no code implementations • 25 Mar 2024 • Yasushi Esaki, Satoshi Koide, Takuro Kutsuna
In DIL, we assume that samples on new domains are observed over time.
no code implementations • 8 Mar 2024 • Takuro Kutsuna
In this paper, we first identify activation shift, a simple but remarkable phenomenon in a neural network in which the preactivation value of a neuron has non-zero mean that depends on the angle between the weight vector of the neuron and the mean of the activation vector in the previous layer.
no code implementations • 21 Feb 2024 • Yasushi Esaki, Akihiro Nakamura, Keisuke Kawano, Ryoko Tokuhisa, Takuro Kutsuna
We propose an accuracy-preserving calibration method using the Concrete distribution as the probabilistic model on the probability simplex.
no code implementations • 2 Feb 2024 • Keisuke Kawano, Takuro Kutsuna, Keisuke Sano
The generalization ability of Deep Neural Networks (DNNs) is still not fully understood, despite numerous theoretical and empirical analyses.
no code implementations • 7 Apr 2023 • Takuro Kutsuna
Distribution shifts are problems where the distribution of data changes between training and testing, which can significantly degrade the performance of a model deployed in the real world.
no code implementations • 9 Mar 2023 • Keisuke Kawano, Takuro Kutsuna, Ryoko Tokuhisa, Akihiro Nakamura, Yasushi Esaki
One major challenge in machine learning applications is coping with mismatches between the datasets used in the development and those obtained in real-world applications.
no code implementations • 29 Jun 2020 • Keisuke Kawano, Takuro Kutsuna, Satoshi Koide
Multiple sequences alignment (MSA) is a traditional and challenging task for time-series analyses.
no code implementations • NeurIPS 2019 • Ruho Kondo, Keisuke Kawano, Satoshi Koide, Takuro Kutsuna
Learning non-deterministic dynamics and intrinsic factors from images obtained through physical experiments is at the intersection of machine learning and material science.
no code implementations • NeurIPS 2018 • Satoshi Koide, Keisuke Kawano, Takuro Kutsuna
The evolution of biological sequences, such as proteins or DNAs, is driven by the three basic edit operations: substitution, insertion, and deletion.
no code implementations • ICLR 2018 • Takuro Kutsuna
In this paper, we first identify \textit{angle bias}, a simple but remarkable phenomenon that causes the vanishing gradient problem in a multilayer perceptron (MLP) with sigmoid activation functions.