no code implementations • 25 Mar 2024 • Yasushi Esaki, Satoshi Koide, Takuro Kutsuna
In DIL, we assume that samples on new domains are observed over time.
no code implementations • 21 Feb 2024 • Yasushi Esaki, Akihiro Nakamura, Keisuke Kawano, Ryoko Tokuhisa, Takuro Kutsuna
We propose an accuracy-preserving calibration method using the Concrete distribution as the probabilistic model on the probability simplex.
no code implementations • 9 Mar 2023 • Keisuke Kawano, Takuro Kutsuna, Ryoko Tokuhisa, Akihiro Nakamura, Yasushi Esaki
One major challenge in machine learning applications is coping with mismatches between the datasets used in the development and those obtained in real-world applications.
no code implementations • 24 Sep 2020 • Yasushi Esaki, Yuta Nakahara, Toshiyasu Matsushima
We propose two new criteria to understand the advantage of deepening neural networks.