1 code implementation • 2 Feb 2024 • Sota Kudo, Naoaki Ono, Shigehiko Kanaya, Ming Huang
We theoretically demonstrate that across all values of reasonable $\beta$, FVIB can simultaneously maximize an approximation of the objective function for Variational Information Bottleneck (VIB), the conventional IB method.
no code implementations • 20 Dec 2023 • Daiki Koge, Naoaki Ono, Shigehiko Kanaya
We show that our model has a better prediction performance for molecular properties than existing pre-training methods using molecular graphs and three-dimensional molecular structures.
no code implementations • 2 Jul 2023 • Daiki Koge, Naoaki Ono, Shigehiko Kanaya
To overcome this limitation, we propose a novel molecular deep generative model that incorporates a hierarchical structure into the probabilistic latent vectors.
no code implementations • 20 Jul 2022 • Zheng Chen, Ziwei Yang, Lingwei Zhu, Guang Shi, Kun Yue, Takashi Matsubara, Shigehiko Kanaya, MD Altaf-Ul-Amin
As such, existing methods often impose unrealistic assumptions to extract useful features from the data while avoiding overfitting to spurious correlations.
no code implementations • 7 Apr 2022 • Zheng Chen, Ziwei Yang, Lingwei Zhu, Wei Chen, Toshiyo Tamura, Naoaki Ono, MD Altaf-Ul-Amin, Shigehiko Kanaya, Ming Huang
This paper proposes a novel framework for automatically capturing the time-frequency nature of electroencephalogram (EEG) signals of human sleep based on the authoritative sleep medicine guidance.
no code implementations • 2 Apr 2022 • Ziwei Yang, Lingwei Zhu, Zheng Chen, Ming Huang, Naoaki Ono, MD Altaf-Ul-Amin, Shigehiko Kanaya
In this paper, we propose to investigate automatic subtyping from an unsupervised learning perspective by directly constructing the underlying data distribution itself, hence sufficient data can be generated to alleviate the issue of overfitting.