1 code implementation • 8 Sep 2023 • Chan Li, Junbin Qiu, Haiping Huang
Therefore, our model provides a starting point to investigate the connection among brain computation, next-token prediction and general intelligence.
1 code implementation • 6 Dec 2022 • Chan Li, Zhenye Huang, Wenxuan Zou, Haiping Huang
A variational Bayesian learning setting is thus proposed, where the neural networks are trained in a field-space, rather than gradient-ill-defined discrete-weight space, and furthermore, weight uncertainty is naturally incorporated, and modulates synaptic resources among tasks.
no code implementations • 21 Aug 2022 • Chan Li, Haiping Huang
Large-scale deep neural networks consume expensive training costs, but the training results in less-interpretable weight matrices constructing the networks.
no code implementations • 7 Feb 2021 • Wenxuan Zou, Chan Li, Haiping Huang
Recurrent neural networks are widely used for modeling spatio-temporal sequences in both nature language processing and neural population dynamics.
1 code implementation • 10 Jan 2020 • Chan Li, Haiping Huang
Therefore, our model learns the credit assignment leading to the decision, and predicts an ensemble of sub-networks that can accomplish the same task, thereby providing insights toward understanding the macroscopic behavior of deep learning through the lens of distinct roles of synaptic weights.