no code implementations • 15 May 2023 • Wenxuan Zou, Haiping Huang
Dynamical mean-field theory is a powerful physics tool used to analyze the typical behavior of neural networks, where neurons can be recurrently connected, or multiple layers of neurons can be stacked.
1 code implementation • 6 Dec 2022 • Chan Li, Zhenye Huang, Wenxuan Zou, Haiping Huang
A variational Bayesian learning setting is thus proposed, where the neural networks are trained in a field-space, rather than gradient-ill-defined discrete-weight space, and furthermore, weight uncertainty is naturally incorporated, and modulates synaptic resources among tasks.
1 code implementation • 16 Mar 2022 • Wenxuan Zou, Muyi Sun
To tackle these problems, we propose Graph Flow, a comprehensive knowledge distillation framework, for both network-efficiency and annotation-efficiency medical image segmentation.
no code implementations • 27 Aug 2021 • Wenxuan Zou, Muyi Sun
To tackle this problem, we propose CoCo DistillNet, a novel Cross-layer Correlation (CoCo) knowledge distillation network for pathological gastric cancer segmentation.
no code implementations • 26 Aug 2021 • Zhuojie Wu, Zijian Wang, Wenxuan Zou, Fan Ji, Hao Dang, Wanting Zhou, Muyi Sun
In the three-dimensional feature learning path, we design a novel Adaptive Pooling Module (APM) and propose a new Quadruple Attention Module (QAM).
no code implementations • 7 Feb 2021 • Wenxuan Zou, Chan Li, Haiping Huang
Recurrent neural networks are widely used for modeling spatio-temporal sequences in both nature language processing and neural population dynamics.
1 code implementation • 16 Jul 2020 • Wenxuan Zou, Haiping Huang
Here, we propose a statistical mechanics framework by directly building a least structured model of the high-dimensional weight space, considering realistic structured data, stochastic gradient descent training, and the computational depth of neural networks.