Thoughts on the Consistency between Ricci Flow and Neural Network Behavior

16 Nov 2021  ·  Jun Chen, Tianxin Huang, Wenzhou Chen, Yong liu ·

The Ricci flow is a partial differential equation for evolving the metric in a Riemannian manifold to make it more regular. On the other hand, neural networks seem to have similar geometric behavior for specific tasks. In this paper, we construct the linearly nearly Euclidean manifold as a background to observe the evolution of Ricci flow and the training of neural networks. Under the Ricci-DeTurck flow, we prove the dynamical stability and convergence of the linearly nearly Euclidean metric for an $L^2$-Norm perturbation. In practice, from the information geometry and mirror descent points of view, we give the steepest descent gradient flow for neural networks on the linearly nearly Euclidean manifold. During the training process of the neural network, we observe that its metric will also regularly converge to the linearly nearly Euclidean metric, which is consistent with the convergent behavior of linearly nearly Euclidean metrics under the Ricci-DeTurck flow.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here