ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.08410
6
5

Thoughts on the Consistency between Ricci Flow and Neural Network Behavior

16 November 2021
Jun Chen
Tianxin Huang
Wenzhou Chen
Yong Liu
ArXivPDFHTML
Abstract

The Ricci flow is a partial differential equation for evolving the metric in a Riemannian manifold to make it more regular. On the other hand, neural networks seem to have similar geometric behavior for specific tasks. In this paper, we construct the linearly nearly Euclidean manifold as a background to observe the evolution of Ricci flow and the training of neural networks. Under the Ricci-DeTurck flow, we prove the dynamical stability and convergence of the linearly nearly Euclidean metric for an L2L^2L2-Norm perturbation. In practice, from the information geometry and mirror descent points of view, we give the steepest descent gradient flow for neural networks on the linearly nearly Euclidean manifold. During the training process of the neural network, we observe that its metric will also regularly converge to the linearly nearly Euclidean metric, which is consistent with the convergent behavior of linearly nearly Euclidean metrics under the Ricci-DeTurck flow.

View on arXiv
Comments on this paper