This paper establishes a unified framework integrating geometric flows with deep learning through three fundamental innovations. First, we propose a thermodynamically coupled Ricci flow that dynamically adapts parameter space geometry to loss landscape topology, formally proved to preserve isometric knowledge embedding (Theorem~\ref{thm:isometric}). Second, we derive explicit phase transition thresholds and critical learning rates (Theorem~\ref{thm:critical}) through curvature blowup analysis, enabling automated singularity resolution via geometric surgery (Lemma~\ref{lem:surgery}). Third, we establish an AdS/CFT-type holographic duality (Theorem~\ref{thm:ads}) between neural networks and conformal field theories, providing entanglement entropy bounds for regularization design. Experiments demonstrate 2.1$\times$ convergence acceleration and 63\% topological simplification while maintaining $\mathcal{O}(N\log N)$ complexity, outperforming Riemannian baselines by 15.2\% in few-shot accuracy. Theoretically, we prove exponential stability (Theorem~\ref{thm:converge}) through a new Lyapunov function combining Perelman entropy with Wasserstein gradient flows, fundamentally advancing geometric deep learning.
翻译:本文通过三项基础性创新,建立了一个将几何流与深度学习相统一的框架。首先,我们提出了一种热力学耦合的Ricci流,它能根据损失景观的拓扑结构动态调整参数空间的几何性质,并严格证明了该方法能保持等距知识嵌入(定理~\ref{thm:isometric})。其次,通过曲率爆破分析,我们推导出了显式的相变阈值与临界学习率(定理~\ref{thm:critical}),从而能够通过几何手术实现自动化的奇点消解(引理~\ref{lem:surgery})。第三,我们在神经网络与共形场论之间建立了一种AdS/CFT类型的全息对偶关系(定理~\ref{thm:ads}),为正则化设计提供了纠缠熵的界。实验表明,该方法在保持$\mathcal{O}(N\log N)$复杂度的同时,实现了2.1倍的收敛加速与63%的拓扑结构简化,在小样本准确率上比黎曼基线方法高出15.2%。在理论上,我们通过结合Perelman熵与Wasserstein梯度流构造了一个新的Lyapunov函数,证明了指数稳定性(定理~\ref{thm:converge}),从而从根本上推进了几何深度学习的发展。