
arXiv: 2207.07287
This paper studies large-scale optimization problems on Riemannian manifolds whose objective function is a finite sum of negative log-probability losses. Such problems arise in various machine learning and signal processing applications. By introducing the notion of Fisher information matrix in the manifold setting, we propose a novel Riemannian natural gradient method, which can be viewed as a natural extension of the natural gradient method from the Euclidean setting to the manifold setting. We establish the almost-sure global convergence of our proposed method under standard assumptions. Moreover, we show that if the loss function satisfies certain convexity and smoothness conditions and the input-output map satisfies a Riemannian Jacobian stability condition, then our proposed method enjoys a local linear -- or, under the Lipschitz continuity of the Riemannian Jacobian of the input-output map, even quadratic -- rate of convergence. We then prove that the Riemannian Jacobian stability condition will be satisfied by a two-layer fully connected neural network with batch normalization with high probability, provided that the width of the network is sufficiently large. This demonstrates the practical relevance of our convergence rate result. Numerical experiments on applications arising from machine learning demonstrate the advantages of the proposed method over state-of-the-art ones.
Large-scale problems in mathematical programming, FOS: Computer and information sciences, Computer Science - Machine Learning, Riemannian Fisher information matrix, manifold optimization, Nonconvex programming, global optimization, Machine Learning (cs.LG), Kronecker-factored approximation, Optimization and Control (math.OC), FOS: Mathematics, Derivative-free methods and methods using generalized derivatives, Semidefinite programming, natural gradient method, Mathematics - Optimization and Control
Large-scale problems in mathematical programming, FOS: Computer and information sciences, Computer Science - Machine Learning, Riemannian Fisher information matrix, manifold optimization, Nonconvex programming, global optimization, Machine Learning (cs.LG), Kronecker-factored approximation, Optimization and Control (math.OC), FOS: Mathematics, Derivative-free methods and methods using generalized derivatives, Semidefinite programming, natural gradient method, Mathematics - Optimization and Control
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 2 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
