
arXiv: 2505.16932
Computing the polar decomposition and the related matrix sign function has been a well-studied problem in numerical analysis for decades. Recently, it has emerged as an important subroutine within the Muon algorithm for training deep neural networks. However, the requirements of this application differ sharply from classical settings: deep learning demands GPU-friendly algorithms that prioritize high throughput over high precision. We introduce Polar Express, a new method for computing the polar decomposition. Like Newton-Schulz and other classical polynomial methods, our approach uses only matrix-matrix multiplications, making it very efficient on GPUs. Inspired by earlier work of Chen & Chow and Nakatsukasa & Freund, Polar Express adapts the update rule at each iteration by solving a minimax optimization problem. We prove that this strategy minimizes error in a worst-case sense, allowing Polar Express to converge as rapidly as possible both in the early iterations and asymptotically. We also address finite-precision issues, making it practical to use in bfloat16. When integrated into the Muon training framework, our method leads to consistent improvements in validation loss when training a GPT-2 model on one billion tokens from the FineWeb dataset, outperforming recent alternatives across a range of learning rates.
34 pages, 8 figures, 4 algorithms
FOS: Computer and information sciences, Numerical Analysis, Numerical Analysis (math.NA), Machine Learning (cs.LG), 65F30, 68T07, 68N19, Machine Learning, Artificial Intelligence (cs.AI), Artificial Intelligence, Optimization and Control (math.OC), Optimization and Control, FOS: Mathematics, Computation and Language, G.1.3; I.2.6; F.2.1; G.1.6, Computation and Language (cs.CL)
FOS: Computer and information sciences, Numerical Analysis, Numerical Analysis (math.NA), Machine Learning (cs.LG), 65F30, 68T07, 68N19, Machine Learning, Artificial Intelligence (cs.AI), Artificial Intelligence, Optimization and Control (math.OC), Optimization and Control, FOS: Mathematics, Computation and Language, G.1.3; I.2.6; F.2.1; G.1.6, Computation and Language (cs.CL)
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
