
Neural sequence models face a fundamental tension between representational capacityand geometric constraints. We present the Lorentz-Manifold Transformer (LMT), integrat-ing hyperbolic geometry (Lorentz model) with oscillatory dynamics (Hyperbolic ArtificialKuramoto Oscillatory Neurons, H-AKOrN) to address the Geometric Capacity Bottle-neck. The LMT establishes mathematical guarantees for topological preservation through:(1) manifold capacity bounds proving exponential advantage (αcH /αcR = Ω(er )), (2) geomet-ric frustration as a proxy for representational misalignment, and (3) Gromov-Wassersteinstructural risk. Computational complexity (
Neuro-AI Alignment, Out-of-Distribution Generalization, Hyperbolic Neural Networks, Geometric Deep Learning, Gromov-Wasserstein Distance, Lorentz Model
Neuro-AI Alignment, Out-of-Distribution Generalization, Hyperbolic Neural Networks, Geometric Deep Learning, Gromov-Wasserstein Distance, Lorentz Model
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
