
Hello,I present PégaseNet, a family of neural network architectures that achieve instantaneous learning by exploiting algebraic structures. Part I demonstrates PégaseNet V2, which solves modular addition with 100% accuracy and zero training iterations via circular convolution in the group (Z/qZ, +), achieving > 1000× speedup over standard gradient descent. Part II introduces PégaseNet-FSA, an extension using finite-state automata to capture word order in structured commands, bridging the gap between commutative operations and non-commutative language. We validate both architectures empirically and discuss applications in cryptography, error-correcting codes, IoT, and formal language parsing. These results establish a new paradigm of structure-driven computation where solutions emerge from mathematical properties rather than iterative optimization. Keywords: Modular arithmetic, Circular convolution, Grokking, Finite-state automata, Zero-shot learning, Structure-driven computation.Cordialy
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
