
This work develops a unified, scale-dependent approximation theory that determines when local updating rules—such as neural network layers, message-passing steps, and Bayesian updates—are sufficient for prediction, and when a persistent latent representation, or world-model, becomes mathematically necessary. Building on the renormalization-based framework introduced in Renormalization as a Unified Framework for Scalable Prediction, the paper establishes that the aggregated Lipschitz constant across scales governs the stability of predictive systems. When this constant exceeds one, approximation errors amplify and no purely local update can remain valid across scales. The theory provides necessary and sufficient conditions for world-model necessity, introduces identifiability and minimal latent-dimension requirements, and formalizes locality in terms of receptive-field growth relative to graph diameter. It further derives tight Lipschitz bounds for multi-head attention, analyzes the stability of GNNs and diffusion models, and constructs counterexamples—including chaotic systems—where local updating fails even under pointwise contractivity. Overall, the paper offers a mathematically rigorous foundation that explains why large-scale models such as Transformers, diffusion models, and world-model architectures succeed in long-range prediction, while purely local reactive policies fundamentally cannot. This framework unifies previously disconnected empirical observations and provides a structural basis for understanding scalable prediction in modern machine learning.
Scale-Dependent Approximation Theory, World-Model Necessity, Transformer Lipschitz Analysis, Renormalization Framework, Identifiability and Latent Dimension, Locality and Receptive Field Growth, Local Updating Stability, Graph Neural Networks Stability, Aggregated Lipschitz Constant, Chaotic Dynamics in Prediction
Scale-Dependent Approximation Theory, World-Model Necessity, Transformer Lipschitz Analysis, Renormalization Framework, Identifiability and Latent Dimension, Locality and Receptive Field Growth, Local Updating Stability, Graph Neural Networks Stability, Aggregated Lipschitz Constant, Chaotic Dynamics in Prediction
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
