
What computational primitives are necessary for robust adaptation? We prove that six are indispensable: objective specification, breakpoint identification, global attraction, minimal intervention, feasibility projection, and feedback adaptation. For each primitive, we construct environments where any algorithm lacking it suffers Ω(T) regret (or Ω(√T log T) for minimal intervention) within a broad class of challenging partially observable, non-stationary environments. These primitives are computationally independent under standard complexity assumptions, yet we provide a concrete algorithm that integrates all six and achieves Õ(√T) regret under mild conditions. This work establishes a unified framework for analyzing adaptive systems and offers principled guidance for building robust AI. Updated (Version 2): 5 January 2026.
reinforcement learning, non-stationary environments, computational primitives, safe exploration, robust AI, regret minimization, adaptation primitives
reinforcement learning, non-stationary environments, computational primitives, safe exploration, robust AI, regret minimization, adaptation primitives
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
