Powered by OpenAIRE graph
Found an issue? Give us feedback
ZENODOarrow_drop_down
ZENODO
Other literature type . 2026
License: CC BY
Data sources: Datacite
ZENODO
Other literature type . 2026
License: CC BY
Data sources: Datacite
ZENODO
Other literature type . 2025
License: CC BY
Data sources: Datacite
versions View all 3 versions
addClaim

Thinking Machines: A Dual-System Framework for Metacognitive Control and Learning

Authors: Ananta, Nair; Erin, Austin; Jason, Watson; Farnoush, Banaei-Kashani;

Thinking Machines: A Dual-System Framework for Metacognitive Control and Learning

Abstract

A foundational challenge for artificial intelligence is not whether machines can solve well defined tasks, but whether they can adapt across novel and open-ended domains. Biological systems achieve such adaptivity by coupling fast sensorimotor control with slower abstraction and memory consolidation across timescales. Despite remarkable progress, contemporary large-scale models remain energy-inefficient at inference, weakly coupled to embodied goal-directed control, and prone to interference without principled consolidation. We propose a dual-loop architecture that couples a fast recurrent perception–action loop with a slow consolidation-and-planning loop that reorganizes experience into compositional memories over a learned relational graph. The fast loop is implemented as a stable excitatory–inhibitory dynamical system with online prediction-error learning, uncertainty aware state estimation, and asymmetric consequence-driven updating in which aversive outcomes produce rapid, preferential policy correction and memory consolidation. The slow loop performs attention-based associative retrieval over a spectrally structured memory graph, enabling context-sensitive diffusion-based recall and the composition of long horizon plans from consolidated fragments. Both loops share a common dynamical substrate and are derived from a single variational objective that unifies learning, action selection, and uncertainty estimation via free energy minimization, enabling metacognitive regulation of computational depth by scaling inference resources to predictive uncertainty. We prove Lyapunov stability for the fast-loop dynamics under quasi-static learning assumptions and establish robustness bounds for inter-loop coupling under timescale separation. The framework yields four falsifiable predictions: improved task switching, reduced catastrophic forgetting, compositional zero-shot transfer, and uncertainty-adaptive compute allocation, providing concrete criteria for evaluation in embodied control, continual learning, and compositional generalization

Keywords

Continual Learning, Metacognitive Control, Multi-timescale Consolidation, Lyapunov Stability, Spectral Graph Organization, Compositional Generalization, Modern Hopfield Networks, Active Inference, Dual-Loop Architecture, Free Energy Minimization, Temporal Difference Learning

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average
Upload OA version
Are you the author of this publication? Upload your Open Access version to Zenodo!
It’s fast and easy, just two clicks!