
pmid: 32909804
arXiv: 1902.02875
The ability of humans and animals to quickly adapt to novel tasks is difficult to reconcile with the standard paradigm of learning by slow synaptic weight modification. Here we show that fixed-weight neural networks can learn to generate required dynamics by imitation. After appropriate weight pretraining, the networks quickly and dynamically adapt to learn new tasks and thereafter continue to achieve them without further teacher feedback. We explain this ability and illustrate it with a variety of target dynamics, ranging from oscillatory trajectories to driven and chaotic dynamical systems.
Neurons, Computational Neuroscience, Models, Neurological, Cell Communication, Learning, plasticity and memory, Quantitative Biology - Neurons and Cognition, FOS: Biological sciences, Animals, Humans, Learning, Neurons and Cognition (q-bio.NC), Nerve Net
Neurons, Computational Neuroscience, Models, Neurological, Cell Communication, Learning, plasticity and memory, Quantitative Biology - Neurons and Cognition, FOS: Biological sciences, Animals, Humans, Learning, Neurons and Cognition (q-bio.NC), Nerve Net
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 36 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
