
Data-driven reduced-order modeling of chaotic dynamics can result in systems that either dissipate or diverge catastrophically. Leveraging nonlinear dimensionality reduction of autoencoders and the freedom of nonlinear operator inference with neural networks, we aim to solve this problem by imposing a synthetic constraint in the reduced-order space. The synthetic constraint allows our reduced-order model both the freedom to remain fully nonlinear and highly unstable while preventing divergence. We illustrate the methodology with the classical 40-variable Lorenz '96 equations and with a more realistic fluid flow problem-the quasi-geostrophic equations-showing that our methodology is capable of producing medium-to-long-range forecasts with lower error using less data than other nonlinear methods.
FOS: Computer and information sciences, Computer Science - Machine Learning, FOS: Mathematics, Dynamical Systems (math.DS), Mathematics - Dynamical Systems, Machine Learning (cs.LG)
FOS: Computer and information sciences, Computer Science - Machine Learning, FOS: Mathematics, Dynamical Systems (math.DS), Mathematics - Dynamical Systems, Machine Learning (cs.LG)
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
