
This dataset was used to learn visually interpretable oscillator networks in "Learning Visually Interpretable Oscillator Networks for Soft Continuum Robots from Video" (DOI: https://doi.org/10.48550/arXiv.2511.18322). Please cite this paper when using the dataset. The implementation of the visually interpretable oscillator networks and their training is published as a GitHub repository (https://github.com/UThenrik/visual_oscillators_for_SCR). The dataset includes: Pressure and video raw data of a soft pneumatic robot during dynamic planar movements based on step and oscillatory inputs Data processing script for data loading, synchronization, subsampling and cropping Processed data used for training of networks in the paper
Model Learning for Control, Koopman Theory, Modeling, Control, and Learning for Soft Robots, Representation Learning, Visual Learning
Model Learning for Control, Koopman Theory, Modeling, Control, and Learning for Soft Robots, Representation Learning, Visual Learning
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
