Views provided by UsageCounts
This video gives an overview of EMPI, the embodied musical predictive interface. This interface allows constrained call-and-response interaction with a musical machine-learning model. The performer can control a basic synth sound with one lever. The machine learning model responds with the same sound and another lever controlled by a servo. We tested the EMPI with three ML models: one based on human-sourced data, as well as a synthetic dataset, and a noise dataset as a control. We also tested the EMPI with the motorized lever enabled and disabled. This video shows example interactions with each of the six conditions.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 5 |

Views provided by UsageCounts