Downloads provided by UsageCounts
We present research that extends the scope of the mobile application Control, aprototyping environment for defining multimodal interfaces that controlreal-time artistic and musical performances. Control allows users to rapidlycreate interfaces employing a variety of modalities, including: speechrecognition, computer vision, musical feature extraction, touchscreen widgets,and inertial sensor data. Information from these modalities can be transmittedwirelessly to remote applications. Interfaces are declared using JSON and canbe extended with JavaScript to add complex behaviors, including the concurrentfusion of multimodal signals. By simplifying the creation of interfaces viathese simple markup files, Control allows musicians and artists to make novelapplications that use and combine both discrete and continuous data from thewide range of sensors available on commodity mobile devices.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 1 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 28 | |
| downloads | 10 |

Views provided by UsageCounts
Downloads provided by UsageCounts