Views provided by UsageCounts
Please cite the primary paper at https://doi.org/10.1145/3577190.3614120 when referencing this dataset. This dataset contains multimodal tongue gesture data as a supplement for "TongueTap: Multimodal Tongue Gesture Recognition with Head-Worn Devices" published in ICMI (International Conference on Multimodal Interfaces) 2023. The data is presented in three formats (XDF, pickle and NumPy) at various stages of pre-processing. Please review the READMEs in each file before working with them. Please also review the paper at https://doi.org/10.1145/3577190.3614120 for more information about the data and how it was collected. Abstract Mouth-based interfaces are a promising new approach enabling silent, hands-free and eyes-free interaction with wearable devices. However, interfaces sensing mouth movements are traditionally custom-designed and placed near or within the mouth. TongueTap synchronizes multimodal EEG, PPG, IMU, eye tracking and head tracking data from two commercial headsets to facilitate tongue gesture recognition using only off-the-shelf devices on the upper face. We classified eight closed-mouth tongue gestures with 94% accuracy, offering an invisible and inaudible method for discreet control of head-worn devices. Moreover, we found that the IMU alone differentiates eight gestures with 80% accuracy and a subset of four gestures with 92% accuracy. We built a dataset of 48,000 gesture trials across 16 participants, allowing TongueTap to perform user-independent classification. Our findings suggest tongue gestures can be a viable interaction technique for VR/AR headsets and earables without requiring novel hardware.
tongue gestures, tongue interface, hands-free, non-intrusive, BCI
tongue gestures, tongue interface, hands-free, non-intrusive, BCI
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 21 |

Views provided by UsageCounts