
Abstract Each normal human has pairwise hands, which can make sophisticated hand gestures delivering different semantic meanings. In practice, disabled persons can communicate with each other by their hand gestures. However, for normal people without specific training process, it is challenging for them to understand the meaning of the various hand gestures. In this work, we propose a novel quality-aware human–computer interaction (HCI) framework for understanding the sophisticated human hand gestures, wherein the key technique is a multi-view feature learning algorithm that optimally fuses hand silhouettes, different figure positions, and hand motions. More specifically, given each human hand, we first extract the multimodal visual features, such as the hand silhouettes based on background removal, different figure position localization using active learning, and hand motions by leveraging optical flow. Afterward, we propose a multi-view learning-based feature fusing scheme that optimizes the multimodal features both locally and globally. Based on this, the optimal weights of different features channels can be calculated. By leveraging the optimally fused feature, we train a multi-class kernel machine to classify the various human hand gestures into each category with a particular meaning. Comprehensive experimental results on a large-scale human hand gesture data set have demonstrated that our method can achieve a highly competitive recognition accuracy by spending less time consumption. Besides, our method can robustly support arbitrary number of semantic category characterizing human hand gestures. Last but not least, our proposed hand gesture understanding technique can be conveniently incorporated into many state-of-the-art HCI system.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 23 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
