
Complex hand gesture interactions among dynamic sign words may lead to misclassification, which affects the recognition accuracy of the ubiquitous sign language recognition system. This paper proposes to augment the feature vector of dynamic sign words with knowledge of hand dynamics as a proxy and classify dynamic sign words using motion patterns based on the extracted feature vector. In this method, some double-hand dynamic sign words have ambiguous or similar features across a hand motion trajectory, which leads to classification errors. Thus, the similar/ambiguous hand motion trajectory is determined based on the approximation of a probability density function over a time frame. Then, the extracted features are enhanced by transformation using maximal information correlation. These enhanced features of 3D skeletal videos captured by a leap motion controller are fed as a state transition pattern to a classifier for sign word classification. To evaluate the performance of the proposed method, an experiment is performed with 10 participants on 40 double hands dynamic ASL words, which reveals 97.98% accuracy. The method is further developed on challenging ASL, SHREC, and LMDHG data sets and outperforms conventional methods by 1.47%, 1.56%, and 0.37%, respectively.
Gestures, bidirectional long short-term memory, Chemical technology, deep learning, Recognition, Psychology, TP1-1185, Hand, American sign language words, computer vision, Article, Pattern Recognition, Automated, Motion, Sign Language, Humans, leap motion controller sensor, American sign language words; bidirectional long short-term memory; computer vision; deep learning; dynamic hand gestures; leap motion controller sensor; sign language recognition; ubiquitous system; video processing, Algorithms, dynamic hand gestures
Gestures, bidirectional long short-term memory, Chemical technology, deep learning, Recognition, Psychology, TP1-1185, Hand, American sign language words, computer vision, Article, Pattern Recognition, Automated, Motion, Sign Language, Humans, leap motion controller sensor, American sign language words; bidirectional long short-term memory; computer vision; deep learning; dynamic hand gestures; leap motion controller sensor; sign language recognition; ubiquitous system; video processing, Algorithms, dynamic hand gestures
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 46 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 1% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 1% |
