Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ INTERANTIONAL JOURNA...arrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
addClaim

Multi Sign Language Recognition

Authors: Dr. Kamini Nalavade; Dr. Pallavi Baviskar; Mayank Katiyara; Hitesh Paighan; Devendra Chaudhari; Sanketgir Gosavi;

Multi Sign Language Recognition

Abstract

The deaf and hard-of-hearing community often experiences communication barriers due to people who are not well-versed in the use of sign language.[1] This project would address this problem by developing an all-embracing machine learning (ML) model which would be able to interpret and translate the hand movements of American Sign Language (ASL) and Indian Sign Language (ISL) into corresponding spoken or written language in real time.[2] In using data from cameras, the system is designed to precisely predict and translate gestures both ASL and ISL. The solution integrates Natural Language Processing for context-aware translations and hence lets the system interpret the surrounding context for more meaningful and accurate implications. Moreover, the final project will explore the utilization of advanced Gesture recognition technologies, which might include the deep learning models and recurrent neural networks, shall enhance the system's accuracy as well as adaptability.[3] Data input multimodality will enhance gesture recognition: examples include skeletal tracking and hand shape analysis. User-friendly interface; it will be made possible for the users to interact with customization options of settings, preferences, including but not limited to language voice options.[6] Moreover, the project includes the potential for integration with other assistive technologies, such as speech-to-text devices and AR equipment among others, to provide a comprehensive communication assistant. At the end of it all, this project would empower the deaf and hard-of-hearing community by ensuring effective communication between the hearing population and the deaf and hard-of-hearing, social inclusion and access. The solution will be plentiful in schools, the workplaces, and public services, so all will be provided with the potentially useful tool of increasing accessibility and understanding.

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average
gold