Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ Recolector de Cienci...arrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
Recolector de Ciencia Abierta, RECOLECTA
Bachelor thesis . 2017
License: CC BY NC ND
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
Recolector de Ciencia Abierta, RECOLECTA
Bachelor thesis . 2019
License: CC BY NC ND
versions View all 2 versions
addClaim

Desarrollo de una app para personas invidentes de reconocimiento de emociones a partir de expresiones faciales

Authors: Pardo Abbiatti, Nahuel Ignacio;

Desarrollo de una app para personas invidentes de reconocimiento de emociones a partir de expresiones faciales

Abstract

Gracias a los avances en diversos campos de la ciencia e ingeniería (biomedicina, mecánica, informática...) se ha logrado que personas que sufren distintas y complejas discapacidades obtengan una notable mejora en su calidad de vida. Prótesis, audífonos, órganos artificiales, sensores, son solo algunos ejemplos que nos ha dado la tecnología para ayudar a personas con discapacidades muy diversas. En este proyecto nos centraremos en la discapacidad que sufren más de 55.000 personas en nuestro país, la ceguera. Analizaremos el estado del arte sobre la tecnología destinada a este tipo de discapacidad, a su naturaleza y previsiones en el número de afectados. Nos centraremos en el campo de la telefonía móvil. Estos dispositivos son cada vez más sofisticados y disponibles a más parte de la población, eso que lo convierte en un elemento crucial para facilitar la vida de cualquier discapacitado, no sólo invidentes. El presente proyecto bebe de las tecnologías potenciadas en las últimas décadas relacionadas con el análisis de sentimientos y reconocimiento de emociones. De todas las subcategorías existentes nos focalizaremos en el análisis de expresiones faciales y eventos gestuales. Realizaremos una aplicación móvil utilizando bibliotecas de Affectiva para crear un traductor de expresiones, apariencia física y eventos faciales a voz. Esto permitirá a una persona invidente obtener información ambiental y visual de su interlocutor, lo cual, debido a esta discapacidad, le resultaría imposible de adquirir de otra forma. Las aplicaciones que se pueden sustraer de esta herramienta son muy variadas pero todas ellas con un punto en común: obtener información de parte de la comunicación corporal. Esta forma de comunicación, por definición, está desligada para las personas con esta discapacidad (salvo eventos que involucren contacto físico). Es por ello que se abre un abanico de posibilidades muy disperso. A modo de ejemplo: en el ámbito familiar sería posible obtener información de los gestos faciales que realiza nuestro interlocutor al reaccionar sobre un evento. En un ámbito laboral sería posible obtener información de forma independiente de la apariencia de nuestro interlocutor lo que conllevaría, por ejemplo, a tomar medidas en la manera de expresarnos de ahí en adelante. En definitiva las ventajas rondarán a toda aquella información que las personas sin discapacidades visuales severas podemos obtener de nuestros interlocutores sin mediar palabra, simplemente observando la apariencia y gesticulación facial. Dado que esta discapacidad es universal y no distingue entre nacionalidades ni lenguajes, la aplicación se ha traducido a ocho idiomas. hanks to the progress on many fields of science and engineering (such as biomedical, mechanics, computing…) it has been possible to achieve that people who suffer from many kinds and complex disabilities get a significant improvement on their quality of life. Prothesis, hearing aids, artificial organs, sensors… they are some few examples of what technology have given us for helping people with different disabilities. In this project we are going to study the disability suffered from more than 55.000 people in our country: the blindness. We are going to analyze the state of the art of technology used on this disability, its nature and predictions on the number of patients. We are going to focus on the mobile phone’s field. These devices are more and more sophisticated and available for the population, that makes it an important factor for making easier the life of any disabled. The current project feeds from the lately strengthened technologies used on sentiment analysis and emotion recognition. Among all existing subcategories, we are going to focus on facial expressions analysis and gestural events. We will build an app using resources from Affectiva for creating a translator of expressions, physical appearance and facial gestural events to voice. This will allow to a blind person to get ambiental and visual information of his interlocutor, it would be impossible to get otherwise due to his disability. The applications that anyone can obtain from this tool are very diverse, but all of them have a point in common: to obtain information from part of the body communication. This way of communicate, by definition, is indifferent to blind people (except those that involve physical contact). It is therefore a wide range of possibilities. For example: In a familiar environment it would be possible to extract information from facial gestures done by our interlocutor when he reacts over some event. In a work environment, it would be possible to get information independently from, for example, the appearance of our client, which can leads to take different strategies in our speech form then to on. Definitely, the advantages will go round the idea of get every kind of information that a person without any severe visual disability can get from his interlocutor without verbal communication, only watching to his appearance and facial gesture. Given that this disability is universal and it does not distinguish between languages, the app has been translated into eight languages. Ingeniería de Sistemas Audiovisuales

Country
Spain
Related Organizations
Keywords

Informática, Emociones, Desarrollo de software, Discapacidad visual, Reconocimiento de emociones, Biología computacional, Reconocimiento facial, Psicología, Inteligencia artificial, Detección facial

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 33
    download downloads 62
  • 33
    views
    62
    downloads
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
download
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
downloads
OpenAIRE UsageCountsDownloads provided by UsageCounts
0
Average
Average
Average
33
62
Green
Related to Research communities