Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ IEEE Accessarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
IEEE Access
Article . 2025 . Peer-reviewed
License: CC BY
Data sources: Crossref
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
IEEE Access
Article . 2025
Data sources: DOAJ
versions View all 2 versions
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

Hands-Free UAV Control: Real-Time Eye Movement Detection Using EOG and LSTM Networks

Authors: Niloofar Zendehdel; Khosro Ghorbani Zadeh; Haodong Chen; Yun Seong Song; Ming C. Leu;

Hands-Free UAV Control: Real-Time Eye Movement Detection Using EOG and LSTM Networks

Abstract

Industry 4.0 has created a growing need for effective human-robot collaboration (HRC). As robots and humans work more closely together, efficient communication becomes essential for coordinating their actions seamlessly. While speech may seem like the obvious choice for communication, noisy factory environments can render it impractical. Additionally, workers often have their hands occupied with assembly tasks, making hand-controlled interfaces less practical for controlling robots. To address these challenges, this paper presents a novel, hands-free method for robot control using electrooculography (EOG) signals—specifically, eye movements and blinks—with unmanned aerial vehicles (UAVs) used as the demonstration platform. We developed a real-time system that captures EOG signals through a graphical user interface (GUI), allowing users to direct their gaze toward on-screen commands and confirm selections with blinks. A Long Short-Term Memory (LSTM) model was trained to detect gaze coordinates and blink events from the EOG data. For performance benchmarking, we implemented a vision-based video-oculography (VOG) system as a camera-based baseline and compared the two approaches under various conditions, including low light, wind disturbances, and deployment on embedded hardware as a resource-constrained device. Our comparative analysis showed the EOG-based method achieved 97.6% accuracy in command detection under normal conditions and maintained strong performance across challenging environments where the vision-based VOG system experienced significant degradation. The proposed EOG method also operated in real time at 47.97 FPS on a high-performance workstation and 38.87 FPS on the resource-constrained NVIDIA Jetson Orin Nano. On this platform, it ran $13.9\times $ faster and used $319\times $ less physical memory than the VOG-based alternative. These results underscore the advantages of EOG-based control in delivering reliable, real-time performance on both high-performance workstations and resource-constrained platforms, outperforming vision-based alternatives in efficiency and adaptability.

Keywords

machine learning, Electrooculography (EOG), long short-term memory (LSTM), Electrical engineering. Electronics. Nuclear engineering, video-oculography (VOG), human-machine interface (HMI), TK1-9971

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    1
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
1
Average
Average
Average
gold