Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Conference object
Data sources: ZENODO
addClaim

IntelliMan_WP5_Grasping, Manipulation and Arm-Hand Coordination_T5.1_Data fusion and sensing technology_ISCAS2025

Authors: Khalifeh, Razan; Abbass, Yahya; Yaacoub, Mohamad; Gentile, Cosimo; Gruppioni, Emanuele; valle, Maurizio;

IntelliMan_WP5_Grasping, Manipulation and Arm-Hand Coordination_T5.1_Data fusion and sensing technology_ISCAS2025

Abstract

This study presents a sensing system composed of PVDF-based sensing arrays to recognize an object’s hardness and texture. The system is mounted on the fingertip of the Hannes prosthetic hand, which obtains texture and hardness features by grasping daily-life objects. Time-domain features are extracted from the sensor responses and evaluated using a Support Vector Machine (SVM) algorithm. Additionally, two deep learning (DL) models—One-Dimensional Convolutional Neural Network (1-D CNN) and Long Short-Term Memory (LSTM) were implemented. Results demonstrate that the 1-D CNN attains the highest recognition accuracy (≈90%) for both hardness and texture. Moreover, the findings indicate that hardness information is integrated throughout the full grasp action, while texture information can be extracted from the initial contact with the object. Deploying the 1-D CNN on a microcontroller further shows that the system is energy-efficient and capable of extracting tactile information in real time. Overall, this study demonstrates the effectiveness of the proposed system in recognizing object properties, highlighting its robustness and suitability for prosthetic applications.

Powered by OpenAIRE graph
Found an issue? Give us feedback