
This study presents a sensing system composed of PVDF-based sensing arrays to recognize an object’s hardness and texture. The system is mounted on the fingertip of the Hannes prosthetic hand, which obtains texture and hardness features by grasping daily-life objects. Time-domain features are extracted from the sensor responses and evaluated using a Support Vector Machine (SVM) algorithm. Additionally, two deep learning (DL) models—One-Dimensional Convolutional Neural Network (1-D CNN) and Long Short-Term Memory (LSTM) were implemented. Results demonstrate that the 1-D CNN attains the highest recognition accuracy (≈90%) for both hardness and texture. Moreover, the findings indicate that hardness information is integrated throughout the full grasp action, while texture information can be extracted from the initial contact with the object. Deploying the 1-D CNN on a microcontroller further shows that the system is energy-efficient and capable of extracting tactile information in real time. Overall, this study demonstrates the effectiveness of the proposed system in recognizing object properties, highlighting its robustness and suitability for prosthetic applications.
