To main content

AI-Based User Gesture Recognition for Human-Robot Interaction Using Wrist Sensors

Abstract

Natural and intuitive interaction remains a central challenge in human-robot communication. This study presents a privacy-preserving gesture recognition framework that enables command-based interaction through simple hand and arm movements. Motion data are captured using a wrist-worn sensor, eliminating the need for cameras or other intrusive tracking systems. Data from ten participants performing seven gesture classes, including move forwards, move backward, move left-right, spin horizontally, spin vertically, wave, and move up-down are used to train and validate a hybrid Convolutional Long Short-Term Memory (CNN-LSTM) model. An additional non-command state is included to ensure the robot remains inactive when no gesture is detected. The proposed model achieves an overall recognition accuracy of approximately 95% across all gesture classes using cross-validation. This framework enhances the safety, intuitiveness, and fluidity of human-robot interaction and provides a robust foundation for gesture-based robot control and human activity recognition.

Category

Academic chapter

Language

English

Author(s)

  • Geir Paulsen
  • Juan Sebastian Cardenas
  • Rosa Nicoline Pham Alsgaard
  • Adel Baselizadeh
  • Md Zia Uddin
  • Jim Tørresen

Affiliation

  • SINTEF Digital / Sustainable Communication Technologies
  • University of Oslo

Year

2025

Publisher

IEEE (Institute of Electrical and Electronics Engineers)

Book

2025 11th International Conference on Robotics and Artificial Intelligence (ICRAI), December 19-21, 2025, Nagoya, Japan

ISBN

9798331590680

Page(s)

161 - 165

View this publication at Norwegian Research Information Repository