Dynamic Gesture Recognition for Social Robots
The Ninth International Conference on Social Robotics
Tsukuba/Japan
2017-11-22

Interpreting users messages, both verbal and non-verbal is essential to achieve a natural Human-Robot Interaction. Traditionally, Social Robots, and particularly Care Robots, rely on interfaces such as voice, touch or images to acquire information from users although the latter is usually used to locate them. This manuscript present the main steps of machine learning-based approach, from the skeleton extraction to the features computation and the classification necessary to detect dynamic gestures, which provide more information than simple poses. 123 classification algorithms have been employed to analyse the performance and accuracy of the system. To train these classifiers a, 30 users were recording while performing 14 dynamic gestures, obtaining 1355 instances of 900 features for each of these. Results indicate that Random Forest classifier achieves the highest F-score using cross-validation.

CONGRESS BOOK
Social Robotics. ICSR 2017. Lecture Notes in Computer Science, vol 10652.
ISBN978-3-319-70021-2
EditorialSpringer, Cham
First page495
Last page505
Year2017