Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the acf domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /var/www/html/roboticslab/wp-includes/functions.php on line 6121
Dynamic Gesture Recognition for Social Robots - RoboticsLab

Dynamic Gesture Recognition for Social Robots

External link: link

Download: BibTeX | Plain Text

Description

Interpreting users messages, both verbal and non-verbal is essential to achieve a natural Human-Robot Interaction. Traditionally, Social Robots, and particularly Care Robots, rely on interfaces such as voice, touch or images to acquire information from users although the latter is usually used to locate them. This manuscript present the main steps of machine learning-based approach, from the skeleton extraction to the features computation and the classification necessary to detect dynamic gestures, which provide more information than simple poses. 123 classification algorithms have been employed to analyse the performance and accuracy of the system. To train these classifiers a, 30 users were recording while performing 14 dynamic gestures, obtaining 1355 instances of 900 features for each of these. Results indicate that Random Forest classifier achieves the highest F-score using cross-validation.