HRI ’17 - Late Breaking Report
Viena/Austria
2017-03-06
Perceiving when, where and how a robot is touched is an important aspect towards a natural Human-Robot Interaction (HRI). To date, several technologies are used in Social Robotics to determine the area where a touch is performed, in some cases using many sensors. Moreover, most approaches do not tackle the kind of touch performed.
In this paper, we introduce a novel technique based on audio analysis, and machine learning techniques. This presents a proof of concept aimed to provide some advantages regarding the state-of-the-art touch technologies for HRI: cost-efficiency since only a few microphones can cover the robot shell completely; robustness as microphones are not affected by electromagnetic interference or by external sounds; and accuracy taking into account the preliminary results.
CONGRESS BOOK
ISBN978-1-4503-4885-0/17/03
EditorialACM
First page
Last page
Year2017