Updated: 13 May 2015 - 12:05am by C. Balaguer
Start:2011 / End:2012
Principal investigator: Alberto Jardón Huete
Comunidad de Madrid - UC3M
Safe and multimodal cooperation with robotic assistants for people with special needs
A new kind of robotics empowered living facilities must provide services to cover sensory, communications and power requirements of the robotic devices operating inside. Ideally these robots can operate smartly in unstructured indoor environments.
For the development of usable assistive robots for elderly and disabled an important part is to allow for multimodal Human-Robot Interaction (HRI). However, the overall human-machine system is complex. The user and the robot are operating in a closed loop and both are potentially capable of adapting to the other. The work presented here has attempted to approach the problem from three different perspectives, investigating methods for analyzing, implementing, and testing an enabling multimodal interface for the ASIBOT. The main purpose of multimodality is then here to reduce control inaccurate interpretation of user intentions. A reliable human centered SW architecture allows high level commands from the different modalities. That is, the user will simultaneously coordinate different modalities so as to make his/her intention clearer to the system. Safety and dependability will be the underlying evaluation criteria for new mechanical designs, actuation, and control architectures to deliver performance.