Template-Based Recognition of Pose and Motion Gestures on a Mobile Robot

Stefan Waldherr, Sebastian Thrun, Roseli Romero, Dimitris Margaritis

For mobile robots to assist people in everyday life, they must be easy to instruct. This paper describes a gesture-based interface for human robot interaction, which enables people to instruct robots through easy-to-performarmgestures. Such gestures might be static pose gestures, which involve only a specific configuration of the person’s arm, or they might be dynamic motion gestures (such as waving). Gestures are recognized in real-time at approximate frame rate, using a hybrid approach that integrates neural networks and template matching. A fast, color-based tracking algorithm enables the robot to track and follow a person reliably through office environments with drastically changing lighting conditions. Results are reported in the context of an interactive clean-up task, where a person guides the robot to specific locations that need to be cleaned, and the robot picks up trash which it then delivers to the nearest trash-bin.


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.