Interacting with a Pet Robot Using Hand Gestures

Milyn C. Moy, MIT Artificial Intelligence Lab and Oracle Corporation

This work focuses on the real-time, visual interpretation of 2D dynamic hand gestures in complex environments. Our goal is to enable humans to communicate and interact with Yuppy, a pet robot being developed at the MIT AI Lab. The gesture lexicon consists of a set of 2D gesture classes (primitives) that include linear (vertical, horizontal, and diagonal) as well as circular (clockwise and counterclockwise) gestures.


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.