AAAI Publications, Workshops at the Twenty-Fifth AAAI Conference on Artificial Intelligence

Font Size: 
Human Activity Detection from RGBD Images
Jaeyong Sung, Colin Ponce, Bart Selman, Ashutosh Saxena

Last modified: 2011-08-24

Abstract


Being able to detect and recognize human activities is important for making personal assistant robots useful in performing assistive tasks. The challenge is to develop a system that is low-cost, reliable in unstructured home settings, and also straightforward to use. In this paper, we use a RGBD sensor (Microsoft Kinect) as the input sensor, and present learning algorithms to infer the activities. Our algorithm is based on a hierarchical maximum entropy Markov model (MEMM). It considers a person's activity as composed of a set of sub-activities, and infers the two-layered graph structure using a dynamic programming approach. We test our algorithm on detecting and recognizing twelve different activities performed by four people in different environments, such as a kitchen, a living room, an office, etc., and achieve an average performance of 84.3% when the person was seen before in the training set (and 64.2% when the person was not seen before).

Full Text: PDF