IPARS: Intelligent Portable Activity Recognition System via Everyday Objects, Human Movements, and Activity Duration

Chi-Yau Lin, Yung-jen Hsu

In this paper we define a general framework for activity recognition by building upon and extending Lempel-Ziv multiway tree structure called tries to model human activities of daily living (ADLs). Our activity recognition system is performed online. We show a wearable wrist, based on Radio Frequency Identification (RFID) reader to detect everyday objects, and WiFi positioning system installed in our experimental environment to capture human current position. Our activity models are formulated by translating labeled activities (such as grooming) into probabilistic collections of a sequence of action steps (such as brushing teeth → combing hair), and each action step (such as making tea) is composed of a sequence of human handling object terms (such as cup → teabag → electric air pot → teaspoon), and human movements (bedroom → kitchen). Given RFID tags, WiFi signals, and Passing of time can directly yield the state of the physical world. We experimentally validate our approach using data gathered from actual human activity.

Subjects: 1.6.1 Automated Device Modeling

Submitted: May 17, 2006


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.