Source:Sensors Signals and Information Processing Workshop, Sedona, AZ (2008)
In this paper, we describe a real time system for detecting and recognizing lower body activities (walking, sitting, standing, running and lying down) using streaming data from tri-axial accelerometers. While there have been various attempts to solve this problem, what makes our system unique is that it uses a minimal set of sensors and works in real time. We have divided the system into three components: preprocessing, feature extraction and classification. This paper describes each component and addresses the issue of locating the sensors on a human body. We also discuss different elementary signal processing techniques that we experimented with to extract salient features from the sensory stream, bearing in mind the computation costs of each method. We used the AdaBoost algorithm built on decision stumps for classification, and our system is able to recognize each activity (walking, sitting, standing, running, lying down) with 95% accuracy.
Traditional approaches to human activity recognition relying on vision as the primary sensory medium have met with little success. The emergence of the ubiquitous and pervasive paradigm of computing has ushered in new low bandwidth wearable, unobtrusive, inexpensive and…