Wearable Sensor-based Activity Recognition
Contact
Members
Dr. Sethuraman "Panch" Panchanathan
Dr. Sethuraman "Panch" Panchanathan
Traditional approaches to human activity recognition relying on vision as the primary sensory medium have met with little success. The emergence of the ubiquitous and pervasive paradigm of computing has ushered in new low bandwidth wearable, unobtrusive, inexpensive and easily deployable sensors like accelerometers, gyroscopes and RFID tags for the purpose of human activity recognition.
The low level processing of the accelerometer data stream is an interesting multimedia as well as a pattern recognition problem. We are investigating the data stream to identify features that are discriminative, to aid in distinguishing between the different action patterns. In addition to the standard statistical and spectral features based on the Fourier transform, we are studying the properties of the acceleration data stream with respect to other signal processing techniques like Wigner distribution, wavelet transform to identify features with good discriminative capability. We believe that an ideal time frequency representation of the signal will be crucial in distinguishing between the different action patterns.
We are looking to developing a multimodal sensory framework, where the vision sensors are supplemented with other low bandwidth sensors like accelerometers and RFID tags. We propose to use this framework for detecting the objects the user is interacting with along with the associated motion patterns. Bayesian networks that fuse the results obtained from the different data streams is an option that we are considering to develop the multi-modal sensory framework. We also plan to develop activity models for inferring the activities based on the information about the objects the user is interacting with along with motion patterns.
Funding Sources
National Science Foundation
CUbiC Technology and Research Initiative Fund