The rapid escalation of technology and the widespread emergence of modern technological equipments have resulted in the generation of large quantities of digital data. This has expanded the possibilities of solving real world problems using computational learning frameworks. However, while gathering large quantities of unlabeled data is cheap and easy, annotating them with class labels entails significant human labor. The objective of this project is to develop a batch mode active learning...
Studying arm motion data from user studies using Wiimote. Performing pattern analysis on MATLAB to detect non-compliance of exercise routines. Developing pressure and force based physiological sensor to detect conditions of fatigue while performing exercise routines
Effective communication requires a shared context. In face-to-face interactions, parts of this shared context are the number and location of people, their facial expression, head pose, eye contact, and movements of each person engaged in a conversation. Faces serve an essential function in social interactions by providing cues to attention, understanding and emotion. Faces also provide contextual information such as identity, age, family origin and personal health. In face-to-face communication...
Every day, our body performs complex motor movements, even during the simplest of physical tasks, often without conscious thought. The movements involved in new motors skills are typically learned through audiovisual instruction, sometimes with direct physical contact from an instructor. However, this type of instruction is limited when vision and/or hearing are not available for use such as with disabilities (vision and/or hearing impairments) or limited attentional resources due to noisy...
Individuals with cognitive, developmental and learning disabilities – in particular, Autism Spectrum Disorders (ASD) - have a significant social communication impairment, a situation that can often lead to social isolation. Reducing the need for formal care services for individuals with ASD using intelligent technologies, could save $90 billion annually (expected to double by 2017) in the US and relieve caregivers of the “free care” they provide at the sacrifice of their own time, social lives...
Motivated by the vision of the future, automated analysis of nonverbal behavior, and especially of facial behavior, has attracted increasing attention in Computer Vision, Pattern Recognition, and Human-Computer interaction. With facial expression being one of the most effective ways of implicit communication, analysis of facial expression is the essence of most affective computing technologies. Commonly used facial expression descriptors in affective computing approaches are the six basic...
The VibroGlove is a novel interface for enhancing human-human interpersonal interactions. Specifically, the device is targeted as an assistive aid to deliver the facial expressions of an interaction partner to people who are blind or visually impaired. (This device is being used, along with the haptic belt, in the Social Interaction Assistant research project.) Vibration motors, mounted on the back of a glove, provide a means for conveying haptic emoticons that represent the six basic human...