Social Interaction Assistant
Effective communication requires a shared context. In face-to-face interactions, parts of this shared context are the number and location of people, their facial expression, head pose, eye contact, and movements of each person engaged in a conversation. Faces serve an essential function in social interactions by providing cues to attention, understanding and emotion. Faces also provide contextual information such as identity, age, family origin and personal health. In face-to-face communication, people with visual impairments are not privy to this visual information, nor are they necessarily aware of their own facial expressions. This lack of access to facial information negatively impacts employment opportunities and has a tendency to socially isolate people who are visually impaired.
The goal of the Interaction Assistant project is to provide a user who is visually impaired with information that can improve their social interactions with sighted counterparts. The system consists of a camera mounted on a pair of sun glasses which the user wears. The video from the camera is analyzed to extract information that can help the user better understand other people in their surroundings. An important goal of the system is to leverage the capabilities of the user and only provide information which is not already available by other means. The work also concentrates on developing various delivery covert mechanisms that allow the user to interpret the social information without interfering with the social interaction itself.
National Science Foundation NSF Award 1116360;
CUbiC Technology and Research Initiative Fund