Audio-Haptic Description in Movies

Publication Type:

Conference Paper

Authors:

L.N. Viswanathan, T. McDaniel, S. Panchanathan

Source:

HCI International 2011, Orlando, Florida (2011)

Abstract:

This paper proposes a methodology to enhance audio described movies (i.e., films augmented with additional narration to explain visual content to viewers who are blind or visually impaired) by providing positional information of on-screen actors through haptics. Using a vibrotactile belt, we map the location of a character across the screen to a relative location around the waist, and the relative distance of a character from the camera is mapped to a tactile rhythm. Character movement is subsequently conveyed through these two dimensions. All participants, including one visually impaired subject, felt the vibrations improved their visualization of the clips. This subject also felt that it was quite easy to combine the information received through audio and haptics, and that the vibrations were not obstructing the subject’s attention to audio.

Authors

Lakshmie Narayan Viswanathan

Lakshmie Narayan Viswanathan

Masters Student Researcher

Dr. Troy L. McDaniel

Dr. Troy L. McDaniel

Assistant Professor, The Polytechnic School; Director, HAPT-X Laboratory; Co-Director, Center for Cognitive Ubiquitous Computing (CUbiC); Co-PI, NSF-NRT grant program, Citizen-Centered Smart Cities and Smart Living

Dr. Sethuraman "Panch" Panchanathan

Dr. Sethuraman "Panch" Panchanathan

Director, National Science Foundation