Modeling Context in Haptic Perception, Rendering, and Visualization

Publication Type:

Conference Paper


K. Kahol, P. Tripathi, T. McDaniel, S. Panchanathan


ACM International Workshop on Multimedia Information Systems, Sorrento, Italy (2005)


Haptic perception refers to the ability of human beings to perceive spatial properties through touch-based sensations. In haptics, contextual clues about material,shape, size, texture, and weight configurations of an object are perceived by individuals leading to recognition of the object and its spatial features. In this paper, we present strategies and algorithms to model context in haptic applications that allow users to haptically explore objects in virtual reality/augmented reality environments. Initial results show significant improvement in accuracy and efficiency of haptic perception in augmented reality environments when compared to conventional approaches that do not model context in haptic rendering.


Dr. Troy L. McDaniel

Dr. Troy L. McDaniel

Assistant Professor, The Polytechnic School; Director, HAPT-X Laboratory; Co-Director, Center for Cognitive Ubiquitous Computing (CUbiC); Co-PI, NSF-NRT grant program, Citizen-Centered Smart Cities and Smart Living

Dr. Sethuraman "Panch" Panchanathan

Dr. Sethuraman "Panch" Panchanathan

Director, National Science Foundation