Humans have the uncanny ability to estimate how an object feels in terms of its shape, size, texture, material, etc., entirely from its visual image. From a biological standpoint, algorithms that estimate haptic features from images mimic the human ability to transfer knowledge from one perceptual modality (vision) to another perceptual modality (touch). These algorithms are known as visio-haptic transfer algorithms, which can estimate haptic information from visual data at either a physical level or a perceptual level. Physical visio-haptic transfer algorithms perform 3D reconstruction to create virtual, tangible models of environments or objects, which may be haptically explored by users. On the other hand, perceptual visio-haptic transfer algorithms classify haptic features, e.g., texture, shape, etc., of objects or surfaces into pre-determined perceptual categories, e.g., a surface's texture may be classified as smooth, rough, bumpy, etc. Hence, perceptual visio-haptic transfer algorithms provide information about objects or surfaces at a conceptual level to enable real-time user perception.
The aim of our research is to design a general framework for visio-haptic transfer, and design, develop and test physical and perceptual visio-haptic transfer algorithms. Our application area of interest is assistive technology for individuals who are blind or visually impaired, with the aim of developing a wearable assistive device for remote object perception for individuals who are blind.
CUbiC Technology and Research Initiative Fund