John Zelek, Sam Bromley, and Daniel Asmar
We are currently exploring relaying navigational information (e.g., obstacles, terrain, depth) to a visually impaired person using a tactile glove we have developed. The glove consists of a collection of vibrating motors. The collective patterns of motor activity are used for conveying the navigational information which is mapped from an artificial perception system derived from a wearable camera and computer. The tactile glove has a reduced bandwidth when compared to the visual input stream. Three exploratory routes of tactile mapping include: (1) encoding information in terms of a minimally spanning basis set of spatial prepositions; (2) organizing the hand in terms of functionality (e.g., obstacle motors, terrain motors); and (3) a direct foveaperiphery retinal distinction on the hand. The glove strongly relies on the information provided by the artificial perception system. We have explored a probabilistic framework (e.g., Particle filtering) for modelling dynamical visual processes (e.g., tracking, optical flow, depth from stereo). We suspect that a probabilistic encoding is necessary to model the uncertainty in visual processing. In addition, the integration of temporal stream redundancy helps the reliability of the perceived scene. The internal representations developed for this application will also be useful for mobile robot navigation.