Abstract

Lack of suitable substitute assistive technology is a roadblock for students and scientists who are blind or visually impaired (BVI) from advancing in careers in science, technology, engineering, and mathematics (STEM) fields. It is challenging for persons who are BVI to interpret real-time visual scientific data which is commonly generated during lab experimentation, such as performing light microscopy, spectrometry, and observing chemical reactions. To address this problem, a real-time multimodal image perception system was developed to allow standard laboratory blood smear images to be perceived by BVI individuals by employing a combination of auditory, haptic, and vibrotactile feedback. These sensory feedback modalities were used to convey visual information through alternative perceptual channels, thus creating a palette of multimodal, sensory information. Two sets of image features of interest (primary and peripheral features) were applied to characterize images. A Bayesian network was applied to construct causal relations between these two groups of features. In order to match primary features with sensor modalities, two methods were conceived. Experimental results confirmed that this real-time approach produced higher accuracy in recognizing and analyzing objects within images compared to conventional tactile images.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.