Abstract
Tactile maps are widely recognized as useful tools for mobility training and the rehabilitation of visually impaired individuals. However, current tactile maps lack real-time versatility and are limited because of high manufacturing and design costs. In this study, we introduce a device (i.e., ClaySight) that enhances the creation of automatic tactile map generation, as well as a model for wearable devices that use low-cost laser imaging, detection, and ranging (LiDAR,) used to improve the immediate spatial knowledge of visually impaired individuals. Our system uses LiDAR sensors to (1) produce affordable, low-latency tactile maps, (2) function as a day-to-day wayfinding aid, and (3) provide interactivity using a wearable device. The system comprises a dynamic mapping and scanning algorithm and an interactive handheld 3D-printed device that houses the hardware. Our algorithm accommodates user specifications to dynamically interact with objects in the surrounding area and create map models that can be represented with haptic feedback or alternative tactile systems. Using economical components and open-source software, the ClaySight system has significant potential to enhance independence and quality of life for the visually impaired.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.