Abstract

Purpose – This study aims to propose a door detection method based on the door properties in both depth and gray-level images. It can further help blind people (or mobile robots) find the doorway to their destination. Design/methodology/approach – The proposed method uses the hierarchical point–line region principle with majority vote to encode the surface features pixel by pixel, and then dominant scene entities line by line, and finally the prioritized scene entities in the center, left and right of the observed scene. Findings – This approach is very robust for noise and random misclassification in pixel, line and region levels and provides sufficient information for the pathway in the front and on the left and right of a scene. The proposed robot vision-assist system can be worn by visually impaired people or mounted on mobile robots. It provides more complete information about the surrounding environment to guide safely and effectively the user to the destination. Originality/value – In this study, the proposed robot vision scheme provides detailed configurations of the environment encountered in daily life, including stairs (up and down), curbs/steps (up and down), obstacles, overheads, potholes/gutters, hazards and accessible ground. All these scene entities detected in the environment provide the blind people (or mobile robots) more complete information for better decision-making of their own. This paper also proposes, especially, a door detection method based on the door’s features in both depth and gray-level images. It can further help blind people find the doorway to their destination in an unfamiliar environment.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.