Abstract

Providing city navigation instructions to people with motion disabilities requires the knowledge of urban features like curb ramps, steps or other obstacles along the way. Since these urban features are not available from maps and change in time, crowdsourcing this information from end-users is a scalable and promising solution. Our preliminary study on wheelchair users shows that an automatic crowdsourcing mechanism is needed, avoiding users' involvement.In this contribution we present a solution to crowdsource urban features from inertial sensors installed on a wheelchair. Activity recognition techniques based on decision trees are used to process the sensors data stream. Experimental results, conducted with data acquired from 10 real wheelchair users navigating in an outdoor environment show that our solution is effective in detecting urban features with precision around 0.9, while it is less reliable when classifying some fine-grained urban feature characteristics, like a step height. The experimental results also present our investigation aimed at identifying the best parameters for the given problem, which include number, position and type of inertial sensors, classifier type, segmentation parameters, etc.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call