Abstract

Assistive robotic applications require systems capable of interaction in the human world, a workspace which is highly dynamic and not always predictable. Mobile assistive devices face the additional and complex problem of when and if intervention should occur; therefore before any trajectory assistance is given, the robotic device must know where it is in real-time, without unnecessary disruption or delay to the user requirements. In this paper, we demonstrate a novel robust method for determining room identification from floor features in a real-time computational frame for autonomous and assistive robotics in the human environment. We utilize two inexpensive sensors: an optical mouse sensor for straightforward and rapid, texture or pattern sampling, and a four color photodiode light sensor for fast color determination. We show how data relating floor texture and color obtained from typical dynamic human environments, using these two sensors, compares favorably with data obtained from a standard webcam. We show that suitable data can be extracted from these two sensors at a rate 16 times faster than a standard webcam, and that these data are in a form which can be rapidly processed using readily available classification techniques, suitable for real-time system application. We achieved a 95% correct classification accuracy identifying 133 rooms' flooring from 35 classes, suitable for fast coarse global room localization application, boundary crossing detection, and additionally some degree of surface type identification.

Highlights

  • Autonomous robotic systems function well in a carefully defined workspace

  • A very simple measure of the surface texture can be extracted from the mouse sensor registers, contrast, and relative brightness, can be obtained from the average, maximum, and minimum pixel values, available from the registers each clock cycle, given in Table 1, as the optical mouse sensor is intrinsically designed to maintain these relative magnitudes, by modulating the shutter period in order to keep the features consistent between frames

  • This surface roughness, or colored patterning, shows correlation with the gradient between a pixel and its neighbors, this variation can be better quantified by the 3D mappings of the three sandpaper images, shown in Figure 3, there is a direct relationship between the surface quality (SQUAL) count and the surface homogeneity

Read more

Summary

Introduction

Autonomous robotic systems function well in a carefully defined workspace Assistive devices such as robotic wheelchairs need to consider user requirements whilst negotiating highly dynamic and varied arenas, as indoor activity is highly room correlated. Any robotic application must have an executable trajectory, and autonomous robotic devices require reference points and maps for localization and navigation, whether those data are known a priori or obtained dynamically whilst undertaking exploration Assistive technologies such as electric wheelchairs are drawing mobile robotic interactions increasingly towards the uncertain and complex human environment. When the assistive device is first initialized, for example after powering down and having been manually moved, localization becomes the first dictate; current methods require some form of scanning or initial exploration to generate a map which is compared with a stored map This approach requires some time and unnecessary motion, both undesirable features in any human assistive system. In addition a habitable room may be cluttered and dynamically varying geometric mapping will not remain consistent over time

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call