Abstract

Recently, autonomous navigation has become an important research topic. There are a lot of applications where the need for autonomous robots is obvious, either because the real or virtual human presence is impossible, dangerous, or expensive, or the tasks to be solved are against the human nature. In most of the applications where robots are to be used, the conditions/environment change along the time that results in an ever-increasing need for universal methods, which are general enough to be used at a wide range of problems. In this paper, a universal, hybrid navigation method is proposed, which is able to work in cases of known, partially known, dynamically changing, or unknown environments. The model consists of two parts which are able to co-operate or to work alone. The modules combine two techniques that deal with a priori information and sensory data separately, thus blends the intelligence and optimality of global navigation methods with the reactivity and low complexity of local ones. The first, global navigation module, based on a priori information, chooses intermediary goals for the local navigation module, for which the so called A* algorithm is used. The second part, carrying out the (local) navigation relying on sensory data, applies a fuzzy-neural representation of an improved potential field based guiding navigation tool. Vision based obstacle detection is implemented by difference detection based on a combination of RGB and HSV representations of the pixels.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.