Abstract

Abstract: To perform service tasks effectively, service robots must be able to handle not only low-level sensory-motor data, but also high-level semantic information. These data and information are bi-directionally linked, where low-level data is passed up and high-level information is passed down in semantic relationships and hierarchy. In this paper, these data and information are described as robot knowledge and associated with each other to integrate them into unified robot knowledge for service robots. Experimental results that demonstrate the advantages of using the proposed knowledge framework are also presented. 1. I NTRODUCTION Knowledge represented by semantic network is a key for service robots to successfully complete service tasks in real environments. Semantic knowledge can be represented by network or graph which represents semantic relations between concepts. Additionally, service robots are designed to complete service tasks semi or fully automatically in a specific environment [1]. Robots must interact semantically with humans as well as understand user intentions to enable socially collaboration [2]. A significant obstacle for service robots is the execution of complicated tasks in a real environment. To overcome these challenges effectively robot semantic knowledge can be extremely useful. However, previous approaches to this issue have been limited to functionality: they have focused on partial tasks rather than integrated tasks, and they have not combined low-level and high-level data. To compensate for incomplete information, it must also be possible for robots to use semantic knowledge through relations represented by links between information

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call