Abstract

Natural-language-based scene understanding can enable heterogeneous robots to cooperate efficiently in large and unconstructed environments. However, studies on symbolic planning rarely consider the semantic knowledge acquisition problem associated with the surrounding environments. Further, recent developments in deep learning methods show outstanding performance for semantic scene understanding using natural language. In this paper, a cooperation framework that connects deep learning techniques and a symbolic planner for heterogeneous robots is proposed. The framework is largely composed of the scene understanding engine, planning agent, and knowledge engine. We employ neural networks for natural-language-based scene understanding to share environmental information among robots. We then generate a sequence of actions for each robot using a planning domain definition language planner. JENA-TDB is used for knowledge acquisition storage. The proposed method is validated using simulation results obtained from one unmanned aerial and three ground vehicles.

Highlights

  • Natural language-based scene understanding is a critical issue for symbolic planning for heterogeneous multi-robot cooperation

  • We can mitigate the knowledge acquisition problem associated with the area of symbolic planning by sharing the environmental information expressed in natural language with diverse robots

  • We proposed a new framework for heterogeneous multi-robot cooperation based on natural language-based scene understanding

Read more

Summary

Introduction

Natural language-based scene understanding is a critical issue for symbolic planning for heterogeneous multi-robot cooperation. Corah et al [6] employed a Gaussian mixture model to map the surrounding environment while maintaining a low volume of memory for communication-efficient planning Since this method uses an algorithm designed for a specific sensor, it poses a practical application issue for a heterogeneous multi-robot system composed of different processors, implementation techniques, and sensors. In this paper, we propose a symbolic planning method that shares natural language-based environment information containing semantic meaning, rather than raw sensor data, for heterogeneous multi-robot cooperation. The proposed method is verified by a simulation using one unmanned aerial vehicle (UAV) and three unmanned ground vehicles (UGVs)

Heterogeneous Multi-Robot Cooperation Planning
Natural Language-Based Scene Understanding
Connecting Symbolic Planning and Deep Learning
Architecture
Natural Language-Based Cognition
Knowledge Engine
PDDL Planning Agent
Experiment
Experiment Setting
Scenario
Results
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call