Abstract

Unmanned swarms are widespread used in the IoT. The ability of unmanned swarms to achieve adaptive collective behavior in complicated mission scenarios is a prerequisite for meeting mission objectives. However, classical collective behavior models often use the velocity and position of neighbors as inputs to be constructed from a phenomenological perspective. This complicates the construction of unmanned swarms and is incompatible with biological perception. Therefore, this paper proposes an Observation-Orientation-Decision-Action (OODA) framework for the construction of adaptive collective behavior based on visual perception, inspired by biological collective behavior formations. The model contains no explicit alignment, and no information exchange occurs between individuals. Instead, individuals make decisions based purely on the sight distance corresponding to different relative orientations. Based on adaptability evaluation metrics defined at the collective level, experiments including coordinated collective motion, single disturbed individual, single external disturbance, narrow passage, and multiple external disturbances scenarios show that the group can respond adaptively to different scenarios with a stable crystal structure while avoiding collisions. In addition to particle simulations, different scenarios provide validation using the Webots robot simulator. As a result, this approach compensates for the inadequacies of existing models and provides technical support for the application of unmanned swarms in various IoT scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call