Abstract

The ability of autonomous robots to model, communicate, and act on semantic “soft data” remains challenging. The human-assisted robotic planning and sensing (HARPS) framework is presented for active semantic sensing and planning in human–robot teams to address these gaps by formally combining the benefits of online sampling-based partially observable Markov decision process policies, multimodal human–robot interaction, and Bayesian data fusion. HARPS lets humans impose model structure and extend the range of soft data by sketching and labeling new semantic features in uncertain environments. Dynamic model updating lets robotic agents actively query humans for novel and relevant semantic data, thereby improving model and state beliefs for improved online planning. Simulations of a unmanned aerial vehicle-enabled target search in a large-scale partially structured environment show significant improvements in time and beliefs required for interception versus conventional planning with robot-only sensing. A human subject study in the same environment shows an average doubling in dynamic target capture rate compared to the lone robot case and highlights the robustness of HARPS over a range of user characteristics and interaction modalities.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.