Abstract

The ability of autonomous robots to model, communicate, and act on semantic “soft data” remains challenging. The human-assisted robotic planning and sensing (HARPS) framework is presented for active semantic sensing and planning in human–robot teams to address these gaps by formally combining the benefits of online sampling-based partially observable Markov decision process policies, multimodal human–robot interaction, and Bayesian data fusion. HARPS lets humans impose model structure and extend the range of soft data by sketching and labeling new semantic features in uncertain environments. Dynamic model updating lets robotic agents actively query humans for novel and relevant semantic data, thereby improving model and state beliefs for improved online planning. Simulations of a unmanned aerial vehicle-enabled target search in a large-scale partially structured environment show significant improvements in time and beliefs required for interception versus conventional planning with robot-only sensing. A human subject study in the same environment shows an average doubling in dynamic target capture rate compared to the lone robot case and highlights the robustness of HARPS over a range of user characteristics and interaction modalities.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call