Abstract
Most approaches to controlling autonomous systems require extensive pre-mission preparation, intensive effort by the human operator, or have strong limitations on the range of possible missions that can be accomplished. In this paper we describe an approach called Cognitive Patterns that promises to alleviate these challenges by replicating three key processes of human cognition—pattern generation, perception/action, and adaptation—and instantiating them in a new architecture which can then be embedded into an autonomous system. An early version of this approach connected high-level knowledge representations in an ontology with a robot's sensing and acting abilities. The advantages of this approach were then demonstrated in a simulation environment. A more refined version, based on lessons learned and called Cognitive Patterns Knowledge Generation, can deal with anomalies, unexpected events, and uncertainties, and is also described in terms of its components, their interactions, and benefits.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Proceedings of the Human Factors and Ergonomics Society Annual Meeting
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.