Abstract

This paper surveys the area of “Trust Metrics” related to security for autonomous robotic systems. As the robotics industry undergoes a transformation from programmed, task oriented, systems to Artificial Intelligence-enabled learning, these autonomous systems become vulnerable to several security risks, making a security assessment of these systems of critical importance. Therefore, our focus is on a holistic approach for assessing system trust which requires incorporating system, hardware, software, cognitive robustness, and supplier level trust metrics into a unified model of trust. We set out to determine if there were already trust metrics that defined such a holistic system approach. While there are extensive writings related to various aspects of robotic systems such as, risk management, safety, security assurance and so on, each source only covered subsets of an overall system and did not consistently incorporate the relevant costs in their metrics. This paper attempts to put this prior work into perspective, and to show how it might be extended to develop useful systemlevel trust metrics for evaluating complex robotic (and other) systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.