Large-scale tactile sensing applications in Robotics have become the focus of extensive research activities in the past few years, specifically for humanoid platforms. Research products include a variety of fundamentally different robot skin systems. Differences rely in technological (e.g., sensory modes and networking), system-level (e.g., modularity and scalability) and representation (e.g., data structures, coherency and access efficiency) aspects. However, differences within the same robot platform may be present as well. Different robot body parts (e.g., fingertips, forearms and a torso) may be endowed with robot skin that is tailored to meet specific design goals, which leads to local peculiarities as far as technological, system-level and representation solutions are concerned. This variety leads to the issue of designing a software framework able to: (i) provide a unified interface to access information originating from heterogeneous robot skin systems; (ii) assure portability among different robot skin solutions. In this article, a real-time framework designed to address both these issues is discussed. The presented framework, which is referred to as Skinware, is able to acquire large-scale tactile data from heterogeneous networks in real-time and to provide tactile information using abstract data structures for high-level robot behaviours. As a result, tactile-based robot behaviours can be implemented independently of the actual robot skin hardware and body-part-specific features. An extensive validation campaign has been carried out to investigate Skinware’s capabilities with respect to real-time requirements, data coherency and data consistency when large-scale tactile information is needed.
Read full abstract