Abstract

Robots are now working more and more closely with humans outside of traditional fences in industrial scenes. Their real-time tactile interaction perception is crucial to the safety of human–robot collaboration (HRC). In this work, we present a customized, wearable and modular robot skin (TacSuit), which is scalable for large-area surface coverage of robot with easily accessible multi-modal sensors, including pressure, proximity, acceleration and temperature sensors. The TacSuit is co-designed for mechanical structure and data fusion algorithm, consisting of three levels of design: sensor, cell (of multi-modal sensors) and block (of multiple cells). These sensors are stored with custom-designed and 3D printed capsules to achieve the conformity, scalability and easy installation to the arbitrary robot surface. A multi-level event-driven data fusion algorithm enables efficient information processing for large number of tactile sensors. Furthermore, a virtual interaction force fusion method takes both the proximity and force perception information into consideration in order to achieve safety of whole interaction process before and after direct physical contacts. A humanoid robotic platform successfully realizes the TacSuit wear of 159 tactile cells. Validation experiments of obstacle detection demonstrate the effective collision avoidance capability of the TacSuit for safe HRC.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call