Abstract

The emergence of smart electronics, human friendly robotics and supplemented or virtual reality demands electronic skins with both tactile and touchless perceptions for the manipulation of real and virtual objects. Here, we realize bifunctional electronic skins equipped with a compliant magnetic microelectromechanical system able to transduce both tactile—via mechanical pressure—and touchless—via magnetic fields—stimulations simultaneously. The magnetic microelectromechanical system separates electric signals from tactile and touchless interactions into two different regions, allowing the electronic skins to unambiguously distinguish the two modes in real time. Besides, its inherent magnetic specificity overcomes the interference from non-relevant objects and enables signal-programmable interactions. Ultimately, the magnetic microelectromechanical system enables complex interplay with physical objects enhanced with virtual content data in augmented reality, robotics, and medical applications.

Highlights

  • The emergence of smart electronics, human friendly robotics and supplemented or virtual reality demands electronic skins with both tactile and touchless perceptions for the manipulation of real and virtual objects

  • The m-MEMS platform is realized by packaging a flexible magnetic field sensor and a compliant permanent magnet with a pyramid-shaped extrusion at its top surface into a single architecture (Fig. 1a)

  • The magnetic field sensor of the m-MEMS changes its electrical resistance when exposed to an external magnetic field of a magnetically functionalized object for touchless interaction and by mechanical deformation of the m-MEMS package upon application of pressure for tactile interaction (Fig. 1b)

Read more

Summary

Introduction

The emergence of smart electronics, human friendly robotics and supplemented or virtual reality demands electronic skins with both tactile and touchless perceptions for the manipulation of real and virtual objects. The m-MEMS relies on a genuine, distinguishable bimodal sensing principle It allows separating the signals from tactile and touchless interactions into two non-overlapping regions, realizing the challenging task of unambiguous discriminating the two interaction modes without knowing the history of the signal. We design and fabricate a demonstrator where our compliant m-MEMS skin is used to identify an object of interest and to activate a pop-up menu and interact with its content relying on a combination of gestures and physical pressing This intrinsically bimodal magnetosensitive smart skin allows reducing the number of physical “clicks” needed to activate the same functionality of the device to one, instead of at least three as up to now required when using state-of-the-art gadgets. Beyond the field of AR, e-skins with multimodal interaction abilities are expected to bring benefits for healthcare, e.g., to ease surgery operations and manipulation of medical equipment[30,31], as well as for humanoid robots to overcome the challenging task of grasping[32,33]

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.