Abstract
Toward smart building and smart home, floor as one of our most frequently interactive interfaces can be implemented with embedded sensors to extract abundant sensory information without the video-taken concerns. Yet the previously developed floor sensors are normally of small scale, high implementation cost, large power consumption, and complicated device configuration. Here we show a smart floor monitoring system through the integration of self-powered triboelectric floor mats and deep learning-based data analytics. The floor mats are fabricated with unique “identity” electrode patterns using a low-cost and highly scalable screen printing technique, enabling a parallel connection to reduce the system complexity and the deep-learning computational cost. The stepping position, activity status, and identity information can be determined according to the instant sensory data analytics. This developed smart floor technology can establish the foundation using floor as the functional interface for diverse applications in smart building/home, e.g., intelligent automation, healthcare, and security.
Highlights
Toward smart building and smart home, floor as one of our most frequently interactive interfaces can be implemented with embedded sensors to extract abundant sensory information without the video-taken concerns
A potential application scenario of the smart floor monitoring system is shown in Fig. 1a, where the DLES-mat array is attached onto the corridor floor
When a person is walking on the DLESmat array, the generated electrical signals from the contact–separation motion of each stepping can be acquired and used for position sensing of the person
Summary
Toward smart building and smart home, floor as one of our most frequently interactive interfaces can be implemented with embedded sensors to extract abundant sensory information without the video-taken concerns. Through the design of varying electrode coverage rates, the generated triboelectric signals with different relative magnitudes can be adopted to distinguish the outputs from different DLES-mats and determine the corresponding walking positions as well.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have