AbstractBody motion capture is a vital approach that underpins natural human–machine interaction. Strain sensors that can detect both motion amplitude and direction are the basis for discerning interactive intent. However, most strain sensors with heterogeneous architecture fail to perceive motion direction accurately. Here, quasi‐homogeneous integrated strain vector sensors are constructed by vertically stacking elastomer meshes with different fiber orientations in order. By varying fiber orientation, the proportion of intrinsic elastic and structural deformation in elastomer mesh can be adjusted, achieving effective regulation of its electromechanical properties. A highly anisotropic elastomer mesh with unidirectional fibers is customized, enabling the strain vector sensors to perceive stretch amplitude and direction simultaneously. Notably, the strain vector sensors demonstrate a minimum directional resolution of 2° and a linear working range of up to 100% strain. The tough bonding between quasi‐homogeneous interlayers ensures high robustness even after 10 000 loading cycles. Moreover, a wearable 3D motion capture system is built that can acquire comprehensive data regarding motions, including amplitudes, directions, and modes, and visually synchronize them in virtual reality. Natural human–machine interaction is achieved by intuitively and effortlessly altering motion amplitudes and directions. This work provides an alternative insight for immersive human–machine interaction in the future.
Read full abstract