Abstract

Humans spend about one-third of their lives resting. Reconstructing human dynamics in in-bed scenarios is of considerable significance in sleep studies, bedsore monitoring, and biomedical factor extractions. However, the mainstream human pose and shape estimation methods mainly focus on visual cues, facing serious issues in non-line-of-sight environments. Since in-bed scenarios contain complicated human-environment contact, pressure-sensing bedsheets provide a non-invasive and privacy-preserving approach to capture the pressure distribution on the contact surface, and have shown prospects in many downstream tasks. However, few studies focus on in-bed human mesh recovery. To explore the potential of reconstructing human meshes from the sensed pressure distribution, we first build a high-quality temporal human in-bed pose dataset, TIP, with 152K multi-modality synchronized images. We then propose a label generation pipeline for in-bed scenarios to generate reliable 3D mesh labels with a SMPLify-based optimizer. Finally, we present PIMesh, a simple yet effective temporal human shape estimator to directly generate human meshes from pressure image sequences. We conduct various experiments to evaluate PIMesh's performance, showing that PIMesh archives 79.17mm joint position errors on our TIP dataset. The results demonstrate that the pressure-sensing bedsheet could be a promising alternative for long-term in-bed human shape estimation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.