Abstract
ABSTRACTAugmented Reality (AR)-assisted exercises for the enhancement of finger functions have been developed for many years and have proven to be effective. Most applications tracked the motion of upper-extremity using handheld trackers, which are not convenient to put on and take off. Bare-hand tracking algorithms have been utilized to address this limitation; however, only few AR-assisted healthcare approaches have been designed to train upper-extremity skills using bare-hand tracking, and even fewer approaches detect the Range of Motion (ROM) of fingers with low-cost devices. This paper presents a low-cost and multi-modal residential-based AR-assisted therapeutic healthcare exercise system. A computer vision-based bare-hand interaction method is proposed in this system. This method is designed to estimate finger bending degrees through their projection lengths. With an accurate feature detection strategy, this method is able to detect the full ROM using two web cameras. This system also incorporates a vibration wrist band to stimulate the users with tactile feedback, and an assessment module to evaluate user performance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Human–Computer Interaction
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.