Abstract

Background: Motion and mobility are vital for human health, often compromised with illness and injury. Presently in clinical health encounters motion characterization is largely qualitative. While hospital-based specialty labs exist for quantifying motion, a greater need exists for simple, readily deployable systems that can be widely used in clinical practice across specialties. Our group advanced an “around-body, i.e. non-contact, motion quantitation system termed MOCA which relies on video analysis of body landmarks and/or colored markers to track and quantitate motion. In the original system data extraction and quantitation involved use of MATLAB, was cumbersome due to inflexibility with data formats, slow in processing, making the system overall less user-friendly. Here we advance this system as MOCA 2.0 - utilizing an improved Python script, with advantages of ability to process multiple data formats, rapid processing speed and with output flexibility. We test its efficacy with a range of video motion capture inputs. Method: Six subjects (3:3 male:female) subjects placed color markers (red, green, or orange 10x10 mm.) at defined body locations. Six exercises were performed at 2 or 6 sec/rep. Video recordings were downloaded and analyzed frame by frame using marker colors to track the xy-coordinates. A Python script was constructed, offering flexibility of data input, processing speed, and a range of outputs. MOCA 2.0 coordinates were converted into displacement using subject’s arm length. Python’s matplotlib library was used to graph axis over time. The new script is compatible with MOCA and other data formats. Results: For all subjects and movements performed, the system captured, translated and yielded quantitative motion information. The new script yielded accurate positional data of tracked markers using color matching, which then calculated angle of motion. An example of angle calculation generated compared to actual video of the movement is shown (Fig A). Generated angles of motion of a bicep curl vs. time, with the elbow as pivot point is shown (Fig B.) Python-enhanced script allowed for increased accuracy in assessment of a standard bicep curl, with an average range of motion of 102°, ranging from 47° angle at rest to 149° at full flexion. Discussion: Python modification of the motion capture and analysis system allowed for enhanced video motion analysis and quantitation useful for both health and medical motion and mobility assessment and physical exams. This advance provides greater flexibility in graphing and potential for further system feature development. Tracking serial motion data over time offers the advantage of quantitative exactness useful for guiding therapeutic intervention to foster clinical progress, guide rehabilitation, and advance therapy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.