Abstract

Running gait assessment is critical in performance optimization and injury prevention. Traditional approaches to running gait assessment are inhibited by unnatural running environments (e.g., indoor lab), varied assessor (i.e., subjective experience) and high costs with traditional reference standard equipment. Thus, development of valid, reproduceable and low-cost approaches are key. Use of wearables such as inertial measurement units have shown promise but despite their flexible use in any environment and reduced cost, they often retain complexities such as connectivity to mobile platforms and stringent attachment protocols. Here, we propose a non-wearable camera-based approach to running gait assessment, focusing on identification of initial contact events within a runner's stride. We investigated different artificial intelligence and object tracking approaches to determine the optimal methodology. A cohort of 40 healthy runners were video recorded (240FPS, multi-angle) during 2-minute running bouts on a treadmill. Validation of the proposed approach is obtained from comparison to manually labelled videos. The computing vision approach can accurately identify initial contact events (ICC(2,1) = 0.902).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call