Abstract

Extracting motion information from videos is important for quantifying data from behavioral experiments to deepen the understanding of generation mechanisms of animal behavior. For insect walking, inter-leg coordination plays a crucial role, and the thorax-coxa (ThC) and femur-tibia (FTi) joint motions of six legs reflect the walking velocity and direction. This suggests that joint-motion information based on a continuous time series is beneficial for dynamic behavior prediction. Since pose estimation from markerless videos has been extensively studied, the joint angle can be calculated accurately from videos via deep-learning algorithms such as DeepLabCut. Herein, we propose a method for the single-step and multi-step prediction of whole-body velocity and direction using leg joint angles. The method constructs models using long short-term memory (LSTM) and Hammerstein LSTM (HLSTM), with joint data as input, to predict the whole-body velocity and direction of insect walking. We investigated motion prediction with ThC and FTi joint angles using LSTM and HLSTM. The trained models predicted single-step motion information with the accuracy in the range of 73.28%–92.12% for velocity and 66.66%–87.46% for direction. Multi-step prediction for the next 10 steps showed the accuracy in the range of 99.56%–99.99% for velocity and 99.43%–99.95% for direction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call