Abstract To address the limitations of six-degree-of-freedom (6-DOF) visual tracking systems in terms of low dynamic performance and the inability to perform measurements under light occlusion, this paper proposes a novel laser tracking attitude dynamic measurement method integrating vision and inertial measurement unit (IMU). A tailored real-time calibration model is developed to enable precise alignment of the vision and IMU coordinate systems during dynamic operations. To overcome the limitations of extended Kalman filter (EKF) that rely on static noise assumptions in complex dynamic measurement scenarios, this study proposes an adaptive mechanism based on multi-factor influence analysis. Through dynamic monitoring of visual observation noise, an adaptive EKF algorithm is designed to enhance performance under varying conditions. When integrated into a vision/IMU fusion system, the algorithm significantly enhances the system’s sensitivity to variations in visual sensor observation noise caused by changes in lighting and motion states. This capability allows rapid adjustment of filtering parameters, leading to substantial improvements in filtering accuracy and stability. Simulation results demonstrate that the proposed algorithm outperforms traditional EKF in fusion measurement accuracy when tested in simulated noise environments featuring sinusoidal and random jump signals. Furthermore, the fusion measurement accuracy is mainly influenced by the data density of the visual sensor and the measurement distance. Finally, the proposed method was validated on a precision turntable. Experimental results reveal that at a 3 m measurement distance and within a 0°–25° angular range, when the turntable rotates at a constant speed of 5°/s along a single axis, the maximum absolute deviation in repeatability of the fused attitude measurements was below 0.11°, and the system is capable of achieving a high measurement frequency of 100 Hz.
Read full abstract