In three-dimensional (3D) measurement, the motion of objects inevitably introduces errors, posing significant challenges to high-precision 3D reconstruction. Most existing algorithms for compensating motion-induced phase errors are tailored for object motion along the camera’s principal axis (Z direction), limiting their applicability in real-world scenarios where objects often experience complex combined motions in the X/Y and Z directions. To address these challenges, we propose a universal motion error compensation algorithm that effectively corrects both pixel mismatch and phase-shift errors, ensuring accurate 3D measurements under dynamic conditions. The method involves two key steps: first, pixel mismatch errors in the camera subsystem are corrected using adjacent coarse 3D point cloud data, aligning the captured data with the actual spatial geometry. Subsequently, motion-induced phase errors, observed as sinusoidal waveforms with a frequency twice that of the projection fringe pattern, are eliminated by applying the Hilbert transform to shift the fringes by π/2. Unlike conventional approaches that address these errors separately, our method provides a systematic solution by simultaneously compensating for camera-pixel mismatch and phase-shift errors within the 3D coordinate space. This integrated approach enhances the reliability and precision of 3D reconstruction, particularly in scenarios with dynamic and multidirectional object motions. The algorithm has been experimentally validated, demonstrating its robustness and broad applicability in fields such as industrial inspection, biomedical imaging, and real-time robotics. By addressing longstanding challenges in dynamic 3D measurement, our method represents a significant advancement in achieving high-accuracy reconstructions under complex motion environments.
Read full abstract