The rhythmic movement of bioinspired robotic fish brings about undesirable visual jitter. Note that the unstable camera path of this kind of robot is characterized by obvious regularity and predictability, of which traditional stabilization methods have not made full advantage. This article proposes a novel estimation-and-prediction framework for real-time digital video stabilization of bioinspired robotic fish. First, based on the attitude information of an inertial measurement unit (IMU), a camera–IMU model is established, where the homography transformation with eight degrees of freedom (DOFs) is reduced to translation transformation with two DOFs. Second, traditional optical flow and gray projection methods as well as a novel translation estimation network are employed to estimate the translations between consecutive frames. Third, a lightweight long short-term memory (LSTM) network is constructed, allowing remarkable prediction and smoothing of the camera path. Finally, aquatic experiments under various scenarios are conducted on a manta-inspired robot, demonstrating the effectiveness of the proposed method. Specifically, compared with the state-of-the-art commercial offline stabilization software, our online stabilization algorithm achieves approximate visual stability and remarkably faster stabilization speed. The obtained results shed light on visual sensing and control applications of bioinspired underwater vehicles.
Read full abstract