This article presents a visual servoing strategy that integrates the capabilities of a physics-informed neural network (PINN) to estimate system uncertainties and inaccuracies with a dynamics-centered visual servoing technique for multirotors. The proposed method effectively combines these approaches, eliminating the need for inverse Jacobian calculations to determine multirotor motion by directly relating pixel variations to the multirotor's torque and thrust inputs, while also strengthening the method's robustness through the utilization of the PINN to model and address uncertainties in camera and multirotor parameters, as well as the modeling inaccuracies inherent in the dynamics-centered visual servoing technique. In contrast to existing state-of-the-art data-driven approaches, the proposed PINN approach requires, on average, 65% less labeled data to characterize uncertainties and inaccuracies. To ensure real-time implementation of the visual servoing model, the PINN-learned model is combined with an adaptive horizon monotonically weighted nonlinear model predictive controller (NMPC), capable of processing control efforts at rates 10 times faster than existing Tube MPC and Adaptive MPC strategies. These findings are validated through real-time trajectory tracking experiments, which not only highlight the effectiveness of the proposed approach in approximating modeling inaccuracies but also its capability in handling uncertainties upto 70% in camera parameters.
Read full abstract