Mobile videos contain a large amount of data, where the information interesting to the user can either be discrete or distributed. This paper introduces a method for fusing 3D geographic information systems (GIS) and video image textures. For the dynamic fusion of video in 3DGIS where the position and pose angle of the filming device change moment by moment, it integrates GIS 3D visualization, pose resolution and motion interpolation, and proposes a projection texture mapping method for constructing a dynamic depth camera to achieve dynamic fusion. In this paper, the accuracy and time efficiency of different systems of gradient descent and complementary filtering algorithms are analyzed mainly by quantitative analysis method, and the effect of dynamic fusion is analyzed by the playback delay and rendering frame rate of video on 3DGIS as indicators. The experimental results show that the gradient descent method under the Aerial Attitude Reference System (AHRS) is more suitable for the solution of smartphone attitude, and can control the root mean square error of attitude solution within 2°; the delay of video playback on 3DGIS is within 29 ms, and the rendering frame rate is 34.9 fps, which meets the requirements of the minimum resolution of human eyes.
Read full abstract