Abstract

At present, augmented reality technology is a research hotspot, and its main purpose is to insert virtual objects into the real scene around the user in real time. One of the difficulties of this technology is how to make virtual objects have the same lighting effect with the real environment. Under the condition of outdoor Scene, it is difficult to simulate the erratic weather and model the complex objects. Therefore, it is relatively difficult to estimate the outdoor illumination, and there are few studies. We propose a novel method of illumination parameter estimation of outdoor scene based on BP neural network using the basis image decomposition model. Under supervised learning, this method can predict the outdoor illumination parameters of the input images by training the initial data set using the colour constancy constraint. This parameter is used to render virtual objects to achieve the illumination consistency of virtual and real objects. The algorithm does not require any 3D geometric information of the scene, and the materials and textures of objects. It also does not require special objects or special surfaces in the environment. Experiments show that the basis images and illumination coefficients obtained by this method can accurately reconstruct the original image. The virtual object can be rendered according to the light coefficients, thereby realizing seamless integration of virtual objects and real scenes. on mobile devices.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call