In May 2022, Saudi Arabian Military Industries, a Saudi government agency, acquired an augmented reality training platform for pilots. In September, the Boeing Corporation began the development of an augmented reality pilot simulator. In November, a similar project was launched by BAE Systems, a leading British developer of aeronautical engineering. These facts allow us to confidently speak about the beginning of a new era of aviation simulators – simulators using the augmented reality technology. One of the promising advantages of this technology is the ability to safely simulate dangerous situations in the real world. A necessary condition for using this advantage is to ensure the visual coherence of augmented reality scenes: virtual objects must be indistinguishable from real ones. All the global IT leaders consider augmented reality as the subsequent surge of radical changes in digital electronics, so visual coherence is becoming a key issue for the future of IT, and in aerospace applications, visual coherence has already acquired practical significance. The Russian Federation lags far behind in studying the problems of visual coherence in general and for augmented reality flight simulators in particular: at the time of publication the authors managed to find only two papers on the subject in the Russian research space, while abroad their number is already approximately a thousand. The purpose of this review article is to create conditions for solving the problem. Visual coherence depends on many factors: lighting, color tone, shadows from virtual objects on real ones, mutual reflections, textures of virtual surfaces, optical aberrations, convergence and accommodation, etc. The article reviews the publications devoted to methods for assessing the conditions of illumination and color tone of a real scene and transferring them to virtual objects using various probes and by individual images, as well as by rendering virtual objects in augmented reality scenes, using neural networks.