Abstract

To make the operation of aircraft less dependent on visibility conditions, concepts using imaging sensors (enhanced vision) and concepts using databases (synthetic vision) have been and are being investigated. Operations that use the capability to 'see' using a database in order to go beyond limits of current operations will require a level of safety equivalent to that of current operations. A capability for the timely detection of hazardous discrepancies between the real world and the depiction of the world that is generated from the database needs to be provided. One potential approach is the use of imaging sensors. Previous research has already addressed the combination of data from imaging sensors with a computer-generated depiction of the environment. The general approach has been to perform spatial and temporal multiresolution image fusion. In the resulting image, none of the original sources is clearly distinguishable. Although such an approach may be desirable to compensate for some of the sensor deficiencies, it may be better if certain specific elements or features from the synthetic world are clearly identifiable as such. This paper discusses research that aims to provide an integration of EVS and SVS in which the EVS data is intended to support the integrity monitoring of the SVS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call