A novel flowchart for small-size objects identification in satellite images of insufficient resolution within the graphic reference images database using neural network technology based on compromise contradiction, i.e. simultaneously the resolution enhancement of the object segment of input image and the resolution reduction of the reference image to joint resolution through the simulation of the imaging system has been proposed. This is necessary due to a significant discrepancy between the resolutions of the input image and the graphic reference images used for identification. The required level of resolution enhancement for satellite images, as a rule, is unattainable, and a significant coarsening of reference images is undesirable because of identification errors. Therefore, a certain intermediate spatial resolution is used for identification, which, on the one hand, can be obtained, and on the other the loss of information contained in the reference image is still acceptable. The intermediate resolution is determined by simulating the process of image acquisition with satellite imaging system. To facilitate such simulation, it is advisable to perform it in the frequency domain, where the advanced Fourier analysis is available and, as a rule, all the necessary transfer properties of the links of image formation chain are known. Three main functional elements are engaged for identification: an artificial neural network for the resolution enhancement of input images, a module of frequency-domain simulating of the graphical reference satellite imaging and an artificial neural network for comparing the enhanced object segment with the reference model images. The feasibility of the described approach is demonstrated by the example of successful identification of the sea vessel image in the SPOT-7 satellite image. Currently, the works are under way to compare the performance of a neural network platforms variety for small-size objects identification in satellite images aa well as to assess achievable accuracy.
Read full abstract