Abstract

Real-time electronic and optical pattern recognition systems use correlation as a means for discriminating objects of interest from unwanted objects and residual clutter in an input scene. However, correlation is not a good technique for certain gray-level input images particularly when the background has a high average value such as encountered in automatic target recognition environments. A better algorithm for these types of environments involves computing the difference-squared error between a reference template and the input image. This algorithm has been used for many years for recognizing gray-level images; however, because of the large number of bits of precision required to perform these computations, the algorithm is difficult to implement in real-time using an electronic embedded computer where small size and low power are at a premium. This paper describes a method for implementing the difference-squared algorithm on an acousto-optic time-integrating correlator. This implementation can accommodate the high dynamic range requirement which is inherent in gray-scale recognition problems. The acousto-optic correlator architecture is a natural fit for this implementation because of its capability to perform two-dimensional processing utilizing relatively mature one-dimensional input devices. Furthermore, since this architecture uses an electronically stored reference, rotational and scale variances can be accommodated by rapidly searching through a library of templates, as first described by Psaltis2. The ability to implement the difference-squared algorithm in an acousto-optic correlator architecture has the potential for solving many practical target recognition problems where real-time discrimination of gray-level objects using a compact, low power processor is required.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.