Two perception experiments are conducted to quantify the relationship between imager sampling artifacts and target recognition and identification performance using that imager. The results of these experiments show that in-band aliasing (aliasing that overlaps the baseband signal) does not degrade target identification performance, but out- of-band aliasing (such as visible display raster) degrades identification performance significantly. Aliasing had less impact on the recognition task than the identification task, but both in-band and out-of-band aliasing moderately degrades recognition performance. Based on these experiments and other results reported in the literature, it appears that in-band aliasing has a strong effect on low-level discrimination tasks such as point (hot-spot) detection; out-of-band aliasing has only a minor impact on these tasks. For high-level discrimination tasks such as target identification, however, out-of-band aliasing has a significant impact on performance, whereas in-band aliasing has a minor affect. For intermediate-level discrimination tasks such as target recognition, both in-band and out-of-band aliasing have a moderate impact on performance. Based on data from the perception experiments, the modulation transferfunction (MTF) model is developed. The degraded performance due to undersampling is modeled as an effective increase in system blur or, equivalently, a contraction or squeeze in the MTF. An equation is developed that quantifies the amount of MTF or contraction to apply to the system MTF to account for the performance degradation caused by sampling.
Read full abstract