Abstract

Morph detection is of paramount significance when the integrity of Automatic Face Recognition (AFR) systems is concerned. Considering the risks incurred by morphing attacks, a robust automated morph detector is required which can distinguish authentic bona fide samples from altered morphed images. We leverage the wavelet sub-band decomposition of an input image, yielding the fine-grained spatial-frequency content of the input image. To enhance detection of morphed images, our goal is to find the most discriminative information across frequency channels and spatial domain. To this end, we propose an end-to-end attention-based deep morph detector which assimilates the most discriminative wavelet sub-bands of a given image which are obtained by a group sparsity representation learning scheme. Specifically, our group sparsity-constrained Deep Neural Network (DNN) learns the most discriminative wavelet sub-bands (channels) of an input image while the attention mechanism captures the most discriminative spatial regions of input images for the downstream task of morph detection. To this end, we adopt three attention mechanisms to diversify our refined features for morph detection. As the first attention mechanism, we employ the Convolutional Block Attention Module (CBAM) which provides us with refined feature maps. As the second attention mechanism, compatibility scores across spatial locations and output of our DNN highlights the most discriminative regions, and lastly, the multiheaded self-attention augmented convolutions account for our third attention mechanism. We evaluate the efficiency of our proposed framework through extensive experiments using multiple morph datasets that are compiled using bona fide images available in the FERET, FRLL, FRGC, and WVU Twin datasets. Most importantly, our proposed methodology has resulted in reduction in detection error rates when compared with state-of-the-art results. Finally, to further assess our multi-attentional morph detection, we delve into different combinations of attention mechanisms via a comprehensive ablation study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call