Abstract

The eye region is one of the most attractive sources for identification and verification due to the representative availability of such biometric modalities as periocular and iris. Many score-level fusion approaches have been proposed to combine these two modalities targeting to improve the robustness. The score-level approaches can be grouped into three categories: transformation-based, classification-based and density-based. Each category has its own benefits, if combined can lead to a robust fusion mechanism. In this paper, we propose a hierarchical fusion network to fuse multiple fusion approaches from transformation-based and classification-based categories into a unified framework for classification. The proposed hierarchical approach relies on the universal approximation theorem for neural networks to approximate each fusion approach as one child neural network and then ensemble them into a unified parent network. This mechanism takes advantage of both categories to improve the fusion performance, illustrated by an improved equal error rate of the multimodal biometric system. We subsequently force the parent network to learn the representation and interaction strategy between the child networks from the training data through a sparse autoencoder layer, leading to further improvements. Experiments on two public datasets (MBGC version 2 and CASIA-Iris-Thousand) and our own dataset validate the effectiveness of the proposed hierarchical fusion approach for periocular and iris modalities.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.