Abstract
Deep learning-based approaches have become the appealing choice of researchers for invariant ear recognition which deals with the challenging tasks related to the biometric system like pose variations, illuminations, and changes in lighting conditions. Recent work of 2D ear recognition achieves good performances when the sample images are captured in a constrained environment. This paper attempts to solve the problem of pose variations, illumination changes, and rotation invariance (generally caused due to unconstrained environment) with the help of Gabor representation given as input to the ensemble of pre-trained deep neural networks. The use of multiple Gabor filters generates multiple Gabor representations of ear images which are rotation invariant and unaffected by pose and scale variation. These Gabor features when given as an input to the multiple pre-trained deep neural networks, the models are trained effectively due to the availability of various robust and discriminant features produced by Gabor filters. As a result, it improves the classification accuracy of the resultant model even when there is variation in sample images captured in unconstrained environments. The proposed work uses two pretrained models namely VGG19 and DenseNet161 which are already trained on the ImageNet dataset, so we need to perform the fine tune of these models only. In this way, it also solves the problem of the limited dataset which is one of the major challenges in the context of deep learning.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.