Abstract

Human face recognition and tracking (FRT) plays a vital role in various fields, including security, authentication, and human-computer interaction. The main modules of the FRT system are detection, feature extraction, and FRT. Using a database, these units recognize faces as well as their location, movement, and visible features. The framework aims to process large visual data in real-time, enabling accurate and fast FRT. The paper develops a real-time FRT framework using a lightweight CNN convolutional neural network to accurately match images of faces and environments with different illumination and expression differences to improve performance. This paper focused on the development of real-time facial recognition and tracking systems. The model used to achieve this is based on deep learning (DL) using a lightweight convolutional neural network (CNN) and post-feature extraction using linear discriminant analysis (LDA). Histogram of Oriented Gradients (HOG) experiments demonstrate that DL with lightweight CNN models provides a good solution for FRT tasks, even in challenging situations including changes in position, expression, illumination, and occlusion. The results of CNN-based DL were compared with several experiments. The model was also compared with many modern methods and achieved better results. The lightweight CNN model for DL outperformed it 100% of the time. When the split rate is 70:30 and the learning rate is 0.001, the epoch is 100. This demonstrates the dominance of DL over other techniques and shows how well it handles FRT tasks using lightweight and even real-time CNN methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.