Abstract
Identity verification is ubiquitous in daily life. Its applications range from unlocking mobile device to accessing online account, boarding airplane or other types of transportation, recording times of arrival and leaving work, controlling access to a restricted area, facility, or vault, and many more. The traditional and the most popular identity verification is password authentication but with many challenges. Human biometric identifiers like fingerprint, retina scan, and 2D or 3D facial features have become popular alternatives. Some applications use two-factor or multi-factor authentication to increase system security, e.g., password and login code sent to a mobile device. All these identity verification methods have their challenges ranging from forgotten or stolen password to unaware or unintentional authentication and complexity and high costs. This paper presents a promising alternative that could be an improvement to the existing identity verification methods. This improved identity verification is a two-factor approach that concurrently analyzes facial features and unique facial actions. The user's facial features and facial actions must both match what have been stored in the system in order to pass identity verification. This two-factor verification requires only the frontal view of the face and authenticates facial features and facial actions concurrently. It generates an embedding of facial features and facial action in a short video for matching. We name this method Current Two-Factor Identity Verification (C2FIV). Two frameworks that use recurrent neural networks to learn the representation of facial features and actions. One uses an auto-encoder, and the other one uses metric learning. Experimental result shows that the metric learning model performs reliably with an average precision of 98.8%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.