Abstract

Driver gaze zone estimation is an important task in Advanced Driver Assistance Systems (ADAS), which suffers difficulties including head pose, capture direction, glass occlusion, and real-time requirement, etc. Most previous methods combine face modalities and head pose using concat process, which may result in over-fitting due to the unbalanced dimension. Focusing on gaze zone estimation problems, we propose the <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Head Pose Fusion Assisted supervision &amp; Eye Region Weighted Encoding</i> ( <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">HP-ERW</b> ) structure, which fuses head pose attribute and face modalities together through spatial attention and Kronecker product mechanisms. Firstly, we introduce a pre-processing module dealing with head pose and face information, with the purpose of extracting input vectors and improving the fusion speed of the HP-ERW structure. Secondly, an <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Eye Region Weighted Encoding Network</i> (ERW-Net) based on spatial attention is proposed to strengthen the networks perception ability for encoding features. Finally, we propose a dual-channel <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Head Pose Fusion Network</i> ( <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">HP-Net</b> ) based on the Kronecker product mechanism, with the purpose of fusing head pose and improving the estimation accuracy. Experiments show that the HP-ERW outperforms compared existing methods on several public datasets. The designed ADAS using the proposed method achieves 23.5 fps real-time application with small memory requirement of 4,884 KB.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.