Abstract

Driver identification has been popular in the field of driving behavior analysis, which has a broad range of applications in anti-thief, driving style recognition, insurance strategy, and fleet management. However, most studies to date have only researched driver identification without a robust verification stage. This paper addresses driver identification and verification through a deep learning (DL) approach using psychological behavioral data, i.e., vehicle control operation data and eye movement data collected from a driving simulator and an eye tracker, respectively. We design an architecture that analyzes the segmentation windows of three-second data to capture unique driving characteristics and then differentiate drivers on that basis. The proposed model includes a fully convolutional network (FCN) and a squeeze-and-excitation (SE) block. Experimental results were obtained from 24 human participants driving in 12 different scenarios. The proposed driver identification system achieves an accuracy of 99.60% out of 15 drivers. To tackle driver verification, we combine the proposed architecture and a Siamese neural network, and then map all behavioral data into two embedding layers for similarity computation. The identification system achieves significant performance with average precision of 96.91%, recall of 95.80%, F1 score of 96.29%, and accuracy of 96.39%, respectively. Importantly, we scale out the verification system to imposter detection and achieve an average verification accuracy of 90.91%. These results imply the invariable characteristics from human factors rather than other traditional resources, which provides a superior solution for driving behavior authentication systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call