Abstract

The emerging automated driving technology poses a new challenge to driver–automation collaboration, which requires a mutual understanding between humans and machines through their intention identifications. In this study, oriented by human–machine mutual understanding, a driver steering intention prediction method is proposed to better understand human driver’s expectation during driver–vehicle interaction. The steering intention is predicted based on a novel hybrid-learning-based time-series model with deep learning networks. Two different driving modes, namely both-hand and single right-hand driving modes, are studied. Different electromyography signals from the upper limb muscles are collected and used for the steering intention prediction. The relationship between the neuromuscular dynamics and the steering torque is analysed first. Then, the hybrid-learning-based model is developed to predict both the continuous and discrete steering intentions. The two intention prediction networks share the same temporal pattern exaction layer, which is built with the bi-directional recurrent neural network and long short-term memory cells. The model prediction performance is evaluated with a varied history and prediction horizon to exploit the model capability further. The experimental data are collected from 21 participants of varied ages and driving experience. The results show that the proposed method can achieve a prediction accuracy of around 95% steering under the two driving modes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.