Abstract

Autonomous vehicles differ from many other autonomous systems in the sense that these vehicles have to share the environment with humans (e.g., pedestrians, cyclists, and other drivers in traffic). This requirement poses a challenging perception and planning problem. Our research focuses on the problem of a vehicle sharing the environment with a human passenger/driver and, specifically, of designing a controller that captures the natural tendencies of the human driver so as to guarantee that the resulting control action is comparable to that of the human. In this paper, we propose to use a human driver control model into the autonomous vehicle control framework, which has previously been shown to predict short-term driver actions, and we develop an approach that reliably and accurately estimates input-feature values from driver-point-of-view images. After the feature-input values have been estimated, the human driver control model computes the corresponding steering-wheel angle. We, thus, provide more structure to the overall processing pipeline, compared to the recent end-to-end approaches. The proposed approach is validated using numerical simulations with synthetic images. The results validate the importance of combining traditional structured (e.g., transfer function) models with parsimonious neural-network representations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call