Abstract

This paper describes an open-source implementation of an event-based dynamic and active pixel vision sensor (DAVIS) for racing human vs. computer on a slot car track. The DAVIS is mounted in eye-of-god view. The DAVIS image frames are only used for setup and are subsequently turned off because they are not needed. The dynamic vision sensor (DVS) events are then used to track both the human and computer controlled cars. The precise control of throttle and braking afforded by the low latency of the sensor output enables consistent outperformance of human drivers at a laptop CPU load of <3% and update rate of 666Hz. The sparse output of the DVS event stream results in a data rate that is about 1000 times smaller than from a frame-based camera with the same resolution and update rate. The scaled average lap speed of the 1/64 scale cars is about 450km/h which is twice as fast as the fastest Formula 1 lap speed. A feedbackcontroller mode allows competitive racing by slowing the computer controlled car when it is ahead of the human. In tests of human vs. computer racing the computer still won more than 80% of the races.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call