Intelligent transportation and smart city applications are currently on the rise. In many applications, diverse and accurate sensor perception of vehicles is crucial. Relevant information could be conveniently acquired with traffic cameras, as there is an abundance of cameras in cities. However, cameras have to be calibrated in order to acquire position data of vehicles. This paper proposes a novel automated calibration approach for partially connected vehicle environments. The approach utilises Global Navigation Satellite System positioning information shared by connected vehicles. Corresponding vehicle Global Navigation Satellite System locations and image coordinates are utilised to fit a direct transformation between image and ground plane coordinates. The proposed approach was validated with a research vehicle equipped with a Real-Time Kinematic-corrected Global Navigation Satellite System receiver driving past three different cameras. On average, the camera estimates contained errors ranging from 1.5 to 2.0 m, when compared to the Global Navigation Satellite System positions of the vehicle. Considering the vast lengths of the overlooked road sections, up to 140 m, the accuracy of the camera-based localisation should be adequate for a number of intelligent transportation applications. In future, the calibration approach should be evaluated with fusion of stand-alone Global Navigation Satellite System positioning and inertial measurements, to validate the calibration methodology with more common vehicle sensor equipment.
Read full abstract