Abstract

Vehicle type recognition is a momentous application of target detection in intelligent transportation system, meanwhile, it is also one in the midst of the research hotspots of scholars at home and abroad at present. Convolutional neural networks (CNNs) model has been shown to achieve some impressive performance in several image processing fields, such as vehicle detection and type recognition. However, those vehicle detection methods available now are short of the capacity to identify the vehicle type, and the related datasets for vehicle type classification lack the annotation information of attributes such as location. Thus, we propose an improved Faster R-CNN based vehicle detection and type recognition method that consists of two parts: vehicle detection and type recognition and automatic annotation of attributes across datasets. This method first uses BIT-Vehicle dataset to train the model used for vehicle detection and type recognition, then migrates the model to the surveillance-nature data part of the CompCars dataset for labeling test, and indirectly obtains the labels of attributes. The vehicle recognition ability of the target detection model combined with ZF-Net, VGG-16 and ResNet-101 convolutional neural networks on BIT-Vehicle dataset and CompCars dataset are compared. The experimental results indicate that the scheme proposed not only has high recognition accuracy in BIT-Vehicle dataset, but also has reliable labeling results in the surveillance-nature data part of CompCars dataset.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.