Abstract

Intelligent transportation is an indispensable part of the smart city and the primary development direction of the future transportation systems. Vehicle detection and recognition, which is one of the most important aspects of intelligent transportation, plays a very important role in various areas of our daily life; one such important area is criminal investigation. In the fine-grained vehicle type detection and recognition, several difficult issues such as problems in data acquisition and tagging, dramatic variance in the data of different vehicle types, and challenges in identifying vehicles of the same brand with highly similar appearances remain unsolved. For the problems of data acquisition and tagging, this paper presents a strategy for automatic data acquisition and tagging based on object detection that can label the vehicle images efficiently while rapidly acquiring all types of fine-grained models. Considering the problem of data imbalance in the training process, this paper proposes a Faster-RCNN based data equalizing strategy (Faster-BRCNN), thereby improving the performance of object detection. In view of the severe information attenuation caused by the feature information transfer obstruct between layers in the traditional deep learning network, the lack of mutual dependency of these features, and the inability of the network to focus on the important region and characteristics, we propose an intensive dense attention network (DA-Net). Through its intensive connection and attention unit, we enhance the model's detection ability. The proposed method achieves mAP of 94.5% and 95.8% in the Stanford Cars and FZU Cars datasets, respectively, thereby verifying its effectiveness.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.