Abstract
We take on the challenge of classifying car photos, from the most general car type to the precise make, model, and year of the vehicle for a given input. Analyzing pre-existing datasets, we find that the CompCars-SV are a great place to begin our classification project. We demonstrate that convolutional neural networks can obtain a classification accuracy of more than 90% on the most difficult task. Due to a skewed mix between training and testing, this impressive result isn't really typical of how people do in the actual world. Using an ML system for car detection, we automatically generate a vehicle-tight bounding box for each picture, which we disseminate to the full dataset together with the existing (but limited) type-level annotation. We have designed and implemented car classification algorithms to analyze this car dataset, two of which take advantage of the hierarchical nature of car annotations. According to our research, a more precise classification of car type at a finer resolution now achieves an accuracy of 99.25%. It serves as a baseline benchmark for future research. Focusing on "vehicle" tasks, this work intends to bring attention to the vision community's lack of attention to these tasks compared to other objects. The important reason getting higher accuracy is extraction of binary descriptor (BD) feature using edge detection before training the CNN. This step reduced the size of the car dataset; hence network took less time to get trained. From the result outcomes shown it is clear that the presented network architecture having 31 layers of 2d convolutional layer, batch normalization, maxpool, ReLU, fully connected layer and Softmax classifier layer, has given higher accuracy. Numerous relevant car-related issues and solutions have yet to be carefully examined and researched, according to our findings. Car model categorization, model verification, and attribute prognosis are just a few examples of how the dataset might be put to use.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.