Abstract

In the computer vision, object detection has always been considered one of the most challenging issues because it requires classifying and locating objects in the same scene. Many object detection approaches were recently proposed based on deep convolutional neural networks (DCNNs), which have been demonstrated to achieve outstanding object detection performance compared to other approaches. However, the supervised training of DCNNs mostly uses gradient-based optimization criteria, in which all parameters of hidden layers require multiple iterations, and often faces some problems such as local minima, intensive human intervention, time-consuming, etc. In this paper, we propose a new method called Faster-YOLO, which is able to perform real-time object detection. The deep random kernel convolutional extreme learning machine (DRKCELM) and double hidden layer extreme learning machine auto-encoder (DLELM-AE) joint network is used as a feature extractor for object detection, which integrating the advantages of ELM-LRF and ELM-AE. It takes the raw images directly as input and thus is suitable for the different datasets. In addition, most connection weights are randomly generated, so there are few parameter settings and training speed is faster. The experiment results on Pascal VOC dataset show that Faster-YOLO improves the detection accuracy effectively by 1.1 percentage points compared to the original YOLOv2, and an average 2X speedup compared to YOLOv3.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.