Abstract
The classification model which consists of the motion detector, object tracker, convolutional sparse coded feature extractor and stacked information-extreme classifier is developed. It is proposed to build a motion detector based on the difference of consecutive aligned frames where alignment is performed via keypoints matching, homography estimation, and projective transformations. Motion detector seeks to simplify object classification task through reduction of input data variations and resource savings for motion region search model synthesis without training. The proposed model is characterized by low computational complexity and it can be used as labeling dataset gathering tool for deep moveable object detector. Furthermore, the training method for moving object detector is developed. The method consisting in unsupervised pretraining feature extractor based on sparse coding neural gas, supervised pretraining and following fine-tuning of stacked information-extreme classifier. Using soft-competitive learning scheme in sparse coding neural gas facilitates robust convergence to close to optimal distributions of the neurons over the data. Sparse coding neural gas reduces the requirements for the volume of labeled observations and computational resource. As a criterion for the effectiveness of classifier's machine training, the normalized modification of S. Kullback’s information measure is considered. Labeling new emerging data through self-labeling for high prediction score cases and manual labeling for low prediction score cases, and following labeled object tracking are also offered. In this case, class balancing using undersampling within dichotomous strategy “one-against-all”. The set of classes include bicycle, bus, car, motorcycle, pickup truck, articulated truck, and background. Simulation results on MIO-TCD dataset confirm the suitability of the proposed model and training method for practical usage.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.