Abstract

Abstract Label distribution learning (LDL) has proven effective in many machine learning applications. Previous LDL methods have focused on learning a non-linear conditional probability mass function by maximizing entropy or minimizing the Kullback–Leibler (K-L) divergence. In order to make full use of the structural information among different classes, a method called structured random forest (StructRF) regression is used which has been applied to semantic image labeling and edge detection. It is a general LDL model that treats the distribution as an integral whole. In StructRF, all label distributions are mapped to a discrete space at each split node in a random forest. In this way, standard information gain measures can be evaluated. Then the predicted distribution can be obtained directly without calculating the probability of each class individually during the test. StructRF is proved to be fast in training and it reaches higher accuracies and lower standard deviations among different measurements. Besides, we propose an adaptive variable step method that can speed up the training process and reduce the calculations of information gain significantly. It is suitable for the most decision tree based models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.