Abstract
Attribute reduction has become an essential challenge in the fields of pattern recognition, data mining and knowledge discovery. As a good indicator of the correlation between variables, information entropy has been widely used as a measure in several attribute reduction algorithms. Its calculation information only comes from the lower approximation, and other information in the calculation process is usually ignored. At the same time, due to the need to consider all objects when using traditional information entropy calculation, attribute reduction is time-consuming and may cause overfitting problems, so it is urgent to improve its computational efficiency and avoid overfitting. Therefore, a new information entropy, local information entropy, is defined. Experiments show that the local information entropy can further improve the computational efficiency of attribute reduction, and will not significantly decrease the classification precision.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have