Abstract
Classification in supervised learning is one of the major issues in machine learning and data science. K-Nearest Neighbour (KNN) and Decision Tree (DT) are one of the most widely used classification techniques that are commonly applying for single model and ensemble processes. KNN is known as lazy learner as it doesn't build any decision line from the training data. DT, on the other hand, is a top-down recursive divide-and-conquer technique that used for both classification and regression problems. DT has several advantages e.g, is requires little prior knowledge and non-linear relationship of features don't affect the tree performance. In this paper, we have proposed a new learning algorithm named KNNTree which is a hybrid model of KNN and DT algorithms. The proposed model is basically a decision tree, but leaf nodes are replaced by the KNN classifier. We have tested the proposed method with KNN and DT algorithms on 10 benchmark datasets taken from UC Irvine Machine Learning Repository and found the proposed method outperforms both KNN and DT classifiers.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.