Regression models play a pivotal role in real-life applications by enabling the analysis and prediction of continuous outcomes. Among these, the k-Nearest Neighbours (KNN) model stands out as a significant advancement in machine learning. KNN's ability to make predictions based on the proximity of data points has found wide-ranging applications in various fields. However, the traditional KNN regression model has its limitations, including sensitivity to noise and an uneven distribution of neighbours. In response, this paper introduces a novel approach: an empirical likelihood ratio (ELR) based regression algorithm. The ELR technique offers distinct advantages over distance-based nearest neighbour computations, particularly in handling skewed data distributions and minimizing the impact of outliers. The proposed ELRbased KNN regression model is rigorously assessed through both simulation studies and real-life scenarios. The results explicitly demonstrate the enhanced performance of the ELR-based approach over the conventional KNN model. This research contributes to a deeper understanding of regression techniques and underscores the practical significance of leveraging empirical likelihood ratios in refining predictive models for real-world applications.. KEYWORDS :Regression model, k-Nearest neighbours, Empirical likelihood ratio, Distance measures, Data distributions.
Read full abstract