Abstract

Given a set of noisy data values from a polynomial, determining the degree and coefficients of the polynomial is a problem of polynomial regressions. Polynomial regressions are very common in engineering, science, and other disciplines, and it is at the heart of data science. Linear regressions and the least squares method have been around for two hundred years. Existing techniques select a model, which includes both the degree and coefficients of a polynomial, from a set of candidate models which have already been fitted to the data. The philosophy behind the proposed method is fundamentally different to what have been practised in the last two hundred years. In the first stage only the degree of a polynomial to represent the noisy data is selected without any knowledge or reference to its coefficient values. Having selected the degree, polynomial coefficients are estimated in the second stage. The development of the first stage has been inspired by the very recent results that all polynomials of degree q give rise to the same set of known time-series coefficients of autoregressive models and a constant term μ. Computer experiments have been carried out with simulated noisy data from polynomials using four well known model selection criteria as well as the proposed method (PTS1). The results obtained from the proposed method for degree selection and predictions are significantly better than those from the existing methods. Also, it is experimentally observed that the root-mean square (RMS) prediction errors and the variation of the RMS prediction errors from the proposed method appear to scale linearly with the standard deviations of the noise for each degree of a polynomial.

Highlights

  • Polynomial regression aims to select a polynomial that passes near a collection of noisy data values from a polynomial

  • RESULTS are described some computer experiments to assess the performance of the four existing techniques (AIC, AICc, Generalised Information Criterion (GIC), and Bayesian Information Criterion (BIC)) as well as the proposed method (PTS1) for selecting polynomial models and predicting noisy polynomial data

  • Quadratic, cubic, and quartic polynomials with different amounts of Gaussian noise have been considered. It is clear from equations (4), (13), (14), and (15) that Akaike Information Criterion (AIC), AICc, GIC, and BIC calculate log-likelihoods which require the knowledge of the standard deviation of noise, which is not available in real situations

Read more

Summary

Introduction

Polynomial regression aims to select a polynomial that passes near a collection of noisy data values from a polynomial. Polynomial regressions are very common in engineering, science, and other disciplines, and it is one of the important problems of data science. Polynomial regression models are generally fitted with the Least-Squares method to obtain estimated values of the polynomial coefficients. In 1815 Gergonne wrote a paper on ‘‘The application of the method of least squares to the interpolation of sequences’’ [3]. This is an English translation by St. John and Stigler [4] of the original paper that was written in French. Few more recent and interesting diverse applications can be found in

Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.