Abstract

This paper describes three different techniques for fitting straight lines to experimental data and discusses the corresponding evaluation of uncertainty. The techniques are (i) traditional fitting by least-squares, (ii) a Bayesian linear-regression analysis and (iii) an analysis according to the propagation of probability density functions attributed to the points measured. The material is presented to clarify assumptions underlying the techniques, to highlight differences between the techniques and to point to difficulties associated with applying the techniques under current views of ‘uncertainty analysis’. Considerable attention is given to the estimation of values of the function and not just to the estimation of parameters of the function. The paper gives a summary of many results of least-squares fitting, including some unfamiliar results for the simultaneous estimation of the unknown function at all points. On many occasions the unknown function will only be approximately linear, in which case we must define a unique unknown gradient to give proper meaning to our ‘estimate’ of slope. This can be achieved by defining an interval of interest and then applying a least-squares-type result.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call