Abstract
The so-called ‘least squares regression’ for mathematical modeling is a widely used technique. It’s so common that one might think nothing could be improved to the algorithm anymore. But it can. By minimizing the squares of the differences between measured and predicted values not just in the vertical but also in the horizontal direction. We can call it ‘multidirectional regression analyses’. This improvement to the ‘least squares regression technique’ is usable for all kinds of invertible model functions: linear, exponential, power, logistic, and many other functions. Especially for power functions, often used in biomedical sciences, the conclusions you make from your data can change dramatically. An important example shown here is the Body Mass Index. We can now explain why scholars used to find a quadratic relationship between the mass of people and their height, against the scaling laws, and we see that the scaling laws are indeed respected if we use the improved fitting method. Probably the most important advantage of multidirectional regression is that the fitted model is invariable if the dependant and independent variables are switched. This was a serious and neglected problem with the ordinary least squares method. The examples were calculated with a specially developed software program, called ‘Fitting KV dm, version 1.0’.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.