Abstract

We consider discrete linear Chebyshev approximation problems in which the unknown parameters of linear function are fitted by minimizing the least maximum absolute deviation of errors. Such problems find application in the solution of overdetermined systems of linear equations that appear in many practical contexts. The least maximum absolute deviation estimator is used in regression analysis in statistics when the distribution of errors has bounded support. To derive a direct solution of the problem, we propose an algebraic approach based on a parameter elimination technique. As a key component of the approach, an elimination lemma is proved to handle the problem by reducing it to a problem with one parameter eliminated, together with a box constraint imposed on this parameter. We demonstrate the application of the lemma to the direct solution of linear regression problems with one and two parameters. We develop a procedure to solve multidimensional approximation (multiple linear regression) problems in a finite number of steps. The procedure follows a method that comprises two phases: backward elimination and forward substitution of parameters. We describe the main components of the procedure and estimate its computational complexity. We implement symbolic computations in MATLAB to obtain exact solutions for two numerical examples.

Highlights

  • Discrete linear Chebyshev approximation problems where the errors of fitting the unknown parameters are measured by the Chebyshev norm are of theoretical interest and practical importance in many areas of science and engineering

  • An important area of applications of the discrete linear Chebyshev approximation is the solution of overdetermined systems of linear equations [4,5,6] that appear in many practical contexts

  • We have shown that the problem under consideration can be reduced by eliminating an unknown parameter to a problem with less number of unknowns and a box constraint for the parameter eliminated

Read more

Summary

Introduction

Discrete linear Chebyshev (minimax) approximation problems where the errors of fitting the unknown parameters are measured by the Chebyshev (max, infinity, uniform or L∞ ) norm are of theoretical interest and practical importance in many areas of science and engineering. To solve the Chebyshev approximation problem, a number of approaches are known which apply various iterative computational procedures to find numerical solutions Along with existing iterative algorithms that find use in applications, direct analytical solutions of the linear Chebyshev approximation problem are of interest as an essential instrument of formal analysis and treatment of the problem. We reshape and adjust algebraic techniques implemented in the above-mentioned approach to develop a direct solution of the discrete linear Chebyshev approximation problem in terms of conventional algebra. To provide illuminating but not cumbersome examples of the application of the lemma, we derive direct solutions of problems of low dimension, formulated as linear regression problems with one and two parameters.

Linear Chebyshev Approximation Problem
Elimination Lemma
Solution of One- and Two-Parameter Regression Problems
One-Parameter Linear Regression Problems
Two-Parameter Linear Regression Problem
General Solution Procedure
Elimination of Parameters
Derivation of Box Constraints
Solution Algorithm
Computational Complexity
Software Implementation and Numerical Examples
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call