Abstract

Let n measurements of values of a real function of one variable be given, but the measurements include random errors. For given integers m and σ, we consider the problem of making the least sum of squares change to the data such that the sequence of the divided differences of order m of the fit changes sign at most σ times. The main difficulty in these calculations is that there are about O(nσ) combinations of positions of sign changes, so that it is impracticable to test each one separately. Since this minimization calculation has local minima, a general optimization algorithm can stop at a local minimum that need not be a global one. It is an open question whether there is an efficient algorithm that can compute a global solution to this important problem for general m and σ. It has been proved that the calculations when m = 1, which gives a piecewise monotonic fit to the data, and m = 2, which gives a piecewise convex/concave fit to the data, reduce to separating the data into σ + 1 disjoint sets of adjacent data and solving a structured quadratic programming problem for each set. Separation allows the development of some dynamic programming procedures that solve the particular problems in O(n2 + σnlog2n) and about O(σn3) computer operations, respectively. We present an example which shows that the minimization calculation when m ≥ 3 and σ ≥ 1 may not be decomposed into separate calculations on subranges of adjacent data such that the associated divided differences of order m are either non-negative or non-positive. Therefore, the example rules out the possibility of solving the general problem by a similar dynamic programming calculation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call