Abstract

Fourier approximation and estimation of discriminant, regression, and density functions are considered. A preference order is established for the frequency weights in multiple Fourier expansions and the connection weights in single hidden-layer neural networks. These preferred weight vectors, called good weights (good lattice weights for estimation of periodic functions), are generalizations for arbitrary periods of the hyperbolic lattice points of Korobov (1959) and Hlawka (1962) associated with classes of smooth functions of period one in each variable. Although previous results on approximation and quadrature are affinely invariant to the scale of the underlying periods, some of our results deal with optimization over finite sets and strongly depend on the choice of scale. It is shown how to count and generate good lattice weights. Finite sample bounds on mean integrated squared error are calculated for ridge estimates of periodic pattern class densities. The bounds are combined with a table of cardinalities of good lattice weight sets to furnish classifier design with prescribed class density estimation errors. Applications are presented for neural networks and projection pursuit. A hyperbolic kernel gradient transform is developed which automatically determines the training weights (projection directions). Its sampling properties are discussed. Algorithms are presented for generating good weights for projection pursuit.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call