Abstract

AbstractThe paper investigates approximation error of two‐layer feedforward Fourier Neural Networks (FNNs). Such networks are motivated by the approximation properties of Fourier series. Several implementations of FNNs were proposed since 1980s: by Gallant and White, Silvescu, Tan, Zuo and Cai, and Liu. The main focus of our work is Silvescu's FNN, because its activation function does not fit into the category of networks, where the linearly transformed input is exposed to activation. The latter ones were extensively described by Hornik. In regard to non‐trivial Silvescu's FNN, its convergence rate is proven to be of order O(1/n). The paper continues investigating classes of functions approximated by Silvescu FNN, which appeared to be from Schwartz space and space of positive definite functions.

Highlights

  • Artificial neural networks have been widely used in machine learning and acquired their popularity during 1990s

  • The study assesses the convergence rate for one of a two-layer neural network with activators composed of a product of cosine functions with different frequencies

  • The architecture was proposed by Silvescu in 1999 [1]. Since this idea is inspired by Fourier series and Fourier transform, such networks are referred to as “Fourier Neural Networks” (FNNs)

Read more

Summary

Introduction

Artificial neural networks have been widely used in machine learning and acquired their popularity during 1990s. Modern trends in deep neural networks helped to achieve superior results in pattern recognition, text processing, and other fields of machine learning. This paper is focused on “shallow” 2-layer neural nets - a classic case for multilayer perceptrons. The study assesses the convergence rate for one of a two-layer neural network with activators composed of a product of cosine functions with different frequencies. The architecture was proposed by Silvescu in 1999 [1]. Since this idea is inspired by Fourier series and Fourier transform, such networks are referred to as “Fourier Neural Networks” (FNNs)

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call