Abstract

In the present study, we investigate a universality of neural networks, which concerns a density of the set of two-layer neural networks in function spaces. There are many works that handle the convergence over compact sets. In the present paper, we consider a global convergence by introducing a norm suitably, so that our results will be uniform over any compact set.

Highlights

  • Neural network is a function that models a neuron system of a biological brain and is defined as alternate compositions of an affine map and a nonlinear map

  • The neural networks have been playing a central role in the field of machine learning with a vast number of applications in the real world in the last decade

  • We focus on a two-layer feed-forward neural network with ReLU activation, which is a function f : R ⟶ R of the form of f ðxÞ = ∑ri=1 ciReLUðaix + biÞ for some a1, b1, c1, ⋯, ar, br, cr ∈ R

Read more

Summary

Introduction

Neural network is a function that models a neuron system of a biological brain and is defined as alternate compositions of an affine map and a nonlinear map. The nonlinear map in a neural network is called the activation function. We focus on a two-layer feed-forward neural network with ReLU (rectified linear unit) activation, which is a function f : R ⟶ R of the form of f ðxÞ = ∑ri=1 ciReLUðaix + biÞ for some a1, b1, c1, ⋯, ar, br, cr ∈ R.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call