Abstract
Neural networks (NNs) with random weights are an interesting alternative to conventional NNs that are used more for data modeling. The random vector functional-link (RVFL) network is an established and theoretically well-grounded randomized learner. A key theoretical result for RVFL networks is that they provide universal approximation for continuous maps, in expectation, with respect to the square-integral norm. We specialize and modify this result, and show that RFVL networks can provide functional approximations that converge in Kullback–Leibler divergence, when the target function is a probability density function. Expanding on the approximation results, we demonstrate the RFVL networks lead to a simple randomized mixture model (MM) construction for density estimation from sample data. An expectation–maximization (EM) algorithm is derived for the maximum likelihood estimation of our randomized MM. The EM algorithm is proved to be globally convergent and the maximum likelihood estimator is proved to be consistent. A set of simulation studies is given to provide empirical evidence towards our approximation and density estimation results.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.