Abstract

Abstract In this paper, we propose a new regularization approach for Extreme Learning Machine-based Single- hidden Layer Feedforward Neural network training. We show that the proposed regularizer is able to weight the dimensions of the ELM space according to the importance of the network's hidden layer weights, without imposing additional computational and memory costs in the network learning process. This enhances the network's performance and makes the proposed approach suitable for learning non- linear decision surfaces in large-scale classification problems. We test our approach in medium- and large-scale face recognition problems, where we observe its superiority when compared to the existing regularized Extreme Learning Machine classifier in both constrained and unconstrained problems, thus making our approach applicable in demanding media analysis applications such as those appearing in digital cinema production.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call