Abstract
The random weight idea in neural network has existed for long time and attracted wide attention for its faster learning speed than that of tuning-based methods. Recently, extreme learning machine (ELM), one non-tuning algorithm for single hidden layer neural network (SLFN) was proposed and applied in many applications. ELM can learn hundreds of times faster than that of the traditional tuning-based methods while obtaining better generalization performance. But since the ELM randomly assigns input weights, its stability and accuracy will decline sharply for high-dimension and small-sample dataset. Although dimension reduction can be used to improve it, this two-stage method will need more physical space and increase structure complexity of the model. In this paper, we propose a novel single-stage algorithm, base projection vector machine (BPVM) and its kernel version—kernel projection vector machine (KPVM) by combining dimension reduction and neural network training. The BPVM and KPVM are called projection vector machine (PVM) by a joint name. Different from random weight idea used in ELM, in PVM, the input weights of SLFN are obtained by singular decomposition, and the hidden neurons are ranked by their contribution level and only those that are important are selected. This will make PVM to obtain better generalization ability and more compact structure than ELM. Similar to ELM, PVM does not need any iterative steps, and thus can yield significantly fast training. Additionally, compared with the two-stage algorithms such as BP/SVD (BP co-work with singular value decomposition) and ELM/SVD (ELM co-work with singular value decomposition), PVM need less parameters and memory running space. The experimental results on many datasets show that; although PVM is not faster than ELM, it is much faster than BP, BP/SVD and ELM/SVD which produce the best generalization ability in most cases. Especially, PVM is very suitable for high-dimension and small-sample dataset.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.