Abstract

An extensive comparison of Hebbian and Random input weights in SLFN networks.A novel fusion scheme for merging Hebbian and Random feature spaces is proposed.The suggested linear combination of both feature spaces results in more robustness. In this paper, we provide an experimental study for two unsupervised processes, namely, the random initialization and the Hebbian learning, which can be used to determine the input weights in Single-hidden Layer Feedforward Neural Networks (SLFNs). In addition, a fusion technique that combines the two feature spaces is proposed. Experiments are conducted on six publicly available facial image datasets. Experimental results show that the proposed fusion technique can improve the performance of Hebbian and random feature spaces when they achieve similar performance. In the cases where the difference in performance of the two feature spaces is high, the proposed fusion scheme preserves the power of the most discriminating one and outperforms the average fused feature space. The experimental results show that there is a trade-off between the generalization of the Hebbian feature space and the low computational cost of the random one.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call