Extreme learning machine (ELM) is a single-hidden-layer feed-forward neural network in which the input weights linking the input layer to the hidden layer are randomly chosen. The output weights which link the hidden layer to the output layer are analytically determined by solving a linear system of equations and hence is one of the fastest learning algorithms. The Moore–Penrose (MP) generalized inverse is normally employed to obtain the output weights of the neural network. Although the random weight parameters between input and hidden layers are need not be tuned, ELM provides good generalization performance with fast learning speed. In general, the data sets from real-world problems tend to make the linear system of ELM ill-conditioned due to the presence of inconsistent noise levels in the input data which leads to unreliable solutions and over-fitting problems. The regularization techniques are developed to address such issues in ELM and it involves estimation of additional variables termed as a regularization parameter. In this context, the proper selection of the regularization parameter is a crucial task as it is going to decide the quality of the solution obtained from the linear system. Further, the popular choice is the Tikhonov regularization technique which penalizes the ℓ2-norm of the model parameters. In ELM, such inclusion results are giving equal weight to singular values of the matrix irrespective of the noise level present in the data. In the presented work, a fractional framework is introduced in the Tikhonov regularized ELM to weigh the singular values with respect to a fractional parameter to reduce the effect of different noise levels. Moreover, an automated golden-section method is applied to choose the optimal fractional parameter. Finally, the generalized cross-validation method is applied for obtaining the suitable value of the regularization parameter. The proposed strategy of applying fractional Tikhonov regularization to ELM results in improvement of performance when compared with the conventional methods with respect to the performance measures. Finally, the results obtained from the proposed fractional regularization is also shown to be statistically significant.
Read full abstract