Abstract

The Sensitivity-Based Linear Learning Method (SBLLM) is a learning method for two-layer feedforward neural networks based on sensitivity analysis, that calculates the weights by solving a linear system of equations. Therefore, there is an important saving in computational time which significantly enhances the behavior of this method compared to other learning algorithms. In this paper a generalization of the SBLLM that includes a regularization term in the cost function is presented. The estimation of the regularization parameter is made by means of an automatic technique. The theoretical basis for the method is given and its performance is illustrated by comparing the results obtained by the automatic technique and those obtained manually by cross-validation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call