Abstract

A new hybrid method is presented for designing feedforward, backpropagation neural models with small training data sets. The method minimizes the generalization error, a fundamental quantity that characterizes the effectiveness of the regression models. It combines into one framework a bootstrap technique that estimates network generalization performance and a collection of stochastic and deterministic optimization techniques that adjust neural network interconnection geometry. The approach is derived as a form of multi-objective optimization strategy. This allows for more direct treatment of contradictory design criteria than traditionally employed single-objectivetechniques.Astochasticoptimization methodsuch asageneticalgorithmisusedto selectactivation functions for hidden-layer nodes, whereas fast deterministic techniques, optimal brain surgeon and singular value decomposition, are used to perform connection and node pruning. The method is demonstrated by optimizing neural networks that model the high-lift aerodynamics of a multi-element airfoil. The neural model is constructed using a small computational data set consisting of 227 data points. In the numerical experiments presented, the solutions produced by this hybrid approach exhibit an improvement in the generalization ability on the average of e ve to six times when compared to the pruned models with only one type of activation function. When traditional fully connected networks with hyperbolic tangent activation functions are considered, the improvement in the generalization performance of the new models is even greater. The neural models exhibit superior generalization qualities that are virtually impossible to e nd by manual trial-and-error approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call