Abstract

The approximations obtained by gradient descent optimization on a set of datasets were compared with the results obtained with the Levenberg-Marquardt Optimization Method (LMOM) on the same datasets. The datasets, which comprised three orthogonal databases obtained from MATLAB's Neural Network toolbox accompanying databases, were normalized and serially loaded to the artificial neural network Graphical User Interface (GUI) designed by the researchers. The GUI built with Visual Studio Programming Language (VSPL) implements a gradient descent optimization scheme of the back-propagation algorithm. The characteristics of each database for determination of the termination criteria were approximated from the developed feature extractive iteration algorithm. Revalidation sessions of the LMOM on the sampled datasets showed significant spuriousness in outputted results when compared with the gradient descent optimization results which although slow in attaining convergence produced results that can be closely replicated. Analysis of the F-statistics and the Receiver Operating Characteristics (ROC) for the sampled datasets results of both methods also showed that the gradient descent method demonstrated significant accuracy and parsimony in approximating the nonlinear solutions in the datasets when compared with the results from LMOM processing. Additionally, in this research, an algorithm for deducing and producing the ROC for analyzed Artificial Neural Network (ANN) sessions was also developed and implemented using VSPL.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call