Abstract

The probabilistic neural network (PNN) is an efficient approach that can compute nonlinear decision boundaries, widely used for classification. In this paper, the often used Gaussian distribution function is replaced by a new probability density function which provides a new variant of the PNN method. Most of the higher-dimensional data are statistically found to be not from the normal distribution, and hence, we have replaced it by the symmetric Laplace distribution. Further, the estimation of the smoothing parameter in the proposed PNN model is carried out with three different evolutionary algorithms, namely bat algorithm (BA), grey wolf optimizer (GWO), and whale optimization algorithm (WOA) with a novel fitness function. These different proposed PNN models with variable smoothing parameter estimation methods are tested on five different benchmark data sets. The performance of proposed three Laplace distribution-based variants of PNN incorporated with BA, GWO, and WOA are reported and compared with Gaussian-based variants of PNN and also other commonly used classifiers: the conventional PNN, extreme learning machine, and K-nearest neighbor in terms of measurement accuracy. The results demonstrate that the proposed approaches using evolutionary algorithms can provide as much as a ten percent increase in accuracy over the conventional PNN method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.