Abstract

We present a new scalable Probabilistic Neural Network (PNN) construction method suitable for data-neuron parallelism in a ring pipeline parallel topology that allows training a large scale distributed model on a large scale distributed dataset. First the recently proposed Kernel Gradient Subtractive Clustering (KG-SC) automatically selects representative exemplar centers and their number for the PNN kernels. Then Expectation Maximization (EM) refines the PNN parameters. Experimental simulations compare the proposed solution accuracy and performance with PNNs produced from other state-of-the-art k-center clustering algorithms. The parallel and distributed implementations produce speedups close to linear on increasing the number of processors and the dataset size.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call