Abstract

A back-propagation (BP) neural network can solve complicated random nonlinear mapping problems; therefore, it can be applied to a wide range of problems. However, as the sample size increases, the time required to train BP neural networks becomes lengthy. Moreover, the classification accuracy decreases as well. To improve the classification accuracy and runtime efficiency of the BP neural network algorithm, we proposed a parallel design and realization method for a particle swarm optimization (PSO)-optimized BP neural network based on MapReduce on the Hadoop platform using both the PSO algorithm and a parallel design. The PSO algorithm was used to optimize the BP neural network’s initial weights and thresholds and improve the accuracy of the classification algorithm. The MapReduce parallel programming model was utilized to achieve parallel processing of the BP algorithm, thereby solving the problems of hardware and communication overhead when the BP neural network addresses big data. Datasets on 5 different scales were constructed using the scene image library from the SUN Database. The classification accuracy of the parallel PSO-BP neural network algorithm is approximately 92%, and the system efficiency is approximately 0.85, which presents obvious advantages when processing big data. The algorithm proposed in this study demonstrated both higher classification accuracy and improved time efficiency, which represents a significant improvement obtained from applying parallel processing to an intelligent algorithm on big data.

Highlights

  • A Back-Propagation (BP) neural network is a type of multi-layered feed-forward neural network that learns by constantly modifying both the connection weights between the neurons in each layer and the neuron thresholds to make the network output continuously approximate the desired output [1]

  • Chiroma et al [18] applied an artificial neural network optimized by the particle swarm optimization (PSO) algorithm to predict OPEC CO2 emissions

  • To validate the performance of the parallel PSO-BP neural network algorithm proposed in this study, we tested it on a semantic classification task with a large number of scene images on the Hadoop platform

Read more

Summary

Introduction

A Back-Propagation (BP) neural network is a type of multi-layered feed-forward neural network that learns by constantly modifying both the connection weights between the neurons in each layer and the neuron thresholds to make the network output continuously approximate the desired output [1]. Yu et al [22] used a genetic algorithm to optimize a BP neural network They improved the additional momentum factor and self-adaptive learning rate and established a natural gas load forecasting model to make short-term forecasts of natural gas loads in Shanghai. Gao et al [23] used the GA to optimize the initial weights and thresholds of a BP neural network and was able to predict housing prices in Guiyang City with improved accuracy. Similar to [3], Ren et al [25] predicted wind speed; in contrast to [3], Ren et al first used a PSO algorithm to optimize the initial thresholds and weights of the BP neural network and established a forecasting model that obtains improved accuracy compared to [3]. Regardless of the optimization employed, BP neural networks always show good performance when the sample size is small; when the sample set size increases, the time efficiency of these algorithms sharply and intolerably declines

Objectives
Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.