Abstract

Metaheuristic algorithms are powerful methods for solving compute intensive problems. neural Networks, when trained well, are great at prediction and classification type of problems. Backpropagation is the most popular method utilized to obtain the weights of Neural Nets though it has some limitations of slow convergence and getting stuck in a local minimum. In order to overcome these limitations, in this paper, a hybrid method combining the parallel distributed bat algorithm with backpropagation is proposed to compute the weights of the Neural Nets. The aim is to use the hybrid method in applications of a distributed nature. Our study uses the Matlab® software and Arduino® microcontrollers as a testbed. To test the performance of the testbed, an application in the area of speech recognition is carried out. Due to the resource limitations of Arduino microcontrollers, the core speech pre-processing of LPC (linear predictive coding) feature extractions are done in Matlab® and only the LPC parameters are passed to the Neural Nets, which are implemented on Arduino microcontrollers. The experimental results show that the proposed scheme does produce promising results.

Highlights

  • Metaheuristic optimization algorithms have been used in many areas such as biology, chemistry, control systems, engineering and many other fields which require huge amounts of computations

  • Even though neural networks and metaheuristic optimization algorithms have been applied in numerous research fields, the paper presents two contributions, first a hybrid method is proposed to overcome the problem of convergence to a local minimum and usage of the parallel distributed bat algorithm in parallel neural networks [6]

  • A novel hybrid method consisting of combining Parallel Distributed Bat Algorithm with Backpropagation method was presented

Read more

Summary

Introduction

Metaheuristic optimization algorithms have been used in many areas such as biology, chemistry, control systems, engineering and many other fields which require huge amounts of computations. A neural network is adaptive in nature such that the internal structure has layers of neurons with activation functions. These neurons are connected with other neurons by links. Even though neural networks and metaheuristic optimization algorithms have been applied in numerous research fields, the paper presents two contributions, first a hybrid method is proposed to overcome the problem of convergence to a local minimum and usage of the parallel distributed bat algorithm in parallel neural networks [6]. The paper is organized as follows: Section Two presents the literature review, Section Three presents the parallel distributed bat algorithm, Section Four presents the background on neural networks. Section Five presents the results and Section Six provides the conclusion and future work

Literature Review
Artificial Neural Networks
Initialize
Application to Speech Recognition
Findings
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.