Abstract

Nowadays, machine learning applications deal most often with large and/or distributed datasets. In this context, distributed learning seems to be the most promising line of research to handle both situations since large datasets can be allocated across several locations. Moreover, the current trend of reducing the speed of processors in favor of multi-core processors and computer clusters leads to a suitable context for distributed learning. Notwithstanding, only a few distributed learning algorithms have been proposed so far in the literature. One of them is DEvoNet, which uses artificial neural networks and genetic algorithms. DEvoNet shows a good performance on many datasets but several limitations were pointed out in connection with its poor performance on nonuniform class-probability distributions of data. An improvement of DEvoNet, which is based on distributing the computation of the genetic algorithm, is presented in this paper. The results obtained during experimentation show a notorious improvement of the performance of DEvoNet on both uniform and nonuniform class-probability distributions of data.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.