Abstract
We propose a parallel distributed model of a hybrid fuzzy genetics-based machine learning (GBML) algorithm to drastically decrease its computation time. Our hybrid algorithm has a Pittsburgh-style GBML framework where a rule set is coded as an individual. A Michigan-style rule-generation mechanism is used as a kind of local search. Our parallel distributed model is an island model where a population of individuals is divided into multiple islands. Training data are also divided into multiple subsets. The main feature of our model is that a different training data subset is assigned to each island. The assigned training data subsets are periodically rotated over the islands. The best rule set in each island also migrates periodically. We demonstrate through computational experiments that our model decreases the computation time of the hybrid fuzzy GBML algorithm by an order or two of magnitude using seven parallel processors without severely degrading the generalization ability of obtained fuzzy rule-based classifiers. We also examine the effects of the training data rotation and the rule set migration on the search ability of our model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.