Abstract
Abstract As machine learning applications embrace larger data size and model complexity, practitioners turn to distributed clusters to satisfy the increasing computational and memory demands. Recently, several parallel variants of extreme learning machine (ELM) have been proposed, some of which are based on clusters. However, the limitation of computation and memory in these variants is still not well addressed when both the data and model are very large. Our goal is to build scalable ELMs with a large number of samples and hidden neurons, parallel running on clusters without computational and memory bottlenecks while having the same output results as the sequential ELM. In this paper, we propose two parallel variants of ELM, referred to as local data and model parallel ELM (LDMP-ELM) and global data and model parallel ELM (GDMP-ELM). Both variants are implemented on clusters with Message Passing Interface (MPI) environment. They both make a tradeoff between efficiency and scalability and have complementary advantages. Collectively, these two variants are called as data and model parallel ELMs (DMP-ELMs). The advantages of DMP-ELMs over existing variants are highlighted as follows: (1) They simultaneously utilize data and model parallel techniques to improve the parallelism of ELM. (2) They have better scalability to support larger data and models due to that they have addressed the memory and computational bottlenecks appearing in existing variants. Extensive experiments conducted on four large-scale datasets show that our proposed algorithms have good scalability and achieve almost ideal speedup. To the best of our knowledge, it is the first time to successfully train a large ELM model with 50,000 hidden neurons on the mnist8m dataset with 8.1 million samples and 784 features.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.