Abstract

Deep belief networks (DBNs) with outstanding advantages of learning input data features have attained particular attention and are applied widely in image processing, speech recognition, natural language interpretation, disease diagnosis, among others. However, owing to large data, the training processes of DBNs are time-consuming and may not satisfy the requirements of real-time application systems. In this study, a single dataset is decomposed into multiple subdatasets that are distributed to multiple computing nodes. Multiple computing nodes learn the features of their own subdatasets. On the precondition of the remaining features where one computing node learns from the total dataset, the single dataset learning models and algorithms are extended to the cases where multiple computing nodes learn multiple subdatasets in a parallel manner. Learning models and algorithms are proposed for the parallel computing of DBN learning processes. A master–slave parallel computing structure is designed, where the slave computing nodes learn the features of their respective subdatasets and transmit them to the master computing node. The master computing node is critical in synthesizing the learned features from the respective slave computing nodes. The broadcast, synchronization, and synthesis are repeated until all features of subdatasets have been learned. The proposed parallel computing method is applied to traffic flow prediction using practical traffic flow data. Our experimental results verify the effectiveness of the parallel computing method of DBN learning processes in terms of decreasing pre-training and fine-tuning times and maintaining the prominent feature learning abilities.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.