Abstract

Stochastic configuration networks (SCNs) are a new model of random neural networks (RNNs), which have demonstrated excellent capabilities in large-scale data analysis. For the processing of increasing data, distributed algorithms have become increasingly important. In this paper, the data sets are stored in multiple agents, and the centralized SCNs are reconstructed into equivalent sub-problems. The input weights and deviations are randomly generated under the constraints of supervised inequalities, ensuring the general approximation of the model, and proposed Based on the Alternating Direction Method of Multipliers (ADMM) distributed algorithm, L1 regularization is applied, sparse solutions are found, output weights are obtained, and the stochastic neural network model is improved. This method proves to be effective and greatly reduces the amount of information that needs to be exchanged between computing agents.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.