Abstract

Stochastic configuration networks (SCNs) are a class of randomized learner models that have garnered increasing attention in data analytics. In the original implementation of SCN, the pseudo-inverse of the hidden layer output matrix is used to evaluate output weights. However, it is quite challenging to compute the pseudo-inverse of the hidden layer output matrices for large-scale datasets with high complexity, which occurs in many real-world applications. This paper aims to accelerate the learning process of SCNs by employing the well-known alternating direction method of multipliers (ADMM) solver and CPU-GPU heterogeneous computing technique. Empirical results on two benchmark datasets demonstrate the efficiency and effectiveness of the proposed method for large-scale data analytics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call