As an important branch of neural network, extreme learning machines (ELMs) have attracted wide interests in the fields of pattern classification and regression estimation. However, when facing learning problems with multi-dimensional outputs, named multi-dimensional regression, the conventional ELMs could not generally get satisfactory results because it is incapable of exploiting the relatedness among outputs efficiently. To solve this problem, a new regularized ELM is firstly proposed in this paper by introducing a hyper-spherical loss function as regularizer. As the regularization form with this loss function cannot be solved directly, an solution with iterative procedure is presented. For improving the learning performance, the algorithm proposed above is further reformulated to identify the inner grouping structure hidden in outputs by assuming that the grouping structure is determined by different linear combinations of a small number of latent basis neurons. This is achieved as a mixed integer programming, and finally an alternating minimization method is presented to solve this problem. Experiments on two multi-dimensional data sets, a toy problem and a real-life dynamical cylindrical vibration data set, are conducted, and the results demonstrate the effectiveness of the proposed algorithm.
Read full abstract