Although stochastic configuration networds(SCN) has the universal approximation property and faster learning speed,during the process of model construction,the randomness of weight and biases assignment as well as the uncertainty of model structure lead to instability.Inspired by bagging,an ensemble model named bagging SCN is proposed to adress the limitation of the single model.Firstly,multiple different training subsets are extracted by bootstrap sampling.Then the SCN submodels are trained on each subset.Finally,the median output of these submodels is taken as the final prediction.Predictions made by bagging SCN are tested on two public datasets.The performance of bagging SCN is then compared with other techniques,including SCN,bagging SCN with other aggregating rules.Experimental results demonstrate that bagging SCN proposed in this study exhibits good stability and high prediction accuracy,making it suitable for quantitative analysis of spectral data.