Conventional machine learning algorithms face significant limitations when dealing with high-dimensional data. Besides, deep learning models often require substantial computational resources and have a high processing time despite their excellent performance. Hence, this paper proposes an expanded stochastic configuration network and a self-organizing hierarchical incremental learning (SHIL) framework to overcome these challenges. Specifically, this study introduces a novel supervised hierarchical clustering tree based on the minimum redundancy maximum correlation algorithm, which mines internal data structures to construct diverse hierarchies. Subsequently, by exploiting the parent-child node relationships in the tree structure, SHIL defines the maximum number of nodes as the switching condition between levels uses the supervisory mechanism as the parameter selection criterion, and adopts the tolerance error as the termination criterion for the training. Furthermore, the universal approximation property of the SHIL framework is provided. The proposed SHIL framework is validated on several benchmark datasets, image datasets, and industrial robot cases, with the corresponding experimental results demonstrating that SHIL significantly improves computational efficiency and ensures high accuracy.