Concept-cognitive learning (CCL) concerns the manner in which to incorporate new information into itself by mimicking human cognitive processes. Numerous CCL systems have been proposed to meet different requirements based on formal concept analysis. However, most of these can only operate effectively on small-scale datasets and lack classification ability. To overcome these two challenges, based on a regular formal decision context and multi-thread technique, a new concurrent CCL model is proposed in this study as an extension of classical CCL. More precisely, to enhance computational efficiency, a new concurrent learning framework is designed and its corresponding learning algorithms at the initial concept construction and CCL stages are developed. We propose a concurrent incremental learning technique by continuously accommodating newly added data to meet the requirements of classification tasks. Finally, experiments on various datasets, including real-world and synthetic data, demonstrate that the proposed concurrent method can achieve comparable classification effectiveness and significantly improve the concept learning performance simultaneously.