Named entity recognition (NER) stands as a pivotal task in natural language processing, bearing profound implications for diverse downstream applications. Recent scholarship underscores the substantial performance gains attainable by integrating label knowledge into token representations, effectively treating named entity recognition as a machine reading comprehension task. Nevertheless, extant methodologies often inadequately leverage the potential of label knowledge. In response to this limitation, we introduce the Context-based Label Knowledge Enhanced Span Recognition (CLESR) architecture, designed to augment label knowledge through the assimilation of contextual information. We formulate an annotation paradigm tailored to nested scenarios, concurrently training external context-based label knowledge using conventional word association learning algorithms. The ensuing context-based label knowledge is seamlessly integrated into the model via a dedicated label attention module, thereby fortifying label learning capabilities during training. To adeptly manage both flat and nested entities, we implement a global pointer as our decoding strategy, enabling direct predictions of the positions and corresponding categories of named entities. Rigorous experimentation across six widely recognized benchmarks substantiates the efficacy of CLESR in both flat and nested tasks. Impressively, our model achieves performance enhancements surpassing those of the baseline BERT-MRC model, with gains of + 0.29%, + 0.77%, + 0.39%, + 1.9%, and + 0.47% on English CoNLL2003, English OntoNotes 4.0, Chinese MSRA, English ACE2004, and English ACE2005, respectively.