Few-shot Named Entity Recognition (NER) systems aim to classify unseen named entity types with limited labeled examples. Significant progress has been made in the use of large-scale pre-trained language models. However, boundary information and hidden relationships between entities beyond the sequence, which provide additional information and play crucial roles in few-shot NER, have received little attention in recent methods that achieve the state of the art. In this paper, we propose CEPTNER, a Contrastive learning Enhanced Prototypical network for Two-stage few-shot Named Entity Recognition, which leverages meta-learning and prototypical network to identify unseen entity types with limited labeled data. Concretely, we first detect candidate boundaries of entities in the stage of boundary detector, then we employ a prototypical network to filter false boundaries and type for the remaining spans in the stage of entity classifier. Additionally, we conduct entity-level contrastive learning to explore the internal relationship between entities that provides extra information while optimizing the prototypical network. CEPTNER is evaluated on two widely-used few-shot NER datasets and a few-shot slot tagging dataset: Few-NERD, CrossNER and SNIPS. Extensive experiments on various benchmarks show the superiority of CEPTNER over previous methods for few-shot NER.