Exemplar-based image colorization aims to colorize a target grayscale image based on a color reference image, and the key is to establish accurate pixel-level semantic correspondence between these two images. Previous methods directly search for correspondence over the entire reference image, and this type of global matching is prone to mismatch. Intuitively, a reasonable correspondence should be established between objects which are semantically similar. Motivated by this, we introduce the idea of semantic prior and propose SPColor, a semantic prior guided exemplar-based image colorization framework. Several novel components are systematically designed in SPColor, including a semantic prior guided correspondence network (SPC), a category reduction algorithm (CRA), and a similarity masked perceptual loss (SMP loss). Different from previous methods, SPColor establishes the correspondence between the pixels in the same semantic class locally. In this way, improper correspondence between different semantic classes is explicitly excluded, and the mismatch is obviously alleviated. In addition, SPColor supports region-level class assignments before SPC in the pipeline. With this feature, a category manipulation process (CMP) is proposed as an interactive interface to control colorization, which can also produce more varied colorization results and improve the flexibility of reference selection. Experiments demonstrate that our model outperforms recent state-of-the-art methods both quantitatively and qualitatively on public dataset. Our code is available at https://github.com/viector/spcolor.