Many currently popular models of categorization are either strictly parametric (e.g., prototype models, decision bound models) or strictly nonparametric (e.g., exemplar models) (F. G. Ashby & L. A. Alfonso-Reese, 1995, Journal of Mathematical Psychology, 39, 216–233). In this article, a family of semiparametric classifiers is investigated where categories are represented by a finite mixture distribution. The advantage of these mixture models of categorization is that they contain several parametric models and nonparametric models as a special case. Specifically, it is shown that both decision bound models (F. G. Ashby & W. T. Maddox, 1992, Journal of Experimental Psychology: Human Perception and Performance, 16, 598–612; 1993, Journal of Mathematical Psychology, 37, 372–400) and the generalized context model (R. M. Nosofsky, 1986, Journal of Experimental Psychology: General, 115, 39–57) can be interpreted as two extreme cases of a common mixture model. Furthermore, many other (semiparametric) models of categorization can be derived from the same generic mixture framework. In this article, several examples are discussed and a parameter estimation procedure for fitting these models is outlined. To illustrate the approach, several specific models are fitted to a data set collected by S. C. McKinley and R. M. Nosofsky (1995, Journal of Experimental Psychology: Human Perception and Performance, 21, 128–148). The results suggest that semi-parametric models are a promising alternative for future model development.