ABSTRACTGaussian mixture models with eigen-decomposed covariance structures, i.e. the Gaussian parsimonious clustering models (GPCM), make up the most popular family of mixture models for clustering and classification. Although the GPCM family has been used for almost 20 years, selecting the best member of the family in a given situation remains a troublesome problem. Likelihood ratio (LR) tests are developed to tackle this problem; given a number of mixture components, these LR tests compare each member of the family to the heteroscedastic model under the alternative hypothesis. Along the way, a novel maximum likelihood estimation procedure is developed for two members of the GPCM family. Simulations show that the reference distribution provides a reasonable approximation for the LR statistics when the sample size is not too small and when the mixture components are separate enough; accordingly, in the remaining configurations, a parametric bootstrap approach is also discussed and evaluated. Furthermore, a closed testing procedure, having the defined LR tests as local tests, is considered to assess, in a straightforward way, a unique model in the general family. In contrast with the information criteria that are often employed in the literature as ‘black boxes’, it is only based on one subjective element, the significance level, whose meaning is clear to everyone. Simulation results are presented to investigate the performance of the procedure in situations with gradual departure from the homoscedastic model and its robustness with respect to elliptical departures from normality in each mixture component. Finally, the advantages of the procedure are illustrated via applications to some well-known data sets.