Mixtures of multivariate normal inverse Gaussian (MNIG) distributions can be used to cluster data that exhibit features such as skewness and heavy tails. For cluster analysis, using a traditional finite mixture model framework, the number of components either needs to be known a priori or needs to be estimated a posteriori using some model selection criterion after deriving results for a range of possible number of components. However, different model selection criteria can sometimes result in different numbers of components yielding uncertainty. Here, an infinite mixture model framework, also known as Dirichlet process mixture model, is proposed for the mixtures of MNIG distributions. This Dirichlet process mixture model approach allows the number of components to grow or decay freely from 1 to \(\infty\) (in practice from 1 to N) and the number of components is inferred along with the parameter estimates in a Bayesian framework, thus alleviating the need for model selection criteria. We run our algorithm on simulated as well as real benchmark datasets and compare with other clustering approaches. The proposed method provides competitive results for both simulations and real data.