As a generative model, probabilistic linear discriminant analysis (PLDA) has achieved good performance in supervised learning tasks. The model incorporates both within-individual and between-individual variation, and remaining unexplained data variation is assumed to follow Gaussian distribution. However, the assumption of Gaussian distribution makes the model sensitive to the presence of noise and outliers in training set. To address this issue, this paper proposes a robust probabilistic linear discriminant analysis model by assuming Laplace prior on the noise term. Instead of solving high-dimensional linear systems, we embed a Kronecker-decomposable component in the new model for tensor data, significantly reducing the size of problems. As the non-conjugacy of Laplace distribution complicates the calculation of the posteriors of latent variables, we express it to a hierarchical architecture using an Inverse Gamma distribution and then adopt variational expectation–maximization (EM) algorithm to learn model parameters. The reconstruction and classification experiments on several public databases show the superiority of the proposed model compared with the state-of-the-art LDA-based algorithms.