Abstract In this paper, a new interpretation of parametric linear discriminants for binary classification problems is presented. Linear discriminants are described in terms of Disjoint Tangent Configurations (DTC) established between the ellipsoidal level surfaces resulting from the means and covariance matrices of the distributions. This is a new framework that allows, first, a new interpretation and analysis of several well-known linear discriminants and, second, the design of new discriminants with very interesting properties. In particular, it is shown that the analytical expression of the Bayes Linear Discriminant —whose explicit expression is still unknown— can be derived from a particular DTC. Besides the Bayes discriminant, other classical linear discriminants are also described according to the DTC analysis, in particular, the Fisher and the Scatter-based Linear Discriminants. On the other hand, two new linear discriminants for the minimax and the Bayesian solutions are obtained from the DTC analysis. Both have a direct analytical expression in contrast to the existing iterative solutions, with which they are compared. The first DTC discriminant, which is called MPDH-DTC, is the solution of the Minimax Probabilistic Decision Hyperplane (MPDH) problem, the same solution that the Minimax Probability Machine (MPM) method approximates by an iterative convex optimization. The second discriminant, called Quasi-Bayes-DTC Linear Discriminant, is designed to be an approximation to the Bayes Linear Discriminant, which requires a search procedure to find the solution. Considering both the accuracy over several synthetic and real problems and the computational cost, the Quasi-Bayes-DTC is the preferred discriminant due to its high performance and low computational cost, unless a minimax solution is required, in that case the MPDH-DTC is preferred.