The least square regression (LSR) is a popular framework for multicategory classification because it has simple mathematical formulation and efficient solution. The classification performance of LSR based methods depends heavily on the discriminative capability of the label transformation. In this work, we aim to enhance the discriminative capability of the label transformation by imposing a new class-induced structure constraint. Specifically, we propose to regularize the label transformation matrix by the difference of l2,1 norm and l2,2 norm of the predicted labels of each class. The major advantage of the new regularity is that, it can guarantee the ideal discrimination of the label transformation matrix and make the classification more stable. For better generalization capability, we adopt the existing ε-dragging technique to relax the binary label. Leveraging the new regularity term and the label relaxation, we give a group discriminative least square regression (GDLSR) training model to learn the label transformation for multicategory classification. To solve the proposed model, we present an ADMM-like iteration algorithm, which we can guarantee a weak convergency. Experiments on several commonly used datasets show that our method outperforms both related LSR-based methods and some traditional methods.