Representation based classification (RC) algorithms have been successfully applied to pattern classification. However, most existing RC algorithms are not robust to bad training samples, since they ignore learning more discriminative dictionary atoms and instead directly use training samples as dictionary atoms. In addition, in order to improve expression ability and recognition accuracy, RC algorithms often need to expand the number of dictionary atoms, resulting in a surge in storage and computing costs. To obtain a more discriminative and compact dictionary, this study proposes discriminative dictionary learning for nonnegative representation based classification (DDLNRC). Specifically, in the DDLNRC, this paper utilizes nonnegative constraint to obtain a nonnegative representation for each training sample on the dictionary. In the dictionary learning stage, for a training sample, the DDLNRC minimizes the intra-class reconstruction error of the training sample and simultaneously enlarges the distance between the training sample and the atom that has the most influence on inter-class reconstruction error. Experiments demonstrate the effectiveness of the DDLNRC. Combined with the deep neural network features, it also can achieve higher accuracy than using the Softmax.