Most relationship knowledge distillation methods individually optimize pairwise similarities to improve the accuracy performance of a lightweight student network. However, this optimization approach may not be optimal for object re-identification (Re-ID) which prioritizes ranking. This is because it does not guarantee consistent ranking results between a lightweight student network and a large teacher network. For that, we propose a novel method called pairwise difference relational distillation (PDRD) for object Re-ID. First, we theoretically prove that minimizing the difference relationship between pairwise similarities resulting from student and teacher networks ensures consistent ranking results between the two networks. Second, based on this theoretical foundation, we combine non-linear activation functions on pairwise similarity discrepancies to create a non-linear pairwise difference rational knowledge loss function, which enhances knowledge transfer. Extensive experiments on four public datasets demonstrate that our method achieves state-of-the-art performance. For example, on Market-1501, using ResNet18 as a lightweight student network, our method acquires a rank-1 identification rate of 93.62%.