Currently, deep metric learning (DML) methods typically rely on class labels to keep positive samples as closely clustered as possible, while distancing them from negative samples. However, this approach tends to disregard some vital information inherent in the data, such as intra-class information variation, which can hinder the generalization of the trained model. To address this issue, we propose an online batch diffusion-based self-distillation method (OBD-SD), which consists of a progressive self-distillation (PSD) technique and an online batch diffusion process (OBDP). Specifically, PSD is a simple yet effective self-distillation technique that is beneficial for the diversity of embedding representations. OBDP uses the diffusion process to reveal the intrinsic relationships among samples in the mini-batch and produce better soft distance targets. Our approach combines PSD with OBDP, which can extract richer relational information among samples by capturing the local geometric structure of manifolds in each mini-batch. OBD-SD is a highly flexible framework that can be integrated into state-of-the-art DML methods. Our experimental results on CUB200, CARS196, and Stanford Online Products datasets demonstrate that our OBD-SD consistently enhances the performance of existing DML methods, without requiring additional training time, resulting in competitive results.