Federated Learning (FL) is a distributed machine learning approach that preserves privacy by allowing numerous devices to collaboratively train a global model without sharing raw data. However, the frequent exchange of model updates between numerous devices and the central server, and some model updates are similar and redundant, resulting in a waste of communication and computation. Selecting a subset of all devices for FL training can mitigate this issue. Nevertheless, most existing device selection methods are biased, while unbiased methods often perform unstable on Non-Independent Identically Distributed (Non-IID) and unbalanced data. To address this, we propose a stable Diversity-aware Unbiased Device Selection (DUDS) method for FL on Non-IID and unbalanced data. DUDS diversifies the participation probabilities for device sampling in each FL training round, mitigating the randomness of the individual device selection process. By using a leader-based cluster adjustment mechanism to meet unbiased selection constraints, DUDS achieves stable convergence and results close to the optimal, as if all devices participated. Extensive experiments demonstrate the effectiveness of DUDS on Non-IID and unbalanced data scenarios in FL.