This work presents an efficient data-centric client selection approach, named DICE, to enable federated learning (FL) over distributed edge networks. Prior research focused on assessing the computation and communication ability of the client devices for selection in FL. On-device data quality, in terms of data volume and heterogeneity, across these distributed devices is largely overlooked. The obvious outcome is the selection of an improper subset of clients with poor-quallity data, which inevitably results in an inefficient trained model. With an aim to address this problem, in this work, we design DICE which prioritizes the data quality of the client devices in the selection phase, in addition to their computation and communication abilities, to improve the accuracy of FL. Additionally, in DICE, we introduce the assistance of vicinal edge devices to account for the lack of computation or communication abilities in certain devices without violating the privacy-preserving guarantees of FL. Towards this aim, we propose a scheme to decide the optimal edge device, in terms of latency and workload, to be selected as the helper device. The experimental results show that DICE improves convergence speed for a given level of model accuracy. Further, the simulation results show that DICE reduces delay by at least 16%, energy consumption by at least 17%, and packet loss by at least 55% compared to the existing benchmarks while prioritizing the on-device data quality across clients.