Federated learning (FL) has emerged as a crucial technology in today’s data-centric environment, enabling decentralized machine learning while safeguarding user privacy. This study introduces “Federated Learning ML Operations (FedOps) Mobile”, a novel FL framework optimized for the dynamic and heterogeneous ecosystem of mobile devices. FedOps Mobile addresses the inherent challenges of FL—such as system scalability, device heterogeneity, and operational efficiency—through advanced on-device training using TensorFlow Lite and CoreML. The framework’s innovative approach includes sophisticated client selection mechanisms that assess device readiness and capabilities, ensuring equitable and efficient participation across the network. Additionally, FedOps Mobile leverages remote device control for seamless task management and continuous learning, all without compromising the user experience. The main contribution of this study is the demonstration that federated learning across heterogeneous devices, especially those using different operating systems, can be both practical and efficient using the FedOps Mobile framework. This was validated through experiments that evaluated three key areas: operational efficiency, model personalization, and resource optimization in multi-device settings. The results showed that the proposed method excels in client selection, energy consumption, and model optimization, establishing a new benchmark for federated learning in diverse and complex environments.