Federated learning (FL) stimulates distributed on-device computation systems to process an optimum technique efficiency by communicating local process upgrades and global method distribution from aggregation averaging procedure. On-device FL is a standard application in wireless environments, with several mobile devices participating as nodes in the FL network. Managing extensive multi-dimensional process upgrades and resource-constrained computations in large-scale heterogeneous IoT cellular networks can be challenging. This article introduces a Lifetime Maximization using Optimal Directed Acyclic Graph Federated Learning in IoT Communication Networks (LM-ODAGFL) technique. The proposed LM-ODAGFL technique utilizes FL and metaheuristic optimization algorithms for energy-effective IoT networks. The Direct Acyclic Graph (DAG) model addresses device asynchrony in FL while minimizing additional resource usage. The Archimedes Optimization Algorithm (AOA) is designed to optimize the DAG model by reducing both user energy consumption and the training loss of the FL model. The performance validation of the LM-ODAGFL technique is performed by utilizing a series of experimentations. The obtained results of the LM-ODAGFL model demonstrate superior performance by consuming significantly less energy than SDAGFL and ESDAGFL, with values ranging from 0.373 to 0.485 kJ per round on the FMNIST-Clustered dataset and 16.27 to 20.34 kJ per round on the Poets dataset, compared to 0.000 to 1.442 kJ and 0.00 to 63.89 kJ respectively.