Federated Learning (FL) has gained popularity due to its advantages over centralized learning. However, existing FL research has primarily focused on unconstrained wired networks, neglecting the challenges posed by wireless Internet of Things (IoT) environments. The successful integration of FL into IoT networks requires tailored adaptations to address unique constraints, especially in computation and communication. This paper introduces Communication-Aware Federated Averaging (CAFA), a novel algorithm designed to enhance FL operations in wireless IoT networks with shared communication channels. CAFA primarily leverages the latent computational capacities during the communication phase for local training and aggregation. Through extensive and realistic evaluations in dedicated FL-IoT framework, our method demonstrates significant advantages over state-of-the-art approaches. Indeed, CAFA achieves up to a 4x reduction in communication costs and accelerates FL training by as much as 70%, while preserving model accuracy. These achievements position CAFA as a promising solution for the efficient implementation of FL in constrained wireless networks.