Federated learning achieves data privacy protection by uploading only users’ local updates. However, recent research has shown that privacy data can still be inferred from local updates. Privacy-preserving federated learning also faces challenges related to the confidentiality and privacy of local updates, the robustness of dynamic membership schemes, and the determination of appropriate clipping threshold in differentially private federated learning. To address these issues, this paper presents the EPFL-DAC scheme: Enhancing Privacy in Federated Learning with Dynamic Aggregation and Clipping. Firstly, Paillier homomorphic encryption is employed to securely aggregate local updates, ensuring that the server can only access the aggregated information. Secondly, a dynamic robustness algorithm is designed to ensure the robustness of the aggregation scheme when user states change. Lastly, a dynamic threshold determination method is introduced to introduce more accurate noise, while also providing resistance against collusive attacks in differentially private settings. Experimental results demonstrate the effectiveness, low computational overhead, and preservation of accuracy in the dynamic aggregation scheme. Additionally, the proposed dynamic threshold determination method exhibits good performance, especially at higher privacy protection levels and larger initial clipping threshold values.