Purpose: This article investigates the intricate balance between free speech and content restrictions within the frameworks of the EU Digital Services Act (DSA) and the Digital Markets Act (DMA). It delves into the duties of 'gatekeepers' under the DMA, the challenges of content moderation, and the delicate equilibrium between fostering open communication and curbing harmful material. The primary aim is to contribute to the ongoing debate on digital platform regulation, particularly focusing on the gatekeeper designation under the DMA and its potential to enhance freedom of speech while mitigating excessive compliance and censorship. Methodology: The study employs a mixed-methods approach to provide a comprehensive analysis of the regulatory landscape and its implications. It includes a literature review of academic journals, legal documents, and policy papers; in-depth document analysis of the DMA and DSA; case studies of landmark decisions such as GS Media and Delfi AS v. Estonia; interviews with key stakeholders; a survey of platform users; comparative analysis with other jurisdictions; policy impact assessment; and content analysis of platform policies. The research adheres to ethical guidelines, ensuring informed consent and data protection. Findings: The findings reveal that the DMA's gatekeeper rules have the potential to significantly enhance freedom of speech by curbing platform dominance, promoting content diversity, and encouraging transparent content moderation. However, the risk of over-compliance leading to the suppression of free speech is also identified. The study highlights the importance of independent oversight, clear platform guidelines, and continuous dialogue among stakeholders to maintain a balance between open markets and free speech. Unique Contribution to Theory, Policy and Practice: This article offers a novel perspective on how the DMA's gatekeeper designation improves freedom of speech while avoiding excessive censorship. It provides valuable insights for policymakers, digital platforms, and civil society organizations, emphasizing the need for a nuanced approach to content moderation that respects users' rights. The article also underscores the importance of ongoing review and adaptation of regulations to address the evolving digital landscape, thereby contributing to the development of a safer, more competitive, and fairer online environment.