Swiss German dialects pose significant challenges for natural language processing (NLP) applications, due to their lack of standard orthography, linguistic diversity, and scarcity of annotated data. We introduce a novel method for normalizing Swiss German text to Standard German by employing the mT5 model, a cutting-edge large language model (LLM) that can perform various text-to-text transformations across multiple languages. Our approach not only aims to enhance the processing of Swiss German dialects but also seeks to broaden the understanding of the adaptability of pre-trained LLMs in the realm of dialect normalization. By fine-tuning the mT5 model across its small, base, and large variants with the SwissDial dataset under various hyperparameter settings, we evaluated the performance of these models using the character n-gram F-score (ChrF) and the COMET metrics. The results demonstrated that the mT5 model, particularly its smallest variant, can achieve high-quality normalization of Swiss German dialects with minimal performance differences between the model sizes. This indicates that the SwissDial dataset is sufficiently extensive for effective fine-tuning, suggesting that even less resource-intensive models are viable for this task. Our findings advocate for the potential of LLMs, like mT5, as powerful instruments for dialect normalization and other NLP challenges, offering a promising alternative to traditional methods.