This paper explores the application of Large Language Models (LLMs) in the automotive and supplier industries, with a particular focus on the use of retrieval-augmented generation (RAG) systems to streamline information retrieval from technical documentation. The research, part of the CoLab4DigiTwin project, investigates how digital twins supported by smart services can enhance interdisciplinary collaboration and reduce the reliance on manual data searches. We developed a pipeline utilizing a RAG architecture which uses a vector database for efficient data management and fast access to relevant information, eliminating the need for expensive computational resources. The performance of various open-source LLMs, which are finetuned on German, was evaluated, focusing on readability, clarity, and accuracy. The results show decent performance of the system without the need for model fine-tuning. Future research will aim to refine these processes and extend the applicability of RAG systems, highlighting the potential of Large Language Models to transform industrial data interaction.