Achieving stable and reliable autonomous driving in complex traffic environments while ensuring safety under unpredictable conditions is a critical challenge in autonomous driving technology. To address this issue, this study proposes the Safedrive Dreamer navigation framework, which aims to reduce the reliance on trial-and-error learning in real-world scenarios, thereby mitigating the risks associated with dynamic driving conditions and enhancing vehicle foresight. This framework integrates the predictive capabilities of world models with the constrained Markov decision process (CMDP) and safety reinforcement learning to accurately anticipate future environmental changes. This ensures the reliability of autonomous driving routes, thereby improving both safety and efficiency. Furthermore, to reduce trial-and-error costs in real-world applications, this study employs PAC-Bayesian methods to derive generalization error bounds between simulations and reality, enabling a more effective transfer of knowledge and experience from simulations to real-world scenarios. Validation experiments in simulated and real environments showed that Safedrive Dreamer significantly outperformed existing autonomous driving solutions by 3.8% in key safety metrics, excelling in collision avoidance and risk reduction. This study provides new insights into the integration of world models into decision-making processes to enhance decision-making capabilities in safety–critical applications, thereby contributing significantly to the improvement of autonomous driving safety and reliability.