This study presents a novel path planning and path-following system designed explicitly for real-world driving conditions in autonomous vehicles. The pipeline considers uncertainties in the dynamic parameters of the vehicle, which are common in real-world scenarios. To address these uncertainties, a robust multi-stage model predictive control approach is developed to generate optimal control outputs. A customized convolutional neural network, Nano-FastSCNN, quantized to half-precision, is employed for accurately segmenting left and right lane boundaries. To ensure that our visual planner is robust against adverse weather conditions, we have rigorously evaluated its generalization capabilities under challenging weather types. The centerline of these lanes is then selected as the desired path and transformed into the camera reference frame by leveraging the camera projection matrix. The controller utilizes this reference trajectory to guide the vehicle along the desired trajectory. To demonstrate real-time processing capabilities, the entire pipeline, including the vision system module and controller, is implemented in a hardware-in-the-loop setup utilizing three NVIDIA Jetson devices to demonstrate the scalability of our architecture. Comprehensive evaluations demonstrate that the pipeline achieves an inference speed of 15.6 FPS, even on the less powerful NVIDIA Jetson Nano. Moreover, the system successfully alleviates the detrimental effects of uncertainties and reliably follows the desired path.