The development of lithium-ion battery technology has ensured that battery thermal management systems are an essential component of the battery pack for next-generation energy storage systems. Using dielectric immersion cooling, researchers have demonstrated the ability to attain high heat transfer rates due to the direct contact between cells and the coolant. However, feedback control has not been widely applied to immersion cooling schemes. Furthermore, current research has not considered battery pack plant design when optimizing feedback control. Uncertainties are inherent in the cooling equipment, resulting in temperature and flow rate fluctuations. Hence, it is crucial to systematically consider these uncertainties during cooling system design to improve the performance and reliability of the battery pack. To fill this gap, we established a reliability-based control co-design optimization framework using machine learning for immersion cooled battery packs. We first developed an experimental setup for 21700 battery immersion cooling, and the experiment data were used to build a high-fidelity multiphysics finite element model. The model can precisely represent the electrical and thermal profile of the battery. We then developed surrogate models based on the finite element simulations in order to reduce computational cost. The reliability-based control co-design optimization was employed to find the best plant and control design for the cooling system, in which an outer optimization loop minimized the cooling system cost while an inner loop ensured battery pack reliability. Finally, an optimal cooling system design was obtained and validated, which showed a 90% saving in cooling system energy consumption.