The hot tearing susceptibility (HTS) of Mg–xAl–yCa (x + y = 8) alloys with different Ca/Al ratios (namely 0.06, 0.34, 0.63, 1.04, 1.75, and 2.82) is experimentally investigated using a “T-shaped” hot tearing measuring system. Additionally, the HTS of the alloys is determined using an optimized version of the Clyne–Davies model in conjunction with the solidification parameters of the alloys. The results of the optimized model, i.e., the fact that the HTS decreases with the increase in the Ca/Al ratio, are in good agreement with both the numerical simulations and experimental results. Furthermore, the hot tearing curves, thermal analysis curves, and microstructure of the alloys show that with the increase in the Ca/Al ratio, the intergranular bonding before the hot tearing initiation changes from relying on a liquid film only to involving both intergranular bridging and the liquid film, which enhances the intergranular bonding ability. In addition, the increase in the eutectic phase content, number of dendrite skeleton voids, and dendrite gap width result in an increase in the number of feeding channels and a decrease in the flow resistance of the residual liquid phase, thus increasing its filling efficiency against tears. Therefore, in alloys with high Ca/Al ratios, the large grain size results in a reduction in both the number and total length of grain boundaries, thereby reducing the number of locations where hot tears may initiate. Therefore, the HTS of alloys with a high Ca/Al ratio is significantly reduced.