High-strength steels generally contain a large number of spherical carbides with the size ranging from several hundred nanometers to several micrometers, which have an important impact on fatigue performance. Most studies believe that the carbides, similar to inclusions as fatigue crack initiators, should be refined as much as possible to improve fatigue performance. However, when the carbide size reaches submicron scale (several hundred nanometers to one micrometer), the traditional theory regarding the effect of carbides on fatigue performance may not be applicable. In this paper, it is found that smaller carbides are not always better, and an optimal submicron carbide size exists. By means of experiment and simulation, the mechanism as to how submicron carbides affect the fatigue performance of high-strength steels is systematically studied. It is found that a kind of microstructural transition occurs around carbides as a result of local stress concentration. This leads to the formation of Effective Strengthening Layers (ESLs), which force the short fatigue cracks to propagate along the ESL-matrix interface and decelerate crack propagation. The stress concentration required to generate ESL decreases with carbide size. Therefore, the competition between increasing the specific area of ESL and decreasing stress concentration with regard to decreasing carbide size yields an optimum carbide size. Based on this finding, a novel quantitative fatigue performance evaluation model with experimental validation is proposed for high-strength steels, providing a theoretical guidance for the microstructure design of submicron carbides for fatigue performance improvement.