Late fibrosis can occur in breast cancer patients treated with curative-intent radiotherapy. Predicting this toxicity is of clinical interest in order to adapt the irradiation dose delivered. Radiation-induced CD8 T-lymphocyte apoptosis (RILA) had been proven to be associated with less grade ≥2 late radiation-induced toxicities in patients with miscellaneous cancers. Tobacco smoking status and adjuvant hormonotherapy were also identified as potential factors related to late-breast-fibrosis-free survival. This article evaluates the predictive performance of the RILA using a ROC curve analysis that takes into account the dynamic nature of fibrosis occurrence. This time-dependent ROC curve approach is also applied to evaluate the ability of the RILA combined with the other previously identified factors. Our analysis includes a Monte Carlo cross-validation procedure and the calculation of an expected cost of misclassification, which provides more importance to patients who have no risk of late fibrosis in order to be able to treat them with the maximal irradiation dose. Performance evaluation was assessed at 12, 24, 36 and 50 months. At 36 months, our results were comparable to those obtained in a previous study, thus underlying the predictive power of the RILA. Based on specificity and cost, RILA alone seemed to be the most performant, while its association with the other factors had better negative predictive value results.