PURPOSE Machine learning (ML) algorithms are used for predictive modeling in medicine, but studies often do not evaluate or report on the potential biases of the models. Our purpose was to develop clinical prediction models for readmission after surgery in patients with colorectal cancer (CRC) and to examine their potential for racial bias. METHODS We used the 2012-2020 American College of Surgeons' National Surgical Quality Improvement Program Participant Use File and Targeted Colectomy File. Patients were categorized into four race groups—White, Black or African American, other, and unknown/not reported. Potential predictive features were identified from studies of risk factors of 30-day readmission in patients with CRC. We compared four ML-based methods—logistic regression, multilayer perceptron, random forest, and XGBoost (XGB). Model bias was assessed using false-negative rate (FNR) difference, false-positive rate (FPR) difference, and disparate impact (DI). RESULTS In all, 112,077 patients were included, 67.2% of whom were White, 9.2% Black, 5.6% other race, and 18% with race not recorded. There were significant differences in the area under the receiver operating characteristics curve, FPR, and FNR between race groups across all models. Notably, patients in the other race category had higher FNR compared with Black patients in all but the XGB model, while Black patients had higher FPR than White patients in some models. Patients in the other category consistently had the lowest FPR. Applying the 80% rule for DI, the models consistently met the threshold for unfairness for the other race category. CONCLUSION Predictive models for 30-day readmission after colorectal surgery may perform unequally for different race groups, potentially propagating to inequalities in delivery of care and patient outcomes if the predictions from these models are used to direct care.