The aim of this work was to provide a method to evaluate the yield of DNA double-strand breaks (DSBs) for carbon ions, overcoming the bias in existing methods due to the nonrandom distribution of DSBs. A previously established biophysical program based on the radiation track structure and a multilevel chromosome model was used to simulate DNA damage induced by x-rays and carbon ions. The fraction of activity retained (FAR) as a function of absorbed dose or particle fluence was obtained by counting the fraction of DNA fragments larger than 6 Mbp. Simulated FAR curves for the 250 kV x-rays and carbon ions at various energies were compared with measurements using constant-field gel electrophoresis. The doses or fluences at the FAR of 0.7 based on linear interpolation were used to estimate the simulation error for the production of DSBs. The relative difference of doses at the FAR of 0.7 between simulation and experiment was -8.5% for the 250 kV x-rays. The relative differences of fluences at the FAR of 0.7 between simulations and experiments were -17.5%, -42.2%, -18.2%, -3.1%, 10.8%, and -14.5% for the 34, 65, 130, 217, 2232, and 3132 MeV carbon ions, respectively. In comparison, the measurement uncertainty was about 20%. Carbon ions produced remarkably more DSBs and DSB clusters per unit dose than x-rays. The yield of DSBs for carbon ions, ranging from 10 to 16 Gbp-1Gy-1, increased with linear energy transfer (LET) but plateaued in the high-LET end. The yield of DSB clusters first increased and then decreased with LET. This pattern was similar to the relative biological effectiveness for cell survival for heavy ions. The estimated yields of DSBs for carbon ions increased from 10 Gbp-1Gy-1 in the low-LET end to 16 Gbp-1Gy-1 in the high-LET end with 20% uncertainty.