We aimed to evaluate the effect of tube voltage, tube current-time product, and iterative reconstruction on iodine quantification using a dual-layer spectral CT scanner. Two mediastinal iodine phantoms, each containing six tubes of different iodine concentrations (0, 1, 2.5, 5, 10, and 20 mg I/mL; the two phantoms had tubes with contrast media diluted in water and in 10% amino acid solution, respectively), were inserted into an anthropomorphic chest phantom and scanned with varying acquisition parameters (120 and 140 kVp; 20, 40, 60, 80, 100, 150, and 200 mAs; and spectral reconstruction levels 0 and 6). Thereafter, iodine density was measured (in milligrams of iodine per milliliter) using a dedicated software program, and the effect of acquisition parameters on iodine density and on its relative measurement error (RME) was analyzed using a linear mixed-effects model. Tube voltages (all, p < 0.001) and tube current-time products (p < 0.05, depending on the interaction terms for iodine density; p = 0.023 for RME) had statistically significant effects on iodine density and RME. However, the magnitude of their effects was minimal. That is, estimated differences between tube voltage settings ranged from 0 to 0.8 mg I/mL for iodine density and from 1.0% to 4.2% for RME. For tube current-time product, alteration of 100 mAs caused changes in iodine density and RME of approximately 0.1 mg I/mL and 0.6%, respectively. Spectral level was not an affecting factor for iodine quantification (p = 0.647 for iodine density and 0.813 for RME). Iodine quantification using dual-layer spectral CT was feasible irrespective of CT acquisition parameters because their effects on iodine density and RME were minimal.