The first systematic study of opacity dependence on atomic number at stellar interior temperatures is used to evaluate discrepancies between measured and modeled iron opacity [J. E. Bailey etal., Nature (London) 517, 56 (2015)NATUAS0028-083610.1038/nature14048]. High-temperature (>180 eV) chromium and nickel opacities are measured with ±6%-10% uncertainty, using the same methods employed in the previous iron experiments. The 10%-20% experiment reproducibility demonstrates experiment reliability. The overall model-data disagreements are smaller than for iron. However, the systematic study reveals shortcomings in models for density effects, excited states, and open L-shell configurations. The 30%-45% underestimate in the modeled quasicontinuum opacity at short wavelengths was observed only from iron and only at temperature above 180eV. Thus, either opacity theories are missing physics that has nonmonotonic dependence on the number of bound electrons or there is an experimental flaw unique to the iron measurement at temperatures above 180eV.
Read full abstract