A higher-than-predicted measurement of iron opacity at solar interior temperatures

Abstract
Nearly a century ago it was recognized1 that radiation absorption by stellar matter controls the internal temperature profiles within stars. Laboratory opacity measurements, however, have never been performed at stellar interior conditions, introducing uncertainties in stellar models2,3,4,5. A particular problem arose2,3,6,7,8 when refined photosphere spectral analysis9,10 led to reductions of 30–50 per cent in the inferred amounts of carbon, nitrogen and oxygen in the Sun. Standard solar models11 using the revised element abundances disagree with helioseismic observations that determine the internal solar structure using acoustic oscillations. This could be resolved if the true mean opacity for the solar interior matter were roughly 15 per cent higher than predicted2,3,6,7,8, because increased opacity compensates for the decreased element abundances. Iron accounts for a quarter of the total opacity2,12 at the solar radiation/convection zone boundary. Here we report measurements of wavelength-resolved iron opacity at electron temperatures of 1.9–2.3 million kelvin and electron densities of (0.7–4.0) × 1022 per cubic centimetre, conditions very similar to those in the solar region that affects the discrepancy the most: the radiation/convection zone boundary. The measured wavelength-dependent opacity is 30–400 per cent higher than predicted. This represents roughly half the change in the mean opacity needed to resolve the solar discrepancy, even though iron is only one of many elements that contribute to opacity.
Keywords