Abstract
Experimental measurements of X-ray line intensity ratios in a transmission electron microscope are compared over several orders of magnitude of sample thicknesses, from the nm- to the mm- range, with Monte-Carlo simulations using two different software packages. It is shown that the form of the thickness dependence of the K/L ratio of characteristic X-ray lines for GaAs is reproduced qualitatively, but the numerical differences between software packages are large. A scheme is presented for improving the simple k-factor method, taking explicitly into account the thickness dependence that remains even after application of the usual absorption and fluorescence corrections. This is done in first-order approximation by linear regression. The improvement in determining the correct indium concentration in specimens of InGaAs is calculated to be 1at%.