Abstract

The long-term shift of impurity depth profiles in silicon, observed by secondary ion mass spectrometry under oblique low-energy O2+ bombardment with oxygen flooding is simulated using a simple model. The erosion rate is assumed to decrease in two steps, a rapid initial falloff due to oxygen incorporation in the sample followed by a less pronounced, but long-term change associated with ripple formation. Two methods of depth calibration are compared, the standard procedure of measuring the crater depth and the recent approach of using shallow delta doping markers with known spacing. The results obtained by the two methods exhibit pronounced differences, which are due to the fact that the delta-spacing approach is based on the use of a depth-dependent local erosion rate whereas the crater-depth method involves a mean erosion rate averaged over the total sputtered depth. The vastly different shifts reported by independent groups for profiling of boron-delta markers in silicon by 1 keV O2+ at about 60° can be reproduced surprisingly well. It is shown that the apparent shifts observed under conditions of ongoing changes in erosion rate are not fixed numbers, as in the case of normally incident O2+ beams, but depend strongly on the details of the depth calibration procedure. Used in combination with shallow markers (<20 nm), the delta-spacing approach yields misleadingly small apparent shifts because the local erosion rate is still significantly higher than the (quasi-) stationary value. Full control of erosion artifacts can only be achieved by profiling to sufficiently large depths (>50 nm). The cumulative shifts for 1 keV O2+ at 60° with flood are about three times larger than at normal incidence in vacuum, and even four times larger for 0.5 and 1 keV at 55° with flood.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call