Abstract

A technique using a Therma-Wave Therma-Probe has been developed which allows the possibility of the detection of parts per million levels of high energy contaminants in low energy beams [C. Jaussaud et al., Nucl. Instr. and Meth. B 74 (1993) 571]. This technique relies on the differential hold up of the desired (low energy) and contaminant (high energy) ions in a screen oxide. In this study we have extended this technique by using a wedge oxide, varying from 0 to 1700 Å as the hold-up oxide. This approach, instead of using a single oxide thickness across the wafer, makes possible the empirical determination of the exact oxide thickness required to completely trap the desired energy ions while at the same time allowing the maximum number of contaminant ions to pass through the oxide and produce a Therma-Wave signal in the underlying silicon. Using an Eaton NV-8200P medium current implanter, we implanted a 2 × 10 15 ions/cm 2 dose of boron at 3 and 10 keV and BF 2 at 13.36 and 44.55 keV. We then compared these Therma-Wave values to the Therma-Wave values produced by reference “contaminant” boron implants, on other wafers, at 15 and 40 keV. These measurements indicate that the wedge oxide allows for the determination of the energetic purity of the beam at less than or equal to the 5–10 ppm level.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.