Abstract

HgCdTe heteroepitaxy on low-cost, large-lattice-mismatched substrates such as Si continue to be plagued by large threading dislocation densities that ultimately reduce the operability of the thermal imaging detector array. Molecular-beam epitaxy (MBE) of 10 μm- to 15 μm-thick CdTe buffer layers has played a crucial role in reducing dislocation densities to current state-of-the-art levels. Herein, we examine the possibility that growth on locally back-thinned substrates could prove advantageous in further reducing dislocation densities in the CdTe/Si heteroepitaxial system. Using defect decoration techniques, a decrease in dislocation (etch-pit) density of up to ~42% has been measured in CdTe regions where the underlying Si substrate was chemically back-thinned to ~20 μm. A theoretical understanding is proposed, where a substrate-thickness-dependent dislocation image force is a likely cause for the experimentally observed reduction in threading dislocation density. These observations raise the prospect of combining localized substrate thinning with other techniques to further reduce dislocation densities to levels sought for HgCdTe/CdTe/Si and other large-lattice-mismatched systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call