Abstract

Light absorption enhancement in a 1.5 μm thick mercury–cadmium–telluride (Hg0.762Cd0.238Te, MCT) layer at room temperature utilizing 1D dielectric grating at mid-wave infrared (MWIR) wavelengths (3–5 μm) has been theoretically investigated. The optimized dielectric grating facilitates light diffraction and scattering into the MCT-absorbing waveguiding layer resulting in an increased lateral optical path. The light absorption was improved from ∼37.5% to ∼71% (TE) and ∼70% (TM) at normal incidence. With enhanced absorption, the photocarrier generation rate in the thin layer would be comparable to a bulk 5 μm thick MCT layer. A ∼3× reduction in the MCT layer thickness without compromising absorption has the potential for realizing infrared photodetectors with improved sensitivity at conventional operating temperatures and/or elevated operating temperatures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call