Abstract

The channel temperature (Tch) and thermal resistance (Rth) of Ga2O3 metal-oxide-semiconductor field-effect transistors were investigated through electrical measurements complemented by electrothermal device simulations that incorporated experimental Ga2O3 thermal parameters. The analysis technique was based on a comparison between DC and pulsed drain currents (IDS) at known applied biases, where negligible self-heating under pulsed conditions enabled approximation of Tch to the ambient temperature (Tamb) and hence correlation of IDS to Tch. Validation of the device model was achieved through calibration against the DC data. The experimental Tch was in good agreement with simulations for Tamb between 20 °C and 175 °C. A large Rth of 48 mm·K/W thus extracted at room temperature highlights the value of thermal analysis for understanding the degradation mechanisms and improving the reliability of Ga2O3 power devices.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.