Abstract

Complex nonlinear dependence of ultra-scaled transistor performance on its channel geometry and source/drain (S/D) doping profile bring obstacles in the advanced technology path-finding and optimization. A machine learning-based multi-objective optimization (MOO) workflow is proposed to optimize the sub-3-nm node gate-all-around (GAA) three-layer-stacked nanosheet transistors (NSFETs) accounting for the key performance knob of channel geometry and S/D doping profile. The artificial neural network (ANN) is trained to learn the compact current–voltage ( <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">${I}$ </tex-math></inline-formula> – <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">${V}$ </tex-math></inline-formula> ) relationship of NSFETs from 3-D technology computer-aided design (TCAD) simulation results. Based on the artificial neural network (ANN) model, MOO between threshold swing, on–off ratio, and ON-state current of NSFETs is performed with adaptive weighted sum theory. The proposed workflow efficiently suggests an optimized design window of channel geometry and doping profile of NSFETs. The proposed devices satisfy the 2025 International Roadmap for Devices and Systems (IRDSs) target in terms of electrical characteristics for digital circuits.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call