Atomic layer deposition (ALD) processes are usually highly exothermic. For ALD on substrates with high specific surface area, the reaction heat per ALD cycle may lead to a substantial temperature rise of the substrates. The novelty of this study lies in quantifying the temperature excursions of porous substrates during ALD via numerical simulation and further addresses the issue of tuning this type of thermal effect. A comprehensive model has been developed and validated to investigate the ALD thermal effect with accounting for the roles of the precursor transport phenomena within the ALD reactor system, and the coupling among the precursor diffusion, heat transfer, film deposition, reaction heat generation within the porous substrates. The results show that the thermal effect of the sub-saturated ALD can change with different substrate thickness, precursor partial pressure profile, heat transfer coefficient, and kinetic parameters. In contrast, for saturated ALD with adiabatic condition, the maximum temperature rise of the substrate is a deterministic value, independent of the substrate thickness and precursor partial pressure profile. The thermal effect with the porous substrates can influence the ultimate deposit amount and its distribution for sub-saturated ALD and change the time required for attaining saturation for saturated ALD. The temperature rises of certain common substrates during typical half-ALD cycles have been simulated to demonstrate the capability of the present model in predicting the thermal effect of ALD.
Read full abstract