Abstract
This paper describes the development of a machine learning model which can be used to screen SiC MOSFETs for paralleling with minimized transient current imbalance. The spread of device parameters is determined to isolate the device parameters which are likely to significantly influence transient current distribution. A linear regression model is then developed and trained using device parameters’ data measured from forty devices of the same production lot. The resulting model is an expression for current imbalance as a function of device parameters, with weight of each parameter determined. The performance of the trained model on a set of twenty testing devices is then determined and verified via a double pulse test experiment incorporating paralleled devices. The model is found to perform satisfactorily and can effectively be deployed in screening devices for paralleling.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have