Abstract

AbstractXGBoost is the optimization of gradient boosting with the best overall performance among machine learning algorithms. By introducing a regularization term into the loss function of gradient boosting, XGBoost can effectively limit the complexity of the model, improve the generalization ability, and solve the overfitting problem. In this paper, XGBoost is first introduced into modeling radio‐frequency (RF) power amplifiers (PA) under different temperatures. Furthermore, the modeling effect of XGBoost is mainly dependent on hyperparameters. As traditional grid search is time‐consuming and labor‐intensive, this paper combines particle swarm optimization (PSO) with XGBoost for searching hyperparameters. The experimental results show that XGBoost can effectively suppress the overfitting problem in gradient boosting while modeling RF PAs in different ambient temperatures. In addition, compared to classic machine learning algorithms, including support vector regression (SVR), gradient boosting, and XGBoost, the proposed PSO‐XGBoost can increase the modeling accuracy by one order of magnitude or more while also increasing the modeling speed by more than one magnitude or more. The PSO‐XGBoost model proposed in this paper can be introduced into modeling other microwave/RF devices and circuits to improve modeling accuracy and reduce modeling time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call