Abstract

In recent years, improvements in wireless communication have led to the development of microstrip or patch antennas. The article discusses using simulation, measurement, an RLC equivalent circuit model, and machine learning to assess antenna performance. The antenna's dimensions are 1.01 λ0×0.612λ0 with respect to the lowest operating frequency, the maximum achieved gain is 6.76 dB, the maximum directivity is 8.21 dBi, and the maximum efficiency is 83.05%. The prototype's measured return loss is compared to CST and ADS simulations. The prediction of gain and directivity of the antenna is determined using a different supervised regression machine learning (ML) method. The performance of ML models is measured by the variance score, R square, mean square error (MSE), mean absolute error (MAE), root mean square error (RMSE), and mean squared logarithmic error (MSLE), etc. With errors of less than unity and an accuracy of roughly 98%, Ridge regression gain prediction outperforms the other seven ML models. Gaussian process regression is the best method for predicting directivity. Finally, modeling results from CST and ADS, as well as measured and anticipated results from machine learning, reveal that the suggested antenna is a good candidate for LTE.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call