In recent times, employing machine learning techniques (MLTs) to forecast the evaporation/condensation performance of refrigerants, namely the heat transfer coefficient (HTC) and frictional pressure drop (FPD), has become increasingly significant. In the current investigation, an experimental comparison of the evaporation HTC and FPD of R1234yf, R290, and R13I1/R290 was explored in an offset strip fin-plate heat exchanger. Following this, six MLTs—support vector regressor (SVR), multilayer perceptron regressor (MLPR), gradient boosting regressor (GBR), AdaBoost regressor (ABR), ridge regressor (RR), and K-nearest neighbors regressor (KNNR)—were proposed to forecast the HTC and FPD of these refrigerants. The comparative experimental analysis revealed that the evaporation-HTC of R290 was 27.8–76.2 % and 46.2–73.8 % higher than that of R1234yf and R13I1/R290, respectively, while FPD declined by up to 74.8 % and 57.2 %, respectively. Under the same working conditions, the first transition from nucleation to convective boiling occurred in the order of R1234yf, R13I1/290, and R290. In addition, nucleate boiling dominated at lower vapor quality, while convective boiling prevailed at higher vapor quality. A total of 336 experimental data points corresponding to various test conditions from existing studies and current experiments were considered for the MLT prediction analysis. According to the results, GBR (without any enhancement approaches) was the best approach for forecasting the FPD and HTC, with mean absolute errors (MAE) of 0.125 % and 0.131 %, respectively. Additionally, to enhance the forecasting efficiency of the MLTs, feature selection, analysis of principal components, and hyperparameter tuning were employed. According to feature importance, mass flux, reduced pressure, saturation temperature, heat flux, and entry vapor quality were identified as the most influential parameters on the FPD and HTC. Ultimately, the most accurate predictions of HTC and FPD, with the lowest MAE of 0.111 %, were achieved using MLPR and SVR, respectively, with feature selection.
Read full abstract