Nanofluids are unique thermal fluids with enhanced heat transfer efficiency relative to conventional thermal fluids. The density of nanofluids is an important thermophysical property that determines the heat transfer coefficient of fluids. Hybrid nanofluids are known to show superior thermal properties compared to normal nanofluids, yet investigation on the predictive modeling of the density of hybrid nanofluids is scarce in the literature. In this report, the application of machine learning (ML) for predicting the density of hybrid nanofluids is examined. The considered hybrid nanofluids consist of Al2O3/SiO2, TiO2-SiO2, Fe3O4-MWCNT, Al2O3-CNT, Al2O3-MWCNT, TiO2-MWCNT, CeO2-MWCNT, ZnO-MWCNT, MgO-MWCNT, CuO-MWCNT, Co3O4/rGO, TiO2-MgO, Ag-GNP, and ND-Fe3O4 nanoparticles suspended in H2O, GB, DW, and W-EG (60:40%). The ML algorithms examined in this report include: support vector regression optimized with genetic algorithm (SVR-GA), Gradient Boost Regression Algorithm (GBR) with Grid Search Optimization (GBR-GSO), Decision Tree (DT), and Voting Ensemble (VE). These models were developed using the following input parameters: temperature, volume concentration, the density of individual nanoparticles, and density of the base fluid. Excellent correlations of 99.81%, 99.76%, 96.36%, and 95.05% were obtained for the SVR-GA, GBR-GSO, VE, and DT, respectively. This result shows that the SVR-GA had the highest correlations with experimental results. The development of a highly accurate predictive model for the density of hybrid nanofluids is essential because it can facilitate the rapid design of heat transfer devices. Developing a diffusivity model for hybrid nanofluids using a machine learning approach is a possible extension of the present study.
Read full abstract