The fundamental objective of this paper is to use Machine Learning (ML) methods for building models on temperature (T) prediction using input features r and z for a membrane separation process. A hybrid model was developed based on computational fluid dynamics (CFD) to simulate the separation process and integrate the results into machine learning models. The CFD simulations were performed to estimate temperature distribution in a vacuum membrane distillation (VMD) process for separation of liquid mixtures. The evaluated ML models include Support Vector Machine (SVM), Elastic Net Regression (ENR), Extremely Randomized Trees (ERT), and Bayesian Ridge Regression (BRR). Performance was improved using Differential Evolution (DE) for hyper-parameter tuning, and model validation was performed using Monte Carlo Cross-Validation. The results clearly indicated the models’ effectiveness in temperature prediction, with SVM outperforming other models in terms of accuracy. The SVM model had a mean R2 value of 0.9969 and a standard deviation of 0.0001, indicating a strong and consistent fit to the membrane data. Furthermore, it exhibited the lowest mean squared error, mean absolute error, and mean absolute percentage error, signifying superior predictive accuracy and reliability. These outcomes highlight the importance of selecting a suitable model and optimizing hyperparameters to guarantee accurate predictions in ML tasks. It demonstrates that using SVM, optimized with DE improves accuracy and consistency for this specific predictive task in membrane separation context.
Read full abstract