Abstract

Film cooling is a crucial technique for protecting critical components of gas turbines from excessive temperatures. Multiparameter film cooling optimization is still relatively time-consuming owing to the substantial computational demands of computational fluid dynamics (CFD) methods. To reduce the computational cost, the present study develops a data-driven framework for predicting and optimizing the film cooling effectiveness of high-pressure turbines based on deep learning. Multiple rows of cooling holes located on the pressure surface of the turbine blade are optimized, with the coolant hole diameter, the incline angle, and the compound angle as design parameters. A conditional generative adversarial network model combining a gated recurrent unit and a convolutional neural network is designed to establish the complex nonlinear regression between the design parameters and the film cooling effectiveness. The surrogate model is trained and tested using independent CFD results. A sparrow search algorithm and the well-trained surrogate model are combined to acquire the optimal film cooling parameters. The proposed framework is found to improve multi-row film cooling effectiveness by 21.2% at an acceptable computational cost.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call