Abstract
Film cooling is a crucial technique for protecting critical components of gas turbines from excessive temperatures. Multiparameter film cooling optimization is still relatively time-consuming owing to the substantial computational demands of computational fluid dynamics (CFD) methods. To reduce the computational cost, the present study develops a data-driven framework for predicting and optimizing the film cooling effectiveness of high-pressure turbines based on deep learning. Multiple rows of cooling holes located on the pressure surface of the turbine blade are optimized, with the coolant hole diameter, the incline angle, and the compound angle as design parameters. A conditional generative adversarial network model combining a gated recurrent unit and a convolutional neural network is designed to establish the complex nonlinear regression between the design parameters and the film cooling effectiveness. The surrogate model is trained and tested using independent CFD results. A sparrow search algorithm and the well-trained surrogate model are combined to acquire the optimal film cooling parameters. The proposed framework is found to improve multi-row film cooling effectiveness by 21.2% at an acceptable computational cost.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.