Abstract

The high-pressure turbine blades are the components of the aero-engines which are the most exposed to extreme thermal conditions. To alleviate this issue, the blades are equipped with cooling systems to ensure long-term operation. However, the accurate prediction of the blade temperature and the design of the cooling system in an industrial context still remains a major challenge. Potential improvement is foreseen with Large-Eddy Simulation (LES) which is well suited to predict turbulent flows in such complex systems. Nonetheless, performing LES of a real cooled high-pressure turbine still remains expensive. To alleviate the issues of CPU cost, a cooling model recently developed in the context of combustion chamber liners is assessed in the context of blade cooling. This model was initially designed to mimic coolant jets injected at the wall surface and does not require to mesh the cooling pipes leading to a significant reduction in the CPU cost. The applicability of the model is here evaluated on the cooled Nozzle Guide Vanes (NGV) of the Full Aerothermal Combustor Turbine interactiOns Research (FACTOR) test rig. To do so, a hole modeled LES using the cooling model is compared to a hole meshed LES. Results show that both simulations yield very similar results confirming the capability of the approach to predict the adiabatic film effectiveness. Advanced post-processing and analyses of the coolant mass fraction profiles show that the turbulent mixing between the coolant and hot flows is however reduced with the model. This finding is confirmed by the turbulent map levels which are lower in the modeled approach. Potential improvements are hence proposed to increase the accuracy of such methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call