Film cooling is one of key technologies advancing the performance of gas turbine engines, which is extensively applied in the hot turbine component for thermal protection by injecting cooling air from numerous film holes. To achieve an optimal film cooling scheme, many iterations of the position, the diameter, and the orientation of the holes through computation simulations are required to well fit the aerodynamic profiles of the turbine blades. However, a conventional computational simulation that resolves each individual hole and its coolant supply channel implies substantial costs of computation resources. In this study, a novel film cooling modeling method that uses a virtual boundary to represent film holes is assessed by comparing to the conventional computational simulation in two representative test cases, involving flat-plate and turbine endwall inlet film cooling. Relevant experiments are carried out to provide experimental data for verifying the approach at various coolant injection rates. Generally, the virtual boundary model produces rather equivalent results to those predicted by the conventional simulation method. Though coolant concentration downstream of the holes is over-estimated by the virtual boundary model due to neglected in-hole mixing, the overall flow fields and film cooling performance are well captured, showing the feasibility of the novel strategy in accurately modeling film cooling in a much more efficient way by saving computational efforts. This novel concept thereby allows designers to modify the film hole geometries independently of the blade geometry and thus, is helpful to accelerate the design process of a film-cooled turbine or to assess multiple film cooling configurations more quickly, as compared to a conventional computation simulation.
Read full abstract