Abstract

A turbulent mixing layer formed by two supersonic streams is investigated using a new hybrid computational method. The hybrid method uses a Reynolds-averaged NavierStokes (RANS) approach for wall bounded regions and a large-eddy simulation (LES) approach for the turbulent mixing region. Mean flow axial velocities and turbulence intensities from the hybrid RANS-LES simulations are compared to data from a benchmark compressible mixing layer experiment. Parametric studies of LES subgrid modeling settings, variations in the spanwise computational domain, and wall temperature settings in the RANS region were conducted. The subgrid model cases, which used a baseline computational grid with small spanwise computational domain, all overpredicted the mixing layer turbulence levels. The subgrid model settings producing the largest eddy viscosities resulted in the lowest axial and vertical turbulence intensities. The next set of cases examining wider spanwise domains enabled more of the turbulent energy to be released in the spanwise direction, which in turn reduced axial and vertical turbulence levels. Finally, prescribing the wall temperatures in the RANS regions instead of using the more traditional adiabatic wall boundary conditions further reduced turbulence levels, and enabled the best agreement with experimental data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call