Abstract

Previous studies have demonstrated the importance of downgradient transport by dissipating waves and particularly of downward heat fluxes by gravity waves undergoing thermal dissipation. With a few exceptions, however, this effect has not been represented in gravity‐wave parameterizations commonly employed in global numerical models. A general expression relating the heat flux to the wave energy deposition rate caused by thermal dissipation is obtained within the standard linear‐theory approach. Although the flux is directed down the gradient of potential temperature, it is not proportional to its magnitude, i.e., is not formally diffusive as commonly represented. With necessary assumptions regarding the partitioning of the total wave energy deposition rate between the thermal and frictional channels, the heat flux may be calculated within any suitable parameterization of gravity‐wave drag. The general relation may also be used to estimate net heating rates from observations of wave heat transport. In a more general thermodynamical context, it is noted that gravity‐wave dissipation results in atmospheric entropy production as expected for a dissipative process. Without friction, entropy is produced under the conservation of the column potential enthalpy. Thermally dissipating waves thus represent an example of an entropy‐generating process hypothesized in the literature but not identified before. Although the downward heat transport results in a local cooling of upper levels, the integrated net effect of the wave energy deposition and heat transport combined is always heating of the whole atmospheric layer in which the dissipation occurs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call