Abstract
In the atmosphere, microphysics refers to the microscale processes that affect cloud and precipitation particles and is a key linkage among the various components of Earth's atmospheric water and energy cycles. The representation of microphysical processes in models continues to pose a major challenge leading to uncertainty in numerical weather forecasts and climate simulations. In this paper, the problem of treating microphysics in models is divided into two parts: (i) how to represent the population of cloud and precipitation particles, given the impossibility of simulating all particles individually within a cloud, and (ii) uncertainties in the microphysical process rates owing to fundamental gaps in knowledge of cloud physics. The recently developed Lagrangian particle‐based method is advocated as a way to address several conceptual and practical challenges of representing particle populations using traditional bulk and bin microphysics parameterization schemes. For addressing critical gaps in cloud physics knowledge, sustained investment for observational advances from laboratory experiments, new probe development, and next‐generation instruments in space is needed. Greater emphasis on laboratory work, which has apparently declined over the past several decades relative to other areas of cloud physics research, is argued to be an essential ingredient for improving process‐level understanding. More systematic use of natural cloud and precipitation observations to constrain microphysics schemes is also advocated. Because it is generally difficult to quantify individual microphysical process rates from these observations directly, this presents an inverse problem that can be viewed from the standpoint of Bayesian statistics. Following this idea, a probabilistic framework is proposed that combines elements from statistical and physical modeling. Besides providing rigorous constraint of schemes, there is an added benefit of quantifying uncertainty systematically. Finally, a broader hierarchical approach is proposed to accelerate improvements in microphysics schemes, leveraging the advances described in this paper related to process modeling (using Lagrangian particle‐based schemes), laboratory experimentation, cloud and precipitation observations, and statistical methods.
Highlights
The problem of treating microphysics in models is divided into two parts: (i) how to represent the population of cloud and precipitation particles, given the impossibility of simulating all particles individually within a cloud, and (ii) uncertainties in the microphysical process rates owing to fundamental gaps in knowledge of cloud physics
Some approaches simplify this exercise into the labeling of the “dominant” hydrometeor species in an observed volume—these “hydrometeor classification” or “hydrometeor identification” algorithms provide some operational value in comparisons to simulations (e.g., Dolan & Rutledge, 2009; Liu & Chandrasekar, 2000; Park et al, 2009; Straka et al, 2000; Vivekanandan et al, 1999) but impose predefined categories that may have overlapping radar signatures, do not span the full range of true microphysical variability that occurs in nature, and may be inconsistent with how hydrometeor categories are defined in models
Microphysics is a key component of cloud, weather, and climate models
Summary
Microphysics refers to the physical and chemical processes occurring at the scale of individual cloud and precipitation particles, or hydrometeors (sub‐micron to several centimeters). There is no well‐defined physical scale at which microphysical processes are fully “resolved”; unlike the Kolmogorov scale for turbulence, scales all the way down to the molecular are potentially important for determining nucleation and growth of hydrometeors, especially for ice particles (see section 3.2) It follows that there are important uncertainties even in particle‐by‐particle DNS, despite these models representing all hydrometeors individually within a volume. This has likely been driven by increasing knowledge that many process details are important for simulation outcomes and perhaps reflects a perceived necessity to incorporate more detail in order to model a highly complicated, nonlinear system such as microphysics (made possible by increasing computing power) This has exacerbated the problem of constraining schemes; in general, increasing the number of parameters that needs to be calibrated or “tuned” leads to increased uncertainty in model predicted variables. The section briefly discusses the history of microphysics scheme development with the goal of addressing a basic question: How did the community arrive at the current state of microphysics parameterization framed by the challenges discussed above?
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.