Uncertainty: a word at the heart of most if not all science, but of paramount importance to climate scientists. How we handle uncertainty is one of our greatest challenges, both in our research and in how we communicate our findings to fellow scientists, policy makers and the general public. In December, an RMetS National Meeting, held jointly with the Royal Statistical Society, aimed to illustrate some of the work being undertaken to quantify uncertainty in climate science. Introducing the subject, organiser Mat Collins (University of Exeter) highlighted some of the main reasons uncertainty exists in climate science and how it has previously been addressed. The Earth's climate system is highly complex, with components that span all scientific disciplines, which means it is a system that is impossible to fully represent in computer-based climate models, either because of incomplete knowledge of a process or of the parameterisation of a process within the model. The presentations at the meeting would investigate how we can quantify the uncertainty, to better target model evaluation and improvement. It is also important to consider the application of uncertainty to the evaluation of climate change risk. Joint organiser, statistician Jonty Rougier (University of Bristol), used an idealised example of the expected cost of sea-level rise as a function of time to 2100, for three adaptation strategies. As one example, a strategy could be deemed ‘inadmissible’ as it cost more than the ‘business as usual’ (BAU) strategy, but how could the BAU strategy and a ‘sea wall’ strategy be separated? By applying probability distributions of likely sea-level rise, an ‘integrated risk’ cost function can be determined for all admissible strategies – so that we have the compelling message that applying probabilities to scenarios is necessary so as to reduce the risk of inadequate adaptation increasing future costs. However, generating probability distributions, or as David Sexton (Hadley Centre) described them, ‘Strength of Evidence’ distributions, is crucial to communicating uncertainty. The UK Climate Projections 2009 (UKCP09) project utilises a ‘perturbed physics ensemble’, investigating the effects of adjusting the parameter values which represent the sub-grid scale climate processes in a climate model. Combined with a rigorous statistical framework and an estimate of the discrepancy between the ‘model’ and ‘real’ worlds, UKCP09 is able to offer policy-makers probabilistic projections for decadal slices to 2100 for the UK - which could be applied to the scenarios suggested by Jonty. One of the biggest uncertainties in the projection of future climate is our understanding of climate sensitivity - the amount of warming for a doubling of CO2 over 100 years. The higher the value for climate sensitivity the greater the impact of warming from increasing greenhouse gases. Present-ly, estimates are from 1 to 5 degC, a range that has been, as Tamsin Edwards (University of Bristol) described, ‘stubbornly broad’ for 30 years. Reducing this range would massively reduce uncertainty, and one method for doing this is to utilise palaeoclimates. This work in progress uses the Last Glacial Maximum and mid-Holocene warm period to constrain climate sensitivity through the application of a new statistical method which combines known modelling uncertainties and palaeo-proxies (such as fossil pollen or foraminifera) to produce a new estimate of climate. The advantage of using palaeoclimate research is that datasets reconstructing past climates can be utilised to provide a ‘sanity check’ on model projections, and guide research as to where we need to improve our understanding to reduce uncertainty. Combined with work investigating palaeoclimate in the UK and internationally across modelling and data communities, Tamsin's results will hopefully improve our understanding of the reaction of the climate system to natural perturbations. In the real world climate sensitivity does not exist, but it is a useful indicator of what is essentially the total response over time to changes in the radiative forcing budget of the planet. The IPCC AR4 in 2007 separated each constituent member of this budget, its range of uncertainty and what level of scientific understanding we had of it. Aerosol effects, both direct (such as dust) and indirect (upon clouds), are highly uncertain terms in the budget, so increasing the overall uncertainty. Investigating changing parameters within a global aerosol model (GLOMAP) and combining modelling techniques with statistical emulation, Lindsay Lee (University of Leeds) has been able to attribute the uncertainty in their model down to the scale of key parameters, both spatially and temporally through the year. This work can be used to drive field campaigns to observe the real-world versions of these parameters, so as to evaluate and improve the model, resulting in increased understanding of this important climate forcing and greater confidence in our climate predictions. Through application of the work from the last three speakers, we are able to develop understanding of climate system processes and where we need to improve our knowledge to quantify and then reduce uncertainty. However, all these models and methods use deterministic parameterisations, but Paul Williams (University of Reading) suggested that these parameterisations should be stochastic. The code which forms climate models slices up equations for motion and circulation, creating a fixed representation for these variables. Paul demonstrated, using examples from the fluid modelling of gravity-wave formation, that expert elicited random noise introduced to the equations could improve the representation of transitional states in systems. Through applying this to a selection of climate-model parameters, he demonstrated that uncertainty in representing the mean climate state and inter-annual variability (such as the El Niño) can be reduced. These short presentations of a vast subject highlighted the broad, inter-disciplinary nature of the science problem – but whether challenging how we think about climate models, constraining climate sensitivity or determining where improvements in our understanding of the physical climate system are needed, it is all in vain if there is not accurate and clear communication of the uncertainties. The communication of the uncertainty, what it means, how it is addressed and why it exists is as important as the quantification of the uncertainty itself. Are we doing enough to communicate this work to the general public? To fully explain why it is important and what it means in terms of our knowledge and understanding of climate change probably needs another national meeting.