Introducing independent patterns into the Stochastically Perturbed Parametrisation Tendencies (SPPT) scheme
Abstract:
The Stochastically Perturbed Parametrisation Tendencies (SPPT) scheme is used at weather and climate forecasting centres worldwide to represent model uncertainty that arises from simplifications involved in the parametrisation process. It uses spatio-temporally correlated multiplicative noise to perturb the sum of the parametrised tendencies. However, SPPT does not distinguish between different parametrisation schemes, which do not necessarily have the same error characteristics. A generalisation to SPPT is proposed, whereby the tendency from each parametrisation scheme can be perturbed using an independent stochastic pattern. This acknowledges that the forecast errors arising from different parametrisations are not perfectly correlated. Two variations of this ‘independent SPPT’ (iSPPT) approach are tested in the Integrated Forecasting System (IFS). The first perturbs all parametrised tendencies independently while the second groups tendencies before perturbation. The iSPPT schemes lead to statistically significant improvements in forecast reliability in the tropics in medium range weather forecasts. This improvement can be attributed to a large, beneficial increase in ensemble spread in regions with significant convective activity. The iSPPT schemes also lead to improved forecast skill in the extra tropics for a set of cases in which the synoptic initial conditions were more likely to result in European ‘forecast busts’. Longer 13-month simulations are also considered to indicate the effect of iSPPT on the mean climate of the IFS.The primacy of doubt: Evolution of numerical weather prediction from determinism to probability
Stochastic parameterization: Towards a new view of weather and climate models
Abstract:
The last decade has seen the success of stochastic parameterizations in short-term, medium-range, and seasonal forecasts: operational weather centers now routinely use stochastic parameterization schemes to represent model inadequacy better and to improve the quantification of forecast uncertainty. Developed initially for numerical weather prediction, the inclusion of stochastic parameterizations not only provides better estimates of uncertainty, but it is also extremely promising for reducing long-standing climate biases and is relevant for determining the climate response to external forcing. This article highlights recent developments from different research groups that show that the stochastic representation of unresolved processes in the atmosphere, oceans, land surface, and cryosphere of comprehensive weather and climate models 1) gives rise to more reliable probabilistic forecasts of weather and climate and 2) reduces systematic model bias. We make a case that the use of mathematically stringent methods for the derivation of stochastic dynamic equations will lead to substantial improvements in our ability to accurately simulate weather and climate at all scales. Recent work in mathematics, statistical mechanics, and turbulence is reviewed; its relevance for the climate problem is demonstrated; and future research directions are outlined.Climate SPHINX: evaluating the impact of resolution and stochastic physics parameterisations in the EC-Earth global climate model
Abstract:
The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on Super- MUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of postprocessed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate – specifically the Madden–Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with highresolution simulations) or stochastically (in low-resolution simulations).A study of reduced numerical precision to make superparameterization more competitive using a hardware emulator in the OpenIFS model
Abstract:
The use of reduced numerical precision to reduce computing costs for the cloud resolving model of superparameterised simulations of the atmosphere is investigated. An approach to identify the optimal level of precision for many different model components is presented and a detailed analysis of precision is performed. This is non-trivial for a complex model that shows chaotic behaviour such as the cloud resolving model in this paper.
results of the reduced precision analysis provide valuable information for the quantification of model uncertainty for individual model components. The precision analysis is also used to identify model parts that are of less importance thus enabling a reduction of model complexity. It is shown that the precision analysis can be used to improve model efficiency for both simulations in double precision and in reduced precision. Model simulations are performed with a superparametrised single-column model version of the OpenIFS model that is forced by observational datasets. A software emulator was used to mimic the use of reduced precision floating point arithmetic in simulations.