Stochastic parameterization: Towards a new view of weather and climate models
Abstract:
The last decade has seen the success of stochastic parameterizations in short-term, medium-range, and seasonal forecasts: operational weather centers now routinely use stochastic parameterization schemes to represent model inadequacy better and to improve the quantification of forecast uncertainty. Developed initially for numerical weather prediction, the inclusion of stochastic parameterizations not only provides better estimates of uncertainty, but it is also extremely promising for reducing long-standing climate biases and is relevant for determining the climate response to external forcing. This article highlights recent developments from different research groups that show that the stochastic representation of unresolved processes in the atmosphere, oceans, land surface, and cryosphere of comprehensive weather and climate models 1) gives rise to more reliable probabilistic forecasts of weather and climate and 2) reduces systematic model bias. We make a case that the use of mathematically stringent methods for the derivation of stochastic dynamic equations will lead to substantial improvements in our ability to accurately simulate weather and climate at all scales. Recent work in mathematics, statistical mechanics, and turbulence is reviewed; its relevance for the climate problem is demonstrated; and future research directions are outlined.Climate SPHINX: evaluating the impact of resolution and stochastic physics parameterisations in the EC-Earth global climate model
Abstract:
The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979–2008) and a climate change projection (2039–2068), together with coupled transient runs (1850–2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on Super- MUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of postprocessed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate – specifically the Madden–Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with highresolution simulations) or stochastically (in low-resolution simulations).A study of reduced numerical precision to make superparameterization more competitive using a hardware emulator in the OpenIFS model
Abstract:
The use of reduced numerical precision to reduce computing costs for the cloud resolving model of superparameterised simulations of the atmosphere is investigated. An approach to identify the optimal level of precision for many different model components is presented and a detailed analysis of precision is performed. This is non-trivial for a complex model that shows chaotic behaviour such as the cloud resolving model in this paper.
results of the reduced precision analysis provide valuable information for the quantification of model uncertainty for individual model components. The precision analysis is also used to identify model parts that are of less importance thus enabling a reduction of model complexity. It is shown that the precision analysis can be used to improve model efficiency for both simulations in double precision and in reduced precision. Model simulations are performed with a superparametrised single-column model version of the OpenIFS model that is forced by observational datasets. A software emulator was used to mimic the use of reduced precision floating point arithmetic in simulations.
Single Precision in Weather Forecasting Models: An Evaluation with the IFS
Report on the SPARC QBO Workshop: The QBO and its Global Influence - Past, Present and Future
Abstract:
There is no known atmospheric phenomenon with a longer horizon of predictability than the quasibiennial oscillation (QBO) of tropical stratospheric circulation. With a mean period of about 28 months, the QBO phase can routinely be predicted at least a year in advance. This predictability arises from internal atmospheric dynamics, rather than from external forcings with long timescales, and it offers the tantalizing prospect of improved predictions for any phenomena influenced by the QBO. Observed QBO teleconnections include an apparent QBO influence on the stratospheric winter polar vortices in both hemispheres, the Madden-Julian Oscillation (MJO), and the North-Atlantic Oscillation (NAO). Yet the degree to which such teleconnections are real, robust, and sufficiently strong to provide useful predictive skill remains an important topic of research. Utilizing and understanding these linkages will require atmospheric models that adequately represent both the QBO and the mechanisms by which it influences other aspects of the general circulation, such as tropical deep convection.
The 2016 QBO workshop in Oxford aimed to explore these themes, and to build on the outcomes of the first QBO workshop, held in March 2015 in Victoria, BC, Canada (as reported in SPARC Newsletter No. 45). This earlier workshop was the kick-off meeting of the SPARC QBOi (QBO Initiative) activity, and its key outcome was to plan a series of coordinated Atmosphere General Circulation Model (AGCM) experiments (the “phase-one” QBOi experiments). These experiments provide a multi-model dataset that can be used to investigate the aforementioned themes. While the focus of the Victoria meeting was primarily on the QBO itself, the Oxford workshop has broadened the scope of the QBOi activity to encompass QBO impacts. Its primary outcome is a planned set of core papers analysing the phaseone QBOi experiments,