The impact of a stochastic parameterization scheme on climate sensitivity in EC‐Earth
Journal of Geophysical Research: Atmospheres American Geophysical Union 124:23 (2019) 12726-12740
Abstract:
Stochastic schemes, designed to represent unresolved sub-grid scale variability, are frequently used in short and medium-range weather forecasts, where they are found to improve several aspects of the model. In recent years, the impact of stochastic physics has also been found to be beneficial for the model's long term climate. In this paper, we demonstrate for the first time that the inclusion of a stochastic physics scheme can notably affect a model's projection of global warming, as well as its historical climatological global temperature. Specifically, we find that when including the 'stochastically perturbed parametrisation tendencies' scheme (SPPT) in the fully coupled climate model EC-Earth v3.1, the predicted level of global warming between 1850 and 2100 is reduced by 10% under an RCP8.5 forcing scenario. We link this reduction in climate sensitivity to a change in the cloud feedbacks with SPPT. In particular, the scheme appears to reduce the positive low cloud cover feedback, and increase the negative cloud optical feedback. A key role is played by a robust, rapid increase in cloud liquid water with SPPT, which we speculate is due to the scheme's non-linear interaction with condensation.Stochastic weather and climate models
Nature Reviews Physics Springer Science and Business Media LLC 1:7 (2019) 463-471
A Stochastic Representation of Subgrid Uncertainty for Dynamical Core Development
Bulletin of the American Meteorological Society American Meteorological Society 100:6 (2019) 1091-1101
Abstract:
Numerical weather prediction and climate models comprise a) a dynamical core describing resolved parts of the climate system and b) parameterizations describing unresolved components. Development of new subgrid-scale parameterizations is particularly uncertain compared to representing resolved scales in the dynamical core. This uncertainty is currently represented by stochastic approaches in several operational weather models, which will inevitably percolate into the dynamical core. Hence, implementing dynamical cores with excessive numerical accuracy will not bring forecast gains, may even hinder them since valuable computer resources will be tied up doing insignificant computation, and therefore cannot be deployed for more useful gains, such as increasing model resolution or ensemble sizes. Here we describe a low-cost stochastic scheme that can be implemented in any existing deterministic dynamical core as an additive noise term. This scheme could be used to adjust accuracy in future dynamical core development work. We propose that such an additive stochastic noise test case should become a part of the routine testing and development of dynamical cores in a stochastic framework. The overall key point of the study is that we should not develop dynamical cores that are more precise than the level of uncertainty provided by our stochastic scheme. In this way, we present a new paradigm for dynamical core development work, ensuring that weather and climate models become more computationally efficient. We show some results based on tests done with the European Centre for Medium-Range Weather Forecasts (ECMWF) Integrated Forecasting System (IFS) dynamical core.Accelerating high-resolution weather models with deep-learning hardware
PASC '19 Proceedings of the Platform for Advanced Scientific Computing Conference Association for Computing Machinery (2019)
Abstract:
The next generation of weather and climate models will have an unprecedented level of resolution and model complexity, and running these models efficiently will require taking advantage of future supercomputers and heterogeneous hardware. In this paper, we investigate the use of mixed-precision hardware that supports floating-point operations at double-, single- and half-precision. In particular, we investigate the potential use of the NVIDIA Tensor Core, a mixed-precision matrix-matrix multiplier mainly developed for use in deep learning, to accelerate the calculation of the Legendre transforms in the Integrated Forecasting System (IFS), one of the leading global weather forecast models. In the IFS, the Legendre transform is one of the most expensive model components and dominates the computational cost for simulations at a very high resolution. We investigate the impact of mixed-precision arithmetic in IFS simulations of operational complexity through software emulation. Through a targeted but minimal use of double-precision arithmetic we are able to use either half-precision arithmetic or mixed half/single-precision arithmetic for almost all of the calculations in the Legendre transform without affecting forecast skill.Bell Inequality Violation with Free Choice and Local Causality on the Invariant Set
ArXiv 1903.10537 (2019)