Singular vectors, predictability and ensemble forecasting for weather and climate
Journal of Physics A: Math. Theor. 46:25 (2013) 254018
The use of imprecise processing to improve accuracy in weather & climate prediction
Journal of Computational Physics (2013)
Abstract:
The use of stochastic processing hardware and low precision arithmetic in atmospheric models is investigated. Stochastic processors allow hardware-induced faults in calculations, sacrificing bit-reproducibility and precision in exchange for improvements in performance and potentially accuracy of forecasts, due to a reduction in power consumption that could allow higher resolution. A similar trade-off is achieved using low precision arithmetic, with improvements in computation and communication speed and savings in storage and memory requirements. As high-performance computing becomes more massively parallel and power intensive, these two approaches may be important stepping stones in the pursuit of global cloud-resolving atmospheric modelling. The impact of both hardware induced faults and low precision arithmetic is tested using the Lorenz '96 model and the dynamical core of a global atmosphere model. In the Lorenz '96 model there is a natural scale separation; the spectral discretisation used in the dynamical core also allows large and small scale dynamics to be treated separately within the code. Such scale separation allows the impact of lower-accuracy arithmetic to be restricted to components close to the truncation scales and hence close to the necessarily inexact parametrised representations of unresolved processes. By contrast, the larger scales are calculated using high precision deterministic arithmetic. Hardware faults from stochastic processors are emulated using a bit-flip model with different fault rates. Our simulations show that both approaches to inexact calculations do not substantially affect the large scale behaviour, provided they are restricted to act only on smaller scales. By contrast, results from the Lorenz '96 simulations are superior when small scales are calculated on an emulated stochastic processor than when those small scales are parametrised. This suggests that inexact calculations at the small scale could reduce computation and power costs without adversely affecting the quality of the simulations. This would allow higher resolution models to be run at the same computational cost. © 2013 The Authors.Reliability of decadal predictions
Geophysical Research Letters 39:21 (2012)
Abstract:
The reliability of multi-year predictions of climate is assessed using probabilistic Attributes Diagrams for near-surface air temperature and sea surface temperature, based on 54 member ensembles of initialised decadal hindcasts using the ECMWF coupled model. It is shown that the reliability from the ensemble system is good over global land areas, Europe and Africa and for the North Atlantic, Indian Ocean and, to a lesser extent, North Pacific basins for lead times up to 6-9years. North Atlantic SSTs are reliably predicted even when the climate trend is removed, consistent with the known predictability for this region. By contrast, reliability in the Indian Ocean, where external forcing accounts for most of the variability, deteriorates severely after detrending. More conventional measures of forecast quality, such as the anomaly correlation coefficient (ACC) of the ensemble mean, are also considered, showing that the ensemble has significant skill in predicting multi-annual temperature averages. © 2012. American Geophysical Union. All Rights Reserved.Quantum Theory and The Symbolic Dynamics of Invariant Sets: Towards a Gravitational Theory of the Quantum
ArXiv 1210.394 (2012)
Towards the probabilistic Earth-system simulator: A vision for the future of climate and weather prediction
Quarterly Journal of the Royal Meteorological Society 138:665 (2012) 841-861