Skip to main content
Home
Department Of Physics text logo
  • Research
    • Our research
    • Our research groups
    • Our research in action
    • Research funding support
    • Summer internships for undergraduates
  • Study
    • Undergraduates
    • Postgraduates
  • Engage
    • For alumni
    • For business
    • For schools
    • For the public
  • Support
Menu
Juno Jupiter image

Dr Adam Povey FRMetSoc FHEA

Visitor

Research theme

  • Climate physics

Sub department

  • Atmospheric, Oceanic and Planetary Physics

Research groups

  • Earth Observation Data Group
Adam.Povey@physics.ox.ac.uk
Robert Hooke Building, room S46
  • About
  • Teaching
  • Publications

Energy balance climate models as a tool for investigating the linkage between the energy imbalance and the hydrological cycle 

(2026)

Authors:

Nedim Sladić, Tim Trent, Adam Povey, Richard P. Allan, Kate Willett

Abstract:

The planetary energy imbalance depends on the amount of solar energy entering and leaving the system, as well as changes in greenhouse gas concentrations. Since the start of the 21st century, the Earth’s energy imbalance (EEI) is assumed to have doubled, linked to the reduction of solar radiation reflected back to space, due to atmospheric dimming. Rapid and responsive feedback mechanisms have contributed to the accumulation of excess heat within the global oceans. The ocean warming drives the positive change in EEI and impacts the hydrological cycle, becoming more intense. Such linkage disturbs well-established weather patterns and cause their alternation. To understand these phenomena, traditionally complex state-of-the-art coupled climate models would be used. However, the strength of simpler, energy balance climate models capturing large-scale features has shown to be an alternative approach in understanding the general state of climate.In this study, we utilise the ocean component of the newly developed novel energy balance climate model (nEBM) to examine the relationship between EEI and ocean warming. Our approach perturbs key hydrological cycle elements (e.g., precipitation, runoff, evaporation, etc) in addition to other forcing components (e.g., CO2) to show the resulting ocean response and the subsequent impacts on EEI. These results are compared to observational datasets to demonstrate the performance of the nEBM ocean model. The obtained results are compared to CMIP6, observations, and relevant literature. Finally, we discuss the ability of simpler climate models (e.g., nEBM) to quantify sensitivity in climate studies.
More details from the publisher

Making Sense of Uncertainties: Ask the Right Question

(2026)

Authors:

Alexander Gruber, Claire Bulgin, Wouter Dorigo, Owen Emburry, Maud Formanek, Christopher Merchant, Jonathan Mittaz, Joaquín Muñoz-Sabater, Florian Pöppl, Adam Povey, Wolfgang Wagner

Abstract:

It is well known that scientific data have uncertainties and that it is crucial to take these uncertainties into account in any decision making process. Nevertheless, despite data producer’s best efforts to provide complete and rigorous uncertainty estimates alongside their data, users commonly struggle to make sense of uncertainty information. This is because uncertainties are usually expressed as the statistical spread in the observations (for example, as random error standard deviation), which does not relate to the intended use of the data.Put simply, data and their uncertainty are usually expressed as something like “x plus/minus y”, which does not answer the really important question: How much can I trust “x”, or any use of or decision based upon “x”? Consequently, uncertainties are often either ignored altogether and the data taken at face value, or interpreted by experts (or non-experts) heuristically to arrive at rather subjective, qualitative judgements of the confidence they can have in the data.In line with existing practices (e.g., the communication of uncertianties in the IPCC reports), we conjecture that the key to enabling users to make sense of uncertainties is to represent them as the confidence one can have in whatever event one is interested in, given the available data and their uncertainty.To that end, we propose a novel, generic framework that transforms common uncertaintiy representations (i.e., estimates of stochastic data properties, such as “the state of this variable is “x plus/minus y”) into more meaningful, actionable information that actually relate to their intended use, (i.e., statements such as “the data and their uncertainties suggest that we can be “z” % confident that…”). This is done by first formulating a meaningful question that links the available data to some events of interest, and then deriving quantiative estimates for the confidence in the occurrence of these events using Bayes theorem.We demonstrate this framework using two case examples: (i) using satellte soil moisture retrievals and their uncertainty to derive how confident one can be in the presence and severity of a drought; and (ii) how ocean temperature analyses and their uncertainty can be used to determine how confident one can be that prevailing conditions are likely to cause coral bleaching. 
More details from the publisher

A Practical Introduction to Utilising Uncertainty Information in the Analysis of Essential Climate Variables

Surveys in Geophysics Springer Science and Business Media LLC (2025)

Authors:

Adam C Povey, Claire E Bulgin, Alexander Gruber

Abstract:

Abstract An estimate of uncertainty is essential to understanding what information is conveyed by data and how it relates to the wider context of what one intended to measure. It can be difficult to know how to use uncertainty during the analysis of environmental data and the best way to present that information within a dataset. In many common uses, such as calculating statistical significance, it is easy to make mistakes due to incomplete or inappropriate use of the available uncertainty information. Uncertainty is itself uncertain, such that many practical or empirical solutions are available when a comprehensive uncertainty budget is impractical to produce. This manuscript collects actionable guidance on how uncertainty can be used, presented, and calculated when working with essential climate variables (ECVs). This includes qualitative discussions of the utility of uncertainties, explanations of common misconceptions, advice on presentation style, and plain descriptions of the essential equations. Selected worked examples are included on the propagation of uncertainties, particularly for data aggregation and merging. Uncertainty need not be off-putting as even incomplete uncertainty budgets add value to any observation. This paper aims to provide a starting point, or refresher, for researchers in the environmental sciences to make more complete use of uncertainty in their work.
More details from the publisher
More details

The Challenges and Limitations of Validating Satellite-Derived Datasets Using Independent Measurements: Lessons Learned from Essential Climate Variables

Surveys in Geophysics Springer Science and Business Media LLC (2025)

Authors:

Mary Langsdale, Tijl Verhoelst, Adam Povey, Nick Schutgens, Thomas Dowling, Jean-Christopher Lambert, Steven Compernolle, Stefan Kern

Abstract:

Abstract Validation of satellite-derived essential climate variable (ECV) datasets requires comparison against independent measurements. These independent measurements, which include ground-based, airborne, and other non-satellite-based measurements, are typically the product of a different measurement system and may include some contribution from models. These reference data therefore have their own characteristics, uncertainties, and limitations which must be accounted for in the validation process. In addition, they typically differ from the data to be validated in spatio-temporal resolution, sensitivity, and sampling. As such, comparisons to independent data do not necessarily yield clear feedback on the quality of satellite data and insufficient awareness of these issues can lead to erroneous interpretation. This is the cost of leaving the laboratory and studying the real world. In this review paper, we examine the challenges and limitations of evaluating satellite-derived datasets with independent measurements, using examples across different ECVs within the terrestrial, ocean, and atmospheric domains. Drawing from other studies, we discuss issues with the reference datasets themselves, issues specific to use of these data for validation, and issues resulting from the comparison methodology. We conclude with recommendations to the community based on this review. In this, we highlight the importance of continued efforts towards (1) advancing uncertainty modelling of reference datasets and quality control knowledge and procedures, (2) establishing and communicating limitations in reference data, (3) reference data (and metadata) timeliness and preservation, and (4) best practices for the validation methodologies that address the spatio-temporal differences of the measurements.
More details from the publisher

Making Sense of Uncertainties: Ask the Right Question

Surveys in Geophysics Springer Science and Business Media LLC (2025)

Authors:

Alexander Gruber, Claire E Bulgin, Wouter Dorigo, Owen Embury, Maud Formanek, Christopher Merchant, Jonathan Mittaz, Joaquín Muñoz-Sabater, Florian Pöppl, Adam Povey, Wolfgang Wagner

Abstract:

Abstract Earth observation data should inform decision making, but good decisions can only be made if the uncertainties in the data are taken into account. Making sense of uncertainty information can be difficult, because uncertainties represent the statistical spread in the observations (e.g., expressed as $$x \pm y$$ x ± y ), which does not relate directly to one specific use case of the data. Here, we propose a Bayesian framework to transform Earth observation product uncertainties into actionable information, i.e., estimates of how confident one can be in the occurrence of specific events of interest given the data and their uncertainty. We demonstrate this framework using two case examples: (i) monitoring drought severity based on soil moisture and (ii) estimating coral bleaching risk based on sea surface temperature. In both cases, we show that ignoring uncertainties can easily lead to misinterpretation of the data, making any decisions based on these data unlikely to be the best course of action. The proposed framework is general and can, in principle, be applied to a wide range of applications. Doing so requires a careful dialogue between data users, to formulate meaningful use cases and decision criteria, and data producers, to provide a rigorous description of their data and its uncertainties. The next step would then be to confront the uncertainty-informed estimates of event probabilities (created by the framework proposed here) with the costs and benefits of possible courses of action in order to make the best possible decisions that maximize socioeconomic merit.
More details from the publisher

Pagination

  • First page First
  • Previous page Prev
  • Page 1
  • Current page 2
  • Page 3
  • Page 4
  • Page 5
  • Page 6
  • Page 7
  • Page 8
  • Page 9
  • …
  • Next page Next
  • Last page Last

Footer Menu

  • Contact us
  • Giving to the Dept of Physics
  • Work with us
  • Media

User account menu

  • Log in

Follow us

FIND US

Clarendon Laboratory,

Parks Road,

Oxford,

OX1 3PU

CONTACT US

Tel: +44(0)1865272200

University of Oxfrod logo Department Of Physics text logo
IOP Juno Champion logo Athena Swan Silver Award logo

© University of Oxford - Department of Physics

Cookies | Privacy policy | Accessibility statement

Built by: Versantus

  • Home
  • Research
  • Study
  • Engage
  • Our people
  • News & Comment
  • Events
  • Our facilities & services
  • About us
  • Giving to Physics
  • Current students
  • Staff intranet