Skip to main content
Home
Department Of Physics text logo
  • Research
    • Our research
    • Our research groups
    • Our research in action
    • Research funding support
    • Summer internships for undergraduates
  • Study
    • Undergraduates
    • Postgraduates
  • Engage
    • For alumni
    • For business
    • For schools
    • For the public
  • Support
Menu
Juno Jupiter image

Tim Palmer

Emeritus

Sub department

  • Atmospheric, Oceanic and Planetary Physics

Research groups

  • Predictability of weather and climate
Tim.Palmer@physics.ox.ac.uk
Telephone: 01865 (2)72897
Robert Hooke Building, room S43
  • About
  • Publications

Rethinking superdeterminism

Frontiers in Physics Frontiers 8 (2020) 139

Authors:

Sabine Hossenfelder, Tim Palmer

Abstract:

Quantum mechanics has irked physicists ever since its conception more than 100 years ago. While some of the misgivings, such as it being unintuitive, are merely aesthetic, quantum mechanics has one serious shortcoming: it lacks a physical description of the measurement process. This “measurement problem” indicates that quantum mechanics is at least an incomplete theory—good as far as it goes, but missing a piece—or, more radically, is in need of complete overhaul. Here we describe an approach which may provide this sought-for completion or replacement: Superdeterminism. A superdeterministic theory is one which violates the assumption of Statistical Independence (that distributions of hidden variables are independent of measurement settings). Intuition suggests that Statistical Independence is an essential ingredient of any theory of science (never mind physics), and for this reason Superdeterminism is typically discarded swiftly in any discussion of quantum foundations. The purpose of this paper is to explain why the existing objections to Superdeterminism are based on experience with classical physics and linear systems, but that this experience misleads us. Superdeterminism is a promising approach not only to solve the measurement problem, but also to understand the apparent non-locality of quantum physics. Most importantly, we will discuss how it may be possible to test this hypothesis in an (almost) model independent way.
More details from the publisher
Details from ORA
More details

Discretization of the Bloch sphere, fractal invariant sets and Bell's theorem.

Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences The Royal Society 476:2236 (2020) 20190350

Abstract:

An arbitrarily dense discretization of the Bloch sphere of complex Hilbert states is constructed, where points correspond to bit strings of fixed finite length. Number-theoretic properties of trigonometric functions (not part of the quantum-theoretic canon) are used to show that this constructive discretized representation incorporates many of the defining characteristics of quantum systems: completementarity, uncertainty relationships and (with a simple Cartesian product of discretized spheres) entanglement. Unlike Meyer's earlier discretization of the Bloch Sphere, there are no orthonormal triples, hence the Kocken-Specker theorem is not nullified. A physical interpretation of points on the discretized Bloch sphere is given in terms of ensembles of trajectories on a dynamically invariant fractal set in state space, where states of physical reality correspond to points on the invariant set. This deterministic construction provides a new way to understand the violation of the Bell inequality without violating statistical independence or factorization, where these conditions are defined solely from states on the invariant set. In this finite representation, there is an upper limit to the number of qubits that can be entangled, a property with potential experimental consequences.
More details from the publisher
Details from ORA
More details
More details

Single-precision in the tangent-linear and adjoint models of incremental 4D-VAr

Monthly Weather Review American Meteorological Society 148:4 (2020) 1541-1552

Authors:

S Hatfield, A McRae, T Palmer, P Düben

Abstract:

The use of single-precision arithmetic in ECMWF’s forecasting model gave a 40% reduction in wall-clock time over double-precision, with no decrease in forecast quality. However, using reduced-precision in 4D-Var data assimilation is relatively unexplored and there are potential issues with using single-precision in the tangent-linear and adjoint models. Here, we present the results of reducing numerical precision in an incremental 4D-Var data assimilation scheme, with an underlying two-layer quasigeostrophic model. The minimizer used is the conjugate gradient method. We show how reducing precision increases the asymmetry between the tangent-linear and adjoint models. For ill-conditioned problems, this leads to a loss of orthogonality among the residuals of the conjugate gradient algorithm, which slows the convergence of the minimization procedure. However, we also show that a standard technique, reorthogonalization, eliminates these issues and therefore could allow the use of single-precision arithmetic. This work is carried out within ECMWF’s data assimilation framework, the Object Oriented Prediction System.
More details from the publisher
Details from ORA
More details

Seasonal forecasts of the 20th century

Bulletin of the American Meteorological Society American Meteorological Society 101:8 (2020) E1413-E1426

Authors:

Antje Weisheimer, Daniel Befort, David Macleod, Timothy Palmer, Chris O’Reilly, Kristian Strømmen

Abstract:

New seasonal retrospective forecasts for 1901-2010 show that skill for predicting ENSO, NAO and PNA is reduced during mid-century periods compared to earlier and more recent high-skill decades.

Forecasts of seasonal climate anomalies using physically-based global circulation models are routinely made at operational meteorological centers around the world. A crucial component of any seasonal forecast system is the set of retrospective forecasts, or hindcasts, from past years which are used to estimate skill and to calibrate the forecasts. Hindcasts are usually produced over a period of around 20-30 years. However, recent studies have demonstrated that seasonal forecast skill can undergo pronounced multi-decadal variations. These results imply that relatively short hindcasts are not adequate for reliably testing seasonal forecasts and that small hindcast sample sizes can potentially lead to skill estimates that are not robust. Here we present new and unprecedented 110-year-long coupled hindcasts of the next season over the period 1901 to 2010. Their performance for the recent period is in good agreement with those of operational forecast models. While skill for ENSO is very high during recent decades, it is markedly reduced during the 1930s to 1950s. Skill at the beginning of the 20th Century is, however, as high as for recent high-skill periods. Consistent with findings in atmosphere-only hindcasts, a mid-century drop in forecast skill is found for a range of atmospheric fields including large-scale indices such as the NAO and the PNA patterns. As with ENSO, skill scores for these indices recover in the early 20th Century suggesting that the mid-century drop in skill is not due to lack of good observational data.

A public dissemination platform for our hindcast data is available and we invite the scientific community to explore them.

More details from the publisher
Details from ORA
More details

Human creativity and consciousness: unintended consequences of the brain's extraordinary energy efficiency?

Entropy MDPI 22:3 (2020) 281

Abstract:

It is proposed that both human creativity and human consciousness are (unintended) consequences of the human brain's extraordinary energy efficiency. The topics of creativity and consciousness are treated separately, though have a common sub-structure. It is argued that creativity arises from a synergy between two cognitive modes of the human brain (which broadly coincide with Kahneman's Systems 1 and 2). In the first, available energy is spread across a relatively large network of neurons, many of which are small enough to be susceptible to thermal (ultimately quantum decoherent) noise. In the second, available energy is focussed on a smaller subset of larger neurons whose action is deterministic. Possible implications for creative computing in silicon are discussed. Starting with a discussion of the concept of free will, the notion of consciousness is defined in terms of an awareness of what are perceived to be nearby counterfactual worlds in state space. It is argued that such awareness arises from an interplay between memories on the one hand, and quantum physical mechanisms (where, unlike in classical physics, nearby counterfactual worlds play an indispensable dynamical role) in the ion channels of neural networks, on the other. As with the brain's susceptibility to noise, it is argued that in situations where quantum physics plays a role in the brain, it does so for reasons of energy efficiency. As an illustration of this definition of consciousness, a novel proposal is outlined as to why quantum entanglement appears to be so counter-intuitive.
More details from the publisher
Details from ORA
More details
More details

Pagination

  • First page First
  • Previous page Prev
  • …
  • Page 4
  • Page 5
  • Page 6
  • Page 7
  • Current page 8
  • Page 9
  • Page 10
  • Page 11
  • Page 12
  • …
  • Next page Next
  • Last page Last

Footer Menu

  • Contact us
  • Giving to the Dept of Physics
  • Work with us
  • Media

User account menu

  • Log in

Follow us

FIND US

Clarendon Laboratory,

Parks Road,

Oxford,

OX1 3PU

CONTACT US

Tel: +44(0)1865272200

University of Oxfrod logo Department Of Physics text logo
IOP Juno Champion logo Athena Swan Silver Award logo

© University of Oxford - Department of Physics

Cookies | Privacy policy | Accessibility statement

Built by: Versantus

  • Home
  • Research
  • Study
  • Engage
  • Our people
  • News & Comment
  • Events
  • Our facilities & services
  • About us
  • Giving to Physics
  • Current students
  • Staff intranet