Human creativity and consciousness: unintended consequences of the brain's extraordinary energy efficiency?

Entropy MDPI 22:3 (2020) 281

Abstract:

It is proposed that both human creativity and human consciousness are (unintended) consequences of the human brain's extraordinary energy efficiency. The topics of creativity and consciousness are treated separately, though have a common sub-structure. It is argued that creativity arises from a synergy between two cognitive modes of the human brain (which broadly coincide with Kahneman's Systems 1 and 2). In the first, available energy is spread across a relatively large network of neurons, many of which are small enough to be susceptible to thermal (ultimately quantum decoherent) noise. In the second, available energy is focussed on a smaller subset of larger neurons whose action is deterministic. Possible implications for creative computing in silicon are discussed. Starting with a discussion of the concept of free will, the notion of consciousness is defined in terms of an awareness of what are perceived to be nearby counterfactual worlds in state space. It is argued that such awareness arises from an interplay between memories on the one hand, and quantum physical mechanisms (where, unlike in classical physics, nearby counterfactual worlds play an indispensable dynamical role) in the ion channels of neural networks, on the other. As with the brain's susceptibility to noise, it is argued that in situations where quantum physics plays a role in the brain, it does so for reasons of energy efficiency. As an illustration of this definition of consciousness, a novel proposal is outlined as to why quantum entanglement appears to be so counter-intuitive.

Reduced-precision parametrization: lessons from an intermediate-complexity atmospheric model

Quarterly Journal of the Royal Meteorological Society Wiley 146:729 (2020) 1590-1607

Authors:

Leo Saffin, Sam Hatfield, Peter Duben, Tim Palmer

Abstract:

Reducing numerical precision can save computational costs which can then be reinvested for more useful purposes. This study considers the effects of reducing precision in the parametrizations of an intermediate complexity atmospheric model (SPEEDY). We find that the difference between double-precision and reduced-precision parametrization tendencies is proportional to the expected machine rounding error if individual timesteps are considered. However, if reduced precision is used in simulations that are compared to double-precision simulations, a range of precision is found where differences are approximately the same for all simulations. Here, rounding errors are small enough to not directly perturb the model dynamics, but can perturb conditional statements in the parametrizations (such as convection active/inactive) leading to a similar error growth for all runs. For lower precision, simulations are perturbed significantly. Precision cannot be constrained without some quantification of the uncertainty. The inherent uncertainty in numerical weather and climate models is often explicitly considered in simulations by stochastic schemes that will randomly perturb the parametrizations. A commonly used scheme is stochastic perturbation of parametrization tendencies (SPPT). A strong test on whether a precision is acceptable is whether a low-precision ensemble produces the same probability distribution as a double-precision ensemble where the only difference between ensemble members is the model uncertainty (i.e., the random seed in SPPT). Tests with SPEEDY suggest a precision as low as 3.5 decimal places (equivalent to half precision) could be acceptable, which is surprisingly close to the lowest precision that produces similar error growth in the experiments without SPPT mentioned above. Minor changes to model code to express variables as anomalies rather than absolute values reduce rounding errors and low-precision biases, allowing even lower precision to be used. These results provide a pathway for implementing reduced-precision parametrizations in more complex weather and climate models.

Anthropogenic influence on the 2018 summer warm spell in Europe: the impact of different spatio-temporal scales

Bulletin of the American Meteorological Society American Meteorological Society 101:S1 (2020) S41-S46

Authors:

Nicholas Leach, S Li, S Sparrow, GJ Van Oldenborgh, FC Lott, A Weisheimer, Allen

Abstract:

We demonstrate that, in attribution studies, events defined over longer time scales generally produce higher probability ratios due to lower interannual variability, reconciling seemingly inconsistent attribution results of Europe’s 2018 summer heatwaves in reported studies.

Beyond skill scores: exploring sub-seasonal forecast value through a case study of French month-ahead energy prediction

(2020)

Authors:

Joshua Dorrington, Isla Finney, Tim Palmer, Antje Weisheimer

The physics of numerical analysis: a climate modelling case study

Philosophical Transactions A: Mathematical, Physical and Engineering Sciences Royal Society 378:2166 (2020) 20190058

Abstract:

The case is made for a much closer synergy between climate science, numerical analysis and computer science. This article is part of a discussion meeting issue 'Numerical algorithms for high-performance computational science'.