Publications


Number Formats, Error Mitigation, and Scope for 16‐Bit Arithmetics in Weather and Climate Modeling Analyzed With a Shallow Water Model

Journal of Advances in Modeling Earth Systems American Geophysical Union (AGU) 12 (2020)

M Klöwer, P Düben, T Palmer


Beyond skill scores: exploring sub‐seasonal forecast value through a case‐study of French month‐ahead energy prediction

Quarterly Journal of the Royal Meteorological Society Wiley (2020) qj.3863

J Dorrington, I Finney, T Palmer, A Weisheimer


Assessing the robustness of multidecadal variability in Northern Hemisphere wintertime 4 seasonal forecast skill

Quarterly Journal of the Royal Meteorological Society Wiley (2020) qj.3890

CH O'Reilly, A Weisheimer, D MacLeod, DJ Befort, T Palmer


The value of initialisation on decadal timescales: state dependent predictability in the CESM Decadal Prediction Large Ensemble

Journal of Climate American Meteorological Society 33 (2020) 7353-7370

H Christensen, J Berner, S Yeager

Information in decadal climate prediction arises from a well initialised ocean state and from the predicted response to an external forcing. The length of time over which the initial conditions benefit the decadal forecast depends on the start date of the forecast. We characterise this state-dependent predictability for decadal forecasts of upper ocean heat content in the Community Earth System Model. We find regionally dependent initial condition predictability, with extended predictability generally observed in the extra-tropics. We also detect state-dependent predictability, with the year of loss of information from the initialisation varying between start dates. The decadal forecasts in the North Atlantic show substantial information from the initial conditions beyond the ten-year forecast window, and a high degree of state-dependent predictability. We find some evidence for state dependent predictability in the ensemble spread in this region, similar to that seen in weather and subseasonal-to-seasonal forecasts. For some start dates, an increase of information with lead time is observed, for which the initialised forecasts predict a growing phase of the Atlantic Multidecadal Oscillation. Finally we consider the information in the forecast from the initial conditions relative to the forced response, and quantify the crossover timescale after which the forcing provides more information. We demonstrate that the climate change signal projects onto different patterns than the signal from the initial conditions. This means that even after the crossover timescale has been reached in a basin-averaged sense, the benefits of initialisation can be felt locally on longer timescales.


Jet Speed Variability Obscures Euro‐Atlantic Regime Structure

Geophysical Research Letters American Geophysical Union (AGU) 47 (2020)

J Dorrington, K Strommen


Continuous Structural Parameterization: A proposed method for representing different model parameterizations within one structure demonstrated for atmospheric convection

Journal of Advances in Modeling Earth Systems American Geophysical Union (AGU) (2020) e2020MS002085-e2020MS002085

F Lambert, P Challenor, N Lewis, D McNeall, N Owen, I Boutle, H Christensen, R Keane, N Mayne, A Stirling, M Webb


Seasonal forecasts of the 20th century

Bulletin of the American Meteorological Society American Meteorological Society 101 (2020) E1413-E1426

A Weisheimer, D Befort, D Macleod, T Palmer, C O’Reilly, K Strømmen

<p>New seasonal retrospective forecasts for 1901-2010 show that skill for predicting ENSO, NAO and PNA is reduced during mid-century periods compared to earlier and more recent high-skill decades.</p> <p>Forecasts of seasonal climate anomalies using physically-based global circulation models are routinely made at operational meteorological centers around the world. A crucial component of any seasonal forecast system is the set of retrospective forecasts, or hindcasts, from past years which are used to estimate skill and to calibrate the forecasts. Hindcasts are usually produced over a period of around 20-30 years. However, recent studies have demonstrated that seasonal forecast skill can undergo pronounced multi-decadal variations. These results imply that relatively short hindcasts are not adequate for reliably testing seasonal forecasts and that small hindcast sample sizes can potentially lead to skill estimates that are not robust. Here we present new and unprecedented 110-year-long coupled hindcasts of the next season over the period 1901 to 2010. Their performance for the recent period is in good agreement with those of operational forecast models. While skill for ENSO is very high during recent decades, it is markedly reduced during the 1930s to 1950s. Skill at the beginning of the 20th Century is, however, as high as for recent high-skill periods. Consistent with findings in atmosphere-only hindcasts, a mid-century drop in forecast skill is found for a range of atmospheric fields including large-scale indices such as the NAO and the PNA patterns. As with ENSO, skill scores for these indices recover in the early 20th Century suggesting that the mid-century drop in skill is not due to lack of good observational data.</p> <p>A public dissemination platform for our hindcast data is available and we invite the scientific community to explore them.</p>


Embeddedness of Timelike Maximal Surfaces in (1+2)-Minkowski Space

ANNALES HENRI POINCARE (2020)

EA Paxton

&#xA9; 2020, The Author(s). We prove that if &#x3D5;: R2&#x2192; R1 + 2 is a smooth, proper, timelike immersion with vanishing mean curvature, then necessarily &#x3D5; is an embedding, and every compact subset of &#x3D5;(R2) is a smooth graph. It follows that if one evolves any smooth, self-intersecting spacelike curve (or any planar spacelike curve whose unit tangent vector spans a closed semi-circle) so as to trace a timelike surface of vanishing mean curvature in R1 + 2, then the evolving surface will either fail to remain timelike, or it will fail to remain smooth. We show that, even allowing for null points, such a Cauchy evolution will be C2 inextendible beyond some singular time. In addition we study the continuity of the unit tangent for the evolution of a self-intersecting curve in isothermal gauge, which defines a well-known evolution beyond singular time.


An analysis of ways to decarbonize conference travel after COVID-19

Nature Nature Research 583 (2020) 356-360

D Hopkins, M Klower, J Higham, M Allen


Euro-Atlantic weather Regimes in the PRIMAVERA coupled climate simulations: impact of resolution and mean state biases on model performance

Climate Dynamics Springer Science and Business Media LLC 54 (2020) 5031-5048

F Fabiano, H Christensen, K Strommen, P Athanasiadis, A Baker, R Schiemann, S Corti


Short-term tests validate long-term estimates of climate change

Nature Springer Nature 582 (2020) 185-186

T Palmer


Rethinking superdeterminism

Frontiers in Physics Frontiers 8 (2020) 139

S Hossenfelder, T Palmer

Quantum mechanics has irked physicists ever since its conception more than 100 years ago. While some of the misgivings, such as it being unintuitive, are merely aesthetic, quantum mechanics has one serious shortcoming: it lacks a physical description of the measurement process. This “measurement problem” indicates that quantum mechanics is at least an incomplete theory—good as far as it goes, but missing a piece—or, more radically, is in need of complete overhaul. Here we describe an approach which may provide this sought-for completion or replacement: Superdeterminism. A superdeterministic theory is one which violates the assumption of Statistical Independence (that distributions of hidden variables are independent of measurement settings). Intuition suggests that Statistical Independence is an essential ingredient of any theory of science (never mind physics), and for this reason Superdeterminism is typically discarded swiftly in any discussion of quantum foundations. The purpose of this paper is to explain why the existing objections to Superdeterminism are based on experience with classical physics and linear systems, but that this experience misleads us. Superdeterminism is a promising approach not only to solve the measurement problem, but also to understand the apparent non-locality of quantum physics. Most importantly, we will discuss how it may be possible to test this hypothesis in an (almost) model independent way.


Discretization of the Bloch sphere, fractal invariant sets and Bell's theorem.

Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences The Royal Society 476 (2020) 20190350

T Palmer

An arbitrarily dense discretization of the Bloch sphere of complex Hilbert states is constructed, where points correspond to bit strings of fixed finite length. Number-theoretic properties of trigonometric functions (not part of the quantum-theoretic canon) are used to show that this constructive discretized representation incorporates many of the defining characteristics of quantum systems: completementarity, uncertainty relationships and (with a simple Cartesian product of discretized spheres) entanglement. Unlike Meyer's earlier discretization of the Bloch Sphere, there are no orthonormal triples, hence the Kocken-Specker theorem is not nullified. A physical interpretation of points on the discretized Bloch sphere is given in terms of ensembles of trajectories on a dynamically invariant fractal set in state space, where states of physical reality correspond to points on the invariant set. This deterministic construction provides a new way to understand the violation of the Bell inequality without violating statistical independence or factorization, where these conditions are defined solely from states on the invariant set. In this finite representation, there is an upper limit to the number of qubits that can be entangled, a property with potential experimental consequences.


Reduced-precision parametrization: lessons from an intermediate-complexity atmospheric model

Quarterly Journal of the Royal Meteorological Society Wiley 146 (2020) 1590-1607

L Saffin, S Hatfield, P Duben, T Palmer

Reducing numerical precision can save computational costs which can then be reinvested for more useful purposes. This study considers the effects of reducing precision in the parametrizations of an intermediate complexity atmospheric model (SPEEDY). We find that the difference between double-precision and reduced-precision parametrization tendencies is proportional to the expected machine rounding error if individual timesteps are considered. However, if reduced precision is used in simulations that are compared to double-precision simulations, a range of precision is found where differences are approximately the same for all simulations. Here, rounding errors are small enough to not directly perturb the model dynamics, but can perturb conditional statements in the parametrizations (such as convection active/inactive) leading to a similar error growth for all runs. For lower precision, simulations are perturbed significantly. Precision cannot be constrained without some quantification of the uncertainty. The inherent uncertainty in numerical weather and climate models is often explicitly considered in simulations by stochastic schemes that will randomly perturb the parametrizations. A commonly used scheme is stochastic perturbation of parametrization tendencies (SPPT). A strong test on whether a precision is acceptable is whether a low-precision ensemble produces the same probability distribution as a double-precision ensemble where the only difference between ensemble members is the model uncertainty (i.e., the random seed in SPPT). Tests with SPEEDY suggest a precision as low as 3.5 decimal places (equivalent to half precision) could be acceptable, which is surprisingly close to the lowest precision that produces similar error growth in the experiments without SPPT mentioned above. Minor changes to model code to express variables as anomalies rather than absolute values reduce rounding errors and low-precision biases, allowing even lower precision to be used. These results provide a pathway for implementing reduced-precision parametrizations in more complex weather and climate models.


Single-precision in the tangent-linear and adjoint models of incremental 4D-VAr

Monthly Weather Review American Meteorological Society 148 (2020) 1541-1552

S Hatfield, A McRae, T Palmer, P Düben

The use of single-precision arithmetic in ECMWF’s forecasting model gave a 40% reduction in wall-clock time over double-precision, with no decrease in forecast quality. However, using reduced-precision in 4D-Var data assimilation is relatively unexplored and there are potential issues with using single-precision in the tangent-linear and adjoint models. Here, we present the results of reducing numerical precision in an incremental 4D-Var data assimilation scheme, with an underlying two-layer quasigeostrophic model. The minimizer used is the conjugate gradient method. We show how reducing precision increases the asymmetry between the tangent-linear and adjoint models. For ill-conditioned problems, this leads to a loss of orthogonality among the residuals of the conjugate gradient algorithm, which slows the convergence of the minimization procedure. However, we also show that a standard technique, reorthogonalization, eliminates these issues and therefore could allow the use of single-precision arithmetic. This work is carried out within ECMWF’s data assimilation framework, the Object Oriented Prediction System.


The physics of numerical analysis: a climate modelling case study

Philosophical Transactions A: Mathematical, Physical and Engineering Sciences Royal Society 378 (2020) 20190058

T Palmer

The case is made for a much closer synergy between climate science, numerical analysis and computer science. This article is part of a discussion meeting issue 'Numerical algorithms for high-performance computational science'.


Machine Learning for Stochastic Parameterization: Generative Adversarial Networks in the Lorenz’96 Model

Journal of Advances in Modeling Earth Systems American Geophysical Union 12 (2020) e2019MS001896

DJ Gagne, HM Christensen, AC Subramanian, AH Monahan

Stochastic parameterizations account for uncertainty in the representation of unresolved subgrid processes by sampling from the distribution of possible subgrid forcings. Some existing stochastic parameterizations utilize data‐driven approaches to characterize uncertainty, but these approaches require significant structural assumptions that can limit their scalability. Machine learning models, including neural networks, are able to represent a wide range of distributions and build optimized mappings between a large number of inputs and subgrid forcings. Recent research on machine learning parameterizations has focused only on deterministic parameterizations. In this study, we develop a stochastic parameterization using the generative adversarial network (GAN) machine learning framework. The GAN stochastic parameterization is trained and evaluated on output from the Lorenz '96 model, which is a common baseline model for evaluating both parameterization and data assimilation techniques. We evaluate different ways of characterizing the input noise for the model and perform model runs with the GAN parameterization at weather and climate time scales. Some of the GAN configurations perform better than a baseline bespoke parameterization at both time scales, and the networks closely reproduce the spatiotemporal correlations and regimes of the Lorenz '96 system. We also find that, in general, those models which produce skillful forecasts are also associated with the best climate simulations.


Human creativity and consciousness: unintended consequences of the brain's extraordinary energy efficiency?

Entropy MDPI 22 (2020) 281

T Palmer

It is proposed that both human creativity and human consciousness are (unintended) consequences of the human brain's extraordinary energy efficiency. The topics of creativity and consciousness are treated separately, though have a common sub-structure. It is argued that creativity arises from a synergy between two cognitive modes of the human brain (which broadly coincide with Kahneman's Systems 1 and 2). In the first, available energy is spread across a relatively large network of neurons, many of which are small enough to be susceptible to thermal (ultimately quantum decoherent) noise. In the second, available energy is focussed on a smaller subset of larger neurons whose action is deterministic. Possible implications for creative computing in silicon are discussed. Starting with a discussion of the concept of free will, the notion of consciousness is defined in terms of an awareness of what are perceived to be nearby counterfactual worlds in state space. It is argued that such awareness arises from an interplay between memories on the one hand, and quantum physical mechanisms (where, unlike in classical physics, nearby counterfactual worlds play an indispensable dynamical role) in the ion channels of neural networks, on the other. As with the brain's susceptibility to noise, it is argued that in situations where quantum physics plays a role in the brain, it does so for reasons of energy efficiency. As an illustration of this definition of consciousness, a novel proposal is outlined as to why quantum entanglement appears to be so counter-intuitive.


Anthropogenic influence on the 2018 summer warm spell in Europe: the impact of different spatio-temporal scales

Bulletin of the American Meteorological Society American Meteorological Society 101 (2020) S41-S46

N Leach, S Li, S Sparrow, GJ Van Oldenborgh, FC Lott, A Weisheimer, Allen

We demonstrate that, in attribution studies, events defined over longer time scales generally produce higher probability ratios due to lower interannual variability, reconciling seemingly inconsistent attribution results of Europe’s 2018 summer heatwaves in reported studies.


Patterns in Wall-Bounded Shear Flows

, 2020

LS Tuckerman, M Chantry, D Barkley

Pages