# Publications

## Seasonal forecasts of the 20th century

Bulletin of the American Meteorological Society American Meteorological Society (2020) BAMS-D-19-0019.1

New seasonal retrospective forecasts for 1901-2010 show that skill for predicting ENSO, NAO and PNA is reduced during mid-century periods compared to earlier and more recent high-skill decades. Forecasts of seasonal climate anomalies using physically-based global circulation models are routinely made at operational meteorological centers around the world. A crucial component of any seasonal forecast system is the set of retrospective forecasts, or hindcasts, from past years which are used to estimate skill and to calibrate the forecasts. Hindcasts are usually produced over a period of around 20-30 years. However, recent studies have demonstrated that seasonal forecast skill can undergo pronounced multi-decadal variations. These results imply that relatively short hindcasts are not adequate for reliably testing seasonal forecasts and that small hindcast sample sizes can potentially lead to skill estimates that are not robust. Here we present new and unprecedented 110-year-long coupled hindcasts of the next season over the period 1901 to 2010. Their performance for the recent period is in good agreement with those of operational forecast models. While skill for ENSO is very high during recent decades, it is markedly reduced during the 1930s to 1950s. Skill at the beginning of the 20th Century is, however, as high as for recent high-skill periods. Consistent with findings in atmosphere-only hindcasts, a mid-century drop in forecast skill is found for a range of atmospheric fields including large-scale indices such as the NAO and the PNA patterns. As with ENSO, skill scores for these indices recover in the early 20th Century suggesting that the mid-century drop in skill is not due to lack of good observational data. A public dissemination platform for our hindcast data is available and we invite the scientific community to explore them.

## Anthropogenic influence on the 2018 summer warm spell in Europe: the impact of different spatio-temporal scales

Bulletin of the American Meteorological Society American Meteorological Society **101** (2020) S41-S46

We demonstrate that, in attribution studies, events defined over longer time scales generally produce higher probability ratios due to lower interannual variability, reconciling seemingly inconsistent attribution results of Europe’s 2018 summer heatwaves in reported studies.

## Euro-Atlantic weather Regimes in the PRIMAVERA coupled climate simulations: impact of resolution and mean state biases on model performance

Climate Dynamics Springer Science and Business Media LLC **54** (2020) 5031-5048

## Machine Learning for Stochastic Parameterization: Generative Adversarial Networks in the Lorenz’96 Model

Journal of Advances in Modeling Earth Systems American Geophysical Union **12** (2020) e2019MS001896

Stochastic parameterizations account for uncertainty in the representation of unresolved subgrid processes by sampling from the distribution of possible subgrid forcings. Some existing stochastic parameterizations utilize data‐driven approaches to characterize uncertainty, but these approaches require significant structural assumptions that can limit their scalability. Machine learning models, including neural networks, are able to represent a wide range of distributions and build optimized mappings between a large number of inputs and subgrid forcings. Recent research on machine learning parameterizations has focused only on deterministic parameterizations. In this study, we develop a stochastic parameterization using the generative adversarial network (GAN) machine learning framework. The GAN stochastic parameterization is trained and evaluated on output from the Lorenz '96 model, which is a common baseline model for evaluating both parameterization and data assimilation techniques. We evaluate different ways of characterizing the input noise for the model and perform model runs with the GAN parameterization at weather and climate time scales. Some of the GAN configurations perform better than a baseline bespoke parameterization at both time scales, and the networks closely reproduce the spatiotemporal correlations and regimes of the Lorenz '96 system. We also find that, in general, those models which produce skillful forecasts are also associated with the best climate simulations.

## Short-term tests validate long-term estimates of climate change.

Nature **582** (2020) 185-186

## Reduced-precision parametrization: lessons from an intermediate-complexity atmospheric model

Quarterly Journal of the Royal Meteorological Society Wiley (2020)

Reducing numerical precision can save computational costs which can then be reinvested for more useful purposes. This study considers the effects of reducing precision in the parametrizations of an intermediate complexity atmospheric model (SPEEDY). We find that the difference between double-precision and reduced-precision parametrization tendencies is proportional to the expected machine rounding error if individual timesteps are considered. However, if reduced precision is used in simulations that are compared to double-precision simulations, a range of precision is found where differences are approximately the same for all simulations. Here, rounding errors are small enough to not directly perturb the model dynamics, but can perturb conditional statements in the parametrizations (such as convection active/inactive) leading to a similar error growth for all runs. For lower precision, simulations are perturbed significantly. Precision cannot be constrained without some quantification of the uncertainty. The inherent uncertainty in numerical weather and climate models is often explicitly considered in simulations by stochastic schemes that will randomly perturb the parametrizations. A commonly used scheme is stochastic perturbation of parametrization tendencies (SPPT). A strong test on whether a precision is acceptable is whether a low-precision ensemble produces the same probability distribution as a double-precision ensemble where the only difference between ensemble members is the model uncertainty (i.e., the random seed in SPPT). Tests with SPEEDY suggest a precision as low as 3.5 decimal places (equivalent to half precision) could be acceptable, which is surprisingly close to the lowest precision that produces similar error growth in the experiments without SPPT mentioned above. Minor changes to model code to express variables as anomalies rather than absolute values reduce rounding errors and low-precision biases, allowing even lower precision to be used. These results provide a pathway for implementing reduced-precision parametrizations in more complex weather and climate models.

## Revisiting the Identification ofWintertime Atmospheric Circulation Regimes in the Euro‐Atlantic Sector

Quarterly Journal of the Royal Meteorological Society Wiley (2020) qj.3818

## Assessing the robustness of multidecadal variability in Northern Hemisphere wintertime 4 seasonal forecast skill

Quarterly Journal of the Royal Meteorological Society Wiley (2020) qj.3890

## Constraining stochastic parametrisation schemes using high-resolution simulations

Quarterly Journal of the Royal Meteorological Society Wiley (2019)

## Jet Speed Variability Obscures Euro‐Atlantic Regime Structure

Geophysical Research Letters American Geophysical Union (AGU) **47** (2020)

## Beyond skill scores: exploring sub‐seasonal forecast value through a case‐study of French month‐ahead energy prediction

Quarterly Journal of the Royal Meteorological Society Wiley (2020) qj.3863

## Discretization of the Bloch sphere, fractal invariant sets and Bell's theorem.

Proceedings. Mathematical, physical, and engineering sciences **476** (2020) ARTN 20190350

An arbitrarily dense discretization of the Bloch sphere of complex Hilbert states is constructed, where points correspond to bit strings of fixed finite length. Number-theoretic properties of trigonometric functions (not part of the quantum-theoretic canon) are used to show that this constructive discretized representation incorporates many of the defining characteristics of quantum systems: completementarity, uncertainty relationships and (with a simple Cartesian product of discretized spheres) entanglement. Unlike Meyer's earlier discretization of the Bloch Sphere, there are no orthonormal triples, hence the Kocken-Specker theorem is not nullified. A physical interpretation of points on the discretized Bloch sphere is given in terms of ensembles of trajectories on a dynamically invariant fractal set in state space, where states of physical reality correspond to points on the invariant set. This deterministic construction provides a new way to understand the violation of the Bell inequality without violating statistical independence or factorization, where these conditions are defined solely from states on the invariant set. In this finite representation, there is an upper limit to the number of qubits that can be entangled, a property with potential experimental consequences.

## Single-precision in the tangent-linear and adjoint models of incremental 4D-VAr

Monthly Weather Review **148** (2020) 1541-1552

© 2020 American Meteorological Society. The use of single-precision arithmetic in ECMWF's forecasting model gave a 40% reduction in wall-clock time over double-precision, with no decrease in forecast quality. However, using reduced-precision in 4D-Var data assimilation is relatively unexplored and there are potential issues with using single-precision in the tangent-linear and adjoint models. Here, we present the results of reducing numerical precision in an incremental 4D-Var data assimilation scheme, with an underlying two-layer quasigeostrophic model. The minimizer used is the conjugate gradient method. We show how reducing precision increases the asymmetry between the tangent-linear and adjoint models. For ill-conditioned problems, this leads to a loss of orthogonality among the residuals of the conjugate gradient algorithm, which slows the convergence of the minimization procedure. However, we also show that a standard technique, reorthogonalization, eliminates these issues and therefore could allow the use of single-precision arithmetic. This work is carried out within ECMWF's data assimilation framework, the Object Oriented Prediction System.

## Rethinking Superdeterminism

FRONTIERS IN PHYSICS **8** (2020) ARTN 139

© Copyright © 2020 Hossenfelder and Palmer. Quantum mechanics has irked physicists ever since its conception more than 100 years ago. While some of the misgivings, such as it being unintuitive, are merely aesthetic, quantum mechanics has one serious shortcoming: it lacks a physical description of the measurement process. This “measurement problem” indicates that quantum mechanics is at least an incomplete theory—good as far as it goes, but missing a piece—or, more radically, is in need of complete overhaul. Here we describe an approach which may provide this sought-for completion or replacement: Superdeterminism. A superdeterministic theory is one which violates the assumption of Statistical Independence (that distributions of hidden variables are independent of measurement settings). Intuition suggests that Statistical Independence is an essential ingredient of any theory of science (never mind physics), and for this reason Superdeterminism is typically discarded swiftly in any discussion of quantum foundations. The purpose of this paper is to explain why the existing objections to Superdeterminism are based on experience with classical physics and linear systems, but that this experience misleads us. Superdeterminism is a promising approach not only to solve the measurement problem, but also to understand the apparent non-locality of quantum physics. Most importantly, we will discuss how it may be possible to test this hypothesis in an (almost) model independent way.

## The physics of numerical analysis: a climate modelling case study

Philosophical Transactions A: Mathematical, Physical and Engineering Sciences Royal Society **378** (2020) 20190058

The case is made for a much closer synergy between climate science, numerical analysis and computer science. This article is part of a discussion meeting issue 'Numerical algorithms for high-performance computational science'.

## Human creativity and consciousness: unintended consequences of the brain's extraordinary energy efficiency?

Entropy MDPI **22** (2020) 281

It is proposed that both human creativity and human consciousness are (unintended) consequences of the human brain's extraordinary energy efficiency. The topics of creativity and consciousness are treated separately, though have a common sub-structure. It is argued that creativity arises from a synergy between two cognitive modes of the human brain (which broadly coincide with Kahneman's Systems 1 and 2). In the first, available energy is spread across a relatively large network of neurons, many of which are small enough to be susceptible to thermal (ultimately quantum decoherent) noise. In the second, available energy is focussed on a smaller subset of larger neurons whose action is deterministic. Possible implications for creative computing in silicon are discussed. Starting with a discussion of the concept of free will, the notion of consciousness is defined in terms of an awareness of what are perceived to be nearby counterfactual worlds in state space. It is argued that such awareness arises from an interplay between memories on the one hand, and quantum physical mechanisms (where, unlike in classical physics, nearby counterfactual worlds play an indispensable dynamical role) in the ion channels of neural networks, on the other. As with the brain's susceptibility to noise, it is argued that in situations where quantum physics plays a role in the brain, it does so for reasons of energy efficiency. As an illustration of this definition of consciousness, a novel proposal is outlined as to why quantum entanglement appears to be so counter-intuitive.

## Continuous Structural Parameterization: A proposed method for representing different model parameterizations within one structure demonstrated for atmospheric convection

Journal of Advances in Modeling Earth Systems American Geophysical Union (AGU) (2020) e2020MS002085-e2020MS002085

## Optimising the use of ensemble information in numerical weather forecasts of wind power generation

Environmental Research Letters IOP Publishing **14** (0) 124086-124086

## The Impact of a Stochastic Parameterization Scheme on Climate Sensitivity in EC-Earth

JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES **124** (2019) 12726-12740

©2019. The Authors. Stochastic schemes, designed to represent unresolved subgrid-scale variability, are frequently used in short and medium-range weather forecasts, where they are found to improve several aspects of the model. In recent years, the impact of stochastic physics has also been found to be beneficial for the model's long-term climate. In this paper, we demonstrate for the first time that the inclusion of a stochastic physics scheme can notably affect a model's projection of global warming, as well as its historical climatological global temperature. Specifically, we find that when including the “stochastically perturbed parametrization tendencies” (SPPT) scheme in the fully coupled climate model EC-Earth v3.1, the predicted level of global warming between 1850 and 2100 is reduced by 10% under an RCP8.5 forcing scenario. We link this reduction in climate sensitivity to a change in the cloud feedbacks with SPPT. In particular, the scheme appears to reduce the positive low cloud cover feedback and increase the negative cloud optical feedback. A key role is played by a robust, rapid increase in cloud liquid water with SPPT, which we speculate is due to the scheme's nonlinear interaction with condensation.