Publications


Forecast skill of the Indian monsoon and its onset in the ECMWF seasonal forecasting system 5 (SEAS5)

Climate Dynamics Springer (2021)

A Chevuturi, A Turner, S Johnson, A Weisheimer, J Shonk, T Stockdale, R Senan

Accurate forecasting of variations in Indian monsoon precipitation and progression on seasonal time scales remains a challenge for prediction centres. We examine prediction skill for the seasonal-mean Indian summer monsoon and its onset in the European Centre for Medium-Range Weather Forecasts (ECMWF) seasonal forecasting system 5 (SEAS5). We analyse summer hindcasts initialised on 1st of May, with 51 ensemble members, for the 36-year period of 1981–2016. We evaluate the hindcasts against the Global Precipitation Climatology Project (GPCP) precipitation observations and the ECMWF reanalysis 5 (ERA5). The model has significant skill at forecasting dynamical features of the large-scale monsoon and local-scale monsoon onset tercile category one month in advance. SEAS5 shows higher skill for monsoon features calculated using large-scale indices compared to those at smaller scales. Our results also highlight possible model deficiencies in forecasting the all India monsoon rainfall.


The representation of winter Northern Hemisphere atmospheric blocking in the ECMWF seasonal prediction systems

Quarterly Journal of the Royal Meteorological Society Wiley (2021) qj.3974

P Davini, A Weisheimer, M Balmaseda, SJ Johnson, F Molteni, CD Roberts, R Senan, TN Stockdale


Impact of stochastic physics and model resolution on the simulation of tropical cyclones in climate GCMs

Journal of Climate American Meteorological Society (2021)

P Vidale, K Hodges, B Vanniere, P Davini, M Roberts, K Strommen, A Weisheimer, E Plesca, S Corti

The role of model resolution in simulating geophysical vortices with the characteristics of realistic Tropical Cyclones (TCs) is well established. The push for increasing resolution continues, with General Circulation Models (GCMs) starting to use sub-10km grid spacing. In the same context it has been suggested that the use of Stochastic Physics (SP) may act as a surrogate for high resolution, providing some of the benefits at a fraction of the cost. Either technique can reduce model uncertainty, and enhance reliability, by providing a more dynamic environment for initial synoptic disturbances to be spawned and to grow into TCs. We present results from a systematic comparison of the role of model resolution and SP in the simulation of TCs, using EC-Earth simulations from project Climate-SPHINX, in large ensemble mode, spanning five different resolutions. All tropical cyclonic systems, including TCs, were tracked explicitly. As in previous studies, the number of simulated TCs increases with the use of higher resolution, but SP further enhances TC frequencies by ≈ 30%, in a strikingly similar way. The use of SP is beneficial for removing systematic climate biases, albeit not consistently so for interannual variability; conversely, the use of SP improves the simulation of the seasonal cycle of TC frequency. An investigation of the mechanisms behind this response indicates that SP generates both higher TC (and TC seed) genesis rates, and more suitable environmental conditions, enabling a more efficient transition of TC seeds into TCs. These results were confirmed by the use of equivalent simulations with the HadGEM3-GC31 GCM.


OpenEnsemble 1.0: a boon for the research community

Copernicus GmbH (2020)

H Christensen


Representing model uncertainty in multi‐annual predictions

Geophysical Research Letters Wiley (2020)

DJ Befort, C O'Reilly, A Weisheimer

The most prominent way to account for model uncertainty is through the pragmatic combi16 nation of simulations from individual climate models into a multi-model ensemble (MME). However, alternative approaches to represent intrinsic model errors within single-model ensembles (SME) using stochastic parameterisations have proven beneficial in numerical weather prediction. Nevertheless, stochastic parameterisations are not included in most current decadal prediction systems. Here, the effect of the stochastically perturbed physical tendency scheme (SPPT) is examined in 28-month predictions using ECMWF’s forecast model and contrasted with a MME constructed from current decadal prediction systems. Compared to SMEs, SPPT improves the skill and reliability of tropical SST forecasts during the first 18 months (similar to the MME). Thus, stochastic schemes can be an effective and low-cost alternative to be used separately or in conjunction with the multi-model combination to improve the reliability of climate predictions on multi-annual time scales.


Calibrating large-ensemble European climate projections using observational data

Earth System Dynamics Copernicus Publications 11 (2020) 1033-1049

C O'Reilly, D Befort, A Weisheimer

This study examines methods of calibrating projections of future regional climate using large single model ensembles (the CESM Large Ensemble and MPI Grand Ensemble), applied over Europe. The three calibration methods tested here are more commonly used for initialised forecasts from weeks up to seasonal timescales. The calibration techniques are applied to ensemble climate projections, fitting seasonal ensemble data to observations over a reference period (1920–2016). The calibration methods were tested and verified using an imperfect model approach using the historical/RCP 8.5 simulations from the CMIP5 archive. All the calibration methods exhibit a similar performance, generally improving the out-of-sample projections in comparison to the uncalibrated (bias-corrected) ensemble. The calibration methods give results that are largely indistinguishable from one another, so the simplest of these methods, namely Homogeneous Gaussian Regression, is used for the subsequent analysis. An extension to this method – applying it to dynamically decomposed data (in which the underlying data is separated into dynamical and residual components) – is also tested. The verification indicates that this calibration method produces more reliable and accurate projections than the uncalibrated ensemble for future climate over Europe. The calibrated projections for temperature demonstrate a particular improvement, whereas the projections for changes in precipitation generally remain fairly unreliable. When the two large ensembles are calibrated using observational data, the climate projections for Europe are far more consistent between the two ensembles, with both projecting a reduction in warming but a general increase in the uncertainty of the projected changes.


Calibrating large-ensemble European climate projections using observational data

Earth System Dynamics 11 (2020) 1033-1049

C O’Reilly, D Befort, A Weisheimer

© Author(s) 2020. This work is distributed under This study examines methods of calibrating projections of future regional climate for the next 40–50 years using large single-model ensembles (the Community Earth System Model (CESM) Large Ensemble and Max Planck Institute (MPI) Grand Ensemble), applied over Europe. The three calibration methods tested here are more commonly used for initialised forecasts from weeks up to seasonal timescales. The calibration techniques are applied to ensemble climate projections, fitting seasonal ensemble data to observations over a reference period (1920–2016). The calibration methods were tested and verified using an “imperfect model” approach using the historical/representative concentration pathway 8.5 (RCP8.5) simulations from the Coupled Model Intercomparison Project 5 (CMIP5) archive. All the calibration methods exhibit a similar performance, generally improving the out-of-sample projections in comparison to the uncalibrated (bias-corrected) ensemble. The calibration methods give results that are largely indistinguishable from one another, so the simplest of these methods, namely homogeneous Gaussian regression (HGR), is used for the subsequent analysis. As an extension to the HGR calibration method it is applied to dynamically decomposed data, in which the underlying data are separated into dynamical and residual components (HGR-decomp). Based on the verification results obtained using the imperfect model approach, the HGR-decomp method is found to produce more reliable and accurate projections than the uncalibrated ensemble for future climate over Europe. The calibrated projections for temperature demonstrate a particular improvement, whereas the projections for changes in precipitation generally remain fairly unreliable. When the two large ensembles are calibrated using observational data, the climate projections for Europe are far more consistent between the two ensembles, with both projecting a reduction in warming but a general increase in the uncertainty of the projected changes.


Number formats, error mitigation, and scope for 16‐bit arithmetics in weather and climate modeling analyzed with a shallow water model

Journal of Advances in Modeling Earth Systems American Geophysical Union 12 (2020) e2020MS002246

P Düben, T Palmer

The need for high‐precision calculations with 64‐bit or 32‐bit floating‐point arithmetic for weather and climate models is questioned. Lower‐precision numbers can accelerate simulations and are increasingly supported by modern computing hardware. This paper investigates the potential of 16‐bit arithmetic when applied within a shallow water model that serves as a medium complexity weather or climate application. There are several 16‐bit number formats that can potentially be used (IEEE half precision, BFloat16, posits, integer, and fixed‐point). It is evident that a simple change to 16‐bit arithmetic will not be possible for complex weather and climate applications as it will degrade model results by intolerable rounding errors that cause a stalling of model dynamics or model instabilities. However, if the posit number format is used as an alternative to the standard floating‐point numbers, the model degradation can be significantly reduced. Furthermore, mitigation methods, such as rescaling, reordering, and mixed precision, are available to make model simulations resilient against a precision reduction. If mitigation methods are applied, 16‐bit floating‐point arithmetic can be used successfully within the shallow water model. The results show the potential of 16‐bit formats for at least parts of complex weather and climate models where rounding errors would be entirely masked by initial condition, model, or discretization error.


Assessing the robustness of multidecadal variability in Northern Hemisphere wintertime seasonal forecast skill

Quarterly Journal of the Royal Meteorological Society Wiley 146 (2020) qj.3890

CH O'Reilly, A Weisheimer, D MacLeod, DJ Befort, T Palmer

Recent studies have found evidence of multidecadal variability in northern hemisphere wintertime seasonal forecast skill. Here we assess the robustness of this finding by extending the analysis to analysing a diverse set of ensemble atmospheric model simulations. These simulations differ in either numerical model or type of initialisation and include atmospheric model experiments initialised with reanalysis data and free‐running atmospheric model ensembles. All ensembles are forced with observed SST and seaice boundary conditions. Analysis of large‐scale Northern Hemisphere circulation indicesover the Northern Hemisphere (namely the North Atlantic Oscillation, Pacific North American pattern and the Arctic Oscillation) reveals that in all ensembles there is larger correlation skill in the late century periods than during periods in the mid‐century. Similar multidecadal variability in skill is found in a measure of total skill integrated over the whole of the extratropics. Most of the differences in large‐scale circulation skill between the skillful late period (as well as early period) and the less skillful mid‐century period seem to be due to a reduction in skill over the North Pacific and a disappearance in skill over North America and the North Atlantic. The results are robust across different models and different types of initialisation, indicating that the multidecadal variability in Northern Hemisphere winter skill is a robust feature of 20th century climate variability. Multidecadal variability in skill therefore arises from the evolution of the observed SSTs, likely related to a weakened influence of ENSO on the predictable extratropical circulation signal during the middle of the 20th century, and is evident in the signal‐to‐noise ratio of the different ensembles, particularly the larger ensembles.


Beyond skill scores: exploring sub‐seasonal forecast value through a case‐study of French month‐ahead energy prediction

Quarterly Journal of the Royal Meteorological Society Wiley (2020) qj.3863

J Dorrington, I Finney, T Palmer, A Weisheimer


Constraining projections using decadal predictions

Geophysical Research Letters American Geophysical Union 47 (2020) e2020GL087900

DJ Befort, C O'Reilly, A Weisheimer

There is increasing demand for robust, reliable and actionable climate information for the next 1 to 50 years. This is challenging for the scientific community as the longest initialized predictions are limited to 10 years (decadal predictions). Thus, to provide seamless information for the upcoming 50 years, information from decadal predictions and uninitialized projections need to be merged. In this study, the ability to obtain valuable climate information beyond decadal time-scales by constraining uninitialized projections using decadal predictions is assessed. The application of this framework to surface temperatures over the North Atlantic Subpolar Gyre region, shows that the constrained uninitialized sub-ensemble has higher skill compared to the overall projection ensemble also beyond ten years when information from decadal predictions is no longer available. Though showing the potential of such a constraining approach to obtain climate information for the near-term future, its utility depends on the added value of initialization.


The value of initialisation on decadal timescales: state dependent predictability in the CESM Decadal Prediction Large Ensemble

Journal of Climate American Meteorological Society 33 (2020) 7353-7370

H Christensen, J Berner, S Yeager

Information in decadal climate prediction arises from a well initialised ocean state and from the predicted response to an external forcing. The length of time over which the initial conditions benefit the decadal forecast depends on the start date of the forecast. We characterise this state-dependent predictability for decadal forecasts of upper ocean heat content in the Community Earth System Model. We find regionally dependent initial condition predictability, with extended predictability generally observed in the extra-tropics. We also detect state-dependent predictability, with the year of loss of information from the initialisation varying between start dates. The decadal forecasts in the North Atlantic show substantial information from the initial conditions beyond the ten-year forecast window, and a high degree of state-dependent predictability. We find some evidence for state dependent predictability in the ensemble spread in this region, similar to that seen in weather and subseasonal-to-seasonal forecasts. For some start dates, an increase of information with lead time is observed, for which the initialised forecasts predict a growing phase of the Atlantic Multidecadal Oscillation. Finally we consider the information in the forecast from the initial conditions relative to the forced response, and quantify the crossover timescale after which the forcing provides more information. We demonstrate that the climate change signal projects onto different patterns than the signal from the initial conditions. This means that even after the crossover timescale has been reached in a basin-averaged sense, the benefits of initialisation can be felt locally on longer timescales.


Jet Speed Variability Obscures Euro‐Atlantic Regime Structure

Geophysical Research Letters American Geophysical Union (AGU) 47 (2020)

J Dorrington, K Strommen


Continuous structural parameterization: a proposed method for representing different model parameterizations within one structure demonstrated for atmospheric convection

Journal of Advances in Modeling Earth Systems American Geophysical Union 12 (2020) e2020MS002085

F Lambert, P Challenor, N Lewis, D McNeall, N Owen, I Boutle, H Christensen, R Keane, N Mayne, A Stirling, M Webb

Continuous structural parameterization (CSP) is a proposed method for approximating different numerical model parameterizations of the same process as functions of the same grid‐scale variables. This allows systematic comparison of parameterizations with each other and observations or resolved simulations of the same process. Using the example of two convection schemes running in the Met Office Unified Model (UM), we show that a CSP is able to capture concisely the broad behavior of the two schemes, and differences between the parameterizations and resolved convection simulated by a high resolution simulation. When the original convection schemes are replaced with their CSP emulators within the UM, basic features of the original model climate and some features of climate change are reproduced, demonstrating that CSP can capture much of the important behavior of the schemes. Our results open the possibility that future work will estimate uncertainty in model projections of climate change from estimates of uncertainty in simulation of the relevant physical processes.


An analysis of ways to decarbonize conference travel after COVID-19

Nature Nature Research 583 (2020) 356-360

M Klower, D Hopkins, M Allen, J Higham


Revisiting the identification of wintertime atmospheric circulation regimes in the Euro‐Atlantic sector

Quarterly Journal of the Royal Meteorological Societyhttps://doi.org/10.1002/qj.3818 Wiley (2020)

S Falkena, J de Wiljes, A WEISHEIMER, TG Shepherd

Atmospheric circulation is often clustered in so‐called circulation regimes, which are persistent and recurrent patterns. For the Euro‐Atlantic sector in winter, most studies identify four regimes: the Atlantic Ridge, Scandinavian Blocking and the two phases of the North Atlantic Oscillation. These results are obtained by applying k‐means clustering to the first several empirical orthogonal functions (EOFs) of geopotential height data. Studying the observed circulation in reanalysis data, it is found that when the full field data are used for the k‐means cluster analysis instead of the EOFs, the optimal number of clusters is no longer four but six. The two extra regimes that are found are the opposites of the Atlantic Ridge and Scandinavian Blocking, meaning they have a low‐pressure area roughly where the original regimes have a high‐pressure area. This introduces an appealing symmetry in the clustering result. Incorporating a weak persistence constraint in the clustering procedure is found to lead to a longer duration of regimes, extending beyond the synoptic time‐scale, without changing their occurrence rates. This is in contrast to the commonly used application of a time‐filter to the data before the clustering is executed, which, while increasing the persistence, changes the occurrence rates of the regimes. We conclude that applying a persistence constraint within the clustering procedure is a better way of stabilizing the clustering results than low‐pass filtering the data.


Euro-Atlantic weather Regimes in the PRIMAVERA coupled climate simulations: impact of resolution and mean state biases on model performance

Climate Dynamics Springer Science and Business Media LLC 54 (2020) 5031-5048

F Fabiano, H Christensen, K Strommen, P Athanasiadis, A Baker, R Schiemann, S Corti


Short-term tests validate long-term estimates of climate change

Nature Springer Nature 582 (2020) 185-186

T Palmer


Rethinking superdeterminism

Frontiers in Physics Frontiers 8 (2020) 139

S Hossenfelder, T Palmer

Quantum mechanics has irked physicists ever since its conception more than 100 years ago. While some of the misgivings, such as it being unintuitive, are merely aesthetic, quantum mechanics has one serious shortcoming: it lacks a physical description of the measurement process. This “measurement problem” indicates that quantum mechanics is at least an incomplete theory—good as far as it goes, but missing a piece—or, more radically, is in need of complete overhaul. Here we describe an approach which may provide this sought-for completion or replacement: Superdeterminism. A superdeterministic theory is one which violates the assumption of Statistical Independence (that distributions of hidden variables are independent of measurement settings). Intuition suggests that Statistical Independence is an essential ingredient of any theory of science (never mind physics), and for this reason Superdeterminism is typically discarded swiftly in any discussion of quantum foundations. The purpose of this paper is to explain why the existing objections to Superdeterminism are based on experience with classical physics and linear systems, but that this experience misleads us. Superdeterminism is a promising approach not only to solve the measurement problem, but also to understand the apparent non-locality of quantum physics. Most importantly, we will discuss how it may be possible to test this hypothesis in an (almost) model independent way.


Discretization of the Bloch sphere, fractal invariant sets and Bell's theorem.

Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences The Royal Society 476 (2020) 20190350

T Palmer

An arbitrarily dense discretization of the Bloch sphere of complex Hilbert states is constructed, where points correspond to bit strings of fixed finite length. Number-theoretic properties of trigonometric functions (not part of the quantum-theoretic canon) are used to show that this constructive discretized representation incorporates many of the defining characteristics of quantum systems: completementarity, uncertainty relationships and (with a simple Cartesian product of discretized spheres) entanglement. Unlike Meyer's earlier discretization of the Bloch Sphere, there are no orthonormal triples, hence the Kocken-Specker theorem is not nullified. A physical interpretation of points on the discretized Bloch sphere is given in terms of ensembles of trajectories on a dynamically invariant fractal set in state space, where states of physical reality correspond to points on the invariant set. This deterministic construction provides a new way to understand the violation of the Bell inequality without violating statistical independence or factorization, where these conditions are defined solely from states on the invariant set. In this finite representation, there is an upper limit to the number of qubits that can be entangled, a property with potential experimental consequences.

Pages