Publications by Tim Palmer


Reduced-precision parametrization: lessons from an intermediate-complexity atmospheric model

QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY (2020)

L Saffin, S Hatfield, P Duben, T Palmer

© 2020 Royal Meteorological Society Reducing numerical precision can save computational costs which can then be reinvested for more useful purposes. This study considers the effects of reducing precision in the parametrizations of an intermediate complexity atmospheric model (SPEEDY). We find that the difference between double-precision and reduced-precision parametrization tendencies is proportional to the expected machine rounding error if individual timesteps are considered. However, if reduced precision is used in simulations that are compared to double-precision simulations, a range of precision is found where differences are approximately the same for all simulations. Here, rounding errors are small enough to not directly perturb the model dynamics, but can perturb conditional statements in the parametrizations (such as convection active/inactive) leading to a similar error growth for all runs. For lower precision, simulations are perturbed significantly. Precision cannot be constrained without some quantification of the uncertainty. The inherent uncertainty in numerical weather and climate models is often explicitly considered in simulations by stochastic schemes that will randomly perturb the parametrizations. A commonly used scheme is stochastic perturbation of parametrization tendencies (SPPT). A strong test on whether a precision is acceptable is whether a low-precision ensemble produces the same probability distribution as a double-precision ensemble where the only difference between ensemble members is the model uncertainty (i.e., the random seed in SPPT). Tests with SPEEDY suggest a precision as low as 3.5 decimal places (equivalent to half precision) could be acceptable, which is surprisingly close to the lowest precision that produces similar error growth in the experiments without SPPT mentioned above. Minor changes to model code to express variables as anomalies rather than absolute values reduce rounding errors and low-precision biases, allowing even lower precision to be used. These results provide a pathway for implementing reduced-precision parametrizations in more complex weather and climate models.


Discretization of the Bloch sphere, fractal invariant sets and Bell's theorem.

Proceedings. Mathematical, physical, and engineering sciences 476 (2020) 20190350-

TN Palmer

An arbitrarily dense discretization of the Bloch sphere of complex Hilbert states is constructed, where points correspond to bit strings of fixed finite length. Number-theoretic properties of trigonometric functions (not part of the quantum-theoretic canon) are used to show that this constructive discretized representation incorporates many of the defining characteristics of quantum systems: completementarity, uncertainty relationships and (with a simple Cartesian product of discretized spheres) entanglement. Unlike Meyer's earlier discretization of the Bloch Sphere, there are no orthonormal triples, hence the Kocken-Specker theorem is not nullified. A physical interpretation of points on the discretized Bloch sphere is given in terms of ensembles of trajectories on a dynamically invariant fractal set in state space, where states of physical reality correspond to points on the invariant set. This deterministic construction provides a new way to understand the violation of the Bell inequality without violating statistical independence or factorization, where these conditions are defined solely from states on the invariant set. In this finite representation, there is an upper limit to the number of qubits that can be entangled, a property with potential experimental consequences.


The physics of numerical analysis: a climate modelling case study.

Philosophical transactions. Series A, Mathematical, physical, and engineering sciences 378 (2020) 20190058-

T Palmer

The case is made for a much closer synergy between climate science, numerical analysis and computer science. This article is part of a discussion meeting issue 'Numerical algorithms for high-performance computational science'.


Human creativity and consciousness: Unintended consequences of the brain's extraordinary energy efficiency?

Entropy 22 (2020)

T Palmer

© 2020 by authors. It is proposed that both human creativity and human consciousness are (unintended) consequences of the human brain's extraordinary energy efficiency. The topics of creativity and consciousness are treated separately, though have a common sub-structure. It is argued that creativity arises from a synergy between two cognitive modes of the human brain (which broadly coincide with Kahneman's Systems 1 and 2). In the first, available energy is spread across a relatively large network of neurons, many of which are small enough to be susceptible to thermal (ultimately quantum decoherent) noise. In the second, available energy is focussed on a smaller subset of larger neurons whose action is deterministic. Possible implications for creative computing in silicon are discussed. Starting with a discussion of the concept of free will, the notion of consciousness is defined in terms of an awareness of what are perceived to be nearby counterfactual worlds in state space. It is argued that such awareness arises from an interplay between memories on the one hand, and quantum physical mechanisms (where, unlike in classical physics, nearby counterfactual worlds play an indispensable dynamical role) in the ion channels of neural networks, on the other. As with the brain's susceptibility to noise, it is argued that in situations where quantum physics plays a role in the brain, it does so for reasons of energy efficiency. As an illustration of this definition of consciousness, a novel proposal is outlined as to why quantum entanglement appears to be so counter-intuitive.


Optimising the use of ensemble information in numerical weather forecasts of wind power generation

Environmental Research Letters IOP Publishing 14 (2019) 124086

JDY Stanger, I Finney, A Weisheimer, T Palmer

Electricity generation output forecasts for wind farms across Europe use numerical weather prediction (NWP) models. These forecasts influence decisions in the energy market, some of which help determine daily energy prices or the usage of thermal power generation plants. The predictive skill of power generation forecasts has an impact on the profitability of energy trading strategies and the ability to decrease carbon emissions. Probabilistic ensemble forecasts contain valuable information about the uncertainties in a forecast. The energy market typically takes basic approaches to using ensemble data to obtain more skilful forecasts. There is, however, evidence that more sophisticated approaches could yield significant further improvements in forecast skill and utility.In this letter, the application of ensemble forecasting methods to the aggregated electricity generation output for wind farms across Germany is investigated using historical ensemble forecasts from the European Centre for Medium-Range Weather Forecasting (ECMWF). Multiple methods for producing a single forecast from the ensemble are tried and tested against traditional deterministic methods. All the methods exhibit positive skill, relative to a climatological forecast, out to a lead time of at least seven days. A wind energy trading strategy involving ensemble data is implemented and produces significantly more profit than trading strategies based on single forecasts. It is thus found that ensemble spread is a good predictor for wind power forecast uncertainty and is extremely valuable at informing wind energy trading strategy.


Experimental Non-Violation of the Bell Inequality

ENTROPY 20 (2019) ARTN 356

TN Palmer


Stochastic weather and climate models

Nature Reviews Physics Springer Science and Business Media LLC 1 (2019) 463-471

TN Palmer


The ECMWF ensemble prediction system: Looking back (more than) 25 years and projecting forward 25 years

Quarterly Journal of the Royal Meteorological Society (2018)

T Palmer

© 2018 The Authors. Quarterly Journal of the Royal Meteorological Society published by John Wiley & Sons Ltd on behalf of the Royal Meteorological Society. This paper has been written to mark 25 years of operational medium-range ensemble forecasting. The origins of the ECMWF Ensemble Prediction System are outlined, including the development of the precursor real-time Met Office monthly ensemble forecast system. In particular, the reasons for the development of singular vectors and stochastic physics – particular features of the ECMWF Ensemble Prediction System - are discussed. The author speculates about the development and use of ensemble prediction in the next 25 years.


How confident are predictability estimates of the winter North Atlantic Oscillation?

Quarterly Journal of the Royal Meteorological Society Wiley 145 (2018) 140-159

A Weisheimer, D Decremer, D Macleod, C O’Reilly, TN Stockdale, S Johnson, T Palmer

Atmospheric seasonal predictability in winter over the Euro-Atlantic region is studied with an emphasis on the signal-to-noise paradox of the North Atlantic Oscillation. Seasonal hindcasts of the ECMWF model for the recent period 1981-2009 show, in agreement with other studies, that correlation skill over Greenland and parts of the Arctic is higher than the signal-to-noise ratio implies. This leads to the paradoxical situation where the real world appears more predictable than the models suggest, with the forecast ensembles being overly dispersive (or underconfident). However, it is demonstrated that these conclusions are not supported by the diagnosed relationship between ensemble mean RMSE and ensemble spread which indicates a slight underdispersion (overconfidence). Furthermore, long atmospheric seasonal hindcasts suggest that over the 110-year period from 1900 to 2009 the ensemble system is well calibrated (neither over- nor underdispersive). The observed skill changed drastically in the middle of the 20th Century and paradoxical regions during more recent hindcast periods were strongly underdispersive during mid-Century decades. <br/><br/> Due to non-stationarities of the climate system in the form of decadal variability, relatively short hindcasts are not sufficiently representative for longer-term behaviour. In addition, small hindcast sample size can lead to skill estimates, in particular of correlation measures, that are not robust. It is shown that the relative uncertainty due to small hindcast sample size is often larger for correlation-based than for RMSE-based diagnostics. Correlation-based measures like the RPC are shown to be highly sensitive to the strength of the predictable signal, implying that disentangling of physical deficiencies in the models on the one hand, and the effects of sampling uncertainty on the other hand, is difficult. Given the current lack of a causal physical mechanism to unravel the puzzle, our hypotheses of non-stationarity and sampling uncertainty provide simple yet plausible explanations for the paradox.


Scale-Selective Precision for Weather and Climate Forecasting

MONTHLY WEATHER REVIEW 147 (2019) 645-655

M Chantry, T Thornes, T Palmer, P Duben


The impact of a stochastic parameterization scheme on climate sensitivity in EC‐Earth

Journal of Geophysical Research: Atmospheres American Geophysical Union 124 (2019) 12726-12740

K Strommen, PAG Watson, TN Palmer

Stochastic schemes, designed to represent unresolved sub-grid scale variability, are frequently used in short and medium-range weather forecasts, where they are found to improve several aspects of the model. In recent years, the impact of stochastic physics has also been found to be beneficial for the model's long term climate. In this paper, we demonstrate for the first time that the inclusion of a stochastic physics scheme can notably affect a model's projection of global warming, as well as its historical climatological global temperature. Specifically, we find that when including the 'stochastically perturbed parametrisation tendencies' scheme (SPPT) in the fully coupled climate model EC-Earth v3.1, the predicted level of global warming between 1850 and 2100 is reduced by 10% under an RCP8.5 forcing scenario. We link this reduction in climate sensitivity to a change in the cloud feedbacks with SPPT. In particular, the scheme appears to reduce the positive low cloud cover feedback, and increase the negative cloud optical feedback. A key role is played by a robust, rapid increase in cloud liquid water with SPPT, which we speculate is due to the scheme's non-linear interaction with condensation.


Accelerating high-resolution weather models with deep-learning hardware

PASC '19 Proceedings of the Platform for Advanced Scientific Computing Conference Association for Computing Machinery (2019)

S Hatfield, M Chantry, P Duben, T Palmer

The next generation of weather and climate models will have an unprecedented level of resolution and model complexity, and running these models efficiently will require taking advantage of future supercomputers and heterogeneous hardware. In this paper, we investigate the use of mixed-precision hardware that supports floating-point operations at double-, single- and half-precision. In particular, we investigate the potential use of the NVIDIA Tensor Core, a mixed-precision matrix-matrix multiplier mainly developed for use in deep learning, to accelerate the calculation of the Legendre transforms in the Integrated Forecasting System (IFS), one of the leading global weather forecast models. In the IFS, the Legendre transform is one of the most expensive model components and dominates the computational cost for simulations at a very high resolution. We investigate the impact of mixed-precision arithmetic in IFS simulations of operational complexity through software emulation. Through a targeted but minimal use of double-precision arithmetic we are able to use either half-precision arithmetic or mixed half/single-precision arithmetic for almost all of the calculations in the Legendre transform without affecting forecast skill.


Progress Towards a Probabilistic Earth System Model: Examining The Impact of Stochasticity in EC-Earth v3.2

Geoscientific Model Development European Geosciences Union 12 (2019) 3099-3118

K Strommen, H Christensen, D Macleod, S Juricke, T Palmer

We introduce and study the impact of three stochastic schemes in the EC-Earth climate model: two atmospheric schemes and one stochastic land scheme. These form the basis for a probabilistic Earth system model in atmosphere-only mode. Stochastic parametrization have become standard in several operational weather-forecasting models, in particular due to their beneficial impact on model spread. In recent years, stochastic schemes in the atmospheric component of a model have been shown to improve aspects important for the models long-term climate, such as El Niño–Southern Oscillation (ENSO), North Atlantic weather regimes, and the Indian monsoon. Stochasticity in the land component has been shown to improve the variability of soil processes and improve the representation of heatwaves over Europe. However, the raw impact of such schemes on the model mean is less well studied. It is shown that the inclusion of all three schemes notably changes the model mean state. While many of the impacts are beneficial, some are too large in amplitude, leading to significant changes in the model's energy budget and atmospheric circulation. This implies that in order to maintain the benefits of stochastic physics without shifting the mean state too far from observations, a full re-tuning of the model will typically be required.


The scientific challenge of understanding and estimating climate change.

Proceedings of the National Academy of Sciences of the United States of America 116 (2019) 24390-24395

T Palmer, B Stevens

Given the slow unfolding of what may become catastrophic changes to Earth's climate, many are understandably distraught by failures of public policy to rise to the magnitude of the challenge. Few in the science community would think to question the scientific response to the unfolding changes. However, is the science community continuing to do its part to the best of its ability? In the domains where we can have the greatest influence, is the scientific community articulating a vision commensurate with the challenges posed by climate change? We think not.


Signal and noise in regime systems: A hypothesis on the predictability of the North Atlantic Oscillation

Quarterly Journal of the Royal Meteorological Society (2019)

K Strommen, TN Palmer

© 2018 Royal Meteorological Society Studies conducted by the UK Met Office reported significant skill in predicting the winter North Atlantic Oscillation (NAO) index with their seasonal prediction system. At the same time, a very low signal-to-noise ratio was observed, as measured using the “ratio of predictable components” (RPC) metric. We analyse both the skill and signal-to-noise ratio using a new statistical toy model, which assumes NAO predictability is driven by regime dynamics. It is shown that if the system is approximately bimodal in nature, with the model consistently underestimating the level of regime persistence each season, then both the high skill and high RPC value of the Met Office hindcasts can easily be reproduced. Underestimation of regime persistence could be attributable to any number of sources of model error, including imperfect regime structure or errors in the propagation of teleconnections. In particular, a high RPC value for a seasonal mean prediction may be expected even if the model's internal level of noise is realistic.


Posits as an alternative to floats for weather and climate models

CoNGA'19 Proceedings of the Conference for Next Generation Arithmetic 2019 Association for Computing Machinery (2019)

M Klöwer, PD Düben, TN Palmer

Posit numbers, a recently proposed alternative to floating-point numbers, claim to have smaller arithmetic rounding errors in many applications. By studying weather and climate models of low and medium complexity (the Lorenz system and a shallow water model) we present benefits of posits compared to floats at 16 bit. As a standardised posit processor does not exist yet, we emulate posit arithmetic on a conventional CPU. Using a shallow water model, forecasts based on 16-bit posits with 1 or 2 exponent bits are clearly more accurate than half precision floats. We therefore propose 16 bit with 2 exponent bits as a standard posit format, as its wide dynamic range of 32 orders of magnitude provides a great potential for many weather and climate models. Although the focus is on geophysical fluid simulations, the results are also meaningful and promising for reduced precision posit arithmetic in the wider field of computational fluid dynamics.


A Stochastic Representation of Subgrid Uncertainty for Dynamical Core Development

BULLETIN OF THE AMERICAN METEOROLOGICAL SOCIETY 100 (2019) 1091-1101

A Subramanian, S Juricke, P Dueben, T Palmer


The Impact of a Stochastic Parameterization Scheme on Climate Sensitivity in EC-Earth

JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES (2019)

K Strommen, PAG Watson, TN Palmer


Estimates of flow-dependent predictability of wintertime Euro-Atlantic weather regimes in medium-range forecasts

QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY 144 (2018) 1012-1027

M Matsueda, TN Palmer


Seasonal to annual ocean forecasting skill and the role of model and observational uncertainty

Quarterly Journal of the Royal Meteorological Society Wiley 144 (2018) 1947-1964

S Juricke, D Macleod, A Weisheimer, L Zanna, T Palmer

Accurate forecasts of the ocean state and the estimation of forecast uncertainties are crucial when it comes to providing skilful seasonal predictions. In this study we analyse the predictive skill and reliability of the ocean component in a seasonal forecasting system. Furthermore, we assess the effects of accounting for model and observational uncertainties. Ensemble forcasts are carried out with an updated version of the ECMWF seasonal forecasting model System 4, with a forecast length of ten months, initialized every May between 1981 and 2010. We find that, for essential quantities such as sea surface temperature and upper ocean 300 m heat content, the ocean forecasts are generally underdispersive and skilful beyond the first month mainly in the Tropics and parts of the North Atlantic. The reference reanalysis used for the forecast evaluation considerably affects diagnostics of forecast skill and reliability, throughout the entire ten‐month forecasts but mostly during the first three months. Accounting for parametrization uncertainty by implementing stochastic parametrization perturbations has a positive impact on both reliability (from month 3 onwards) as well as forecast skill (from month 8 onwards). Skill improvements extend also to atmospheric variables such as 2 m temperature, mostly in the extratropical Pacific but also over the midlatitudes of the Americas. Hence, while model uncertainty impacts the skill of seasonal forecasts, observational uncertainty impacts our assessment of that skill. Future ocean model development should therefore aim not only to reduce model errors but to simultaneously assess and estimate uncertainties.

Pages