Publications


The response of the Pacific storm track and atmospheric circulation to Kuroshio Extension variability

Quarterly Journal of the Royal Meteorological Society Wiley 141 (2015) 52-66

CH O'Reilly, A Czaja


Simulating weather regimes: impact of model resolution and stochastic parameterization

Climate Dynamics 44 (2015) 2177-2193

A Dawson, TN Palmer

© 2014, Springer-Verlag Berlin Heidelberg. The simulation of quasi-persistent regime structures in an atmospheric model with horizontal resolution typical of the Intergovernmental Panel on Climate Change fifth assessment report simulations, is shown to be unrealistic. A higher resolution configuration of the same model, with horizontal resolution typical of that used in operational numerical weather prediction, is able to simulate these regime structures realistically. The spatial patterns of the simulated regimes are remarkably accurate at high resolution. A model configuration at intermediate resolution shows a marked improvement over the low-resolution configuration, particularly in terms of the temporal characteristics of the regimes, but does not produce a simulation as accurate as the very-high-resolution configuration. It is demonstrated that the simulation of regimes can be significantly improved, even at low resolution, by the introduction of a stochastic physics scheme. At low resolution the stochastic physics scheme drastically improves both the spatial and temporal aspects of the regimes simulation. These results highlight the importance of small-scale processes on large-scale climate variability, and indicate that although simulating variability at small scales is a necessity, it may not be necessary to represent the small-scales accurately, or even explicitly, in order to improve the simulation of large-scale climate. It is argued that these results could have important implications for improving both global climate simulations, and the ability of high-resolution limited-area models, forced by low-resolution global models, to reliably simulate regional climate change signals.


Does the ECMWF IFS Convection Parameterization with Stochastic Physics Correctly Reproduce Relationships between Convection and the Large-Scale State?

JOURNAL OF THE ATMOSPHERIC SCIENCES 72 (2015) 236-242

PAG Watson, HM Christensen, TN Palmer


Architectures and precision analysis for modelling atmospheric variables with chaotic behaviour

2015 IEEE 23RD ANNUAL INTERNATIONAL SYMPOSIUM ON FIELD-PROGRAMMABLE CUSTOM COMPUTING MACHINES (FCCM) (2015) 171-178

FP Russell, PD Duben, X Niu, W Luk, TN Palmer, IEEE


Opportunities for Energy Efficient Computing: A Study of Inexact General Purpose Processors for High-Performance and Big-data Applications

2015 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE) (2015) 764-769

P Duben, J Schlachter, Parishkrati, S Yenugula, J Augustine, C Enz, K Palem, TN Palmer, IEEE


News/Interview/Editorial

Significance 12 (2015) 2-7

H Christensen, B Tarran


Solving difficult problems creatively: a role for energy optimised deterministic/stochastic hybrid computing.

Frontiers in computational neuroscience 9 (2015) 124-

TN Palmer, M O'Shea

How is the brain configured for creativity? What is the computational substrate for 'eureka' moments of insight? Here we argue that creative thinking arises ultimately from a synergy between low-energy stochastic and energy-intensive deterministic processing, and is a by-product of a nervous system whose signal-processing capability per unit of available energy has become highly energy optimised. We suggest that the stochastic component has its origin in thermal (ultimately quantum decoherent) noise affecting the activity of neurons. Without this component, deterministic computational models of the brain are incomplete.


Simulating weather regimes: impact of stochastic and perturbed parameter schemes in a simple atmospheric model

Climate Dynamics 44 (2015) 2195-2214

HM Christensen, IM Moroz, TN Palmer

© 2014, Springer-Verlag Berlin Heidelberg. Representing model uncertainty is important for both numerical weather and climate prediction. Stochastic parametrisation schemes are commonly used for this purpose in weather prediction, while perturbed parameter approaches are widely used in the climate community. The performance of these two representations of model uncertainty is considered in the context of the idealised Lorenz ’96 system, in terms of their ability to capture the observed regime behaviour of the system. These results are applicable to the atmosphere, where evidence points to the existence of persistent weather regimes, and where it is desirable that climate models capture this regime behaviour. The stochastic parametrisation schemes considerably improve the representation of regimes when compared to a deterministic model: both the structure and persistence of the regimes are found to improve. The stochastic parametrisation scheme represents the small scale variability present in the full system, which enables the system to explore a larger portion of the system’s attractor, improving the simulated regime behaviour. It is important that temporally correlated noise is used in the stochastic parametrisation—white noise schemes performed similarly to the deterministic model. In contrast, the perturbed parameter ensemble was unable to capture the regime structure of the attractor, with many individual members exploring only one regime. This poor performance was not evident in other climate diagnostics. Finally, a ‘climate change’ experiment was performed, where a change in external forcing resulted in changes to the regime structure of the attractor. The temporally correlated stochastic schemes captured these changes well.


The use of imprecise processing to improve accuracy in weather & climate prediction

Journal of Computational Physics 271 (2014) 2-18

PD Düben, H McNamara, TN Palmer

The use of stochastic processing hardware and low precision arithmetic in atmospheric models is investigated. Stochastic processors allow hardware-induced faults in calculations, sacrificing bit-reproducibility and precision in exchange for improvements in performance and potentially accuracy of forecasts, due to a reduction in power consumption that could allow higher resolution. A similar trade-off is achieved using low precision arithmetic, with improvements in computation and communication speed and savings in storage and memory requirements. As high-performance computing becomes more massively parallel and power intensive, these two approaches may be important stepping stones in the pursuit of global cloud-resolving atmospheric modelling. The impact of both hardware induced faults and low precision arithmetic is tested using the Lorenz '96 model and the dynamical core of a global atmosphere model. In the Lorenz '96 model there is a natural scale separation; the spectral discretisation used in the dynamical core also allows large and small scale dynamics to be treated separately within the code. Such scale separation allows the impact of lower-accuracy arithmetic to be restricted to components close to the truncation scales and hence close to the necessarily inexact parametrised representations of unresolved processes. By contrast, the larger scales are calculated using high precision deterministic arithmetic. Hardware faults from stochastic processors are emulated using a bit-flip model with different fault rates. Our simulations show that both approaches to inexact calculations do not substantially affect the large scale behaviour, provided they are restricted to act only on smaller scales. By contrast, results from the Lorenz '96 simulations are superior when small scales are calculated on an emulated stochastic processor than when those small scales are parametrised. This suggests that inexact calculations at the small scale could reduce computation and power costs without adversely affecting the quality of the simulations. This would allow higher resolution models to be run at the same computational cost. © 2013 The Authors.


The real butterfly effect

NONLINEARITY 27 (2014) R123-R141

TN Palmer, A Doering, G Seregin


Energy- and enstrophy-conserving schemes for the shallow-water equations, based on mimetic finite elements

QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY 140 (2014) 2223-2234

ATT McRae, CJ Cotter


Build high-resolution global climate models

NATURE 515 (2014) 338-339

T Palmer


Stochastic modelling and energy-efficient computing for weather and climate prediction.

Philosophical transactions. Series A, Mathematical, physical, and engineering sciences 372 (2014) 20140118-

T Palmer, P Düben, H McNamara


On the use of inexact, pruned hardware in atmospheric modelling.

Philosophical transactions. Series A, Mathematical, physical, and engineering sciences 372 (2014) 20130276-

PD Düben, J Joven, A Lingamneni, H McNamara, G De Micheli, KV Palem, TN Palmer

Inexact hardware design, which advocates trading the accuracy of computations in exchange for significant savings in area, power and/or performance of computing hardware, has received increasing prominence in several error-tolerant application domains, particularly those involving perceptual or statistical end-users. In this paper, we evaluate inexact hardware for its applicability in weather and climate modelling. We expand previous studies on inexact techniques, in particular probabilistic pruning, to floating point arithmetic units and derive several simulated set-ups of pruned hardware with reasonable levels of error for applications in atmospheric modelling. The set-up is tested on the Lorenz '96 model, a toy model for atmospheric dynamics, using software emulation for the proposed hardware. The results show that large parts of the computation tolerate the use of pruned hardware blocks without major changes in the quality of short- and long-time diagnostics, such as forecast errors and probability density functions. This could open the door to significant savings in computational cost and to higher resolution simulations with weather and climate models.


Benchmark Tests for Numerical Weather Forecasts on Inexact Hardware

MONTHLY WEATHER REVIEW 142 (2014) 3809-3829

PD Dueben, TN Palmer


More reliable forecasts with less precise computations: a fast-track route to cloud-resolved weather and climate simulators?

Philosophical transactions. Series A, Mathematical, physical, and engineering sciences 372 (2014) 20130391-

TN Palmer

This paper sets out a new methodological approach to solving the equations for simulating and predicting weather and climate. In this approach, the conventionally hard boundary between the dynamical core and the sub-grid parametrizations is blurred. This approach is motivated by the relatively shallow power-law spectrum for atmospheric energy on scales of hundreds of kilometres and less. It is first argued that, because of this, the closure schemes for weather and climate simulators should be based on stochastic-dynamic systems rather than deterministic formulae. Second, as high-wavenumber elements of the dynamical core will necessarily inherit this stochasticity during time integration, it is argued that the dynamical core will be significantly over-engineered if all computations, regardless of scale, are performed completely deterministically and if all variables are represented with maximum numerical precision (in practice using double-precision floating-point numbers). As the era of exascale computing is approached, an energy- and computationally efficient approach to cloud-resolved weather and climate simulation is described where determinism and numerical precision are focused on the largest scales only.


Atlantic meridional overturning circulation and the prediction of North Atlantic sea surface temperature

EARTH AND PLANETARY SCIENCE LETTERS 406 (2014) 1-6

M Kloewer, M Latif, H Ding, RJ Greatbatch, W Park


Lorenz, Godel and Penrose: new perspectives on determinism and causality in fundamental physics

CONTEMPORARY PHYSICS 55 (2014) 157-178

TN Palmer


Climate forecasting: build high-resolution global climate models.

Nature 515 (2014) 338-339

T Palmer


On the reliability of seasonal climate forecasts.

Journal of the Royal Society, Interface 11 (2014) 20131162-

A Weisheimer, TN Palmer

Seasonal climate forecasts are being used increasingly across a range of application sectors. A recent UK governmental report asked: how good are seasonal forecasts on a scale of 1-5 (where 5 is very good), and how good can we expect them to be in 30 years time? Seasonal forecasts are made from ensembles of integrations of numerical models of climate. We argue that 'goodness' should be assessed first and foremost in terms of the probabilistic reliability of these ensemble-based forecasts; reliable inputs are essential for any forecast-based decision-making. We propose that a '5' should be reserved for systems that are not only reliable overall, but where, in particular, small ensemble spread is a reliable indicator of low ensemble forecast error. We study the reliability of regional temperature and precipitation forecasts of the current operational seasonal forecast system of the European Centre for Medium-Range Weather Forecasts, universally regarded as one of the world-leading operational institutes producing seasonal climate forecasts. A wide range of 'goodness' rankings, depending on region and variable (with summer forecasts of rainfall over Northern Europe performing exceptionally poorly) is found. Finally, we discuss the prospects of reaching '5' across all regions and variables in 30 years time.

Pages