Today’s climate models successfully capture broad global warming trends. However, because of uncertainties about processes that are small in scale yet globally important, such as clouds and ocean turbulence, these models’ predictions of upcoming climate changes are not very accurate in detail. For example, predictions of the time by which the global mean surface temperature of Earth will have warmed 2℃, relative to preindustrial times, vary by 40–50 years (a full human generation) among today’s models. As a result, we do not have the accurate and geographically granular predictions we need to plan resilient infrastructure, adapt supply chains to climate disruption, and assess the risks of climate-related hazards to vulnerable communities.

In large part this is because clouds dominate errors and uncertainties in climate predictions for the coming decades [1, 2, 3]. Clouds reflect sunlight and exert a greenhouse effect, making them crucial for regulating Earth’s energy balance and mediating the response of the climate system to changes in greenhouse gas concentrations. However, they are too small in scale to be directly resolvable in today’s climate models. Current climate models resolve motions at scales of tens to a hundred kilometers, with a few pushing toward the kilometer-scale. However, the turbulent air motions that sustain, for example, the low clouds that cover large swaths of tropical oceans have scales of meters to tens of meters. Because of this wide difference in scale, climate models use empirical parameterizations of clouds, rather than simulating them directly, which result in large errors and uncertainties.

While clouds cannot be directly resolved in global climate models, their turbulent dynamics can be simulated in limited areas by using high-resolution large eddy simulations (LES). However, the high computational cost of simulating clouds with LES has inhibited broad and systematic numerical experimentation, and it has held back the generation of large datasets for training parameterization schemes to represent clouds in coarser-resolution global climate models.

Related work from others:  Latest from MIT : Computational model captures the elusive transition states of chemical reactions

In “Accelerating Large-Eddy Simulations of Clouds with Tensor Processing Units”, published in Journal of Advances in Modeling Earth Systems (JAMES), and in collaboration with a Climate Modeling Alliance (CliMA) lead who is a visiting researcher at Google, we demonstrate that Tensor Processing Units (TPUs) — application-specific integrated circuits that were originally developed for machine learning (ML) applications — can be effectively used to perform LES of clouds. We show that TPUs, in conjunction with tailored software implementations, can be used to simulate particularly computationally challenging marine stratocumulus clouds in the conditions observed during the Dynamics and Chemistry of Marine Stratocumulus (DYCOMS) field study. This successful TPU-based LES code reveals the utility of TPUs, with their large computational resources and tight interconnects, for cloud simulations.

Climate model accuracy for critical metrics, like precipitation or the energy balance at the top of the atmosphere, has improved roughly 10% per decade in the last 20 years. Our goal is for this research to enable a 50% reduction in climate model errors by improving their representation of clouds.

Large-eddy simulations on TPUs

In this work, we focus on stratocumulus clouds, which cover ~20% of the tropical oceans and are the most prevalent cloud type on earth. Current climate models are not yet able to reproduce stratocumulus cloud behavior correctly, which has been one of the largest sources of errors in these models. Our work will provide a much more accurate ground truth for large-scale climate models.

Our simulations of clouds on TPUs exhibit unprecedented computational throughput and scaling, making it possible, for example, to simulate stratocumulus clouds with 10× speedup over real-time evolution across areas up to about 35 × 54 km2. Such domain sizes are close to the cross-sectional area of typical global climate model grid boxes. Our results open up new avenues for computational experiments, and for substantially enlarging the sample of LES available to train parameterizations of clouds for global climate models.

Related work from others:  Latest from MIT : A technique to improve both fairness and accuracy in artificial intelligence



Rendering of the cloud evolution from a simulation of a 285 x 285 x 2 km3 stratocumulus cloud sheet. This is the largest cloud sheet of its kind ever simulated. Left: An oblique view of the cloud field with the camera cruising. Right: Top view of the cloud field with the camera gradually pulled away.

The LES code is written in TensorFlow, an open-source software platform developed by Google for ML applications. The code takes advantage of TensorFlow’s graph computation and Accelerated Linear Algebra (XLA) optimizations, which enable the full exploitation of TPU hardware, including the high-speed, low-latency inter-chip interconnects (ICI) that helped us achieve this unprecedented performance. At the same time, the TensorFlow code makes it easy to incorporate ML components directly within the physics-based fluid solver.

We validated the code by simulating canonical test cases for atmospheric flow solvers, such as a buoyant bubble that rises in neutral stratification, and a negatively buoyant bubble that sinks and impinges on the surface. These test cases show that the TPU-based code faithfully simulates the flows, with increasingly fine turbulent details emerging as the resolution increases. The validation tests culminate in simulations of the conditions during the DYCOMS field campaign. The TPU-based code reliably reproduces the cloud fields and turbulence characteristics observed by aircraft during a field campaign — a feat that is notoriously difficult to achieve for LES because of the rapid changes in temperature and other thermodynamic properties at the top of the stratocumulus decks.

One of the test cases used to validate our TPU Cloud simulator. The fine structures from the density current generated by the negatively buoyant bubble impinging on the surface are much better resolved with a high resolution grid (10m, bottom row) compared to a low resolution grid (200 m, top row).

Outlook

With this foundation established, our next goal is to substantially enlarge existing databases of high-resolution cloud simulations that researchers building climate models can use to develop better cloud parameterizations — whether these are for physics-based models, ML models, or hybrids of the two. This requires additional physical processes beyond that described in the paper; for example, the need to integrate radiative transfer processes into the code. Our goal is to generate data across a variety of cloud types, e.g., thunderstorm clouds.

Related work from others:  Latest from MIT Tech Review - Banning ChatGPT will do more harm than good

Rendering of a thunderstorm simulation using the same simulator as the stratocumulus simulation work. Rainfall can also be observed near the ground.

This work illustrates how advances in hardware for ML can be surprisingly effective when repurposed in other research areas — in this case, climate modeling. These simulations provide detailed training data for processes such as in-cloud turbulence, which are not directly observable, yet are crucially important for climate modeling and prediction.

Acknowledgements

We would like to thank the co-authors of the paper: Sheide Chammas, Qing Wang, Matthias Ihme, and John Anderson. We’d also like to thank Carla Bromberg, Rob Carver, Fei Sha, and Tyler Russell for their insights and contributions to the work.

Similar Posts