Bringing Clouds into Focus
A New Global Climate Model May Reduce the Uncertainty of Climate Forecasting
May 11, 2010
Contact: John Hules, JAHules@lbl.gov , +1 510 486 6008
Clouds exert two competing effects on the Earth's temperature: they cool the planet by reflecting solar radiation back to space, but they also warm the planet by trapping heat near the surface. These two effects coexist in a delicate balance.
In our current climate, clouds have an overall cooling effect on the Earth. But as global warming progresses, the cooling effect of clouds might be enhanced or weakened—global climate models are evenly divided on this issue. In fact, inter-model differences in cloud feedbacks are considered the principal reason why various models disagree on how much the average global temperature will increase in response to greenhouse gas emissions, when it will happen, and how it will affect specific regions.
Clouds also affect climate in other ways, such as transporting heat and moisture from lower to higher altitudes, producing precipitation, and many other interrelated mechanisms. Current global climate models are unable to directly simulate individual cloud systems from physical principles, because the size and speed of supercomputers place a limit on the number of grid cells that can practically be included in the model. As a result, global models do not have fine enough horizontal resolution to represent large clouds.
Instead, global climate models must rely on parameterizations, which are statistical representations of phenomena, such as cloud cover or precipitation rates, that cannot be directly modeled. Different models use different parameterizations, which is an important reason why their results differ. Cloud parameterizations are the greatest source of uncertainty in today's climate models.
David Randall, a Professor Atmospheric Science at Colorado State University, is working to clear up that uncertainty by developing and testing a new kind of global climate model, called a global cloud resolving model (GCRM)—a model that's designed to take advantage of the extreme-scale computers expected in the near future.
Randall is the principal investigator of the "Global Cloud Modeling" project that computes at NERSC, and was one of two coordinating lead authors of Chapter 8, "Climate Models and Their Evaluation," in the Intergovernmental Panel on Climate Change's (IPCC's) Fourth Assessment Report, which was honored with the 2007 Nobel Peace Prize. He also directs the Center for Multiscale Modeling of Atmospheric Processes, sponsored by the National Science Foundation.
From a single thunderstorm to the whole earth
"The GCRM is a computer model that simulates the motions of the atmosphere on scales from a single thunderstorm all the way up to the size of the entire earth," Randall explains. "It has about a billion little grid cells to represent the three-dimensional structure of the air. Each grid cell has a wind, a temperature, a humidity, and some other things that are needed. So the number of numbers involved is in the tens of billions, just as a snapshot of what's going on at a given second."
Large thunderstorms play an important role in global atmospheric circulation. They pack a lot of energy in the form of updrafts that move, in extreme cases, 30 to 40 meters a second—"scary fast," Randall says. They "lift air from near Earth's surface to way up near the stratosphere in just a few minutes." In this way, thunderstorms carry moisture, momentum, carbon dioxide, and other chemical species through great depths of the atmosphere very quickly.
Cumulus clouds, Randall says, make the upper troposphere wet by transporting water from its source, the oceans. "A lot of it will rain out along the way, but some of it is still left, and it gets spread out up there and makes cirrus clouds, comprised largely of ice, which are very important for climate. We're especially interested to see how storms that create cirrus affect the climate." Cirrus clouds block Earth's infrared radiation from flowing out to space, and that tends to warm the climate. "If we have more cirrus in the future, that will enhance warming. If we have less, it will reduce the warming."
The GCRM also will give scientists new insights into tropical cyclones, which, Randall says, "are much bigger than thunderstorms, and in fact they contain many thunderstorms simultaneously. They affect the climate in part by cooling the sea surface as they move over the ocean."
The GCRM, supported by the Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, is built on a geodesic grid that consists of about 100 million mostly hexagonal columns, each with 128 levels, representing layers of atmosphere that reach 50 kilometers above the Earth. For each of these grid cells, the model predicts the wind, temperature, and humidity at points just 4 kilometers apart (with a goal of 2 kilometers on the next generation of supercomputers). That's an unprecedented resolution—most global atmospheric models provide detail at a 100-kilometer scale.
"No one has done this before in quite this manner, and it's our hope that our project will point the way to future generations of models," says Randall.
The geodesic grid used in the GCRM, also developed with SciDAC support, is itself quite innovative. If you want to tile a plane with regular polygons, you have only three choices: triangles, squares, or hexagons. Most climate models use some form of square (or rectangular) grid; but the geometry of the grid complicates the calculations, because each square has two different kinds of neighbors—four wall neighbors and four corner neighbors—which require different treatment in the equations. In addition, a square grid poses complications in modeling the Earth's polar regions, where grid cells lose symmetry because of longitudinal convergence. There are solutions to these problems, but they are computationally expensive.
The GCRM, in contrast, uses a geodesic, hexagonal grid. In a hexagonal grid, all neighbors of a given cell lie across cell walls; there are no corner neighbors. A geodesic grid on a sphere has twelve pentagonal cells in addition to the many hexagonal cells; but each cell still has only wall neighbors, and all cells are roughly the same size. This type of grid also eliminates the pole problem.
As a result, equations constructed on hexagonal grids treat all neighboring cells in the same way, reducing the complexity and increasing the speed, productivity, and accuracy of the code. The number of cells (both grid columns and levels) can easily be changed for a particular computer run, depending on what the researchers want to simulate. Models based on geodesic grids are now being used by several major weather and climate modeling groups around the world.
Vorticity: Where the action is
Climate models are systems of partial differential equations that simulate how the atmosphere and oceans move and change over time, based on the laws of physics, fluid motion, and chemistry. Since the equations are all interrelated, the dynamical core of the model has to solve these equations simultaneously for every grid cell at each time step—which is why climate models require massive computing power.
Because the GCRM has such high resolution, Randall's research team knew they needed to use equations that reproduce accurate motions at a wide range of scales to get the most realistic results; so team members Akio Arakawa of the University of California, Los Angeles (UCLA), and Celal Konor of Colorado State University (CSU) developed the Unified System of governing equations (so called because it unifies the quasi-hydrostatic compressible system with the nonhydrostatic anelastic system). The Unified System can cover a wide range of horizontal scales, from turbulence to planetary waves. It also filters out vertically propagating sound waves of all scales, without excluding relevant waves such as inertia-gravity waves, Lamb waves, and Rossby waves.
"The atmosphere can make lots of different kinds of waves" Randall says, "but in choosing equations we knew we wanted to avoid those that include sound waves, because sound waves are completely irrelevant to weather and climate. Because sound moves too fast, if you include sound waves in your model, you have to take very small time steps. If you eliminate sound waves completely, then you can take much longer time steps. There have been other ways to get rid of them in the past, but they've been considerably less accurate. The new method that we've developed does involve approximations, because you're leaving something out, but it has much smaller errors that are, we believe, quite acceptable."
Another key feature of the Unified System is that it solves the three-dimensional vector vorticity equation rather than the vector momentum equation. Vorticity, or spinning motion, "is really at the core of much of the important fluid dynamics in the atmosphere," Randall says. "Vortices move around and maintain their identities and live a life, like little animals. Sometimes two vortices will merge and make a bigger one. Almost everything that is interesting and important in the motion of the atmosphere predominantly involves the spinning part."
"This project could not have happened without a lot of support from the federal government... We've been computing at NERSC for more than a decade, and it's been an excellent experience. We have a lot of respect for and gratitude to everyone at NERSC for all the excellent support they have given us over the years,"—David Randall, Professor Atmospheric Science at Colorado State University
Most climate models use the momentum equation because it is easier to solve than the vorticity equation, and vorticity can be derived from momentum. But Akio Arakawa of UCLA and Joon-Hee Jung of CSU found a more efficient way of solving the vorticity equation that represents the important spinning motions much more directly and explicitly than the momentum equation does. "You really have to get that spinning part right, because that's where most of the action is," Randall explains. "Working with the vorticity equation directly means focusing in on the part of the physics that is most important to what we care about.
The component algorithms in the GCRM were selected for their good scaling properties, so the model scales linearly with the number of grid cells used. "Depending on the details of the configuration, we can do a few simulated days per wall clock day on 40,000 processors of Franklin," Randall says. "Which means that doing a whole year is a very big calculation—it might be like a hundred days continuously around the clock on 40,000 processors or more—a big chunk of a very expensive machine. So what we’re doing is just barely doable now."
"But in ten more years," he adds, "we expect computers to be a hundred times faster, whether it’s Green Flash or some other system. Then we’ll be getting, say, a simulated year for a wall-clock day. That’s a big improvement. You can start thinking about doing simulated decades or even longer. You’re almost getting into the climate regime of about a century. So that’s exciting."
"This project could not have happened without a lot of support from the federal government, especially the Department of Energy. We have to use the very fastest, most powerful machines in the world, and DOE, of course, is where you go for that. They're 'Supercomputing Central.' We've been computing at NERSC for more than a decade, and it's been an excellent experience. We have a lot of respect for and gratitude to everyone at NERSC for all the excellent support they have given us over the years."
Further computational challenges
The development of a geodesic dynamical core with a unique system of equations was the major, but not the only computational challenge. Other challenges include parallel input/output (I/O), including storage, management, and distribution of the voluminous output, and visualization of the results. The SciDAC Scientific Application Partnership titled "Community Access to Global Cloud Resolving Model and Data," led by Karen Schuchardt of Pacific Northwest National Laboratory, has been working to address those issues.
As for Randall's group, they are now adding parameterizations of various physical processes, such as cloud microphysics, to the dynamical core of the GCRM, and they are also working on a method to include topography in the model, which will add vertically propagating waves produced by air flow over mountains. While continuing to run various tests on Franklin at NERSC, including numerical accuracy, stability, and parallel scaling performance, they are also running larger tests on up to 80,000 cores of Jaguar, a Cray XT system at the Oak Ridge Leadership Computing Facility (OLCF).
Early tests of the model will span just a few simulated days and will focus on short-range global weather prediction, starting from high-resolution analyses produced by working weather prediction centers. Tropical cyclones and other extreme weather events will be particular areas of focus. By 2011, the researchers plan to use the GCRM to perform two or more annual-cycle simulations, at least one of which will be coupled to the geodesic ocean general circulation model that they developed under SciDAC Phase 1.
Within the next ten years or so, models similar to the GCRM will be used for operational weather prediction, and eventually GCRMs will be used for multi-century climate simulations. The Green Flash project may make this possible sooner rather than later. The long-term target resolution for a Green Flash system is a horizontal grid spacing of about 1 km, which will require approximately 671 million grid columns, each with about 100 layers.
About NERSC and Berkeley Lab
The National Energy Research Scientific Computing Center (NERSC) is a U.S. Department of Energy Office of Science User Facility that serves as the primary high performance computing center for scientific research sponsored by the Office of Science. Located at Lawrence Berkeley National Laboratory, NERSC serves almost 10,000 scientists at national laboratories and universities researching a wide range of problems in climate, fusion energy, materials science, physics, chemistry, computational biology, and other disciplines. Berkeley Lab is a DOE national laboratory located in Berkeley, California. It conducts unclassified scientific research and is managed by the University of California for the U.S. Department of Energy. »Learn more about computing sciences at Berkeley Lab.