January 12, 2021

For decades, supercomputers have played a critical role in studying our climate and predicting how global changes will impact the world, as well as potential ways to mitigate those effects. With each new generation of more powerful computers, scientists have been able to create simulations in higher resolution, which reveal both new challenges and potential solutions.

At the same time, the effects of rising global temperatures are forcing individuals and communities to adapt to rising sea levels, more frequent catastrophic storms, severe droughts, increases in wildfires, and more extreme temperatures. These impacts are also having health and economic impacts, leading to migration and social unrest.

In a Dec. 16 keynote address to the HiPC 2020 Conference, UC Berkeley Computer Science Professor Kathy Yelick described how both the technology used to study climate and the impacts of our changing climate are at a crossroads. Computing will remain central to studying the problem, but instead of focusing mainly on physical modeling there will be increased use of data from many sources, and with the use of machine learning, the analyses will provide insight on a wide range of topics critical to addressing the many impacts of global climate change.

HiPC 2020, the 27th IEEE International Conference on High-Performance Computing, Data, & Analytics, was held virtually this year on Dec. 16-18, drawing more than 500 attendees from four continents. In her talk “Computing and Data Challenges in Climate Change,” Yelick said the complexity of the problem and the importance of AI and modeling in addressing it means that the global community needs to work together. Yelick, who is a senior advisor for computing at Lawrence Berkeley National Laboratory (Berkeley Lab), is also the Associate Dean for Research at UC Berkeley’s Division of Computing, Data Science, and Society, which aims to foster broad, multi-disciplinary collaborations in addressing societal issues such as climate change and sustainability. For example, melting polar ice which leads to rising ocean levels will likely cause forced migrations from low-lying nations, which can have social, political and economic impacts.

The broadening of climate research

Although current climate models are actually ensembles of interconnected physical models of components like air temperature, ocean temperature, landforms, etc., Yelick said that future research will require a broader range of disciplines to fully understand and try to mitigate the impacts. Areas such as economics, sociology, engineering, material science, law and public policy will need to join the community. 

For the past few decades, climate researchers have focused on increasing the resolution of and adding new physics into their models. Twenty years ago, leading climate models could break the conditions down to a resolution of 200-kilometer squares, forming a mesh around the globe. In 2008, climate researchers started running models at a resolution of 25 kilometers. Today, some teams are working on very short runs with 2-3 kilometer resolution.  With the emergence of exascale computers, researchers are looking to extend their modeling runs to simulate climate change over a century or longer. 

But Yelick sees a broader range of computing and data being brought to bear on the problem.          

For example, Trevor Keenan, who has a joint appointment at UC Berkeley and Berkeley Lab, is using data-driving models to understand the impacts of climate variability and long-term change on ecosystem functions, as well as related feedbacks to the atmosphere through ecosystem carbon cycling and water use. His work combines large ecological data sets (including field studies and remote sensing) with models of ecosystem state and function and machine learning methods to study key physical and biological processes.

One of Yelick’s research projects, ExaBiome, has developed new tools for studying microbiomes, the communities of microbes made up of hundreds or thousands of microbial species of varying abundance and diversity that are central players in climate change, environmental remediation, food production, and human health. By studying their genomics, researchers expect to learn how microbes can be used to help remediate environmental problems or to manufacture novel chemicals and medicines. But only 1 percent of all the microbes have been studied, a situation the ExaBiome project is addressing to unlock more information.

Besides using machine learning to understand the physical aspects of climate, it’s also being used to predict the effectiveness of reduction and remediation efforts. New materials offer promise in both energy capture and storage and carbon capture. The Materials Project at Berkeley Lab has calculated the potential of hundreds of thousands of materials for designing batteries, solar cells, and the like, and makes the results publicly available. Machine learning is being used to improve the search of the existing database and to explore the viability of new materials.

Machine learning is also speeding up the design of metal-organic frameworks (MOFs) which have the potential to remove more than 90 percent of the CO2 from natural gas power plants, a six-fold improvement over current methods. The MOFs act like a sponge to absorb carbon and can be rinsed out using steam and used again.

Beyond technical solutions

Yelick also discussed using machine learning to manage infrastructure to reduce energy use and its effects on climate. One example is transportation, which accounts for 30 percent of the energy used in the U.S. Clogged traffic is not just frustrating to drivers, but also wastes fuel and worsens air quality. Using data from vehicles and sensors, machine learning can help optimize traffic flow. 

By equipping buildings with data-collecting sensors connected to computers for analysis, machine learning can train the buildings’ systems to operate in the most energy-efficient manner while also meeting the needs of building users.

Assuring the economic viability of new infrastructure is another area where modeling and machine learning are being applied. For instance, the installation of wind turbines has been modeled to better understand how the placement of turbines affects the flow of wind to machines down the line and how to maximize the efficiency of all of the turbines.

At UC Berkeley, economist David Zilberman has studied how one form of renewable energy can “cannibalize” the revenue of another, such as when wind and solar energy are used together. He found that introducing wind turbines into an area with solar reduces the value of solar while adding solar panels increases the value of wind power.

“The economic model is just as important as the technology,” Yelick said. 

There is also the social cost to consider, she said.

“We need to understand the economic impacts of climate change, which now involves policy experts. “Where are the biggest impacts from drought? How will this affect the people who live there? Will they be forced to migrate? Human behavior is difficult to model but absolutely essential to understand.”

Yelick closed her presentation by citing President Barack Obama’s call to action on climate, “We are the first generation to feel the effect of climate change and the last generation who can do something about it.”