November 1, 2023
2 min read
Scientists have proposed a network of supercomputing centers that would focus on local climate impacts
Scientists have used computer models to predict global warming’s implications for more than five decades. As climate change intensifies, these increasingly precise models require more and more computing power. For a decade the best simulations have been able to predict climate change effects down to a 25-square-kilometer area. Now a new modeling project could tighten the resolution to one kilometer, helping policymakers and city planners spot the neighborhoods—or even individual buildings—most vulnerable to extreme weather events.
“Climate [science] has always had a computing problem,” says Bjorn Stevens, director of Germany’s Max Planck Institute for Meteorology. Recent technological advances such as shrinking transistors, however, have made computers far more capable, Stevens says. He and a group of climatologists and scientists from other disciplines are developing a network of global supercomputing centers called Earth Visualization Engines, or EVE, which they hope to complete within the decade. These centers would work together by running climate models, interpreted by machine-learning algorithms, on supercomputers to predict climatic shifts and severe weather events locally.
This international push, which organizers have called “the CERN of climate science,” could help municipalities mitigate disasters, say supporters who plan to present the proposal at the 28th United Nations Climate Change Conference in November. Higher-resolution modeling could show how wind shear affects certain buildings, where floodwaters might go, or what areas are most vulnerable to damage. These details could inform measures taken before dangerous events such as heat waves, hurricanes or droughts, helping officials determine when and where to save water, set up cooling centers or shore up infrastructure.
Such fine-grained modeling may be enabled by a recent technological advance: a superchip called Grace Hopper, named after the pioneering computer scientist and developed by computer technology company Nvidia. Ten years in the making, it could be used to process models as many as six times faster than other superchips while using less energy, says Dion Harris, Nvidia’s head of accelerated data center project marketing.
As EVE moves forward, Stevens and other planners envision making the data and models publicly available. Doing so—especially in developing countries hit hardest by the climate crisis—should be prioritized before rolling out new and expensive computing technologies, says Gavin Schmidt of NASA’s Goddard Institute for Space Studies, who is not involved with EVE.
“There is a huge amount of useful climate information that isn’t accessible,” Schmidt says. Climate modelers are “trying to make the best of the information, get it out there, and help people make better decisions for adaptation.”