As artificial intelligence rapidly expands across industries, from healthcare and education to transportation and finance, its environmental toll also rises.
Large language models (LLMs), image generators, and other AI tools consume enormous electrical power, much of still drawn from fossil-fueled grids. The environmental cost goes beyond carbon emissions, stretching to water use, data center locations, and the growing strain on power grids that run and cool massive server farms.
A growing movement known as “Green AI” seeks to prioritize energy-efficient algorithms, transparent emissions reporting, and environmentally conscious computing practices. The idea is not just to offset AI’s footprint, but to fundamentally rethink how these systems are developed and deployed.
During Harvard’s Climate Action Week 2025, a panel of researchers gathered to explore AI’s sustainability. Hosted by the Chan School of Public Health and moderated by Winkler associate professor of environmental respiratory health Mary Berlik Rice, who directs the Center for Climate Health and the Global Environment, the panel featured assistant professor of environmental health and population science Amruta Nori-Sarma; Gamble professor of biostatistics, population, and data science Francesca Dominici; Harvard Medical School assistant professor Nick Nassikas; and postdoctoral research fellow Claudio Battiloro.
The Outsized Fossil Fuel Impact of AI Data Centers
One of the most immediate concerns lies in the energy demands of AI data centers. These facilities, which house the hardware needed to train and run AI models, are consuming an increasingly large share of the national grid’s output (read more in “How AI Could Be Raising Your Energy Bill”).
As Francesca Dominici explained, current estimates suggest that data centers already account for roughly 4 percent of all U.S. electricity use. “But what is more concerning,” said Dominici, “is that the electricity that they need comes mostly from fossil fuels, so there is 58 percent higher average carbon intensity” for this type of use. In other words, not only is AI consuming more electricity—it’s contributing to greenhouse gas emissions at a higher rate than other forms of increased energy demand.
“We also know that these fossil fuel emissions aren’t just CO2, or carbon dioxide, but PM2.5 emissions,” continued Dominici. PM2.5, meaning particles that are 2.5 microns or less in diameter, are fine particles produced during the combustion of fossil fuels and released from power plant smokestacks and vehicle exhausts that enter the lungs and are small enough to cross into the bloodstream. These particles can cause premature death in individuals with heart and lung disease and lead to increased hospital admissions for a broad range of ailments in the general population.
This situation may worsen. “The electricity that is now used to power AI infrastructure is not only increasing exponentially,” Dominici added, “but because of our current political environment, coal-fired power plants that were supposed to be decommissioned are being reopened so they can power AI infrastructure—and we know from our research that PM2.5 from coal-fired power plants is even more toxic than fine particulate matter from all sources.”
The long-term consequences could include not only increased greenhouse gas emissions, but also heightened public health risks, particularly for communities already suffering from poor air quality and limited access to care (read more in “How Air Pollution Affects Our Brains”).
How AI Data Centers Threaten Communities and the Environment
The environmental costs of AI are not limited to emissions. The site selection for data centers brings another layer of complexity, particularly when communities with limited resources end up hosting power-intensive infrastructure that wealthier communities don’t want nearby.
In the fall of 2024, the developer Balico LLC proposed to build a massive “MegaCampus” data center powered by a gas-fired plant in Pittsylvania County, Virginia. At 3,500 megawatts, the proposed facility would have been the second-largest natural gas plant in the U.S.
At the time, Dominici’s lab was doing pro bono work for the Southern Environmental Law Center (SELC),which brought her team in to assess the public health impact. “What’s happening in this type of negotiation,” explained Dominici, “is that the community that is going to be impacted has very little information.” Local residents were told, in this instance, that the pollution risk would be minor: “It’s going to be the equivalent of you standing near to a grill for a couple hours on the Fourth of July.”
But a full environmental analysis from Dominici’s team told a different story. More than 1.28 million people would have been exposed to increased PM2.5 pollution, with public health costs estimated at more than $625 million over 10 years. After the SELC made this report public, the plan was withdrawn by the developer. “Communities must be made a factor in decision-making,” said Dominici, “and not provided with misinformation that supports the business interests of the developer.”
All four panelists emphasized that truly sustainable AI will involve rethinking the incentive structures that currently reward scale, speed, and performance in development above all else.
“Instead of celebrating the largest, fastest, most power-hungry models, why not reward ones that are marginally less accurate but exponentially more sustainable?” asked Dominici. That might include everything from carbon labeling on AI tools to setting industry benchmarks that account not only for performance, but also environmental impact. A chatbot that is 0.2 percent less accurate, she notes, might still be preferable if it reduces emissions by half.
Dominici pushed the idea further: “If we require a driver’s license to operate a car, why not have a license for deploying powerful AI models? That license could include principles like the responsible use of data, accountability, transparency, and so on.” In other words, she said, the accessibility of information might need to be transformed if AI is going to be consistently integrated sustainably—that is, without long-term impacts on populations and the planet.
Tying AI Expansion to Clean Energy
The panel identified several key priorities for advancing “Green AI”:
- Decarbonizing data centers
- Upgrading and optimizing AI infrastructure
- Choosing data center locations with sustainable, community-informed approaches
- Minimizing related environmental impacts, such as air pollution exposure
- Reducing water consumption for cooling
- Lowering overall energy demand and developing ways to reuse energy from data centers and other sources
- Reforming incentive structures that currently prioritize scale and speed over sustainability
These priorities fit within the broader context of global climate action generally. “One point five degrees Celsius is no longer achievable—and pretending like it is probably doing a disservice to public trust,” said Nick Nassikas, referencing the international climate goal set by the 2015 Paris Agreement. With global warming already reaching at least 1.2-1.3 degree Celsius, many scientists now see 1.5 degrees as a likely threshold to be crossed—if it hasn’t been already.
Still, Nassikas pointed to signs of momentum in the U.S. energy sector in 2024: renewables accounted for more than 90 percent of new electricity generation, wind and solar surpassed coal in total output, and battery storage capacity doubled.
These trends create a more supportive foundation for sustainable AI, panelists said. But without active regulation, market forces are unlikely to drive change (and may worsen AI’s environmental impact). As AI becomes more embedded in society, the panelists concluded, the imperative is not just to make it smarter—but cleaner and more transparent.