As artificial intelligence rapidly expands and drives surges in demand for energy and investment, markets may struggle to keep pace. If providers cannot supply and finance enough energy to power AI, and policymakers fail to implement regulations to mitigate broader financial risks, the entire sector—and the downstream economy—could implode.
That warning was at the center of a February 5 panel at Harvard Kennedy School (HKS)’s Institute of Politics, where policymakers and economists examined the scale and risk profile of AI investments, as well as their impact on the U.S. energy sector, climate policymaking, and the environment itself.
Joseph Aldy, Heinz professor of the practice of environmental policy at HKS and a former White House climate advisor, opened the discussion by outlining AI’s growing environmental costs. Electricity demand in the United States is rising sharply, driven in part by rapidly proliferating data centers. At the same time, political resistance has slowed the buildout of wind and solar power, technologies that could otherwise help ease some of that strain.
The result, Aldy said, is a troubling paradox: “We just can’t build enough gas-fired power plants to make up the gap.” Coal plants once expected to retire are still online, electricity prices are rising, and emissions are beginning to tick upward.
Aldy noted AI’s potential to accelerate the discovery of energy-efficient materials, improve climate modeling, optimize power grids, and even support carbon removal. But he cautioned that those environmental benefits are far from immediate or guaranteed. And without clear policy signals steering AI toward climate-relevant uses, Aldy noted, investment will flow toward whatever is most quickly profitable—which may not be green energy.
Robert Rubin ’60, who served as the U.S. secretary of the treasury under President Bill Clinton and was the first director of the National Economic Council, situated the AI infrastructure investment boom within a broader macroeconomic cycle that is, itself, increasingly driven by AI. He warned of what he called a “circularity risk,” in which large AI firms make massive future commitments to suppliers, prompting those suppliers to borrow aggressively, invest in capacity, and channel capital back into the same firms whose demand projections justified the expansion.
Rubin pointed to OpenAI, which has made approximately $1.4 trillion in long-term commitments to suppliers. “The stocks of the suppliers go up because of the commitments,” he explained, and that rising valuation feeds further financing back into the AI firm itself. OpenAI now faces rapidly intensifying competition from other companies (Anthropic, DeepMind, Meta, Google, and more) and platforms, raising the possibility that it may be unable to meet its commitments—spelling disaster for stakeholders, the sector at large, and the many other industries that the company now touches.
Jason Furman, Aetna professor of the practice of economic policy at HKS, suggested additional layers to a simple boom-or-bust interpretation. Investment returns, Furman said, can fall short of today’s most optimistic projections while still rising significantly.
Furman pointed to research by economists at the Federal Reserve, who track real estate listings and early construction activity to estimate future data-center capacity one to two years out. “There’s an enormous amount of data on what has been started,” Furman noted. The indicators, he said, suggest that while growth may not match the most staggering forecasts from figures like Elon Musk and Sam Altman, AI-related investment is likely to continue expanding.
Aldy added that much of the available data is inconclusive, in part because of a “broken regulatory system.” Long, inefficient approval processes encourage firms to hedge their bets by submitting multiple proposals at once, assuming that only one is likely to clear regulatory hurdles.
In regional electricity markets such as PJM, which serves much of the Mid-Atlantic, rapid data-center growth has contributed to higher prices and renewed debates over whether consumers or companies should bear the cost of new infrastructure. The core issue, Aldy noted, is that AI companies are massive power users that do not face the full costs they impose on the system.
Toward the end of the session, a first-year master’s of public policy student asked how a government like the United States might create policy in a world 30 to 50 years from now that is largely “agentic” (that is, AI-agent driven). In an economy dominated by autonomous systems, she asked, what incentives would people have to participate at all? Would traditional economic structures give way to “something more relational”?
As Rubin put it, the challenges ahead carry “tremendous potential on both the plus side and the minus side.” Whether the U.S. captures the former or succumbs to the latter, he suggested, may depend less on innovation than political will.