Google’s investment in a new data center partially powered by a massive natural gas plant has ignited a critical conversation about the energy demands of artificial intelligence. The facility, which will emit carbon dioxide equivalent to adding nearly a million gas-powered cars to the road annually, highlights the growing tension between AI’s computational needs and environmental sustainability. This development comes at a time when the AI industry faces unprecedented pressure to balance innovation with responsible energy consumption.
The Energy Reality Behind AI’s Growth
Data centers are the backbone of AI infrastructure, consuming vast amounts of electricity to power the servers running complex algorithms. According to BloombergNEF projections, US data center power demand is expected to surge from 34.7 gigawatts in 2024 to 106 gigawatts by 2035. This exponential growth has created a scramble for reliable power sources, with companies like Google, Meta, and Microsoft racing to secure energy contracts for their AI operations.
David Crane, CEO of Generate Capital, warns about the risks of overbuilding energy infrastructure for AI data centers. “As much as the data centre people tell you their demand for electricity is infinite, it feels to me like there will be a time when they’ll be overbuilt,” he told the Financial Times. Crane advocates for ‘take-or-pay’ contracts where data centers cover infrastructure costs regardless of usage, ensuring power companies don’t bear the financial burden of unused capacity.
Geopolitical Factors Complicating Energy Strategy
The energy landscape for AI companies has become increasingly complex due to geopolitical tensions. The ongoing conflict in the Middle East has disrupted global energy supplies, with the Strait of Hormuz – a crucial shipping lane for 20% of the world’s energy – effectively shut down. This has caused oil prices to jump 4.8% following recent political developments, creating uncertainty for companies planning long-term energy strategies.
These disruptions have broader implications for the tech industry. As noted in Nikkei Asia’s analysis, supply chain constraints are affecting everything from semiconductors to specialized components like external modulation lasers. Jose Liao, general manager of systems business at Asus, summarized the challenge: “No one can escape” the impact of these energy market disruptions on technology manufacturing and deployment.
Competing Approaches to AI Infrastructure
Different companies are taking varied approaches to addressing AI’s energy challenges. Meta’s Hyperion AI data center in Louisiana will consume electricity comparable to South Dakota’s entire power demand, requiring 10 natural gas power plants generating 7.5 gigawatts. This contradicts Meta’s climate pledges, as the plants will emit 12.4 million metric tons of CO2 annually – 50% more than Meta’s entire 2024 carbon footprint.
Meanwhile, French AI startup Mistral AI is taking an $830 million loan to build a data center near Paris with 13,800 Nvidia GPUs, aiming to strengthen Europe’s AI autonomy. The company plans to reach 200 megawatts of AI computing capacity in Europe by late 2027, with 60% of its revenue coming from European clients including ASML, TotalEnergies, and several European governments.
Technical Innovations and Efficiency Gains
Some companies are focusing on technical solutions to reduce AI’s energy footprint. Google has introduced TurboQuant, a new innovation designed to reduce AI memory usage by at least 6x through real-time quantization. This technology compresses the key-value cache, a major memory bottleneck in large language models, potentially making AI more accessible by lowering inference costs.
However, experts note that efficiency improvements may lead to increased overall AI usage due to the Jevons paradox, where technological progress that increases efficiency leads to greater consumption rather than reduced demand. Vivek Arya of Merrill Lynch suggests that “the 6x improvement in memory efficiency [will] likely [lead] to 6x increase in accuracy (model size) and/or context length, rather than 6x decrease in memory.”
The Business Impact and Industry Response
The energy demands of AI are reshaping business strategies across the technology sector. AI startup Poolside faced setbacks when its $2 billion funding round anchored by Nvidia fell apart and its deal with CoreWeave collapsed. The company had planned a 2-gigawatt data center complex in Texas’s Permian Basin that CoreWeave would fill with Nvidia’s Blackwell AI chips, but the partnership ended due to missed deadlines.
Ben Hertz-Shargel, Global Head of Grid Edge at Wood Mackenzie, observes: “The AI ship has sailed, but the energy cost of serving it is very much in question.” This uncertainty is prompting companies to reconsider their infrastructure investments and explore alternative energy sources, including renewable options where feasible.
Balancing Innovation with Responsibility
The debate around Google’s gas-powered data center reflects broader questions about how the AI industry should address its environmental impact. While natural gas provides reliable power for data-intensive operations, it comes with significant carbon emissions. The plants supporting Google’s new facility will emit the yearly equivalent of putting more than 970,000 additional gas-powered cars on the road.
As companies navigate these challenges, they must balance the need for computational power with environmental responsibility. The industry faces pressure to develop more sustainable approaches while maintaining the rapid innovation that has characterized AI development in recent years. How companies address this energy dilemma will likely shape public perception and regulatory approaches to AI infrastructure in the coming years.

