
A single ChatGPT query uses ten times more energy than a Google search. Sounds trivial? Multiply that by billions of queries per day. Add the energy for training the models. Factor in data center cooling. Suddenly we're talking about the energy consumption of entire countries.
The AI revolution has an energy problem. And it's growing faster than the models themselves.
1,050 terawatt-hours — that's how much electricity AI data centers are projected to consume by 2026. That's more than Japan's total electricity consumption. More than Germany, France, and Spain combined.
The trajectory is staggering: US data centers consumed about 183 terawatt-hours in 2023. By 2030, it will be 426 TWh — a doubling in seven years. The AI share alone is driving this growth.
And it's not just electricity. The AI industry's carbon footprint in 2025 is estimated at 32.6 to 79.7 million metric tons — comparable to the annual emissions of Belgium or the Czech Republic. Water consumption for cooling: 312 to 764 billion liters annually. That's equivalent to Denmark's water consumption.
To put it in perspective: the electricity needed to train a single large language model like GPT-4 equals the annual consumption of roughly 1,000 US households. And that's just training — daily use by millions of users consumes multiples of that.
AI's energy consumption splits into two phases: training and inference (actual usage).
Training is the more energy-intensive process per run: thousands of high-performance GPUs work for weeks or months to optimize a model on billions of data points. GPT-4 was reportedly trained on over 20,000 Nvidia A100 GPUs — each consuming up to 400 watts. Add cooling, network infrastructure, and redundancy, and you understand why a single training run can cost millions.
Inference — each individual user query — consumes less energy per operation. But through the sheer volume of queries, total inference consumption far exceeds training. ChatGPT alone processes an estimated 100 million queries daily. Each query activates billions of parameters.
Then there's a physical problem: cooling. GPU clusters generate enormous heat. Traditional air cooling is no longer sufficient — more and more data centers are adopting liquid cooling or being built in cold regions. But cooling also consumes energy. It's a vicious cycle.
In January 2025, Donald Trump and Sam Altman announced the Stargate Project: $500 billion for ten new AI data centers in the US. Each one requires roughly 5 gigawatts — equivalent to the output of five large nuclear power plants. Per data center.
In the same year, Google announced $75 billion in AI infrastructure investment. Microsoft and Meta followed with investments of similar magnitude. In total, over $300 billion flowed into new data centers in 2025 — primarily for AI.
The paradox: the same companies that publicly committed to carbon neutrality are building the most energy-intensive infrastructure in human history. Google had to admit in 2024 that its greenhouse gas emissions had risen 48 percent since 2019 — despite billions invested in renewable energy. The reason: AI workloads are growing faster than energy efficiency.
The companies point to solar and wind investments, long-term Power Purchase Agreements, fusion research. These are real efforts. But the honest answer is: currently, AI energy consumption is growing faster than renewables can keep up.
The costs of AI infrastructure aren't distributed evenly. Certain regions bear a disproportionate burden.
Ireland is an extreme example: data centers could consume up to 32 percent of the country's entire electricity by 2026. In a country already struggling with grid constraints, the government has imposed moratoriums on new data centers in the Dublin region.
In the US, grid operator PJM Interconnection — responsible for 65 million people in 13 states — projects a 6-gigawatt shortfall by 2027. This has concrete consequences: in Maryland and Ohio, residential electricity bills are rising by $16 to $18 per month. Real people paying for the AI industry's energy hunger.
In Germany, too, the construction of new data centers — particularly in Frankfurt, Europe's largest data center hub — is increasingly controversial. The city has frozen development plans and is discussing upper limits. The question of who bears the costs for grid expansion and energy supply remains unresolved.
The good news: there are counter-strategies. And some of them are not just more environmentally friendly, but also better.
Smaller, more efficient models demonstrate that bigger doesn't always mean better. Meta's Llama 3.2 achieves results with 1 billion parameters that earlier models needed 10 billion for. Mistral and other European AI companies deliberately focus on lean architectures that require less energy.
Model distillation compresses the knowledge of large models into smaller, more efficient versions. A distilled model can deliver 90 percent of the performance at 10 percent of the compute cost — that's not just an environmental argument, it's a hard-nosed business case.
Renewable energy for data centers is a growth area. Microsoft is investing in nuclear energy, Google in geothermal. But an honest assessment shows: these investments currently don't fully offset increased consumption. Both are needed — more efficient models and cleaner energy.
There's a strong argument for European AI in this context: smaller, focused models optimized for specific tasks consume a fraction of the energy of general-purpose giants like GPT-4. An AI text analysis tool that does one thing really well is not just more precise — it's also more sustainable.
You don't have to be an AI developer to make a difference. As a user of AI tools, you make decisions that have impact.
The AI industry faces an inconvenient truth: its most impressive innovations carry an ecological price tag that most marketing departments would rather not mention. But silence doesn't change the numbers.
The solution isn't to abolish AI — that would be neither realistic nor sensible. The solution lies in efficiency. In models that achieve more with less. In data centers powered by renewable energy. In companies that consciously decide when AI makes sense — and when it doesn't.
The most sustainable AI is the one that solves the problem with the fewest resources. That's not a compromise. It's the definition of intelligence.
deepsight uses lean, specialized models for text analysis — efficient and hosted in Europe. Learn more.
