AI Now Consumes as Much Power as Airlines, and Eric Schmidt Says the Real Surge Is Yet to Come
AI is using the same amount of energy as the entire Airline Industry.
Former Google CEO Eric Schmidt has issued urgent warnings about the rapidly escalating energy demands of artificial intelligence (AI) and the consequent expansion of data centres over the next 10 to 15 years. In recent testimonies before the U.S. House Energy and Commerce Committee, Schmidt projected that AI could consume up to 99% of the world's electricity if current growth trends continue unchecked. He emphasised the urgent need for increased energy production across all renewable and non-renewable sources to meet this growing demand.
Schmidt underscored the pressing need for a substantial expansion of energy infrastructure, estimating that 29 gigawatts of energy will be required by 2027 and an additional 67 gigawatts by 2030 to support the advancement of AI. He also emphasised that the energy requirements of AI data centres are a looming problem, with companies like Elon Musk's xAI actively seeking to secure power resources to sustain AI development.
Despite the real environmental implications, Schmidt argued that current climate goals should not hinder the evolution of AI. He highlighted AI's potential, with its ability to solve complex problems, as a key player in addressing climate-related issues. He stated, "We need energy in all forms, renewable, non-renewable, whatever. It needs to be there, and it needs to be there quickly."
Schmidt's comments have sparked debate, with some viewing his warnings as a strategy to influence lawmakers and secure favourable treatment for the AI industry, especially given his investments in AI startups and his role as chairman of the National Security Commission on Artificial Intelligence.
As AI continues to evolve and integrate into various sectors, the balance between technological advancement and sustainable energy consumption remains a critical issue for policymakers, industry leaders, and society.
The Hidden Cost of Progress: AI's Growing Energy Appetite
We live in a world where artificial intelligence is no longer a futuristic concept—it is deeply embedded in our daily lives. AI is everywhere, from generative models like ChatGPT to autonomous vehicles learning to navigate our roads to recommendation engines that predict what we'll buy, read, and watch.
And it's only just beginning. Behind the magic of AI lies an often-overlooked reality: the energy consumption it entails.
Every AI system depends on massive computational power. Each model we train, each query we run, and every product we personalise rely on data centres humming in the background, processing staggering amounts of information.
As a result, the energy demand is surging.
Today, data centres account for 1–2% of global electricity consumption, roughly equivalent to the consumption of the airline industry. But this figure is rapidly climbing. Analysts project that by 2030, data centres could consume 3–8% of the world's electricity, primarily driven by the rapid growth of AI systems.
The infrastructure we have today is not enough to support the next phase of AI growth.
The Ticking Clock: Energy Requirements for the AI Future
The next generation of AI will not simply mean more innovative models; it will mean bigger, more complex, and more energy-hungry models:
Large Language Models (LLMS) like GPT-5, advanced AI models created to comprehend and generate human language, will need exponentially more computing power than their predecessors. Autonomous fleets (cars, drones, ships) will demand real-time data processing at the edge, requiring a vast expansion of micro-data centres.
Personalised AI agents, such as virtual assistants or personalised recommendation systems, which could be available to every person and company, will add billions of new workloads to the cloud.
We may need thousands of new data centres globally, each the size of several football fields, packed with servers that must run 24/7 to meet these demands.
And here's the catch: training just one large AI model today can consume as much energy as a small town uses in a year.
If we continue on our current trajectory without major innovation, AI's energy footprint could become one of the most significant sustainability challenges of the 21st century.
The High Stakes of AI's Energy Crisis
As AI systems continue to expand into every corner of business, society, and daily life, an urgent truth is emerging: the future of AI is inextricably linked to the future of energy.
If we don't act now, AI's growing energy demands could create a cascade of new problems, straining not just the environment but our entire global infrastructure.
Here's what's at stake:
1. Energy Infrastructure Strain
AI doesn't run on magic; it runs on millions of servers.
Those servers require constant, stable, 24/7 power to function.
The challenge? Our current electrical grids were built for a different era that didn't imagine every home, city, and business leaning on continuous cloud computing and AI inference at scale. AI inference refers to using a trained AI model to make predictions or decisions based on new data.
Without massive upgrades, we risk frequent outages, unstable grids, and skyrocketing electricity costs. As AI continues to grow, our infrastructure becomes increasingly vulnerable.
2. Environmental Costs
Training and running AI models require a significant amount of energy, much of which is derived from fossil fuels.
If renewable energy sources don't just as fast—or faster—carbon emissions will climb dramatically, undercutting global climate goals.
The same AI innovations we hope will help combat climate change could exacerbate it unless the sector adopts an entirely green energy transition.
3. Global Inequality
The benefits of AI, such as more competent healthcare, improved logistics, and increased productivity, will primarily be reaped by wealthy nations that can afford to build the necessary data centres and infrastructure.
Meanwhile, environmental costs, rising temperatures, extreme weather, and resource scarcity are global issues.
Developing nations, which contributed least to the AI-driven energy boom, will likely bear the heaviest burden.
This growing imbalance risks deepening global inequality at a time when cohesion and cooperation are more critical than ever.
4. Regulatory Risk
Governments are paying attention.
Suppose AI energy consumption becomes a political flashpoint. In that case, regulatory intervention will trigger a public backlash over emissions, energy prices, or equity.
We could see limits on data centre expansion, mandatory carbon offset requirements, or even caps on AI training runs.
This could spell massive disruption for AI companies betting their future on limitless computing growth.
The energy crisis tied to AI isn't a distant worry; it's unfolding quietly and rapidly. I would need immediate and decisive action to address this pressing issue.
Those who invest in solving it, through clean energy, efficient design, and more intelligent infrastructure, will not only future-proof their businesses.
The Big Numbers Behind AI's Energy Explosion
If the concerns surrounding energy and AI seem abstract, let's ground them in reality.
The numbers tell a story that is too big and urgent to ignore.
1. Global Data Centre Power Demand Is Projected to Double by 2030
Today, data centres consume about 1–2% of global electricity.
However, as AI continues to scale, analysts forecast that this figure could double by 2030, putting data centres on track to consume as much power as some of the world's largest industrial sectors.
The race to build new data centres is underway, with tech giants like Microsoft, Google, Amazon, and others announcing multibillion-dollar investments.
The question is: Can our power grids and our planet keep up?
2. Training a Single AI Model Can Emit as Much COâ‚‚ as 5+ Cars Over Their Lifetime
Training today's large language models (LLMS) like GPT-4 is energy-intensive.
Studies estimate that a training run can produce over 284,000 kilograms of COâ‚‚ emissions, equivalent to the lifetime emissions of more than five average cars.
This doesn't even account for the energy required to deploy and run these models once they're in production, serving millions of queries 24/7 worldwide.
The more ambitious our AI becomes, the higher the environmental toll will be unless innovation and clean energy scale alongside it.
3. 80% of New Internet Data Is Processed Through Cloud Data Centres
We are firmly in the cloud-first era.
80% of all new internet data traffic, including videos, apps, AI queries, and Iot devices, is already processed and stored in massive cloud data centres.
This means the energy footprint of the cloud is no longer a niche concern.
It is now one of the major drivers of global energy demand growth, and AI is accelerating this trend.
The numbers are damning:
AI's future success is directly tied to solving its energy problem.
Suppose we're serious about unlocking the next era of technological progress. In that case, we must be just as serious about building an energy system that can sustain itself sustainably, equitably, and resiliently.
Emerging Solutions: Building an AI Future Without Breaking the Planet
Emerging Solutions: Is it possible?
The good news? While the energy challenge of AI is real, so is human ingenuity. Across the tech ecosystem, innovative solutions are already emerging to make AI robust and sustainable.
1. AI for Energy Efficiency
AI itself could be part of the solution to its energy crisis.
Leading companies are training specialised AI models to monitor, predict, and optimise server workloads, dramatically reducing wasted energy. For example, companies have already achieved 10–30% energy savings in major data centres by using AI-driven systems to manage server cooling and dynamic workloads.
In a world where every watt counts, that's a game-changer.
2. Green Data Centres
Tech giants are leading the way toward carbon-neutral operations by investing heavily in green data centres powered by renewable energy:
Google achieved 100% renewable energy matching for all its global operations, including data centres, and is pushing toward operating on clean energy 24/7.
Microsoft has pledged to be carbon-negative by 2030, building new data centres that rely on wind, solar, and experimental geothermal technologies.
Smaller players are innovating as well, utilising underwater cooling, AI-optimised airflow, and locally generated solar grids.
These green infrastructure projects show it's possible to scale AI without sacrificing our planet, but only if we prioritise sustainable design.
3. Hardware Innovation
The future isn't just better algorithms; it's better hardware.
New generations of energy-efficient chips (like Google's TPU, Nvidia's AI-focused GPUs, and upcoming neuromorphic processors) are dramatically improving the performance-per-watt of AI computations.
Meanwhile, modular data centre designs allow operators to quickly add or reconfigure hardware, making it easier to optimise for energy use, cooling needs, and local renewable energy availability.
In short, more innovative hardware enables smarter energy choices.
What is The Vision for the Future?
If we want AI to continue improving the world, investing in sustainable infrastructure is no longer optional—it's mission-critical.
Founders, tech leaders, policymakers, and investors must come together now to:
Incentivise the development of green technologies and energy innovations.
Set high standards for sustainable AI deployments.
Develop regulatory frameworks that foster, rather than hinder, responsible AI development.
Because the real breakthrough won't just be bigger, faster models.
It will build a future where technology, energy, and the environment coexist in harmony.
Our choices today will define the AI landscape for generations to come.