
The global appetite for generative AI is skyrocketing, but so is its energy consumption.
A new report jointly released by UNESCO and University College London (UCL) warns that the energy demands of artificial intelligence, especially large language models (LLMs), have reached unsustainable levels.
The research estimates that just one prompt to a generative AI model uses around 0.34 watt-hours. When multiplied by the daily usage of over 1 billion people, the global energy demand balloons to 310 gigawatt-hours annually — roughly equivalent to the amount of electricity consumed by over 3 million people in a low-income country.
“Generative AI’s annual energy footprint is already equivalent to that of a low-income country, and it is growing exponentially,” said Tawfik Jelassi, UNESCO’s assistant director-general for communication and information. “To make AI more sustainable, we need a paradigm shift in how we use it, and we must educate consumers about what they can do to reduce their environmental impact.”
Efficiency without compromise
The UCL research team ran a series of real-world experiments using different open-source large language models (LLMs). Their findings revealed three key strategies for cutting down AI’s energy appetite:
- Use of smaller, task-specific models: Most users currently rely on large, general-purpose models for all tasks, from translation to summarization. However, the report shows that smaller models, focused on specific tasks, can reduce energy consumption by up to 90% without affecting accuracy. The study also highlights the “Mixture of Experts” model design, where only specialized modules are activated when needed, instead of powering an entire model every time.
- Concise prompts and responses: Trimming down how we communicate with AI matters, too. The report found that shorter interactions can cut energy use by more than 50%.
- Model compression techniques: The report noted that by applying methods such as quantization, researchers achieved energy savings of up to 44% with no compromise in performance.
Beyond energy savings, smaller AI models may also help close the technology gap between high-income and low-income regions. Today, most of the infrastructure required to build and run AI tools is concentrated in wealthy countries, while developing nations lag far behind.
Smaller, more efficient models could help make this technology more accessible in regions with limited electricity, water, and connectivity.
UNESCO’s ethical AI mandate
This new report builds on the UNESCO Recommendation on the Ethics of AI, adopted by all 194 member states in 2021. That framework includes environmental safeguards and aims to ensure that AI is developed in ways that support sustainability, human rights, and digital equity.
As the organization puts it, the goal isn’t just to reduce energy use — it’s to unlock innovation that is “inclusive, rights-affirming, and environmentally aligned.”
While AI is often hailed as a key solution to global challenges, it now poses a challenge of its own: resource consumption at scale. If unchecked, the energy cost of AI could worsen climate change and deepen digital inequality, especially in countries already grappling with energy poverty.
“Embracing energy- and resource-efficient AI is key to ensuring that digital transformation advances in a way that is both inclusive and ecologically responsible,” the report states.
Explore how AI is reshaping environmental strategies worldwide. Read our Earth Day 2025 coverage on sustainable AI innovation and emerging green tech.