Study: AI needs to be smaller, reduce energy footprint


A combination of shorter queries and using more specific models and could cut AI energy consumption by up to 90% without sacrificing performance, said UNESCO in a report published to mark the AI for Good global summit in Geneva. — Pixabay

PARIS: The potential of artificial intelligence is immense – but its equally vast energy consumption needs curbing, with asking shorter questions one way to achieve, said a UNESCO study unveiled on July 8.

A combination of shorter queries and using more specific models and could cut AI energy consumption by up to 90% without sacrificing performance, said UNESCO in a report published to mark the AI for Good global summit in Geneva.

OpenAI CEO Sam Altman recently revealed that each request sent to its popular generative AI app ChatGPT consumes on average 0.34 Wh of electricity, which is between 10 and 70 times a Google search.

With ChatGPT receiving around a billion requests per day that amounts to 310 GWh annually, equivalent to the annual electricity consumption of three million people in Ethiopia, for example,

Moreover, UNESCO calculated that AI energy demand is doubling every 100 days as generative AI tools become embedded in everyday life.

"The exponential growth in computational power needed to run these models is placing increasing strain on global energy systems, water resources, and critical minerals, raising concerns about environmental sustainability, equitable access, and competition over limited resources," the UNESCO report warned.

However, it was able to achieve a nearly 90% reduction in electricity usage by reducing the length of its query, or prompt, as well as by using a smaller AI, without a drop in performance.

Many AI models like ChatGPT are general-purpose models designed to respond on a wide variety of topics, meaning that it must sift through an immense volume of information to formulate and evaluate responses.

The use of smaller, specialised AI models offers major reductions in electricity needed to produce a response.

So did cutting the cutting prompts from 300 to 150 words.

Being already aware of the energy issue, tech giants all now offer miniature versions with fewer parameters of their respective large language models.

For example, Google sells Gemma, Microsoft has Phi-3, and OpenAI has GPT-4o mini. French AI companies have done likewise, for instance, Mistral AI has introduced its model Ministral. – AFP

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Germany's CDU weighs social media age curbs for under-16s
Uber found liable in rape by driver, setting stage for thousands of trials
Anthropic unveils new AI model as OpenAI rivalry heats up
Japan scouring social media 24 hours a day for abuse of Olympic athletes
Apple is scaling back plans for new AI-based health coach service
AI tools fabricate Epstein images ‘in seconds’, study says
Hackers hit sensitive targets in 37 nations in spying plot
SiTime tech could go into billions of Renesas chips, SiTime CEO says
Exclusive-The sale of xAI comes with tax, financial and legal benefits for xAI and SpaceX investors
Roblox forecasts strong annual bookings as gaming platform momentum grows; shares jump

Others Also Read