Study: AI needs to be smaller, reduce energy footprint


A combination of shorter queries and using more specific models and could cut AI energy consumption by up to 90% without sacrificing performance, said UNESCO in a report published to mark the AI for Good global summit in Geneva. — Pixabay

PARIS: The potential of artificial intelligence is immense – but its equally vast energy consumption needs curbing, with asking shorter questions one way to achieve, said a UNESCO study unveiled on July 8.

A combination of shorter queries and using more specific models and could cut AI energy consumption by up to 90% without sacrificing performance, said UNESCO in a report published to mark the AI for Good global summit in Geneva.

OpenAI CEO Sam Altman recently revealed that each request sent to its popular generative AI app ChatGPT consumes on average 0.34 Wh of electricity, which is between 10 and 70 times a Google search.

With ChatGPT receiving around a billion requests per day that amounts to 310 GWh annually, equivalent to the annual electricity consumption of three million people in Ethiopia, for example,

Moreover, UNESCO calculated that AI energy demand is doubling every 100 days as generative AI tools become embedded in everyday life.

"The exponential growth in computational power needed to run these models is placing increasing strain on global energy systems, water resources, and critical minerals, raising concerns about environmental sustainability, equitable access, and competition over limited resources," the UNESCO report warned.

However, it was able to achieve a nearly 90% reduction in electricity usage by reducing the length of its query, or prompt, as well as by using a smaller AI, without a drop in performance.

Many AI models like ChatGPT are general-purpose models designed to respond on a wide variety of topics, meaning that it must sift through an immense volume of information to formulate and evaluate responses.

The use of smaller, specialised AI models offers major reductions in electricity needed to produce a response.

So did cutting the cutting prompts from 300 to 150 words.

Being already aware of the energy issue, tech giants all now offer miniature versions with fewer parameters of their respective large language models.

For example, Google sells Gemma, Microsoft has Phi-3, and OpenAI has GPT-4o mini. French AI companies have done likewise, for instance, Mistral AI has introduced its model Ministral. – AFP

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Smartphone on your kid’s Christmas list? How to know when they’re ready.
A woman's Waymo rolled up with a stunning surprise: A man hiding in the trunk
A safety report card ranks AI company efforts to protect humanity
Bitcoin hoarding company Strategy remains in Nasdaq 100
Opinion: Everyone complains about 'AI slop,' but no one can define it
Google faces $129 million French asset freeze after Russian ruling, documents show
Netflix’s $72 billion Warner Bros deal faces skepticism over YouTube rivalry claim
Pakistan to allow Binance to explore 'tokenisation' of up to $2 billion of assets
Analysis-Musk's Mars mission adds risk to red-hot SpaceX IPO
Analysis-Oracle-Broadcom one-two punch hits AI trade, but investor optimism persists

Others Also Read