Generative AI could soon use more power than a country


AI requires the development of many dedicated servers, which consume a lot of energy. — AFP Relaxnews

A Dutch researcher has highlighted the enormous energy use associated with the whole new generation of tools powered by generative artificial intelligence. Eventually, if they were to be adopted widely, these tools could end up using as much energy as a whole country, or even several countries combined.

Alex de Vries, a PhD candidate at Vrije Universiteit Amsterdam, has published research in the journal Joule into the environmental impact of emerging technologies such as generative artificial intelligence.

The arrival, in less than a year, of tools such as ChatGPT (from OpenAI), Bing Chat (Microsoft) and Bard (Google), as well as Midjourney and others in the image sector, has greatly boosted demand for servers, and consequently the energy required to keep them running smoothly. This development inevitably raises concerns about the environmental impact of this technology, which is already being used by many people.

In recent years, excluding cryptocurrency mining, electricity consumption by data centers has been relatively stable, at around 1% of global consumption. However, the expansion of AI, which is unavoidable in many fields, is likely to change the game.

According to Alex de Vries, the GPT-3 language model alone consumed more than 1,287 MWh during training. After this phase, the tool is put to work – with, to stay with ChatGPT, the generation of responses to prompts from Internet users.

At the start of the year, SemiAnalysis estimated that OpenAI needed 3,617 servers, with a total of 28,936 graphics processing units (GPUs), to support ChatGPT, which would correspond to a power demand of some 564 MWh per day.

And that, of course, is just the beginning. Still according to SemiAnalysis, implementing an AI similar to ChatGPT in every Google search would require the use of 512,821 dedicated servers, or a total of over 4 million GPUs.

With a power demand of 6.5 kW per server, this would translate into a daily electricity consumption of 80 GWh and an annual consumption of 29.2 TWh (terawatt-hours, or one billion kilowatt-hours). According to the most pessimistic scenario, AI deployed on a mass scale by Google could consume as much power as a country like Ireland (29.3 TWh per year).

Alphabet has already confirmed that interaction with a language model could consume up to 10 times more power than a standard keyword search, increasing from 0.3 Wh to around 3 Wh. At Nvidia, the leading supplier of AI servers, more than 1.5 million units could be sold by 2027, for a total consumption ranging from 85 to 134 TWh per year.

In conclusion, AI-related electricity consumption is fast becoming a major concern. Nevertheless, there are a number of ways that this can be reduced. The first would obviously be to prioritise renewable energy sources to power data centers.

Next, comes the need to find ways of developing algorithms that consume less energy. Finally, Internet users could be educated on using AI responsibly, and without excess. – AFP Relaxnews

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Tech News

EU forces Apple to also allow alternative app stores on iPads
TikTok blocks 37 million suspicious product listings from online shop
Google Podcasts, one of the most popular podcast apps, to end in June
Review: ‘Tales of Kenzera: Zau’ translates the journey of grief into a video game
Atos creditors reach deal to rescue debt-laden group, La Tribune says
In an online world, a new generation of protesters chooses anonymity
After two winsome Ori games, a pivot into dark fantasy
Teenager in China dies of heart attack after teacher forces her to exercise, insists illness is ‘fake’, delays first aid, enrages mainland social media
NoSpace is Gen Z’s answer to MySpace
What if customers were rewarded for tipping their meal delivery drivers?

Others Also Read