Placeholder canvas

AI bots like ChatGPT will have higher energy demands than many countries by 2027

According to recent research published in the journal Joule, the energy consumption associated with artificial intelligence (AI) applications is projected to surpass the annual electricity requirements of nations like the Netherlands, Argentina, and Sweden by 2027.

This energy-intensive trend is attributed to the widespread adoption of AI, with particular emphasis on generative AI technologies like ChatGPT, which generate content by drawing upon the data they were trained on. The surge in the use of such AI systems has been notably prominent since 2022.

Training these tools is an energy-intensive process as it requires feeding the models a large amount of data, said author Alex de Vries, a researcher at Vrije Universiteit Amsterdam, The Netherlands.

For training its textual content-generating AI tool, Hugging Face, a New York-based AI company, reported that it consumed 433 megawatt-hours (MWh), sufficient to power 40 average American homes for a year, said de Vries, who also founded Digiconomist, a research company dedicated to exposing the unintended consequences of digital trends.

Further, these AI tools again take up a significant amount of computing power, and thus energy, every time they generate a text or image based on prompts.

For example, ChatGPT could cost 564 MWh of electricity daily, de Vries’s analysis showed.

De Vries also estimated that if Google, which currently processes up to 9 billion searches a day, were to power its search engines with AI, then it would need roughly 30 TWh of power a year, which is almost the yearly electricity consumption of Ireland.

Google has been incorporating generative AI in the company’s email service and is testing out powering its searches with AI.

Thus, by 2027, worldwide AI-related electricity consumption could increase by 85 to 134 TWh annually, said de Vries, based on the projection of the rate of AI-server production, which is expected to increase shortly.

Also, even as companies globally work on making AI hardware and software more efficient and less energy-intensive, de Vries said this only increases their demand.

“The result of making these tools more efficient and accessible can be that we just allow more applications of it and more people to use it,” said de Vries in the commentary.

Thus, “looking at the growing demand for AI service, it’s very likely that energy consumption related to AI will significantly increase in the coming years.

“The potential growth highlights that we must be mindful of what we use AI for.

“It’s energy-intensive, so we don’t want to put it in all kinds of things where we don’t need it,” said de Vries.

Share your love
Facebook
Twitter
LinkedIn
WhatsApp

Newsletter

Follow Us

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

error: Unauthorized Content Copy Is Not Allowed