Home / Chroniques / Generative AI: energy consumption soars
AI chip glows with blue energy and power, futuristic processor of artificial intelligence radiates light and lightning. Concept of computer technology, circuit board, cpu, data
π Energy π Science and technology

Generative AI: energy consumption soars

Anne-Laure Ligozat
Anne-Laure Ligozat
Professor of Computer Science at ENSIIE and LISN
anonyme
Alex De Vries
PhD Student at the School of Business and Economics at the University of Amsterdam
Key takeaways
  • The energy consumption of artificial intelligence is skyrocketing with the craze for generative AI, although there is a lack of data provided by companies.
  • Interactions with AIs like ChatGPT could consume 10 times more electricity than a standard Google search, according to the International Energy Agency (IAE).
  • The increase in electricity consumption by data centres, cryptocurrencies and AI between 2022 and 2026 could be equivalent to the electricity consumption of Sweden or Germany.
  • AI’s carbon footprint is far from negligible, with scientists estimating that training the BLOOM AI model emits 10 times more greenhouse gases than a French person in a year.
  • It seems complex to reduce the energy consumption of AI, making it essential to promote moderation in the future.

Arti­fi­cial intel­li­gence (AI) has found its way into a wide range of sec­tors: med­ic­al, digit­al, build­ings, mobil­ity, etc. Defined as “a computer’s abil­ity to auto­mate a task that would nor­mally require human judge­ment1”, arti­fi­cial intel­li­gence has a cost: its large-scale deploy­ment is gen­er­at­ing grow­ing energy require­ments. The IT tasks needed to imple­ment AI require the use of user ter­min­als (com­puters, tele­phones, etc.) and above all data centres. There are cur­rently more than 8,000 of these around the world, 33% of which are in the United States, 16% in Europe and almost 10% in China, accord­ing to the Inter­na­tion­al Energy Agency2 (IEA). Data centres, crypto­cur­ren­cies and arti­fi­cial intel­li­gence will account for almost 2% of glob­al elec­tri­city con­sump­tion in 2022, rep­res­ent­ing elec­tri­city con­sump­tion of 460 TWh. By com­par­is­on, French elec­tri­city con­sump­tion stood at 445 TWh in 20233.

AI electricity consumption: a lack of data?

How much of this elec­tri­city con­sump­tion is actu­ally ded­ic­ated to AI? “We don’t know exactly” replies Alex de Vries. “In an ideal case, we would use the data provided by the com­pan­ies that use AI, in par­tic­u­lar the GAFAMs, which are respons­ible for a large pro­por­tion of the demand.” In 2022, Google provided inform­a­tion on the sub­ject for the first time4: “The per­cent­age [of energy used] for machine learn­ing has held steady over the past three years, rep­res­ent­ing less than 15% of Google’s total energy con­sump­tion.” How­ever, in its latest envir­on­ment­al report5, the com­pany provides no pre­cise data on arti­fi­cial intel­li­gence. Only the total elec­tri­city con­sump­tion of its data centres is giv­en: 24 TWh in 2023 (com­pared with 18.3 TWh in 2021).

In the absence of data provided by com­pan­ies, the sci­entif­ic com­munity has been try­ing to estim­ate the elec­tri­city con­sump­tion of AI for sev­er­al years. In 2019, an ini­tial art­icle6 threw a span­ner in the works: “The devel­op­ment and train­ing of new AI mod­els are costly, both fin­an­cially […] and envir­on­ment­ally, due to the car­bon foot­print asso­ci­ated with power­ing the equip­ment.” The team estim­ates that the car­bon foot­print of the total train­ing for a giv­en task of BERT, a lan­guage mod­el developed by Google, is roughly equi­val­ent to that of a transat­lantic flight. A few years later, Google sci­ent­ists believe that these estim­ates over­es­tim­ate the real car­bon foot­print by 100 to 1,000 times. For his part, Alex de Vries has chosen to rely on sales of AI hard­ware7. NVIDIA dom­in­ates the AI serv­er mar­ket, account­ing for 95% of sales. Based on serv­er sales and con­sump­tion, Alex de Vries pro­jec­ted elec­tri­city con­sump­tion of 5.7 to 8.9 TWh in 2023, a low fig­ure com­pared with glob­al data centre con­sump­tion (460 TWh).

The tasks examined in our study and the aver­age amount of car­bon emis­sions they pro­duce (in g of 𝐶𝑂2𝑒𝑞) per 1,000 quer­ies. N.B. The y‑axis is in log­ar­ithmic scale8.

The generative AI revolution

But these fig­ures could skyrock­et. Alex de Vries estim­ates that by 2027, if pro­duc­tion capa­city matches the com­pan­ies’ prom­ises, NVIDIA serv­ers ded­ic­ated to AI could con­sume 85 to 134 TWh of elec­tri­city every year. The cause: the surge in the use of gen­er­at­ive AI. Chat­G­PT, Bing Chat, Dall‑E, etc. These types of arti­fi­cial intel­li­gence, which gen­er­ate text, images or even con­ver­sa­tions, have spread across the sec­tor at record speed. How­ever, this type of AI requires a lot of com­put­ing resources and there­fore con­sumes a lot of elec­tri­city. Accord­ing to the AIE, inter­ac­tions with AIs such as Chat­G­PT could con­sume 10 times more elec­tri­city than a stand­ard Google search. If all Google searches – 9 bil­lion every day – were based on Chat­G­PT, an addi­tion­al 10 TWh of elec­tri­city would be con­sumed every year. Alex De Vries estim­ates the increase at 29.3 TWh per year, as much as Ireland’s elec­tri­city con­sump­tion. “The steady rise in energy con­sump­tion, and there­fore in the car­bon foot­print of arti­fi­cial intel­li­gence, is a well-known phe­nomen­on,” com­ments Anne-Laure Ligoz­at. “AI mod­els are becom­ing increas­ingly com­plex: the more para­met­ers they include, the longer the equip­ment runs. And as machines become more and more power­ful, this leads to increas­ingly com­plex mod­els…”. For its part, the Inter­na­tion­al Energy Agency estim­ates that in 2026, the increase in elec­tri­city con­sump­tion by data centres, crypto­cur­ren­cies and AI could amount to between 160 and 590 TWh com­pared with 2022. This is equi­val­ent to the elec­tri­city con­sump­tion of Sweden (low estim­ate) or Ger­many (high estimate).

Estim­ated elec­tri­city demand for tra­di­tion­al data cen­ters, AI-ded­ic­ated data cen­ters and crypto-cur­ren­cies, 2022 and 2026 (ref­er­ence scen­ario)9. Note: elec­tri­city demand for data centres excludes con­sump­tion by data net­work centres.

The pro­cessing needs of AI can be explained by dif­fer­ent phases. AI devel­op­ment involves an ini­tial learn­ing phase based on data­bases, known as the train­ing phase. Once the mod­el is ready, it can be used on new data: this is the infer­ence phase10. The train­ing phase has long been the focus of sci­entif­ic atten­tion, as it is the most energy-intens­ive. But new AI mod­els have changed all that, as Alex de Vries explains: “With the massive adop­tion of AI mod­els like Chat­G­PT, everything has been reversed and the infer­ence phase has become pre­dom­in­ant.” Recent data provided by Meta and Google indic­ate that it accounts for 60–70% of energy con­sump­tion, com­pared with 20–40% for train­ing11.

Carbon neutrality: mission impossible for AI?

While AI’s energy con­sump­tion is fraught with uncer­tainty, estim­at­ing its car­bon foot­print is a chal­lenge for the sci­entif­ic com­munity. “We are able to assess the foot­print linked to the dynam­ic con­sump­tion of train­ing, and that linked to the man­u­fac­ture of com­puter equip­ment, but it remains com­plic­ated to assess the total foot­print linked to use. We don’t know the pre­cise num­ber of uses, or the pro­por­tion of use ded­ic­ated to AI on the ter­min­als used by users,” stresses Anne-Laure Ligoz­at. “How­ever, col­leagues have just shown that the car­bon foot­print of user ter­min­als is not neg­li­gible: it accounts for between 25% and 45% of the total car­bon foot­print of cer­tain AI mod­els.” Anne-Laure Ligoz­at and her team estim­ate that train­ing the BLOOM AI mod­el – an open-access mod­el – emits around 50 tonnes of green­house gases, or 10 times more than the annu­al emis­sions of a French per­son. This makes it dif­fi­cult for the tech giants to achieve their car­bon neut­ral­ity tar­gets, des­pite the many off­set­ting meas­ures they have taken. Google admits in its latest envir­on­ment­al report: “Our [2023] emis­sions […] have increased by 37% com­pared to 2022, des­pite con­sid­er­able efforts and pro­gress in renew­able energy. This is due to the elec­tri­city con­sump­tion of our data centres, which exceeds our capa­city to devel­op renew­able energy projects.”

Lim­it­ing glob­al warm­ing means drastic­ally redu­cing glob­al green­house gas emis­sions. Is AI at an impasse? “None of the argu­ments put for­ward by Google to reduce AI emis­sions hold water” deplores Anne-Laure Ligoz­at. “Improv­ing equip­ment requires new equip­ment to be man­u­fac­tured, which in turn emits green­house gases. Optim­ising infra­struc­tures – such as water cool­ing for data centres – shifts the prob­lem to water resources. And the relo­ca­tion of data centres to coun­tries with a low-car­bon elec­tri­city mix means that we need to be able to man­age the addi­tion­al elec­tri­city demand…” As for the optim­isa­tion of mod­els, while it does reduce their con­sump­tion, it also leads to increased use – the fam­ous ‘rebound’ effect. “This tends to can­cel out any poten­tial energy sav­ings,” con­cludes Alex de Vries. “My main argu­ment is that AI should be used sparingly.”

Anaïs Marechal
1Stu­art J. Rus­sell, Peter Nor­vig, and Ern­est Dav­is. Arti­fi­cial intel­li­gence: a mod­ern approach. Pren­tice Hall series in arti­fi­cial intel­li­gence. Pren­tice Hall Upper Saddle River, New Jer­sey, third edi­tion edi­tion, 2010.
2IEA (2024), Elec­tri­city 2024, IEA, Par­is https://​www​.iea​.org/​r​e​p​o​r​t​s​/​e​l​e​c​t​r​i​c​i​t​y​-2024, Licence: CC BY 4.0
3Web­site con­sul­ted on 26 Septem­ber 2024 : https://​ana​lyse​set​don​nees​.rte​-france​.com/​b​i​l​a​n​-​e​l​e​c​t​r​i​q​u​e​-​2​0​2​3​/​c​o​n​s​o​m​m​a​t​i​o​n​#​C​o​n​s​o​m​m​a​t​i​o​n​c​o​r​rigee
4D. Pat­ter­son et al., “The Car­bon Foot­print of Machine Learn­ing Train­ing Will Plat­eau, Then Shrink,” in Com­puter, vol. 55, no. 7, pp. 18–28, July 2022, doi: 10.1109/MC.2022.3148714.
5Google, 2024, Envir­on­ment­al report
6Stru­bell et al. (2019) Energy and policy con­sid­er­a­tions for deep learn­ing in NLP, arX­iv.
7De Vries, The grow­ing energy foot­print of arti­fi­cial intel­li­gence, Joule (2023), https://​doi​.org/​1​0​.​1016/ j.joule.2023.09.004
8Source of first graph: ACM Con­fer­ence on Fair­ness, Account­ab­il­ity, and Trans­par­ency (ACM FAccT ‘24), June 3–6, 2024, Rio de Janeiro, Brazil
9Source for second graph: IEA fore­casts based on data and pro­jec­tions from Data Centres and Data Trans­mis­sion Net­works; Joule (2023) – Alex de Vries, The grow­ing energy foot­print of arti­fi­cial intel­li­gence; Crypto Car­bon Rat­ings Insti­tute, Indices; Ire­land – Cent­ral Stat­ist­ics Office, Data Centres Metered Elec­tri­city Con­sump­tion 2022; and Dan­ish Energy Agency, Den­mark’s Energy and Cli­mate Out­look 2018
10Adrien Ber­th­el­ot, Eddy Car­on, Math­ilde Jay, Laurent Lefèvre, Estim­at­ing the envir­on­ment­al impact of Gen­er­at­ive-AI ser­vices using an LCA-based meth­od­o­logy, Pro­cedia CIRP, Volume 122, 2024, Pages 707–712, ISSN 2212–8271
11Web­site con­sul­ted on 25/09/2024 : https://​www​.iea​.org/​e​n​e​r​g​y​-​s​y​s​t​e​m​/​b​u​i​l​d​i​n​g​s​/​d​a​t​a​-​c​e​n​t​r​e​s​-​a​n​d​-​d​a​t​a​-​t​r​a​n​s​m​i​s​s​i​o​n​-​n​e​t​works

Support accurate information rooted in the scientific method.

Donate