Home / Chroniques / Generative AI: energy consumption soars
AI chip glows with blue energy and power, futuristic processor of artificial intelligence radiates light and lightning. Concept of computer technology, circuit board, cpu, data
π Energy π Science and technology

Generative AI : energy consumption soars

Anne-Laure Ligozat
Anne-Laure Ligozat
Professor of Computer Science at ENSIIE and LISN
anonyme
Alex De Vries
PhD Student at the School of Business and Economics at the University of Amsterdam
Key takeaways
  • The energy consumption of artificial intelligence is skyrocketing with the craze for generative AI, although there is a lack of data provided by companies.
  • Interactions with AIs like ChatGPT could consume 10 times more electricity than a standard Google search, according to the International Energy Agency (IAE).
  • The increase in electricity consumption by data centres, cryptocurrencies and AI between 2022 and 2026 could be equivalent to the electricity consumption of Sweden or Germany.
  • AI’s carbon footprint is far from negligible, with scientists estimating that training the BLOOM AI model emits 10 times more greenhouse gases than a French person in a year.
  • It seems complex to reduce the energy consumption of AI, making it essential to promote moderation in the future.

Arti­fi­cial intel­li­gence (AI) has found its way into a wide range of sec­tors : medi­cal, digi­tal, buil­dings, mobi­li­ty, etc. Defi­ned as “a computer’s abi­li­ty to auto­mate a task that would nor­mal­ly require human jud­ge­ment1”, arti­fi­cial intel­li­gence has a cost : its large-scale deploy­ment is gene­ra­ting gro­wing ener­gy requi­re­ments. The IT tasks nee­ded to imple­ment AI require the use of user ter­mi­nals (com­pu­ters, tele­phones, etc.) and above all data centres. There are cur­rent­ly more than 8,000 of these around the world, 33% of which are in the Uni­ted States, 16% in Europe and almost 10% in Chi­na, accor­ding to the Inter­na­tio­nal Ener­gy Agen­cy2 (IEA). Data centres, cryp­to­cur­ren­cies and arti­fi­cial intel­li­gence will account for almost 2% of glo­bal elec­tri­ci­ty consump­tion in 2022, repre­sen­ting elec­tri­ci­ty consump­tion of 460 TWh. By com­pa­ri­son, French elec­tri­ci­ty consump­tion stood at 445 TWh in 20233.

AI electricity consumption : a lack of data ?

How much of this elec­tri­ci­ty consump­tion is actual­ly dedi­ca­ted to AI ? “We don’t know exact­ly” replies Alex de Vries. “In an ideal case, we would use the data pro­vi­ded by the com­pa­nies that use AI, in par­ti­cu­lar the GAFAMs, which are res­pon­sible for a large pro­por­tion of the demand.” In 2022, Google pro­vi­ded infor­ma­tion on the sub­ject for the first time4 : “The per­cen­tage [of ener­gy used] for machine lear­ning has held stea­dy over the past three years, repre­sen­ting less than 15% of Google’s total ener­gy consump­tion.” Howe­ver, in its latest envi­ron­men­tal report5, the com­pa­ny pro­vides no pre­cise data on arti­fi­cial intel­li­gence. Only the total elec­tri­ci­ty consump­tion of its data centres is given : 24 TWh in 2023 (com­pa­red with 18.3 TWh in 2021).

In the absence of data pro­vi­ded by com­pa­nies, the scien­ti­fic com­mu­ni­ty has been trying to esti­mate the elec­tri­ci­ty consump­tion of AI for seve­ral years. In 2019, an ini­tial article6 threw a span­ner in the works : “The deve­lop­ment and trai­ning of new AI models are cost­ly, both finan­cial­ly […] and envi­ron­men­tal­ly, due to the car­bon foot­print asso­cia­ted with powe­ring the equip­ment.” The team esti­mates that the car­bon foot­print of the total trai­ning for a given task of BERT, a lan­guage model deve­lo­ped by Google, is rough­ly equi­va­lent to that of a trans­at­lan­tic flight. A few years later, Google scien­tists believe that these esti­mates ove­res­ti­mate the real car­bon foot­print by 100 to 1,000 times. For his part, Alex de Vries has cho­sen to rely on sales of AI hard­ware7. NVIDIA domi­nates the AI ser­ver mar­ket, accoun­ting for 95% of sales. Based on ser­ver sales and consump­tion, Alex de Vries pro­jec­ted elec­tri­ci­ty consump­tion of 5.7 to 8.9 TWh in 2023, a low figure com­pa­red with glo­bal data centre consump­tion (460 TWh).

The tasks exa­mi­ned in our stu­dy and the ave­rage amount of car­bon emis­sions they pro­duce (in g of 𝐶𝑂2𝑒𝑞) per 1,000 que­ries. N.B. The y‑axis is in loga­rith­mic scale8.

The generative AI revolution

But these figures could sky­ro­cket. Alex de Vries esti­mates that by 2027, if pro­duc­tion capa­ci­ty matches the com­pa­nies’ pro­mises, NVIDIA ser­vers dedi­ca­ted to AI could consume 85 to 134 TWh of elec­tri­ci­ty eve­ry year. The cause : the surge in the use of gene­ra­tive AI. ChatGPT, Bing Chat, Dall‑E, etc. These types of arti­fi­cial intel­li­gence, which gene­rate text, images or even conver­sa­tions, have spread across the sec­tor at record speed. Howe­ver, this type of AI requires a lot of com­pu­ting resources and the­re­fore consumes a lot of elec­tri­ci­ty. Accor­ding to the AIE, inter­ac­tions with AIs such as ChatGPT could consume 10 times more elec­tri­ci­ty than a stan­dard Google search. If all Google searches – 9 bil­lion eve­ry day – were based on ChatGPT, an addi­tio­nal 10 TWh of elec­tri­ci­ty would be consu­med eve­ry year. Alex De Vries esti­mates the increase at 29.3 TWh per year, as much as Ireland’s elec­tri­ci­ty consump­tion. “The stea­dy rise in ener­gy consump­tion, and the­re­fore in the car­bon foot­print of arti­fi­cial intel­li­gence, is a well-known phe­no­me­non,” com­ments Anne-Laure Ligo­zat. “AI models are beco­ming increa­sin­gly com­plex : the more para­me­ters they include, the lon­ger the equip­ment runs. And as machines become more and more power­ful, this leads to increa­sin­gly com­plex models…”. For its part, the Inter­na­tio­nal Ener­gy Agen­cy esti­mates that in 2026, the increase in elec­tri­ci­ty consump­tion by data centres, cryp­to­cur­ren­cies and AI could amount to bet­ween 160 and 590 TWh com­pa­red with 2022. This is equi­va­lent to the elec­tri­ci­ty consump­tion of Swe­den (low esti­mate) or Ger­ma­ny (high estimate).

Esti­ma­ted elec­tri­ci­ty demand for tra­di­tio­nal data cen­ters, AI-dedi­ca­ted data cen­ters and cryp­to-cur­ren­cies, 2022 and 2026 (refe­rence sce­na­rio)9. Note : elec­tri­ci­ty demand for data centres excludes consump­tion by data net­work centres.

The pro­ces­sing needs of AI can be explai­ned by dif­ferent phases. AI deve­lop­ment involves an ini­tial lear­ning phase based on data­bases, known as the trai­ning phase. Once the model is rea­dy, it can be used on new data : this is the infe­rence phase10. The trai­ning phase has long been the focus of scien­ti­fic atten­tion, as it is the most ener­gy-inten­sive. But new AI models have chan­ged all that, as Alex de Vries explains : “With the mas­sive adop­tion of AI models like ChatGPT, eve­ry­thing has been rever­sed and the infe­rence phase has become pre­do­mi­nant.” Recent data pro­vi­ded by Meta and Google indi­cate that it accounts for 60–70% of ener­gy consump­tion, com­pa­red with 20–40% for trai­ning11.

Carbon neutrality : mission impossible for AI ?

While AI’s ener­gy consump­tion is fraught with uncer­tain­ty, esti­ma­ting its car­bon foot­print is a chal­lenge for the scien­ti­fic com­mu­ni­ty. “We are able to assess the foot­print lin­ked to the dyna­mic consump­tion of trai­ning, and that lin­ked to the manu­fac­ture of com­pu­ter equip­ment, but it remains com­pli­ca­ted to assess the total foot­print lin­ked to use. We don’t know the pre­cise num­ber of uses, or the pro­por­tion of use dedi­ca­ted to AI on the ter­mi­nals used by users,” stresses Anne-Laure Ligo­zat. “Howe­ver, col­leagues have just shown that the car­bon foot­print of user ter­mi­nals is not negli­gible : it accounts for bet­ween 25% and 45% of the total car­bon foot­print of cer­tain AI models.” Anne-Laure Ligo­zat and her team esti­mate that trai­ning the BLOOM AI model – an open-access model – emits around 50 tonnes of green­house gases, or 10 times more than the annual emis­sions of a French per­son. This makes it dif­fi­cult for the tech giants to achieve their car­bon neu­tra­li­ty tar­gets, des­pite the many off­set­ting mea­sures they have taken. Google admits in its latest envi­ron­men­tal report : “Our [2023] emis­sions […] have increa­sed by 37% com­pa­red to 2022, des­pite consi­de­rable efforts and pro­gress in rene­wable ener­gy. This is due to the elec­tri­ci­ty consump­tion of our data centres, which exceeds our capa­ci­ty to deve­lop rene­wable ener­gy projects.”

Limi­ting glo­bal war­ming means dras­ti­cal­ly redu­cing glo­bal green­house gas emis­sions. Is AI at an impasse ? “None of the argu­ments put for­ward by Google to reduce AI emis­sions hold water” deplores Anne-Laure Ligo­zat. “Impro­ving equip­ment requires new equip­ment to be manu­fac­tu­red, which in turn emits green­house gases. Opti­mi­sing infra­struc­tures – such as water cooling for data centres – shifts the pro­blem to water resources. And the relo­ca­tion of data centres to coun­tries with a low-car­bon elec­tri­ci­ty mix means that we need to be able to manage the addi­tio­nal elec­tri­ci­ty demand…” As for the opti­mi­sa­tion of models, while it does reduce their consump­tion, it also leads to increa­sed use – the famous ‘rebound’ effect. “This tends to can­cel out any poten­tial ener­gy savings,” concludes Alex de Vries. “My main argu­ment is that AI should be used sparingly.”

Anaïs Marechal
1Stuart J. Rus­sell, Peter Nor­vig, and Ernest Davis. Arti­fi­cial intel­li­gence : a modern approach. Pren­tice Hall series in arti­fi­cial intel­li­gence. Pren­tice Hall Upper Saddle River, New Jer­sey, third edi­tion edi­tion, 2010.
2IEA (2024), Elec­tri­ci­ty 2024, IEA, Paris https://​www​.iea​.org/​r​e​p​o​r​t​s​/​e​l​e​c​t​r​i​c​i​t​y​-2024, Licence : CC BY 4.0
3Web­site consul­ted on 26 Sep­tem­ber 2024 : https://​ana​ly​se​set​don​nees​.rte​-france​.com/​b​i​l​a​n​-​e​l​e​c​t​r​i​q​u​e​-​2​0​2​3​/​c​o​n​s​o​m​m​a​t​i​o​n​#​C​o​n​s​o​m​m​a​t​i​o​n​c​o​r​rigee
4D. Pat­ter­son et al., “The Car­bon Foot­print of Machine Lear­ning Trai­ning Will Pla­teau, Then Shrink,” in Com­pu­ter, vol. 55, no. 7, pp. 18–28, July 2022, doi : 10.1109/MC.2022.3148714.
5Google, 2024, Envi­ron­men­tal report
6Stru­bell et al. (2019) Ener­gy and poli­cy consi­de­ra­tions for deep lear­ning in NLP, arXiv.
7De Vries, The gro­wing ener­gy foot­print of arti­fi­cial intel­li­gence, Joule (2023), https://​doi​.org/​1​0​.​1016/ j.joule.2023.09.004
8Source of first graph : ACM Confe­rence on Fair­ness, Accoun­ta­bi­li­ty, and Trans­pa­ren­cy (ACM FAccT ‘24), June 3–6, 2024, Rio de Janei­ro, Bra­zil
9Source for second graph : IEA fore­casts based on data and pro­jec­tions from Data Centres and Data Trans­mis­sion Net­works ; Joule (2023) – Alex de Vries, The gro­wing ener­gy foot­print of arti­fi­cial intel­li­gence ; Cryp­to Car­bon Ratings Ins­ti­tute, Indices ; Ire­land – Cen­tral Sta­tis­tics Office, Data Centres Mete­red Elec­tri­ci­ty Consump­tion 2022 ; and Danish Ener­gy Agen­cy, Den­mark’s Ener­gy and Cli­mate Out­look 2018
10Adrien Ber­the­lot, Eddy Caron, Mathilde Jay, Laurent Lefèvre, Esti­ma­ting the envi­ron­men­tal impact of Gene­ra­tive-AI ser­vices using an LCA-based metho­do­lo­gy, Pro­ce­dia CIRP, Volume 122, 2024, Pages 707–712, ISSN 2212–8271
11Web­site consul­ted on 25/09/2024 : https://​www​.iea​.org/​e​n​e​r​g​y​-​s​y​s​t​e​m​/​b​u​i​l​d​i​n​g​s​/​d​a​t​a​-​c​e​n​t​r​e​s​-​a​n​d​-​d​a​t​a​-​t​r​a​n​s​m​i​s​s​i​o​n​-​n​e​t​works

Support accurate information rooted in the scientific method.

Donate