Commentary on AI and climate protection: the tech industry in a defiant phase

When it comes to AI, tech companies behave like children at play. But Susanne Nolte wonders how this fits in with these companies' lofty climate goals.

Save to Pocket listen Print view
Child eating cookies, playing on a tablet and sticking his tongue out at the camera

(Image: Shutterstock/dotshock)

4 min. read

You might think it's all just a game for the IT industry. It's a bit like a toddler: it throws itself unrestrainedly at every new toy, screams "Here, here, here", "Me, me, me", but when it comes to tidying up the room, it gets grumpy: "I didn't do that", "Now I'm tired".

An opinion by Susanne Nolte

Susanne Nolte deals with servers, data centres, storage and green IT.

If you ask about the most stylish and shiny toy at the moment, the answer is unanimous: AI, AI and more AI. AI in all colors and shapes, generative AI, symbolic, neuronal AI, AI as rational, visual, manipulative intelligence and for pattern prediction. And everywhere pure enthusiasm, an almost childlike joy about all the toys and possibilities now, the possibilities tomorrow and the beautiful life that the possibilities promise us for the day after tomorrow.

Haven't we forgotten something? Oh yes, AI costs - costs money, lots of money. Escalating costs and unclear business value can be read about in the latest Gartner report, which looks at the reasons for abandoned GenAI projects. And what else? Oh yes, above all, AI costs energy, lots of energy - and as Mark Zuckerberg has now explained to us, much more energy than previously thought.

Llama 4 is said to require ten times more computing power than its predecessor. This predecessor, Llama 3.1, with its 405 billion parameters, is said to have required 39.3 million GPU hours on Nvidia's H100-80GByte for training, which at a TDP of 700 W is a good 27.5 GWh and 11,390 tons of CO₂e. Unfortunately, Meta's figures only refer to GPUs. An older study on the BLOOM model comprising 176 billion parameters assumes a ratio of 1:2 for GPUs to the rest of the servers, which is a good 80 GWh and 34,000 tons of CO₂e. We add the same for storage, network, cooling and power losses from the power supply, which comes to a good 160 GWh - around two thirds of what Germany's power plants produce in an hour - and 68,000 t CO₂e.

Oh yes, didn't we read in Google's environmental report that Scope 1 and 2, i.e. the actual energy consumption, only accounts for 25 percent of a cloud data center's footprint, while Scope 3, the upstream and downstream value chain, accounts for 75 percent? So the whole thing times 4, that would be 272,000 t CO₂e. And with Llama 4, the whole thing times 10, i.e. 2.7 million tons of CO₂e.

As this is by no means the end of the line for Zuckerberg and he prefers to build up capacity before he needs it, he increased Meta's investments in servers, data centers and network infrastructure by around 33% to USD 8.5 billion in the second quarter of 2024 alone. It goes without saying that none of the environmental protection targets so loudly propagated by all the cloud playthings in recent years can be achieved. But climate protection was also yesterday's toy. It's about time someone cleared it away.

(vbr)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.