Bill Gates misrepresents the power consumption of AI data centres

Training AI costs a lot of energy. Bill Gates thinks it would be saved elsewhere. However, the calculation seems skewed.

Save to Pocket listen Print view
Portrait of Bill Gates

(Image: lev radin/Shutterstock.com)

4 min. read
This article was originally published in German and has been automatically translated.

Bill Gates is not very concerned about the energy requirements of data centers for training artificial intelligence (AI). This is despite the fact that hyperscalers such as Amazon, Google, Meta and Microsoft are currently massively expanding their computing capacity in order to train increasingly complex AI models.

This not only consumes considerable production resources – along with negative impacts on the environment –, but also large amounts of electrical energy. Even the best greenwashing cannot conceal this fact.

Ein Kommentar von Mark Mantel

Mark Mantel ist seit 2019 Redakteur bei heise online und c't. Er kümmert sich hauptsächlich um die Online-Berichterstattung rund um PC-Hardware.

At a London event organized by his energy company Breakthrough Energy, Gates said: "Let’s not go overboard on this. Datacentres are, in the most extreme case, a 6% addition [in energy demand] but probably only 2% to 2.5%. The question is, will AI accelerate a more than 6% reduction? And the answer is: certainly."

In addition, hyperscalers would significantly finance the expansion of green energy generation. However, electricity is already scarce in some locations, which is why operators are tapping into any source of power they can get.

The International Energy Agency (IAE) estimates that all data centers worldwide will have consumed more than 500 terawatt hours of electrical energy by 2023. This includes "classic" servers and crypto miners, both of which are difficult to factor out. With a linear increase, the figure would rise to a good 800 TWh per year by 2026. At the moment, however, it looks more like exponential growth, which would make more than 1000 TWh possible. The main driver is AI training.

For comparison: According to estimates by the German Association of Energy and Water Industries (BDEW), German energy demand in 2023 was around 467 TWh, including local industry.

Gates mentions savings made possible by AI algorithms. However, the current AI hype is not about energy savings at all. AI algorithms for such purposes already existed years ago. Google's Deepmind team announced in the summer of 2016, for example, that it was able to reduce the power required to cool data centers by 40 percent thanks to machine learning. To do this, the AI model drew on more real-time data than conventional automated systems, such as the weather and load distribution.

The tech giants are currently training huge generative models that may eventually lead to superintelligence. Primarily, hyperscalers throw hardware at a problem in the hope of being able to solve it in the future. They accept the costs in order to be able to advertise with functions such as Windows Copilot. They are currently encouraging everyone to try out chatbots, image generators and other programs as much as possible – and to consume electricity with every request.

Microsoft is at the forefront of this, providing the computing capacity for OpenAI, for example. Although Gates himself no longer holds a position at Microsoft, he invests billions in the company through his Bill & Melinda Gates Foundation. At the end of March, the foundation owned 36,499,597 Microsoft shares according to a disclosure to the US Securities and Exchange Commission (SEC); this corresponds to 16.3 billion US dollars or just under 15.3 billion euros today.

If Microsoft is successful with its AI strategy, Gates would therefore benefit massively from this via his foundation, and he may be biased. But even if he were to assume with a clear heart that energy can be saved with AI, this would at least be negligent in view of the reality.

(mma)