Zuckerberg: Llama 4 needs ten times more computing power than its predecessor

Mark Zuckerberg assumes that the development of the next AI model will require ten times more computing power.

Save to Pocket listen Print view
Meta logo and head of a female robot

(Image: Below the Sky/Shutterstock.com)

3 min. read

Artificial intelligence costs a lot of money. Both in terms of development and operation. When presenting the quarterly figures, Mark Zuckerberg also said that he expects the computing power required for the upcoming model - Llama 4 - to increase tenfold.

Meta publishes its large language models as open source. Zuckerberg expects this to create an entire ecosystem around his AI. It will also allow him to circumvent the regulation of the models to some extent. However, Llama 4 is expected to cost the company a lot of money. On the one hand, there are the development costs, which also include paying employees and researchers, but the main factor is training. According to the head of Meta, this will require ten times more computing power than was needed to train Llama 3. Llama 3.1 is already a model with 400 trillion parameters- twice as many as OpenAI's GPT-4o.

Zuckerberg also expects future models to grow even further. However, he also said at the presentation of the quarterly figures that it is difficult to predict how large language models will develop. "But at this point, given the long lead times for new inference projects, I'd rather take the risk of building capacity before it's needed," TechCrunch quotes Zuckerberg as saying. Meta's investments rose by around 33 percent to USD 8.5 billion in the second quarter of 2024 - the money went into servers, data centers and network infrastructure.

It is unclear whether ever more parameters and ever larger models will actually deliver better performance. Some AI experts believe that it is possible to create artificial general intelligence (AGI) simply by scaling up. However, many scientists are extremely critical of this. Meta's head of AI research, Naila Murray, also said in an interview with heise online that she believes other types of AI models are needed to create agents and ultimately a kind of intelligence.

How high the costs for training large language models will be is still relatively unknown. Meta's CFO has stated that generative AI is not expected to make up a large proportion of the Group's revenue. Zuckerberg had also already written in a blog post that Meta could only afford to develop AI and make it available as open source because they had other sources of income.

OpenAI, for example, is reportedly not generating enough revenue to cover its own costs. The business model is currently based on investments - from Microsoft, for example. However, not much is known about the actual costs of development.

In the EU, providers of so-called General Purpose AI (GPAI) will have to disclose the energy consumption of the models in future. This applies to both development, including training, and operation. The AI Act stipulates that providers must develop their AI as resource-efficiently as possible.

(emw)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.