Report: Meta Platforms tests self-developed chip for AI training

The AI accelerator developed by Meta itself is intended to reduce dependency on Nvidia and costs. It is planned for recommendations and generative AI.

listen Print view
Meta logo and head of a female robot

(Image: Below the Sky/Shutterstock.com)

3 min. read

Meta Platforms began using AI processors built in-house last year, but initially only as inference accelerators. It is now being reported that the largest operator of social networks is also testing its first self-developed AI chip for training artificial intelligence (AI). If these tests are successful, this could save costs due to higher energy efficiency and reduce dependence on suppliers such as Nvidia and its AI accelerators.

In April 2024, Meta presented its self-developed AI accelerator MTIA for Facebook and Instagram. According to its name, the “Meta Training and Inference Accelerator” (MTIA) is suitable for both training AI models and using these models for decisions and predictions, with the latter being referred to as AI inference. AI inference was apparently the initial focus, but now AI training with the MTIA chips is also being added.

As the news agency Reuters reports, Meta Platforms has started to use a limited number of its AI accelerators for AI training. The unnamed sources do not provide any details about the chips, but they are said to have been produced by Taiwanese contract manufacturer TSMC. The AI accelerators are said to be more energy-efficient than graphics cards that are also used for AI due to their limitation to AI tasks.

Videos by heise

This corresponds to the information provided by Meta itself, as the company stated a power consumption of 90 watts per chip for the MTIA, which was presented last year and manufactured at TSMC using the 5-nanometer process. AI power consumption is difficult to estimate, but AI accelerators such as Nvidia's H100 and AMD's MI300X are designed for 700 and 750 watts per card respectively.

Neither Meta nor TSMC wanted to comment on the report, but the operator of Facebook and Instagram had previously stated that MTIA chips are used for the recommendation system of these platforms – based on already trained AI models. This means that content that is (supposedly) relevant to the user is automatically displayed. The more relevant these recommendations are, the longer users will stay on the platform, which in turn generates more advertising revenue.

According to earlier statements by Meta managers, the company also wanted to use its chips for AI training from 2026. Here, too, the plan was to start with recommendations and later use the self-developed AI accelerators for generative AI, such as Meta's own AI chatbot. Chris Cox, Senior Product Manager at Meta, confirmed this approach during a conference last week: “We are working on how we design training for recommendation systems and how we ultimately think about training and inference for generative AI.”

(fds)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.