Attack on Nvidia and AMD: Qualcomm Announces New AI Chips

US chip group Qualcomm is introducing new AI accelerators. With this move, the company is focusing on the AI data center market.

listen Print view
Qualcomm sign, with trees in the background

(Image: Sundry Photography/Shutterstock.com)

3 min. read
By
  • Andreas Knobloch

US semiconductor manufacturer Qualcomm will launch its own AI accelerators. The company announced this on Monday. The US group is thus realigning its portfolio. Qualcomm is the world's largest provider of modem chips, which smartphones use to connect to wireless data networks. Considering the global AI boom, the chip manufacturer is now also targeting large data centers for artificial intelligence (AI) and positioning itself as a competitor to world market leaders Nvidia and AMD.

Qualcomm announced the introduction of its "new generation of AI inference-optimized solutions for data centers." The AI accelerators AI200 and AI250 have been developed for improved storage capacity and the execution of AI applications. They are scheduled to be launched next year and the year after, respectively, and will be available in a system that fills a complete, liquid-cooled server rack, it says in the company's announcement.

Qualcomm's data center chips are based on the AI components in Qualcomm's smartphone chips, the so-called Hexagon Neural Processing Units (NPUs). "We wanted to prove ourselves in other areas first, and once we had built up our strength there, it was quite easy for us to take the next step and enter the data center sector," says Durga Malladi, General Manager for Data Centers and Edge Computing at Qualcomm, quoted by US news channel CNBC.

Qualcomm's announcement of its own AI chips marks the entry of a new competitor into this rapidly growing market. Global investments in AI chips have recently surged as cloud providers and chip groups rush to build infrastructure to meet the increasing demand for computing power in the race to develop sophisticated AI systems.

Videos by heise

In mid-October, it became known that chip manufacturer AMD and software group Oracle are expanding their cooperation. Oracle offers cloud services with AI chips from AMD. ChatGPT developer OpenAI, in turn, wants to develop its own custom AI chips together with semiconductor company Broadcom. At the beginning of the month, OpenAI also announced the purchase of AI chips with a total capacity of six gigawatts from AMD for several billion US dollars over a period of five years. And at the end of September, US chip manufacturer Nvidia announced that it would invest a total of 100 billion US dollars in OpenAI to implement the "largest AI infrastructure project in history." As part of the cooperation, both groups intend to build new data centers jointly with a capacity of at least ten gigawatts. Companies such as Google, Amazon, and Microsoft are also investing in the development of their own AI accelerators for their cloud services.

Qualcomm explained that its new chips focus on inference, i.e., running AI models, rather than training large AI language models. Malladi announced, according to CNBC, that they will also sell the AI chips and other components separately, especially for customers like hyperscalers who prefer to design their own racks. Other AI chip manufacturers like Nvidia or AMD could even become customers for some of Qualcomm's data center components, including CPUs, Malladi added.

(akn)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.