Nvidia competition: Google wants to expand its TPU business significantly
Google and Meta have reportedly already concluded a multi-billion dollar deal for AI chips. New joint ventures could also lease TPUs.
Ironwood is the codename for Google's TPU v7.
(Image: Google)
Google has apparently landed a big fish with Meta, which wants to buy its own Tensor Processing Units (TPUs) and use them in data centers. Google calls its AI accelerators TPUs, which serve as an alternative to, among others, Nvidia's and AMD's GPUs. The current version is TPU v7.
According to The Information, the agreement between Google and Meta is designed for several years. Compared to the millions of accelerators that Meta is buying from AMD and Nvidia for tens of billions of US dollars, the Google deal involves significantly smaller sums. Several billion US dollars are mentioned. The price per accelerator is likely to be lower than with the competition.
This gives Google a foot in the door as a hardware supplier. The company allegedly wants to capture ten percent of Nvidia's market share in the future. Last year, that would have been around 16 billion US dollars, purely related to AI accelerators: Nvidia generated 162.4 billion US dollars in annual revenue with such products, and another 31.4 billion with network hardware, with a continuing upward trend.
Google Joint Ventures for Wider Distribution
To drive the adoption of TPUs, Google reportedly wants to form a joint venture with an unnamed major investor. Through this, the company could lease the AI accelerators to other data center operators. Further joint ventures with other investors could follow. As an alternative to leasing, these subsidiaries could also operate entire data centers for customers.
Meanwhile, Google must balance the marketing of its own TPUs and the use of Nvidia hardware. At least for rentals within its own cloud, Google continues to rely on Nvidia's AI accelerators because many customers work with Nvidia's software.
According to The Information, Nvidia CEO Jensen Huang is specifically entering into agreements to bind promising companies to him, including most recently Anthropic. Such tendencies were already evident earlier, for example, with Nvidia's investment in Groq, thereby preempting a deal between OpenAI and Groq. Huang is all too aware that some of the best AI models, such as Gemini and Claude, were trained on Google hardware.
Videos by heise
TPU v7 as an Alternative
On paper, Google's TPU v7 is slower than Nvidia's Blackwell Ultra (B300) with 4.6 FP8 Petaflops compared to 5 Petaflops or 10 with sparsity (omission of zeros in matrices). The memory capacity is also lower at 192 instead of 288 GByte HBM3e.
However, the TPU v7 is more economical at an estimated 1000 instead of 1400 watts. Google relies on the modern 3-nanometer manufacturing process N3P from chip contract manufacturer TSMC, while Nvidia uses the improved 5nm process 4NP. Efficiency is becoming the most important metric, as all hyperscalers are limited in their power supply to their data centers. Moreover, hyperscalers obviously do not want to become dependent on Nvidia as their sole supplier.
Empfohlener redaktioneller Inhalt
Mit Ihrer Zustimmung wird hier ein externer Preisvergleich (heise Preisvergleich) geladen.
Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (heise Preisvergleich) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.
(mma)