Anthropic increases Claude usage for AI subscribers through SpaceX cooperation

Usage limits for Claude subscribers are being greatly increased as Anthropic secures new AI computing capacities from SpaceX - possibly soon from space as well.

listen Print view
Anthropic is on a smartphone, with Claude in the background.

Anthropic is on a smartphone, with Claude in the background.

(Image: Stockinq/Shutterstock.com)

3 min. read

Anthropic has entered into a partnership with SpaceX, securing a large portion of its AI computing capacity in its large supercomputer Colossus. At the same time, and perhaps because of this, the AI company has dramatically increased the usage limits for subscribers to its AI models. Users of Claude Code will receive significantly more usage time, and the previous limit during peak times will be lifted. Furthermore, the API limits for tokens of Claude Opus will be significantly increased.

The reason is likely the procurement of new computing capacities. Upon its completion in autumn 2024, Colossus was considered possibly the fastest supercomputer in the world and was initially used for training Grok, xAI's AI model. But now Anthropic writes that the AI company has secured “all of the compute capacity” of this data center. This amounts to a computing power of 300 megawatts per month, based on more than 220,000 Nvidia GPUs.

The additional computing capacity is intended to benefit subscribers of Claude Pro and Max. Therefore, according to its statements, Anthropic is doubling the previous tariff limit of five hours for Pro, Max, Team, and Enterprise subscriptions of Claude Code. Pro and Max subscribers of this AI agent for programming tasks will also be exempt from the peak time limitation. These upgrades are available immediately.

Videos by heise

For Claude Opus, which was released in version 4.7 in mid-April and is said to follow instructions substantially better, the token limits for API usage have also been dramatically increased immediately. In the lowest tier (Tier 1), input tokens have been more than tenfold increased (from 30,000 to 500,000 per minute) and output tokens exactly tenfold increased (from 8,000 to 80,000 per minute). In the highest tier (Tier 4), it is now 10 instead of 2 million input tokens per minute and 800,000 instead of 400,000 output tokens per minute.

Anthropic is also considering the future use of data centers in space for its AI models. “We have also expressed interest in partnering with SpaceX to develop multiple gigawatts of orbital AI compute capacity,” the AI company states. There is no timeline yet, but during the acquisition of xAI, including Grok and X, by SpaceX in February 2026, CEO Elon Musk stated that Earth orbit would be the cheapest location for AI data centers within a few years due to solar energy.

(fds)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.