Cloud market: Gartner forecasts trillion US dollars by 2027

The global cloud market is expected to reach one trillion US dollars by 2027. AI usage and digital sovereignty will shape the future.

listen Print view
Server hardware with cloud in front

(Image: heise medien)

10 min. read
By
  • Harald Weiss
Contents

Gartner sees the global cloud market continuing its upward trend. By 2027, it is expected to rise to one trillion US dollars. However, there will be massive changes due to increasing AI usage. By 2030, AI is expected to be integrated into over 90 percent of cloud strategies, compared to less than ten percent today.

Gartner analyst Milind Govekar sees two developments here: "There will be two different investment models for cloud providers. One focuses on a highly vertically integrated stack, as with Google, from CPUs to the AI layer. The other is modular and partner-oriented." As an example, he points to the cooperation between Microsoft and OpenAI to cover a wide range of AI functions: from GPUs and CPUs to AI features. These different models have far-reaching consequences for AI usage from the cloud. The vertically integrated technology stack offers high cost efficiency, good performance optimization, and control over the entire technology stack. Or rather: the provider has control. The modular, partner-oriented approach, on the other hand, offers more flexibility, shortens time to market, and enables more innovation.

Govekar addressed the question of what kind of innovations these are. "Essentially, these are packaged components with industry-specific solutions," is his assessment. Consequently, he believes there will be a significant increase in cloud services expanding into traditional industries. He points to AWS's banking service in the US and Google's HealthCare services as examples.
All of this is based on three extensive resources: Firstly, a complex core infrastructure of network, computing power, storage, AI models, and the corresponding support software. This also includes a powerful data infrastructure that enables the capture of huge amounts of data from various areas and their use for training and inference. Secondly, the composition capabilities with cloud-native technologies, such as containers, which are used for the development and deployment of AI agents. And thirdly, the ability to quickly create packaged components (PBCs), establish marketplaces and industry clouds, and provide industry-specific functions that can be combined and extended into proprietary services by other providers and users.

Many cloud providers are investing heavily in these areas. Govekar sees automotive cloud solutions in Germany as an example of this. This development will also be reflected in the market: "We expect the share of industry-specific AI systems to support critical business objectives to increase to about 80 percent by 2030 – from currently less than ten percent," he predicts. Govekar also believes that these AI systems will make autonomous decisions and orchestrate tasks. The scalability of computing resources in the cloud proves to be extremely useful here, as large amounts of company-wide data need to be captured and made available for autonomous decisions.

However, these developments also bring a number of problems – for example, with cloud costs: "Most agent-based AI workloads are deployed using containers. I have often seen these containers being over-provisioned by up to 70 percent – which costs a lot of money," Govekar reports from his customer contacts. His conclusion: "Companies that do not optimize their underlying AI computing environment will pay 50 percent more by 2030 than their competitors."

Another problem is the energy demand for AI. There are many linear extrapolations predicting extreme energy consumption. But Govekar is skeptical: "Sales of traditional servers are flat, while sales of AI-optimized servers are multiplying, which means significantly higher energy efficiency," is his assessment. Nevertheless, he also assumes that energy consumption will more than triple by 2030. In his view, this could lead to major political upheavals. "In the Netherlands, Great Britain, and other countries, water suppliers are already refusing to supply data centers with cooling. Many citizens in the EMEA region say: 'I don't want this data center because it consumes electricity for new buildings.'" This has direct consequences for IT's energy planning. "By 2030, companies that fail to optimize their underlying AI computing environment will pay 50 percent more than those that do," is the corresponding Gartner forecast.

A particularly significant factor of uncertainty in further cloud development is the issue of digital sovereignty. Govekar confirms this: "Digital sovereignty is a highly sensitive topic in Europe; many companies are concerned about their data sovereignty and ask: Where is my data stored and who has access to it?" This is particularly noticeable among defense companies. In many cases, they use the core infrastructure of a cloud provider but also additionally use a local telecommunications provider through which a sovereign data management layer is implemented. Often, mobile technology is used instead of internet or Wi-Fi technology because it is considered more secure than the internet. And with the expansion of 5G, this communication is gaining increasing importance.

Overall, Gartner analysts are divided on how to approach the issue of digital sovereignty. While Govekar has a pragmatic assessment, primarily focused on particularly sensitive areas like the defense industry, his colleague René Büst is very skeptical. "The geopolitical situation is one of the biggest concerns for IT decision-makers when it comes to cloud usage," he said at a press conference. Due to the geopolitical situation, many CIOs and IT managers want to increasingly use local or regional cloud providers in the future. Specifically mentioned are OVH, Telekom, noris network, StackIT, and Ionos.

Regarding Germany, a Gartner survey in seven major European countries revealed that the issue is not as high on the agenda as expected in Germany. The question "Geopolitical factors will increase the future use of local/regional cloud providers by our organizations" was answered with yes by 69 percent in the UK and France. Germany, with 51 percent, ranked only fifth. The much-cited preference for open source is also comparatively low in Germany. The question "Geopolitical concerns have made open source a more important criterion when selecting new cloud solutions" was answered with yes by 63 percent in the UK and France – Germany was last on the scale with 35 percent.

Other studies also show an inconsistent picture. IDC comes to the conclusion in a study that 60 percent of European companies prefer to use sovereign cloud solutions. However, this applies exclusively to "AI workloads" – not to cloud usage as a whole. Also interesting in this regard is a Bitkom study. In it, 97 percent of the surveyed companies state that the country of origin plays a role in selecting a cloud provider. However, these agreements diminish when it comes to the provider's performance. If the use of a local provider results in a decrease in response time, functionality, price, or service, 65 percent would not opt for it.

And finally, a look at cloud revenues also provides no reliable indications. While they are increasing in Europe – this applies to the whole world. How much of this increase is due to the shift from US providers remains unclear. "From a market perspective, hyperscalers from the US continue to dominate in Europe," is the assessment of the eco association. "EU providers have maintained a constant share of 15 percent for years," the association further states.

Videos by heise

The trend towards repatriation is often seen in close connection with sovereignty. This can also refer to other cloud problems such as costs, proprietary usage models, or response times. Gartner analysts have a clear opinion on this: "Cloud repatriation is not a macro trend," said Ted McHugh at the very beginning of his presentation. In his view, this is a marketing gag by on-premises infrastructure providers that has been picked up by the media and is being disproportionately represented.

While McHugh confirms that there are many such cases, the reasons for them are not a general cloud problem – and certainly not a sovereignty problem. "The most common reasons for repatriating applications are similar to edge computing: autonomy, latency, efficient data management, or applications not designed for the cloud," is his assessment.

Finally, Gartner analysts also offered some advice on how IT managers can improve their infrastructure: first, create a cost-benefit analysis of current infrastructure options. Second, identify areas where changes in deployment styles and infrastructure would improve returns. Third: Consider the use of FinOps and cost optimization tools and create business cases for the necessary changes. Fourth, consider all risks and costs associated with geolocation and sovereignty requirements. The result should be a concept for a company-optimal infrastructure that considers all deployment options, from cloud and edge computing to colocation, diverse services, and traditional on-premises solutions.

(mack)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.