ServiceNow wants to become the AI control center for companies

Traditional IT service management is not enough: provider ServiceNow wants to reinvent itself and offer companies an entire ecosystem of networked AI systems.

listen Print view
A presentation on a conference stage at the ServiceNow in-house exhibition.

(Image: Jonas Härtfelder)

3 min. read
By
  • Prof. Jonas Härtfelder

At the heart of this year's Knowledge 2025 in-house conference, ServiceNow presented a strategic realignment of its platform: the previous "ServiceNow Platform" will now become the "ServiceNow AI Platform". It will unite artificial intelligence, data and workflows across the entire company. The aim is to create a central platform for productive and scalable AI applications. The aim is to secure a key position in the competition for corporate AI.

At the heart of the revised platform are so-called "ready-to-work" AI agents. These autonomous software agents go far beyond conventional virtual assistants: as independent team members, they are designed to take on tasks, prepare decision-making processes and accelerate workflows – across all systems, functions and departments.

ServiceNow explicitly emphasizes the importance of end-to-end integration. AI agents would not work in isolation, but would be based on a uniform platform, which was made clear in live demonstrations with VISA, Google and Microsoft.

With the increase in autonomous agents –, whether via the company's own AI Agent Studio or third-party providers –, the need for centralized management and control is also growing. This is where ServiceNow comes in with the newly introduced AI Control Tower. This central management instance is intended to monitor all AI agents used and document their activities and results. This should ensure that security and compliance standards are adhered to and that people remain in control.

In addition, another key component was introduced in the form of the AI Agent Fabric. This communication layer enables collaboration between AI agents within complex IT environments. It supports AI agent-to-AI agent, AI agent-to-tool and even cross-agent system communication. ServiceNow relies on open protocols such as the Model Context Protocol (MCP) developed by Anthropic and the Agent2Agent protocol (A2A) introduced by Google, which is intended to enable dynamic and real-time-based information transfer.

Videos by heise

Another important topic at ServiceNow's in-house conference: a deepened partnership with Nvidia, presented by Nvidia CEO Jensen Huang. Together, the two companies developed the Nemotron-15B language model –, a powerful, open source-based Large Language Model (LLM) that has been specially optimized for use in companies. The two partners promise low latency, reduced inference costs and fast response times for Nemotron-15B. The model was trained on Nvidia NeMo with industry-specific ServiceNow data. In productive use, Nemotron-15B is offered via Nvidia Inference Microservices (NIM) and scales in GPU-optimized cloud environments.

With the transformation to the ServiceNow AI Platform, the provider, originally known for classic IT service management, is making a significant turnaround: ServiceNow is thus attaching itself to the current momentum of artificial intelligence in order to provide companies with an entire ecosystem of networked AI systems. This goes far beyond the previous automation and workflow management portfolio. Whether ServiceNow will actually set new standards in the field of enterprise AI should become clear in the coming months. In any case, the course towards the future of AI is clearly mapped out.

(mma)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.