Apple wants to release its own AI models for developers

Developers currently use models from OpenAI, Anthropic or Meta for their applications. Apple wants to get involved in the future — with its technology.

listen Print view
Two fingers of two different hands touch each other. The hand on the left is human, the hand on the right is artificial.

AI meets humans (symbolic image): Will Apple achieve a breakthrough with its own AI models for developers?

(Image: Ole.CNX/Shutterstock.com)

3 min. read

It is not only the manufacturers themselves who contribute to the implementation of AI models, but in particular the developer community: it is they who integrate ChatGPT, Claude, or Llama into their apps and thus (further) popularize them. Apple now also wants to make use of this trick: In the future, it plans to release its language models and other AI technology developed by the iPhone company to developers. This is reported by the financial news agency Bloomberg.

A suitable software development kit (SDK) and corresponding frameworks that allow the development of AI functions based on Apple's large language models (LLMs) could be presented at the WWDC developer conference on June 9, according to reports. The question is still how extensive the functions will be.

Videos by heise

Currently, the features that Apple offers as part of its AI system Apple Intelligence on the iPhone, iPad, Mac and Vision Pro are considered rather meagre. For example, texts can be rewritten, notifications and emails are summarized and images can be created – but the latter are not photorealistic. There is no real chatbot – which is optionally provided by OpenAI – nor are the outputs outstanding. However, it was recently said that Apple is at least getting closer to the competition thanks to major investments.

Apparently, the focus is already on smaller language models that run on the iPhone itself – at least in the beginning. However, these are even weaker than Apple's full AI system, which switches to the cloud for more demanding tasks. Developers can already use the standard AI functions such as the writing tools, Genmoji or Image Playground including notification summaries. They are actually active by default, although there are ways to bypass them.

The purpose of the SDK and frameworks is to make it possible for developers to write their AI functions based on Apple technology. What this will cost – and whether it might even be free – remains to be seen. The App Store itself should serve as a model: With the release of the software store for iOS in 2008, Apple had also released numerous proprietary technologies for developers. It is also conceivable that Apple will present its own new language models at WWDC. However, the company is not working on an “LLM Siri” until 2026 or even 2027.

Empfohlener redaktioneller Inhalt

Mit Ihrer Zustimmung wird hier ein externer Preisvergleich (heise Preisvergleich) geladen.

Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (heise Preisvergleich) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.

(bsc)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.