Foundation Models Framework: Apple shows third-party apps with local AI
With iOS 26 and the like, Apple is introducing the option for developers to use Apple's local AI model. The first examples of this are now being presented.
(Image: Tada Images / Shutterstock)
With the release of iOS 26, iPadOS 26 and macOS 26, Apple has created the conditions for third-party apps to use Apple's local AI model on the device. In a press release, the iPhone manufacturer has now highlighted some examples of what developers have done with this option.
The interest of the programmers present at the World Developer Conference WWDC was great when Apple announced the opening up of its AI. The first developers were already experimenting with the possibilities during the event. Local processing is interesting for developers for several reasons: in addition to data protection, it allows them to use AI functions even when the device is not connected to the network or only poorly connected. And they no longer have to incur the costs of using external AI models.
Advantages for developers
Of course, this is only useful if the AI can also keep up with external solutions such as those from OpenAI or Anthropic. The apps that have now been published show where developers see reliable support from Apple's AI. Another major advantage of Apple's solution is that the AI's answers should be reliably output in a format that can be processed by the app.
The video editing app "Detail: AI Video Editor" creates teleprompter scripts from drafts or outlines, for example. The app automatically generates titles, descriptions, and hashtags for finished videos.
Videos by heise
"SmartGym" transforms simple descriptions from the user into structured training plans with sentences, repetitions, and breaks. The app learns from training data, provides personalized recommendations with explanations, and automatically creates training notes and dynamic welcome messages.
"Stoic" generates personalized diary prompts based on users' previous entries and moods. The app summarizes past entries, organizes thematically related entries, and offers improved natural language search.
"Streaks" intelligently suggests tasks in to-do lists and categorizes them automatically. The app helps to organize daily routines. And "Lil Artist" combines AI text creation with image generation to create illustrated children's stories. Children select characters and themes via the user interface, making the experience more accessible.
Integrated into Swift
The Foundation Models Framework is integrated into Apple's Swift programming language. Requests to the AI model with three billion parameters can be sent directly from the existing Swift code. By exclusively processing locally, Apple wants to score particularly well when it comes to data protection. This means that developers cannot access Apple's cloud AI Private Cloud Compute, while device users can do so via the Shortcuts app, for example. The Foundation Models Framework is available with iOS 26, iPadOS 26, and macOS 26 and works on every Apple Intelligence-compatible device if Apple Intelligence is activated.
(mki)