Analysis of new plans: Is Apple finally doing everything right with AI?
Apple has lagged in AI. Now, the company is reportedly making progress, partly with Google Gemini. What to make of these ideas?
Apple and AI: Waited just the right amount of time?
(Image: Below the Sky / Shutterstock / Bearbeitung: heise medien)
Apple Intelligence is a term that has so far caused AI fans to dismiss it: The iPhone maker's language and image models are simply not considered competitive, nor are the services offered, such as writing tools, Image Playground, or Genmoji. Apple also offers no chatbot of its own, and Siri is poor at conversation. Two pieces of news from recent weeks could now change all of that: Apple reportedly plans to integrate its own chatbot directly into its operating systems for the first time, and, even more surprisingly, to launch its own AI wearable.
Chatbots probably interesting after all
First, about the chatbot. Apple had rejected this both internally and externally for a long time. Software boss Craig Federighi is generally considered an AI skeptic who had prohibited huge investments like those of other IT giants until now. Apparently, the company feared, firstly, not being able to keep up with the competition, and secondly, falling into traps that Apple dislikes – for example, when AIs hallucinate, disregard guardrails, and thus tarnish the company's family-friendly image.
Videos by heise
But with iOS 27, macOS 27, and the other operating systems, which are expected to be unveiled in the summer and released in the fall, that is set to change: A system-wide chatbot is reportedly planned, serving as a kind of super-Siri. The whole thing will then be "powered" by the company's "preferred cloud provider," to which Google recently officially declared itself. In other words, Apple on the outside, a Google Gemini-based model on the inside. It will be interesting to see how Apple markets this. The company does not comment on technical details, but it might be advisable for them to let it be known that they are using Google technology to avoid the aforementioned potential embarrassments.
The Preferred Cloud Provider
It is also still completely unclear how data exchange will work. So far, it is assumed that Apple will initially run simpler Gemini models on its own servers as part of the Private Cloud Compute initiative, but these are probably not good enough for the chatbot planned for the fall. After that, Apple can really only acquire Google Tensor systems optimized for the latest Gemini models to put them in its own data centers. Or the computation will take place in Google's facilities, which is likely sensible in terms of price and practicality, especially since half of iCloud is already hosted on Google Cloud servers (alongside Microsoft Azure and Amazon AWS).
The first indications of how all this is to be imagined will likely appear in April. Then, according to all current findings, the context-sensitive Siri will finally arrive as part of iOS 26.4. This will be based on Apple's own models as well as simpler Gemini models, as has been reported so far. The company cannot afford any missteps here, especially since the project was already announced in the summer of 2024 (!). Siri is finally supposed to become more intelligent, use user content to provide real added value – and also control or at least read apps. However, ChatGPT or Claude-like chatbot qualities are not to be expected; those will come later.
Truly an AI Gadget
Finally, the news also came in that Apple is planning its own AI gadget in the form of a wearable, in direct competition to former design boss Jony Ive. When I first heard this, I thought it was a joke. After all, previous products in this segment have failed spectacularly, and it is by no means certain that Jony Ive, together with Sam Altman, will succeed in establishing a market-successful AI wearable from OpenAI.
Nevertheless, the information appears to be true. According to The Information, a kind of clip-on AirTag with a speaker, camera, microphones, and wireless charging function is expected to be released in 2027. OpenAI itself reportedly has pens, glasses, pins, and/or earbuds in the pipeline. The usefulness of these devices only arises from the fact that they function without additional hardware like a smartphone, because otherwise one could simply use that.
Apple also already has the Apple Watch, which should represent the ideal platform for a constantly available AI assistant – but officially, it can only indirectly access Apple Intelligence so far. Apple's potential AI pin is expected to be a million-seller. How intelligent or cloud-dependent it will be is open. It would be unusual for Apple to venture into such an unproven category. The acquisition of the AI company Q.AI is more exciting, which is intended to enable voiceless communication with voice assistants. That would be real progress.
Empfohlener redaktioneller Inhalt
Mit Ihrer Zustimmung wird hier ein externer Preisvergleich (heise Preisvergleich) geladen.
Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (heise Preisvergleich) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.
(bsc)