These AI functions in iOS 18 & Co. will have to wait until 2025

Apple is planning a slow rollout of its Apple Intelligence components. There are technical and personnel reasons for this - and it's also about image.

Save to Pocket listen Print view
Apple Intelligence on Mac, iPhone and iPad

Apple Intelligence on Mac, iPhone and iPad.

(Image: Apple)

3 min. read
This article was originally published in German and has been automatically translated.

Apple will not be able to offer important parts of its Apple Intelligence functions in iOS 18, iPadOS 18 and macOS 15 until 2025 - including the biggest improvements for the voice assistant Siri and support for more languages than just US English. This was reported by the financial news agency Bloomberg in a report on Sunday. It is also conceivable that support for ChatGPT will not appear with the first releases of the new operating systems, but the plan is to introduce the function in 2024.

Among other things, Apple had announced that Siri would be able to handle natural language much better in future. This function could be available for the USA region as early as this year. This also means that users will be able to make promises without having to repeat the request. An improved "Type to Siri" (for keyboard input) could also be made available immediately, as well as a database with information and instructions on Apple products.

More context for Siri, including the use of information already available on the device from appointments, emails or chats, could not appear until 2025, similar to the planned "semantic index" for all content on the device. The control of apps and device functions also does not appear to be planned until 2025, for example to allow photos for messages to be edited directly by voice. On-screen awareness, which gives Siri access to screen content, is also said to be taking some time at Apple.

Apple had already made it clear during its keynote that the Apple Intelligence functions would be rolled out gradually. The lack of internationalization at the start is likely to be a particular pain point. Among other things, Apple wants to avoid capacity utilization problems for its own engineers, reports Bloomberg. "Trying to launch too many new things at once has hurt Apple in the past. It also gives developers more time to support the new features in their apps," the report says. Starting with American English also helps to free up time for training in other languages.

As Apple uses its own language models, this costs time and money. The same applies to the expansion of the cloud infrastructure, which Apple also wants to handle via its own servers with M-SoCs. Too many users could mean that the technology can no longer keep up, as we know from the initial phase of ChatGPT. Apple also fears damage to its image if generative AI functions lead to hallucinatory spending. With a smaller user base, these problems can be identified and rectified more quickly. Apple is also looking for new partners for further chatbot integrations. Talks are said to be continuing with Alphabet, Anthropic and Chinese companies such as Baidu and Alibaba. Bloomberg also writes that other announced functions may not appear until later in 2024. These include the new category management in Apple Mail and the coding assistant Swift Assist for Xcode.

Empfohlener redaktioneller Inhalt

Mit Ihrer Zustimmmung wird hier ein externer Preisvergleich (heise Preisvergleich) geladen.

Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (heise Preisvergleich) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.

(bsc)