Comes with iOS 18: eye tracking, haptic music and anti-motion sickness
Apple has presented the first functions designed to make the iOS and iPadOS mobile operating systems and CarPlay even more accessible.
![Live-Untertitel auf der Visio Pro](https://heise.cloudimg.io/width/610/q85.png-lossy-85.webp-lossy-85.foil1/_www-heise-de_/imgs/18/4/5/9/0/9/7/3/Apple-accessibility-features-Apple-Vision-Pro-Live-Captions-8c6e8715453ec706.jpeg)
(Image: Apple)
Apple has announced further basic accessibility functions for iPhones and iPads. One of the innovations expected with iOS 18 in the fall is the ability to control smartphones and tablets with your eyes. The integrated front camera is used to capture the gaze and all data is only processed locally on the device, as the manufacturer emphasized. No additional hardware is required for this. Inputs can also be made with the eyes by holding your gaze on a button for longer.
On the iPhone, the vibration motor can also be used in future for haptic feedback during music playback, as Apple announced – this would also allow the deaf and hard of hearing to experience music. The function should already work with many songs from the Apple Music catalog and be available as an API for other music service providers.
Function to prevent motion sickness in the car
To avoid motion sickness when using screens in vehicles, iOS 18 allows users to display small dots on the screen that react to driving movements – ideally helping to prevent nausea. CarPlay is also set to become more accessible: The planned innovations include voice control as well as the sound recognition already familiar from iPhone & Co, which can indicate horns or sirens, for example. Text can also be adapted to make it easier to read.
With Vocal Shortcuts, users will be able to create voice commands or sound commands and use them to execute shortcuts, for example. Speech recognition should also cope better with unusual speech patterns.
More accessibility – also for the Vision Pro
The Apple Vision Pro will receive further accessibility functions familiar from the iPhone, such as live subtitling, which transmits conversations directly in text form – both in apps such as FaceTime and in conversations with physically present people. Apple is also promising further accessibility functions with iOS 18, including a new reading mode for the magnifying glass, improved Braille input and the display of larger letters against a background of the user's choice when entering text. Apple will present iOS 18 at the WWDC developer conference at the beginning of June, with the iPhone update expected to be released in September.
(lbe)