From Monday: Apple gives companies direct Vision Pro camera access
Until now, what the Vision Pro sees has not been directly accessible to developers. With visionOS 2, this will change – for a developer group.
Production of Vision Pro: corporate customers should have a better overview.
(Image: Apple / Screenshot YouTube)
Apple's first mixed reality headset, Vision Pro, sees a lot of its surroundings. This is ensured by numerous cameras, which not only capture the room but also body-related hand movements and, internally, the eyes in order to display the personas function. For data protection reasons, most of this data is processed directly on the device. Developers, on the other hand, only receive clues that are sufficient for the display of their apps, never real raw data from the cameras. With visionOS 2, which should be released on Monday evening, this is now changing –, at least for a special group of developers.
Machine-oriented development including camera
With special "Managed Entitlements", which must be applied for from Apple – plus a license file that is linked to the Apple ID –, developers can develop apps for companies via an enterprise API that are machine-oriented. The idea is to enable more research and development – as well as internal applications that can do more than what normal apps are allowed to do. However, such programs should not be able to go "in the wild" for the aforementioned data protection reasons.
Videos by heise
The Enterprise API features for visionOS 2, which were first shown in a session at the WWDC 2024 developer conference in the summer, include access to in-screen capturing of what is seen in the Vision Pro environment using passthrough. There is also direct access to the main camera, which the operating system currently blocks for apps. Developers in companies should also be given greater leeway in terms of the performance requirements of the M2 SoC ("Increased Performance Headroom") –, which normal apps are also not allowed to do.
Spatial barcodes, neural engine access
Apple emphasizes that the Enterprise API is about features that are to be used "solely in a business context". This means that you cannot (of course) distribute such apps in the App Store, but must use the Apple Business Manager instead. Smaller features that are not yet available to regular App Store developers for the Vision Pro are also available on the headset via the new interface. These include scanning QR codes and spatial barcodes to localize positions in a room.
Finally, Apple is also providing more direct access to the Apple Neural Engine (ANE) for machine learning tasks –, which should work in a similar way to iOS. Object tracking will also be made easier for enterprise developers; known objects can be identified and recorded more quickly using "configurable parameters". Enterprise developers who want to use the Enterprise API for Vision Pro must request a "Development Only" request from the company.
Empfohlener redaktioneller Inhalt
Mit Ihrer Zustimmung wird hier ein externer Preisvergleich (heise Preisvergleich) geladen.
Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (heise Preisvergleich) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.
(bsc)