Skip to main content

Apple will let the Vision Pro ‘see’ for you

Approved third-party developers will be able to offer their own solutions using a new API, as well.

Approved third-party developers will be able to offer their own solutions using a new API, as well.

The Apple Vision Pro
The Apple Vision Pro
Photo: Nilay Patel
Wes Davis
is a former weekend editor who covered tech and entertainment. He has written news, reviews, and more as a tech journalist since 2020.

Apple previewed new Vision Pro accessibility features today that could turn the headset into a proxy for eyesight when they launch in visionOS later this year. The update uses the headset’s main camera to magnify what a user sees or to enable live, machine-learning-powered descriptions of surroundings.

The new magnification feature works on virtual objects as well as real-world ones. An example from Apple’s announcement shows a first-person view as a Vision Pro wearer goes from reading a zoomed-in view of a real-world recipe book to the Reminders app, also blown up to be easier to read. It’s a compelling use for the Vision Pro, freeing up a hand for anyone who might’ve done the same thing with comparable smartphone features.

Animation showing Apple’s new magnification feature in the Vision Pro.
The Vision Pro’s coming magnification feature.
GIF: Apple / The Verge

Also as part of the update, Apple’s VoiceOver accessibility feature will “describe surroundings, find objects, read documents, and more” in visionOS.

Related

The company will release an API to give approved developers access to the Vision Pro’s camera for accessibility apps. That could be used for things like “live, person-to-person assistance for visual interpretation in apps like Be My Eyes, giving users more ways to understand their surroundings hands-free.” It may not seem like much now, given the Vision Pro’s reportedly meager sales, but the features could be useful in rumored future Apple wearables, like camera-equipped AirPods or even new Apple-branded, Meta Ray-Ban-like smart glasses.

Finally, Apple says it’s adding a new protocol in visionOS, iOS, and iPadOS that supports brain-computer interfaces (BCI) using its Switch Control accessibility feature, which provides for alternate input methods such as controlling aspects of your phone using things like head movements captured by your iPhone’s camera.

A Wall Street Journal report today explains that Apple worked on this new BCI standard with Synchron, a brain implant company that lets users select icons on a screen by thinking about them. The report notes that Synchron’s tech doesn’t enable things like mouse movement, which Elon Musk’s Neuralink has accomplished.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.