Apple is creating a feature for the Vision Pro that will allow users to scroll through apps using only their eyes, a new report by Bloomberg’s Mark Gurman claims.
Citing people who are aware of the development, Gurman said that Apple is testing the eye-scrolling feature as part of the upcoming visionOS 3 update for its Vision Pro headset. The new interface will remove the need for gesturing with hands.
Eye-Based Scrolling: Extension Of Current Capabilities
The Vision Pro already has the capability to navigate the operating system through eye tracking. The Vision Pro has several cameras that are utilised for biometric authentication (iris scanning) and eye tracking. The headgear can be used to navigate the software by focusing on things and then pinching their fingers to select them.
Hence, eye-based scrolling would be a logical expansion of the Vision Pro’s present OS navigation capabilities. Apple intends to add the scrolling functionality to all of the Vision Pro’s built-in apps, in addition to creating APIs that will enable developers to do the same.
Apple is set to release visionOS 3 at the Worldwide Developers Conference starting June 9, so users will eagerly anticipate to have a first look at the eye scrolling feature.
New Vision Pro Accessibility Features
Apple also gave a first peek at new accessibility features for the Vision Pro that could make the headset a stand-in for eyesight. Expected to release in visionOS later this year, the update will enable live, machine-learning-powered descriptions of the environment or magnify what the user sees using the main camera on the headset.
Both virtual and real-world items can be magnified using the new function. An example from Apple’s announcement displays a first-person view as a Vision Pro user switches between reading a recipe book and the Reminders app, both of which are zoomed in.
A VoiceOver accessibility feature in the visionOS update will further “describe surroundings, find objects, read documents, and more.”
Furthermore, Apple has announced the addition of a new protocol in visionOS, iOS, and iPadOS that supports brain-computer interfaces (BCI). Through the Switch Control accessibility feature, user will be able to choose alternative input methods, such as controlling features of their phone with head movements recorded by their iPhone’s camera.
RECOMMENDED FOR YOU
Oakley Meta Smart Glasses Likely To Debut On June 20: What To Expect


Apple WWDC 2025 Key Highlights: Everything New Coming To iPhones, iPads And Macs Later This Year


Apple WWDC 2025: What To Look Out For In The New visionOS 26 For Apple Vision Pro


Which iPhones Will Get iOS 26? Report Reveals List of Supported Devices
