Apple Announces AI-Powered Accessibility Features: Nutrition Labels, Mac Magnifier App, Braille Access

The features are powered by on-device machine learning and AI.

Apple has announced new accessibility features coming later this year, powered by on-device machine learning and artificial intelligence (Source: Apple)

Apple has announced new accessibility features coming later this year, powered by on-device machine learning and artificial intelligence. These include Accessibility Nutrition Labels on the App Store, Magnifier app for Mac, Braille Access feature and updates to visionOS. 

Accessibility Nutrition Labels On App Store

A new section on App Store product pages will highlight accessibility features within apps and games. These labels will allow users to learn if an app will be accessible to them before they download it. 

New Magnifier For Mac

Magnifier on iPhone and iPad allows users who are blind or have low vision tools to zoom in, read text and detect objects. This year, Magnifier will be available on Mac, connecting to a user's camera so they can zoom in on their surroundings, such as a screen or whiteboard. 

Users can multitask by viewing a presentation with a webcam while simultaneously following along in a book using Desk View. Users can also adjust brightness, contrast, colour filters and perspective. Views can be captured, grouped and saved to add to later on. 

Accessibility Reader

The new system-wide reading mode is designed to make text easier to read for users with disabilities such as dyslexia or low vision. Available on iPhone, iPad, Mac and Apple Vision Pro, it allows users to customise text and focus on content they want to read, with options for font, colour and spacing, and support for Spoken Content. 

Braille Access

Braille Access turns iPhone, iPad, Mac and Apple Vision Pro into a braille note taker. With a built-in app launcher, users can open apps by typing with Braille Screen Input or a connected braille device.

Users can take notes in braille format and perform calculations using Nemeth Braille. Also, Live Captions allows conversation transcription in real time on braille displays.

Live Captions On Apple Watch

For users who are deaf or hard of hearing, Live Listen controls on Apple Watch have new features, including real-time Live Captions. Users can use their iPhone as a remote microphone to stream content to AirPods, Made for iPhone hearing aids or Beats headphones. 

Users can view Live Captions of what their iPhone hears on a paired Apple Watch while listening to the audio. Apple Watch also serves as a remote control to start or stop Live Listen sessions. 

Vision Pro Enhancements

For users who are blind or have low vision, visionOS will expand vision accessibility features using the camera system on Vision Pro. Users can magnify views via the main camera.

For VoiceOver users, Live Recognition in visionOS uses on-device machine learning to describe surroundings, find objects, read documents and more. 

Also Read: Apple Rolls Out iOS 18.5 With Fresh Wallpapers, Bug Fixes And Key Security Updates

Watch LIVE TV, Get Stock Market Updates, Top Business, IPO and Latest News on NDTV Profit.
GET REGULAR UPDATES