Apple is introducing a new wave of accessibility enhancements across its product ecosystem, including iPhone, iPad, Mac, Apple TV, and Apple Vision Pro.
Highlights
The upcoming updates are designed to provide more inclusive user experiences and address a wider range of accessibility needs. Notably, Apple is also exploring the integration of brain-computer interface (BCI) technology as part of its long-term innovation roadmap.
Among the key announcements is the introduction of Accessibility Nutrition Labels on the App Store. These labels are designed to help users better understand how accessible an app or game is before downloading it.
Another major update, Braille Access, enables Apple devices to function as Braille note takers—expanding support for users who are blind or have low vision.
System-wide features such as Accessibility Reader, optimized for easier content consumption, and a dedicated Assistive Access app for Apple TV are also part of the rollout. Users will also be able to synchronize accessibility settings across multiple Apple devices more easily.
Enhancements are being made to existing tools as well, including Live Listen, Background Sounds, Personal Voice, and Vehicle Motion Cues.
Brain-Computer Interface Support Under Development
Apple is reportedly collaborating with Synchron, a startup focused on brain-computer interface technology, to explore device control through neural input.
According to The Wall Street Journal, the collaboration centers on an updated version of Switch Control, a protocol that may one day allow users to interact with Apple devices using neural signals.
Synchron’s Stentrode implant, which sits within a blood vessel near the brain’s motor cortex, is capable of capturing these signals.
Although still in early stages, this initiative may benefit individuals with severe physical impairments, such as those living with ALS or spinal cord injuries. It also marks Apple’s first public involvement in the brain-computer interface space.
Eye Tracking
Apple is introducing Eye Tracking for iPhone and iPad, offering users the ability to navigate their devices using only eye movement.
The feature utilizes the front-facing camera and on-device machine learning to allow users to control apps and system elements without the need for additional hardware.
Eye Tracking supports Dwell Control, which enables actions like taps and swipes by focusing the user’s gaze on specific areas. This feature is intended to assist individuals with limited physical mobility.
Feature | Activation Method | Target Users | Device Support | Primary Benefit | Setup Complexity |
---|---|---|---|---|---|
AssistiveTouch | Settings > Accessibility > Touch > AssistiveTouch or Siri | Limited motor/dexterity | iPhone, iPad, iPod touch | On-screen gesture menu & shortcuts | Moderate |
VoiceOver | Settings > Accessibility > VoiceOver, Side-button triple-click, or Siri | Blind/low vision | iPhone, iPad, Mac, Apple Watch, Apple TV | Gesture-based screen reader | Moderate – High |
Eye Tracking | Settings > Accessibility > Eye Tracking (or Shortcut) | Severe physical impairments | iPhone & iPad with front camera | Hands-free navigation via gaze | Low |
Music Haptics
With Music Haptics, Apple aims to make music more accessible for users who are deaf or hard of hearing. The feature uses the iPhone’s Taptic Engine to convert audio into vibrations, taps, and textures that reflect the rhythm and dynamics of the music.
Music Haptics will support millions of songs in the Apple Music catalog and will be available to developers through an API, encouraging broader adoption in third-party apps.
Vocal Shortcuts and Speech Recognition Enhancements
Apple is introducing Vocal Shortcuts, which allow users to assign personalized voice commands that Siri can recognize to trigger shortcuts or perform complex tasks.
Another feature, Listen for Atypical Speech, leverages on-device machine learning to better understand speech patterns that may differ due to conditions such as cerebral palsy, ALS, or stroke.
These tools are designed to make voice interaction more inclusive and adaptable to a wider range of communication styles.
Vehicle Motion Cues: Reducing Motion Sickness
To help address motion sickness when using devices in moving vehicles, Apple is launching Vehicle Motion Cues. This feature displays subtle, animated dots on the screen edges that correspond with the vehicle’s movement, helping reduce sensory conflict that can cause discomfort.
The feature uses built-in motion sensors and can be easily toggled through the Control Center.
Additional Accessibility Features
- Live Captions in visionOS: Real-time captions for conversations and app audio.
- Braille Access: Support for both connected Braille devices and Braille Screen Input.
- Accessibility Reader: Customizable text formatting across devices, extended to physical text via the Magnifier app.
- AssistiveTouch for Apple Watch: Navigation using hand gestures and built-in motion sensors.
- Enhanced VoiceOver: New voice options, improved image descriptions, and customizable keyboard shortcuts.