Apple's new accessibility features are coming out this year: eye tracking, music touch, voice shortcuts, and more
Jun 08, 2024| Recently, Apple announced that it will launch a new accessibility feature later this year. This will lead to a better accessibility experience.
The official information also introduces these auxiliary functions:
These include eye tracking, tactile music, voice shortcuts, vehicle motion cues, and more, and visionOS will provide more accessibility features.
Ai-powered eye tracking gives users the built-in option to use ipads and iphones with just their eyes. Eye tracking is a feature designed for users with disabilities that takes seconds to set up and calibrate the front-facing camera, as well as machine-learning capabilities on the device.
Eye tracking works in both iPadOS and iOS apps and requires no additional hardware or accessories.
With eye tracking, users can navigate through elements of the app and use stay controls to activate each element, accessing other features such as physical buttons, swipes, and other gestures only with their eyes.

Music touch is a way for hearing-impaired users to experience music on their iphones.
When this accessibility feature is turned on, the touch engine in the iPhone displays taps, textures, and subtle vibrations as music plays. The Music Haptics feature is available for songs in Apple Music and will be available to developers as an API.
iPhone and iPad users can use voice shortcuts to add custom words to Siri to launch shortcuts and complete complex tasks.
Listening to atypical speech features provides options to enhance the range of speech recognition. This feature uses machine learning on the device to recognize the user's speech patterns.

Designed for users whose speech functions are affected by things like cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features build on the features introduced in iOS 17, providing new personalization and control for users who are unable to speak or are at risk of speech ability
Vehicle Motion Alerts is a new feature for iPhone and iPad that helps car users reduce motion sickness.
This feature displays animated points on the edge of the screen that represent changes in the vehicle's motion to help reduce sensory conflict without interfering with the main display. Using the iPhone and iPad's built-in sensors, vehicle Motion alerts can recognize if the user is in a moving vehicle and give feedback accordingly.

The feature can be set to show automatically on the iPhone, or it can be turned on and off in the Control center.
visionOS will also bring a number of accessibility upgrades.
This year will bring system-wide live captioning to help all users, including those with hearing loss, understand conversations and app audio in real time.
The Vision Pro will add the ability to move captions using a window bar to Apple's immersive video, as well as support for other hearing AIDS Made for iPhone (MFi) and cochlear hearing processors.
Updates to visual assistance features will include the addition of "Reduce Transparency," "Smart Inversion," and "Reduce Flashing lights" features for users with low vision or who want to avoid bright lights and frequent flashing.
Other updates include voiceover, amplifiers, Braille screen input, Braille keyboard, hover input, personal voice, real-time voice, virtual trackpad, switch control, voice control, and more.


