Apple Unveils Visual, Sound, and Haptic Accessibility Features

May 15 2024, 00:25
Spanning its multiple platforms and dedicated operating systems, Apple continues to offer valuable accessibility features to both overcome user disabilities and enhance individual experiences. The company announced new accessibility features coming later in 2024, including new visual, sound and haptic options that harness the power of Apple hardware and software, Apple silicon, artificial intelligence, and machine learning.
 

Among the new accessibility features, Apple demonstrated Eye Tracking, a way for users with physical disabilities to control iPad or iPhone with their eyes. Additionally, Music Haptics will offer a new way for users who are deaf or hard of hearing to experience music using the Taptic Engine in iPhone; Vocal Shortcuts will allow users to perform tasks by making a custom sound; Vehicle Motion Cues can help reduce motion sickness when using iPhone or iPad in a moving vehicle; and more accessibility features will come to visionOS, the operating system for Apple's latest Vision Pro augmented reality platform. 

"Each year, we break new ground when it comes to accessibility," says Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. "These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world."

Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.
 

Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

Music Haptics is a new way for users who are deaf or hard of hearing to experience music on iPhone. With this accessibility feature turned on, the Taptic Engine in iPhone plays taps, textures, and refined vibrations to the audio of the music. Music Haptics will work at launch across millions of songs in the Apple Music catalog, and will be available as an API for developers to make music more accessible in their apps.

With Vocal Shortcuts, iPhone and iPad users can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks. Listen for Atypical Speech, another new feature, gives users an option for enhancing speech recognition for a wider range of speech. Listen for Atypical Speech uses on-device machine learning to recognize user speech patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customization and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.

"Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers," says Mark Hasegawa-Johnson, the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign’s principal investigator. "The Speech Accessibility Project was designed as a broad-based, community-supported effort to help companies and universities make speech recognition more robust and effective, and Apple is among the accessibility advocates who made the Speech Accessibility Project possible."

Vehicle Motion Cues is another new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles. Research shows that motion sickness is commonly caused by a sensory conflict between what a person sees and what they feel, which can prevent some users from comfortably using iPhone or iPad while riding in a moving vehicle. With Vehicle Motion Cues, animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content. Using sensors built into iPhone and iPad, Vehicle Motion Cues recognizes when a user is in a moving vehicle and responds accordingly. The feature can be set to show automatically on iPhone, or can be turned on and off in Control Center.
 

CarPlay Gets Voice Control
Accessibility features coming to CarPlay include Voice Control, Color Filters, and Sound Recognition. With Voice Control, users can navigate CarPlay and control apps with just their voice. With Sound Recognition, drivers or passengers who are deaf or hard of hearing can turn on alerts to be notified of car horns and sirens. For users who are colorblind, Color Filters make the CarPlay interface visually easier to use, with additional visual accessibility features including Bold Text and Large Text.

Accessibility features coming to visionOS will include systemwide Live Captions to help everyone — including users who are deaf or hard of hearing — follow along with spoken dialogue in live conversations and in audio from apps. Apple Vision Pro will add the capability to move captions using the window bar during Apple Immersive Video, as well as support for additional Made for iPhone hearing devices and cochlear hearing processors. Updates for vision accessibility will include the addition of Reduce Transparency, Smart Invert, and Dim Flashing Lights for users who have low vision, or those who want to avoid bright lights and frequent flashing.

For users who are blind or have low vision, VoiceOver will include new voices, a flexible Voice Rotor, custom volume control, and the ability to customize VoiceOver keyboard shortcuts on Mac. Magnifier will offer a new Reader Mode and the option to easily launch Detection Mode with the Action button.
Braille users will get a new way to start and stay in Braille Screen Input for faster control and text editing; Japanese language availability for Braille Screen Input; support for multi-line braille with Dot Pad; and the option to choose different input and output tables.

For users at risk of losing their ability to speak, Personal Voice will be available in Mandarin Chinese. Users who have difficulty pronouncing or reading full sentences will be able to create a Personal Voice using shortened phrases.

For users who are nonspeaking, Live Speech will include categories and simultaneous compatibility with Live Captions. For users with physical disabilities, Virtual Trackpad for AssistiveTouch allows users to control their device using a small region of the screen as a resizable trackpad. Switch Control will include the option to use the cameras in iPhone and iPad to recognize finger-tap gestures as switches. Voice Control will offer support for custom vocabularies and complex words.
www.apple.com
 
related items