Americas

  • United States

Asia

Soon, you’ll control your iPhone with a glance

news
May 15, 20245 mins
AppleiOSiPad

Apple has announced a range of accessibility features including Eye Tracking, Music Haptics, and more.

Perhaps building on lessons learned by developing eye-tracking controls for the Vision Pro, Apple today announced AI-driven eye-tracking support for the iPad and iPhone. The idea is that it becomes possible to use apps on the iPad using only your eyes.

The feature is part of the latest package of accessibility features Apple unveiled to mark Global Accessibility Awareness Day. It touts new accessibility features every year at this time — last year the company introduced Assistive Access and Detection Mode in Magnifier, which join a lengthy list of tools that can help people.

“We believe deeply in the transformative power of innovation to enrich lives,” Apple CEO Tim Cook said in a statement. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software.”

The iPhone maker has always taken accessibility seriously and has garnered a lot of support from within some communities because it builds powerful accessibility tools into its platforms for free. At one point it was alone in doing so; you can explore an extensive video playlist detailing its existing accessibility features and what they are already capable of right here.

What is Eye Tracking?

Eye Tracking on iPhone and iPad uses AI and the front-facing camera, works across apps, doesn’t require any additional hardware, and can be calibrated for use in “seconds,” the company said. The idea is that in combination with Dwell Control, users can navigate what’s on screen and activate those elements with their eyes. The system understands swipe and other gestures based on eye movement.

Apple also introduced several additional features, including Music Haptics and Vocal Shortcuts. The latter lets users perform tasks by making a custom sound, while the former aims to give deaf or hard-of-hearing users a route to gain some experience of music. It does so by playing taps, textures, and refined vibrations to music audio.

(It must be noted that Music Haptics requires tracks be prepared to exploit it. Apple says it already works across millions of songs in the Apple Music catalogue and promises to introduce a developer API so music can be made more accessible within apps.)

Using your own voice

Apple last year introduced Personal Voice

This allows you to train your device to create and use a voice like your own and was intended in particular for those suffering from speech loss. Users can type what they say and have it read in their own voice. 

The company has gone further this year, introducing something it calls “Listen for Atypical Speech.” This uses on-device machine learning to recognize user speech patterns and is designed to make speech recognition systems easier and more accurate for users with acquired or progressive conditions such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke.

“Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”

Even more accessibility updates

Another new feature, Motion Cues, can help reduce motion sickness when using a device in a moving vehicle; the company also promised a range of additional accessibility features for visionOS — including Live Captions, which lets people follow live or recorded conversations with captions on the display.

As powerful as these new accessibility tools might be, Apple has more. The company says Voice Control and tolls for deaf/hard of hearing vehicle occupants (Sound Recognition) are coming to CarPlay. 

The company also discussed numerous additional updates, including new voices in VoiceOver, a new Reader Mode in Magnifier to make it possible to have documents read to you by pointing your iPhone at them, improved support for Braille, typing with low vision, virtual trackpad, Switch control and more. Voice Control gains support for custom vocabularies and complex words.

“We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users,” said Cook.

Celebrate Global Accessibility Awareness Day

Finally, Apple has initiated extensive promotional activity to support Global Accessibility Awareness Day. This includes free Apple Store sessions to help customers use the accessibility features built into their devices, and a variety of relevant content across its App Store, books, TV, podcasts, and fitness services. 

Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

Jonny Evans

Hello, and thanks for dropping in. I'm pleased to meet you. I'm Jonny Evans, and I've been writing (mainly about Apple) since 1999. These days I write my daily AppleHolic blog at Computerworld.com, where I explore Apple's growing identity in the enterprise. You can also keep up with my work at AppleMust, and follow me on Mastodon, LinkedIn and (maybe) Twitter.