Eye Tracking Control is Coming to iPhone and iPad

Why Trust Techopedia
Key Takeaways

  • Apple is bringing an Eye Tracking control feature to iPhones and iPads.
  • You can navigate entirely with your eyes without using special hardware or apps.
  • Vision Pro users are also getting live captions for video calls and app audio.

Apple has unveiled a host of accessibility features coming later in the year, including an Eye Tracking option that lets people with disabilities control the iPhone and iPad hands-free.

The Eye Tracking technology uses AI to follow your eyes, and can activate on-screen items when you dwell your gaze on them. All the data stays on your device, Apple said. The feature doesn’t require special app support or any extra hardware.

The company also hopes to make the Vision Pro more accessible to people with hearing issues. Live Captions will provide real-time speech-to-text for FaceTime calls and app audio, making it possible to call friends and join video meetings even if you couldn’t normally hear what was happening.

Other accessibility are subtler than Eye Tracking and Live Captions, but potentially very helpful. Music Haptics will use the Taptic Engine (vibration system) in an iPhone to play taps and other sounds in sync with songs. You can feel music even if you can’t hear it. It will initially work with Apple Music, but Apple has promised a toolkit to build support into other apps.

Vocal Shortcuts will let iPhone and iPad users say custom keywords to perform more complicated tasks. Listen for Atypical Speech will help those with conditions like ALS use voice control. If you get motion sick while staring at your device in a car, a Vehicle Motion Cues feature will display moving dots to reduce the queasiness.

CarPlay will offer voice-only control, colorblind filters, and sound recognition to identify horns and sirens.

Apple didn’t say exactly when the accessibility features would arrive, although they might be tied to iOS 18, iPadOS 18, and other platforms the company is expected to announce at WWDC in June.

Eye control isn’t a new concept in computing. Companies like Tobii have made it available for years. It has typically required specialized hardware and software, however, and is still relatively rare in mobile devices. If Apple’s approach works as promised, it could make iPhones and iPads more practical for people with paralysis or motor control issues.