Apple has unveiled a series of upcoming accessibility enhancements, including eye tracking, music haptics, vocal shortcuts, and vehicle motion cues, set to be released later this year.
These upgrades are part of the visionOS platform, with Eye Tracking being a standout feature powered by artificial intelligence. It allows users to control their iPads and iPhones solely with their eyes, offering a more accessible interaction method for those with physical disabilities. The Eye Tracking setup is quick and simple, using the front-facing camera for calibration. Privacy is a priority, as all eye-tracking data remains on the device and is not shared with Apple or any third parties. This feature is compatible with all apps on iPadOS and iOS, eliminating the need for additional hardware or accessories.
Apple’s CEO, emphasized Apple’s commitment to inclusive design, stating that innovation enriches lives
With Eye Tracking, users can effortlessly navigate app interfaces by looking at specific elements and activating them using Dwell Control. This feature enables them to select parts of the screen by focusing their gaze on them for a brief period.
Furthermore, users can perform various actions like pressing buttons, swiping, and executing gestures solely through their eye movements, eliminating the need for physical touch.
Tim Cook, Apple’s CEO, emphasized Apple’s commitment to inclusive design, stating that innovation enriches lives. He highlighted Apple’s longstanding dedication to accessibility, integrating it into both hardware and software for nearly four decades. Cook also mentioned that Apple is continuously pushing technological boundaries to enhance user experiences. Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, emphasized the impact of these new features, stating that they will provide new communication methods, device control options, and mobility enhancements for a wide range of users.