Apple's new feature lets you use iPhone with your eyes
Apple has announced a range of new accessibility features for its iPhone and iPad users. These features were revealed ahead of Global Accessibility Awareness Day (on the third Thursday of each May), and are designed to cater to diverse user needs. The tech giant aims to make its products more accessible by introducing eye-tracking technology, voice command shortcuts, haptic music experience, and more.
Eye-tracking tech: A game changer for accessibility?
Apple's new eye-tracking feature, powered by artificial intelligence (AI), allows users to navigate through apps using the front-facing camera. The built-in option uses AI to understand what the user is looking at and which gesture they want to perform, such as swiping or tapping. A feature called "Dwell Control" can sense when a person's gaze pauses on an element, indicating they want to select it. This innovative technology marks a significant step forward in device accessibility.
'Vocal Shortcuts' and 'Listen for Atypical Speech'
Apple has also introduced "Vocal Shortcuts," an improvement on its voice-based controls. This feature allows users to assign different sounds or words to launch shortcuts and complete tasks. For example, Siri can launch an app even after the user says something as simple as "Ah!" Apple's new feature, "Listen for Atypical Speech," uses machine learning to recognize unique speech patterns, catering specifically to users with conditions that affect speech including amyotrophic lateral sclerosis (ALS), cerebral palsy, stroke, and more.
'Music Haptics' and 'Vehicle Motion Cues' enhance user experience
For users who are deaf or hard-of-hearing, Apple has introduced "Music Haptics." This feature allows users to experience songs in Apple Music through a series of taps, textures, and vibrations. It will also be available as an API for music app developers. Additionally, Apple's new feature called "Vehicle Motion Cues" aims to help with motion sickness in cars by displaying animated dots on the screen edges that sway with the motion direction.
CarPlay and visionOS updates improve accessibility
Apple's CarPlay is receiving an update, including a "Voice Control" feature, "Color Filters" for colorblind users, and "Sound Recognition" to notify deaf or hard of hearing users about car horns and sirens. Apple has also introduced a new accessibility feature for visionOS that enables live captions during FaceTime calls. These updates are expected to debut in iOS 18 and iPadOS 18 later this year, with all data used to control these features securely stored on the device.