Apple has announced a host of new accessibility features for its iPhone, iPad and Mac devices. The new features will be released later this year and are expected to be a part of the new iOS 17 which could be previewed at this year’s WWDC.
Apple CEO Tim Cook shared the news about new accessibility features on Twitter, he wrote, “At Apple, we believe technology should be designed to help everyone do what they love. We’re excited to preview new accessibility features to help even more people follow their dreams.”
Here are the new accessibility features announced by Apple:
Assistive Access:Â
The new Assistive Access feature distills Apple’s applications and experiences down to their most essential features. It also combines the Phone and FaceTime into a single app. Aimed at people with cognitive disabilities, Assistive Access is designed to help reduce the “cognitive load” on users while tailoring the iPad and iPhone experience to their needs.Â
While introducing the new feature Apple said, “The feature offers a distinct interface with high contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individual they support.”Â
Live Speech and Personal Voice Advance Speech Accessibility:Â
Live Speech will allow users to type what they want and have it played during a phone call or FaceTime conversation. The new feature will also provide the users with the ability to save their commonly used phrases and have them spoken out loud during a conversation.
Interestingly, users can create a voice that sounds just like them using the Personal Voice feature. It will be particularly helpful for users with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other such conditions where users can progressively lose their speaking abilities.
While talking about the Personal Voice feature, Apple said, “Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on iPhone or iPad.”Â
Point and Speak:Â
Aimed at users with visual disabilities, Point and Speak will make it easier to use physical objects that have text labels. Built into the magnifier app, Point and Speak uses input from the camera and the LiDAR Scanner along with inputs from machine learning to help users navigate their using day-to-day physical objects like microwaves or refrigerators.
Download The Mint News App to get Daily Market Updates & Live Business News.