While we are all eagerly anticipating WWDC 2023, Apple has pleasantly surprised us by offering a sneak peek of the upcoming iOS 17 accessibility features. This announcement holds even more significance as it came just before Global Accessibility Awareness Day.
Apple has provided fans with a glimpse of the new features that will be released later this year. Before we have the opportunity to try them out firsthand, let’s take a look at an overview of the accessibility features in iOS 17.
Table of Contents
Apple previews new accessibility features in iOS 17
Apple’s main objective has always been to enhance the lives of individuals, and in line with this mission, they have developed a range of accessibility features in iOS 17. These new features are designed to assist users with cognitive, vision, hearing, and mobility impairments, and they utilize on-device machine-learning technology.
During the announcement, Apple CEO Tim Cook expressed his enthusiasm, stating, “Today, we are thrilled to unveil remarkable new features that further build upon our longstanding commitment to making technology accessible, ensuring that everyone has the opportunity to create, communicate, and pursue their passions.”
Now, let’s explore the upcoming features of the iPhone and discover what Apple has in store for us.
1. Assistive Access
Assistive Access is a cognitive accessibility feature that aims to enhance the usability and independence of iPhone and iPad users. It simplifies the interface of apps and emphasizes important elements, reducing cognitive load. Apple collaborated closely with individuals with cognitive disabilities to ensure the accurate design of this feature.
Essential apps such as Camera, Photos, Music, Calls, and Messages will now feature high-contrast buttons, large text labels, and customizable options in Assistive Access. This means users can choose between a visual, grid-based layout or a text-based interface, tailoring their experience to their individual preferences. These enhancements aim to improve ease of use and promote greater independence.
In addition, Apple has consolidated the Phone and FaceTime functionalities into the Calls app, providing users with easy access to both features. For individuals with cognitive disabilities, there is an emoji-only keyboard available, allowing them to interact visually. Furthermore, the Messages app now offers the option to record and send video messages to loved ones, further enhancing communication possibilities.
2. Live Speech and Personal Voice Advance Speech Accessibility
Apple has introduced the Live Speech and Personal Voice feature to assist users with speech accessibility, particularly those who may be at risk of losing their ability to speak, such as individuals with ALS (amyotrophic lateral sclerosis). This innovative feature aims to provide support and enable individuals to communicate effectively by leveraging advanced technology.
With the Live Speech feature, users can type out their desired messages during calls and FaceTime, and their iPhones will audibly speak the text for accessible communication. This functionality allows individuals to effectively communicate even if they are unable to speak verbally. Additionally, users have the option to save frequently used phrases for quick access during conversations, further enhancing convenience and efficiency.
Meanwhile, the Personal Voice feature allows users to generate a synthesized voice that resembles their own. By following text prompts and recording 15 minutes of audio on their iPhone, individuals can create a customized voice that will be seamlessly integrated with Live Speech. Apple remains committed to safeguarding the privacy of all conversations, ensuring that user data is protected throughout the process.
3. Point and Speak
The Detection Mode of the Magnifier app will introduce a new vision accessibility feature called Point and Speak. Specifically designed for users who are blind or have limited vision, this feature enables users to point their iPhones at any text, which will be recognized and read aloud by the device. Point and Speak enhance the interaction with real-life objects that have text labels, providing valuable assistance to individuals with visual impairments.
Point and Speak utilize data from the Camera app, the LiDAR Scanner, and on-device machine learning to function effectively. It is compatible with VoiceOver and can be used in conjunction with other Magnifier features such as People Detection, Door Detection, and Image Descriptions. This comprehensive integration aims to empower disabled users to navigate their surroundings with greater independence and confidence.
Cheers to empowerment!
The upcoming features in iOS 17 exemplify Apple’s dedication to promoting inclusivity and empowering disabled users around the globe. By actively collaborating with disability communities, Apple ensures that these features address real-life challenges faced by individuals. Moreover, the integration of on-device machine learning technology prioritizes user privacy and data protection. Just like you, I am also filled with excitement to test these new features and witness the significant milestones they will create in enhancing accessibility and user experiences.
You may also like:
Your Ultimate Guide to Finding Archived Instagram Posts and Stories on iPhone