Apple makes accessibility top-tier importance, which makes the entire experience better for everyone.
With the recent release of iOS 14, iPadOS 14, and watchOS 7, we’re all giggling with excitement over awesome new features like emoji watch faces and Home screen widgets (you should check out our iOS 14 review, iPadOS 14 review, and watchOS 7 review for more on the new features), but some of the best features that get overlooked have to do with accessibility. Apple has always been a leader in the accessibility field and I think Apple should wear its accessibility achievements on its chest as a badge of honor. I oftentimes speak to tech lovers that tell me they only use Apple devices or have switched to Apple products because of the many and useful accessibility features.
This success doesn’t come from simply adding on an accessibility option after everything’s already been done. It comes from Apple putting accessibility at the top of its list of priorities. Accessibility isn’t an afterthought for Apple. It’s a before thought.
Accessibility features benefit everyone. Mouse and trackpad support started as an adaptive pointer device feature. Back Tap, the back of the phone double-tap feature in iOS 14? That’s an accessibility thing. We all get to take advantage of this useful feature, but it was built for folks with low motor skills.
This year, Apple has added new accessibility features and improved old ones so they are easier to access, more robust, and better for everyone.
Hearing needs differ vastly across all people. Headphones shouldn’t be a one-setting-for-all situation and Apple is changing that for AirPods 2, AirPods Pro, and some Beats headphones. For example, I have tinnitus, which is a constant faint (sometimes loud) ringing in my ear. This makes listening to bright sounds painful. My brother, however, is deaf in one ear and prefers audio levels to have a brighter tone because he can pick up individual sounds better.
Headphone accommodations are AirPods and Beats settings designed to adjust to certain frequencies that can amplify or dampen particular sounds so you can tune your audio to your particular hearing needs.
You can adjust the EQ of AirPods 2, AirPods Pro, and Beats headphones by default for Balanced Tone, Vocal Range, or Brightness and adjust the strength of those presets. You can further customize your audio experience for phone calls, movies and music, and listen to an audiogram to set up unique audio profiles for different activities. You can set up to nine audio profiles.
Specifically, with AirPods Pro, you can use Transparency mode to make voices and environmental sounds louder. Not only is this safer for people with certain levels of hearing impairments so they can take in environmental sounds louder, but the feature itself acts as a, sort of, hearing enhancement in general.
For years, Apple has had accommodations for receiving alerts on your phone with a visual aid (Settings > Accessibility > Audio/Visual > LED Flash for Alerts) to help people with hearing impairments know when they’re receiving an alert. In iOS 14,iPadOS 14, and watchOS 7 users can set up Sound Recognition, which will alert you to certain sounds, like a smoke alarm, siren, or even doorbell. Apple is working on adding more recognizable sound, like people and animals.
This is another example of technology built for accessibility purposes that can also benefit the wider tech-loving community. Many of us are working from home lately, and we are sometimes in meetings with headphones on. If someone rings the doorbell, we may not hear it because of our headphones. This sound alert could give you the notification you need to answer the door before the delivery person leaves with your important package.
FaceTime Sign Language recognition
When you’re in a group FaceTime call with multiple little people bubbles, FaceTime usually makes the current speaker’s bubble larger on the screen. With the iOS 14 and iPadOS 14 accessibility update, FaceTime will recognize when someone is speaking in Sign Language and make that person’s bubble bigger.
For those using Real-Time Text for phone calls, you can switch out of the phone call to access another app on your phone. When the person on the other end of the call speaks, you’ll receive a notification with what was said. You can also respond to that notification without having to switch back to the phone call.
Hearing health upgrades
The Noise app, which first appears in watchOS 6, sends you a notification on our wrist when we’re in an environment that’s louder than the recommended decibel level. In watchOS 7, this works with headphones, too.
When you’re blasting your favorite tunes in your earbuds (doesn’t have to be Apple headphones), you’ll get that same notification if the sound levels are too loud. This works with listening to music on your iPhone, iPad, and Apple Watch. There is also a new user interface in the Health app that lets you keep track of your listening levels. If you’ve reached the WHO’s recommended maximum weekly listening level limits, you’ll get a notification.
Taking advantage of Apple’s bionic chip and Neural Engine, iPhones and iPads now have additional VoiceOver features that go beyond just button recognition. If a developer doesn’t include audio descriptions for some element of their user interface (which happens more often than it should), VoiceOver will automatically supplement with optical recognition. In addition to buttons, iPhone and iPad will recognize objects like sliders, tables, and other UIKit interfaces. Apple says it can even recognize custom interface elements added in an app.
Apple’s Neural Engine is also being used to add more context to image descriptions in apps and on the web. Currently, you might come across an image of two people sitting in chairs at a table, but thanks to image recognition, VoiceOver can add context to the image and describe two people sitting at a restaurant enjoying a meal. Context is everything.
Coding for accessibility by accessibility
Apple has also added features to Swift for sight-impaired coders. Swift Blind Preview gives coders more access to preview their apps and games. And Xcode Playgrounds have more accessibility features, too.
I’m sure you’ve heard about this one. Apple added a feature that lets you double or triple-tap the back of your iPhone to trigger an action. Users can assign a specific action mapped to double or triple-tap the back of their iPhone.
There are already nearly two dozen actions you can choose from, but Back Tap also supports Shortcuts. That means you can trigger any Shortcut you want by double or triple-tapping the back of your iPhone.
Remember that useful Shortcut for quickly setting your iPhone to start filming and then storing the clip in the cloud? What if you didn’t have the ability to say, “Hey Siri, start filming” but you could, instead, subtly double-tap the back of your iPhone?
Yet another example of how advancements in accessibility tech can help everyone.
Apple Watch setup
When you set up an Apple Watch as new in watchOS 7, you can configure your accessibility settings right out of the gate instead of having to follow up after the Watch is set up.
More vision-enabling updates
Magnifier is, more and more, becoming my most-used accessibility feature as my eyesight starts to diminish. New features to the Magnifier in iOS 14 and iPadOS 14 include on-screen controls for filters, adjustments, and additional lighting. You can also take more than one snapshot (not saved to Photos) so you can review them side-by-side instead of only being able to view one shot at a time. Magnifier can also magnify more of an area. Magnifier gains multitasking support on iPad, as well.
Braille gets the AutoPanning feature, which allows for panning brailled text without needing to press a physical button.
Rotor is supported with Apple Watch in watchOS 7, and there is a few new Rotor features coming to macOS Big Sur.
Voice Control gets new features, including British English and Indian English voices, the ability to use alongside VoiceOver, and improvements for Sleep/Wake commands.
Apple Pay and Face ID can be triggered without needing to press the side button. Instead, you can set a trigger with AssistiveTouch or Switch Control.
And, of course, big news for gamers with mobility needs, Apple Arcade and Apple TV will support Xbox Adaptive Controllers (XAC). XACs are much-beloved in the accessibility community, creating a customizable way for different people to game in whatever way works best for them.
More to come
Every time Apple updates its operating systems, it adds improvements to accessibility features to help anyone and everyone use its technologies. I look forward to finding out what Apple has next on its list of accessibility features and updates.
How about you? Do you have any particular favorite accessibility features that you use all the time? Share your uses in the comments.
Updated September 2020: Updated for the official launch of iOS 14, iPadOS 14, and watchOS 7.