With iOS 14Apple brings countless new accessibility tools to its devices, including some that people with disabilities might also find helpful. The list ranges from the ability to adjust the transparency mode in AirPods Pro to capturing multiple images with the iPhone magnifying function. And the new back tap function This option allows you to tap the back of your iPhone to take a screenshot, for example.
Many of the new improvements are likely to appeal to people who are deaf or have hearing loss, while other features will benefit users who are blind or visually impaired, and will expand Apple’s efforts over the years to make its devices and software more accessible.
The improvements don’t just apply to iPhones and iPads. Apple Watch users can now configure accessibility features while setting up a watch and turn on an extra-large dial for bigger and bolder “complications”. fleeting information about things like the weather – To help people with visual impairments to see them better.
Apple revealed on Monday iOS 14, iPadOS 14 and its other updated software during its annual Worldwide developer conference. The company uses WWDC to release the major updates to its operating systems before it becomes available to all Apple device users later in the year. Currently, developers and other beta testers have access to previous versions of the software to build their apps and to help Apple spot bugs before the improvements are rolled out. This includes accessibility.
The U.S. Centers for Disease Control and Prevention estimate that a quarter of Americans live with a disability. In the past, people with special needs had to spend thousands of dollars on technologies that enlarged their computer screens, gave navigation instructions, identified their money, and recognized the color of their clothes. Nowadays, users only need smartphones, computers and a handful of apps and accessories to get through their physical and online world.
Apple has integrated accessibility features into its products for years. It offers technology with which people with visual impairments can navigate, for example, on the iPhone’s touchscreen or people with motor impairments can virtually tap the icons on the user interface. It has a Developed for the iPhone program that certifies hearing aids It works with his devices. Two years ago, Apple gave users the ability to turn their iPhones and AirPods into remote microphones its live list feature.
iOS 14, iPadOS 14, WatchOS 7 and its other upcoming software expand these offerings.
Hearing functions
- With Headphones Accommodations, users can adjust the audio frequencies streamed through their AirPods Pro. Second generation AirPodsChoose Beats headphones and EarPods. Each individual can adjust the settings to suit them by dampening or amplifying certain sounds. Users can set up to nine unique profiles (e.g., a movie setting and another call setting) that use three reinforcement moods and three different strengths.
- AirPods Pro transparency mode offers its own unique advantage through headphone adjustments: the ability to adjust the amount of environment you hear. Quiet voices can get crisper and outside noise can become more detailed.
- Sound detection makes it easier for deaf people to be aware of audible warnings, alarms and notifications. When an iPhone, iPad, or iPod Touch receives a certain type of sound or alarm, it sends a notification to the user’s device, including an Apple Watch. The sounds that the system can detect are alarms such as sirens, smoke detectors at home or fire alarms. and household noises such as doorbells, car horns, device sounds and running water. Apple is also working to detect sounds from people or animals.
- Group FaceTime calls are now suitable for people who use sign language instead of speaking. As a rule, the speaking person appears more strongly to the other participants in a group call, whereby the video box of this person becomes larger. With iOS 14, FaceTime can detect if someone is using sign language and highlight that person’s video window.
- The Noise app, introduced WatchOS 6 last year, measures the ambient sound level to give users a sense of how loud their environment is. With WatchOS 7, customers can see how loud they are listening to headphones on their iPhone, iPod or Apple Watch audio. A hearing control panel displays a live user interface that indicates whether the audio is playing above the limit recommended by the World Health Organization. Audio is played back at 80 decibels for about 40 hours a week without impairing hearing. When the safe weekly listening level is reached, the Apple Watch sends a notification to the wearer.
- Real-time text allows people with hearing problems or speech impairments to communicate in real time with bidirectional text during a phone call. The iPhone has had RTT since 2017, but Apple has now made it easier for users to multitask while interacting with calls and incoming RTT messages. You will receive notifications even if they are not in the phone app and the RTT conversation view is not activated.
Vision functions
- VoiceOver, Apple’s technology that converts screen text to speech, gets some updates with iOS 14. It now uses Apple’s machine learning on the device and Neural Engine to recognize and audibly describe more of what is happening on the screen – even if this is the third. Party developers have not activated the function in their apps. An iPhone or iPad now automatically provides better visual recognition of more objects, images, text, or controls displayed on a screen, and VoiceOver provides more natural, contextual feedback. When it comes to images or photos, VoiceOver now reads descriptions of the competition sets to show in detail what is displayed on the screen. And it automatically detects user interface controls such as buttons, labels, toggles, sliders, and indicators.
- Rotor, a gesture-based way to customize the VoiceOver experience, can do more than before. With the system, users can already make changes, e.g. For example, you can adjust the speech speed and volume, select special input types such as Braille or specify how VoiceOver switches from one element to the next on the screen. WatchOS 7 brings the technology to Apple Watches and allows users to customize characters, words, lines, headings and links. And with MacOS Big SurUsers can configure rotors with preferred Braille tables and access more code customization options while developing apps in Xcode.
- Apple’s loupe technology, one of the most commonly used accessibility features, is updated with iOS 14 and iPadOS 14. Now users can zoom in on the area they are pointing and capture multi-shot still images. You can also filter or lighten images to improve clarity, and capture multiple images at the same time to make it easier to review multi-page documents or longer content at the same time. The magnifying glass also works with multitasking on the iPad.
- Apple’s new software extends Braille support with Braille AutoPanning. This allows users to pan across large amounts of braille text without having to press a physical pan button on their external updatable displays.
Back Tap
- One accessibility feature that many users may be using is Back Tap. With this feature in iOS 14, iPhone users can perform a variety of quick actions by tapping the back of an iPhone twice or three times. Users can enable certain accessibility features or take a screenshot. You can also scroll, open the control center, switch to the home screen, or open the app switcher.
- One thing that Back Tap doesn’t make easy is starting the camera or taking a photo. Users can configure these actions by first creating a Siri shortcut. The shortcut app, introduced two years ago, automates common and routine tasks. Shortcuts allowed users to create custom commands such as: For example, setting up a request that summarizes a surf report, current weather, travel time to the beach, and a reminder of sunscreen products, simply by saying, “Hey Siri, surf time.” These links can be assigned to the back tap settings.
Mobility / physical motor skills
- Apple’s voice control tool offers new voices for British and Indian English as well as some new features. The technology, presented at last year’s WWDCenables people with physical motor impairments to search and operate their devices by issuing voice commands. For example, users can request the addition of an emoji while dictating an email, or split a screen into a numbered grid so that they can replicate a tap or a mouse click by calling a number. Now Apple device owners can use voice control with VoiceOver to perform common VoiceOver actions like Read All or Enable Display Control. Apple has also integrated annotations and permanent grid or number overlays to improve the consistency of a user when navigating a device with his voice. It is now possible to separate sleep / wake commands while multiple devices are running at the same time.
Accessible coding
- Apple is expanding the accessibility of its Xcode encoding tools. The company’s Xcode playgrounds and live previews are more accessible to blind programmers, much like they are Coding curriculum for fast playgrounds has been accessible for years. The hope is that by deploying Xcode, more visually impaired people will be encouraged to become programmers.
Xbox Adaptive Controller support
- Apple’s devices now support the Microsoft Xbox Adaptive Controller. This means that users of games in Apple Arcade – including Apple TVs – can use Microsoft’s $ 100 device to make games more accessible. Players can connect switches, buttons, pressure sensitive hoses and other devices to the controller to perform all the functions that a standard controller normally performs.
- Apple also supported other popular controllers, including Xbox Wireless Controller with Bluetooth, PlayStation DualShock 4, and MFi game controllers. They also work with touch controls and the Siri remote control.