Apple packs iOS 14 with new accessibility features, like AirPods Pro audio tweaks – CNET

apple-packs-ios-14-with-new-accessibility-features,-like-airpods-pro-audio-tweaks-–-cnet
apple-iphone-11-9335

Apple has designed new accessibility features for its popular devices. 

Angela Lang/CNET

With iOS 14, Apple brings tons of new accessibility features to its devices, including some that people without disabilities could also find helpful. The list ranges from the ability to customize Transparency mode in AirPods Pro to capturing multiple frames with the iPhone Magnifier function. And the new Back Tap feature lets you tap the backside of your iPhone to do things like take a screenshot. 

Many of the new enhancements will likely appeal to people who are deaf or have hearing loss, while other features will benefit users who are blind or low vision, expanding on Apple’s efforts over the years to make its devices and software more accessible. 

The improvements aren’t just for iPhones and iPads. Apple Watch users now will have the option to configure accessibility features as they go through the process of setting up a watch, as well as turn on an extra-large watch face for bigger and bolder “complications” — glanceable bits of info about things like the weather — to help people with low vision see them better. 

On Monday, Apple unveiled iOS 14, iPadOS 14 and its other updated software during its annual Worldwide Developers Conference. The company uses WWDC to show off the biggest updates to its operating systems before making them available for all Apple device users later in the year. Right now, developers and other beta testers have access to early versions of the software to make their apps and help Apple detect bugs before the improvements are rolled out broadly. That includes accessibility features.

iOS 14 sound recognition setupEnlarge Image

AirPods into remote microphones through its Live Listen feature. 

iOS 14, iPadOS 14, WatchOS 7 and its other upcoming software expand those offerings. 

Hearing features

  • Headphones Accommodations lets users adjust the frequencies of audio streamed through their AirPods Pro, second-generation AirPods, select Beats headphones and EarPods. Each individual can customize the settings for what’s right for them, either dampening or amplifying particular sounds. Users can set up to nine unique profiles (like a movie setting and a different calls setting) that tap into three amplification tunings and three varying strengths.
  • AirPods Pro Transparency Mode gets its own unique benefit from Headphones Accommodations: the ability to customize how much of the surrounding environment you hear. Quiet voices can become more crisp, and outside environmental sounds can become more detailed.
  • Sound Recognition makes it easier for people who are deaf to be aware of sound-based alerts, alarms and notifications. When an iPhone, iPad or iPod Touch picks up a particular type of sound or alert, it will send a notification to the user’s device, including an Apple Watch. The sounds the system can detect are alarms like sirens, smoke alarms at home or building fire alarms; and household noises like doorbell chimes, car horns, appliance beeps and running water. Apple also is working on detecting sounds from people or animals.
  • Group FaceTime calls will now be accommodating for people who are using sign language instead of talking. Typically, in a group call, the person speaking appears more prominently to the other participants, with that person’s video box becoming larger. With iOS 14, FaceTime will be able to detect if someone is using sign language and will make that person’s video window prominent. 
  • The Noise app, introduced in last year’s WatchOS 6, measures ambient sound levels to give users a sense of how loud their surrounding environment is. With WatchOS 7, customers will be able to see how loudly they’re listening to audio through their headphones via their iPhone, iPod or Apple Watch. A hearing control panel displays a live UI that shows whether the audio is playing above the World Health Organization’s recommended limit, which is listening to audio at 80 decibels for about 40 hours a week without hurting hearing. When reaching the safe weekly listening amount, the Apple Watch sends a notification to the wearer. 
  • Real-Time Text lets people who have hearing difficulties or speech disabilities communicate using two-way text in real time while on a phone call. The iPhone has had RTT since 2017, but Apple has now made it simpler for users to multitask while interacting with calls and incoming RTT messages. They’ll get notifications even when they’re not in the phone app and don’t have RTT conversation view enabled. 

Vision features

  • VoiceOver, Apple’s technology that translates on-screen text into speech, gets some updates with iOS 14. It now taps into Apple’s on-device machine learning and Neural Engine to recognize and audibly describe more of what’s happening on screen — even when third-party developers haven’t enabled the ability in their apps. An iPhone or iPad will now automatically provide better optical recognition of more objects, images, text or controls displayed on a screen, and VoiceOver gives more natural and contextual feedback. When it comes to images or photos, VoiceOver now reads compete sentence descriptions to detail what’s on the screen. And it automatically detects user interface controls like buttons, labels, toggles, sliders and indicators. 
  • Rotor, a gesture-based way to customize the VoiceOver experience, now can do more than before. The system already lets users make tweaks like adjust the speaking rate and volume, select special types of input such as braille or adjust how VoiceOver moves from one item to the next on the screen. WatchOS 7 brings the technology to Apple Watches, letting users customize characters, words, lines, headings and links. And with MacOS Big Sur, users can configure Rotors with preferred braille tables and access more options to adjust code while developing apps in Xcode. 
  • Apple’s Magnifier technology, one of its most-used accessibility features, gets an upgrade with iOS 14 and iPadOS 14. It now lets users magnify more of the area they’re pointing at, as well as capture multi-shot freeze frames. They also can filter or brighten images for better clarity and capture multiple images at once to make it simpler to review multipage documents or longer content all at once. Magnifier also works with multitasking on the iPad.
  • Apple’s new software expands support for Braille with Braille AutoPanning. It lets users pan across larger amounts of Braille text without needing to press a physical pan button on their external refreshable displays.

Back Tap

  • One accessibility feature that many people could end up using is Back Tap. The feature, found in iOS 14, lets iPhone users do a variety of quick action by double or triple tapping on the back of an iPhone. Users can turn on specific accessibility features or take a screenshot. They also can scroll, open the control center, go to the home screen or open the app switcher. 
  • One thing Back Tap doesn’t easily do is launch the camera or take a photo. Users can configure those actions by first making a Siri Shortcut. The Shortcut app, introduced two years ago, automates common and routine tasks. With Shortcuts, people have been able to create customized commands, like setting up a request that brings together a surf report, current weather, travel time to the beach and a sunscreen reminder, all by just saying, “Hey Siri, surf time.” Those Shortcuts can be mapped to the Back Tap settings.

Mobility/physical motor features

  • Apple’s Voice Control tool gets new British English and Indian English voices, as well as some new capabilities. The technology, introduced at last year’s WWDC, allows people with physical motor limitations to browse and operate their devices by issuing voice commands. It lets users do things like request the addition of an emoji while dictating an email, or divide a screen into a numbered grid so they can replicate a screen tap or mouse click by calling out a number. Now Apple device owners can use Voice Control along with VoiceOver to perform common VoiceOver actions like “read all” or “activate” a display control. Apple also has built in hints and persistent grid or number overlays to improve a user’s consistency when navigating a device with their voice, and it’s now possible to separate Sleep/Wake commands while running multiple devices at the same time. 

Accessible coding

  • Apple’s expanding the accessibility of its Xcode coding tools. The company’s Xcode Playgrounds and Live Previews will be more accessible to coders who are blind, similar to how its Swift Playgrounds coding curriculum has been accessible for years. The hope is by making Xcode accessible, too, it will encourage more people with low vision to become coders. 

Xbox Adaptive Controller support

  • Apple’s devices will now support the Microsoft Xbox Adaptive Controller. That means people playing games in Apple Arcade — including on Apple TVs — will be able to use Microsoft’s $100 device that was designed to make gaming more accessible. Gamers can plug switches, buttons, pressure-sensitive tubes and other gear into the controller to handle any function a standard controller normally does. 
  • Apple also supported other popular controllers, including Xbox Wireless Controllers with Bluetooth, PlayStation DualShock 4 and MFi game controllers. They also work with touch controls and the Siri Remote.

Leave a Reply

Your email address will not be published. Required fields are marked *

the-best-cheap-desktop-computer-deals-for-june-2020

The best cheap desktop computer deals for June 2020

9to5mac-happy-hour-283:-wwdc-2020-impressions:-ios-14,-watchos-7,-macos-big-sur-and-tvos-14

9to5Mac Happy Hour 283: WWDC 2020 impressions: iOS 14, watchOS 7, macOS Big Sur and tvOS 14