Apple is testing Conversation Boost, a feature for the AirPods Pro that focuses the microphones on the person in front of you, boosting their voice, without cutting out the sounds of the world around you. Currently in beta testing, this feature continues to blur the lines between augmented reality and accessibility. “This is definitely a feature that all users, even those with normal hearing, would like to have because we are often in noisy environments and this amplifies the voices that we care about while rejecting other sounds,” John Carter, former chief engineer at Bose, told Lifewire via email.
How Conversation Boost Works
Apple detailed Conversation Boost during this year’s Worldwide Developer Conference keynote. It uses beamforming microphones, which are microphones designed to be capable of detecting the direction and distance of incoming audio. Combined with computational wizardry, it’s possible to focus on the sounds you want, and reject the ones you don’t. “Because there is a microphone in each AirPod, you have the ability to use beam steering to increase the intelligibility and sound level from a speaker you would like to hear, and reduce noise and other sound from other conversations or noises,” says Carter. Previously, Apple added Live Listen to the iPhone, which lets you use the phone as a remote mic to transmit conversations to AirPods. Conversation Boost uses the AirPods’ own mics. More recently, the company added Headphone Accommodations, which is a way to fine-tune the audio output of the headphones to your own hearing, typically by augmenting audio frequencies your ears no longer pick up properly. Taken together, one can read this as a continuation of Apple’s industry-leading commitment to accessibility. But it’s also an impressive demonstration of augmented reality.
Audio AR or Accessibility?
Apple has made no secret of its obsession with AR. It’s a common feature of Apple keynotes, and AR-friendly tech like LIDAR cameras have been added even to seemingly AR-unfriendly gadgets like the iPad Pro. That’s probably all leading toward some Apple AR glasses, but right now, Apple’s audio AR features already are impressive. For instance, Siri can read aloud incoming messages, and in iOS 15 also will read out notifications, through AirPods, so you never have to look at a screen to keep up. Also, AirPods already either block or augment background audio, letting you cancel noise, while simultaneously letting the important parts through. This selective blending of iPhone audio with real-world audio lets Apple not only combine the two, but also pluck selected sounds from the surrounding world, augment them, then add them back in. “[I]t looks like Apple [designed the] dual microphones to create a directional effect (for optimal hearing in noise—with and without a noise cancellation feature activated) and/or a reduction of sound behind the hearing aid wearer,” audiologist Steve DeMari told Lifewire via email.
Benefits for Everyone
It’s become something of a cliche to say accessibility is a boon, not just to people with hearing loss or with diminished motor or visual abilities, but to everyone. That’s true, it doesn’t go far enough. Because it’s researching accessibility, AR, and also clever audio processing (like making the HomePod sound awesome), Apple is able to offer new features that combine all three. This, in turn, destigmatizes the use of technology to augment our senses. Hearing aids used to be (and often still are) pink blobs with a hunger for disposable batteries, but AirPods are an aspirational product. And while folks might be ashamed to use a magnifying glass to read in public, nobody cares about using the magnifier on the iPhone to do the same, or even to use iOS 15’s new Live Text feature to translate real-world text in languages we can’t otherwise read. Accessibility has become less about restoring diminished senses to a notional average. It’s now more about using technology to extend our senses to levels that previously have been impossible. And that’s great for everyone.