iOS 12’s ‘Live Listen’ Feature Will Allow AirPods to Do so Much More

Airpods Live Listen Credit: The Mac Observer
Text Size
- +

Toggle Dark Mode

Apple on Monday showcased the majority of new iOS 12 features live from WWDC 2018, however it wasn’t until Tuesday when a lot of the bigger feature announcements began trickling their way into discovery by some iOS developers who’ve been fortunate enough to get their hands on developer beta 1 already.

Among the new and potentially ground-breaking new features Apple introduced include FaceTime group calling and new iOS performance enhancements throughout — but there’s also another intriguing new feature coming to iOS 12 later this fall, which was first discovered by TechCrunch less than a day after iOS 12 beta 1 went live.

Live Listen on iOS 12

Though the underlying technology has been around for years as an exclusive facet of Apple-certified hearing aid devices, the feature, dubbed Live Listen, has specifically been reengineered for employment within Apple’s $159 AirPods, according to Business Insider.

“Essentially, Live Listen turns your iPhone into a microphone: If you’re in a crowded bar, point your iPhone’s microphones at the person across the table from you, or even slide it over, and you’ll hear what they have to say in your hearing aid,” the publication said, noting that the same technology will be coming to Apple’s AirPods this fall.

To be clear, that’s not to suggest that AirPods can substitute for an actual medical device like a hearing aid, which has to be prescribed by a medical professional in the first place — but that the technology itself will allow more people to take advantage of the “potentially very handy feature.”

A Small Step Towards Audible Computing

As Business Insider notes, since Apple’s AirPods offer users one-touch access to the company’s voice-driven personal assistant, Siri; and given their access to a plethora of the iOS App Store’s offerings, early reviewers and optimists were hopeful that AirPods could one day enable a range of high-level computing operations. Though here we are, two-years post launch, and AirPods are still best suited for listening to music and placing phone calls under ideal noise conditions.

Still, while Apple has been slow with rolling out major new features — preferring to perfect them prior to launch as opposed to rushing them to the altar — it’s clear that Live Listen is a strategic step towards what analysts have dubbed the future of ‘Audible Computing.’ Considered at face value, audible computing entails the process of performing “any kind of computing task or tasks” utilizing an audible interface rather than visual, kinetic or other physical and/or sensory capacity.

And with Apple’s addition of Live Listen in iOS 12, it’s clear the company is taking one very small but nonetheless significant step towards ushering in the future of audible computing. It’s an extremely slow process, Business Insider notes, but once we move closer to Apple’s ultimate goal, “things are going to get wild.”

What Will Live Listen Allow AirPods to Do?

One day we could see the iOS App Store become flooded with new apps designed to take advantage of AirPods Live Listen functionality — perhaps in the form of a Live Language Translation app or even an advanced voice recognition app to help you type and save audio documents using AirPods and the sound of your voice. 

It’s going to be some time before we start seeing apps and features like these in the conceptual way we’re describing them, but the fact that Apple has set the gears in motion by introducing a technology like Live Listen on a device as functionally capable as AirPods is beyond promising.

Sponsored
Social Sharing