Rumors have long suggested that Apple is developing a version of the AirPods Pro equipped with infrared (IR) cameras (via MacRumors). The purpose behind the cameras, however, has remained quite vague until now. 

Why did Apple spend $2 billion on a company earlier this year? 

Earlier this year, the Cupertino giant paid $2 billion for Q.ai (its second-largest acquisition, behind only Beats), an Israeli AI startup that develops technology for interpreting microfacial movements. 

I’m talking about reading whispered or unspoken words by analyzing skin and music movements in real time. At the time, the acquisition raised quite a lot of eyebrows, but very few answers. Now, there’s a growing theory that’s connecting the dots. 

The idea is quite simple: IR cameras on AirPods Pro might be able to track microfacial movements around the mouth and jaw, while Q.ai’s software could translate those movements into commands or texts. 

So what do earbuds have to do with silent speech?

In other words, facial movements could let users draft messages, control apps, or speak to Siri without actually making a sound. In July 2025, Apple was also granted a patent for camera-based systems similar to Face ID’s dot projector, for proximity detection and 3D depth mapping. 

AirPods already include accelerometers and skin-detection sensors, which might mean the hardware foundation is already in place. For everyday users, this might mean interacting with devices privately, especially in noisy environments, or without disrupting those around them. 

The exact use cases, how Apple implements the technology, and how it is showcased in iOS are still unknown to us. For now, the AirPods Pro 3 with IR cameras are expected to arrive this year, likely in September 2026.

Share.
Exit mobile version