Apple does not secretly analyze sine waves to infer head motion. Instead, airpods pro/max/gen-3 include actual IMUs (inertial measurement units), and ios exposes their readings through core motion.
It’s a known research technique called acoustic motion tracking (some labs use inaudible chirps to locate phones or headsets) you mentioned, but it’s not how airpods head tracking works
I think they're more so talking about measuring attenuation that apple applies for the "spatial audio" effect (after apple does all of the fancy IMU tracking for you), by using a known amplitude of signal in, and the ability to programmatically monitor the signal out after the effect, you can reverse engineer a crude estimated angle out of the delta between the two.
I don't think that's how this app works though, after installing it I got a permission prompt for motion tracking.
Looks like there is an API for this. Here's an example: https://github.com/tukuyo/AirPodsPro-Motion-Sampler/blob/8ac...
Yup.
Since the author of the app mentioned reverse engineering, analyzing audio is a way that immediately came to mind. It should be quite precise, too, only at the expense of extra CPU cycles.
I did not imply that there is no API to get head tracking data (even though Google search overview straight up says that). It’s mostly a thought experiment. Kudos for digging up CMHeadphoneMotionManager.
> Apple does not secretly analyze sine waves to infer head motion.
Duh. The mechanism I described hinges on Apple being able to track head movements in the first place in order to convert that virtual 3D scene to stereo sound.