A few weeks ago, while listening to music and developing an app with my AirPods, I noticed their spatial audio feature. I thought about what else could be done with reverse engineering, including the possibility of using AirPods as a motion controller. I built the world's first AirPods-controlled game, which uses a motor driven by head movements. In fact, it's not just AirPods, it's a game that uses a headset as a motion controller. And today, it was approved on the App Store. Who knows, maybe AirPods Arcade has even started? :)

This is an amazing idea! Even more surprising is the fact that apple approved it, usually they're not a fan of features being abused in ways they weren't intended for...

We made a cooking thermometer that plugged into the iPhone’s headphone jack before Bluetooth LE took off. Did not run into trouble with the app review team.

Very curious how you get the motion data from AirPods.I read the app description and noticed that no modification to AirPods is needed

Apple has public developer-accessible Core Motion APIs for this

https://developer.apple.com/documentation/coremotion/cmheadp...

This is so insanely cool!

Fun idea!

This reminds me of that ripe beautiful period after the iPhone was released when the all the apps were these weird idiosyncratic almost, "game like" offerings that weren't tied to platforms that either just brought amusement to a user or solved a simple problem. They were often free or cost .99 cents to 1.99$ and there were no subscriptions.

At any rate you earned your upvotes today tanis. Keep at her!

Yeah it was really a magical moment. I think the recipe for such moments is:

1. one piece of technology fairly widely accessible at once

2. Technology MUST be physical (device etc.)

GenAI is certainly having its incredible moment now, but without a physical layer to connect people through it doesn’t have that same edge.

Compare with something as niche as Kinect - for the engineering crowd, at least in my experience, the excitement around it was maybe less intense (no promise of money), but more crystalline

Yeah man I'm with you on the Kinect-- the WiiMote too. I remember how geeked out my family was when we finally got it to work as mouse on our computer. It was once again, "a hell of time to be alive."

Without being simple minded there might be fertility in, "can't we get the Kinect/WiiMote to control an AI?" Couldn't I swing my device around (giggidy) and have the AI analyze the movements to do something, "novel?" Food for thought. Thanks for commenting wellthisisgreat.

[dead]

Neat concept, tho tbh after about 2 minutes of playing I could tell it would lead to neck pain, so it was an instant delete for me.

Something that required slower reaction time might be more appropriate than a racing game.

This is the kind of creative out of the box thinking, official technology is lacking right now. Many times we don’t need new technologies, what we need are new ways to use what we have right now.

Thanks for your effort.

First off, kudos and congrats on the launch, seems like a fun idea! I am curious, as you mentioned reverse engineering. How difficult was it to retrieve the raw gyroscope data from the AirPods - AFAIK there is no API to access this information, right?

If you're looking to develop based on raw or processed gyroscope data then Core Motion is helpful

https://developer.apple.com/documentation/coremotion

"For example, a game might use accelerometer and gyroscope input to control onscreen game behavior."

The part specifically about obtaining headphone orientation is here (added in 2023):

https://developer.apple.com/documentation/coremotion/cmheadp...

Put a sine wave emitter (or multiple) on the scene. Enable head tracking. Analyze stereo sound at the output. Mute output. There you go: you now can track user’s head without direct access to gyroscope data.

Apple does not secretly analyze sine waves to infer head motion. Instead, airpods pro/max/gen-3 include actual IMUs (inertial measurement units), and ios exposes their readings through core motion.

It’s a known research technique called acoustic motion tracking (some labs use inaudible chirps to locate phones or headsets) you mentioned, but it’s not how airpods head tracking works

I think they're more so talking about measuring attenuation that apple applies for the "spatial audio" effect (after apple does all of the fancy IMU tracking for you), by using a known amplitude of signal in, and the ability to programmatically monitor the signal out after the effect, you can reverse engineer a crude estimated angle out of the delta between the two.

I don't think that's how this app works though, after installing it I got a permission prompt for motion tracking.

Looks like there is an API for this. Here's an example: https://github.com/tukuyo/AirPodsPro-Motion-Sampler/blob/8ac...

Yup.

Since the author of the app mentioned reverse engineering, analyzing audio is a way that immediately came to mind. It should be quite precise, too, only at the expense of extra CPU cycles.

I did not imply that there is no API to get head tracking data (even though Google search overview straight up says that). It’s mostly a thought experiment. Kudos for digging up CMHeadphoneMotionManager.

> Apple does not secretly analyze sine waves to infer head motion.

Duh. The mechanism I described hinges on Apple being able to track head movements in the first place in order to convert that virtual 3D scene to stereo sound.

Is it possible for an app to access the actual output played by the AirPods after 3D audio is applied?

I got an impression that you can, but I have not dug into developer reference deeply enough to cite it for sure.

Reminds me of when the first iPhones came out and developers were very creative for the time with the available features: flashlight app, bubble level app, asphalt 4

most fun for me was the beer drinking app and the see through wallpapers (though this was only on Android)

Okay so there is something called TrackIR and it has an open source implementation called OpenTrack. It’s used for simulators like flight or driving sims.

I’ve seen iPhone apps that use a neural net to determine the direction your head is facing using the camera. I think it networks with OpenTrack somehow, but I’m not sure about the details.

I wonder if you could use the AirPods to track your head and also direct audio from the PC through them with some networking!

I don’t know how to do Apple development but this world be a very cool application.

I use a head tracking app called SmoothTrack, which can use both, the iPhone camera and AirPods. Connects to OpenTrack via WiFi or cable.

So that’s already taken care of :)

On my iPhone 16 Pro + Airpods Pro, when I start the game (tapping on the screen when it says "Tap to start") I get a message saying "Airpods Disconnected", even though the Control Center on the phone reports them as connected).

Tried restarting the app, and disconnecting and reconnecting the Airpods with no luck.

Love the concept, though!

Amazing that it has taken this long for someone to do this.

I don’t have the latest AirPods so I haven’t downloaded it, but I’ll put in a feature request to enable the iPhone to be used as a tilt controller. It doesn’t have to change the marketing that AirPods are the intended controller.

But even though I can’t play it, great job on doing something new and creative.

This is amazing! I'm about to waste my entire afternoon playing this. Feature request: nod to restart

Since I don't have AirPods I found this funny video using the game: <https://youtu.be/sr-dsSWsMfE?si=9UlqjNw8_HJhtZ04>

Love it - quite a unique idea!

Can I ask about the tech stack - what did you use to build it. Just plain Swift and SceneKit? I did notice the app download is over 100MB, which seems a bit excessive for the game play.

Really cool! Small complaint, the “near miss” text obstructs the view at least when playing in landscape. But great idea and execution!

Can one also use AirPods for e.g. Google Maps navigation?

Theoretically yes. There are apps for the blind that do this[1]; you set a "beacon" at the location you want to navigate to, and the apps use head tracking and 3d (HRTF) audio to show you which direction the beacon is in.

Most of these are based on Microsoft's discontinued Soundscape app[2], which MS open sourced after its discontinuation.

[1] https://apps.apple.com/us/app/voicevista/id6450388413 [2] https://github.com/microsoft/soundscape

I see that global leaderboard… over 6000 This is great! Gives me old school arcade vibes and obviously need for speed.

I love this! It’s so reminiscent of the way we all would swerve our bodies when playing these types of games as a kid.

(Flying would be amazing.)

Great gameplay, well done. It is correctly telling me I should not ride a motorcycle ;-)

What a cool project! I wish I was the one too have thought of this. :)

Fantastic! It makes me a bit carsick. TBF, I am ~getting~ old.

Also, I am often rear ending red cars. It dawned on me it's probably because I'm red-green colorblind and I can hardly see them on the horizon.

For a second I thought you were talking about real life, and "often rear-ending red cards" seemed like a very serious problem.

so it can read left and right motion, can it do other directions as well?

This is so dumb - I love it. So much of modern computing is monetised and sanitised.

It’s so cool to see an interesting toy-like tech demo that does something new and different.

[dead]