But perhaps the first well-executed?

All the best to them, this remains to be seen, including how the SDK tooling will look like.

I consider Microsoft's one great, given its C# and .NET based SDK, instead of yet again C and C++.

The Unity SDK is C# and slots right into the typical Unity patterns for touch input.

Two contact types are provided: Finger – Representing a single touch point, e.g, a finger Glyph – Representing a tangible object

For each contact, you get ID, Position, and Phase. For Glyphs, you also get orientation and touched status, as in the system knows whether the object is being touched or not. There tunable parameters for the tracking system as well.

For event systems (e.g., menus, etc), BoardUIInputModule in provided in place of Unity's default InputSystemUIInputModule.

Please reach out if interested to develop: https://board.fun/pages/developers

Thanks for the overview, and all the best for the project.

Godot SDK should be the goal