Without having the time right now to dig too deep, I have a few questions:
1. Does it have any idea of the range at which motion is happening, relative to the distance between nodes or otherwise?
2. Can it correlate motion with that of another WiFi or Bluetooth device? So if I'm carrying my phone and it can see me moving, can it tell that it's me?
3. What's the movement -> signal latency?
Thanks for the questions.
1. As of now it only knows whether there is motion in the zone or not. Actual (x,y) coordinates within the zone are theoretically possible and something I have on the roadmap for 2026.
2. No, it doesn't correlate with other devices currently. But Wi-Fi sensing can theoretically identify who is moving by analyzing how you disrupt the signal. We all have unique movement patterns. I actually have a working proof of concept that can differentiate between me and my girlfriend, but it's very experimental and not stable enough for release yet. Definitely on the future roadmap though.
3. Pretty much instant (around 50ms). The detection happens in real-time as the signal disruption is analyzed.
1 and 3: intriguing.
With 2, I'm trying to think how effectively it could replace something like ESPresense. With that you've got a BLE tag with an ID that you can just associate with a person, so there's no learning of moving patterns needed, but getting it to be accurate to a room is a nightmare. It also works when there's no movement, which is convenient, and you can put the tag down if you don't want to be tracked. The ergonomics are just a bit nicer, it's just the resolution that sucks.
ESPresense and TOMMY solve different problems right now. ESPresense tracks specific people with IDs (even when stationary), while TOMMY detects anonymous motion across zones. For your use case, ESPresense is probably the better fit currently. Though once stationary presence detection is added (Q1 2026) and person identification, it could do similar tracking without needing physical tags to carry around.
Ah, I was hoping (at least rough) coordinates were something that this would do. I have HA and have been looking at mmwave devices for that use case. For things like "turn on a light, but only once I've sat down in the chair."
Unfortunately it doesn't support this yet. But it's on the roadmap and something I want for my apartment.