The assumption that the data must be "high resolution" is erroneous. Low resolution noisy data works just fine, you just need a lot more of it. You can use standard signal processing tricks such as stacking noisy low-resolution data to extract high-resolution features. This requires a lot more processing but that isn't much of a limitation. These reconstruction techniques work even if the data is from unrelated sources that aren't even trying to measure the thing you are measuring.
Any data exhaust will work, people have created interesting PoCs leveraging things like HVAC data, RF attenuation, etc. High-precision weather models essentially work the same way, making inferences by stitching together diverse event data that has nothing to do with weather.
High-quality high-resolution data sources largely don't exist in the way people imagine they do, so you need to do this anyway. If you have a high-resolution spatiotemporal graph for entities, tying it to identity is always trivial.
It would be more common if it weren't for the fact that open source platforms scale poorly for this type of analytical processing.