Great execution on aggregating live feeds. Two questions from someone who does similar work on the B2B side:
1. How do you handle deduplication when the same event surfaces across multiple feeds simultaneously? For news aggregation this is the hard part - an event that appears in Reuters, Bloomberg, and 12 downstream outlets is one story, not 13.
2. What's your rate limiting strategy across 15 sources? Some of the better data APIs (Shodan, GreyNoise, etc.) have strict per-minute limits that become a real constraint at even modest query frequencies.
The B2B application of this pattern is company intelligence - pulling company news, job postings, funding signals, and tech stack changes from 10+ sources and surfacing the relevant signal per account. Same architecture challenge (deduplication, rate limits, signal:noise ratio) with a much smaller initial data volume but higher precision requirements per entity.