What if we had more specialized search engines? There should be a recipe aggregator that searches for recipes and nothing else, and prioritizes high value recipe sites.
What if we had more specialized search engines? There should be a recipe aggregator that searches for recipes and nothing else, and prioritizes high value recipe sites.
Then we would need a search engine for search engines…
Also, how would a search engine for recipes work? How does THAT search engine find when a new recipe site is created? It would be to scrape the whole internet just to find all the recipe sites….
Random ideas:
- it could work like the Kagi smallweb. people submit sites and you can’t submit your own until you submit (and have accepted) enough of others
- I’m also envisioning a parallel world where the big tech monopolies next existed. Maybe there could be crawler/indexer companies whose product was the stream of new content. Then you as a specialist search engine could consume the stream to build your own custom index and weights
This exists and is called a meta search engine. For example like MetaGer, which was extremely famous 20 years ago in Germany. https://metager.org/
Reminds me a bit of how yahoo got started: categories, sub-categories, etc..
Of course, back then they had thousands of websites to categorize not billions.