The hosted ones still have the advantage of being able to search the internet for live info rather than being limited to a knowledge cut off date.
The hosted ones still have the advantage of being able to search the internet for live info rather than being limited to a knowledge cut off date.
I’m not sure why a model needs to be hosted in order to make network calls?
Is there a library of good tools for LLMs to call? I have to imagine the bot-detection avoidance mechanisms are a major engineering effort and not likely to work out of the box with a simple harness and random local LLM.
If your volume is low enough, it should be pretty fine. It can just piggy back onto your personal browser cookies for Cloudflare.
Even the hosted ones are blocked from searching certain sites, for example Claude is banned from searching Reddit:
`Error: "The following domains are not accessible to our user agent: ['reddit.com']."`
Tavily, Exa, Firecrawl, Perplexity, and Linkup are all tools for agents to search the web.
I’ve been building a harness the past few months and supports them all out of the box with an API key.
Kagi also has an API. People who hate ads are probably the same folk that should be paying for Kagi. That's the sane alternative world where companies respect their users.
Oh, you got me so excited. I've had a Kagi sub for 3 years, but their API is still in closed beta. I guess I could (and should reach out and ask for access).
That's not how it works. Whether local or hosted, every modern model has a cutoff date for its training data, and can be leveraged by agents / harnesses / tools to fetch context from the internet or wherever.
Local ones that support tool use can do the same
You can do that locally too!