Aren't browsers starting to ship with built-in LLMs? I don't know much about this but if so then surely your extension won't need to send queries to LLM APIs?
Aren't browsers starting to ship with built-in LLMs? I don't know much about this but if so then surely your extension won't need to send queries to LLM APIs?
There's two types of built-in LLM's:
- The ones the user sees (like a sidepanel). These often use LLM API's like OpenAI.
- The browser API ones. These are indeed local, but are often very limited smaller models (for Chrome this is Gemini Nano). Results from these would be lower quality, and of course with large contexts, either impossible or slower than using an API.