I wonder if it makes sense for browser vendors to agree upon and ship various ‘standard models’ that are released into the public domain or something, and the API lets you pick between them.
The models themselves would be standardized and the weights and everything should be identical between browsers. They’d be standard and ‘web-safe’ like CSS colors or fonts. Probably would help to give them really boring/unbranded names too. These would work identically across browsers and web developers can rely on them existing on modern setups.
If you want more models, you could install them as a user or your browser could ship them or the web developers could bundle them through a CDN (and another standard for shared big files across domains would probably be needed)
It doesn't make sense at all. So as a user how do you choose which model to use? There could be 3824 models to choose from. The browser might as well set one as default, and we all know how that goes (see: search engine).
Not to mention many other UX questions the come with this, most importantly, how unusable these local models are on regular 3-year old laptops that are constrained in RAM, GPU/CPU capability and likely disk space despite what enthusiasts say here. (They have a Macbook Pro with 32+GB of RAM, reports it works great with xyz model -- fine -- but somehow thinks it works for everyone and local models are the future.)
The Chrome model requires either "16 GB of RAM or more and 4 CPU cores or more" or "Strictly more than 4 GB of VRAM", and "22 GB of free space" (it uses around 4.4GB but it doesn't want to use the remaining free space).
The model is pretty slow on my M4 Pro mac.
The API allows the browser to use a cloud service instead, but then privacy is lower. So, more privacy for the rich.
> It doesn't make sense at all. So as a user how do you choose which model to use? There could be 3824 models to choose from. The browser might as well set one as default, and we all know how that goes (see: search engine).
...what's the exact problem here? Believe it or not, most non-tech-savvy users use the search engine just fine.
With regards to search engines, Google paid billions of dollars [0] to become the default on major browsers. I guess GP's implying that something similar might happen with LLMs.
[0] https://www.reuters.com/technology/google-paid-26-bln-be-def...
The rate of model development is an issue here. Once there are many cross-origin models, it becomes a fingerprinting vector. Also even the small models are many GBs.
Browsers do not need to force LLMs on their users.