More like "you need to sign up for our website and pay for a subscription", and I'd much rather do that if it's actually providing value. I am absolutely not going to run model locally which slowly churns out words at 5 tps while making the computer hot to touch.
I would very much like not to have to download 22 GB for some inference capability that is way worse than API calls both in terms of quality and speed.
I would rather pay money than seeing this thing running in my browser that only prints 5 tps on high-end consumer hardware.
True, but arguably better than "sorry, to use our website, you must have a ChatGPT subscription."
More like "you need to sign up for our website and pay for a subscription", and I'd much rather do that if it's actually providing value. I am absolutely not going to run model locally which slowly churns out words at 5 tps while making the computer hot to touch.
Also much better than every website wanting its own 22 GB rather than the 22 GB being a shared resource.
I would very much like not to have to download 22 GB for some inference capability that is way worse than API calls both in terms of quality and speed.
I would rather pay money than seeing this thing running in my browser that only prints 5 tps on high-end consumer hardware.
Why are you pretending those are the options?
The options are:
1. 22GB per website
2. 22GB per browser
3. 0GB / No AI capabilities
By having this in Chrome they are simply ensuring that option 2 replaces option 1. You still have option 3.
that is ~9% of the total available disk space for baseline phones and laptops for a model that is not that useful.