I think you should mention that LM Studio isn't open source.
I mean, what's the point of using local models if you can't trust the app itself?
I think you should mention that LM Studio isn't open source.
I mean, what's the point of using local models if you can't trust the app itself?
> I mean, what's the point of using local models if you can't trust the app itself?
and you think ollama doesn't do telemetry/etc. just because it's open source?
You're welcome to go through the source: https://github.com/ollama/ollama/
That's why i suggested using llama.cpp in my other comment.
Depends what people use them for, not every user of local models is doing so for privacy, some just don't like paying for online models.
Most LLM sites are now offering free plans, and they are usually better than what you can run locally, So I think people are running local models for privacy 99% of the time