It seems to use https://api.deepgram.com (and other web endpoints) and apparently needs an API key, so it's not actually local. Why is it being compared to ollama, which does run fully locally?
It seems to use https://api.deepgram.com (and other web endpoints) and apparently needs an API key, so it's not actually local. Why is it being compared to ollama, which does run fully locally?
It can run Whisper and Moonshine models locally, while also allowing the use of other API providers. Read the docs - or at least this post.
I would want such information accessible without having to go hunt for it. You could improve your presentation by interposing fewer clicks between a reader and the thing they want to know.
The information is readily available in the open-your-eyes section.
> I would want such information accessible without having to go hunt for it.
Where exactly, if not in the FM?