yeah we use whisper.cpp for whisper inference. this is more like a community-focused project, not a commercial product!

Ya after spending a decent amount of time in r/localllama I was surprised that a project would want to name itself in association with Ollama, it’s got a pretty bad reputation in the community at this point.