If you wanted to make the transcription locally hosted to save on OpenAI costs, you can use my crate, mutter [0], which makes self-hosting the Whisper model super easy :)

[0]: https://crates.io/crates/mutter

Can I box this up in a docker container and stuff it behind an API?

Absolutely! There’s existing solutions for this, if you hunt around, but if you wanted to customize it, it would be pretty easy.