Local-first, controllability(custom endpoint part), and eventually extensibility(VSCode part of the post)

We're putting a lot of effort into making it run smoothly on local machines. There are no signups, and the app works without any internet connection after downloading models.

One of the things I would want to do is - As the meeting is going on - I would like to ask a LLM what questions I could ask at that point in time. Especially if it's a subject I am not expert in.

Would I be able to create an extension that could do this?

you can definitely do that in the future. but we had that on our mind as well from multiple requests - planning to add "eli5 - explain like i'm five" and "mmss - make me sound smart" ;) (edit: grammar fix)

Wow, does anything like this exist in current commercial tools?

ELI5 sounds useful.

MMSS sounds terrifying though, honestly.

i think there are tools like cluely - where they propose to "cheat" on everything in real-time. or just wearables like waves that shows ar displays with real-time assist. (i've never used both of them before, but i understood their products like this) so proactive ai agents are somewhat becoming a thing I guess. but it all boils down to privacy for us.

mmss was something that a lot of users suggested - they wanted to be saved from public humiliation