I pay to use ProtonMail’s privacy preserving Lumo LLM Chat with good web_search tooling. Lumo is powered by Mistral models.

I use Lumo a lot and usually results are good enough. To be clear though, I do fall back on gemini-cli and OpenAI’s codex systems for coding a few times a week.

I live in the US, but if I were a European, I would be all in on supporting Mistral. Strengthen your own country and region.

I wonder what ProtonMail are doing internally? Mistral's public API endpoints route via CloudFlare, just like apparently every other hosted LLM out there, even any of the Chinese models I've checked

>I live in the US, but if I were a European, I would be all in on supporting Mistral. Strengthen your own country and region

That's a bit of a double edged sword. My support goes as far as giving local offerings a try when I might not have done otherwise. But at that point they need to be able to compete on merit.

https://proton.me/support/lumo-privacy

> Lumo is powered by open-source large language models (LLMs) which have been optimized by Proton to give you the best answer based on the model most capable of dealing with your request. The models we’re using currently are Nemo, OpenHands 32B, OLMO 2 32B, and Mistral Small 3. These run exclusively on servers Proton controls so your data is never stored on a third-party platform.

Mistral small and large are open weight, so they are likely self hosting?

Note that Proton is sketchy about their code being open-source and available for anyone to review: https://news.ycombinator.com/item?id=44665398

>> I live in the US, but if I were a European, I would be all in on supporting Mistral. Strengthen your own country and region.

The problem is that if it's actually successful it'll just be bought by one of the big US based competitors.

I don't think France would allow that to happen - they would block it on national interest grounds.