> I prefer llamacpp because doing things "seamlessly" obscures the way things work behind the scenes, which is what I want to learn and play with.
And that's a fine choice, but some of us actually just want to hack with the models not hack on them. Ollama is great for that, and SSHing into my server to reboot the process every time I want to change models just doesn't work for me.
I'd rather wait a few weeks for the newest model and be able to alternate easily than stay on the bleeding edge and sacrifice that.