I was really impressed with how Ollama 3 ran on an AMD VEGA64 (~2017 tech) with only 8gb of [HBM] RAM. It was definitely limited, but very local and helpful.
I was really impressed with how Ollama 3 ran on an AMD VEGA64 (~2017 tech) with only 8gb of [HBM] RAM. It was definitely limited, but very local and helpful.