But are these llms worth their salt?
They're not unless you curve the grading because they're running locally.
Which some people do, but I don't think the average person asking this question does (and I don't)
With 128GB of memory they can have real world use cases. But they won’t be as good as SoTA hosted models.
They're not unless you curve the grading because they're running locally.
Which some people do, but I don't think the average person asking this question does (and I don't)
With 128GB of memory they can have real world use cases. But they won’t be as good as SoTA hosted models.