Mirrors the geohotz rants about AMD at the time, though as others point out this - 2024 - is ancient news in AI world and not quite sure what value it adds to the current discussions

Has this changed, If I want to go hands on with development using pytorch or whatever is used now, would you recommend an AMD card?

Genuine question, I have not followed this topic closely for years :)

Please just get everything in PyTorch to work, and work well (and across all graphics cards too). This is the starting point and it doesn't matter how you do it. But the fact you cannot even do some very basic stuff on AMD is going to mean you are left unused by researchers, so getting further up the stack is going to be almost impossible.

The problem is "just". "Just" getting pytorch to work and to work well is a huge undertaking.

Correction: Why wasn't it competitive 2 years ago; basically half the AI summer ago.

[deleted]

If AMD's betting the company on their AI compute, they had best follow the advice in the article because the only way to compete with NVIDIA is to meet/exceed not just the performance but also the DevX.

These days it's for sure the dev environment that is lacking, hardware is okay (potentially great?!), software abysmal. To run a local llm in a stable manner implies using Vulkan.. any attempt at ROCm is totally hamstrung by haphazard support of hardware alongside with an online presence poisoned by people primarily discussing work-arounds rather than work when it comes to AMD as a platform. Argh.

You can't have good performance without good DevX. There's a reason why we get a new python dsl for nvidia GPUs every week.

I love how they just butcher that article.

I remember when it came out a little over a year ago, and its just as wrong as it is today as it was then.

[2024]