> AMD doesn't seem like is going to be a player of note in the AI sphere.
I thought AMD is well positioned in the inference space? They have high VRAM, somewhat high connectivity, already shipping pods that are pretty ok for inference? Training is still dominated by nvda and their CUDA moat, but there's an increased need for scalable inference, and that should fit AMD well. Am I wrong in thinking that?
That's what I have read as well but the ocean's distance between Nvidia and AMD in terms of market share of AI chips does not reflect that.