The cross entropy loss function is softmax. They are one and the same.

They’re not. Cross entropy loss is E[-log q] where q is a probability. You could convert the model outputs x into probabilities using some other function like q = 1/Z x^2, and compute cross entropy loss just fine.

Behold the softmax: https://docs.pytorch.org/docs/2.11/generated/torch.nn.CrossE...

Behold the actual definition of cross entropy: https://en.wikipedia.org/wiki/Cross-entropy

It's true that the PyTorch API conflates cross entropy and softmax, but they are separate concepts.