It's actually almost exactly the other way around.
The probability of a model M given data X, or P(M|X) is the posterior probability. The likelihood of data X given model M, or P(X|M), is the probability (or probability density, depending on whether your data is continuous or discrete) of observing data X given model M. We often are given un-normalised likelihoods, which is what the linked paper talks about. These quantities are related via Bayes' Theorem.
Now, you may ask, isn't the probability of observing data X given model M still a probability? I mean, yeah, a properly normalized likelihood is indeed a probability. It's not the mirror image of probability, it is just an un-normalised probability (or a probability distribution) of data given a model or model parameters.