I don't think there is a generic computational model that would prevent deadlocks, so no, pony also doesn't solve it.

In Pony, handling messages is done via behaviors, which look like a normal method - the only difference is that it cannot return anything and that it runs asynchronously. Hence, the example in the post cannot occur since you cannot really wait for something in Pony. You'd have to explicitly call some other behavior of the other actor to be able to "respond" its first message, which breaks the circularity.

Yes - deadlocks are not hard to perfectly prevent. It comes with some consequences, but they're not too crazy. Though without provable termination, I don't know of any that prevent livelocks (and many deadlocks just become livelocks in a deadlock-free system, though design habits often change to mitigate this somewhat).

The only ones I know of that prevent both are really proof-oriented languages that cannot run an infinite loop, which disqualifies them on many technical and practical considerations. But they do exist.

Yeah. That's why I meant "generic", and we would probably agree that Coq and alia are not generic languages.