It's not a question of support, it's a question of defaults. Rust code exercises a single-ownership discipline by default, and you must opt-in to dynamic multi-ownership (via refcounting or otherwise). In languages that have pervasive dynamic multi-ownership by default (via GC, etc.), enforced single-ownership is instead opt-in, which means that you cannot expect code in the wild to use it. In Rust, you can expect code in the wild to exhibit a single-ownership discipline; that's just the prevailing style for Rust code, as encouraged by the design of the language itself.
In fact, we can see this "defaults matter" problem in Rust as well. Note that Rust by-default assumes that code is running in a context where a dynamic allocator is available, but allows one to opt-out of this ("no_std" mode). Code written for embedded devices or baremetal contexts uniformly opt into this mode, but because it's not the default, you can't just pull any old library off the shelf and expect it to work for you, so the ecosystem is much smaller and less mature. Defaults matter.
True, defaults matter. In many cases, however, using a language that defaults to a GC for memory safety is often preferable or easier.
The argument is often about when ownership and borrowing is truly necessary. Rust has its uses, but arguably not all the time and with everything, because of its defaults.
I'm going to thoughtfully disagree here. The more I use Rust, the more I feel like we got it wrong all the way back in the mists of ALGOL, and that copy-by-default (which, in managed languages, translates to multi-ownership-by-default) was a mistake in the same league as null. Being able to convincingly express resource consumption in a first-class way is just too broadly useful, IMO. Certainly we can imagine a "high-level Rust" that doesn't make you care so much about pointers and the stack/heap distinction, but I still think such a language would want to make single-ownership the default. To that end, rather than any of the aforementioned languages, I'll suggest Austral as looking fairly interesting, although it's still far from high-level: https://austral-lang.org/
I’m writing some C# code at the moment, and the fact that `ObjectDisposedException` exists should give you a clue that “consume” semantics are desirable in all languages.
It opens up entirely new avenues for statically error-free programming, letting you model things like “if the caller has an instance of this type, I can guarantee that this other larger proposition is true”. Namely without also having to handle the case where the user smuggled another instance from another call site.
This is really, really useful.