If memory serves, PyPy supports a subset of Python and focused their optimizations on software transactional memory.

Back in 2022 it worked fine with literally all modules except some ssh, ssl and C based modules.

With a little bit of tinkering (multiprocessing, choosing the right libraries written strictly in python, PyPy plus a lot of memory) I was able to optimize some workflows going from 24h to just 17 minutes :) Good times...

It felt like magic.

The "C based modules" bit is the kicker. A significant chunk of Python users essentially use it as a friendly wrapper for more-powerful C/C++ libraries underneath the hood.

They've long since fixed the C based modules interaction, unfortunately a lot of common knowledge is from when it couldn't interact with everything.

If you've written it off on that basis, I'd suggest it's worth giving it another shot at some stage. It might surprise you.

Last I saw there was still a little bit more overhead around the C interface, so hot loops that just call out to a C module in the loop can be just a smidgen slower, but I haven't seen it be appreciably slower in a fair while.

The FAQ states it is often much slower:

> We have support for c-extension modules (modules written using the C-API), so they run without modifications. This has been a part of PyPy since the 1.4 release, and support is almost complete. CPython extension modules in PyPy are often much slower than in CPython due to the need to emulate refcounting. It is often faster to take out your c-extension and replace it with a pure python or CFFI version that the JIT can optimize.

https://doc.pypy.org/en/latest/faq.html#do-c-extension-modul...

I have seen great success with cffi though.

I see, and it's a pretty short list:

https://doc.pypy.org/en/latest/cpython_differences.html#exte...

""" The extension modules (i.e. modules written in C, in the standard CPython) that are neither mentioned above nor in lib_pypy/ are not available in PyPy. """

The lifecycle of generators makes pypy code very verbose without refcounting. I've already been bitten with generator lifecycles and shared resources. PEP533 to fix this was deferred. Probably for the best as it seems a bit heavy-handed.

Yep, I had a script that was doing some dict mapping and re-indexing, wrote the high level code to be as optimal as possible, and switching from cpython to pypy brought the run time from 5 minutes to 15 seconds.

If pypy worked with Retux the game would get a big boost. Altough the main issue is that it tried to redraw many object at one per frame.

Not a subset. It covers 100% of pure python. CPyExt are working fine , just need optimizations on some parts. The private CPyEXT calls that some libraries uses as Hacks are only things that PyPy do not support officially (PyO3 Rust-python bindings uses those) .