You can go a long way with just Termux. You can upcycle old phones by installing or building code in Termux to turn the phones into a compute grid, AI inference nodes, file servers, compute servers, web servers.

I was actually just going to do that with an old Galaxy S24. Seems like there's no easy way to add something like docker. Best I can find is to try to use qemu to get a full Linux VM.

Do you happen to know what kind of performance you can expect? Or perhaps a better way?

There's an app called Termux that comes with distro sources compiled for the Android/Linux. They're not binary compatible with regular GNU/Linux, but runs most software through distro standard ways.

> AI inference nodes

Are phones any good for that? (I agree with the rest, and I'm a big fan of termux, I just wouldn't have thought of a phone - especially an old phone - as a useful way to run AI)

Modern phones pack a good bit of compute, and can run things like VLAs decently well.

Of course, that would require today's phones to age out of "being used as a phone" bracket, and robotics VLAs to become actually useful. But things like the Comma AI autopilot hardware use slightly obsolete smartphone chips internally - so it's not like it's impossible to run a useful AI on this kind of HW.