One challenge is that the power usage of a server is order(s) of magnitude greater than that of a laptop. This means the cost to do what you describe is significant, hence that has to be taken into account when trying to build a cluster that is competitive...

Yeah, I agree with that. I think that power savings are a big priority for datacenters these days, so perhaps as more efficient chips go into production, the feasibility of "self-contained" servers increases. I could serve a lot of websites from my phone, and I've never had to fire up a diesel generator to have 100% uptime on it. (But, the network infrastructure uses more power than my phone itself. ONT + router is > 20W! The efficiency has to be everywhere for this to work.)