Cheap to buy old hardware, but electricity to run those old rigs isn't really cheap in many areas now. My server is costing me about $100/month in electricity costs.

It does have 16 spinning disks in it, so I accept that I pay for the energy to keep them spinning 24/7, but I like the redundancy of RAID10, and I have two 8-disk arrays in the machine. And a Ryzen-7 5700G, 10gbit NIC, 16 port RAID card, and 96GB of RAM.

It depends on the type of hardware that you use for your server. If it's really server grade you're totally right. For example cheap memory+CPU+MB x99 off AliExpress are cheap but they're not very efficient.

In my case I fell in love with the tiny/mini/micros and have a refurbish Lenovo m710q running 24/7 and only using 5W when idling. I know it doesn't support ECC memory or more than 8 threads, but for my use case is more than enough

I’ve been watching some storage and homelab-themed videos and I heard there’s a lot of optimizations you can do to lower power usage - spinning the disks down, turning the machine on for a limited time, etc.

That doesn't work for me. The main server is constantly using the disks to record security cameras, run VMs 24/7, Plex, a web server, a VPN (so I can dial in to my local network remotely), and a lot more.

How have you measured the power usage/cost? That seems like a incredibly high price for electricity, similar to a 600W constant load in my part of the world.

All of my IT equipment in my office is running through a single UPS that measures power consumption.

I do have a bit more than just that server hooked up to it. There's also a Dell i5 running DDWRT as my main gateway/router, the fiber internet modem, a small Synology NAS, a couple of WIFI routers, etc. It all adds up.

That doesn't include my backup server out in the garage with another 8-disk RAID10 array and an LTO tape drive that is often backing up data, 5 more WIFI routers around the property, and 10 or so security cameras. So I'm probably well over $100/mo for all my tech stuff.