> First, there’s a signal being given about the complexity and fragility of the software. A 16GB RAM requirement just screams “this is a big, complex piece of software that will break in mysterious ways”. Not very scientific, I know, but based on experience.

This lines up with my experience self hosting a headless BI service. In "developer mode" it takes maybe 1GB RAM. But as soon as you want to go prod you need multiple servers with 4+ cores and 16GB+ RAM that need a strongly consistent shared file store. Add confusing docs to the mix, mysterious breakages, incomplete logging and admin APIs, and a reliance on community templates for stuff like k8s deployment... it was very painful. I too gave up on self hosted.

In most corporate projects that I worked with, the day it went to production started the clock to disband the core team and optimizations were not on the table. When the product works, they move on. The only area where performance optimization is a mandatory step is video games, but that is where your clients are external. Most business software is in the "finally, but barely works" state and new features beat performance improvements.

This is caused by short sighted management that need to deliver and move on. "Long term" is a contradiction with their business model. In this case "long term" means "after product launch".