SRE here who has dealt with this before.
Everything in article is excellent point but other big point is schema changes become extremely difficult because you have unknown applications possibly relying on that schema.
It's also at certain point, the database becomes absolutely massive and you will need teams of DBAs care and feeding it.
This is true. But at the same time people need to understand that most companies will never hit that certain point. It's a matter of if, not when.
Everyone tries to plan for a world where they've become one of the hyperscalers. Better to optimize for the much more likely scenarios.
We were not a hyperscaler, we were boring company that you never heard of.
Database is still 40TB with 3200 stored procedures.
I've dealt with postgres DBs larger than that in size though with no stored procedures and have never run into such problems. Except for a single table in a single DB at one stop, and that was a special case of people being extra stupid.
Granted, DB size isn't the best metric to be using here in terms of performance, but it's the one you used.
Not only will you need a team of DBAs caring for it, but you'll never be able to hire them.
No organization I have seen prioritizes a DBA's requirements, concerns, or approach. They certainly don't pay them enough to deal with that bullshit, so I was out.