Exactly. I think this shows two things quite nicely:
- Very few sites need to cope with more than a handful of hits per second. A regular DSL connection and desktop PC can host the vast majority of them; you don't need clouds if you don't want them. (Even under variable load: if you need 80% of the systems more than 40% of the time, scaling down is probably not worth the cloud premium)
- If a site can't handle HN, that's a software limitation. Compare Wordpress' insanely slow page generation to simple blog software that generates pages in 5 milliseconds, or even to hosting the blog as static HTML files. I'd not be surprised if you can serve Wikipedia's page text from like one Raspberry Pi 5 per country. Not that you'd want to do that for reliability and redundancy reasons, plus you have the constant stream of edits to process and templates to (re-)render. Media and blob hosting is also a separate beast. Thankfully, most sites are not in the top ten world's most popular websites and you get away with a lot
WordPress is a static host for the vast majority of users. The generation time is irrelevant. Almost by default, it will just cache the rendered page and always serve from the cache.