I feel like the SPA vs. SSR debate misses the point: SPAs are most often web applications (as opposed to informational websites). I created SPAs as a contractor for 10+years, and it always has been B2B web apps for large corporations. The users are always professionals who work with the app on a mostly daily basis.

Since .js, .css and assets are immutable between releases (and modern tooling like NextJS appends hashes to the filenames so they can be served with 'Cache-Control: immutable'), the app is always served from browser cache until there is a new release - which is usually weeks, not days. And if a the browser cache should be empty, you would compare waiting 500ms-1s to use an app that you will use for hours that day. If however, every link click, every route change, every interaction triggers a SSR server roundtrip, the app will not feel snappy during usage.

Now, if people chose the wrong tool for the job and use a 1MB SPA to serve a landing page, that is where things go wrong. But for me, metrics that include the download time of the .js/.css assets are pointless, as they occur once - relative to total time of app usage. After initial load, the snappyness of your SPA will mostly depend on your database queries and API performance, which is also the case in a SSR solution. YMMV of course.

> if people chose the wrong tool for the job and use a 1MB SPA to serve a landing page, that is where things go wrong

That is exactly the case. Can’t really blame the people when every learning resource, react evangelist, tweet and post points you towards that.

> If however, every link click, every route change, every interaction triggers a SSR server roundtrip, the app will not feel snappy during usage.

SPAs still do the same, maybe more round trips for API requests, we all know how endemic loading spinners have become. Rendering HTML does not meaningfully affect server response times. And frameworks like Datastar (used in this benchmark), htmx, alpine allow you to avoid full page loads.

i have almost never encountered an spa where anything happened without a very perceptible delay. My windows 95 pc on a pentium literally felt snappier. On 100MHz and 8MiB of ram...

Yeah but the problem is that people don't just use a single webapp all the time. We all browse and go to many different websites, which all have payloads that they want us to download and run. So in practice it ends up that we're re-downloading bundles constantly, many of them which have the exact same libraries, but because they're bundled and minified, they're not cacheable so we have to fetch them over and over again.

Don't believe me? Check this out: https://tonsky.me/blog/js-bloat/