Given how much front end has evolved over the last 15 years, is it really the case still these days that you should do SSR for SEO? That would be a strong indictment of Google and other search engines if they still so many years later are not properly crawling JS based websites wouldn't it?
Google does crawl client side generated content but at a low priority and with mixed results. It's expensive and slow to render js pages so I see where they are coming from.
I built a small router that can use Prerender for SEO compatibility on vanilla React projects. So there are some alternatives.
https://www.npmjs.com/package/routerino
hey thanks for linking to routerino <3, i;ve thought about using something like this for my vanilla react projects and this seems perfect. do you know what are the actual downsides for the end-user/client when compared to SSR (besides requiring client js)?
the main downside as a developer is it can be a little more work to set up Prerender depending on your hosting setup (although Netlify has just a checkbox to enable) to allow Googlebot to view the page JS-free. I've been playing around with some ideas to eliminate that requirement, however.
as an end user, the only downside would be the typical SPA ones, such as the client needs to render the site with JS like you mentioned (except in cases where we use pre-rendering).
here's an example site built with Routerino if you'd like to experience what it is like in usage: https://virtourlimited.netlify.app/
Unfortunately, yes. While Google and many other search engines/crawlers can render JS it’s much faster to give them the rendered page - especially for very large sites.
But page load optimization does typically bring better user engagement so it’s not just an SEO strategy.