Maybe it is because I'm a bit too deep in the React ecosystem at this point, but once you start trying to do any reasonably complex task, this seems like it would be significantly harder to reason about.
Also, unless I'm misunderstanding it, this revolves a lot around the backend returning HTML to do backend-as-frontend, which given my previous experiences with the concept, I wouldn't want to touch with a 10ft pole. When you consider users with extremely bad internet connections (there are still people on dsl/older satellite/2G), having to make more requests to the backend that return larger blobs of html (as opposed to fewer requests that return JSON when it is relevant) will result in a significantly more degraded user experience
In my experience from using react apps on 2g/3g… I would much prefer the html. Usually, you get the content in 1-2 seconds with html, where you could end up with a react apps that never loads at all. Why does it never load? Because enterprising engineers usually invent their own timeouts that make no sense when you are dealing with bytes-per-second. There’s already a one minute timeout on the socket, and it knows if it is still receiving where your application has no sense of progress.
Please don’t reinvent things.
The issue is that if you have to make more requests overall, there is a non-negligible rtt for each of those requests, which on slow connections can add up very fast
Try my demo [1] on 3G slow. It works fine.
Turns out brotli compression does wonders.
[1] https://checkboxes.andersmurphy.com
on real 3g, seems to load in about 5s. As good as hackernews imho
Yup for initial load. But the subsequent interactions which each require a network hop and send down the whole page are lightening fast.
Because once that connection is primed you are off to the races.
"this revolves a lot around the backend returning HTML"
→ This is the way the Web used to work in the era of 56kbps modems (also, with ten levels of "<TABLE>" for layout).
Yeah and it used to be normal to get up and grab a drink between page transitions.
If you are truly offline, then of course this won't work.
Though, it is certainly possible to cache pages since most sites do not need persistent state - it's just ephemeral things like toggling a button, linking an input box and another caption, etc...
Though you could run the datastar js sdk from a service worker and if you've synced necessary state to browser storage, you could have your backend there.
Also, for slow connections, compression is very powerful - especially over an sse stream where there's redundant info being sent. Another comment here has links to immense demos and the compression ratios are north of 90%.
Also, slow internet is often also associated with slow devices, which cannot handle the bloat of React, css-in-js etc
Datastar VASTLY simplifies everything while sacrificing essentially nothing in the way of functionality.
"Frontend" is a huge field.
Some people write their own websites where they blog (where Wordpress would be enough), some build shops with mostly static content that has to load fast but has no real interactivity, others build full-fledged software like Figma/Discord and for the true masters, the DOM is a prison and only <canvas> coupled with GPU-accelerated computations will do.
Obviously htmx and it's siblings are nice for blogs, docs and shops. You don't build a "software" level website with it.