I agree with what you're saying, of course, simplicity is better, etc.
But the nav on your blog is a terrible example.
Firstly, you don't get to just click on the links to go to where you want to go, you first have to click the three-lines button, even on a desktop with an enormous screen.
And secondly, despite your claims about an "enhanced experience with a modern browser", it seems to work exactly as if there was no enhancement at all? I click the three-lines menu and it takes me to a new page listing the links I can click. The "X" button to "close" the menu navigates me back particularly quickly, but that is all that I can tell that is unusual.
I'm using Firefox 136 on Ubuntu.
And in any event, this is all unnecessary, because you can make a nav by just putting a bunch of links at the top of the page, like HN does.
Not a fan of the nav, but loved the transitions. Applied them to my website (bespoke C++ static generator + template library): https://vittorioromeo.com/
When I first saw the title, I thought "great! Someone else has (re)discovered hypermedia".
But they have gone in a completely bewildering direction. Rather than swap/morph html fragments in, they're doing full page navigations and using view transitions to make it look smooth.
Worse, they are manipulating the history to cover these blasphemous tracks.
Datastar would make this particularly simple - just include whatever menu/nav stuff that you need and show/hide/toggle it with a few signals/attributes.
No, they're not related. The submission's article is about CSS view transitions. HATEOAS seems to involve interacting with an API, but linking between pages on a blog does not require this.
I have a question: After clicking on a blog in the listing page ("Collective Speed is..."), the page navigated to that particular blog. What CSS transitions are used to convert that title to a header? I saw some animation which pushed that title to become a header. How does that work? I'm curious
Would love to hear anecdata from others but I'd say...not really? I was a kid in those days but there's no way I'd make a server round trip for /menu/ to open a menu.
I just tried it on their website, using the desktop browser, and the experience is absolutely OK: you just get the menu as in any web app, and you can close it to go back, etc. Just an old-school page which is blazing fast ... because it is an old-school page. It renders faster than a typical animation to open a sidebar.
I dunno, it wants to challenge our dependence on javascript and then to make it work it needs to inject a “back” behavior into a normal link?
Js and fallbacks for menus is a solved issue.
this is just another form of LLM dunning krueger derangement where you think the LLM-suggested solution is novel because you haven’t encountered it before, or because you fundamentally don’t understand the underlying problems that we have already solved.
Yeah, I don't think the menu should've been a separate page. It can be made JavaScript-less as a dialog opened by the popover HTML attributes,[0] and the escape key would be able to close it.
I guess it doesn't have to use JavaScript for the back behavior. It could use a server-side rendered referrer if that hasn't been stripped by the browser?
You say that JavaScript and fallbacks for menus is a solved issue but the number of menus that are just an absolute clusterfuck is ridiculous on the web today. They're really not a solved issue, Progressive enhancement is hard to do. Genuinely hard in some cases.
On balance, while this is not without flaws, it's interesting. Accessibility, deep linking, reduction in cognitive load for the developer. There's some merit here.
...because the opening line of the blog post says he's been "building websites with LLMs", and then attempts to cutely redefine that abbreviation as "Lots of Little htMl pages" in a parenthetical.
It's, um. Not the best kind of communication, and very easily leads to this kind of misunderstanding.
JavaScript doesn't effect screen readers at all unless you dynamically add content without the proper ARIA roles. It is trivial to correct.
As I just said, users who explicitly disable JavaScript cannot even use Google Search. Why should I accommodate those users when even Google refuses to do so? They are actively choosing to have a limited web experience. The vast majority of the internet is completely broken for them.
How am I being hostile? I'm just tired of being downvoted every single time I mention that JavaScript is necessary on the modern web, and attempts to avoid it are quixotic at best.
That link is not nearly as convincing as you seem to think it is. I suppose that I will need to refute the points if I want you to stop sharing it, so here we go:
A browser extension has interfered with the site - okay? That can be true of literally anything. An extension can interfere with View Transitions too.
A spotty connection hasn’t loaded the dependencies correctly - Either they load or they don't. How would the dependencies load "incorrectly"? Does this author know how JavaScript works?
Internal IT policy has blocked dependencies - How? Are they bundled? Does this author still think modern websites load things like jQuery from a CDN? What year is it? (WYII from this point on, for the sake of brevity)
WIFI network has blocked certain CDNs - WYII
A user is viewing your site on a train which has just gone into a tunnel - The CSS and HTML won't load either!
A device doesn’t have enough memory available - WYII???
There’s an error in your JavaScript
- and, you don't have any tests? You didn't notice when developing the site? Can you not have errors in your CSS? Sure, an error in JS is worse, but that doesn't mean you should never use it.
An async fetch request wasn’t fenced off in a try catch and has failed - This usually wouldn't change anything. fetch failures are rarely actual errors (even a 500 response doesn't result in an exception), and it's async so it wouldn't affect the initial load.
A user has a JavaScript toggle accidentally turned off - The <noscript> tag exists.
A user uses a JavaScript toggle to prevent ads loading - <noscript>
An ad blocker has blocked your JavaScript from loading - Modern ad blockers are URL based. How are they loading literally anything else from my domain?
A user is using Opera Mini - No, they aren't.
A user has data saving turned on - Okay... And!??!?!
Rogue, interfering scripts have been added by Google Tag manager - Do I really need to explain how module scoping works here?
The browser has locked up trying to parse your JS bundle - This literally doesn't happen.
>A spotty connection hasn’t loaded the dependencies correctly - Either they load or they don't. How would the dependencies load "incorrectly"?
Let‘s say you have 5-7 dependencies to load, but 3 of them timed out because your train entered the tunnel. Your app ends up in incorrect state, fails silently and UX degrades unpredictably. This is where the conversion often drops visibly and the reason SSR is now a go-to solution for any marketing website.
A normal person would immediately think "dang, page didn't load before I entered the tunnel. Guess I'll wait til I'm out again and refresh".
And if they're deliberately going somewhwre where there's no signal for an extended period of time, and really want it to work, they'll ensure they've loaded everything before doing so.
And I say this as someone who is developing a pwa that is for people with low end phones and very inconsistent and/or connections. I'm very cognizant and empathetic to their situation.
Why am I loading dependencies from 5-7 places? Why is my website not using a bundler if it has so many varied dependencies? Why do we not expect the user to understand that they are in a tunnel without internet?
Regardless, this isn't really restricted to the usage of JavaScript. The website would likely have pretty bad UX if only half of the CSS loaded correctly, but no one programs defensively around it being absent.
Have you ever developed an enterprise scale frontend applications optimized for conversion targets? It feels like you have not. You may ship your own code in a bundle, yes. All integrations come on top of that. That chatbot, tracker, A/B testing logic etc - all are loaded separately from your service provider CDN.
An user opening a web page is not expecting a full-blown app with multi-second loading times. If that happens, they bounce, and you loose revenue. Web is supposed to have very short time to first content paint and very short time to interactive, the shorter, the better, less than 0.5s is the goal. It can deliver that, if built properly. Many SPAs, bulky JS apps are built this way for developer convenience, not for end users. The only real use case for SPA is when you deal with a lot of local data. A spreadsheet, document or image editor, a diagram tool (but then wasm is probably a better choice).
You may say, you are not building enterprise grade frontend. But if you are small enough, you don’t need SPA either.
> I'm just tired of being downvoted every single time I mention that JavaScript is necessary on the modern web
Downvotes should give you a hint that the few users that know what javascript is, don't like it, and the rest of them, if they learned, most likely won't like it either. Your attitude shows that you don't care.
Lots of important points already posted, from reliability over accessibility to SEO. To add a personal reason: speed. I hate slow computer things. If I open a page I e.g. don't have to use for work or similar (like Google Ads or Linkedin or similar horror shows), and loading the page takes longer than a second, I just bounce.
I built and maintain a static site for a company, and it's just wonderful. It opens instantly (which the search engines like), rebuilding and deploying is trivial, I am not forced to update every week because the JS framework has a new vuln.(There are a few dynamic, JS-based parts that load dynamically and fail gracefully.)
It's not "easier" using Javascript; raw Javascript websites are a nightmare to build, maintain and reason about. It is "easier" with Javascript + current-favourite-framework-of-the-day.
Also: accessibility, SEO, the all-new Generative Engine Optimization (GEO) thing, etc.
i'd say it really depends on the website you're building and what you define as "using javascript"
are you comparing fully client rendering vs SSR without client JS at all? are you building a marketing site or a dashboard type tool?
there is subtly to why each decision would be made based on these.
i think HN tends to demonize JS because we're all well aware of how bad the state of a lot of web apps have become with developers only knowing react and having no background in web fundamentals or accessibility.
i see no reason to completely shun JS, just use it intentionally and not reflexively.
But you don't need JS to make a search engine work. I'm pretty sure I've used links2 in the past to google things just fine. Google Search would work fine without JS, Google just wants to track people
I agree with what you're saying, of course, simplicity is better, etc.
But the nav on your blog is a terrible example.
Firstly, you don't get to just click on the links to go to where you want to go, you first have to click the three-lines button, even on a desktop with an enormous screen.
And secondly, despite your claims about an "enhanced experience with a modern browser", it seems to work exactly as if there was no enhancement at all? I click the three-lines menu and it takes me to a new page listing the links I can click. The "X" button to "close" the menu navigates me back particularly quickly, but that is all that I can tell that is unusual.
I'm using Firefox 136 on Ubuntu.
And in any event, this is all unnecessary, because you can make a nav by just putting a bunch of links at the top of the page, like HN does.
Wow, I love it when people discover these NEW ways of doing things.
Not a fan of the nav, but loved the transitions. Applied them to my website (bespoke C++ static generator + template library): https://vittorioromeo.com/
I've been trying to steer LLMs towards the Astro framework due to this - the island concept seems like a perfect fit
https://docs.astro.build/en/concepts/islands/
Isn't this just HATEOAS as espoused by libraries like htmx, datastar, hotwire etc.
https://htmx.org/essays/hateoas/
When I first saw the title, I thought "great! Someone else has (re)discovered hypermedia".
But they have gone in a completely bewildering direction. Rather than swap/morph html fragments in, they're doing full page navigations and using view transitions to make it look smooth.
Worse, they are manipulating the history to cover these blasphemous tracks.
Datastar would make this particularly simple - just include whatever menu/nav stuff that you need and show/hide/toggle it with a few signals/attributes.
No, they're not related. The submission's article is about CSS view transitions. HATEOAS seems to involve interacting with an API, but linking between pages on a blog does not require this.
As the article in the comment you are replying to makes very clear, this couldn't be more wrong
HATEOAS does not require interacting with an API.
I have a question: After clicking on a blog in the listing page ("Collective Speed is..."), the page navigated to that particular blog. What CSS transitions are used to convert that title to a header? I saw some animation which pushed that title to become a header. How does that work? I'm curious
That'd be the View Transition API. Specifically, it is a cross-document view transition. https://developer.chrome.com/docs/web-platform/view-transiti...
Really awesome new(ish) feature of the web platform. It doesn't yet have full support in Firefox unfortunately, but is supported in Chrome and Safari.
This is close to how things used to be, in the time before server-side includes.
isnt this just the old school way of making a static website? its nice and chill and peaceful
Would love to hear anecdata from others but I'd say...not really? I was a kid in those days but there's no way I'd make a server round trip for /menu/ to open a menu.
the go-to method is using frames.
OK...and what does that look like on a desktop browser?
Because if I click on a menu button on a desktop browser, I generally don't expect it to take over the entire page with a menu.
This seems like an example of unhelpfully mobile-centric website design, which has been becoming more prevalent in recent years.
I just tried it on their website, using the desktop browser, and the experience is absolutely OK: you just get the menu as in any web app, and you can close it to go back, etc. Just an old-school page which is blazing fast ... because it is an old-school page. It renders faster than a typical animation to open a sidebar.
I agree, but its not intrinsic to the approach of less JS and more pages.
You should of course not have a menu button on a desktop view. There is plenty of space to show the menu without hiding it behind a button.
Maybe it is you who are mobile centric?
I dunno, it wants to challenge our dependence on javascript and then to make it work it needs to inject a “back” behavior into a normal link?
Js and fallbacks for menus is a solved issue. this is just another form of LLM dunning krueger derangement where you think the LLM-suggested solution is novel because you haven’t encountered it before, or because you fundamentally don’t understand the underlying problems that we have already solved.
Yeah, I don't think the menu should've been a separate page. It can be made JavaScript-less as a dialog opened by the popover HTML attributes,[0] and the escape key would be able to close it.
[0]: https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...
I'm not at all against using some js for the menu, but a native dialog or popover is definitely a nice idea here.
I guess it doesn't have to use JavaScript for the back behavior. It could use a server-side rendered referrer if that hasn't been stripped by the browser?
You say that JavaScript and fallbacks for menus is a solved issue but the number of menus that are just an absolute clusterfuck is ridiculous on the web today. They're really not a solved issue, Progressive enhancement is hard to do. Genuinely hard in some cases.
On balance, while this is not without flaws, it's interesting. Accessibility, deep linking, reduction in cognitive load for the developer. There's some merit here.
I'm unsure why you think this was an LLM-suggested solution.
...because the opening line of the blog post says he's been "building websites with LLMs", and then attempts to cutely redefine that abbreviation as "Lots of Little htMl pages" in a parenthetical.
It's, um. Not the best kind of communication, and very easily leads to this kind of misunderstanding.
Why do you think this is LLM?
[dead]
I just don't see the appeal when it's much easier to just build a nice website using JavaScript.
Google Search doesn't work without JavaScript.
Seriously, what's the point? Don't just reflexively downvote me. Try to articulate why this is a good idea. It's not that hard to use your words.
You're pre-emptive hostility seems rather unwarranted.
This article is my usual go-to and lists several reasons why JavaScript might not be available, and thus why you shouldn't take it for granted: https://piccalil.li/blog/a-handful-of-reasons-javascript-won...
I feel compelled to add:
- the user explicitly disabled JavaScript
- the browser does not support JavaScript (I sometimes view websites using elinks)
AFAIK screen readers also work better without JavaScript, so it's also an accessibility issue.
JavaScript doesn't effect screen readers at all unless you dynamically add content without the proper ARIA roles. It is trivial to correct.
As I just said, users who explicitly disable JavaScript cannot even use Google Search. Why should I accommodate those users when even Google refuses to do so? They are actively choosing to have a limited web experience. The vast majority of the internet is completely broken for them.
How am I being hostile? I'm just tired of being downvoted every single time I mention that JavaScript is necessary on the modern web, and attempts to avoid it are quixotic at best.
That link is not nearly as convincing as you seem to think it is. I suppose that I will need to refute the points if I want you to stop sharing it, so here we go:
A browser extension has interfered with the site - okay? That can be true of literally anything. An extension can interfere with View Transitions too.
A spotty connection hasn’t loaded the dependencies correctly - Either they load or they don't. How would the dependencies load "incorrectly"? Does this author know how JavaScript works?
Internal IT policy has blocked dependencies - How? Are they bundled? Does this author still think modern websites load things like jQuery from a CDN? What year is it? (WYII from this point on, for the sake of brevity)
WIFI network has blocked certain CDNs - WYII
A user is viewing your site on a train which has just gone into a tunnel - The CSS and HTML won't load either!
A device doesn’t have enough memory available - WYII???
There’s an error in your JavaScript - and, you don't have any tests? You didn't notice when developing the site? Can you not have errors in your CSS? Sure, an error in JS is worse, but that doesn't mean you should never use it.
An async fetch request wasn’t fenced off in a try catch and has failed - This usually wouldn't change anything. fetch failures are rarely actual errors (even a 500 response doesn't result in an exception), and it's async so it wouldn't affect the initial load.
A user has a JavaScript toggle accidentally turned off - The <noscript> tag exists.
A user uses a JavaScript toggle to prevent ads loading - <noscript>
An ad blocker has blocked your JavaScript from loading - Modern ad blockers are URL based. How are they loading literally anything else from my domain?
A user is using Opera Mini - No, they aren't.
A user has data saving turned on - Okay... And!??!?!
Rogue, interfering scripts have been added by Google Tag manager - Do I really need to explain how module scoping works here?
The browser has locked up trying to parse your JS bundle - This literally doesn't happen.
>A spotty connection hasn’t loaded the dependencies correctly - Either they load or they don't. How would the dependencies load "incorrectly"?
Let‘s say you have 5-7 dependencies to load, but 3 of them timed out because your train entered the tunnel. Your app ends up in incorrect state, fails silently and UX degrades unpredictably. This is where the conversion often drops visibly and the reason SSR is now a go-to solution for any marketing website.
A normal person would immediately think "dang, page didn't load before I entered the tunnel. Guess I'll wait til I'm out again and refresh".
And if they're deliberately going somewhwre where there's no signal for an extended period of time, and really want it to work, they'll ensure they've loaded everything before doing so.
And I say this as someone who is developing a pwa that is for people with low end phones and very inconsistent and/or connections. I'm very cognizant and empathetic to their situation.
Why am I loading dependencies from 5-7 places? Why is my website not using a bundler if it has so many varied dependencies? Why do we not expect the user to understand that they are in a tunnel without internet?
Regardless, this isn't really restricted to the usage of JavaScript. The website would likely have pretty bad UX if only half of the CSS loaded correctly, but no one programs defensively around it being absent.
Have you ever developed an enterprise scale frontend applications optimized for conversion targets? It feels like you have not. You may ship your own code in a bundle, yes. All integrations come on top of that. That chatbot, tracker, A/B testing logic etc - all are loaded separately from your service provider CDN.
An user opening a web page is not expecting a full-blown app with multi-second loading times. If that happens, they bounce, and you loose revenue. Web is supposed to have very short time to first content paint and very short time to interactive, the shorter, the better, less than 0.5s is the goal. It can deliver that, if built properly. Many SPAs, bulky JS apps are built this way for developer convenience, not for end users. The only real use case for SPA is when you deal with a lot of local data. A spreadsheet, document or image editor, a diagram tool (but then wasm is probably a better choice).
You may say, you are not building enterprise grade frontend. But if you are small enough, you don’t need SPA either.
Don't worry, it's only in the HN bubble that adding a 500ms round trip to open a damn menu is acceptable and commended.
> I'm just tired of being downvoted every single time I mention that JavaScript is necessary on the modern web
Downvotes should give you a hint that the few users that know what javascript is, don't like it, and the rest of them, if they learned, most likely won't like it either. Your attitude shows that you don't care.
Lots of important points already posted, from reliability over accessibility to SEO. To add a personal reason: speed. I hate slow computer things. If I open a page I e.g. don't have to use for work or similar (like Google Ads or Linkedin or similar horror shows), and loading the page takes longer than a second, I just bounce.
I built and maintain a static site for a company, and it's just wonderful. It opens instantly (which the search engines like), rebuilding and deploying is trivial, I am not forced to update every week because the JS framework has a new vuln.(There are a few dynamic, JS-based parts that load dynamically and fail gracefully.)
> when it's much easier to just build a nice website using JavaScript
I'm currently building a web-based tool that uses dynamic forms for UI, without the help of of a framework (yeah I know; I have reasons). This is the result: https://github.com/KaliedaRik/sc-filter-builder/blob/main/js...
It's not "easier" using Javascript; raw Javascript websites are a nightmare to build, maintain and reason about. It is "easier" with Javascript + current-favourite-framework-of-the-day.
Also: accessibility, SEO, the all-new Generative Engine Optimization (GEO) thing, etc.
i'd say it really depends on the website you're building and what you define as "using javascript"
are you comparing fully client rendering vs SSR without client JS at all? are you building a marketing site or a dashboard type tool?
there is subtly to why each decision would be made based on these.
i think HN tends to demonize JS because we're all well aware of how bad the state of a lot of web apps have become with developers only knowing react and having no background in web fundamentals or accessibility.
i see no reason to completely shun JS, just use it intentionally and not reflexively.
Why wouldn't google work without javascript? Just use some forms, submit the text the user entered, get back a page with the search results...
OP is right. a quick google (hehe) shows you they require it as of 2024.
But you don't need JS to make a search engine work. I'm pretty sure I've used links2 in the past to google things just fine. Google Search would work fine without JS, Google just wants to track people
Google is an advertising company. Not a search engine company.