I don't think this article actually does a great job of explaining why Nextcloud feels slow. It shows lots of big numbers for MBs of Javascript being downloading, but how does that actually impact the user experience? Is the "slow" Nextcloud just sitting around waiting for these JS assets to load and parse?

From my experience, this doesn't meaningfully impact performance. Performance problems come from "accidentally quadratic" logic in the frontend, poorly optimised UI updates, and too many API calls.

It downloads a lot of JavaScript, it decompresses a lot of JavaScript, it parses a lot of JavaScript, it runs a lot of JavaScript, it creates a gazillion onFoundMyNavel event callbacks which all run JavaScript, it does all manner of uncontrolled DOM-touching while its millions of script fragments do their thing, it xhr’s in response to xhrs in response to DOM content ready events, it throws and swallows untold exceptions, has several dozen slightly unoptimized (but not too terrible) page traversals, … the list goes on and on. The point is this all adds up, and having 15MB of code gives a LOT of opportunity for all this to happen. I used to work on a large site where we would break out the stopwatch and paring knife if the homepage got to more than 200KB of code, because it meant we were getting sloppy.

15+ megabytes of executable code begins to look quite insane when you start to take a gander at many AAA games. You can produce a non-trivial Unity WebGL build that fits in <10 megabytes.

It’s the kind of code size where you analyze it and find 13 different versions of jquery and a hundred different bespoke console.log wrappers.

Yes and Windows 3.11 came on 6 1.44MB floppy disks. Modern software is so offensive.

Windows 3.11 also wasn’t shipped to you over a cellular connection when you clicked on it. If it were, 6x1.44MB would have been considered quite unacceptable.

But at least they’re not prematurely optimizing

Agreed. Plus if it truly downloads all of that every time, something has gone wrong with caching.

Overeager warming/precomputation of resources on page load (rather than on use) can be a culprit as well.

Relying on cache to cover up a 15MB JavaScript load is a serious crutch.

Oh totally, but - normal caching behavior would lead to different results than reported in the article. It would impact cold-start scenarios, not every page load. So something else is up.