For reference, 20 MB is three hundred and thirteen Commodores.

The complete Doom 2, including all graphics, maps, music and sound effects, shipped on 4 floppies, totalling 5.76MB.

The original Doom 2 ran 64,000 pixels (320x200). 4k UHD monitors now show 8.3 million pixels.

YMMV.

Of course, Doom 2 is full of Carmack shenanigans to squeeze every possible ounce of performance out of every byte, written in hand optimized C and assembly. Nextcloud is delivered in UTF-8 text, in a high level scripting language, entirely unoptimized with lots of low hanging fruit for improvement.

yes, but why isn't it optimised? not as extreme as doom had to be, but to be a bit better? especially the low hanging fruits.

this is why i think there's another version for customers who are paying for it, with tuning, optimization, whatever.

Sure but i doubt there is more image data in the delivered nextcloud data compared to doom2, games famously need textures where a website usually needs mostly vector and css based graphics.

Actually Carmack did squeeze every possible ounce of performance out of DOOM, however that does not always mean he was optimizing for size. If you want to see a project optimized for size you might check out ".kkrieger" from ".theprodukkt" which accomplishes a 3d shooter in 97,280bytes.

You know how many characters 20MB of UTF-8 text is right? If we are talking about javascript it's probably mostly ascii so quite close to 20 million characters. If we take a wild estimate of 80 characters per line that would be 250000 lines of code.

I personally think 20MB is outrageous for any website, webapp or similar. Especially if you want to offer a product to a wide range of devices on a lot of different networks. Reloading a huge chunk of that on every page load feels like bad design.

Developers usually take for granted the modern convenience of a good network connection, imagine using this on a slow connection it would be horrid. Even in the western "first world" countries there are still quite some people connecting with outdated hardware or slow connections, we often forget them.

If you are making any sort of webapp you ideally have to think about every byte you send to your customer.

I mean, if you’re going to include carmack’s relentless optimizer mindset in the description, I feel like your description of the NextCloud situation should probably end with “and written by people who think shipping 15MB of JavaScript per page is reasonable.”

You know apps don't store pixels, right? So why are you counting pixels?

A single picture that looks decent on a modern screen, taken from a modern camera, can easily be larger than the original Doom 2 binary.

You don't need pictures for a CRUD app. Should all be vectorial in any case.

The article suggests that it takes 14MB of Javascript to do just the calendar. I doubt that all of my calendar events for 2025 is 14MB.

Sure, but what people leave out is that it’s mostly C and assembly. That just isn’t realistic anymore if you want a better developer experience that leads to faster feature rollout, better security, and better stabilty.

This is like when people reminisce about the performance of windows 95 and its apps while forgetting about getting a blue screen of death every other hour.

Exactly javascript is a higher level language with a lot of required functionality build in. When compared to C you would need to (for most tasks) write way less actual code in javascript to achieve the same result, for example graphics or maths routines. Therefore it's crazy that it's that big.

I think it's a double edged sword of Open-Source/FLOSS... some problems are hard and take a lot of effort. One example I consistently point to is core component libraries... React has MUI and Mantine, and I'm not familiar with any open-source alternatives that come close. As a developer, if there was one for Leptos/Yew/Dioxus, I'd have likely jumped ship to Rust+WASM. They're all fast enough with different advantges and disadvantages.

All said... I actually like TypeScript and React fine for teams of developers... I think NextCloud likely has coordination issues that go beyond the language or even libraries used.

Windows 2000 was quite snappy on my Pentium 150, and pretty rock solid. It was when I stopped being good at fixing computers because it just worked, so I didn't get much practice.

I did get a BSOD from a few software packages in Win2k, but it was fewer and much farther between than Win9x/me... I didn't bump to XP until after SP3 came out... I also liked Win7 a lot. I haven't liked much of Windows since 7 though.

Currently using Pop + Cosmic.

Win2000 is in the same class as Win95 despite being slightly more stable. It still locked up and crashed more frequently than modern software.

Then you did something special. For me Win2k was at least three orders of magnitude more stable, and based on my buddies that was not exceptional.