20 years ago grandpa could go to limewire.com, download setup.exe and click next->next->next to install a fully functional file hosting server+client. It was so easy that 1/3rd of world's computers had limewire installed in 2007 [1]. ONE FUCKING THIRD!

Today, to install even the simplest self-hosted software, one has to be effectively a professional software engineer. Use SSH, Use Docker, use tailscale, understand TLS and generate certificates, Perform maintenance updates, check backups, and million things that are automatable.

No idea why self-hosted software isn't `apt-get install` and forget. Just like Limewire. But that's the reason no one self-hosts.

[1] https://en.wikipedia.org/wiki/LimeWire

> No idea why self-hosted software isn't `apt-get install` and forget.

Some of it is. But as soon as you want your services to be accessible from the Internet, you need to have a domain name and HTTPS. To run Limewire or a BitTorrent client, you don't need a domain name yourself because you use a central server (in the case of BitTorrent, a tracker) to help you discover peers.

All the popular domain name services and certificate issuers have APIs. All grandpa has to do is go online and buy a domain - which is a very reasonable step that grandpa can do. Grandpa, after all buys stuff online. But after that the self-hosted app should be able to leverage the APIs to configure all the settings.

That's because Limewire is a client and not a server. If you wanted decent share ratios you needed to update your firewall to allow the correct inbound ports (or leave UPnP on (bad idea)).

A self-hosted server is an entirely different beast. You're right, it's not easy to setup and run -- but that's the world we live in. Malicious actors have ruined something that could have been relatively easy and automated to setup and run; even the most experienced of us wouldn't stand against professional penetration testers or nation states.

I don't think many people would consider limewire to be "self-hosting". That is just installing a program.

That's ok, because "self-hosting" is not a goal in itself, it's a means to an end

> No idea why self-hosted software isn't `apt-get install` and forget.

Ubuntu tried to fix this with snaps but the whole Linux community raged and pushed back at them. Yeah, snap has its faults but it was designed initially for server-side apps.

Snap install xyz-selfhosted-app was the initial goal. You can install nextcloud as a snap right now.

Instead the Linux community let perfect be the enemy of good and successfully convinced everyone else to dump and avoid snaps as a format at all costs.

I don't recall any of that narrative being why people didn't like snaps.

One of the early sticking points was switching Firefox from deb to snap. That doesn't fit into your characterization.

Right, it got a bad rep on the desktop which tarnished its reuptation overall as a packaging format entirely

Isn't Ubuntu primarily a desktop distribution?

The numbers might favor server installs (no idea), but it seems like the decisions must be primarily desktop. (i.e. a server admin or business that installs a thousand Ubuntu instances is just a single decision).

Either way, if Canonical's goals for snaps included easing people into self-hosting their services, surely making the experience pleasant on desktop would be a priority?

I don't recall any positive changes brought by snaps. I was looking at it through a desktop lens at the time, but my general perspective is mostly server-side, so I might be biased in that direction.

I don't think the two perspectives are necessarily in conflict, but noted just for framing... :)

Nextcloud snap is really easy to install, and has been solid. Zero maintenance.

Fully agreed that any rough edges/onboarding can be solved (with a lot of work, care, etc.).

I just have one main question: what would you like to self-host? Limewire was about file sharing, so the "value proposition" was clearly-ish defined. The "what does Limewire do" was clear.

Are you interested in hosting your own web site? Or email? Or google cal/drive/photos-equivalent? Some of it, all of it?

I'm genuinely curious, and also would love to know: is this a 80% of people want X (self-hosted file storage? web serving?), and then there's a very long tail of other services? Does everyone want a different thing? Or are needs power-law distributed? Cheers

[deleted]

Self hosting involves 3 steps in my life.

1) Find the docker compose file. 2) Change the expose line to make it specific 10.0.10.1:9000 instead of the default 0.0.0.0:9000 . 3) Connect via wireguard.

(Answers the "security" point a sister comment brought up too)

Not to mention the 100 steps you have to do to get their, of course...

Yes of course. The first time you try to get wireguard working you will not get it to ping the other side right away. It is a process. The next few times it'll be much quicker. Then it will keep running forever. Or maybe mine isn't working but I never noticed.

I had this wireguard setup in place long before I even ran my first docker container. It's all building on top of things already there.

It's really frustrating how every single compose file publishes on 0.0.0.0 instead of 127.0.0.1

and goodbye security...

Yes. It's implicit. Goodbye security indeed.

XAMPP seems to still be alive and maintained.

https://www.apachefriends.org/

I haven't used it in over a decade, but I'm glad to see it's still kicking.

Not as click click click, but still awesome - copyparty

>Today, to install even the simplest self-hosted software, one has to be effectively a professional software engineer.

I’m a regular engineer, non-software, my coding knowledge is very basic, I could never be employed even as a junior dev unless I wanted to spend evenings grinding and learning.

Still I was able to set up a NAS and a personal server using Docker. I think a basic and broad intro to programming class like Harvard’s CS50 is all that would be required to learn enough to be able to figure out self-hosting.

Those things would hardly take an hour to set up, it's the cost of freedom and control. Don't want to put any effort? Might as well be a cloud slave and complain about lack of digital sovereignty while using gdrive like a fucking normie

Because of American copyright predators and jurisdictions friendly to their lobbying like e.g. Germany. If you're planning to get involved in this kind of software, better think beforehand about practices to ensure your anonymity.

>No idea why self-hosted software isn't `apt-get install` and forget. Just like Limewire. But that's the reason no one self-hosts.

Security.

As an avid self-hoster with a rack next to my desk, I shudder as I read your comment, unfortunately.

It's in fact the opposite. If the user has to manually write/fix endless configuration files, they are likely to make a mistake and have gaps in their security. And they will not know because their settings are distinct from everyone else.

If they `apt-get install` on a standard debian computer, and the application's defaults are already configured for high-security, and those exact settings have been tested by everyone else with the same software, you have a much higher chance of being secure. And if a gap is found, update is pushed by the authors and downloaded by everyone in their automatic nightly update.

The core point is valid. As someone who self hosts, it's become so complicated to get the most basic functionality setup that someone with little to no knowledge would really struggle whereas years ago it was much simpler. Functionally now we can do much more but practically, we've regressed.

What's so complicated? I'm currently on DigitalOcean but I've self-hosted before. My site is largely a basic LAMP setup with LetsEncrypt and a cron job to install security updates. Self-hosting that on one of my machines would only be a matter of buying a static IP and port forwarding.

LAMP with dynamic webpages (I assume your approach) works just like it ever did (besides SSL)

But are you really keen to make a PHP dynamic webpage application where each page imports some database function/credentials and uses them to render html?

Can you keep the behavior of fluent userflow (e.g. menu not rerendering) that way? Only with minimal design.

When in 2006 most webpages had an iframe from the main content, an iframe for the menu, and maybe an iframe for some other element (e.g. a chat window), it was fine to refresh one of those or have a link in one load another dynamic page. Today that is not seen to be very attractive and to common people (consumers and businesses), unattractive means low-trust which means less income. Just my experience, unfortunately. I also loved that era in hindsight, even though the bugs were frustrating, let alone the JS binding and undefined errors if you added that...

You can make modern single-page web apps with a LAMP back-end if you want. PHP is perfectly capable of serving database query results as JSON, and Apache will happily serve your (now static) HTML and JS framework-based page.

I was doing web development in 2006 and that's not how it was. Websites were not all in i-frames and they were not all insecure. Setting up a PHP dynamic website with Apache does not have to be insecure and didn't have to be back then, either.

Putting something on the Internet by yourself has always been outside the reach of a non-tech person. Years ago regular people weren't deploying globally available complex software from their desktops either.

The point to an extent is to make it have friction.

If you don't care enough to figure it out, then you don't care enough to make it secure and that leads to very very bad time in modern largely internet-centric world.

100% wrong