>, we simply have to make a cultural change where non-technical people do more for themselves. I don't even think it's about technical difficulty (most of the time). I think people just want someone else to take care of their shit.
The above includes us highly technical people on HN. We really can't expect (or lecture) the normal mainstream population to make a cultural change to adopt decentralized tech when most of us don't do it ourselves.
E.g. Most of us don't want to self-host our public git repo. Instead, we just use centralized Github. We have the technical knowledge to self-host git but we have valid reasons for not wanting to do it and willingly outsource it to Github. (Notice this thread's Show HN about decentralized social networking has hosted its public repo on centralized Github.)
And consider we're not on decentralized USENET nodes discussing this. Instead, we're here on centralized HN. It's more convenient. Same reason technical folks shut down their self-hosted PHP forum software and migrate to centralised Discord.
The reason can't be reduced to just "people being lazy". It's about tradeoffs. This is why it's incorrect to think that futuristic scenarios of a hypothetical easy-to-use "internet appliance" (possibly provided by ISP) to self-host email/git/USENET/videos/etc and a worldwide rollout out IPv6 to avoid NAT will remove barriers to decentralization.
The popular essay "Protocols Not Platforms" about the benefits of decentralization often gets reposted here but that doesn't help because "free protocols" don't really solve the underlying reasons centralization keeps happening: money, time, and motivation to follow the decentralized ethos.
"But you become a prisoner of centralized services!" -- True, but a self-hosted tech stack for some folks can also be a prison too. It's just a different type. To get "freedom" and escape the self-hosted hassles, they flee to centralized services!
I agree with you that it's about tradeoffs.
The cost ($$$, opportunity cost, and mental toll) of maintenance is very real. It can be hugely advantageous to outsource that effort to a professional, PROVIDED the professional is trustworthy and competent. To ensure that most professionals are trustworthy and competent two things need to be present:
1. A very high degree of transparency, so that it's very difficult for a service provider to act contrary to their user's interests without the user knowing about it.
2. Very low switching costs, so that if the service provider ever does act against their users' interests, they will be likely to lose their users.
As long as our laws encourage providers to operate in black-box fashion, and to engineer artificially high switching costs into their products, I believe there will continue to be a case for self-hosting among a minority of the population. And because they are a minority, they will be forced to also make use of centralized services in order to connect to the people who are held hostage by those high switching costs.
Somewhere in the multiverse, there's a world in which interoperability and accountability have been enshrined as bedrock principles and enforced since the beginning of the internet. It would be very interesting to compare that world with the one we inhabit.
I do wonder if self-host or centralised are the only options.
Something like IPFS, but that works remains my dream - decentralised, but in the cloud nonetheless.
It depends a lot on how accessible those services are. I tried to host some git repos 5 years ago and it was a hassle (needed mostly private git and reviews nothing fancy). I tried again this year and using forgejo was extremely easy. I don't remember exactly what problems I had before, so maybe I got better at finding things, but this time felt more polished. Containers, reasonable defaults, good tutorial on how to start, took in total less than one hour. I did in the meantime an upgrade and that was really 5 minutes (check change-log, apply it and go)
Of course, lots of work was done in the background to reach this point, but I think it is possible. Will I make the effort to make that happen for a social network? No, because I am not using them that much.
Technically things become simpler (in the sense that you can do it "at home" and if you add LLM-s to answer you when you don't know some obscure option it is even easier), but identifying well the use-case, deciding defaults, writing documentation, juggling trade-offs will remain as hard as before.
Note/edit: something being possible does not mean one should do it, so I think it will depend on everybody's priorities and skills. I wish though good luck to anybody trying...
Out of curiosity, how do you handle backups?
(To my great disappointment, a lot of "how to self-host" guides just omit that step, and quietly assume that disks don't go bad...)
Not the poster, but: use ZFS or LVM + XFS on your machine, do a snapshot, use restic or kopia to back it up to cheap object storage in the cloud, such as R2. If it's too technical, run syncthing and mirror it to a USB-connected external disk, preferably a couple of meters away from your machine.
A poor haphazard backup is better than no backup.
> A poor haphazard backup is better than no backup.
but is it better than cloud provider?
Cloud provider can lock you out without recourse and you'll lose your data.
Local backups can fail, be destroyed (for example a failed PSU kills both your PC and any attached devices), or be deleted by malware
How complex do you need to have your local backup to achieve cloud providers' reliability?
The best backup is a proper 3-2-1, with regular testing of integrity, and regular restoration from a backup as an exercise. But most people cannot be bothered to care quite so much.
So, keeping a half-assed backup copy on a spouse's machine in a different room is still better than not keeping any copy at all. It will not protect from every disaster, but it will protect against some.
My own backups progressed from manual rsync to syncthing to syncthing for every machine in the house + restic backups (which saved my bacon more than once).
>And consider we're not on decentralized USENET nodes discussing this. Instead, we're here on centralized HN. It's more convenient. Same reason technical folks shut down their self-hosted PHP forum software and migrate to centralised Discord.
You're contradicting yourself. Why is HN centralized, while a phpBB forum is decentralized? Are you conflating decentralization and being open source?
>Why is HN centralized, while a phpBB forum is decentralized?
There's a spectrum of decentralized <--> centralized for different audiences.
For this tech demographic here where installing some type of p2p or federated discussion tech (Mastodon? Matrix?) is not rocket science, it's more convenient for us to avoid that and just be on a "centralized" HN. I used to be very active on USENET and HN is relatively more centralized than a hypothetical "comp.programming.hackernews" newsgroup. This is not a complaint. It's an observation of our natural preferences and how it aggregates. (Btw, it's interesting that Paul Graham started this HN website but doesn't post here anymore. Instead, he's more active on Twitter. He's stated his reasons and it's very understandable why.)
For the phpBB forums where a lots of non-tech people discuss hobbies such as woodworking, guitar gear, etc., the decentralization perspective is the php forums and the centralization is towards big platforms such as reddit / Discord / Facebook Groups.
I see similar decentralized --> centralized trends in blogs. John Carmack abandoned his personal website and now posts on centralized Twitter.
My overall point is that a lot of us techies wish the general public would get enlightened about decentralization but that's unrealistic when we don't follow that ideal ourselves. We have valid reasons for that. But it does a create a cognitive dissonance and/or confusion as to why the world doesn't do what we think they should do.
EDIT add reply: >Wouldn't comp.programming.hackernews concentrate discussion under a single heading and also be hosted from a single specific computer?
Usenet is more decentralized/federated: https://en.wikipedia.org/wiki/Usenet#:~:text=Usenet%20is%20t...
> I used to be very active on USENET and HN is relatively more centralized than a hypothetical "comp.programming.hackernews" newsgroup.
How so? Wouldn't comp.programming.hackernews concentrate discussion under a single heading and also be hosted from a single specific computer? This confuses me even further; I don't understand what you mean by centralization.
>For the phpBB forums where a lots of non-tech people discuss hobbies such as woodworking, guitar gear, etc., the decentralization perspective is the php forums and the centralization is towards big platforms such as reddit / Discord / Facebook Groups.
Surely by this interpretation HN is decentralized. It's a special interest (if relatively broad) forum just like those phpBB forums were. I ask again: is HN "centralized" just because you can't spin up your own copy of the software to use it to talk about gardening?