> Viewed through the lens of digital autonomy and citizenship, the question isn’t simply “Is Linux perfect?” but rather: Do we want our fundamental computing environment to be ultimately under our control, or controlled by private interests with their own incentives?
As a user of Linux as my main desktop OS for more than 20 years, a user of Linux far longer than that, and a promoter of FOSS before that was a term, this has always been the question. Most of the world does not care. I suspect that is more true today than ever before. There are now adults that grew up in the age of social media that have no idea how local computing works.
Not to be negative but the "obstacles" to adopting Linux were never actually obstacles most of the time. Fifteen years ago my mother started using Linux as her main OS with no training. I gave her the login information, but never had a chance to show her how to use it, and she just figured it out on her own. Everything just worked, including exchanging MS Office documents for work.
> Most of the world does not care. I suspect that is more true today than ever before. There are now adults that grew up in the age of social media that have no idea how local computing works.
Yep. I was amazed when I was talking to a friend who's a bit younger (late 20s) and told him about a fangame you could just download from a website (Dr Robotnik's Ring Racers, for the record) and he was skeptical and concerned at the idea of just downloading and running an executable from somewhere on the internet.
I suspect most adults these days are like this; their computing experience is limited to the web browser and large official corporate-run software repositories e.g. app stores and Steam. Which ironically means they would do just fine on Linux, but there's also no incentive for them to switch off Windows/MacOS.
To them, Microsoft and Apple having control of their files and automatically backing up their home directory to Azure/iCloud is a feature, not a problem.
> and he was skeptical and concerned at the idea of just downloading and running an executable from somewhere on the internet
Ironically, being concerned and skeptical about running random executables from the internet is a good idea in general.
> Ironically, being concerned and skeptical about running random executables from the internet is a good idea in general.
I agree you shouldn't run random executables, but the key word is "random". In this case, Ring Racers is a relatively established and somewhat well-known game, plus it's open-source.
It doesn't guarantee it's not harmful of course, but ultimately for someone with the mindset of "I should never run any programs that aren't preapproved by a big corporation", they may as well just stick to Windows/MacOS or mobile devices where this is built into the ecosystem.
> plus it's open-source
Open-source only matters if you have the time/skill/willingness to download said source (and any dependencies') and compile it.
Otherwise you're still running a random binary and there's no telling whether the source is malicious or whether the binary was even built with the published source.
It's no guarantee, but it's a positive indicator of trustworthiness if a codebase is open source.
I don't have hard numbers on this, but in my experience it's pretty rare for an open source codebase to contain malware. Few malicious actors are bold enough to publish the source of their malware. The exception that springs to mind is source-based supply chain attacks, such as publishing malicious Python code to Python's pip package-manager.
You have a valid point that a binary might not correspond to the supposed source code, but I think this is quite uncommon.
Of course this is true. But you can keep going down the rabbit hole. How do you know there isn't a backdoor hidden in the source code? How do you know there isn't a compromised dependency, maybe intentionally?
Ultimately there needs to be trust at some point because nobody is realistically going to do a detailed security analysis of the source code of everything they install. We do this all the time as software developers; why do I trust that `pip install SQLAlchemy==2.0.45` isn't going to install a cryptominer on my system? It's certainly not because I've inspected the source code, it's because there's a web of trust in the ecosystem (well-known package, lots of downloads, if there were malware someone would have likely noticed before me).
> still running a random binary
Again "random" here is untrue, there's nothing random about it. You're running a binary which is published by the maintainers of some software. You're deciding how much you trust those maintainers (and their binary publishing processes, and whoever is hosting their binary).
The problem is that on Windows or your typical Linux distro "how much you trust" needs to be "with full access to all of the information on my computer, including any online accounts I access through that computer". This is very much unlike Android, for example, where all apps are sandboxed by default.
That's a pretty high bar, I don't blame your friend at all for being skeptical.
> Open-source only matters if you have the time/skill/willingness to download said source (and any dependencies') and compile it.
Not really. The fact that an application is open-source means its originator can't rug-pull its users at some random future date (as so often happens with closed-source programs). End users don't need to compile the source for that to be true.
> Otherwise you're still running a random binary and there's no telling whether the source is malicious or whether the binary was even built with the published source.
This is also not true in general. Most open-source programs are available from an established URL, for example a Github archive with an appropriate track record. And the risks of downloading and running a closed-source app are much the same.
How do they know they’ve found the legitimate Ring Racers download and not some scammer who managed to get their search result above the real one?
Nothing wrong with downloading and running programs you trust, but there needs to be a good answer to that question.
To be fair, downloading and running random executables from the internet is a genuinely terrible security model when the OS (like Windows, Linux, or (to a lesser extent) MacOS) does nothing to prevent it from doing anything you can do.
> Most of the world does not care. I suspect that is more true today than ever before
100% of the people I have spoken with, from uber drivers to grandparents, have all noticed, hated, and are sympathetic to the fight against the rental/subscription economy. In 2025 I don't think I've had a single person defend the status quo because they all know what's coming.
I think Arduino and RPi demonstrate that there is still a relatively strong attraction for tinkering. In the past, freedom meant a lot to tinkerers. My sense is that this is not so true today. Perhaps I am wrong. It may be that few people respect licensing enough to care. As long as somebody (not necessarily the producer) has made a youtube video of how to hack something, that's good enough.
This was probably always true. Replace youtube with Byte magazine and it was probably the same 45 years ago. I wonder if the percentage of true FOSS adherents has changed much. It would be a bit of a paradox if the percent of FOSS software has exploded and the percent of FOSS adherents has declined.
Note: I mean "adherent" to mean something different than "user".
> I think Arduino and RPi demonstrate that there is still a relatively strong attraction for tinkering
Raspberry Pi is an interesting example because it is constantly criticized by people who complain about the closed source blobs, the non-open schematics, and other choices that don’t appease the purists.
Yet it does a great job at letting users do what they want to do with it, which is get to using it. It’s more accessible than the open counterparts, more available, has more guides, and has more accessories.
The situation has a lot of parallels to why people use Windows instead of seeking alternatives: It’s accessible, easy, and they can focus on doing what they want with the computer.
The problems with SBCs are primarily software. I have a ton of SBCs, mostly Raspberry Pis and OrangePis.
OrangePi boards are great. Zero is almost stamp sized, plus and pro has tons of options and on-board NVMe + fast-ish eMMC with great official cases, whatnot.
But, guess what? The OS is bad. I mean, unpatched, mishmashed, secured as an open door bad.
You get an OS installation which drops you to root terminal automatically on terminal output. There are many services which you don't need on board. There's an image, not an installer, and all repositories look to Chinese servers.
Armbian is not a good solution, because it's not designed to rollover like Debian and RasberryPi OS. So you can't build any long-term system from them like you can build with RaspberryPi.
On top of that, you can't boot anything mainline on most of them because either drivers are closed source, or the Kernel has weird hacks to make things work, or generally both.
So, what makes Raspberry Pi is not the hardware, but software support.
I don't think tinkering is the dominant culture behind tech anymore, but it's definitely operating at a higher scale than ever before. There's more OSS projects than ever, and there are tons of niche areas with entire communities. Examples could include: LoRa radios (or LoRA adaptors!), 3d printing, FPGA hacking, new games for retro hardware...
There was a gap before (think 90s and early 2000s) where there was a niche tinkering and more mainstream user/power user/programmer crowds. All these groups have knowledge gaps between them, but the gap was surmountable.
Now, the groups have drifted apart. Even if you're a programmer, unless you care or get excited about the hardware, you don't know how things work. You follow the docs, push the code to magical gate via that magical command, and that works. It's similar even for Desktop applications.
When you care about performance, and try to understand how these things work, you need to break that thick ice to start learning things, and things are much more complicated now, so people just tend to run away and pretend that it's not there.
Also, since the "network is reliable, computing cheap" gospel took hold, 90% of the programmers don't care about how much performance / energy they waste.
I'm guilty of this. I started with a C64 and love hardware and programming, but modern CPUs and MCUs are so complicated I can't be bothered learning about them.
The old 8-bit Arduinos were pretty understandable, but with an ESP-32 I just assume the compiler knows what it's doing and that the Espressif libs are 'good enough'.
> There are now adults that grew up in the age of social media that have no idea how local computing works.
Very few people of any age understood how local computing (or any computing) works. There's probably more now since most of the world is connected.
Profit scale has reached a point where commercial OS creators have to do stuff like shove ads into the UI. There's probably more legitimate need from non-developers to use Linux now than ever before, just to get a better base-line user experience.
You are right. Most will never care. I think of it like, lets try to keep the lights on for the folks that inevitably get burned and need an escape hatch. Many will not, but always some will. At least that's my way of not being a techno-nihilist.
The same with multiple people I know. Its not perfect, but neither is Windows.
> There are now adults that grew up in the age of social media that have no idea how local computing works.
They like it given a chance. My daughters for example far prefer Linux to Windows.
> They like it given a chance. My daughters for example far prefer Linux to Windows.
The two topics are orthogonal. GP talks about "local computing" vs. "black box in the cloud", the difference between running it vs. using it. You're talking about various options to run locally, the difference between running it this way or that way.
Linux or Windows users probably understand basic computing concepts like files and a file system structure, processes, basic networking. Many modern phone "app" users just know what the app and device shows them, and that's not much. Every bit of useful knowledge is hidden and abstracted away from the user. They get nothing beyond what service the provider wants them to consume.
>> Do we want our fundamental computing environment to be ultimately under our control, or controlled by private interests with their own incentives?
Define "our".
Because having general compute under developer/engineering control does not mean end-users want, need, or should, tinker inside appliances.
So there are two definitions of our: our end-users, and ourselves the engineers.
Worldwide, in aggregate, far more harms come to users from malware, destroying work at the office and life memories at home, than benefits from non-tech-savvy users being able to agree to a dialog box (INSTALL THIS OR YOUR VOTING REGISTRATION WILL BE SWITCHED IN 30 MINUTES!!!) and have rootkits happen.
Our (hackers) tinkering being extra-steps guardrailed by hardware that we can work within, to help us help general computing become as "don't make me think, and don't do me harm" as a nightstand radio clock, seems a good thing.
Not hard to see through the false "only two cases" premise of the quote, however un-hip to agree so.