This is one way to look at it, but ignores the fact that most users use third party community plugins.
Obsidian has a truly terrible security model for plugins. As I realized while building my own, Obsidian plugins have full, unrestricted access to all files in the vault.
Obsidian could've instead opted to be more 'batteries-included', at the cost of more development effort, but instead leaves this to the community, which in turn increases the attack surface significantly.
Or it could have a browser extension like manifest that declares all permissions used by the plugin, where attempting to access a permission that's not granted gets blocked.
Both of these approaches would've led to more real security to end users than "we have few third party dependencies".
When I was young there were a few luminaries in the software world who talked about how there is a steady if small flow of ideas from video game design into conventional software.
But I haven't heard anyone talk like that in quite sometime (unless it's me parroting them). Which is quite unfortunate.
I think for example if someone from the old guard of Blizzard were to write a book or at least a novella that described how the plugin system for World of Warcraft functioned, particularly during the first ten years, where it broke, how they hardened it over time, and how the process worked of backporting features from plugins into the core library...
I think that would be a substantial net benefit to the greater software community.
Far too many ecosystems make ham-fisted, half-assed, hair-brained plugin systems. And the vast majority can be consistently described by at least two of the three.
Game dev also rewards people for applying the 80/29 rule effectively and you see less of that in commercial software.
In each game generation there’s a game that would be easy to write on the next or subsequent generation of hardware and is damned difficult to implement on the current one. Cleverness and outright cheating make it work, after all fashion.
The game simulation will get more detailed/granular as aesthetics dial down in perceived value. You can always go bigger/wider/more procedural/more multiplayer.
This is also why every hard problem eventually shows up — games are just simulation + interaction, and eventually everything that can be simulated will have some attempted implementation out there, struggling along. (For some reason, this does not appear to stop at “interesting” things to simulate — see all the literal simulators on steam)
The simulations have yet to release photo-realism in lieu of event-perception, where simulation parallels reality, but that's not really playable as a game, only as a view.
I came to learn that even though in process plugins are easier to implement, and less resource demanding, anyone serious about host stability and security can only allow for plugins based on OS IPC.
And in general, it will take less hardware resources that the usual Electron stuff.
Kernel design is (to me) another one where ideas have flowed into other software fields - there were monolithic kernels, micro kernels, and hybrid kernels, and they all need to work with third party modules (drivers)
The lessons from all fields seem to be relearnt again and again in new fields :-)
Because learning how to make a proper one requires building your own broken one first.
It might be slightly sped up by reading up on theory and past experiences of others.
I am around mid life and I see how I can tell people stuff, I can point people to resources but they still won’t learn until they hit the problem themselves and put their mind into figuring it out.
A lot of stuff we think is top shelf today was tried on mainframes in the late 80’s through the 90’s. Cloud computing is mostly recycled 90’s “fashion”.
See also people trying to bring Erlang back into fashion.
Having recently read through a handful of issues on their forums, they seems to brush aside a lot of things. It's a useful tool but the mod / dev team they have working with the community could use some training.
If you're using a flatpak, that's not actually the case. It would have very restricted access to the point where you even would have to explicitly give it access to user /home.
Interesting, I thought I had to turn that on for Obsidian!
The first time I started installing flatpaks I ran into a bit of permission / device isolation trouble and ever since then, I use flatseal after installing an app to make sure it actually has access to things.
I „love” such sandboxing defaults. Apps like Docker Desktop also share the whole home by default [1], which is pretty interesting if a big selling point is to keep stuff separated. No idea why node_packages need to have access to my tax returns :). Of course you can change that, but I bet many users keeps the default paths intact.
Yeah, I forgot there’s the intermediate VM level, and user folders are shared there so that folders could be mounted to the individual containers using host paths.
I'm not claiming it's a security feature of Obsidian, I'm saying it's a consequence of running a flatpak - and in this situation it could be advantageous for those interested.
I have been using firejail for most of these kind of applications, be it Obsidian, Discord, or the browser I am using. I definitely recommend people start using it.
I feel like I should keep track of all my comments on HN because I remember writing a lengthy comment on firejail more than once. I cannot keep doing this. :D
For user-space, there is usually bubblewrap vs. firejail. I have not personally used bubblewrap, so I cannot comment on that, but firejail is great at what it does.
The last comment was about restricting clipboard access to either X11 or Wayland which is possible with firejail quite easily, so if you want that, you can have that.
So do you configure firejail to give each app their own separate, permanent home directories? Like "firejail --private=/home/user/firejails/discord discord", "firejail --private=/home/user/firejails/chromium chromium", and so on?
FWIW, once you start whitelisting, it will only have access to those directories and files only, so Discord has no access to anything other than its own directory and ${DOWNLOADS}, which I should probably change.
You should check out the default profiles for many programs / apps under directory "/etc/firejail".
[1] You run it via "firejail Discord" or "firejail ./Discord" if you name it "Discord.profile".
I treat LS as a privacy/anti-telemetry/anti-accident tool, not as anti malware.
Obviously it can detect malware if there’s a connection to some weird site, but it’s more like a bonus than a reliable test.
If you need to block FS access, then per app containers or VMs are the way to go. The container/VM sandboxes your files, and Little Snitch can then manage externa connectivity (you might still want to allow connection to some legit domains—-but maybe not github.com as that can be use to upload your data. I meant something like updates.someapp.com)
I believe LS has some protections against this. Never tried them, but there are config related security options, incl. protection against synthetic events. So they definitely put some thought into that.
Is this true on Mac? Usually I am notified when programs request access outside the normal sandboxed or temp folders. Not sure how that works in any detail though.
Funny enough, I thought this earlier about Arch Linux and it's deritives. It was mentioned on reddit that they operate on a small budget. A maintainer replied that they have very low overhead, and the first thought that popped into my mind was that most of the software I use and rely on comes from the AUR, which relies on the user to manage their own security.
If engineers can't even manage their own security, why are we expecting users to do so?
I'm shocked it is most of your software. I think I have under a dozen AUR packages. It has been that way for about a decade. I added a couple for gaming recently (mostly because Lutris just crashes for me), but nearly all of my software comes from the official repos.
I think this criticism is unfair because most common packages are covered by the core and extra repos which are maintained by Arch Linux. AUR is a collection of user build scripts and using it has a certain skill cliff such that I expect most users to have explicit knowledge of the security dangers. I understand your concern but it would be weird and out of scope for Arch to maintain or moderate AUR when what Arch is providing here amounts to little more than hosting. Instead Arch rightly gives the users tools to moderate it themselves through the votes and comments features. Also the most popular AUR packages are maintained by well known maintainers.
The derivatives are obviously completely separate from Arch and thus are not the responsibility of Arch maintainers.
Disagree. AUR isn’t any trickier than using pacman most of the time. Install a package manager like Yay or Paru and you basically use it the same way as the default package manager.
It’s still the same problem, relying on the community and trusted popular plugin developers to maintain their own security effectively.
I understood GP's point to be that because Obsidian leaves a lot of functionality to plugins, most people are going to use unverified third party plugins. On arch however most packages are in core or extra so for most people they wont need to go to AUR. They are more likely to install the flatpak or get the appimage for apps not in the repos as thats much easier.
yay or paru (or other aur helpers afaik) are not in the repos. To install them one needs to know about how to use AUR in the first place. If you are technically enough to do that, you should know about the security risks since almost all tutorials for AUR come with the security warnings. Its also inconvenient enough that most people wont bother.
In obsidian plugins can seem central to the experience so users might not think much of installing them, in Arch AUR is very much a non essential component. At least thats how I understand it.
> Its also inconvenient enough that most people wont bother.
> in Arch AUR is very much a non essential component.
While somewhat true, we are talking about a user who has installed Arch on their machine. If a user wanted to not bother with installation details, they would've installed Ubuntu.
> If engineers can't even manage their own security, why are we expecting users to do so?
This latest attack hit Crowdstrike as well. Imagine they had gotten inside Huntress, who opened up about how much they can abuse the access given: https://news.ycombinator.com/item?id=45183589
Security folks and companies think they are important. The C suite sees them as a scape goat WHEN the shit hits the fan and most end users feel the same about security as they do about taking off their shoes at the airport (what is this nonsense for) and they mostly arent wrong.
It's not that engineers cant take care of their own security. It's that we have made it a fight with an octopus rather than something that is seamless and second nature. Furthermore security and privacy go hand and hand... Teaching users that is not to the benefit of a large portion of our industry.
> It's not that engineers cant take care of their own security.
I dunno. My computer has at least 1 hardware backdoor that I know off, but that I just can't get hardware without any equivalent exploit.
My OS is developed with a set of tools that is known to make code revision about as hard as possible. Provides the bare minimum application insulation. And is 2 orders of magnitude larger than any single person can read on their lifetime. It's also the usable OS out there with best security guarantees, everything else is much worse or useless.
A browser is almost a new complete layer above the OS. And it's 10 times larger. Also written in a way that famously makes revisions impossible.
And then there are the applications, that is what everybody is focusing today. Keeping them secure is close to useless if one don't fix all of the above.
I'm developing an Obsidian plugin commercially. I wish there was a higher tier of vetting available to a certain grade of plugin.
IMO they should do something like aur on Arch Linux and have a community managed plugin repo and then a smaller, more vetted one. That would help with the plugin review time too.
The plugin is called Relay [0] -- it makes Obsidian more useful in a work setting by adding real-time collaboration.
One thing that makes our offering unique is the ability to self-host your Relay Server so that your docs are completely private (we can't read them). At the same time you can use our global identity system / control plane to collaborate with anyone in the world.
We have pretty solid growth, a healthy paid consumer base (a lot of students and D&D/TTRPG), and starting to get more traction with businesses and enterprise.
I think it's a matter of time until we see a notable plugin in the obsidian space get caught exfiltrating data. I imagine then, after significant reputational harm, the team will start introducing safe guards. At a minimum, create some sort of verified publisher system.
Don’t most plugin models work this way? Does VSCode, Vim, Emacs, and friends do anything to segregate content? Gaming is the only area where I expect plugins have limited permissions.
Browser extensions also have a relatively robust permissions-based system.
If they wanted to, one would guess that browser-ish local apps based on stuff like Electron/node-webkit could probably figure out some way to limit extension permissions more granularly.
I would have thought, but it has been how many years, and as far as I know, there is still no segregation for VSCode extensions. Microsoft has all the money and if they cannot be bothered, not encouraged that smaller applications will be able to iron out the details.
I think it's just because supply-chain attacks are not common enough / their attack surfaces not large enough to be worth the dev time... yet...
Sneak in a malicious browser extension that breaks the permissions sandbox, and you have hundreds of thousands to millions of users as an attack surface.
Make a malicious VSCode/IDE extension and maybe you hit some hundreds or thousands of devs, a couple of smaller companies, and probably can get on some infosec blogs...
>Make a malicious VSCode/IDE extension and maybe you hit some hundreds or thousands of devs, a couple of smaller companies, and probably can get on some infosec blogs..
Attackers just have to hit one dev with commit rights to an app or library that gets distributed to millions of users. Devs are multipliers.
The time has come. The nx supply chain attack a couple weeks ago literally exfiltrated admin tokens from your local dev machine because the VS code extension for nx always downloaded the latest version of nx from npm. And since nx is a monoreop tool, it’s more applicable to larger projects with more valuable tokens to steal.
The solution at my job is you can only install extensions vetted by IT and updates are significantly delayed. Works well enough but sucks if you want one that isn't available inside the firewall.
>Browser extensions also have a relatively robust permissions-based system.
Yeah and they suck now. We need a better security model where it's still possible to do powerful stuff on the whole machine (it's MY computer after all) without compromises.
>We need a better security model where it's still possible to do powerful stuff on the whole machine
That's not possible. If you can do powerful stuff on the whole machine by definition you have no security. Security is always a question of where you create a perimeter. You can hand someone a well defined box in which they can do what they want, you can give someone broader access with fewer permissions, but whether vertically or horizontally to have security is to exercise control and limit an attack surface.
That's even implicit in the statement that it's YOUR computer. The justification being that there's a dividing line between your computer and other computers. If you'd be part of of a network, that logic ceases to hold. Same when it comes to components on your machine.
vim and emacs are over 30 years old and therefore living with an architecture created when most code was trusted. Encrypting network protocols was extremely rare, much less disks or secrets. I don't think anything about the security posture of vim and emacs should be emulated by modern software.
I would say VSCode has no excuse. It's based on a browser which does have capabilities to limit extensions. Huge miss on their part, and one that I wish drew more ire.
I'd love to see software adopt strong capabilities-based models that enforce boundaries even within parts of a program. That is, with the principle of least authority (POLA), code that you call is passed only the capabilities you wish (e.g. opening a file, or a network socket), and not everything that the current process has access to. Thomas Leonard's post (https://roscidus.com/blog/blog/2023/04/26/lambda-capabilitie...) covers this in great detail, and OCaml's newer Eio effect system will has aspects of this too.
The Emily language (locked-down subset of OCaml) was also interesting for actively removing parts of the standard library to get rid of the escape hatches that would enable bypassing the controls.
Linux has seccomp, but I think that was changing the access for an entire process. The language-focused aspect seems useful to me, from that application aspect where maybe I want access to something, but I don't want to pass that access on to all the code that I might call from a library.
You have to get out the beaten path to get plugins into Vim/Emacs. It's not difficult, but you don't have access to a marketplace open to the world from the get go. I think Emacs have ELPA, but I would put that at the level of OS repos like Debian/Alpine.
iirc vscode has RCE by design when you use the remote editing feature (i.e. editing files on a server, which is obviously a bad idea anyway, but still a feature) and nobody gives a fuck.
> The code in Mods for Cities: Skylines is not executed in a sandbox.
> While we trust the gaming community to know how to behave and not upload malicious mods that will intentionally cause damage to users, what is uploaded on the Workshop cannot be controlled.
> Like with any files acquired from the internet, caution is recommended when something looks very suspicious.
I think they meant games that specifically come with a sandboxed scripting layer. Otherwise, I agree that most mods are indeed just untrusted patches for a native executable or .NET assembly.
I guess the intent behind Cities Skylines's support for mods is just removing the need for a mod manager and enabling Steam Workshop support.
I was thinking more Lua/Luaua which make it trivial to restrict permissions. In general, the gaming client has access to a lot more information than it shares, so to prevent cheats from plugins, the developers have to be explicit about security boundaries.
One of the large dependencies they call out is an excellent example: pdf.js.
There is no reason for pdf.js to ever access anything other than the files you wish to export. The Export to PDF process could spawn a containerized subprocess with 0 filesystem or network access and constrained cpu and memory limits. Files could sent to the Export process over stdin, and the resulting PDF could be streamed back over stdout with stderr used for logging.
There are lots of plugin systems that work this way. I wish it were commodofied and universally available. AFAIK there's very little cross-platform tooling to help you solve this problem easily, and that's a pity.
Another thought: what about severely sandboxing plugins so they while they have access to your notes, they have no network or disk access and in general lack anyway for them to exfiltrate your sensitive info? Might not be practical but approaches like this appeal to me.
As someone who specifically started building Octarine, just for this reason, I understand.
Having to rely on random devs for the most basic functionality and passing it off as `community does what it wants` is weird. Either add it in yourselves, or accept the fact that given your app requires external contributors to work at a little above the basic level, there are going to be security issues.
Writing a whole blog post, and throwing shade on "other apps" that have far more dependencies than Obsidian is weird to me.
Anyway, it seems like you can't really talk bad about them, since there's a huge following that just comes at you, and that feels weird, cause they apparently can throw shade, others can't just talk back.
I use “Templater” and “Dataview” but now I am rethinking my usage; they were required for the daily template I use (found here on HN) but this is probably overkill.
> could've instead opted to be more 'batteries-included', at the cost of more development effort, but instead leaves this to the community, which in turn increases the attack surface significantly.
This app deals with very critical, personal, and intimate data – personal notes and professional/work-related notes, but proudly has an Electron app. This alone has seemed like a massive red flag to me.
There are better alternatives. It's just that people have convinced themselves they need the features Obsidian offers - because it makes them feel smart and important.
At the end of the day, you're just taking notes. If you write a journal, don't put it in something like Obsidian. Even Apple Notes is better (in security, privacy, etc) in this regards.
Point is, you don't need Obsidian (or all of its plugin). People have been making do with Dropbox and plain text (.txt) files perfectly fine for years.
Wow I never knew I "can already build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem".
Plain-text folder on a cloud sharing service. Edit with notepad.exe or whatever editor you prefer. Others have been doing it with .doc files forever, or .rtf.
It's no worse than vscode. Sure there's permissions, but it's super common for an extension to start a process and that process can do anything it wants.
Because it is one of the most popular dev tools out there? If not the most popular. It also uses Electron, like Obsidian. Has thousands of plugins, like obsidian.
Plus vscode is maintained by a company with thousands of devs. Obsidian is less than 10 people, which is amazing. About plugins why blame the product, pls check what you install on your machine instead
My personal take is that the only way to be reasonably sure you're OK is to install as few apps as possible and then as few plugins as possible (and ideally stick to the bundled ones only). I don’t think it’s controversial, but for some reason this is not how many people think, even if in the real world you don’t give keys to your place to everyone who says they’re cool :)
Among others, this is a big reason I want effect systems to gain more attention. After having seen them, the idea that in most languages, the only option is that any function can do anything without keeping track of what it affects in its type signature is bonkers to me.
I agree Obsidian plugins do nothing about safety. But I'm not sure "most users use plugins", that's not my impression from reading the subreddit. I wonder if there's any data on it?
Is this true? Is there any source about how many obsidian users use third party plugins? For once I don't. Moreover, obsidian by default runs in "restricted mode" which does not allow for community plugins. You have to specifically enable it to be able to install community plugins, hence I assume somebody who does that understands the risks involved. How many people even get into enabling that?
For me it is not even about security firstmost, the whole appeal of markdown is simplicity and interoperability. The more I depend on "plugins" the more I am locked in into this specific platform.
Operating systems are different though, since their whole purpose is to host _other_ applications.
FWIW, MacOS isn't any better or worse for security than any other desktop OS tbh....
I mean, MacOS just had it's "UAC" rollout not that long ago... and not sure about you, but I've encountered many times where someone had to hang up a Zoom or browser call because they updated the app or OS, and had to re-grant screenshare permissions or something. So, not that different. (Pre-"UAC" versions of MacOS didn't do any sandboxing when it came to user files / device access)
Yes, on desktop, Obsidian plugins can access files on your system, unless you run it in a container. On iOS, iPadOS, and Android the app is sandboxed so plugins are more constrained.
This is not unique to Obsidian. VS Code (and Cursor) work the same way despite Microsoft being a multi-trillion dollar company. This is why Obsidian ships in restricted mode and there's a full-screen warning before you turn on community plugins.
VS Code and Obsidian have similar tradeoffs, both being powerful file-based tools on the Electron stack. This fear about plugins was raised on the Obsidian forums in 2020 when Obsidian was still new, and Licat explained[1] why it’s not possible to effectively sandbox plugins without making them useless.
So... what do you do?
The drastic option is to simply not use community plugins. You don't have to leave restricted mode. For businesses there are several ways to block network access and community plugins[2]. And we're currently planning to add more IT controls via a policy.json file[3].
The option of using Obsidian without plugins is more viable in 2025 than it was in 2020, as the app has become more full-featured. And we're now regularly doing third-party security audits[4].
But realistically, most people want to run community plugins, and don't have the technical skills to run Obsidian in a container, nor the ability and time to review the code for every plugin update.
So the solution that appeals to us most is similar to the "Marketplace protections"[5] that Microsoft gradually implemented for VS Code. For example, implementing a trusted developer program, and automated scanning of each new plugin update. We plan to significantly revamp the community directory over the coming year and this is part of it.
Note that Obsidian is a team of 7 people. We're 100% user-supported[6] and competing with massive companies like Microsoft, Apple, Google, etc. Security audits are not cheap. Building an entire infrastructure like the one I described above is not easy. We're committing to doing it, but it wouldn't be possible without our supporters.
Yes, you are responsible for all the code you ship to your users. Not pinning dependencies is asking for trouble. It is literally, "download random code from the Internet and hope for the best."
Pinning dependencies also means you're missing any security fixes that come in after your pinned versions. That's asking for trouble too, so you need a mechanism by which you become aware of these fixes and either backport them or upgrade to versions containing them.
Things like dependabot or renovate solves the problem of letting you know when security updates are available, letting you have your cake and eat it too.
This statement is one of those useless exercises in pedantry like when people say "well technically coffee is a drug too, so..."
Code with publicly-known weaknesses poses exponentially more danger than code with unknown weaknesses.
It's like telling sysadmins to not waste time installing security patches because there are likely still vulnerabilities in the application. Great way to get n-day'd into a ransomware payment.
Have you spent time reviewing the security patches for any nontrivial application recently? 90% of them are worthless, the 10% that are actually useful are pretty easy to spot. It's not as big of a deal as people would like to have you think.
The real answer is to minimize dependencies (and subdependencies) to the greatest extent practical. In some cases you can get by with surprisingly few without too much pain (and in the long run, maybe less pain than if you'd pulled in more).
Yep, and for the rest I've gotten a lot of mileage, when shipping server apps, by deploying on Debian or Ubuntu* and trying to limit my dependencies to those shipped by the distro (not snap). The distro security team worries about keeping my dependencies patched and I'm not forced to take new versions until I have to upgrade to the next OS version, which could be quite a long time.
It's a great way to keep lifecycle costs down and devops QoL up, especially for smaller shops.
*Insert favorite distro here that backports security fixes to stable package versions for a long period of time.
No. "Always downloading random code and hoping" is not the only option. Even w/ the supply-chain shitshow that the public npmjs registry has become, using pnpm and a private registry makes it possible to leverage a frozen lockfile that represents the entire dependency graph and supports vulnerability-free reproducible builds.
EDIT to add:
Of course, reaching a state where the whole graph is free of CVEs is a fleeting state of affairs. Staying reasonably up-to-date and using only scanned dependencies is an ongoing process that takes more effort and attention to detail than many projects are willing or able to apply; but it is possible.
This. It would be a partial improvement. A greater improvement would be rewriting it for native per platform, conscientiously sandboxing plugins, and minimizing the supported "js" language with a strict subset that doesn't allow arbitrary file, network, or system operations unless signed and approved entitlements are granted.
> We do not run postinstall scripts. This prevents packages from executing arbitrary code during installation.
I get the intent, but I’m not sure this really buys much. If a package is compromised, the whole thing is already untrustworthy and skipping postinstall doesn’t suddenly make the rest of the code safe. If it isn’t compromised, then you risk breaking legitimate installation steps.
From a security perspective, it feels like an odd tradeoff. I don’t have hard data, but I’d wager we see far more vulnerabilities patched through regular updates than actual supply-chain compromises. Delaying or blocking updates in general tends to increase your exposure rather than reduce it.
I agree but there's a bit of nuance here. Today scanning steps typically happen post install, which is wild but the status quo. Therefore preventing anything from running during install is desirable.
I'd like to see the ability to scan/restrict as part of the installation step become popular, there are some proprietary tools that do this already but it's not yet a common capability.
Yes. For instance when we had that crypto malware npm fiasco a few days back I happened to be updating one of my packages. The audit lit up with dozens of critical issues, but of course this was after it installed everything. Luckily I had disabled install scripts so it became a matter of not running the code until I could get it reverted back.
It does protect the build machine though. Seems like quality, low-hanging security fruit to me. If I want to casually hack on some random web app, I don’t have to worry about giving arbitrary scripts running from the ~4 gazillion dependencies.
I’ve been using other apps than Obsidian for notes and sharing, so this is nice to read and consider. But isn’t Obsidian an electron app or whatever? Electron has always seemed resource intensive and not native. JavaScript has never struck me as “secure”. Am I just out of touch?
JavaScript is a very secure language. The browser is a massive success at running secure JavaScript on a global scale. Every website you use is running JavaScript and not able to read other site data. Electron is the same, running v8 to sandbox JavaScript. Assuming you aren't executing user input inside that sandbox (something many programming languages allow, including JS), it's very secure.
The problem with supply chain attacks is specifically related to npm, and not related to JS. npm as an organization needs to be taking more responsibility for the recent attacks and essentially forcing everyone to use more strict security controls when publishing their dependencies.
Doesn’t this mean browser sandboxing is secure, not JS? Or are you referring to some specific aspect of JS I’m not aware of? (I’m not aware of a lot of JS)
It’s maybe a nit-pick, since most JS is run sandboxed, so it’s sort of equivalent. But it was explicitly what GP asked for. Would it be more accurate to say Electron is secure, not JS?
Turing completeness is irrelevant, as it only addresses computation. Security has to do with system access, not computational capacity. Brainfuck is Turing complete, but lacks any primitives to do more than read from a single input stream and write to a single output stream. Unless someone hooks those streams up to critical files, you can't use it to attack a system.
Language design actually has a lot of impact on security, because it defines what primitives you have available for interacting with the system. Do you have an arbitrary syscall primitive? Then the language is not going to help you write secure software. Is your only ability to interact with the system via capability objects that must be provided externally to authorize your access? Then you're probably using a language that put a lot of thought into security and will help out quite a lot.
A number of operating system security features, such as ASLR, exist because low level languages allow reading and writing memory that they didn't create.
Conversely, barring a bug in the runtime or compiler, higher level languages don't enable those kinds of shenanigans.
See for example the heart bleed bug, where openssl would read memory it didn't own when given a properly malformed request.
I mean, JavaScript doesn’t even have APIs for reading a file from disk, let alone executing an arbitrary binary. (Anything similar comes from a runtime like NodeJS.) You can’t access memory in different JS processes… so what would make it insecure?
To be fair, a plugin system built on JS with all plugins interacting in the same JS context as the main app has some big risks. Anything plugin can change definitions and variable in the global scope with some restrictions. But any language where you execute untrusted code in the same context/memory/etc as trusted code has risks. the only solution is sandboxing plugins
None of this makes or is a result of the language js (being) secure. Security is far and away predominantly a matter of how it's used, not a character of them language itself. "Safety" helps, but you can still easily write and package unsafe and insecure code in "safe" languages like rust, just as you can in C, JS, python, etc etc etc.
Not a huge electron fan (thank god for tauri), but Obsidian is a fantastic app and you shouldn't let the electron put you off of it. You can even hook a MCP up to it and an agent can use it as a personal knowledge base, it's quite handy.
I think I would prefer to see official supports for major package managers, even with unofficial repos (Debian, Macports,...). We went from a time where software were usually tarballed to one where devs are encouraging piping to shell.
There’s some advice that’s been going around lately that I’ve been having trouble understanding: the idea that you should not be updating your dependencies when new patches are released (e.g., X.X.PATCH).
I understand that not updating your dependencies when new patches are released reduces the chance of accidentally installing malware, but aren’t patches regularly released in order to improve security? Wouldn’t it generally be considered unwise to not install new patches?
There's a key missing piece to this puzzle: being informed about _why_ you're updating and what the patches are.
Nobody has time to read source code, but there are many tools and services that will tell you brief summaries of release notes. Npm Audit lists security vulnerabilities in your package versions for example.
I do adopt the strategy of not updating unless required, as updates are not only an attack vector, but also an extremely common source of bugs that'd I'd prefer to avoid.
But importantly I stay in the loop about what exploits I'm vulnerable to. Packages are popping up with vulnerabilities constantly, but if it's a ReDoS vulnerability in part of the package I definitely don't use or pass user input to? I'm happy to leave that alone with a notice. If it's something I'm worried another package might use unsafely, with knowledge of the vulnerability I can decide how important it is, and if I need to update immediately, or if I can (preferably) wait some time for the patch to cook in the wild.
That is the important thing to remember about security in this context: it is an active, continuous, process. It's something that needs to be tuned to the risk tolerance and risk appetite of your organisation, rather than a blanket "never update" or "always update" - for a well-formed security stance, one needs more information than that.
Exactly. If you can avoid having to do _any_ patches except those that have a security purpose you've already reduced your risk to supply chain attacks considerably.
This isn't trivial to organise though since semver by it's self doesn't denote when a patch is security related or not. Of course, you can always review the release notes but this is time consuming, and doesn't scale well when a product grows either in size of code base or community support.
This is where there's a fairly natural place for SAST (E.g., Semgrep, Snyk (many more but these are the two I've used the most, in no particular order)), and supply chain scans fall in place, but they're prohibitively expensive.
There is a lot of open source tooling out there that can achieve the same too of course.
I've found there's a considerable linear climb with overheads/TOIL and the larger the number of open source tools you commit to create a security baseline. Unfortunately, this realistically means most companies where time is scarcer than money, means more money shifts into closed source products like those I listed, rather than those ran by open source products/companies.
On the ReDos aspect, I find the current CVSS rating system lacking. Availability is important, but following secure coding principles (fail closed) I'd much rather my system go down than have integrity/confidentiality compromised.
It's frustrating that a potential availability issue often gets the same (high) rating as a integrity/confidentiality issue
I believe it's about waiting a bit before a new patch is released, not fully avoiding installing updates. Seems like compromises are being caught quickly these days, usually within hours. There are multiple companies monitoring npm package releases because they sell security scanning products and so it's part of their business to be on top of it.
pnpm has a setting that you can tell it that a package needs to be at least X minutes old in order to install it. I would wait at least 24 hours just to be safe
The attack that hit my packages two weeks ago was a patch release, taking advantage of this exact assumption. Wasn't a Post-Install script either.
With all of the latest in automated scanning and whatnot, this is more or less a moot point. You'll know when a package is vulnerable, and the alarm bells are loud and unambiguous. I really agree, and have always pushed the point, that version ranges are the worst things you can have if you care about supply chain attacks.
> Only a handful of packages are part of the app you run, e.g. Electron, CodeMirror, moment.js.
So they ship an extremely bloated package to ship a WebView based on one of the most complex pieces of software ever written, an entire code editor for text editing, and a deprecated time library that could be substituted by newer APIs and some glue code?
Honestly, it doesn't seem impressive at all. What Obsidian does is the bare minimum of how we should manage packages in any piece of software, not a testament to a serious security policy. They do security audits, though, which I find to be a good practice.
I love Obsidian dearly, but if you build an app that's only really useful with plugins, and that has a horrifyingly bad security model for plugins and little to no assurance of integrity of the plugins...
Maybe, just maybe, don't give fullmouthed advice on reducing risk in the supply chain.
Going to preface this post by saying I use and love Obsidian, my entire life is effectively in an Obsidian vault, I pay for sync and as a user I'm extremely happy with it.
But as a developer this post is nonsense and extremely predictable [1]. We can expect countless others like it that explains how their use of these broken tools is different and just don't worry about it!
By their own linked Credits page there are 20 dependencies. Let's take one of those, electron, which itself has 3 dependencies according to npm. Picking one of those electron/get has 7 dependencies. One of those dependencies got, has 11 dependencies, one of those cacheable-request has 7 dependencies etc etc.
Now go back and pick another direct dependency of Obsidian and work your way down the dependency tree again. Does the Obsidian team review all these and who owns them? Do they trust each layer of the chain to pick up issues before it gets to them? Any one of these dependencies can be compromised. This is what it means to be. supply chain attack, you only have to quietly slip something into any one of these dependencies to have access to countless critical user data.
Coincidentally I did that yesterday. Mermaid pulls in 137 dependencies. I love Obsidian and the Obsidian folks seem like good people but I did end up sandboxing it.
To be fair, the electron project likely invests some resources in reviewing it's own dependencies, because of its scale. But yeah this is a good exercise, I think we need more systems like Yocto which prioritize complete understanding of the entire product from source.
Absolutely love Obsidian but had to stop using it because Electron apps don't play well with Wayland. After lots of tinkering around with flags and settings for compatibility layers, it became obvious that it would never work seamlessly like it did on Windows (and probably does on x11). So it was either give up Wayland compositors or give up Obsidian. Luckily I don't use any plugins, so moving to other software was easy, but I still would prefer Obsidian. Electron's "works everywhere" works about as good as Java's "works everywhere", which is to say it works great, until it doesn't, at which point it's a mess of tinkering.
If you use Wayland and it works for you, that's great, but it's not my experience.
In my experience electron + Wayland was absolutely god awful for a long time, but it got dramatically better in the last 4-5ish months. So depending on when you last tried it, might be worth a revisit. Heavily depends on which GPU+DE though, Nvidia+Plasma here.
I went from Evernote to Joplin to Obsidian and really like the model of md files with some logic added on top. Initially I used a lot of plugins mostly out of curiosity, but now it's mostly RemotelySave for syncing and some small convenience plugins, so I hope my exposure isn't too high.
The app doesn't have the bloaty feel of other Electron apps, but if there was a good more native alternative that fits my bill (md files editable in a graphical mode with good wikilink-syntax support and similar search etc), then I would actually consider switching.
If the obsidian team did a 2 hour q&a livestream every week, I'd watch every one (or at least get the AI summary). One of my favorite pieces of software ever.
To be honest, right now I'm thinking about isolating of build process for frontend on my local environment. It is seems not hard to send my local environment variables like OPENAI_API_KEY or .ssh/* to some remote machine.
I know it is not very different comparing to python or projects in any other language. But I don't feel that I cannot trust node/js community at this point.
Switching to Deno might help. It's sandboxed by default and offers granular escape hatches. So if a script needs access to a specific environment variable or read or write specific files, it's simple to configure that only those accesses are allowed.
These practices are very similar to what I've done in the past, for a large, sensitive system, and they worked very well.
(IIUC, we actually were the first to get a certain certification for cloud deployment, maybe because we had a good handle on this and other factors.)
From the language-specific network package manager, I pulled the small number of third-party packages we used into the filesystem tree of system's repo, and audited each new version. And I disabled the network package manager in the development and deployment environments, to make it much harder for people to add in dependencies accidentally.
Dependencies outside this were either from the Linux distro (nice, because well-managed security updates), or go in the `vendor` or `ots` (off-the-shelf) trees of the repo (and are monitored for security updates).
Though, I look at some of the Python, JS, or Rust dependency explosions I sometimes see -- all dependent on being hooked up to the language's network package manager, with many people adding these cavalierly -- and it becomes a much harder problem.
I was talking to our chief architect about a blog post about our zero dependency home grown HTTP server[0]; the project just hit 1.2 and uses virtual threads. I'm generally a fan of "don't reinvent the wheel"[1], but I think there are some cases where that control is worth the cost.
Defending against security vulnerabilities is definitely one of them, as long as you make the proper investments to harden what you write.
Joel Spolsky write about some other reasons too[2].
Can’t wait for “implements mechanism to delay application of new patches” to start showing up compliance checklists. My procrastination will finally pay off!
This is obviously the way to do it, assuming you have the skills and resources to operate in this manner. If you don't, then godspeed, but you have to know going in that you are trading expediency now for risk later. Risk of performance issues, security vulnerabilities, changes in behavior, etc. And when the mess inevitably comes, at whatever inopportune time, you don't really get to blame other people...
An alternative for those who want a native application and/or even less supply-chain risk is Zim [1], which uses GTK and is packaged by the major Linux distributions.
Zim doesn't have a native phone app and syncing, though, and that's a big draw of Obsidian. It's plenty secure if you don't install plugins all willy-nilly.
I installed an AppArmor profile for Obsidian. For an application that displays text files, it needed a lot of permissions. It would refuse to run without network access.
You can install Obsidian flatpak and lock it down with flatseal.
It has been so rewarding and timely to see this post.
I just decided on Thursday after years of covering my ears and eyes from my obsidian-obsessed friends and coworkers that the tool just didn’t make sense to me, and I felt like I’d be in plugin purgatory on my computer for eternity.
I’ve ended up going with Reflect, after ~forever using Apple Notes primarily. So far so good, but I genuinely felt for so long I was supposed to love Obsidian because that’s the trope - appears that’s changing.
It's amazing how Google, Mozilla, Apple, etc can collaborate to come up with web standards, so they can ship browsers that display apps/webpages pretty much the same across multiple platforms. But Microsoft, Apple, Canonical, Google, etc can't collaborate to come up with apis that allow everyone to make desktop apps once and they display mostly the same on all platforms.
I use Emacs and Org-Roam, and some Emacs packages. It is hosted in a VM that is not connected to the internet (a Qube in Qubes OS). I just cannot review all the code running in Emacs.
My assumption is that most people on HN are making programmer money. $4 - 5 USD per month is affordable even on a junior engineer’s salary in many parts of the world.
The price per GB isn’t as good as the services you mentioned, but their storage limits are fine for the primary use case — storing a lot of plain text notes.
I’ve also had no problems with it, in contrast with iCloud which has routinely gotten stuck for me.
And if price per GB is what you care most about, use something else. That’s one of the great things about Obsidian.
The plain text thing is more of a feel-good argument than a practical one. If there’s a solid export path, the format isn’t really the issue... what matters is whether the app actually works the way you need it to. At the end of the day, your workflow lives or dies on how the software behaves... not on the file extension.
Was hoping they outlined their approach to handling potentially compromised packages running on dev machines prior to even shipping. That seems like a much harder problem to solve.
Really? Is this some kind of perverted joke?
Electron based thing wants to brag about less being safer?
Get rid of the browser, then we can talk about less.
but still along the same lines as "safer". the stresses are different, "safer" has the stress as "SAY-fer" and "secure" has the stress as "sih-KYOOR". the latter sounds more similar (and rhymes better) with "more", the originator of the phrase "less is more"
> Obsidian has a low number of dependencies compared to other apps in our category
Whataboutism. Relative comparisons don't address absolute risk. I checked three random packages: prism pulls 22, remark pulls 51, pixijs 179! So that's 250+ transitive dependencies just from those.
> Features like Bases and Canvas were implemented from scratch instead of importing off-the-shelf libraries. This gives us full control over what runs in Obsidian.
Full control? There are still hundreds of dependencies.
> This approach keeps our dependency graph shallow with few sub-dependencies. A smaller surface area lowers the chance of a malicious update slipping through.
> The other packages help us build the app and never ship to users, e.g. esbuild or eslint.
Build tools like esbuild don't ship to users, but a compromised build tool can still inject malicious code during compilation. This is supply chain security 101.
> All dependencies are strictly version-pinned and committed with a lockfile
Version pinning is, I would hope, standard practice in any professional development team years and years ago. It prevents accidental updates but doesn't stop compromised existing versions.
> When we do dependency updates, we:
> [snip]
While these practices are better than nothing, they don't fundamentally address the core issue.
> That gap acts as an early-warning window: the community and security researchers often detect malicious versions quickly
According to whom? Heartbleed, a vulnerability in a package with far more scrutiny than a typical npm module took what, 2 years to be found? The "community detection" assumption is flawed.
I'm not trying to put Obsidian down here - I sympathize, aside from implementing everything themselves, what can they do! I'm trying to point out that while their intent is good, this is a serious problem and their solution is not a solution.
Of course, it's the same in any project with dependencies. It's the same in other languages as well - if they have a convenient package manager. Like Rust and Cargo.
This problem came with the convenience of package managers and it should be fixed there, not by every application like Obsidian. I'm not sure how but maybe once a package is starting to become popular, additional security measures must be put in place for the author to be able to continue to commit to it. Signing requirements, reproducible builds, 2fa, community reputation systems, who knows.
Individual applications can't solve supply chain security through wishful thinking and version pinning.
Package managers need to solve this at the infrastructure level through measures like mandatory code signing, automated security auditing, dependency isolation, or similar system level approaches.
Obsidian's practices are reasonable given the current tooling limitations, but they don't eliminate the fundamental risks that the package managers bring to modern dependency ecosystems.
'It may sound obvious but the
primary way we reduce the risk of
supply chain attacks is to avoid depending on third-party code."
What a horribly disingenuous statement, for a product that isn't remotely usable without 3rd-party plugins. The "Obsidian" product would be more aptly named "Mass Data Exfiltration Facilitator Pro".
I have to agree that I don't find plugins necessary, and I'm not sure why you're so down on people using a solid backlinking note taker. I don't think I have low standards, I think Roam and Logseq aren't that great and Obsidian is all I need.
A more charitable interpretation would be that that have different needs. My keyboard costs more than my computer, but most people probably spend $15-$50 on a keyboard. Even my mouse is well outside that range. Do I have high standards or do I have tendonitis?
Software usability, in this context, is measured objectively, there is no interpretation. This is separate from a specific user's preferences, the ergonomics of their hardware, etc.. As for your high standards vs tendonitis distinction, I'd say these things are not mutually exclusive, and the comparison is not related to what we're talking about.
That's just...what? It's highly usable without plugins. Yes, I use plugins...but that's by choice. Obsidian is still a superior Markdown editor with backlink support, plugins or not.
I think you and your fellow commenters are missing the point. The degree to which Obsidian is "a superior Markdown editor with backlink support" can be debated. What I'm saying is that actual usability, in this context, is not a matter of opinion -- see ISO 9241-210, ISO/IEC 25010, etc.. Having said that, I'm glad you're happy.
This doesn't make any sense to me. I've always been told you don't write anything yourself unless you absolutely have to and having a million micro-dependencies is a good thing. JavaScript and now Rust devs have been saying this for years. Surely they know what they're doing...
There is a balance to be struck. NPM in particular has been a veritable dependency hell for a long time. I don't know if it just attracts inexperienced developers, or if its security model is fundamentally flawed, but there have been soooo many supply chain attacks using NPM that being extra careful is very much warranted.
Right, I get that. I was just making a joke because I really dislike the micro-dependency approach. Honestly bothers me more in Rust then JavaScript, but that's probably just because I'm not a web dev.
From my buddies who have worked in JS my understanding is that a lot of it is rooted in JavaScript having a really shitty standard library for a long time. So that lead to a culture of code sharing early on since everyone was rewriting the same utilities and what not. Which evolved into the package management ecosystem over time.
Why that came about in something like JavaScript but not C, which also obviously has a really small std is super interesting. My only guess is that JavaScript came up in the internet when package managers started to take on a new meaning, and web dev is more relavent to a broader range of (nontech) companies and they're more likely to put a huge focus on quality over quantity.
I've been focused on writing software the last couple of weeks. What is obsidian again? I can't find a simple "what is obsidian?" FAQ on their site. Is it a browser or a node replacement like deno? Or an AI library? Clearly obsidian has plugins, but what are they in service of?
If it's a browser, they should have something on their web site that says "obsidian is a really cool browser." I think there are a lot of people out there who are ignoring the hype-train and it would do the community a service if they just started with answering that simple question. I mean sure, I get it, it's a "sharpen your thinking app," but I'm not sure what that means.
This is one way to look at it, but ignores the fact that most users use third party community plugins.
Obsidian has a truly terrible security model for plugins. As I realized while building my own, Obsidian plugins have full, unrestricted access to all files in the vault.
Obsidian could've instead opted to be more 'batteries-included', at the cost of more development effort, but instead leaves this to the community, which in turn increases the attack surface significantly.
Or it could have a browser extension like manifest that declares all permissions used by the plugin, where attempting to access a permission that's not granted gets blocked.
Both of these approaches would've led to more real security to end users than "we have few third party dependencies".
When I was young there were a few luminaries in the software world who talked about how there is a steady if small flow of ideas from video game design into conventional software.
But I haven't heard anyone talk like that in quite sometime (unless it's me parroting them). Which is quite unfortunate.
I think for example if someone from the old guard of Blizzard were to write a book or at least a novella that described how the plugin system for World of Warcraft functioned, particularly during the first ten years, where it broke, how they hardened it over time, and how the process worked of backporting features from plugins into the core library...
I think that would be a substantial net benefit to the greater software community.
Far too many ecosystems make ham-fisted, half-assed, hair-brained plugin systems. And the vast majority can be consistently described by at least two of the three.
I’ve been of the opinion that every hard problem in CS shows up somewhere in gamedev. It’s a great space for inspo.
Game dev also rewards people for applying the 80/29 rule effectively and you see less of that in commercial software.
In each game generation there’s a game that would be easy to write on the next or subsequent generation of hardware and is damned difficult to implement on the current one. Cleverness and outright cheating make it work, after all fashion.
80/29 rule is the paretypo principle?
Typo but yeah.
It reaches a dead end eventually. That's where we are, edge of speed, where the only mods left are aesthetics veering at photorealism.
The game simulation will get more detailed/granular as aesthetics dial down in perceived value. You can always go bigger/wider/more procedural/more multiplayer.
This is also why every hard problem eventually shows up — games are just simulation + interaction, and eventually everything that can be simulated will have some attempted implementation out there, struggling along. (For some reason, this does not appear to stop at “interesting” things to simulate — see all the literal simulators on steam)
The simulations have yet to release photo-realism in lieu of event-perception, where simulation parallels reality, but that's not really playable as a game, only as a view.
My team's approach is game dev is probably for the hard problems in reality, behavior, ecology, language.
I came to learn that even though in process plugins are easier to implement, and less resource demanding, anyone serious about host stability and security can only allow for plugins based on OS IPC.
And in general, it will take less hardware resources that the usual Electron stuff.
Kernel design is (to me) another one where ideas have flowed into other software fields - there were monolithic kernels, micro kernels, and hybrid kernels, and they all need to work with third party modules (drivers)
The lessons from all fields seem to be relearnt again and again in new fields :-)
Because learning how to make a proper one requires building your own broken one first.
It might be slightly sped up by reading up on theory and past experiences of others.
I am around mid life and I see how I can tell people stuff, I can point people to resources but they still won’t learn until they hit the problem themselves and put their mind into figuring it out.
A lot of stuff we think is top shelf today was tried on mainframes in the late 80’s through the 90’s. Cloud computing is mostly recycled 90’s “fashion”.
See also people trying to bring Erlang back into fashion.
> Obsidian plugins have full, unrestricted access to all files in the vault.
Unless something has changed, it's worse than that. Plugins have unrestricted access to any file on your machine.
When I brought this up in discord a while back they brushed it aside.
Having recently read through a handful of issues on their forums, they seems to brush aside a lot of things. It's a useful tool but the mod / dev team they have working with the community could use some training.
If you're using a flatpak, that's not actually the case. It would have very restricted access to the point where you even would have to explicitly give it access to user /home.
You're wrong. The obsidian flatpak ships by default with access to /home. https://github.com/flathub/md.obsidian.Obsidian/blob/5e594a4...
Interesting, I thought I had to turn that on for Obsidian!
The first time I started installing flatpaks I ran into a bit of permission / device isolation trouble and ever since then, I use flatseal after installing an app to make sure it actually has access to things.
I guess I misremembered in the case of Obsidian.
I „love” such sandboxing defaults. Apps like Docker Desktop also share the whole home by default [1], which is pretty interesting if a big selling point is to keep stuff separated. No idea why node_packages need to have access to my tax returns :). Of course you can change that, but I bet many users keeps the default paths intact.
[1] https://docs.docker.com/desktop/settings-and-maintenance/set...
Needed for volume mounting to work easily I assume.
Yeah, I forgot there’s the intermediate VM level, and user folders are shared there so that folders could be mounted to the individual containers using host paths.
So if I run their software in a container they can't access my entire filesystem. I don't think that is a security feature.
It sounds like if I ever run obsidian I should be using flat seal too.
Er, what?
I'm not claiming it's a security feature of Obsidian, I'm saying it's a consequence of running a flatpak - and in this situation it could be advantageous for those interested.
Sorry, it genuinely sounded to me like you were saying that it's not a problem because flat pack.
No, lol
What if you run little snitch and block any communications from obsidian to anything?
Or firejail. Or QubesOS using a dedicated VM. There are options, but it would still be nice if Obsidian had a more robust security model.
I have been using firejail for most of these kind of applications, be it Obsidian, Discord, or the browser I am using. I definitely recommend people start using it.
Sell it to us! Why do you use specifically firejail?
There are so many options, from so many different security perspectives, that analysis paralysis is a real issue.
I feel like I should keep track of all my comments on HN because I remember writing a lengthy comment on firejail more than once. I cannot keep doing this. :D
For user-space, there is usually bubblewrap vs. firejail. I have not personally used bubblewrap, so I cannot comment on that, but firejail is great at what it does.
The last comment was about restricting clipboard access to either X11 or Wayland which is possible with firejail quite easily, so if you want that, you can have that.
You can do a LOT more with firejail though.
https://wiki.archlinux.org/title/Firejail
https://man.archlinux.org/man/firejail.1
> bubblewrap vs. firejail
In case anyone else is curious, I found the following comparison in bubblewrap's repo.
- https://github.com/containers/bubblewrap#related-project-com...
I'm gonna try both and see which one I like. Thanks for this info! You're sure living up to your user name there. (:
So do you configure firejail to give each app their own separate, permanent home directories? Like "firejail --private=/home/user/firejails/discord discord", "firejail --private=/home/user/firejails/chromium chromium", and so on?
I have my own Discord.profile!
This is my ~/.config/firejail/Discord.profile[1]:
I have some things commented out but you could probably uncomment most.Some has this, too:
FWIW, once you start whitelisting, it will only have access to those directories and files only, so Discord has no access to anything other than its own directory and ${DOWNLOADS}, which I should probably change.You should check out the default profiles for many programs / apps under directory "/etc/firejail".
[1] You run it via "firejail Discord" or "firejail ./Discord" if you name it "Discord.profile".
This is great. Thanks for the detailed reply!
It was not THAT detailed and it makes me feel a bit guilty, so if you have any questions let me know.
FYI you can search your comment history with hn.algolia.com:
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
Thank you, exactly what I have been looking for!
Little snitch can block open(2)?
I treat LS as a privacy/anti-telemetry/anti-accident tool, not as anti malware.
Obviously it can detect malware if there’s a connection to some weird site, but it’s more like a bonus than a reliable test.
If you need to block FS access, then per app containers or VMs are the way to go. The container/VM sandboxes your files, and Little Snitch can then manage externa connectivity (you might still want to allow connection to some legit domains—-but maybe not github.com as that can be use to upload your data. I meant something like updates.someapp.com)
Very, very good point
I got lazy
Time to crank the paranoidmeter up again
ty
I believe they're saying it can open, it just can't send the data anywhere.
Seems a little excessive, but here we are.
It still can encrypt everything and demand you pay some ₿₿₿₿.
If it can open and write any file on the OS, it's pretty much game over. Too many ways to exfiltrate data even without network/socket access.
Worse, what keeps this from editing the config files for Little Snitch (or similar blockers)?
I believe LS has some protections against this. Never tried them, but there are config related security options, incl. protection against synthetic events. So they definitely put some thought into that.
File system permissions?
Is this true on Mac? Usually I am notified when programs request access outside the normal sandboxed or temp folders. Not sure how that works in any detail though.
To be fair it also ships with the ability to install community plugins disabled.
Ah I guess that's one reason some folks started running it in a docker container. I think Linux server recently released a container for it.
To be fair, it’s no worse of a dumpsterfire than any other plug-in ecosystem.
Funny enough, I thought this earlier about Arch Linux and it's deritives. It was mentioned on reddit that they operate on a small budget. A maintainer replied that they have very low overhead, and the first thought that popped into my mind was that most of the software I use and rely on comes from the AUR, which relies on the user to manage their own security.
If engineers can't even manage their own security, why are we expecting users to do so?
I'm shocked it is most of your software. I think I have under a dozen AUR packages. It has been that way for about a decade. I added a couple for gaming recently (mostly because Lutris just crashes for me), but nearly all of my software comes from the official repos.
Same for me. I learned about AUR before installing Arch, but went months before installing my first package from there.
I think this criticism is unfair because most common packages are covered by the core and extra repos which are maintained by Arch Linux. AUR is a collection of user build scripts and using it has a certain skill cliff such that I expect most users to have explicit knowledge of the security dangers. I understand your concern but it would be weird and out of scope for Arch to maintain or moderate AUR when what Arch is providing here amounts to little more than hosting. Instead Arch rightly gives the users tools to moderate it themselves through the votes and comments features. Also the most popular AUR packages are maintained by well known maintainers.
The derivatives are obviously completely separate from Arch and thus are not the responsibility of Arch maintainers.
Disagree. AUR isn’t any trickier than using pacman most of the time. Install a package manager like Yay or Paru and you basically use it the same way as the default package manager.
It’s still the same problem, relying on the community and trusted popular plugin developers to maintain their own security effectively.
I understood GP's point to be that because Obsidian leaves a lot of functionality to plugins, most people are going to use unverified third party plugins. On arch however most packages are in core or extra so for most people they wont need to go to AUR. They are more likely to install the flatpak or get the appimage for apps not in the repos as thats much easier.
yay or paru (or other aur helpers afaik) are not in the repos. To install them one needs to know about how to use AUR in the first place. If you are technically enough to do that, you should know about the security risks since almost all tutorials for AUR come with the security warnings. Its also inconvenient enough that most people wont bother.
In obsidian plugins can seem central to the experience so users might not think much of installing them, in Arch AUR is very much a non essential component. At least thats how I understand it.
> Its also inconvenient enough that most people wont bother. > in Arch AUR is very much a non essential component.
While somewhat true, we are talking about a user who has installed Arch on their machine. If a user wanted to not bother with installation details, they would've installed Ubuntu.
The Arch-based distros that most normies will install have AUR helpers instaled by default.
I can't even install Brave without the AUR.
> If engineers can't even manage their own security, why are we expecting users to do so?
This latest attack hit Crowdstrike as well. Imagine they had gotten inside Huntress, who opened up about how much they can abuse the access given: https://news.ycombinator.com/item?id=45183589
Security folks and companies think they are important. The C suite sees them as a scape goat WHEN the shit hits the fan and most end users feel the same about security as they do about taking off their shoes at the airport (what is this nonsense for) and they mostly arent wrong.
It's not that engineers cant take care of their own security. It's that we have made it a fight with an octopus rather than something that is seamless and second nature. Furthermore security and privacy go hand and hand... Teaching users that is not to the benefit of a large portion of our industry.
> It's not that engineers cant take care of their own security.
I dunno. My computer has at least 1 hardware backdoor that I know off, but that I just can't get hardware without any equivalent exploit.
My OS is developed with a set of tools that is known to make code revision about as hard as possible. Provides the bare minimum application insulation. And is 2 orders of magnitude larger than any single person can read on their lifetime. It's also the usable OS out there with best security guarantees, everything else is much worse or useless.
A browser is almost a new complete layer above the OS. And it's 10 times larger. Also written in a way that famously makes revisions impossible.
And then there are the applications, that is what everybody is focusing today. Keeping them secure is close to useless if one don't fix all of the above.
You never actually told us what your OS is.
Because that would be a distraction to the point they're actually making.
The point is thoroughly undermined since we can't judge the veracity of their claims
And discussing the specifics of whatever OS GP uses is exactly the type of OT he was wise enough to avoid.
Personally, I think he uses Emacs.
They must mean macos, right?
I think you could find a dozen different operating systems that someone, somewhere, would say similar about.
I'm developing an Obsidian plugin commercially. I wish there was a higher tier of vetting available to a certain grade of plugin.
IMO they should do something like aur on Arch Linux and have a community managed plugin repo and then a smaller, more vetted one. That would help with the plugin review time too.
Just out of curiosity, what's the plugin? Are there folks interested in paying for plugins?
The plugin is called Relay [0] -- it makes Obsidian more useful in a work setting by adding real-time collaboration.
One thing that makes our offering unique is the ability to self-host your Relay Server so that your docs are completely private (we can't read them). At the same time you can use our global identity system / control plane to collaborate with anyone in the world.
We have pretty solid growth, a healthy paid consumer base (a lot of students and D&D/TTRPG), and starting to get more traction with businesses and enterprise.
[0] https://relay.md
Are you worried about being sherlocked at all? I know "multiplayer" is on their official roadmap.
yeah, definitely.
It might not be the most strategic move, but i want to build cool and useful tools, and the Obsidian folks are a big inspiration.
I hope there's a way to collaborate and/or coexist.
This open letter seems relevant here: https://www.emilebangma.com/Writings/Blog/An-open-letter-to-...
I think it's a matter of time until we see a notable plugin in the obsidian space get caught exfiltrating data. I imagine then, after significant reputational harm, the team will start introducing safe guards. At a minimum, create some sort of verified publisher system.
Don’t most plugin models work this way? Does VSCode, Vim, Emacs, and friends do anything to segregate content? Gaming is the only area where I expect plugins have limited permissions.
Browser extensions also have a relatively robust permissions-based system.
If they wanted to, one would guess that browser-ish local apps based on stuff like Electron/node-webkit could probably figure out some way to limit extension permissions more granularly.
I would have thought, but it has been how many years, and as far as I know, there is still no segregation for VSCode extensions. Microsoft has all the money and if they cannot be bothered, not encouraged that smaller applications will be able to iron out the details.
I think it's just because supply-chain attacks are not common enough / their attack surfaces not large enough to be worth the dev time... yet...
Sneak in a malicious browser extension that breaks the permissions sandbox, and you have hundreds of thousands to millions of users as an attack surface.
Make a malicious VSCode/IDE extension and maybe you hit some hundreds or thousands of devs, a couple of smaller companies, and probably can get on some infosec blogs...
>Make a malicious VSCode/IDE extension and maybe you hit some hundreds or thousands of devs, a couple of smaller companies, and probably can get on some infosec blogs..
Attackers just have to hit one dev with commit rights to an app or library that gets distributed to millions of users. Devs are multipliers.
The time has come. The nx supply chain attack a couple weeks ago literally exfiltrated admin tokens from your local dev machine because the VS code extension for nx always downloaded the latest version of nx from npm. And since nx is a monoreop tool, it’s more applicable to larger projects with more valuable tokens to steal.
The solution at my job is you can only install extensions vetted by IT and updates are significantly delayed. Works well enough but sucks if you want one that isn't available inside the firewall.
>Browser extensions also have a relatively robust permissions-based system.
Yeah and they suck now. We need a better security model where it's still possible to do powerful stuff on the whole machine (it's MY computer after all) without compromises.
>We need a better security model where it's still possible to do powerful stuff on the whole machine
That's not possible. If you can do powerful stuff on the whole machine by definition you have no security. Security is always a question of where you create a perimeter. You can hand someone a well defined box in which they can do what they want, you can give someone broader access with fewer permissions, but whether vertically or horizontally to have security is to exercise control and limit an attack surface.
That's even implicit in the statement that it's YOUR computer. The justification being that there's a dividing line between your computer and other computers. If you'd be part of of a network, that logic ceases to hold. Same when it comes to components on your machine.
vim and emacs are over 30 years old and therefore living with an architecture created when most code was trusted. Encrypting network protocols was extremely rare, much less disks or secrets. I don't think anything about the security posture of vim and emacs should be emulated by modern software.
I would say VSCode has no excuse. It's based on a browser which does have capabilities to limit extensions. Huge miss on their part, and one that I wish drew more ire.
I'd love to see software adopt strong capabilities-based models that enforce boundaries even within parts of a program. That is, with the principle of least authority (POLA), code that you call is passed only the capabilities you wish (e.g. opening a file, or a network socket), and not everything that the current process has access to. Thomas Leonard's post (https://roscidus.com/blog/blog/2023/04/26/lambda-capabilitie...) covers this in great detail, and OCaml's newer Eio effect system will has aspects of this too.
The Emily language (locked-down subset of OCaml) was also interesting for actively removing parts of the standard library to get rid of the escape hatches that would enable bypassing the controls.
Sadly capabilities are older than emacs. I’d welcome advancements here but their practical utility is clearly not a foregone conclusion.
It seems to me that it's not their utility, but lack of support in general for the sorts of changes that enable its wider use. E.g., looks like it's getting practical use in FreeBSD: https://www.cl.cam.ac.uk/research/security/capsicum/freebsd....
Linux has seccomp, but I think that was changing the access for an entire process. The language-focused aspect seems useful to me, from that application aspect where maybe I want access to something, but I don't want to pass that access on to all the code that I might call from a library.
> OCaml's newer Eio effect system
Eio is an IO library out of many competing ones, not OCaml's effect system. The capabilities are an Eio thing, not an effects thing.
Gotcha, thanks!
You have to get out the beaten path to get plugins into Vim/Emacs. It's not difficult, but you don't have access to a marketplace open to the world from the get go. I think Emacs have ELPA, but I would put that at the level of OS repos like Debian/Alpine.
iirc vscode has RCE by design when you use the remote editing feature (i.e. editing files on a server, which is obviously a bad idea anyway, but still a feature) and nobody gives a fuck.
> Gaming is the only area where I expect plugins have limited permissions.
It's pretty much the opposite. A lot of modding communities' security model is literally just to "trust the community."
Example: https://skylines.paradoxwikis.com/Modding_API
> The code in Mods for Cities: Skylines is not executed in a sandbox.
> While we trust the gaming community to know how to behave and not upload malicious mods that will intentionally cause damage to users, what is uploaded on the Workshop cannot be controlled.
> Like with any files acquired from the internet, caution is recommended when something looks very suspicious.
I think they meant games that specifically come with a sandboxed scripting layer. Otherwise, I agree that most mods are indeed just untrusted patches for a native executable or .NET assembly.
I guess the intent behind Cities Skylines's support for mods is just removing the need for a mod manager and enabling Steam Workshop support.
> Gaming is the only area where I expect plugins have limited permissions.
Do you mean mods on Steam? If you do, then that's down to the individual game. Sandboxing mods isn't universal.
I was thinking more Lua/Luaua which make it trivial to restrict permissions. In general, the gaming client has access to a lot more information than it shares, so to prevent cheats from plugins, the developers have to be explicit about security boundaries.
Perhaps, but I think what you might put onto Obsidian (personal thoughts, journal entries etc) can be more sensitive than code.
> Obsidian plugins have full, unrestricted access to all files in the vault.
And how exactly you can solve that?
I don't want to press 'allow access' on the every file some plugin is accessing.
One of the large dependencies they call out is an excellent example: pdf.js.
There is no reason for pdf.js to ever access anything other than the files you wish to export. The Export to PDF process could spawn a containerized subprocess with 0 filesystem or network access and constrained cpu and memory limits. Files could sent to the Export process over stdin, and the resulting PDF could be streamed back over stdout with stderr used for logging.
There are lots of plugin systems that work this way. I wish it were commodofied and universally available. AFAIK there's very little cross-platform tooling to help you solve this problem easily, and that's a pity.
Specific permissions declared in a manifest much like browser extensions could be a good first step.
Another thought: what about severely sandboxing plugins so they while they have access to your notes, they have no network or disk access and in general lack anyway for them to exfiltrate your sensitive info? Might not be practical but approaches like this appeal to me.
Deno would be a good candidate for this.
As someone who specifically started building Octarine, just for this reason, I understand.
Having to rely on random devs for the most basic functionality and passing it off as `community does what it wants` is weird. Either add it in yourselves, or accept the fact that given your app requires external contributors to work at a little above the basic level, there are going to be security issues.
Writing a whole blog post, and throwing shade on "other apps" that have far more dependencies than Obsidian is weird to me.
Anyway, it seems like you can't really talk bad about them, since there's a huge following that just comes at you, and that feels weird, cause they apparently can throw shade, others can't just talk back.
That's ok. I haven't come across an Obsidian plug-in that's worth introducing a dependency for.
I use “Templater” and “Dataview” but now I am rethinking my usage; they were required for the daily template I use (found here on HN) but this is probably overkill.
I did too but have switched over to “bases” now that that’s in core. Before that I had an apparmor profile restricting Obsidian from reaching the web.
> could've instead opted to be more 'batteries-included', at the cost of more development effort, but instead leaves this to the community, which in turn increases the attack surface significantly.
Ah, the WordPress model.
This app deals with very critical, personal, and intimate data – personal notes and professional/work-related notes, but proudly has an Electron app. This alone has seemed like a massive red flag to me.
Until there is a better alternative you’re left with electron. Nothing come close to obsidian.
Give Octarine (https://octarine.app) a try?
Built with tauri & rust, and is more performant, and doesn't rely on random contributions for basic things as plugins.
Disclaimer - I build it.
There are better alternatives. It's just that people have convinced themselves they need the features Obsidian offers - because it makes them feel smart and important.
At the end of the day, you're just taking notes. If you write a journal, don't put it in something like Obsidian. Even Apple Notes is better (in security, privacy, etc) in this regards.
How do I use Apple Notes cross-platform?
You can't. But that wasn't the point, was it?
Point is, you don't need Obsidian (or all of its plugin). People have been making do with Dropbox and plain text (.txt) files perfectly fine for years.
Wow I never knew I "can already build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem".
This is why people use Obsidian.
Plain-text folder on a cloud sharing service. Edit with notepad.exe or whatever editor you prefer. Others have been doing it with .doc files forever, or .rtf.
It's no worse than vscode. Sure there's permissions, but it's super common for an extension to start a process and that process can do anything it wants.
It's *significantly* worse than vscode. vscode is at least attempting to grapple the problem: https://code.visualstudio.com/docs/configure/extensions/exte....
And why is VSCode our baseline?
Because it is one of the most popular dev tools out there? If not the most popular. It also uses Electron, like Obsidian. Has thousands of plugins, like obsidian.
Plus vscode is maintained by a company with thousands of devs. Obsidian is less than 10 people, which is amazing. About plugins why blame the product, pls check what you install on your machine instead
My personal take is that the only way to be reasonably sure you're OK is to install as few apps as possible and then as few plugins as possible (and ideally stick to the bundled ones only). I don’t think it’s controversial, but for some reason this is not how many people think, even if in the real world you don’t give keys to your place to everyone who says they’re cool :)
Among others, this is a big reason I want effect systems to gain more attention. After having seen them, the idea that in most languages, the only option is that any function can do anything without keeping track of what it affects in its type signature is bonkers to me.
I agree Obsidian plugins do nothing about safety. But I'm not sure "most users use plugins", that's not my impression from reading the subreddit. I wonder if there's any data on it?
> most users use third party community plugins
Is this true? Is there any source about how many obsidian users use third party plugins? For once I don't. Moreover, obsidian by default runs in "restricted mode" which does not allow for community plugins. You have to specifically enable it to be able to install community plugins, hence I assume somebody who does that understands the risks involved. How many people even get into enabling that?
For me it is not even about security firstmost, the whole appeal of markdown is simplicity and interoperability. The more I depend on "plugins" the more I am locked in into this specific platform.
That just sounds like Linux packages; also not a system known for security of desktop apps and scripts especially compared to MacOS, shoot me.
Operating systems are different though, since their whole purpose is to host _other_ applications.
FWIW, MacOS isn't any better or worse for security than any other desktop OS tbh....
I mean, MacOS just had it's "UAC" rollout not that long ago... and not sure about you, but I've encountered many times where someone had to hang up a Zoom or browser call because they updated the app or OS, and had to re-grant screenshare permissions or something. So, not that different. (Pre-"UAC" versions of MacOS didn't do any sandboxing when it came to user files / device access)
Is there an alternate to obsidian?
(CEO of Obsidian here)
Yes, on desktop, Obsidian plugins can access files on your system, unless you run it in a container. On iOS, iPadOS, and Android the app is sandboxed so plugins are more constrained.
This is not unique to Obsidian. VS Code (and Cursor) work the same way despite Microsoft being a multi-trillion dollar company. This is why Obsidian ships in restricted mode and there's a full-screen warning before you turn on community plugins.
VS Code and Obsidian have similar tradeoffs, both being powerful file-based tools on the Electron stack. This fear about plugins was raised on the Obsidian forums in 2020 when Obsidian was still new, and Licat explained[1] why it’s not possible to effectively sandbox plugins without making them useless.
So... what do you do?
The drastic option is to simply not use community plugins. You don't have to leave restricted mode. For businesses there are several ways to block network access and community plugins[2]. And we're currently planning to add more IT controls via a policy.json file[3].
The option of using Obsidian without plugins is more viable in 2025 than it was in 2020, as the app has become more full-featured. And we're now regularly doing third-party security audits[4].
But realistically, most people want to run community plugins, and don't have the technical skills to run Obsidian in a container, nor the ability and time to review the code for every plugin update.
So the solution that appeals to us most is similar to the "Marketplace protections"[5] that Microsoft gradually implemented for VS Code. For example, implementing a trusted developer program, and automated scanning of each new plugin update. We plan to significantly revamp the community directory over the coming year and this is part of it.
Note that Obsidian is a team of 7 people. We're 100% user-supported[6] and competing with massive companies like Microsoft, Apple, Google, etc. Security audits are not cheap. Building an entire infrastructure like the one I described above is not easy. We're committing to doing it, but it wouldn't be possible without our supporters.
[1] https://forum.obsidian.md/t/security-of-the-plugins/7544/3
[2] https://help.obsidian.md/teams/deploy
[3] https://x.com/kepano/status/1957927003254059290
[4] https://obsidian.md/security
[5] https://code.visualstudio.com/docs/configure/extensions/exte...
[6] https://stephango.com/vcware
The Simpsons Springfield Nuclear Plant Security scene in real life.
https://www.youtube.com/watch?v=eU2Or5rCN_Y
Yes, you are responsible for all the code you ship to your users. Not pinning dependencies is asking for trouble. It is literally, "download random code from the Internet and hope for the best."
Pinning dependencies also means you're missing any security fixes that come in after your pinned versions. That's asking for trouble too, so you need a mechanism by which you become aware of these fixes and either backport them or upgrade to versions containing them.
Things like dependabot or renovate solves the problem of letting you know when security updates are available, letting you have your cake and eat it too.
> so you need a mechanism by which you become aware of these fixes and either backport them or upgrade to versions containing them
RSS Feeds?
[dead]
All code is fundamentally not ever secure.
This statement is one of those useless exercises in pedantry like when people say "well technically coffee is a drug too, so..."
Code with publicly-known weaknesses poses exponentially more danger than code with unknown weaknesses.
It's like telling sysadmins to not waste time installing security patches because there are likely still vulnerabilities in the application. Great way to get n-day'd into a ransomware payment.
Have you spent time reviewing the security patches for any nontrivial application recently? 90% of them are worthless, the 10% that are actually useful are pretty easy to spot. It's not as big of a deal as people would like to have you think.
That's why I run Windows 7. It's going to be insecure anyways so what's the big deal?
Pinned dependencies usually have their own dependencies so you are generally always downloading random code and hoping.
I mean, jeeze, how much code comes along for the ride with Electron...
The real answer is to minimize dependencies (and subdependencies) to the greatest extent practical. In some cases you can get by with surprisingly few without too much pain (and in the long run, maybe less pain than if you'd pulled in more).
Yep, and for the rest I've gotten a lot of mileage, when shipping server apps, by deploying on Debian or Ubuntu* and trying to limit my dependencies to those shipped by the distro (not snap). The distro security team worries about keeping my dependencies patched and I'm not forced to take new versions until I have to upgrade to the next OS version, which could be quite a long time.
It's a great way to keep lifecycle costs down and devops QoL up, especially for smaller shops.
*Insert favorite distro here that backports security fixes to stable package versions for a long period of time.
No. "Always downloading random code and hoping" is not the only option. Even w/ the supply-chain shitshow that the public npmjs registry has become, using pnpm and a private registry makes it possible to leverage a frozen lockfile that represents the entire dependency graph and supports vulnerability-free reproducible builds.
EDIT to add: Of course, reaching a state where the whole graph is free of CVEs is a fleeting state of affairs. Staying reasonably up-to-date and using only scanned dependencies is an ongoing process that takes more effort and attention to detail than many projects are willing or able to apply; but it is possible.
This. It would be a partial improvement. A greater improvement would be rewriting it for native per platform, conscientiously sandboxing plugins, and minimizing the supported "js" language with a strict subset that doesn't allow arbitrary file, network, or system operations unless signed and approved entitlements are granted.
> We do not run postinstall scripts. This prevents packages from executing arbitrary code during installation.
I get the intent, but I’m not sure this really buys much. If a package is compromised, the whole thing is already untrustworthy and skipping postinstall doesn’t suddenly make the rest of the code safe. If it isn’t compromised, then you risk breaking legitimate installation steps.
From a security perspective, it feels like an odd tradeoff. I don’t have hard data, but I’d wager we see far more vulnerabilities patched through regular updates than actual supply-chain compromises. Delaying or blocking updates in general tends to increase your exposure rather than reduce it.
I agree but there's a bit of nuance here. Today scanning steps typically happen post install, which is wild but the status quo. Therefore preventing anything from running during install is desirable.
I'd like to see the ability to scan/restrict as part of the installation step become popular, there are some proprietary tools that do this already but it's not yet a common capability.
Yes. For instance when we had that crypto malware npm fiasco a few days back I happened to be updating one of my packages. The audit lit up with dozens of critical issues, but of course this was after it installed everything. Luckily I had disabled install scripts so it became a matter of not running the code until I could get it reverted back.
It does protect the build machine though. Seems like quality, low-hanging security fruit to me. If I want to casually hack on some random web app, I don’t have to worry about giving arbitrary scripts running from the ~4 gazillion dependencies.
I’ve been using other apps than Obsidian for notes and sharing, so this is nice to read and consider. But isn’t Obsidian an electron app or whatever? Electron has always seemed resource intensive and not native. JavaScript has never struck me as “secure”. Am I just out of touch?
JavaScript is a very secure language. The browser is a massive success at running secure JavaScript on a global scale. Every website you use is running JavaScript and not able to read other site data. Electron is the same, running v8 to sandbox JavaScript. Assuming you aren't executing user input inside that sandbox (something many programming languages allow, including JS), it's very secure.
The problem with supply chain attacks is specifically related to npm, and not related to JS. npm as an organization needs to be taking more responsibility for the recent attacks and essentially forcing everyone to use more strict security controls when publishing their dependencies.
Doesn’t this mean browser sandboxing is secure, not JS? Or are you referring to some specific aspect of JS I’m not aware of? (I’m not aware of a lot of JS)
It’s maybe a nit-pick, since most JS is run sandboxed, so it’s sort of equivalent. But it was explicitly what GP asked for. Would it be more accurate to say Electron is secure, not JS?
I'm really curious about this comment. What would it mean for a programming language to be secure?
Any two Turing-complete programming languages are equally secure, no?
Surely the security can only ever come from whatever compiles/interprets it? You can run JavaScript on a piece of paper.
Turing completeness is irrelevant, as it only addresses computation. Security has to do with system access, not computational capacity. Brainfuck is Turing complete, but lacks any primitives to do more than read from a single input stream and write to a single output stream. Unless someone hooks those streams up to critical files, you can't use it to attack a system.
Language design actually has a lot of impact on security, because it defines what primitives you have available for interacting with the system. Do you have an arbitrary syscall primitive? Then the language is not going to help you write secure software. Is your only ability to interact with the system via capability objects that must be provided externally to authorize your access? Then you're probably using a language that put a lot of thought into security and will help out quite a lot.
A number of operating system security features, such as ASLR, exist because low level languages allow reading and writing memory that they didn't create.
Conversely, barring a bug in the runtime or compiler, higher level languages don't enable those kinds of shenanigans.
See for example the heart bleed bug, where openssl would read memory it didn't own when given a properly malformed request.
I mean, JavaScript doesn’t even have APIs for reading a file from disk, let alone executing an arbitrary binary. (Anything similar comes from a runtime like NodeJS.) You can’t access memory in different JS processes… so what would make it insecure?
To be fair, a plugin system built on JS with all plugins interacting in the same JS context as the main app has some big risks. Anything plugin can change definitions and variable in the global scope with some restrictions. But any language where you execute untrusted code in the same context/memory/etc as trusted code has risks. the only solution is sandboxing plugins
None of this makes or is a result of the language js (being) secure. Security is far and away predominantly a matter of how it's used, not a character of them language itself. "Safety" helps, but you can still easily write and package unsafe and insecure code in "safe" languages like rust, just as you can in C, JS, python, etc etc etc.
> JavaScript is a very secure language.
I almost fell out of my chair laughing. Thanks for the comedic relief.
I need more evidence to believe this.
Javascript is probably one of the most used, depending on how you measure it, languages on earth.
It runs on a majority of computers and basically all phones. There will be many security issues that get discovered b y virtue of these facts.
What makes you think that "native" apps are any more secure?
Nit: "Earth" is the proper noun for the planet most of us live on, "earth" is dirt.
Not a huge electron fan (thank god for tauri), but Obsidian is a fantastic app and you shouldn't let the electron put you off of it. You can even hook a MCP up to it and an agent can use it as a personal knowledge base, it's quite handy.
> Thank god for tauri
I’d love to try it, but speaking of security, this was the first thing I saw:
sh <(curl https://create.tauri.app/sh)
Right. But you know how to fetch and inspect (yea?) so, I with you that piping random crap to sh is bad. Maybe these snips encourage that behavior.
Tauri is trustable (for some loose definition) and the pipe to shell is just a well known happy-path.
All that to say it's a low value smell test.
Also, I'm in the camp that would rather git clone and then docker up. My understanding is it gives me a littl more sandbox.
I think I would prefer to see official supports for major package managers, even with unofficial repos (Debian, Macports,...). We went from a time where software were usually tarballed to one where devs are encouraging piping to shell.
https://snapcraft.io/obsidian
No, it's not really an issue. GitHub and VS Code are also Electron apps. So are Slack and Discord. Postman is, as well.
I'd also be forced to ask... what exactly are you doing with a markdown note-taking application such that performance is a legitimate concern?
But, I mean, maybe you're reading this in a Lynx session on your ThinkPad 701C.
> what exactly are you doing with a markdown note-taking application such that performance is a legitimate concern?
Launching it and expecting a fast startup.
That’s a reason I moved away from Notion. The startup is so terribly slow (perhaps because it’s updating too often?).
[dead]
If you have to render html, which is what markdown ultimately becomes, you might as well use a web broswer.
It is resource intensive.
It's not a problem on pc, but an obsidian vault with thousands of notes can have a laggy startup on mobile, even if you disable plugins.
Users sidestep this issue with quick capture plugins and apps, but I wish there was a native stripped-down version of obsidian.
Javascript is a lot more secure than C++, since it's a memory managed language.
Buffer overflows are 0.001 percent of security incidents in practice.
Let's fix private key leakage and supply chain issues before worrying about C++ haxxors p0wning your machines.
Memory management vulnerabilities are estimated to account for 70% of bugs.
As less code at trust boundaries is being written in memory-unsafe languages, we'll get to 0.001%!
There’s some advice that’s been going around lately that I’ve been having trouble understanding: the idea that you should not be updating your dependencies when new patches are released (e.g., X.X.PATCH).
I understand that not updating your dependencies when new patches are released reduces the chance of accidentally installing malware, but aren’t patches regularly released in order to improve security? Wouldn’t it generally be considered unwise to not install new patches?
There's a key missing piece to this puzzle: being informed about _why_ you're updating and what the patches are.
Nobody has time to read source code, but there are many tools and services that will tell you brief summaries of release notes. Npm Audit lists security vulnerabilities in your package versions for example.
I do adopt the strategy of not updating unless required, as updates are not only an attack vector, but also an extremely common source of bugs that'd I'd prefer to avoid.
But importantly I stay in the loop about what exploits I'm vulnerable to. Packages are popping up with vulnerabilities constantly, but if it's a ReDoS vulnerability in part of the package I definitely don't use or pass user input to? I'm happy to leave that alone with a notice. If it's something I'm worried another package might use unsafely, with knowledge of the vulnerability I can decide how important it is, and if I need to update immediately, or if I can (preferably) wait some time for the patch to cook in the wild.
That is the important thing to remember about security in this context: it is an active, continuous, process. It's something that needs to be tuned to the risk tolerance and risk appetite of your organisation, rather than a blanket "never update" or "always update" - for a well-formed security stance, one needs more information than that.
Exactly. If you can avoid having to do _any_ patches except those that have a security purpose you've already reduced your risk to supply chain attacks considerably.
This isn't trivial to organise though since semver by it's self doesn't denote when a patch is security related or not. Of course, you can always review the release notes but this is time consuming, and doesn't scale well when a product grows either in size of code base or community support.
This is where there's a fairly natural place for SAST (E.g., Semgrep, Snyk (many more but these are the two I've used the most, in no particular order)), and supply chain scans fall in place, but they're prohibitively expensive.
There is a lot of open source tooling out there that can achieve the same too of course.
I've found there's a considerable linear climb with overheads/TOIL and the larger the number of open source tools you commit to create a security baseline. Unfortunately, this realistically means most companies where time is scarcer than money, means more money shifts into closed source products like those I listed, rather than those ran by open source products/companies.
On the ReDos aspect, I find the current CVSS rating system lacking. Availability is important, but following secure coding principles (fail closed) I'd much rather my system go down than have integrity/confidentiality compromised.
It's frustrating that a potential availability issue often gets the same (high) rating as a integrity/confidentiality issue
I believe it's about waiting a bit before a new patch is released, not fully avoiding installing updates. Seems like compromises are being caught quickly these days, usually within hours. There are multiple companies monitoring npm package releases because they sell security scanning products and so it's part of their business to be on top of it.
pnpm has a setting that you can tell it that a package needs to be at least X minutes old in order to install it. I would wait at least 24 hours just to be safe
https://pnpm.io/settings#minimumreleaseage
The attack that hit my packages two weeks ago was a patch release, taking advantage of this exact assumption. Wasn't a Post-Install script either.
With all of the latest in automated scanning and whatnot, this is more or less a moot point. You'll know when a package is vulnerable, and the alarm bells are loud and unambiguous. I really agree, and have always pushed the point, that version ranges are the worst things you can have if you care about supply chain attacks.
The tension between improvement or regression when updating is real and ubiquitous but it is worse for apps in the npm ecosystem (like Obsidian).
Not only is npm a prominent target but it also does not allow packages to be removed or blocked for usage without a human on their side in the loop.
The result is that they are slow to remove malicious packages and slowing down your own updates helps to mitigate this a little.
Honestly, it doesn't seem impressive at all. What Obsidian does is the bare minimum of how we should manage packages in any piece of software, not a testament to a serious security policy. They do security audits, though, which I find to be a good practice.
“ Supply chain attacks are malicious updates that sneak into open source code used by many apps.” No!
This should be: Supply chain attacks are malicious updates that sneak into source code used by many apps.
Stop blaming FOSS. Too many people still have the perception that FOSS software is insecure
I love Obsidian dearly, but if you build an app that's only really useful with plugins, and that has a horrifyingly bad security model for plugins and little to no assurance of integrity of the plugins...
Maybe, just maybe, don't give fullmouthed advice on reducing risk in the supply chain.
But what about VScode?
Going to preface this post by saying I use and love Obsidian, my entire life is effectively in an Obsidian vault, I pay for sync and as a user I'm extremely happy with it.
But as a developer this post is nonsense and extremely predictable [1]. We can expect countless others like it that explains how their use of these broken tools is different and just don't worry about it!
By their own linked Credits page there are 20 dependencies. Let's take one of those, electron, which itself has 3 dependencies according to npm. Picking one of those electron/get has 7 dependencies. One of those dependencies got, has 11 dependencies, one of those cacheable-request has 7 dependencies etc etc.
Now go back and pick another direct dependency of Obsidian and work your way down the dependency tree again. Does the Obsidian team review all these and who owns them? Do they trust each layer of the chain to pick up issues before it gets to them? Any one of these dependencies can be compromised. This is what it means to be. supply chain attack, you only have to quietly slip something into any one of these dependencies to have access to countless critical user data.
[1] https://drewdevault.com/2025/09/17/2025-09-17-An-impossible-...
Coincidentally I did that yesterday. Mermaid pulls in 137 dependencies. I love Obsidian and the Obsidian folks seem like good people but I did end up sandboxing it.
To be fair, the electron project likely invests some resources in reviewing it's own dependencies, because of its scale. But yeah this is a good exercise, I think we need more systems like Yocto which prioritize complete understanding of the entire product from source.
Absolutely love Obsidian but had to stop using it because Electron apps don't play well with Wayland. After lots of tinkering around with flags and settings for compatibility layers, it became obvious that it would never work seamlessly like it did on Windows (and probably does on x11). So it was either give up Wayland compositors or give up Obsidian. Luckily I don't use any plugins, so moving to other software was easy, but I still would prefer Obsidian. Electron's "works everywhere" works about as good as Java's "works everywhere", which is to say it works great, until it doesn't, at which point it's a mess of tinkering.
If you use Wayland and it works for you, that's great, but it's not my experience.
In my experience electron + Wayland was absolutely god awful for a long time, but it got dramatically better in the last 4-5ish months. So depending on when you last tried it, might be worth a revisit. Heavily depends on which GPU+DE though, Nvidia+Plasma here.
What issues are you seeing? Haven't noticed the slightest glitch here, using Sway.
I went from Evernote to Joplin to Obsidian and really like the model of md files with some logic added on top. Initially I used a lot of plugins mostly out of curiosity, but now it's mostly RemotelySave for syncing and some small convenience plugins, so I hope my exposure isn't too high.
The app doesn't have the bloaty feel of other Electron apps, but if there was a good more native alternative that fits my bill (md files editable in a graphical mode with good wikilink-syntax support and similar search etc), then I would actually consider switching.
If the obsidian team did a 2 hour q&a livestream every week, I'd watch every one (or at least get the AI summary). One of my favorite pieces of software ever.
I recently had a similar experience using Libby for the first time.
An absolutely incredible piece of software. If anyone here on HN works on it, you deserve to be proud of your work.
Fully agree. Libby is awesome!
To be honest, right now I'm thinking about isolating of build process for frontend on my local environment. It is seems not hard to send my local environment variables like OPENAI_API_KEY or .ssh/* to some remote machine.
I know it is not very different comparing to python or projects in any other language. But I don't feel that I cannot trust node/js community at this point.
Switching to Deno might help. It's sandboxed by default and offers granular escape hatches. So if a script needs access to a specific environment variable or read or write specific files, it's simple to configure that only those accesses are allowed.
Running vite inside a docker container would probably get you what you want
I don't think you even need a container for that type of containment.
You could do it with namespaces.
I think node/whatever-js-run-time/package-manger could allow for namespaced containment for packages with simple modern linux things.
The realms proposal was a step towards that at one time.
These practices are very similar to what I've done in the past, for a large, sensitive system, and they worked very well.
(IIUC, we actually were the first to get a certain certification for cloud deployment, maybe because we had a good handle on this and other factors.)
From the language-specific network package manager, I pulled the small number of third-party packages we used into the filesystem tree of system's repo, and audited each new version. And I disabled the network package manager in the development and deployment environments, to make it much harder for people to add in dependencies accidentally.
Dependencies outside this were either from the Linux distro (nice, because well-managed security updates), or go in the `vendor` or `ots` (off-the-shelf) trees of the repo (and are monitored for security updates).
Though, I look at some of the Python, JS, or Rust dependency explosions I sometimes see -- all dependent on being hooked up to the language's network package manager, with many people adding these cavalierly -- and it becomes a much harder problem.
Love it. Jonathan Blow had a nice thread about dependencies a while back: https://x.com/Jonathan_Blow/status/1924509394416632250
I also recommend using this site to evaluate the dependencies of your dependencies:
https://npmgraph.js.org/?q=express
I was talking to our chief architect about a blog post about our zero dependency home grown HTTP server[0]; the project just hit 1.2 and uses virtual threads. I'm generally a fan of "don't reinvent the wheel"[1], but I think there are some cases where that control is worth the cost.
Defending against security vulnerabilities is definitely one of them, as long as you make the proper investments to harden what you write.
Joel Spolsky write about some other reasons too[2].
0: https://github.com/FusionAuth/java-http
1: https://en.wikipedia.org/wiki/Reinventing_the_wheel
2: https://www.joelonsoftware.com/2001/10/14/in-defense-of-not-...
> We don’t rush upgrades.
Can’t wait for “implements mechanism to delay application of new patches” to start showing up compliance checklists. My procrastination will finally pay off!
It's kind of weird to use word "less" for software that does million things instead of being a simple note taking tool.
This is obviously the way to do it, assuming you have the skills and resources to operate in this manner. If you don't, then godspeed, but you have to know going in that you are trading expediency now for risk later. Risk of performance issues, security vulnerabilities, changes in behavior, etc. And when the mess inevitably comes, at whatever inopportune time, you don't really get to blame other people...
Is that the reason why Obsidian does not support the https://textbundle.org/ format for import/export like any other markdown editor?
for import it does: https://help.obsidian.md/import/textbundle
for export it's the same reason many features don't exist yet — the team only has 3 full time developers
An alternative for those who want a native application and/or even less supply-chain risk is Zim [1], which uses GTK and is packaged by the major Linux distributions.
[1] https://zim-wiki.org/
Zim doesn't have a native phone app and syncing, though, and that's a big draw of Obsidian. It's plenty secure if you don't install plugins all willy-nilly.
How much supply chain vulnerability can be mitigated just by pinning known safe versions of dependencies
Did anyone need the newest xz version in the first place? What negative tradeoffs would have come from pinning a 2022 release for example
Depends on the pinned version. The pinned version might even have vulnerabilities themselves. The problem is trusting the ecosystem.
I installed an AppArmor profile for Obsidian. For an application that displays text files, it needed a lot of permissions. It would refuse to run without network access.
You can install Obsidian flatpak and lock it down with flatseal.
I’m pretty sure them rewriting things themselves has resulted in its speed. The great thing about obsidian is how fast it is compared to Notion.
It has been so rewarding and timely to see this post.
I just decided on Thursday after years of covering my ears and eyes from my obsidian-obsessed friends and coworkers that the tool just didn’t make sense to me, and I felt like I’d be in plugin purgatory on my computer for eternity.
I’ve ended up going with Reflect, after ~forever using Apple Notes primarily. So far so good, but I genuinely felt for so long I was supposed to love Obsidian because that’s the trope - appears that’s changing.
"The other packages help us build the app and never ship to users, e.g. esbuild or eslint."
Still, they can present a security risk by injecting malware at build time
Obsidian is an Electron application. Bloat and the lack of security come with the territory.
It also looks and feels the exact same on every platform I use it on and natively supports accessibility.
It's amazing how Google, Mozilla, Apple, etc can collaborate to come up with web standards, so they can ship browsers that display apps/webpages pretty much the same across multiple platforms. But Microsoft, Apple, Canonical, Google, etc can't collaborate to come up with apis that allow everyone to make desktop apps once and they display mostly the same on all platforms.
What about the third party extensions?
I use Emacs and Org-Roam, and some Emacs packages. It is hosted in a VM that is not connected to the internet (a Qube in Qubes OS). I just cannot review all the code running in Emacs.
I've been using Roam Research since about 2020. Is Obsidian better?
Haven’t used Roam, but what I like about Obsidian:
- All your data is just plain files on your file system. Automation and interop are great, including with tools like Claude Code.
- It’s local-first, so performance is good.
- It’s extensible. Write extensions in HTML, CSS, and JS.
- It’s free.
- Syncing files is straightforward. Use git, Syncthing, Google Drive, or pay for their cheap sync service which is quite good.
- Product development is thoughtful and well done.
- They’re explicitly not trying to lock you in or own your data. They define open specs and build on them when Markdown doesn’t cut it.
Things you might not like:
- Their collaboration story isn’t great yet. No collaborative editing.
- It’s an Electron app.
In what universe is their sync service cheap?
It's literally at least 100 times more expensive that Dropbox/OneDrive/Google Drive/iCloud sync
My assumption is that most people on HN are making programmer money. $4 - 5 USD per month is affordable even on a junior engineer’s salary in many parts of the world.
The price per GB isn’t as good as the services you mentioned, but their storage limits are fine for the primary use case — storing a lot of plain text notes.
I’ve also had no problems with it, in contrast with iCloud which has routinely gotten stuck for me.
And if price per GB is what you care most about, use something else. That’s one of the great things about Obsidian.
The plain text thing is more of a feel-good argument than a practical one. If there’s a solid export path, the format isn’t really the issue... what matters is whether the app actually works the way you need it to. At the end of the day, your workflow lives or dies on how the software behaves... not on the file extension.
Was hoping they outlined their approach to handling potentially compromised packages running on dev machines prior to even shipping. That seems like a much harder problem to solve.
On the other hand, their actual dependency list looks closer to this, and this is definitely not comprehensive:
https://github.com/ionic-team/capacitor
https://github.com/Microsoft/tslib
https://github.com/codemirror
https://github.com/codemirror/autocomplete
https://github.com/codemirror/language
https://github.com/marijnh/style-mod
https://github.com/marijnh/crelt
https://github.com/marijnh/find-cluster-break
https://github.com/marijnh/w3c-keyname
https://github.com/cure53/DOMPurify
https://github.com/electron/electron
https://github.com/electron/get
https://github.com/debug-js/debug
https://github.com/vercel/ms
https://github.com/sindresorhus/env-paths
https://github.com/sindresorhus/got
https://github.com/sindresorhus/is
https://github.com/visionmedia/node-progress
https://github.com/npm/node-semver
https://github.com/malept/sumchecker
https://github.com/szmarczak/http-timer
https://github.com/szmarczak/defer-to-connect
https://github.com/szmarczak/cacheable-lookup
https://github.com/jaredwray/cacheable
https://github.com/biomejs/biome
https://github.com/sindresorhus/decompress-response
https://github.com/sindresorhus/mimic-response
https://github.com/octet-stream/form-data-encoder
https://github.com/szmarczak/http2-wrapper
https://github.com/szmarczak/resolve-alpn
https://github.com/sindresorhus/lowercase-keys
https://github.com/sindresorhus/p-cancelable
https://github.com/sindresorhus/responselike
https://github.com/sindresorhus/type-fest
https://github.com/sindresorhus/tagged-tag
https://github.com/max-mapper/extract-zip
https://github.com/sindresorhus/get-stream
https://github.com/Sec-ant/readable-stream
https://github.com/sindresorhus/is-stream
https://github.com/thejoshwolfe/yauzl
https://github.com/brianloveswords/buffer-crc32
https://github.com/andrewrk/node-pend
https://github.com/i18next/i18next
https://github.com/babel/babel
https://github.com/microsoft/TypeScript
https://github.com/lezer-parser
https://github.com/lucide-icons/lucide
https://github.com/mathjax/MathJax
https://github.com/mermaid-js/mermaid
https://github.com/moment/moment
https://github.com/mozilla/pdf.js
https://github.com/pixijs/pixijs
https://github.com/mapbox/earcut
https://github.com/primus/eventemitter3
https://github.com/matt-way/gifuct-js
https://github.com/matt-way/jsBinarySchemaParser
https://github.com/kaimallea/isMobile
https://github.com/browserslist/caniuse-lite
https://github.com/jkroso/parse-svg-path
https://github.com/avoidwork/tiny-lru
https://github.com/PrismJS/prism/
https://github.com/mourner/rbush
https://github.com/mourner/quickselect
https://github.com/remarkjs/remark
https://github.com/hakimel/reveal.js
https://github.com/barrysteyn/node-scrypt
https://github.com/nodejs/nan
https://github.com/mixmark-io/turndown
https://github.com/mixmark-io/domino
https://github.com/webpack/webpack
https://github.com/acornjs/acorn
https://github.com/nicolo-ribaudo/acorn-import-phases
https://github.com/browserslist/browserslist
https://github.com/web-platform-dx/baseline-browser-mapping
https://github.com/kilian/electron-to-chromium
https://github.com/chicoxyzzy/node-releases
https://github.com/browserslist/update-db
https://github.com/lukeed/escalade
https://github.com/alexeyraspopov/picocolors
https://github.com/samccone/chrome-trace-event
https://github.com/webpack/enhanced-resolve
https://github.com/isaacs/node-graceful-fs
https://github.com/webpack/tapable
https://github.com/guybedford/es-module-lexer
https://github.com/eslint/js
https://github.com/browserify/events
https://github.com/fitzgen/glob-to-regexp
https://github.com/isaacs/node-graceful-fs
https://github.com/npm/json-parse-even-better-errors
https://github.com/webpack/loader-runner
https://github.com/jshttp/mime-types
https://github.com/jshttp/mime-db
https://github.com/suguru03/neo-async
https://github.com/webpack/schema-utils
https://github.com/ajv-validator/ajv
https://github.com/epoberezkin/fast-deep-equal
https://github.com/fastify/fast-uri
https://github.com/epoberezkin/json-schema-traverse
https://github.com/floatdrop/require-from-string
https://github.com/ajv-validator/ajv-formats
https://github.com/ajv-validator/ajv-keywords
https://github.com/DefinitelyTyped/DefinitelyTyped
https://github.com/webpack-contrib/terser-webpack-plugin
https://github.com/jridgewell/sourcemaps
https://github.com/jestjs/jest
https://github.com/yahoo/serialize-javascript
https://github.com/browserify/randombytes
https://github.com/terser/terser
https://github.com/tj/commander.js
https://github.com/evanw/node-source-map-support
https://github.com/LinusU/buffer-from
https://github.com/mozilla/source-map
https://github.com/webpack/watchpack
https://github.com/webpack/webpack-sources
https://github.com/eemeli/yaml
Has there been a supply chain attack with an LLM conduit yet? Because that would be spicy and is assuredly possible and plausible too.
I love Obsidian and wish I could make it my default markdown handler on Windows.
While we're on the topic: what's your default markdown handler on Windows?
Not my favorite but I was surprised recently when Windows 11 Notepad popped up something mentioning markdown support.
Wonder if we could make the case that simpler always leads to less in favor of lean apps
No mention of plugins, which are a core differentiator of Obsidian, so part of the overall "supply chain" for the app.
I wish they could add Google Drive support to their mobile app. I'd be happy to pay $100+ for one-time-only Google Drive support.
Really? Is this some kind of perverted joke? Electron based thing wants to brag about less being safer? Get rid of the browser, then we can talk about less.
Unbelievable
> The other packages help us build the app and never ship to users, e.g. esbuild or eslint.
Eslint with such wonderful dependencies like is-glob smh
missed opportunity for "less is secure"
"Secure" is a different, harder promise than safeR.
but still along the same lines as "safer". the stresses are different, "safer" has the stress as "SAY-fer" and "secure" has the stress as "sih-KYOOR". the latter sounds more similar (and rhymes better) with "more", the originator of the phrase "less is more"
Well uh sure if meter's all you're going for here
> Obsidian has a low number of dependencies compared to other apps in our category
Whataboutism. Relative comparisons don't address absolute risk. I checked three random packages: prism pulls 22, remark pulls 51, pixijs 179! So that's 250+ transitive dependencies just from those.
> Features like Bases and Canvas were implemented from scratch instead of importing off-the-shelf libraries. This gives us full control over what runs in Obsidian.
Full control? There are still hundreds of dependencies.
> This approach keeps our dependency graph shallow with few sub-dependencies. A smaller surface area lowers the chance of a malicious update slipping through.
Really? Again, this is just one package: https://npmgraph.js.org/?q=pixijs
> The other packages help us build the app and never ship to users, e.g. esbuild or eslint.
Build tools like esbuild don't ship to users, but a compromised build tool can still inject malicious code during compilation. This is supply chain security 101.
> All dependencies are strictly version-pinned and committed with a lockfile
Version pinning is, I would hope, standard practice in any professional development team years and years ago. It prevents accidental updates but doesn't stop compromised existing versions.
> When we do dependency updates, we: > [snip]
While these practices are better than nothing, they don't fundamentally address the core issue.
> That gap acts as an early-warning window: the community and security researchers often detect malicious versions quickly
According to whom? Heartbleed, a vulnerability in a package with far more scrutiny than a typical npm module took what, 2 years to be found? The "community detection" assumption is flawed.
I'm not trying to put Obsidian down here - I sympathize, aside from implementing everything themselves, what can they do! I'm trying to point out that while their intent is good, this is a serious problem and their solution is not a solution.
Of course, it's the same in any project with dependencies. It's the same in other languages as well - if they have a convenient package manager. Like Rust and Cargo.
This problem came with the convenience of package managers and it should be fixed there, not by every application like Obsidian. I'm not sure how but maybe once a package is starting to become popular, additional security measures must be put in place for the author to be able to continue to commit to it. Signing requirements, reproducible builds, 2fa, community reputation systems, who knows.
Individual applications can't solve supply chain security through wishful thinking and version pinning.
Package managers need to solve this at the infrastructure level through measures like mandatory code signing, automated security auditing, dependency isolation, or similar system level approaches.
Obsidian's practices are reasonable given the current tooling limitations, but they don't eliminate the fundamental risks that the package managers bring to modern dependency ecosystems.
[dead]
[dead]
'It may sound obvious but the primary way we reduce the risk of supply chain attacks is to avoid depending on third-party code."
What a horribly disingenuous statement, for a product that isn't remotely usable without 3rd-party plugins. The "Obsidian" product would be more aptly named "Mass Data Exfiltration Facilitator Pro".
I've used Obsidian for years without a single 3rd party plugin.
It is possible to make your same point without histrionic excess.
Yeah, this is always the response. Usability can be assessed objectively, so you just have low standards.
I have to agree that I don't find plugins necessary, and I'm not sure why you're so down on people using a solid backlinking note taker. I don't think I have low standards, I think Roam and Logseq aren't that great and Obsidian is all I need.
A more charitable interpretation would be that that have different needs. My keyboard costs more than my computer, but most people probably spend $15-$50 on a keyboard. Even my mouse is well outside that range. Do I have high standards or do I have tendonitis?
Software usability, in this context, is measured objectively, there is no interpretation. This is separate from a specific user's preferences, the ergonomics of their hardware, etc.. As for your high standards vs tendonitis distinction, I'd say these things are not mutually exclusive, and the comparison is not related to what we're talking about.
> a product that isn't remotely usable without 3rd-party plugins
That's not even remotely true. Obsidian out-of-the-box is very usable, covers pretty much all use cases for a note-taking software.
I've been using it for ages, and I haven't needed to turn on the Community Plugins switch for anything.
Usually the people I see with tons of Obsidian plugins are people who think "just one more plugin" is what stands between them and productivity.
That's just...what? It's highly usable without plugins. Yes, I use plugins...but that's by choice. Obsidian is still a superior Markdown editor with backlink support, plugins or not.
I think you and your fellow commenters are missing the point. The degree to which Obsidian is "a superior Markdown editor with backlink support" can be debated. What I'm saying is that actual usability, in this context, is not a matter of opinion -- see ISO 9241-210, ISO/IEC 25010, etc.. Having said that, I'm glad you're happy.
This doesn't make any sense to me. I've always been told you don't write anything yourself unless you absolutely have to and having a million micro-dependencies is a good thing. JavaScript and now Rust devs have been saying this for years. Surely they know what they're doing...
There is a balance to be struck. NPM in particular has been a veritable dependency hell for a long time. I don't know if it just attracts inexperienced developers, or if its security model is fundamentally flawed, but there have been soooo many supply chain attacks using NPM that being extra careful is very much warranted.
Right, I get that. I was just making a joke because I really dislike the micro-dependency approach. Honestly bothers me more in Rust then JavaScript, but that's probably just because I'm not a web dev.
From my buddies who have worked in JS my understanding is that a lot of it is rooted in JavaScript having a really shitty standard library for a long time. So that lead to a culture of code sharing early on since everyone was rewriting the same utilities and what not. Which evolved into the package management ecosystem over time.
Why that came about in something like JavaScript but not C, which also obviously has a really small std is super interesting. My only guess is that JavaScript came up in the internet when package managers started to take on a new meaning, and web dev is more relavent to a broader range of (nontech) companies and they're more likely to put a huge focus on quality over quantity.
I've been focused on writing software the last couple of weeks. What is obsidian again? I can't find a simple "what is obsidian?" FAQ on their site. Is it a browser or a node replacement like deno? Or an AI library? Clearly obsidian has plugins, but what are they in service of?
If it's a browser, they should have something on their web site that says "obsidian is a really cool browser." I think there are a lot of people out there who are ignoring the hype-train and it would do the community a service if they just started with answering that simple question. I mean sure, I get it, it's a "sharpen your thinking app," but I'm not sure what that means.
It’s a notetaking app.
Thanks.