This is one way to look at it, but ignores the fact that most users use third party community plugins.
Obsidian has a truly terrible security model for plugins. As I realized while building my own, Obsidian plugins have full, unrestricted access to all files in the vault.
Obsidian could've instead opted to be more 'batteries-included', at the cost of more development effort, but instead leaves this to the community, which in turn increases the attack surface significantly.
Or it could have a browser extension like manifest that declares all permissions used by the plugin, where attempting to access a permission that's not granted gets blocked.
Both of these approaches would've led to more real security to end users than "we have few third party dependencies".
When I was young there were a few luminaries in the software world who talked about how there is a steady if small flow of ideas from video game design into conventional software.
But I haven't heard anyone talk like that in quite sometime (unless it's me parroting them). Which is quite unfortunate.
I think for example if someone from the old guard of Blizzard were to write a book or at least a novella that described how the plugin system for World of Warcraft functioned, particularly during the first ten years, where it broke, how they hardened it over time, and how the process worked of backporting features from plugins into the core library...
I think that would be a substantial net benefit to the greater software community.
Far too many ecosystems make ham-fisted, half-assed, hair-brained plugin systems. And the vast majority can be consistently described by at least two of the three.
I’ve been of the opinion that every hard problem in CS shows up somewhere in gamedev. It’s a great space for inspo.
Game dev also rewards people for applying the 80/29 rule effectively and you see less of that in commercial software.
In each game generation there’s a game that would be easy to write on the next or subsequent generation of hardware and is damned difficult to implement on the current one. Cleverness and outright cheating make it work, after all fashion.
80/29 rule is the paretypo principle?
Typo but yeah.
It reaches a dead end eventually. That's where we are, edge of speed, where the only mods left are aesthetics veering at photorealism.
The game simulation will get more detailed/granular as aesthetics dial down in perceived value. You can always go bigger/wider/more procedural/more multiplayer.
This is also why every hard problem eventually shows up — games are just simulation + interaction, and eventually everything that can be simulated will have some attempted implementation out there, struggling along. (For some reason, this does not appear to stop at “interesting” things to simulate — see all the literal simulators on steam)
The simulations have yet to release photo-realism in lieu of event-perception, where simulation parallels reality, but that's not really playable as a game, only as a view.
My team's approach is game dev is probably for the hard problems in reality, behavior, ecology, language.
I came to learn that even though in process plugins are easier to implement, and less resource demanding, anyone serious about host stability and security can only allow for plugins based on OS IPC.
And in general, it will take less hardware resources that the usual Electron stuff.
Kernel design is (to me) another one where ideas have flowed into other software fields - there were monolithic kernels, micro kernels, and hybrid kernels, and they all need to work with third party modules (drivers)
The lessons from all fields seem to be relearnt again and again in new fields :-)
Because learning how to make a proper one requires building your own broken one first.
It might be slightly sped up by reading up on theory and past experiences of others.
I am around mid life and I see how I can tell people stuff, I can point people to resources but they still won’t learn until they hit the problem themselves and put their mind into figuring it out.
A lot of stuff we think is top shelf today was tried on mainframes in the late 80’s through the 90’s. Cloud computing is mostly recycled 90’s “fashion”.
See also people trying to bring Erlang back into fashion.
> Obsidian plugins have full, unrestricted access to all files in the vault.
Unless something has changed, it's worse than that. Plugins have unrestricted access to any file on your machine.
When I brought this up in discord a while back they brushed it aside.
Having recently read through a handful of issues on their forums, they seems to brush aside a lot of things. It's a useful tool but the mod / dev team they have working with the community could use some training.
If you're using a flatpak, that's not actually the case. It would have very restricted access to the point where you even would have to explicitly give it access to user /home.
You're wrong. The obsidian flatpak ships by default with access to /home. https://github.com/flathub/md.obsidian.Obsidian/blob/5e594a4...
Interesting, I thought I had to turn that on for Obsidian!
The first time I started installing flatpaks I ran into a bit of permission / device isolation trouble and ever since then, I use flatseal after installing an app to make sure it actually has access to things.
I guess I misremembered in the case of Obsidian.
I „love” such sandboxing defaults. Apps like Docker Desktop also share the whole home by default [1], which is pretty interesting if a big selling point is to keep stuff separated. No idea why node_packages need to have access to my tax returns :). Of course you can change that, but I bet many users keeps the default paths intact.
[1] https://docs.docker.com/desktop/settings-and-maintenance/set...
Needed for volume mounting to work easily I assume.
Yeah, I forgot there’s the intermediate VM level, and user folders are shared there so that folders could be mounted to the individual containers using host paths.
So if I run their software in a container they can't access my entire filesystem. I don't think that is a security feature.
It sounds like if I ever run obsidian I should be using flat seal too.
Er, what?
I'm not claiming it's a security feature of Obsidian, I'm saying it's a consequence of running a flatpak - and in this situation it could be advantageous for those interested.
Sorry, it genuinely sounded to me like you were saying that it's not a problem because flat pack.
No, lol
What if you run little snitch and block any communications from obsidian to anything?
Or firejail. Or QubesOS using a dedicated VM. There are options, but it would still be nice if Obsidian had a more robust security model.
I have been using firejail for most of these kind of applications, be it Obsidian, Discord, or the browser I am using. I definitely recommend people start using it.
Sell it to us! Why do you use specifically firejail?
There are so many options, from so many different security perspectives, that analysis paralysis is a real issue.
I feel like I should keep track of all my comments on HN because I remember writing a lengthy comment on firejail more than once. I cannot keep doing this. :D
For user-space, there is usually bubblewrap vs. firejail. I have not personally used bubblewrap, so I cannot comment on that, but firejail is great at what it does.
The last comment was about restricting clipboard access to either X11 or Wayland which is possible with firejail quite easily, so if you want that, you can have that.
You can do a LOT more with firejail though.
https://wiki.archlinux.org/title/Firejail
https://man.archlinux.org/man/firejail.1
> bubblewrap vs. firejail
In case anyone else is curious, I found the following comparison in bubblewrap's repo.
- https://github.com/containers/bubblewrap#related-project-com...
I'm gonna try both and see which one I like. Thanks for this info! You're sure living up to your user name there. (:
So do you configure firejail to give each app their own separate, permanent home directories? Like "firejail --private=/home/user/firejails/discord discord", "firejail --private=/home/user/firejails/chromium chromium", and so on?
I have my own Discord.profile!
This is my ~/.config/firejail/Discord.profile[1]:
I have some things commented out but you could probably uncomment most.Some has this, too:
FWIW, once you start whitelisting, it will only have access to those directories and files only, so Discord has no access to anything other than its own directory and ${DOWNLOADS}, which I should probably change.You should check out the default profiles for many programs / apps under directory "/etc/firejail".
[1] You run it via "firejail Discord" or "firejail ./Discord" if you name it "Discord.profile".
This is great. Thanks for the detailed reply!
It was not THAT detailed and it makes me feel a bit guilty, so if you have any questions let me know.
FYI you can search your comment history with hn.algolia.com:
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
Thank you, exactly what I have been looking for!
Little snitch can block open(2)?
I treat LS as a privacy/anti-telemetry/anti-accident tool, not as anti malware.
Obviously it can detect malware if there’s a connection to some weird site, but it’s more like a bonus than a reliable test.
If you need to block FS access, then per app containers or VMs are the way to go. The container/VM sandboxes your files, and Little Snitch can then manage externa connectivity (you might still want to allow connection to some legit domains—-but maybe not github.com as that can be use to upload your data. I meant something like updates.someapp.com)
Very, very good point
I got lazy
Time to crank the paranoidmeter up again
ty
I believe they're saying it can open, it just can't send the data anywhere.
Seems a little excessive, but here we are.
It still can encrypt everything and demand you pay some ₿₿₿₿.
If it can open and write any file on the OS, it's pretty much game over. Too many ways to exfiltrate data even without network/socket access.
Worse, what keeps this from editing the config files for Little Snitch (or similar blockers)?
I believe LS has some protections against this. Never tried them, but there are config related security options, incl. protection against synthetic events. So they definitely put some thought into that.
File system permissions?
Is this true on Mac? Usually I am notified when programs request access outside the normal sandboxed or temp folders. Not sure how that works in any detail though.
To be fair it also ships with the ability to install community plugins disabled.
Ah I guess that's one reason some folks started running it in a docker container. I think Linux server recently released a container for it.
To be fair, it’s no worse of a dumpsterfire than any other plug-in ecosystem.
Funny enough, I thought this earlier about Arch Linux and it's deritives. It was mentioned on reddit that they operate on a small budget. A maintainer replied that they have very low overhead, and the first thought that popped into my mind was that most of the software I use and rely on comes from the AUR, which relies on the user to manage their own security.
If engineers can't even manage their own security, why are we expecting users to do so?
I'm shocked it is most of your software. I think I have under a dozen AUR packages. It has been that way for about a decade. I added a couple for gaming recently (mostly because Lutris just crashes for me), but nearly all of my software comes from the official repos.
Same for me. I learned about AUR before installing Arch, but went months before installing my first package from there.
I think this criticism is unfair because most common packages are covered by the core and extra repos which are maintained by Arch Linux. AUR is a collection of user build scripts and using it has a certain skill cliff such that I expect most users to have explicit knowledge of the security dangers. I understand your concern but it would be weird and out of scope for Arch to maintain or moderate AUR when what Arch is providing here amounts to little more than hosting. Instead Arch rightly gives the users tools to moderate it themselves through the votes and comments features. Also the most popular AUR packages are maintained by well known maintainers.
The derivatives are obviously completely separate from Arch and thus are not the responsibility of Arch maintainers.
Disagree. AUR isn’t any trickier than using pacman most of the time. Install a package manager like Yay or Paru and you basically use it the same way as the default package manager.
It’s still the same problem, relying on the community and trusted popular plugin developers to maintain their own security effectively.
I understood GP's point to be that because Obsidian leaves a lot of functionality to plugins, most people are going to use unverified third party plugins. On arch however most packages are in core or extra so for most people they wont need to go to AUR. They are more likely to install the flatpak or get the appimage for apps not in the repos as thats much easier.
yay or paru (or other aur helpers afaik) are not in the repos. To install them one needs to know about how to use AUR in the first place. If you are technically enough to do that, you should know about the security risks since almost all tutorials for AUR come with the security warnings. Its also inconvenient enough that most people wont bother.
In obsidian plugins can seem central to the experience so users might not think much of installing them, in Arch AUR is very much a non essential component. At least thats how I understand it.
> Its also inconvenient enough that most people wont bother. > in Arch AUR is very much a non essential component.
While somewhat true, we are talking about a user who has installed Arch on their machine. If a user wanted to not bother with installation details, they would've installed Ubuntu.
The Arch-based distros that most normies will install have AUR helpers instaled by default.
I can't even install Brave without the AUR.
> If engineers can't even manage their own security, why are we expecting users to do so?
This latest attack hit Crowdstrike as well. Imagine they had gotten inside Huntress, who opened up about how much they can abuse the access given: https://news.ycombinator.com/item?id=45183589
Security folks and companies think they are important. The C suite sees them as a scape goat WHEN the shit hits the fan and most end users feel the same about security as they do about taking off their shoes at the airport (what is this nonsense for) and they mostly arent wrong.
It's not that engineers cant take care of their own security. It's that we have made it a fight with an octopus rather than something that is seamless and second nature. Furthermore security and privacy go hand and hand... Teaching users that is not to the benefit of a large portion of our industry.
> It's not that engineers cant take care of their own security.
I dunno. My computer has at least 1 hardware backdoor that I know off, but that I just can't get hardware without any equivalent exploit.
My OS is developed with a set of tools that is known to make code revision about as hard as possible. Provides the bare minimum application insulation. And is 2 orders of magnitude larger than any single person can read on their lifetime. It's also the usable OS out there with best security guarantees, everything else is much worse or useless.
A browser is almost a new complete layer above the OS. And it's 10 times larger. Also written in a way that famously makes revisions impossible.
And then there are the applications, that is what everybody is focusing today. Keeping them secure is close to useless if one don't fix all of the above.
You never actually told us what your OS is.
Because that would be a distraction to the point they're actually making.
The point is thoroughly undermined since we can't judge the veracity of their claims
And discussing the specifics of whatever OS GP uses is exactly the type of OT he was wise enough to avoid.
Personally, I think he uses Emacs.
They must mean macos, right?
I think you could find a dozen different operating systems that someone, somewhere, would say similar about.
I'm developing an Obsidian plugin commercially. I wish there was a higher tier of vetting available to a certain grade of plugin.
IMO they should do something like aur on Arch Linux and have a community managed plugin repo and then a smaller, more vetted one. That would help with the plugin review time too.
Just out of curiosity, what's the plugin? Are there folks interested in paying for plugins?
The plugin is called Relay [0] -- it makes Obsidian more useful in a work setting by adding real-time collaboration.
One thing that makes our offering unique is the ability to self-host your Relay Server so that your docs are completely private (we can't read them). At the same time you can use our global identity system / control plane to collaborate with anyone in the world.
We have pretty solid growth, a healthy paid consumer base (a lot of students and D&D/TTRPG), and starting to get more traction with businesses and enterprise.
[0] https://relay.md
Are you worried about being sherlocked at all? I know "multiplayer" is on their official roadmap.
yeah, definitely.
It might not be the most strategic move, but i want to build cool and useful tools, and the Obsidian folks are a big inspiration.
I hope there's a way to collaborate and/or coexist.
This open letter seems relevant here: https://www.emilebangma.com/Writings/Blog/An-open-letter-to-...
I think it's a matter of time until we see a notable plugin in the obsidian space get caught exfiltrating data. I imagine then, after significant reputational harm, the team will start introducing safe guards. At a minimum, create some sort of verified publisher system.
Don’t most plugin models work this way? Does VSCode, Vim, Emacs, and friends do anything to segregate content? Gaming is the only area where I expect plugins have limited permissions.
Browser extensions also have a relatively robust permissions-based system.
If they wanted to, one would guess that browser-ish local apps based on stuff like Electron/node-webkit could probably figure out some way to limit extension permissions more granularly.
I would have thought, but it has been how many years, and as far as I know, there is still no segregation for VSCode extensions. Microsoft has all the money and if they cannot be bothered, not encouraged that smaller applications will be able to iron out the details.
I think it's just because supply-chain attacks are not common enough / their attack surfaces not large enough to be worth the dev time... yet...
Sneak in a malicious browser extension that breaks the permissions sandbox, and you have hundreds of thousands to millions of users as an attack surface.
Make a malicious VSCode/IDE extension and maybe you hit some hundreds or thousands of devs, a couple of smaller companies, and probably can get on some infosec blogs...
>Make a malicious VSCode/IDE extension and maybe you hit some hundreds or thousands of devs, a couple of smaller companies, and probably can get on some infosec blogs..
Attackers just have to hit one dev with commit rights to an app or library that gets distributed to millions of users. Devs are multipliers.
The time has come. The nx supply chain attack a couple weeks ago literally exfiltrated admin tokens from your local dev machine because the VS code extension for nx always downloaded the latest version of nx from npm. And since nx is a monoreop tool, it’s more applicable to larger projects with more valuable tokens to steal.
The solution at my job is you can only install extensions vetted by IT and updates are significantly delayed. Works well enough but sucks if you want one that isn't available inside the firewall.
>Browser extensions also have a relatively robust permissions-based system.
Yeah and they suck now. We need a better security model where it's still possible to do powerful stuff on the whole machine (it's MY computer after all) without compromises.
>We need a better security model where it's still possible to do powerful stuff on the whole machine
That's not possible. If you can do powerful stuff on the whole machine by definition you have no security. Security is always a question of where you create a perimeter. You can hand someone a well defined box in which they can do what they want, you can give someone broader access with fewer permissions, but whether vertically or horizontally to have security is to exercise control and limit an attack surface.
That's even implicit in the statement that it's YOUR computer. The justification being that there's a dividing line between your computer and other computers. If you'd be part of of a network, that logic ceases to hold. Same when it comes to components on your machine.
vim and emacs are over 30 years old and therefore living with an architecture created when most code was trusted. Encrypting network protocols was extremely rare, much less disks or secrets. I don't think anything about the security posture of vim and emacs should be emulated by modern software.
I would say VSCode has no excuse. It's based on a browser which does have capabilities to limit extensions. Huge miss on their part, and one that I wish drew more ire.
I'd love to see software adopt strong capabilities-based models that enforce boundaries even within parts of a program. That is, with the principle of least authority (POLA), code that you call is passed only the capabilities you wish (e.g. opening a file, or a network socket), and not everything that the current process has access to. Thomas Leonard's post (https://roscidus.com/blog/blog/2023/04/26/lambda-capabilitie...) covers this in great detail, and OCaml's newer Eio effect system will has aspects of this too.
The Emily language (locked-down subset of OCaml) was also interesting for actively removing parts of the standard library to get rid of the escape hatches that would enable bypassing the controls.
Sadly capabilities are older than emacs. I’d welcome advancements here but their practical utility is clearly not a foregone conclusion.
It seems to me that it's not their utility, but lack of support in general for the sorts of changes that enable its wider use. E.g., looks like it's getting practical use in FreeBSD: https://www.cl.cam.ac.uk/research/security/capsicum/freebsd....
Linux has seccomp, but I think that was changing the access for an entire process. The language-focused aspect seems useful to me, from that application aspect where maybe I want access to something, but I don't want to pass that access on to all the code that I might call from a library.
> OCaml's newer Eio effect system
Eio is an IO library out of many competing ones, not OCaml's effect system. The capabilities are an Eio thing, not an effects thing.
Gotcha, thanks!
You have to get out the beaten path to get plugins into Vim/Emacs. It's not difficult, but you don't have access to a marketplace open to the world from the get go. I think Emacs have ELPA, but I would put that at the level of OS repos like Debian/Alpine.
iirc vscode has RCE by design when you use the remote editing feature (i.e. editing files on a server, which is obviously a bad idea anyway, but still a feature) and nobody gives a fuck.
> Gaming is the only area where I expect plugins have limited permissions.
It's pretty much the opposite. A lot of modding communities' security model is literally just to "trust the community."
Example: https://skylines.paradoxwikis.com/Modding_API
> The code in Mods for Cities: Skylines is not executed in a sandbox.
> While we trust the gaming community to know how to behave and not upload malicious mods that will intentionally cause damage to users, what is uploaded on the Workshop cannot be controlled.
> Like with any files acquired from the internet, caution is recommended when something looks very suspicious.
I think they meant games that specifically come with a sandboxed scripting layer. Otherwise, I agree that most mods are indeed just untrusted patches for a native executable or .NET assembly.
I guess the intent behind Cities Skylines's support for mods is just removing the need for a mod manager and enabling Steam Workshop support.
> Gaming is the only area where I expect plugins have limited permissions.
Do you mean mods on Steam? If you do, then that's down to the individual game. Sandboxing mods isn't universal.
I was thinking more Lua/Luaua which make it trivial to restrict permissions. In general, the gaming client has access to a lot more information than it shares, so to prevent cheats from plugins, the developers have to be explicit about security boundaries.
Perhaps, but I think what you might put onto Obsidian (personal thoughts, journal entries etc) can be more sensitive than code.
> Obsidian plugins have full, unrestricted access to all files in the vault.
And how exactly you can solve that?
I don't want to press 'allow access' on the every file some plugin is accessing.
One of the large dependencies they call out is an excellent example: pdf.js.
There is no reason for pdf.js to ever access anything other than the files you wish to export. The Export to PDF process could spawn a containerized subprocess with 0 filesystem or network access and constrained cpu and memory limits. Files could sent to the Export process over stdin, and the resulting PDF could be streamed back over stdout with stderr used for logging.
There are lots of plugin systems that work this way. I wish it were commodofied and universally available. AFAIK there's very little cross-platform tooling to help you solve this problem easily, and that's a pity.
Specific permissions declared in a manifest much like browser extensions could be a good first step.
Another thought: what about severely sandboxing plugins so they while they have access to your notes, they have no network or disk access and in general lack anyway for them to exfiltrate your sensitive info? Might not be practical but approaches like this appeal to me.
Deno would be a good candidate for this.
As someone who specifically started building Octarine, just for this reason, I understand.
Having to rely on random devs for the most basic functionality and passing it off as `community does what it wants` is weird. Either add it in yourselves, or accept the fact that given your app requires external contributors to work at a little above the basic level, there are going to be security issues.
Writing a whole blog post, and throwing shade on "other apps" that have far more dependencies than Obsidian is weird to me.
Anyway, it seems like you can't really talk bad about them, since there's a huge following that just comes at you, and that feels weird, cause they apparently can throw shade, others can't just talk back.
That's ok. I haven't come across an Obsidian plug-in that's worth introducing a dependency for.
I use “Templater” and “Dataview” but now I am rethinking my usage; they were required for the daily template I use (found here on HN) but this is probably overkill.
I did too but have switched over to “bases” now that that’s in core. Before that I had an apparmor profile restricting Obsidian from reaching the web.
> could've instead opted to be more 'batteries-included', at the cost of more development effort, but instead leaves this to the community, which in turn increases the attack surface significantly.
Ah, the WordPress model.
This app deals with very critical, personal, and intimate data – personal notes and professional/work-related notes, but proudly has an Electron app. This alone has seemed like a massive red flag to me.
Until there is a better alternative you’re left with electron. Nothing come close to obsidian.
Give Octarine (https://octarine.app) a try?
Built with tauri & rust, and is more performant, and doesn't rely on random contributions for basic things as plugins.
Disclaimer - I build it.
There are better alternatives. It's just that people have convinced themselves they need the features Obsidian offers - because it makes them feel smart and important.
At the end of the day, you're just taking notes. If you write a journal, don't put it in something like Obsidian. Even Apple Notes is better (in security, privacy, etc) in this regards.
How do I use Apple Notes cross-platform?
You can't. But that wasn't the point, was it?
Point is, you don't need Obsidian (or all of its plugin). People have been making do with Dropbox and plain text (.txt) files perfectly fine for years.
Wow I never knew I "can already build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem".
This is why people use Obsidian.
Plain-text folder on a cloud sharing service. Edit with notepad.exe or whatever editor you prefer. Others have been doing it with .doc files forever, or .rtf.
It's no worse than vscode. Sure there's permissions, but it's super common for an extension to start a process and that process can do anything it wants.
It's *significantly* worse than vscode. vscode is at least attempting to grapple the problem: https://code.visualstudio.com/docs/configure/extensions/exte....
And why is VSCode our baseline?
Because it is one of the most popular dev tools out there? If not the most popular. It also uses Electron, like Obsidian. Has thousands of plugins, like obsidian.
Plus vscode is maintained by a company with thousands of devs. Obsidian is less than 10 people, which is amazing. About plugins why blame the product, pls check what you install on your machine instead
My personal take is that the only way to be reasonably sure you're OK is to install as few apps as possible and then as few plugins as possible (and ideally stick to the bundled ones only). I don’t think it’s controversial, but for some reason this is not how many people think, even if in the real world you don’t give keys to your place to everyone who says they’re cool :)
Among others, this is a big reason I want effect systems to gain more attention. After having seen them, the idea that in most languages, the only option is that any function can do anything without keeping track of what it affects in its type signature is bonkers to me.
I agree Obsidian plugins do nothing about safety. But I'm not sure "most users use plugins", that's not my impression from reading the subreddit. I wonder if there's any data on it?
> most users use third party community plugins
Is this true? Is there any source about how many obsidian users use third party plugins? For once I don't. Moreover, obsidian by default runs in "restricted mode" which does not allow for community plugins. You have to specifically enable it to be able to install community plugins, hence I assume somebody who does that understands the risks involved. How many people even get into enabling that?
For me it is not even about security firstmost, the whole appeal of markdown is simplicity and interoperability. The more I depend on "plugins" the more I am locked in into this specific platform.
That just sounds like Linux packages; also not a system known for security of desktop apps and scripts especially compared to MacOS, shoot me.
Operating systems are different though, since their whole purpose is to host _other_ applications.
FWIW, MacOS isn't any better or worse for security than any other desktop OS tbh....
I mean, MacOS just had it's "UAC" rollout not that long ago... and not sure about you, but I've encountered many times where someone had to hang up a Zoom or browser call because they updated the app or OS, and had to re-grant screenshare permissions or something. So, not that different. (Pre-"UAC" versions of MacOS didn't do any sandboxing when it came to user files / device access)
Is there an alternate to obsidian?
(CEO of Obsidian here)
Yes, on desktop, Obsidian plugins can access files on your system, unless you run it in a container. On iOS, iPadOS, and Android the app is sandboxed so plugins are more constrained.
This is not unique to Obsidian. VS Code (and Cursor) work the same way despite Microsoft being a multi-trillion dollar company. This is why Obsidian ships in restricted mode and there's a full-screen warning before you turn on community plugins.
VS Code and Obsidian have similar tradeoffs, both being powerful file-based tools on the Electron stack. This fear about plugins was raised on the Obsidian forums in 2020 when Obsidian was still new, and Licat explained[1] why it’s not possible to effectively sandbox plugins without making them useless.
So... what do you do?
The drastic option is to simply not use community plugins. You don't have to leave restricted mode. For businesses there are several ways to block network access and community plugins[2]. And we're currently planning to add more IT controls via a policy.json file[3].
The option of using Obsidian without plugins is more viable in 2025 than it was in 2020, as the app has become more full-featured. And we're now regularly doing third-party security audits[4].
But realistically, most people want to run community plugins, and don't have the technical skills to run Obsidian in a container, nor the ability and time to review the code for every plugin update.
So the solution that appeals to us most is similar to the "Marketplace protections"[5] that Microsoft gradually implemented for VS Code. For example, implementing a trusted developer program, and automated scanning of each new plugin update. We plan to significantly revamp the community directory over the coming year and this is part of it.
Note that Obsidian is a team of 7 people. We're 100% user-supported[6] and competing with massive companies like Microsoft, Apple, Google, etc. Security audits are not cheap. Building an entire infrastructure like the one I described above is not easy. We're committing to doing it, but it wouldn't be possible without our supporters.
[1] https://forum.obsidian.md/t/security-of-the-plugins/7544/3
[2] https://help.obsidian.md/teams/deploy
[3] https://x.com/kepano/status/1957927003254059290
[4] https://obsidian.md/security
[5] https://code.visualstudio.com/docs/configure/extensions/exte...
[6] https://stephango.com/vcware
The Simpsons Springfield Nuclear Plant Security scene in real life.
https://www.youtube.com/watch?v=eU2Or5rCN_Y