This is so weird that a random website on a random server (having SSL doesn’t change that in the slightest) is considered less of a risk than a file I have on my own computer.
Can somebody help me understand what’s going on?
This is so weird that a random website on a random server (having SSL doesn’t change that in the slightest) is considered less of a risk than a file I have on my own computer.
Can somebody help me understand what’s going on?
There’s some discussion in the specs here https://w3c.github.io/webappsec-secure-contexts/#is-origin-t...
> In particular, the user agent SHOULD treat file URLs as potentially trustworthy.
> User agents which prioritize security over such niceties MAY choose to more strictly assign trust in a way which excludes file.
A potentially trustworthy URL is a secure context: https://html.spec.whatwg.org/multipage/webappapis.html#secur...
So this is a matter of browsers not implementing it, probably because there’s just not a lot of demand for it.
The random website cannot access arbitrary files on your computer's file system and send it somewhere else. An html file with javascript running locally, if trusted, on a typical personal computer could do that.
There should be some way to mark files as accessible, for example, place them into a folder with a specific name (like "html-accessible-files") on the same level as HTML file.
Internet Explorer in an ancient age used to have .HTA ("HTML Application") files for a double-click a self-contained HTML file that can act as a small local application. It did a lot of what PWAs are still trying to do, somewhat more effectively and simpler. (.HTA was mostly just a ZIP file! Simple to build.) It also had a lot of security holes that gave it a terrible reputation and a lot of reasons it got killed. (It was very early days for "AJAX" and modern browser security tools like CORS and whatnot, after all.)
Such things seem to be cycles.
Today a lot of browsers support .MHT which is a similar format, but also worse in many other ways. (The M stands for MIME and wrapping a website like an email seems somehow sillier and weirder to me than wrapping it in a ZIP file, though I get that MIME wrappers are ancient internet tech with an ancient track record.)
Then we see all the millions of apps in PWAs and Electron downloads.
At some point it feels like we should have better solutions and cut some of the gordian knot cycling between "local apps are too much of a security risk" and "local apps should be complicated collections of Service Workers to get offline support" and "local apps should just embed a full browser and all its security implications/risks rather than allowing browsers to directly open local apps" and back and forth. .HTA and .MHT both showcase possible directions back to "simpler" than PWAs/Electron, they just have such fascinating and weird histories.
The rule was probably intended to block public non-SSL pages, so your clipboard data doesn't ever get sent over the wire unencrypted.
Why does it block local pages? Well what benefit of is to Apple or Google if it were easier to make good localhost webapps?
Try deleting Safari site data(indexed DB etc) for your localhost site. You won't be able to. Hell, even deleting data for a specific public site is hilariously painful. Try adding a certificate authority to your iPhone. Try playing locally cached audio blobs via web APIs on an iPhone. There's probably 1000 more.