Did not know that. That sounds extraordinary wasteful, there must be a file hash based method that would allow sharing such files between domains.

It offers security.

Just like you wouldn't use same table in your system for all users in a multi tenant application.

If the file is hashed strongly enough then it can be no other file. I can see how information on previous sites visited can be leaked and how this could be bad but I think whitelisting by end users could still allow some files to be used. E.g. the code for react.

The fact that you don't see it doesn't mean it doesn't exist. I make up a unique file, put it on site X and ask your browser to cache it. I try to load the same file on site Y and time how long it takes. If it's instant, site Y knows you visited site X.

Tadaaa! Tracking.

I said I ‘can see’ I already understand that. Hence the whitelisting on files that are not unique / created for this purpose.

Ah, my bad, sorry.

it's a security feature. otherwise my malicious site could check for cdn.sensitivephotoswebsite.com and blackmail you if it was cached already

It would be nice if there was a whitelist option for non-sensitive content. I stopped using cdn links due to the overhead of the extra domain lookups but I did think that my self hosted content would be cached across domains.

It would be nice if there was a whitelist option for non-sensitive content.

There's no such thing as non-sensitive content from a CDN though. Scripts are obviously sensitive, styles can be used to exfiltrate data through background-url directives, and anything like images has no benefit being cached across sites.

Fonts might be one exception, but I bet those are exploitable somehow.

Seem like a solvable problem. Per origin cache control. But actually just load the data locally