Closed Bug 402983 Opened 17 years ago Closed 17 years ago

Security checks that should be symmetric are now asymmetric

Categories

(Core :: Security: CAPS, defect, P1)

x86
Linux
defect

Tracking

()

RESOLVED FIXED
mozilla1.9beta5

People

(Reporter: bzbarsky, Assigned: jst)

References

(Depends on 1 open bug, Blocks 3 open bugs)

Details

(Keywords: dev-doc-complete)

Attachments

(7 files)

Bug 230606 made SecurityCompareURIs asymmetric again (as it was in 1.8). Unfortunately, we have a number of callers at this point who assume it's symmetric. nsPrincipal::Equals comes to mind. We need to either make it symmetric again or to audit all callers and make sure they're either using it in the right direction, or calling it both ways.
Flags: blocking1.9?
Severity: normal → critical
Blocks: 230606
Flags: blocking1.9? → blocking1.9+
Priority: -- → P3
Or we can ratchet up the fileuri origin policy to "samedir", which is symmetric. Will have to fix some of the testcases that break when we do that.
I think we should do that, given that ~/ and ~/Desktop/ are common download locations.
Yeah, using "samedir" (and removing the asymmetric options from that pref) would be an acceptable fix for this bug as far as I'm concerned. I suspect it'll cause all sorts of usability issues, unfortunately. :( I wonder: can we somehow flag files that we save from the Internet and enforce a "samedir" policy for those, while allowing things that originated on the user's hard drive (and in particular things that live outside the user's homedir altogether) to access arbitrary files? That would allow HTML manuals, for example, to continue working. I believe both some Windows versions and OS X have a way of flagging such untrusted content. In fact, I seem to recall that OS X sets the flag itself, while on Windows the app needs to do it (and we currently don't).
> I suspect it'll cause all sorts of usability issues, unfortunately. :( We'll see... maybe we'll need to give users a way to specify that a directory should be treated as a single local web application. Or maybe we'll need to let pages opt in, similar to document.domain. I'm not sure I understand your "HTML manuals" example. Most of them don't use cross-page scripting (if they use scripting at all), right? > I wonder: can we somehow flag files that we save from the Internet and > enforce a "samedir" policy for those... I don't think so. There are lots of ways for HTML files to end up on users' hard drives; see bug 230606 comment 26 for some examples.
> Most of them don't use cross-page scripting (if they use scripting at all), > right? Yes, but apparently some of them use other things that we perform cross-origin checks on. Client-side XSLT, for example. Again, most things will be OK. Some things will break. Some of the things that will break are living on CDs and can't be updated. The only question is how many and how important... Can we treat anything outside ~ and /tmp (and their conceptual equivalents) as a single origin, perhaps? That might alleviate some of the concerns about breaking existing installed stuff, somewhat.
Oh, we also do same-origin-type checks for window targeting, which could affect someone if their windowing setup is confused enough (the confusion is needed because if you open a window you can target it regardless of origin, so you'd need to be targeting a window you didn't open).
Bz/jst should we move the priority up on this?
Yes. This needs to definitively block release, so it should be either a P1 or a P2. Probably P2, since it doesn't _really_ need to block the beta, except insofar that every time we change this people will need to retest their file:// stuff. Ideally we would in fact fix this for beta 3, but I have a hard time advocating that without being able to commit to fixing it. Bumping to P2 for now.
Priority: P3 → P2
(In reply to comment #5) > Again, most things will be OK. Some things will break. Some of the things > that will break are living on CDs and can't be updated. The only question is > how many and how important... Example: http://forums.mozillazine.org/viewtopic.php?t=610173
I was thinking this morning... could we perhaps do something where a subframe that has a file:// principal also stores the principal of its parent frame, or better yet of whoever loaded it. If the latter is also a file:// principal, it's used for same-origin checks. In other words, the principals for a frame and its parent test equal, but still have different URIs (so that base URI stuff works right). This wouldn't help with opening a page from a subdir in a new window (though we could perhaps propagate the principal there too). We would want to change checkLoadURI so that random content can't link to ancestors, perhaps, but maybe we can make this work? Sorry about the fuzziness of the idea; I haven't had the time to really think it through.
So one thing that I think that we're going to break here is CDs containing HTML files. I recently got a mail from the producer of such a CD. I think this might be a semi common thing to do to produce a CD and use HTML either as documentation format or as format for the whole contents. Could we detect which paths (drives on windows) map to CDs or similar medias and consider all files on a CD as same-origin. I realize this would take a lot of platform specific code unfortunately.
I've been thinking about my proposal some more, and I think I was proposing the following system: 1) For every file:// URI that's linked to or loaded by another file:// URI keep track of the "original loader" (the original loader of the thing loading you, or the thing loading you if it has no original loader). 2) Use CheckLoadURI from the original loader and disallow loading things higher up in the hierarchy than that. 3) Use the original loader in security checks. This will help in all the cases when the original entrypoint is the rootmost of the set of HTML pages one will look at... Is that a good assumption?
Boris' solution in comment 12 sounds like a good one to me, although naturally one would have to test to make sure that the original loader *and* all the pages it loads still honor security.fileuri.origin_policy as impersonated for the original loader. HTML manuals loaded from CD should be able to organize into subdirectories and have things "just work", but they should still not be allowed to reach outside their sandbox.
The patch in bug 410700 makes this symmetric again, but does not implement the suggestions in this bug.
(In reply to comment #12) > 2) Use CheckLoadURI from the original loader and disallow loading things > higher up in the hierarchy than that. That would break on-disk help or other reference material when you navigate into a section and then click the "Table of Contents" button to go back up to the top. Blocking the loads is far more incompatible than simply preventing reading of content.
Wouldn't item (1) in the list handle that? The idea is that you can't link to ancestors, but if you get to A from B, A gets the principal of B. So you can go back where you started, but no further. If you start in the section, that wouldn't work, of course. So it really depends on the use cases...
Flags: tracking1.9+ → blocking1.9+
Priority: P2 → P1
Blocks: 420425
No longer blocks: 420425
No patch, no discussion since Jan 26, does this really block in a way that requires beta exposure, if at all? I'm hoping that there's been discussion which hasn't been reflected in this bug, or that the solution's really simple.
Target Milestone: --- → mozilla1.9beta5
Yes, I really do think needs beta exposure. But I think there has been work going on in another bug at least to some degree. Dveditz knows better.
This patch addresses the symmetry concerns Boris had, and also addresses the performance concerns of bug 410700 by moving the file checks out of caps and into docshell loading where we're touching the disk anyway. There are no more options, the new strict behavior is either on or off via pref. Once you open a file any subframe gets the same principal as long as you open a file in the same directory or a subdirectory. If you attempt to open a file "up" or a directory then it picks up the traditional origin and dom access is blocked. This patch does not address bug 380994 or bug 395353
Attachment #310650 - Flags: superreview?(bzbarsky)
Attachment #310650 - Flags: review?
Comment on attachment 310650 [details] [diff] [review] make checks symmetric, and faster not sure of bz's availability, but Jonas was interested in this one.
Attachment #310650 - Flags: superreview?(jonas)
Attachment #310650 - Flags: superreview?(bzbarsky)
Attachment #310650 - Flags: review?(jst)
Attachment #310650 - Flags: review?
Comment on attachment 310650 [details] [diff] [review] make checks symmetric, and faster Boris should look this over, beyond that I'll take whoever of Johnny and Jonas look at this sooner :)
Attachment #310650 - Flags: superreview?(jonas)
Attachment #310650 - Flags: superreview?(bzbarsky)
Attachment #310650 - Flags: review?(jonas)
Comment on attachment 310650 [details] [diff] [review] make checks symmetric, and faster >+ mStrictFilePolicy(PR_TRUE); > mFiredUnloadEvent(PR_FALSE), Arg, should end in a comma of course.
Since dveditz is out for a few days can you take this jst? Is there any way to add unit tests for this?
Assignee: dveditz → jst
Comment on attachment 310650 [details] [diff] [review] make checks symmetric, and faster This looks good to me. The only nit I had was that you're inconsistent in the whitespace usage: if ( foobar ) vs. if (foobar) Follow the file policy. It would be great with some tests though. Should be possible to let a mochitest create a few test files and open them and see if it works. I think we have other mochitests that do similar things.
Attachment #310650 - Flags: review?(jonas) → review+
Then, let's add tests here, please.
Blocks: 424261
Comment on attachment 310650 [details] [diff] [review] make checks symmetric, and faster bz, please feel free to review this still. Unless I hear back from you within a few hours, I'll be landing this and we'll deal with any issues you may find after it's in the tree.
Attachment #310650 - Flags: superreview?(bzbarsky)
Attachment #310650 - Flags: superreview+
Attachment #310650 - Flags: review?(jst)
Attachment #310650 - Flags: review?(bzbarsky)
Damon, no, we're adding tests in bug 424261, this will land later tonight assuming nothing else comes up.
Attached patch Updated diff. (deleted) — Splinter Review
This is what I'll be checking in (same as above, with minor tweaks per Jonas' review comments).
Comment on attachment 310650 [details] [diff] [review] make checks symmetric, and faster So I have two issues, both minor. In docshell, we probably want to observe the pref, since secman does so. And in secman, in nsScriptSecurityManager::SecurityCompareURIs, the patch has: >+ nsresult rv = sourceBaseURI->Equals(aTargetURI, &filesAreEqual); That should probably be targetBaseURI, not aTargetURI. With those fixed, r=bzbarsky
Attachment #310650 - Flags: review?(bzbarsky) → review+
Blocks: 424292
Fix checked in with all but the pref observing issue fixed. Filed followup bug 424292 on that issue. Marking FIXED on behalf of dveditz.
No longer blocks: 424292
Status: NEW → RESOLVED
Closed: 17 years ago
Resolution: --- → FIXED
Blocks: 424292
This makes file:///foo and file:///foo#bar compare equal.
Attachment #310934 - Flags: superreview?(bzbarsky)
Attachment #310934 - Flags: review?(bzbarsky)
Attachment #310934 - Flags: superreview?(bzbarsky)
Attachment #310934 - Flags: superreview+
Attachment #310934 - Flags: review?(bzbarsky)
Attachment #310934 - Flags: review+
Reopening, had to disable this check (flipped the pref) as this made a bunch of reftests fail :(
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
This, in addition to what already went in, gets us down to one reftest failing (well, 3, but the two other ones really don't seem related to this change, or could be a local test issue), one that uses an <object> tag with a file:// URI (relative path).
Attachment #310957 - Flags: superreview?(bzbarsky)
Attachment #310957 - Flags: review?(bzbarsky)
Comment on attachment 310957 [details] [diff] [review] Make nsPrincipal::CheckMayLoad() work like nsDocShell does for file:// URIs > nsPrincipal::CheckMayLoad(nsIURI* aURI, PRBool aReport) >+ if (/* mStrictFilePolicy && */ You could ask the security manager for that boolean, right? >+ // If the this is not also a file: uri then forget it s/the this/the codebase/ ? The rest look OK. That said, I really wonder whether we could put most of this code into a helper method in nsNetUtil or something. We're going to have it here, in nsObjectLoadingContent, and in docshell, and divergence seems like a serious danger. r+sr=me so we can get the tests fixed, but it'd be a really good idea to refactor this code.
Attachment #310957 - Flags: superreview?(bzbarsky)
Attachment #310957 - Flags: superreview+
Attachment #310957 - Flags: review?(bzbarsky)
Attachment #310957 - Flags: review+
Attached patch Fix that passes all reftests. (deleted) — Splinter Review
bz, this passes all reftests (except the ones that fail here locally both with and w/o any of these changes). This leaves the code in nsPrincipal, but uses that code from nsDocShell and nsObjectLoadingContent, and seems to do the right thing, so I don't think we need to split this out to another location. This also means docshell doesn't need to know whether the pref that controls this is enabled or not, as the nsPrincipal::CheckMayLoad() call does that check for it.
Attachment #311084 - Flags: superreview?(bzbarsky)
Attachment #311084 - Flags: review?(bzbarsky)
Comment on attachment 311084 [details] [diff] [review] Fix that passes all reftests. >+++ b/caps/src/nsPrincipal.cpp >+ if (!codebaseFileURL || >+ NS_FAILED(fileURL->GetFile(getter_AddRefs(targetFile))) || >+ NS_FAILED(codebaseFileURL->GetFile(getter_AddRefs(codebaseFile))) You should probably add "!fileURL ||" to this condition. >+++ b/content/base/src/nsObjectLoadingContent.cpp >+ if (inheritPrincipal || IsAboutBlank(aURI) || >+ NS_SUCCEEDED(thisContent->NodePrincipal()->CheckMayLoad(aURI, >+ PR_FALSE))) { That doesn't look right. If NodePrincipal() is an HTTP URI and aURI is an HTTP URI on the same host, we'll set the owner with this code. Docshell correctly has a check to make sure that aURI is a file URI first. We should also add similar code to <http://bonsai.mozilla.org/cvsblame.cgi?file=mozilla/layout/style/nsCSSLoader.cpp&rev=3.244&mark=1434-1442#1434>. Perhaps a URIIsLocalFile() function in nsContentUtils or nsNetUtil? With the first two issues fixed, r+sr=bzbarsky. We can do the stylesheet thing in a separate bug, if we want.
Attachment #311084 - Flags: superreview?(bzbarsky)
Attachment #311084 - Flags: superreview+
Attachment #311084 - Flags: review?(bzbarsky)
Attachment #311084 - Flags: review+
We should add tests for this. As tinderbox showed, testing this (the original thing this bug was filed on) with reftest should be pretty simple... In particular, we should add a test that the subframe can access the parent and change its styling or something to match a reference rendering. Alternately, we could use mochitest and construct the files/directory trees by hand in /tmp. Or we could wait for the infrastructure from bug 424484, because nothing guarantees that reftests will always run from file:// I think we should do a combination of reftest and getting explicit file:// tests set up once bug 424484 is fixed.
Flags: in-testsuite?
Attached patch Fixes for bz's last comments. (deleted) — Splinter Review
Boris, this applies on top of the previous patch and fixes the issues you found, including the CSS loader check. I won't be landing this tonight as it's too late to stay up and watch the tree now, but I'll land over the weekend, reviews or not. I'll attach a complete updated patch as well.
Attachment #311115 - Flags: superreview?(bzbarsky)
Attachment #311115 - Flags: review?(bzbarsky)
Attached patch Complete diff for checkin. (deleted) — Splinter Review
Comment on attachment 311115 [details] [diff] [review] Fixes for bz's last comments. >+++ b/content/base/public/nsContentUtils.h >+ * Return true if aURI is a local file URI (i.e. file:// or >+ * resource://). Just file://, actually. Not resource://. With that nit, looks great. Thanks!
Attachment #311115 - Flags: superreview?(bzbarsky)
Attachment #311115 - Flags: superreview+
Attachment #311115 - Flags: review?(bzbarsky)
Attachment #311115 - Flags: review+
This was checked in earlier today. Triggered bug 424594 and bug 424571, but those will be fixed independently. Marking this bug FIXED.
Status: REOPENED → RESOLVED
Closed: 17 years ago17 years ago
Resolution: --- → FIXED
filed bug 424860 about concerns with the name URIIsLocalFile if we're exposing that as a generic helper function rather than specific to CAPS. In general people finding something with that name might expect it to get the innermost URI for them, and outside of this particular usage it probably should.
Blocks: 424860
Depends on: 425201
Depends on: 427051
Depends on: 424484
filed bug 429467 about the security policy being to strict for the XMLHttpRequest object
Depends on: 431082
No longer depends on: 431082
Depends on: 431133
We need to document the fact that the security.fileuri.strict_origin_policy preference is no longer an integer but is now a boolean (if we had MDC docs for it in the first place).
Keywords: dev-doc-needed
Er, rather the "security.fileuri.origin_policy" pref is just gone, replaced by the boolean "security.fileuri.strict_origin_policy".
I know I am late to this discussion, but I am in charge of an html online help system. With the release of FF 3, this system will not work. It is used by dozens of applications. The core help rendering files are shared amongst many separate "books". With FF 3 this type of help system will not work. Early on in this thread you talked about help systems not working. There needs to be a better way to handle this than forcing all web apps to architect to one folder. With only one file at the top. Most help systems do not work that way.
Isaac, would you mind describing the actual directory structure you guys use, and how one is to tell your use of it from an attempt to read the user's private files?
I have been developing browser (all of them/even Unix) based help systems for over 8 years. They typically all use a shared (parallel folder) library code base. This allows all the "books" in the help system to reference a single folder for rendering .css,.js,.htm,.html code via relative paths. The application that calls the help may open any one of the books html files, not just a top level point file. The books file may then use this shared rendering code. By using the shared code, changes can be made to the shared code in one location, that can change its look and feel/bugs/general functions without forcing the calling application to change code or having to duplicate the shared code change if the code was placed under every "book". By forcing all complex browser based applications to have a single top level point of entry minimizes the type of development FireFox supports. Plus it breaks the many past years of numerous deployed applications that have worked until FF 3. One solution could be to just allow certain types of files that can be accessed (.js,.css,.htm.,html,.ps,.jpg,.gif,...)? Or a minimum level of upper context levels of reference? Or both? If this issue is a concern for the FF community, then why could it not be solved in the "save/as" function. As long as FireFox was used to save the content, then couldn't there be data placed in the save/as files to cover the FireFox concern for this issue? Then FF would not have to be responsible for the vast content that could not support this issue.
Depends on: 445004
(In reply to comment #47) > Isaac, would you mind describing the actual directory structure you guys use, > and how one is to tell your use of it from an attempt to read the user's > private files? I'm not Isaac but the NAG manuals are seriously broken by this behaviour. They work online (now, they didn't in the beta http://dpcarlisle.blogspot.com/2008/05/xslt-in-browser.html But it appears that the behaviour in 3.03 is that for xml files accessed via file: using an xml-stylesheet pi of the form href="../styles/pmathml.xsl" the stylesheet is just silently ignored. It's really hard to think of any circumstance where an end user would have a stylesheet at a known location on the filesystem that could do any kind of malicious damage. As it happens, in previous releases the stylesheet was not needed in firefox and the stylesheet just installs mathml support in browsers that don't have it natively, but with firefox 3 because of bug 427990 the stylesheet is needed to re-instate all hyperlinks in mathematics by converting xlink to html a@href links). To see an example of the usage, see http://www.nag.co.uk/numeric/FL/manual/xhtml/A02/a02aaf.xml which works online, but not from the filesystem, which is particularly bad as it is distributed as part of the product, on CD.
David, it sounds like you're seeing bug 397894. Please see the discussion there. We just need to fix bug 427990, as far as that goes.
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: