Closed Bug 40538 Opened 25 years ago Closed 24 years ago

security.checkloadURI logs file URL errors to JavaScript Console

Categories

(Core :: Security, defect, P3)

x86
Linux
defect

Tracking

()

VERIFIED FIXED
mozilla0.9

People

(Reporter: blizzard, Assigned: security-bugs)

References

Details

(Keywords: relnote, verifyme, Whiteboard: send post-fix dupes to bug 84128)

If you have a file:/// url in a web page it doesn't get downloaded however it works fine if the original page was on the local filesystem. Given the file: <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML//EN"> <html> <body> <a href="file:///tmp/">file:///tmp/</a> </body> </html> It works fine on the local filesystem but put it up on a web server and it never downloads the content.
Adding people that might know more than I.
This is the correct behavior. A page loaded from the web should not be allowed to load pages from the local filesystem. Among other things, this allows an attacker to see if a specific local file exists. IE allows this behavior, and it has led to several vulnerabilities recently. (Notably, using IE to read someone's Netscape prefs.js file). Do you have code that depends on this behavior? I'm reassigning this to me, and if I don't hear any objections, I'm going to mark this INVALID.
Assignee: gagan → mstoltz
See this url in NS4. http://feed.0xdeadbeef.com/~blizzard/test.html How is this a vulnerability? Are you worried about someone doing something across frames or something?
See bug 16858 and recent IE exploit involving reading prefs.js file. Opening arbitrary file: URL's facilitates the sorts of problems IE is prone to lately...if someone can cause a file to be downloaded to your machine (in a cache file, or by having the user consent to download it to a known directory), and the downloaded file contains JS, then referencing it via a file: URL on a hostile web page would cause the JS to be run, and since it's coming from the local drive, it would run with privileges. I can see this leading to 'love bug' type scenarios. At the very least, it gives an attacker information about your local machine which should not be available.
I save html files to my hard disk often, so I don't see this as a complete fix for the security issue.
Ok, I can see how you have security concerns. It doesn't change the fact that as someone who is using mozilla in an embedding context I need to be able to handle file:// urls with another viewer than Mozilla. So, how do I find out when someone requests one? It never seems to make it to the nsIWebBrowser code.
Chris, I'm not clear on what you need. Can you give me an example?
Status: NEW → ASSIGNED
Mitch, Are you saying that the file:// protocol in SeaMonkey will be different that 4.x ? In 4.x, if i type this into the location bar: file:///tmp/ The browser goes there. In SeaMonkey it dont if the content came from a http:// stream. It does if the content itself came from a localfile. It seems to me this should work the same as it did in 4.x. Now, the issue of evil JavaScript code inside of that content being executed is something else. Of course that the security model in SeaMonkey should prevent such code from executing. Or did i tottaly miss something ? To be more specific, I need this feature to control my embedded application. Im working on a file manager which (of course!) can deal with file:// uri just fine. When i try to get embedded mozilla to deal with <a href="file:///foo/"> content, it doesnt. I need this url to be dispatched as any other would, so that im my embedding app i can track the opening of the url and deal with it myself. But because of this bug, im not even given the chance! Perhaps there is a way for me to query some interface to turn this security check off ? I could do this in my embedded app safely knowing that I am always going to handle file:/// uri's myself. Thanks for looking at this and please let me know if there is anything i can do te help get this fixed or workaround it.
Adding a hidden preference for turning this security check off. Nominating nsbeta2.
Blocks: 37425
Keywords: nsbeta2
Whiteboard: Fix in hand
Target Milestone: --- → M17
Putting on [nsbeta2-] radar. Will not hold beta 2 for this. ramiro could land pref via mozilla...see brendan/waterson. If you can give us a top100 site, PDT would make [nsbeta2+]
Whiteboard: Fix in hand → [nsbeta2-] Fix in hand
Pref is checked in. I'm leaving this bug open as a placeholder for URL loading security issues which still need to be worked out. THis code is not in its final form yet, and the behavior of this pref needs to be tested after the code is finished.
Whiteboard: [nsbeta2-] Fix in hand → [nsbeta2-]
*** Bug 47988 has been marked as a duplicate of this bug. ***
I posted these comments to bug 47988, reposting here for comments: Thanks for your input as to the inconsistencies. However, inconsistencies are not a justification for removing this security feature. I realize this security check (nsScriptSecurityManager::CheckLoadURI) is not called everywhere it should be (it's not currently called for IMG tags, for example) and it looks like right- clicking a link takes a different code path which doesn't have the check. I'm going to make this consistent by adding the check everywhere it's needed. The reason this is a security issue is that it facilitates exploits. Although web scripts are denied access to the content of pages coming from other sites, even being able to load those pages in a window can allow some malicious behavior. For example, accessing a user's prefs.js file (easy to find if the user is using the default profile) inside a SCRIPT tag causes the prefs file to be executed as Javascript, which allows stealing your email settings, among other things. That's just one example. With some types of tags (such as STYLE), pointing at a local URL allows an attqacker to determine whether a local file exists at a particular location. In general, it's a bad idea. This "feature" of being able to point at local files has led to a bunch of exploits in IE. Brad, the 'very bad behavior' is on the part of sites which attempt to make use of your local drive. I realize that this is a tradeoff between security and functionality. Any concrete data (rather than opinions) which you can give me on the necessity of allowing local files to be loaded by remote content would be very helpful. Are there popular, large-scale, 'top-100' sites which depend on this behavior? Are there corporate users who need it, and for whom setting "security.checkloaduri" to bypass this restriction is not an option? Am I overstating the security vulnerability here?
The 'very bad behavior' quote is out of context. I was referring to the fact that Mozilla presents a link to the user, and when the user clicks on it, absolutely nothing happens. Mozilla should give some indication that the hyperlink was blocked due to security concerns. Perhaps this should be a filed as separate bug? Is there any way to differentiate between loading a local file as a "top-level" document, and loading it as part of another document? Thus, the only time one could access a local file from a remote site would be directly through a hyperlink, while loading a file as a smaller part of a page would be denied. I don't have a 'top-100' site, but I do have a web server script that runs on our internal network that provides 'file:' links to various nfs-mounted directories. I'd hate to disable this protection on everyone's browser (opening them to attacks from outside sites) just to allow them to view these url's. That application is available here: http://public.perforce.com/cgi-bin/p4db/dtb.cgi?FSPC=guest/brad_garcia&HIDEDEL=NO
Depends on: 24739
Target Milestone: M17 → M18
You wouldn't be "opening them to attacks from outside sites." Allowing local file links doesn't directly open up any exploits that I'm aware of, it just makes the environment a bit less secure. Netscape browsers through the current 4.7, and most versions of IE, allow local file links, so you're not exposing yourself terribly by setting "security.checkloaduri" to false. People have asked me to allow this on a site-by-site basis, like our per-domain DOM security policy mechanism, but I haven't seen a huge demand for this. I'm marking this bug FUTURE so we can revisit this issue after NS6.0 ships.
Target Milestone: M18 → Future
Thanks for the explanation. There is still the issue of a user clicking on a link and nothing happening. This makes for a bad interface and confuses the user. I can see users complaining to web site operators (or the IT staff, for intranet servers like ours) that the links on a page are bad. Expecting every user to have "security.checkloaduri" set to false is not the solution to this problem. When set to true, mozilla should tell the user that it is choosing not to follow the link. If you'd like to keep this bug around with a Target of "Future" to track this bug from a security standpoint, then I think it would be good to open a new bug about the user interface issue with a more immediate Target. Does this sound reasonable to you?
Brad, Sounds reasonable, except that I'm probably not going to have the time to do it. I think the appropriate way to inform the user, the way we use in similar situatons involving "bad" Javascript or HTML content, is to post a message to the console. This is easy. would you like to do it, or do you know someone who would? Take a look at http://lxr.mozilla.org/seamonkey/source/modules/libjar/nsJAR.cpp#768 for an example.
Keywords: nsbeta2
Whiteboard: [nsbeta2-]
*** Bug 54286 has been marked as a duplicate of this bug. ***
*** Bug 67200 has been marked as a duplicate of this bug. ***
*** Bug 69975 has been marked as a duplicate of this bug. ***
It might be good to include, as part of the error message, a hint about how to relax the restriction for a specific website.
Sure, if that were possible. But it isn't yet.
*** Bug 69546 has been marked as a duplicate of this bug. ***
Changing description to "[RFE] Need console message when CheckLoadURI fails."
Summary: file:// urls from downloaded content aren't downloaded → [RFE] Need console message when CheckLoadURI fails
Target Milestone: Future → mozilla0.9
*** Bug 74747 has been marked as a duplicate of this bug. ***
*** Bug 75577 has been marked as a duplicate of this bug. ***
Fix checked in.
Status: ASSIGNED → RESOLVED
Closed: 24 years ago
Resolution: --- → FIXED
x86 linux 2001-04-18-08 I'm not seeing any sort of message being generated. This bug does not appear to be fixed. Where is the message supposed to appear? What is the message supposed to look like? And why don't I have the choice to re-open this bug?
Brad, if you go to this url and then click the link, you should see a message on the JavaScript console (Tasks->Tools->JavaScript Console). data:text/html,<a href="file:///c|/autoexec.bat">c:\autoexec.bat
This is not a legitimate fix. Users don't open up the javascript console to find error messages. Especially when javascript doesn't even appear on the page!!! The message needs to appear in the browser window itself, possibly in the status bar at the bottom. Bringing up an "about:security" page might not be a bad idea too. It would then be impossible for a user to miss the message. It would also allow us to display the rationale for not allowing the user to go to the link, as well as ways to disable this feature. Agree? Disagree?
Brad, the message is mainly for web developers, not users. Web developers know where to look for error messages, and if they see the message, they'll stop using file:// links. Getting in the user's face with dialogs is bad, especially when the fault lies with the content author, not the user.
That's just silly. The people who really have the need to see the error message are the users!!! The developers will already know what's going on. A user will think that the link is bad, and complain to the webmaster. And in my particular case, we're talking about an internal website, were there is no security risk of having such a link. > Getting in the user's face with dialogs is bad No, having an expected action fail with no visible reason is bad. A message in the status bar is not "in the user's face", it's reasonable. More importantly, it's expected! An "about:security" page is more in-your-face, but is still better than simply "ignoring the click", from a user point of view. And it's equivalent to clicking on a "rotted" link and getting a 404 message on a new page.
I agree that the user should see a message. The majority of web developers currently check their pages only with IE on Windows, which allows users to click on file:// links from web pages. Typically, it will be the users who try Mozilla who will see this problem first. I'm not even sure that most web developers who do try their pages out with Mozilla know to look in the JavaScript console for error messages. I certainly didn't when I first encountered this problem and reported it as a bug.
I agree with Brad, a status bar message would be nice -- it's good for the user to know that there *is* an error on the page and that the page isn't working as its author intended. I think that showing a status bar message for the CheckLoadURI message is covered by bug 47128, "Display JavaScript error indicator in status bar." Depending on how bug 47128 is fixed, it might be necessary to change the message to a warning or error in order to make it show up. Mitch, what do you think of making the CheckLoadURI message an error? I think that would be consistent with how messages and errors are used for problems with javascript code. The other "messages" I've seen are for deprecated, but working, methods.
Component: Networking → Networking: File
VERIFIED: Win32, but not MacOS or LINUX. Does the the data: url provided act as a test case for all plats, or do I actually need to point to a real file in each OS? Changed summary to describe feature. If we need to debate this further, please start a thread in the netlib newsgroup, or file new bugs. I see two possible RFE's: domain based security and logging to the normal console.
Summary: [RFE] Need console message when CheckLoadURI fails → security.checkURI logs file URL errors to JavaScript Console
Keywords: relnote, verifyme
Summary: security.checkURI logs file URL errors to JavaScript Console → security.checkloadURI logs file URL errors to JavaScript Console
*** Bug 89046 has been marked as a duplicate of this bug. ***
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
I have a real existing file in intranet and a collegue sent me a *MESSSAGE* with the file:/// link in it (see #89046) and it does not work in the news nightly from yesterday (Mozilla/5.0 (Windows; U; WinNT4.0; en-US; rv:0.9.2+) Gecko/20010709)....on WinNT.
Upps. wrong prefs.js, sorry :-))) Close and forget ...
OK...
Status: REOPENED → RESOLVED
Closed: 24 years ago24 years ago
Resolution: --- → FIXED
*** Bug 89917 has been marked as a duplicate of this bug. ***
Bug 84128 is a request to report this error so end users can see it. I think we are getting enough dupes where we need to put this at the top of list of errors to be fixed.
Whiteboard: send post-fix dupes to bug 84128
*** Bug 91164 has been marked as a duplicate of this bug. ***
I'm not happy with that solution, as we can do better and more secure, so view this as an reopening with request for enhancement: Expected: Mozilla should check if local access rights apply (as it does in the url bar) and should load the pages accordingly. It should deny loading if access rights do not apply, regardless of whether the page containing the file:///link is loaded via file: or http:. As (at least in Unix environments) it's easy to find out if a user has access rights (when he/she is a user on the local system), it should be possible to implement this finer granularity in security checking. The argument that it should prevent exploits is not valid since the possible exploiter CAN SEE the link if it's on the page, even if it does not open, and he CAN TYPE it into the url bar. It's not the responsibility of Mozilla to assist in keeping weak sites alive, since the exploits are not through Mozilla's weaknesses. The 'evil javascript' argument can not hold as a reason because then you can deny access to and forbid almost everything using the same reasoning. Guys brave enough to turn javascript on should be definitely on their own because there are MILLIONs of exploits with that (personally I would kick out javascript completely out of Mozilla since that's the biggest security hole in any browser !). A secure system can open part of it's internal structure to the outside without being compromised (tautological definition of 'secure site'). E.G., we are behind a firewall and extensively use the feature of accessing local files through pages loaded via our local web server (documentation) with Netscape x.x, and it works without having had any security problems with that. By the way, nobody who has turned off javascript by default (as everyone should!) expects anything on a JavaScript Console....;-) And, there should definitely be NO FLAG that opens local files to the outside ! Instead, implement REAL access rights checking, and we all will be happy again !
grabow: See Mitch's 2000-05-25 15:12 comments on this bug for why we disallow all links from http:// to file:/// by default.
jesse: See my comments on javashit ;-) OK OK, I see we can not totally ignore the pure existence of this scrap ... But is THAT really the way to go ? I would propose to switch the access off ONLY if javascript is switched on (since that is the only applicable reason to block access). That would solve all our problems, doesn't it ?
Again, in a more formal way: file:///path/file is standard html content and is supposed to work as described in the standard. Javascript is a nonstandard (hopefully forever) add-on. Therefore, if this nonstandard add-on causes security problems in combination with a standard W3C token, that problem should be solved in the framework of the optional add-on, and not in a way that blocks HTML standard conformance! The current state is therefore that Mozilla IS NOT STANDARD CONFORMING ! (Or to say it clearer: How you "fixed" it is the wrong philosophy, guys !) Therefore, please REOPEN for standard conformance.
grabow, you have completely misunderstood the security issue. Please read the last 6 or 8 comments on bug 91316 for clarification. If you argue that JavaScript should be removed, I guarantee that you will not be taken seriously in any Mozilla forum. Javascript is a powerful technology, and with power comes risk. IMHO, the risks are minimal and manageable. If you're afraid of the risks inherent in rich interactive Web content, delete Mozilla and go get Lynx. Meanwhile, we will continue to make JavaScript as safe as possible.
RELNOTE: NS6.1 "File URLs will not be read if they are inside a network based (HTTP) document. To disable this feature, set "security.checkloadURI" to false in your prefs.js".
qa to me.
QA Contact: tever → benc
-> security
Component: Networking: File → Security: General
QA Contact: benc → bsharma
Verified on 2001-10-22-branch build on WinNT Loaded the test case locally and through the web server, the behavior is as expected.
Status: RESOLVED → VERIFIED
Wow, an amazing case of wrongthink. I have a local html document on an internal website. It makes reference to local network resources with a file:// url. These links don't work. This carries no security risk. This is clearly broken. The error messages are displayed in the JAVACSCRIPT console even though there is no javascript on the pages in question. Clearly the bug is that mozilla does not handle file:// urls correctly. If you think this is unsafe, you should make mozilla throw up obnoxious warnings in the same way it does in regards with leaving secure sites, etc. The user should have a clear option to disable it. I myself surf with javascript disabled, but yet my browser conformance must suffer beceause you can't make a feature I don't use secure? Get a grip people.
You need to log in before you can comment on or make changes to this bug.