Closed
Bug 292789
Opened 20 years ago
Closed 17 years ago
SCRIPT tags can access chrome
Categories
(Core :: General, defect, P2)
Core
General
Tracking
()
RESOLVED
FIXED
mozilla1.9
People
(Reporter: mikx, Assigned: dveditz)
References
(Depends on 1 open bug, Blocks 1 open bug)
Details
(Keywords: addon-compat, dev-doc-complete, Whiteboard: [sg:want])
Attachments
(4 files, 5 obsolete files)
(deleted),
text/html
|
Details | |
(deleted),
patch
|
bzbarsky
:
review+
jst
:
superreview+
beltzner
:
approval1.9+
|
Details | Diff | Splinter Review |
(deleted),
patch
|
beltzner
:
approval1.9+
|
Details | Diff | Splinter Review |
(deleted),
patch
|
dveditz
:
review+
|
Details | Diff | Splinter Review |
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.7) Gecko/20050414 Firefox/1.0.3
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.7) Gecko/20050414 Firefox/1.0.3
IMG and SCRIPT tags (and INPUT type=image) can access chrome and use this
ability to check if an extension is installed.
The following piece of code alerts whether or not Adblock is installed.
<script>var uninstallAdblock = false;</script>
<script src="chrome://adblock/content/uninstall.js"></script>
<script>alert("Adblock is "+(uninstallAdblock?"":"not ")+"installed")</script>
Is there any valid use why unprivileged content should be able to get read
access to the content of chrome files?
The only thing i can imagine is something like
<img src="chrome://browser/content/about.png">
Reproducible: Always
Comment 1•20 years ago
|
||
This would be because we don't do cross-domain checks on <img> and <script>,
presumably. (I'm guessing the script is running in the context of the page, not
with privs.)
Didn't we already block http:// content from using <img> or <script> on file://
URIs? We should probably block chrome:// and resource: (if we still have that)
as well, if so. We can't just block different schemes, that would block data:,
javascript:, https:, about:, etc, which should keep on working.
Assignee | ||
Comment 2•20 years ago
|
||
IMG was explicitly allowed to load chrome with bug 69070 (which blocked file:
access). Not convinced that was a great idea.
Adblock, of course, can be detected easily in other ways: "var hasAdblock =
window._AdblockFiltered ? true : false", but not every extension sticks things
in the DOM like that.
in general there's not much reason to allow <script src> such wide scope. I'm
sure someone can come up with something cool they can do with it (distribute a
library of useful web-app helper code via extension?), but any legitimate use
could be accomplished in other ways.
Assignee | ||
Updated•20 years ago
|
Status: UNCONFIRMED → NEW
Ever confirmed: true
Flags: blocking1.8b3+
Flags: blocking-aviary1.1+
Flags: blocking-aviary1.0.4?
Whiteboard: [sg:fix]
Comment 3•19 years ago
|
||
> IMG was explicitly allowed to load chrome
That's needed for proper functioning of various XBL-like things in remote XUL,
proper functioning of icons in ftp listings, etc, etc. That is, lots of our own
"UI" somehow depends on loading chrome images and has the permissions of a web page.
Note sure what the deal with script was, except insofar as remote XUL may want
to use things like contentAreaUtils.js?
Comment 4•19 years ago
|
||
It seems pretty dangerous to allow SCRIPT tags to load chrome:// URLs. There
are probably interesting ways in which chrome priviledges scripts may be
exploited. Perhaps by changing a property on the global window?
Comment 5•19 years ago
|
||
Scripts are compiled with the permissions of the document that loaded them, no
matter what URI they were loaded from, I believe. The only thing we should be
basing on the script URI is the decision as to whether to create
XPCNativeWrappers....
Comment 6•19 years ago
|
||
From comment #0 it looks to me as though uninstall.js is able to interact with
JS from the containing document. That seems like a recipe for trouble.
Comment 7•19 years ago
|
||
Yes, in comment 0 the uninstall.js script is being compiled against the content
script global and ending up setting properties on the content global object...
As I said in comment 3, I see no obvious "it'll break the world" situation with
<script> if we disallow linking to chrome script from non-chrome pages.
Comment 8•19 years ago
|
||
This is WONTFIX: we do want to allow untrusted content to link to various pieces
of installed JS, as long as there is no privilege elevation.
Status: NEW → RESOLVED
Closed: 19 years ago
Flags: blocking1.8b3+
Flags: blocking-aviary1.0.5?
QA Contact: general → benjamin
Resolution: --- → WONTFIX
Reporter | ||
Comment 9•19 years ago
|
||
Well, i disagree. In the spirit of the "proactive security" MoFo claims this
should be fixed and not left alone because it is "too painfull and probably not
worth it" - i guess this is what most developers think about it *g*
Another argument against this behavior is that website developers could rely on
this kind of code. They could include chrome javascript files or images into
their pages and use it reduce traffic. And when Firefox changes internally,
their pages will break and they will blame MoFo.
If this is a WONTFIX please remove the security flag, make it a documented
feature and tell people if they can/can't rely on this in the future.
Assignee | ||
Comment 10•19 years ago
|
||
Unflagging security sensitive as requested, and because we were convinced it was
safe enough to close as WONTFIX.
Reopening though, as this just seems like a bad idea. It's bad enough when
extensions rely on unfrozen chrome scripts, we don't want web sites doing the same.
Group: security
Status: RESOLVED → REOPENED
Resolution: WONTFIX → ---
Summary: IMG and SCRIPT tags can access chrome → SCRIPT tags can access chrome
Comment 11•19 years ago
|
||
*** Bug 308808 has been marked as a duplicate of this bug. ***
Comment 12•19 years ago
|
||
Quoting myself from bug 308808:
> This is probably not a security bug in itself, but I'd classify it as an
> unwanted information leak that makes targeting exploits easier. When a
security
> hole is discovered in an extension, it is easy for the attacker to only
activate
> the exploit on systems where the extension is installed.
> Other potential uses are keeping out "unwanted" AdBlock or GreaseMonkey users
> from web sites - sure, they can disable JavaScript for web pages, but to
enjoy
> the web most users won't do that, and "crossing" the web<>chrome barrier in
this
> way shouldn't be possible from the start.
Attaching demo page detecting the presence of GreaseMonkey, IE View,
FlashGot and Mouse Gestures.
Updated•19 years ago
|
Attachment #196527 -
Attachment is patch: false
Attachment #196527 -
Attachment mime type: text/plain → text/html
Assignee | ||
Comment 13•19 years ago
|
||
Easy to fix here:
http://bonsai.mozilla.org/cvsblame.cgi?file=mozilla/content/base/src/nsScriptLoader.cpp&mark=517-518#510
Allowing access to chrome scripts has been in since r1.1 of this file. It's hard
to believe any legitimate pages are using this, but we need to watch out for
breaking sites this close to release.
Stopping this won't prevent pages from detecting (some) extensions in other ways.
Comment 14•19 years ago
|
||
The problem is that we really must allow sourcing of chrome JS scripts from XBL
(for remote XUL), which will expose the same "problem" that as using a <script>
element directly.
Comment 15•19 years ago
|
||
Hmm... Why does remote XUL XBL need to source chrome via <script> tags?
Comment 16•19 years ago
|
||
Doesn't matter how it sources it. The point is if it is possible to source it
using one technique, then the privacy/security/whatever risk is present anyway.
If the reason to block <script src=""> is a privacy/security/whatever risk, then
we must block all such possibilities. If there is something that outweighs that,
then we should just allow access everywhere. Inconsistent "allowed if you use
this syntax but not if you use that syntax"-type access rights seem bad.
Comment 17•19 years ago
|
||
(In reply to comment #14)
> The problem is that we really must allow sourcing of chrome JS scripts from XBL
> (for remote XUL), which will expose the same "problem" that as using a <script>
> element directly.
Benjamin, could you (or someone else who isn't on vacation) elaborate for
exactly which purpose remote XUL needs to source a chrome JS script? And is the
XBL that needs to source the chrome script part of our own code, or is it (can
it be) third-party, i.e. served remotely, too?
My thinking is that there are those scripts that "want" to be available to
untrusted content (some XBL-related JS files), and those that are not (all other
chrome JS + extensions). Perhaps we can find a way to distinguish those
(whitelists, or even a special 'publicchrome://' protocol if backwards
compatibility is not an issue)...
(In reply to comment #16)
> Inconsistent "allowed if you use this syntax but not if you use that
> syntax"-type access rights seem bad.
Full ack.
Comment 18•19 years ago
|
||
> Doesn't matter how it sources it.
Was anyone suggesting that this does matter?
Reporter | ||
Comment 19•19 years ago
|
||
(In reply to comment #17)
> Perhaps we can find a way to distinguish those
> (whitelists, or even a special 'publicchrome://' protocol
Please not another pseudo protocol - especially not chrome related. That smells
like another door for privilege escalation: getting from a normal website to
publicchrome and than over to "real" chrome including chrome privileges.
In the last months we finally fixed most of the unchecked places for chrome
access (menus, dialogs, favicons, etc). Please don't allow yet another way to
access something that could potentially allow a way into chrome.
Creating a remote app that is not a part of the browser (unlike an auto-update)
should simply never be allowed to use or rely on something of the local
installation. Sure, it's nice to save bandwith and have remote XUL apps use
local scripts for common stuff - but "remote xul apps" should be just a "stand-
alone" thing. Interacting with local files is something for extensions. What if
ever another vendor (e.g. Opera) adds support for XUL? By allowing remote XUL
apps to interact with Firefox local chrome files we would allow to limit those
XUL apps to Firefox and making it an even more proprietary thing. Or we would
enforce other vendors to ship Firefox chrome files and add the publicchrome
protocol (if we implement it).
Comment 20•19 years ago
|
||
We allow remote XUL apps to interact with the local theme.
The problem is one of defining what is part of "XUL" and what is part of "chrome
that we just happen to have". Why is a part of XUL implemented in C++ somehow
more acceptable for use in remote XUL than one implemented in JavaScript (via
XBL or pure JS components or whatever).
The real problem Michael put his finger on is that we use chrome://
indiscriminately to refer to both parts of the XUL platform and parts of our
browser UI (including extensions, etc, etc). Which makes it very hard to make
one work without giving access to the other. :(
Comment 21•19 years ago
|
||
Bug 59701 might be related.
Updated•19 years ago
|
Flags: blocking1.8b5?
Reporter | ||
Comment 22•19 years ago
|
||
(In reply to comment #20)
> Why is a part of XUL implemented in C++ somehow
> more acceptable for use in remote XUL than one implemented in JavaScript (via
> XBL or pure JS components or whatever).
Personally i would draw the line at the API. If there is a defined and frozen
API to call via javascript or a tag based langugae like XUL everything is fine
to me. XUL authors can than use those objects and manipulate them with
everything javascript offers.
Put as soon as you include the _source_ of such an object via chrome protocol
and a script tag you are breaking a barrier (you can't include the C++ code to
answer your question). What if a programmer relies on a function in the source
that was not frozen? Or relies on a special return value of a private function?
Or just uses something - maybe even by getting the string value of a function,
change it and eval it - and bypasses a security check or string validation that
way?
Extension authors should be allowed to do all those things - they extend the
browser and maybe want to intentionally break/modifiy a build in function. XUL
app authors should just be able to use existing functions and never need to
bother with the source code behind - and don't even have the option since not
all programmers know what they are doing *g*
Maybe i am wrong. Or i don't understand the full purpose of remote XUL. But
that is the way i would implement it ;)
Comment 23•19 years ago
|
||
The problem is that with JS you can't very well draw the line between the api
and the source -- you have to have acess to the source to be able to call the
functions in it, generally speaking.... A good case in point is the perennial
problem of chrome XBL bindings. To use a <xul:browser> remote XUL must be able
to load the corresponding XBL, which lives at a chrome URI.
Updated•19 years ago
|
Flags: blocking1.8b5?
Flags: blocking1.8b5+
Flags: blocking-aviary1.5+
Comment 24•19 years ago
|
||
Hey folks - looking like we are not going to get a fix in time for the 1.8b5
release so taking it off the list. If we close on a solution we can take post
1.5 timeframe.
Updated•19 years ago
|
Flags: blocking1.8b5+
Comment 25•19 years ago
|
||
It seems to me that this problem could be partially ameliorated by preventing
script tags from loading chrome scripts outside of <chrome://global/>; this
would at least eliminate attacks through host app or extension scripts. Note
that at a minimum this assumes that host apps shouldn't provide XBL bindings
intended to be used remotely, which may not be a valid assumption.
Assignee | ||
Updated•19 years ago
|
Whiteboard: [sg:fix] → [sg:want]
Comment 26•19 years ago
|
||
SCRIPT tags aren't the only way to detect extensions, of course.
Comment 27•18 years ago
|
||
After seeing a reference to this bug in a site that demonstrated a way for pages to determine what Firefox Extensions were installed, I put together a GM userscript to provide some control. It can be found at http://www.mittineague.com/dev/chromesentry.user.js and http://userscripts.org/scripts/show/5899
Comments regarding the possibility of testing for a "window" tag, or another way of identifying legitimate use would be greatly appreciated.
I also would like feedback regarding possible security issues involved with the placing of filtered script content within the replacement script tag.
Comment 28•18 years ago
|
||
Having javascript.options.showInConsole set to true, i get these messages when logging into my GMail account:
No chrome package registered for chrome://adblockplus/skin/adblockplus.png .
No chrome package registered for chrome://flashblock/skin/flash-on-24.png .
It turned out to be this script:
http://mail.google.com/mail/?view=page&name=js&ver=a9a9lo240wm7
I don't know for how long the URL is accessible. Also, the script comes as text/html, so you have to look at the source to see:
var f={"chrome://flashblock/skin/flash-on-24.png":"bf-fb","chrome://adblockplus/skin/adblockplus.png":"bf-ap"};
sidki
Comment 29•18 years ago
|
||
That's not this bug. That's "IMG tags can access chrome" and it's arguably not a bug.
Comment 30•18 years ago
|
||
They're all bugs, Jesse. What we _should_ have is a small subset of chrome that's world-accessible instead of making it all world-accessible. Please see my post to .security (the one that went completely unanswered) about this...
Comment 31•18 years ago
|
||
Jesse, they are very much bugs. Have a look at http://jeremiahgrossman.blogspot.com/2006/08/i-know-what-youve-got-firefox.html and say hello to Big Brother. Fortunately I could implement a work-around in Adblock Plus. I designed it in such a way that it can be used by other extensions as well (http://adblockplus.org/en/faq_internal#protectchrome), but not every extension can rely on Adblock Plus being installed. It would be nice to see something like this in Gecko.
Comment 32•18 years ago
|
||
I agree with bz that it would be better if only a small subset of chrome were accessible to content. But extensions that put things into that subset would still be detectable with the same method. How about denying content *all* access to chrome, and instead allowing chrome scripts (and user stylesheets) to stick images (and XBL) with chrome: URLs into web pages? See also bug 204140.
Comment 33•18 years ago
|
||
Yes, my solution to the problem of extensions needing to put chrome content into documents is more of a hack. Its only advantage - it works.
Generally disallowing chrome access: stylesheets are not a problem I guess, allowing DOM nodes to request chrome content if the request was initiated by a chrome script will be more of an issue however. But that surely can be implemented. However, it will break some functionality in remote XUL that currently works - having content XBL that extends a chrome XBL or binding chrome XBL to selected nodes (via CSS or document.addBinding) will no longer work.
Comment 34•18 years ago
|
||
Yeah, that's the real question. We need to decide what our remote XUL story is and implement it, while also keeping security in mind (unlike the current remote XUL impl). This needs people familiar with XUL, the chrome registry, and security to sit down and talk to each other. The problem has been finding said people.
I do think we should fix this for Gecko 1.9 -- it keeps coming up as a problem, and it's a serious one imo. Sadly, this bug doesn't seem to be in Core...
Flags: blocking-firefox3?
Updated•18 years ago
|
Status: REOPENED → NEW
Flags: blocking-firefox3?
Product: Firefox → Core
QA Contact: benjamin → general
Version: unspecified → Trunk
Updated•18 years ago
|
Flags: blocking1.9?
Comment 35•18 years ago
|
||
If you could make Gecko automatically import chrome://global/skin/ in such a way that it overrides xul.css then remote XUL should never have to explicitly refer to chrome. (As yet I'm not convinced on remote XUL deriving bindings from chrome). What I don't know is how you would know whether a certain image request originated from a chrome stylesheet or binding or not.
You can still work out what the user's current theme is, by including a representative sample of XUL elements and inspecting their computed style.
Comment 36•18 years ago
|
||
> What I don't know is how you would know whether a certain image request
> originated from a chrome stylesheet or binding or not.
The changes I'm considering for bug 221428 would allow me to know.
Importing chrome://global/skin/ should be pretty trivial -- just do it the same way we do mathml.css and svg.css, no? Or hardcode it in the document viewer for the XUL type. Or whatever.
Depends on: 221428
Hmm.. so how do XBL currently pull in chrome-script? Isn't all script in xbl1 inside the XBL file rather than in external script-files? Or are we worried about extensions putting <script> in the anonymous content?
Comment 38•18 years ago
|
||
I don't think web pages should have access to extension-provided XBL either, generally speaking.
Basically, what I think we should do is have a chrome registry way to flag packages as "web-accessible" or something, and not allow things not flagged that way to be linked to (fail CheckLoadURI checks for them).
Agreed, I'm just trying to figure out where we need to add blockers. If we want chrome-xbl to be able to have <script>s linking to chrome-scripts we have to somehow flag this on the individual bindings and make script elements check which binding they live in before allowing linking to chrome. Though on second thought that wouldn't be safe since the page can reach in to the binding and insert script elements there. Alternatively we could flag the individual anonymous content nodes, but I bet there'd be ways to work around that too.
So it looks like a first step would be to block non-chrome <script> elements from linking to chrome files, that should take care of the ability to access pure javascript files.
Comment 40•18 years ago
|
||
I don't think we need to check where we're linking *from*... only where we're linking *to*:
e.g. it's ok for web content to link to chrome://xul-bindings (scripts, css, xbl bindings, etc)
but not to chrome://adblock
Not sure I follow you. We don't want to block chrome://browser/content/browser.xul from loading browser.js, but we do want to block http://evil.com/ from doing the same.
Comment 42•18 years ago
|
||
We want to block evil.com from loading chrome://browser/content/browser.js
But we want evil.com to be able to load chrome://xul-bindings/content/textbox.js
e.g. in the chrome manifest there would be a flag saying the "xul-bindings" package is safe for content to load.
Comment 43•18 years ago
|
||
Jonas, there is not only <script>s but also images. And images are far more likely to be found in the anonymous nodes. Problem with flagging anonymous nodes would be that web sites have access to those and can change attributes like src. So to do this properly you would have to unflag the nodes when content changes src attribute and flag them again when chrome does it - that's going to be a mess. It is probably easier to track down the source of each request but this still means lots of changes.
Bsmedberg: do we really have any textbox.js type files that anyone needs to load? Looks to me like all of xul lives in .xml xbl files.
Wladimir: Yes, this doesn't solve images, but fixing scripts seems like it's a step on the way.
Comment 45•18 years ago
|
||
sicking, does it matter? The point is that script can load anything from some specific set of chrome packages, and not others (and the default would be inaccessible). This solves the problem of detecting whether an extension is installed (unless the extension wants to make their chrome available to web content, of course).
Sure, I'd be fine with having a flag like that if we think that anyone's actually going to use it.
Hmm.. Would such a flag solve images even? I.e. could we apply that flag to <img>s and only flag the few directories that XUL and editor[*] uses as being ok to link to?
[*] I believe that one reason we allow pointing <img>s to chrome urls is because editor needs it to implement some of its editing widgets.
Comment 48•18 years ago
|
||
Benjamin, would it still allow an extension to use addBinding() with a protected chrome URL on a content node? I guess not, checkLoadURI will only know that there is a content document that tries to load an URL that it shouldn't load. In Adblock Plus I have exactly that case - I don't want content to be able to load any of its files (including that binding) but I need the possibility to apply the binding when I want to. Don't know how common this case would be, my solution is linked in comment 31.
Comment 49•18 years ago
|
||
Yes, that flag should be used for images as well.
Comment 50•18 years ago
|
||
Hrm, if you are manually calling addBinding(), why use a chrome: URI instead of, say, a data: URI?
Comment 51•18 years ago
|
||
I tried, data: URIs don't work for bindings. I guess the anchor is not recognized.
Comment 52•18 years ago
|
||
We could make addBinding() work (make it pass an explicit principal, gotten off the JS stack, to the XBL core, and change that to use that explicit principal). In fact, we'll probably need to do something like that if we decide to _really_ keep track of who's loading the XBL (and give principals to stylesheets, etc).
And yes, XBL doesn't support nsSimpleURI URIs because they don't have anchors as far as our code is concerned.
Comment 53•18 years ago
|
||
> I tried, data: URIs don't work for bindings. I guess the anchor is not
> recognized.
Bug 243917 :(
Comment 54•18 years ago
|
||
How about adding something like a .access file to each chrome:// directory. Then this file can be consulted for which groups can access files in the directory. What exactly these different groups are still needs to be decided, but does this work as a solution?
Comment 55•18 years ago
|
||
First, I'm all for making Mozilla more secure, where possible. I would understand, if this "untrusted can load chrome scripts as untrusted" feature got chopped.
We have a useful use for it, though: Most of the UI is chrome with systenm privileges, needs them. Some of it interacts only with the server, though, so in the interest of security, I made it untrusted. We still want to have the same look for everything, though, and we need (almost) the same self-written JS libraries and XBL widgets on both sides of the fence. The solution right now is to let untrusted load them from chrome: URIs. Any other suitable solution would be welcome, too, and I'd be willing to adapt to any Mozilla changes, given a warning.
There's a potential spoofing danger to allowing untrusted to access local skins, though, no? Brendan mentioned the changable skins as anti-spoof protection. (Granted, that's largely irrelevant now that skins are no longer in vogue, with Firefox.)
Comment 56•18 years ago
|
||
You could probably implement a very thin shim protocol handler to do this; that way you could control access to exactly the scripts you want to expose.
That's assuming we don't end up with a "native" mechanism for doing that, of course.
Isn't the 'res' protocol still around?
Comment 58•17 years ago
|
||
Is this what http://www.0x000000.com/?i=417 is talking about?
Comment 59•17 years ago
|
||
(In reply to comment #58)
> Is this what http://www.0x000000.com/?i=417 is talking about?
Yes it is, as I suggested in http://ha.ckers.org/blog/20070811/firefox-remote-variable-leakage/#comment-45648
Didn't notice this passage, though:
"I've tested this against my own Firefox extension called: Fire Encrypter. And I was able to steal a dynamically generated password successfully."
Passwords or private keys hard-coded in a script?!
Or does this extension persitently pollute content some way?
Comment 61•17 years ago
|
||
> I was able to steal a dynamically generated password successfully.
If (just guessing) Fire Encrypter would at initialization/loading time generate a password and store it in a variable, remote content would have access to it. It would just not be the *same* password that the real extension running with system rights uses, because it's a new instance with new variables.
Comment 62•17 years ago
|
||
The security of private values being stored in javascript isn't what concerns me as a consumer. What concerns me is that I am becoming less anonymous in that websites can determine what plugins I am running, such as firebug, greasemonkey, and ad-blocker and websites are beginning to block me based on this information.
Comment 63•17 years ago
|
||
NoScript 1.1.6.15 addresses this some what:
+ noscript.forbidChromeScripts preference (true by default), prevents
script tags in content (non chrome:/resource:/file:) documents from
referencing chrome: scripts, see
https://bugzilla.mozilla.org/show_bug.cgi?id=292789
http://noscript.net/changelog
Comment 64•17 years ago
|
||
AdBlock Plus also addresses it since quite some time
http://adblockplus.org/en/faq_internal#protectchrome
Comment 65•17 years ago
|
||
See discussion at http://forums.mozillazine.org/viewtopic.php?t=590601
I've got Adblock Plus 0.7.5.3 (and NoScript 1.1.7.2) and that website can still tell whether Adblock Plus is enabled or not. It's serving ramdom content when Adblock Plus is enabled. Very disturbing!
Comment 66•17 years ago
|
||
This page simply shows random content, every time you do a full reload it is something different. Somebody probably misconfigured the server. Nothing to do with Adblock Plus or the issue discussed here.
Comment 67•17 years ago
|
||
If access to a configuration, addons/extensions or even themes is available without consent it is a provacy issue.
Comment 68•17 years ago
|
||
Everyone, please take a step back and go read https://bugzilla.mozilla.org/etiquette.html
Feel free to readd me to the cc list once this bug stops being a pointless-comment-fest.
Comment 69•17 years ago
|
||
Having read the previous posts and comments and later on realizing I could have added detail to my post I offer the following for a better understanding. Also sorry if there is some disagreement in my theory, it is not argument but just a consideration.
If one has a mechanical device, one of multiple duplicates manufactured, it's operation creates a series of sounds, a signature, specific to that unit, different from the other peer units. It would seem the information which is central to this issue could be treated as a "signature" on it's very own. While not actually specific to one individual user it could create class identifiers which could then be placed with other indicators such as time for more definition.
My apology if this comment is not what is considered relative.
Comment 70•17 years ago
|
||
jruderman: +'ing and assigning to you. Please determine if we are going to fix this or not. Also, setting priority at P4 as we're not going to block a beta any time soon on this.
Assignee: nobody → jruderman
Flags: blocking1.9? → blocking1.9+
Priority: -- → P4
Assignee | ||
Updated•17 years ago
|
Assignee: jruderman → dveditz
Assignee | ||
Updated•17 years ago
|
Blocks: CVE-2008-2802
Updated•17 years ago
|
Flags: tracking1.9+ → blocking1.9+
Assignee | ||
Comment 73•17 years ago
|
||
This blocks content access to chrome: urls, even in places where the ALLOW_CHROME CheckLoadURI() flag has been set. To keep too many things from breaking there's a whitelist of allowed packages, initially populated with global and browser so as not to break the feed reader.
Testcases that need to continue working:
+ view a feed (http://rss.news.yahoo.com/rss/topstories#)
should show feed, and media links should show the moz-icon: items
+ Open resource:/// and check directory. Should show the moz-icon:s in the listing.
+ view-source -- should be styled.
This is extensible, so if an extension really needs to allow links to its own chrome (icons maybe?) it can do so, knowing that that potentially allows attackers to fingerprint users of that one extension. For example, Jesse's shell bookmarklet includes scripts from extensiondev if it's installed, so that extension could add the pref "security.chrome_access.extensiondev" and set it to true in order to keep working from the shell bookmarklet.
Attachment #314817 -
Flags: superreview?(jst)
Attachment #314817 -
Flags: review?(bzbarsky)
Comment 74•17 years ago
|
||
I don't think the extensiondev bits actually work outside of a chrome context anyway, I think they're just included so Jesse doesn't have to maintain two different versions of the file. :)
Comment 75•17 years ago
|
||
In the comment in the default prefs, please make clear that the string "browser" is not the English term, but the name of the chrome page. Ideally, give an (commented-out) example how to add my own chrome package with a funny name. (It's clear to me, but won't be for others.)
Comment 76•17 years ago
|
||
Comment on attachment 314817 [details] [diff] [review]
whitelist chrome packages that ALLOW_CHROME applies to
>Index: caps/src/nsScriptSecurityManager.cpp
>+ nsCAutoString accesspref("security.chrome_access.");
Let's make that "security.chrome_access_allowed."
Other than that nit, I'm pretty happy with this approach. Though it might also make sense to ask the chrome registry instead of checking prefs, and have that use bits like it does for XPCNativeWrapper or whatnot... I'd check with bsmedberg to see which he thinks is a better idea.
>Index: browser/app/profile/firefox.js
Please file bugs on Seamonkey and Thunderbird to add such prefs as needed?
Also, please file a bug to move as much as possible out of the "browser" and "global" packages, or to leave things there and move the "web-accessible" stuff to different packages and adjust the prefs appropriately?
There should be some tests here. Certainly tests to make sure things are no longer accessible when they shouldn't be, but ideally also the "this should keep working" tests you list.
Attachment #314817 -
Flags: review?(bzbarsky) → review+
Comment 77•17 years ago
|
||
I certainly think asking the chrome registry is the better long-term approach: it avoid spreading information about a chrome package over a large surface area. And it's quite easy to implement, if all you want is something like:
nsIChromeRegistry::CanScriptAccess(in nsIURL chromeURL)
that's a 30-minute patch
Comment 78•17 years ago
|
||
Attachment #314919 -
Flags: superreview?(djst)
Attachment #314919 -
Flags: review?
Comment 79•17 years ago
|
||
Oh, please pretend that I rev'ved the IID of nsIXULChromeRegistry
Updated•17 years ago
|
Attachment #314919 -
Flags: superreview?(djst) → superreview?(jst)
Updated•17 years ago
|
Attachment #314919 -
Flags: review? → review?(dtownsend)
Assignee | ||
Comment 80•17 years ago
|
||
Alternate implementation of whitelisting using Benjamin's ChromeReg patch.
Attachment #314817 -
Attachment is obsolete: true
Attachment #314954 -
Flags: superreview?(jst)
Attachment #314954 -
Flags: review?(bzbarsky)
Attachment #314817 -
Flags: superreview?(jst)
Comment 81•17 years ago
|
||
Comment on attachment 314954 [details] [diff] [review]
alternate w/chromereg change
I like this!
Attachment #314954 -
Flags: review?(bzbarsky) → review+
Updated•17 years ago
|
Attachment #314919 -
Flags: superreview?(jst) → superreview+
Updated•17 years ago
|
Attachment #314954 -
Flags: superreview?(jst) → superreview+
Comment 82•17 years ago
|
||
Comment on attachment 314919 [details] [diff] [review]
add a contentaccessible flag to chrome registration
>diff --git a/content/base/public/nsIChromeRegistry.idl b/content/base/public/nsIChromeRegistry.idl
>--- a/content/base/public/nsIChromeRegistry.idl
>+++ b/content/base/public/nsIChromeRegistry.idl
>@@ -93,6 +93,13 @@ interface nsIXULChromeRegistry : nsIChro
> * method.
> */
> boolean allowScriptsForPackage(in nsIURI url);
>+
>+ /**
>+ * Content should only be allowed to load chrome JS from certain packages.
>+ * This method reflects the allowContentAccess flag on packages.
>+ * Do not pass non-chrome URIs to this method.
>+ */
>+ boolean allowContentToAccess(in nsIURI url);
> };
Nit: The flag is named "contentaccessible"
r=me on the code changes, however you should add a unit test for the flag.
The test changes in the patch seem to be unrelated to this bug and appear to be broken. It looks like you have added a head js file but not included it in the patch.
Attachment #314919 -
Flags: review?(dtownsend) → review+
Comment 83•17 years ago
|
||
Argh, I spent an extra 30 minutes on the unit tests and then forgot to attach them!
Comment 84•17 years ago
|
||
Mossop looked over my unit tests and suggested one change.
Attachment #314919 -
Attachment is obsolete: true
Attachment #315001 -
Attachment is obsolete: true
Attachment #315008 -
Flags: approval1.9?
Assignee | ||
Updated•17 years ago
|
Attachment #314954 -
Flags: approval1.9?
Assignee | ||
Comment 85•17 years ago
|
||
Attachment #315024 -
Flags: review?(bzbarsky)
Comment 86•17 years ago
|
||
Comment on attachment 315024 [details] [diff] [review]
mochitest
r=bzbarsky if you run the is/ok() things from a load event. Otherwise the images might not have loaded yet...
Attachment #315024 -
Flags: review?(bzbarsky) → review+
Comment 87•17 years ago
|
||
Though ideally we'd have a guaranteed way of testing the "can't load" thing. Right now, if someone moves those resources and then this bug regresses we won't catch that...
Comment 88•17 years ago
|
||
Comment on attachment 315024 [details] [diff] [review]
mochitest
>+ <img id="img-global" src="chrome://global/skin/icons/Error.png">
gnomestripe's use of this image is deprecating soon, so it may be wise to use something that is actually themed by all three themes, such as "chrome://global/skin/icons/find.png". For toolkit, gnomestripe overlays winstripe, so it's possible this might continue to work, but it's not guaranteed to work, as would a file that all three themes individually have.
Comment 89•17 years ago
|
||
Comment on attachment 315008 [details] [diff] [review]
add a contentaccessible flag to chrome registration, final
a1.9=beltzner
Attachment #315008 -
Flags: approval1.9? → approval1.9+
Comment 90•17 years ago
|
||
Comment on attachment 314954 [details] [diff] [review]
alternate w/chromereg change
a1.9=beltzner
Attachment #314954 -
Flags: approval1.9? → approval1.9+
Comment 91•17 years ago
|
||
(In reply to comment #87)
> Though ideally we'd have a guaranteed way of testing the "can't load" thing.
Do a chrome test and include in the page an iframe which sources the same script and checks for |typeof ChromeVar == "undefined"|, changing its query string to indicate the result of the test. The outer window can at load add a load listener to the iframe and set its location to start the test. The second time the iframe listener fires, check for success/failure using the iframe's location. Finally, check |typeof ChromeVar != "undefined"| and you're foolproof.
Assignee | ||
Comment 92•17 years ago
|
||
This one verifies that the chrome we're blocked from loading actually exists, that the load failed because of this bug's fix and not because someone happened to remove one of the resources we're trying to load.
The img onError fires twice: mochitest (in my build) reports 7 tests when there are really only 6.
Attachment #315024 -
Attachment is obsolete: true
Attachment #315049 -
Flags: review?(bzbarsky)
Comment 93•17 years ago
|
||
Comment on attachment 315049 [details] [diff] [review]
updated mochitest
>Index: caps/tests/mochitest/Makefile.in
>+ img.src = uri;
>+ document.getElementById("content").appendChild(img);
This will possibly kick off the load twice (which might explain your error event thing). I wouldn't bother inserting into the document if you're listening for error/load events and don't care about the sizing.
With that one line removed, looks wonderful.
Attachment #315049 -
Flags: review?(bzbarsky) → review+
Comment 94•17 years ago
|
||
Landed the chrome-registry changes on CVS trunk.
Assignee | ||
Updated•17 years ago
|
Keywords: dev-doc-needed,
late-compat
Comment 95•17 years ago
|
||
I just tried to package an extension adding to chrome registration the contentaccessible=yes flag, but installation apparently fails:
Warning: Warning: Unrecognized chrome registration modifier 'contentaccessible=yes'
Even if it's reported as a warning, the chrome does not get registered (yesterday's trunk and branch).
How are we supposed to handle backward compatibility for extensions which want to keep working on Fx 2 for some time?
Comment 96•17 years ago
|
||
You can probably use version modifiers of the form:
content mypackage location/ contentaccessible=yes
content mypackage location/ application={ec8030f7-c20a-464f-9b0e-13a3a9e97384} appversion<3.0pre
Comment 97•17 years ago
|
||
That doesn't sound great when you have to support five applications. I guess something like this will work as well:
content mypackage location/
content mypackage location/ contentaccessible=yes
Older browsers should ignore the second line while in the newer browsers the second should override the first (?). I didn't try but that should work?
Comment 98•17 years ago
|
||
Yes, I believe it should work. Please try it and let me know.
Assignee | ||
Comment 99•17 years ago
|
||
> How are we supposed to handle backward compatibility for extensions which want
> to keep working on Fx 2 for some time?
Should we backport this fix to FF2, or even just the chrome registry part? Doesn't help if you need to support down-rev FF2 versions, but if your users are in the vast majority who keep up to date (which is a good bet for NoScript users) you'd be fine.
caps part of fix checked in.
Status: NEW → RESOLVED
Closed: 19 years ago → 17 years ago
Resolution: --- → FIXED
Are we sure we didn't want to wait until closer to the freeze, so we could have a bigger surprise for extension authors? :(
Can someone here familiar with the details write a doc (or multiple) for MDC describing how to address the use cases that motivated the WONTFIX in comment 8? Given that the remote XUL use cases are important enough to keep us from removing the content-XBL attack surface, I presume we're about to break some pretty important applications with this...
Comment 101•17 years ago
|
||
(In reply to comment #76)
> Please file bugs on Seamonkey and Thunderbird
Oh, don't worry about us, we'll find out when you break us, same as always.
Comment 102•17 years ago
|
||
(In reply to comment #76)
>Also, please file a bug to move as much as possible out of the "browser" and
>"global" packages, or to leave things there and move the "web-accessible" stuff
>to different packages and adjust the prefs appropriately?
Would it work to drop a file into resource: somewhere, and make that import the chrome file e.g. for the four or so stylesheets SeaMonkey and Thunderbird use:
chrome://messenger/skin/messageBody.css
chrome://editor/content/EditorContent.css
chrome://editor/content/EditorAllTags.css
chrome://editor/content/EditorParagraphMarks.css
Comment 103•17 years ago
|
||
(In reply to comment #98)
> Yes, I believe it should work. Please try it and let me know.
Tested it - yes, my suggestion in comment 97 works. Firefox 2 will warn about the unknown "contentaccessible" flag but accept the first declaration. And Firefox 3 will allow access because of the second declaration.
However, as it turns out doing that isn't necessary for Adblock Plus - the call to addBinding happens with chrome privileges, apparently that's enough to attach a chrome binding to content.
Comment 104•17 years ago
|
||
Causes a regression in Camino Flash blocking.
From Bug 292789 Comment 1
> Ew. We didn't get a complete fix for bug 292789, so we might be stuck :(
> http://tinyurl.com/4dbqes
Blocks: 428747
Comment 105•17 years ago
|
||
(In Reply to Comment #100)
> Are we sure we didn't want to wait until closer to the freeze, so we could have
> a bigger surprise for extension authors? :(
(In Reply to Comment #101)
> Oh, don't worry about us, we'll find out when you break us, same as always.
Like Phil says.
Comment 106•17 years ago
|
||
> From Bug 292789 Comment 1
Correction: From Bug 428747 Comment 1
Updated•17 years ago
|
Comment 107•17 years ago
|
||
(In reply to comment #102)
> chrome://messenger/skin/messageBody.css
Fix(es) is in bug 428767.
> chrome://editor/content/EditorContent.css
> chrome://editor/content/EditorAllTags.css
> chrome://editor/content/EditorParagraphMarks.css
No fix needed: see bug 428852...
Target Milestone: --- → mozilla1.9
Comment 108•17 years ago
|
||
(In reply to comment #102)
> chrome://editor/content/EditorContent.css
> chrome://editor/content/EditorAllTags.css
> chrome://editor/content/EditorParagraphMarks.css
So as with comment #103 EditorAllTags.css is loaded by chrome, so no problem.
However I'd still appreciate an answer as I don't really want to expose all of mailnews just for one CSS file.
I notice that venkman also tries to load chrome stylesheets from content.
Comment 109•17 years ago
|
||
> Would it work to drop a file into resource: somewhere, and make that import the
> chrome file
It should, yes.
Comment 110•17 years ago
|
||
Though that depends on the "resource and chrome are same-origin" thing, which is something we really want to change sometime. Not for 1.9, though.
The "right" solution to that problem is to put the CSS file into its own chrome package which is exposed.
Comment 111•17 years ago
|
||
(In reply to comment #108)
> So as with comment #103 EditorAllTags.css is loaded by chrome, so no problem.
See bug 428852 comment 5 and 6:
The two used and still working editor css files are used by
|.addOverrideStyleSheet()| and |.enableStyleSheet(..., false)|,
and not a SCRIPT tag.
The current bug blocked the SCRIPT tag.
Is it fine that these functions still work ?
(Dumb question, as I really don't know anything about all this.)
> However I'd still appreciate an answer as I don't really want to expose all of
> mailnews just for one CSS file.
Sure, it would be nice if bug 428767 could be a short-term only solution...
> I notice that venkman also tries to load chrome stylesheets from content.
I filed bug 428848.
Severity: minor → normal
Comment 112•17 years ago
|
||
If Firefox 3.0 beta 4, I can still do something like this:
if( "nsIFireBug" in Components.interfaces ) {
alert("You seem to have FireBug installed.");
}
This has been used in the wild to block Adblock users. This information leak has been well known for ages, yet I don't see it mentioned anywhere in here. Does Mozilla need an information leak meta-bug?
Sorry for the spam, but does the resolution of this bug affect the application
of a chrome:// XBL to a web page from chrome JS ? I am afraid it does...
Comment 114•17 years ago
|
||
Note that contentaccessible=yes flag is only accepted on the content declaration but will affect skin and locale as well - I think that should be noted in the documentation.
(In reply to comment #113)
> Sorry for the spam, but does the resolution of this bug affect the application
> of a chrome:// XBL to a web page from chrome JS ? I am afraid it does...
Yes, it does - that's the main reason why chrome://browser/ and chrome://toolkit/ have been whitelisted.
Severity: normal → minor
Target Milestone: mozilla1.9 → ---
Updated•17 years ago
|
Severity: minor → normal
Target Milestone: --- → mozilla1.9
Comment 115•17 years ago
|
||
I reworked the test to avoid using setTimeout, and instead rely on the load events. I'm going to land this now in an attempt to address the linux orange that's been around since this test landed.
Attachment #315049 -
Attachment is obsolete: true
Attachment #315488 -
Flags: review?(dveditz)
Comment 116•17 years ago
|
||
James, please do file a bug report on the Components.interfaces information leak. A metabug about leaks of extension information wouldn't hurt (parallel to bug 412525?).
Assignee | ||
Updated•17 years ago
|
Attachment #315488 -
Flags: review?(dveditz) → review+
Comment 118•17 years ago
|
||
Jesse, I filed Bug 429070.
Comment 119•17 years ago
|
||
Comment on attachment 314954 [details] [diff] [review]
alternate w/chromereg change
>+ reg->AllowContentToAccess(targetBaseURI, &accessAllowed);
Eww, no rv check - no wonder attachment 316287 [details] [diff] [review] works.
Comment 120•17 years ago
|
||
Is is possible to do a security check that allows linking to contentaccessible chrome packages? Bug 429721 added <link rel="icon" type="image/png" id="favicon" href="chrome://global/skin/icons/warning-16.png"/> to netError.xhtml which causes SeaMonkey to fail its security check.
Comment 121•16 years ago
|
||
I noticed that this fix broke our extension (LibX) in FF3 RC1 and later.
This breakage occurs despite the fact the scenario against which this fix is directed (untrusted code trying to access chrome:// content) does not apply.
The restriction you put in place now prevents even trusted extension code from adding <img> elements to a document if this img's src attribute refers to a chrome:// URL.
See Bug 436989 for details.
Comment 122•16 years ago
|
||
(In reply to comment #121)
> The restriction you put in place now prevents even trusted extension code from
> adding <img> elements to a document if this img's src attribute refers to a
> chrome:// URL.
>
We don't really care _how_ the img src attribute is set the "chrome://", the fact that a web content element is trying to access a chrome resource is all that matters, and we shut it down - unless contentaccessible=yes is set in the chrome manifest. This isn't really a scripting issue, it's a chrome protocol issue.
> See Bug 436989 for details.
>
I marked that INVALID based on the above
Comment 123•16 years ago
|
||
> We don't really care _how_ the img src attribute is set the "chrome://", the
> fact that a web content element is trying to access a chrome resource is all
> that matters, and we shut it down
That strategy is what I am questioning (or looking for rationale.)
The original complaint in this bug report was that sites can detect if an extension is installed if web content is allowed to add elements that refer to chrome:// resources to their DOM.
This goal could have been achieved by disallowing only web content from installing or using chrome:// URLs in img and script elements created by it. Instead, you ignore who is setting the chrome:// URL in an element and simply do not render the element. What is the rationale for this decision? (Other than that it took only 30mins to implement, as noted in comment #77 ?)
Comment 124•16 years ago
|
||
(In reply to comment #123)
> That strategy is what I am questioning (or looking for rationale.)
>
> The original complaint in this bug report was that sites can detect if an
> extension is installed if web content is allowed to add elements that refer to
> chrome:// resources to their DOM.
>
> This goal could have been achieved by disallowing only web content from
> installing or using chrome:// URLs in img and script elements created by it.
> Instead, you ignore who is setting the chrome:// URL in an element and simply
> do not render the element. What is the rationale for this decision? (Other than
> that it took only 30mins to implement, as noted in comment #77 ?)
>
From a security standpoint, it's much better to solve a problem at the mouth of the river than to patch up all the little streams.
Solve something at the root and it's solved everywhere. The fact that this was quick and simple was even better; instead of spending many hours (which are needed in many places) trying to fix all the cases where web content might be able to add something, and missing something, it was done correctly and fairly quickly too.
Sounds like efficiency and good security practice to me. If this has limited features, I would suggest a new bug should be filed asking for the ability to add things to the page, in some way, without letting the page add them itself.
-[Unknown]
Comment 125•16 years ago
|
||
(In reply to comment #123)
> This goal could have been achieved by disallowing only web content from
> installing or using chrome:// URLs in img and script elements created by it.
Why bother? Dump the stuff you want usable by web content in its own named package, set the content-accessible flag, and let 'er rip. If the choice here truly limited what you could do, I might sympathize, but you can opt in to exactly the behavior you had in the past with no skin off your back.
Comment 126•16 years ago
|
||
> Why bother? Dump the stuff you want usable by web content in its own named
> package, set the content-accessible flag, and let 'er rip.
I'm sorry if this hasn't been clear, I do not want anything to be "usable by web content." I don't care what web content uses, knows, or sees about the images I add, as long as they display.
> I would suggest a new bug should be filed asking for the ability to
> add things to the page, in some way, without letting the page add them itself.
Like Bug 436989 ?
Other than that, I do agree with the simplicity argument made in comment #124.
Comment 127•16 years ago
|
||
(In reply to comment #126)
> > Why bother? Dump the stuff you want usable by web content in its own named
> > package, set the content-accessible flag, and let 'er rip.
>
> I'm sorry if this hasn't been clear, I do not want anything to be "usable by
> web content." I don't care what web content uses, knows, or sees about the
> images I add, as long as they display.
>
> > I would suggest a new bug should be filed asking for the ability to
> > add things to the page, in some way, without letting the page add them itself.
>
> Like Bug 436989 ?
Perhaps so, but it sounds like this really hasn't limited features.
All you'd do is create a second chrome content directory in your extension, and expose it under a different base chrome:// URI. So, let's say your current is "chrome://libx/", you'd have "chrome://libx-www/".
Now, since you don't care if web content links to your chrome there (although it's unlikely), you set the content-accessible flag in chrome.manifest.
Then your extension (js code under chrome://libx/) creates an img tag, and sets its src to chrome://libx-www/whatever.png.
And life is good. This is probably a cleaner way to separate it anyway. In fact, if you're using a skin (instead of content) in chrome.manifest for the images you want to reference, you might even be able to just put the flag on there and not worry about anything.
-[Unknown]
Comment 128•16 years ago
|
||
[ I was going to drop this thread, but I keep getting misunderstood ].
Comment #127 and comment #125 are correct that your fix does not limit features if I set contentaccessible=yes. Doing so produces the behavior that was in Firefox before. It just means that I do not benefit from the intended effect of this fix, which is to allow extensions to stay undetected by untrusted web content sniffing for them.
The suggestion to create a second chrome directory with the flag contentaccessible=yes flag is, in my opinion, entirely besides the point. If the goal is for an extension to stay undetected, then it can't set the contentaccessible=yes flag on *any* of its chrome paths. If I have to set it on any, I may as well set it on all paths.
My suggestion would be to not consider this issue closed, but mark it for future consideration.
Comment 129•16 years ago
|
||
If you are adding chrome content to a webpage, then you are explicitly telling the page that your extension exists. The page could have a DOM mutation event listener, and notice your images being inserted. Even if we lied about the URL or denied access, the page could probably use that misinformation to realize that it was being modified. There's really no good solution for that.
Comment 130•16 years ago
|
||
True --- but the applied fix, which forces me to set contentaccessible=yes to at least some part of my chrome:// content, leaves me open to detection even by pages whose DOM my extension does not change, just like it was before the fix. So the fix does not achieve its objective for at least some extensions, which is why I wouldn't close it entirely.
For images, you can construct a data: URL and use it, or probably use a canvas element and drawImage into it with your chrome image element as the source. For scripts, just create a script element and insert the script you want as child content, and then add it to the document -- should work fine. (Or use a data: URL for scripts as well, of course.)
Comment 132•16 years ago
|
||
I'm saying your extension's behavior defies the intent of this fix, so it's a moot point. Just set contentaccessible=yes and you're back where you started.
If you want the protection this fix enables, you will have to find a different way to do what you're doing. End of story.
Comment 133•16 years ago
|
||
data: URIs, to my (possible incorrect) knowledge, are limited in length and do not make for a fully functional replacement. In our extension, which actually comes in over 360 editions, adopters control the type and size of the images added.
This fix, as is, creates a (minor) pain, no gain situation for our extension and extensions like it. What is starting to make this a major pain is the apparent refusal to acknowledge and document this trade-off, however reasonable it may be.
We've been acknowledging that trade-off all over the place, in this very conversation. You could add it to the documentation if you wanted, certainly, if you think that the current documentation of it is insufficient. We didn't document "web pages can detect extensions" at all before, so it still seems like your complaint is "this doesn't improve things for me, because I want to mutate content pages without detection" plus "I have to add a few dozen characters to my chrome.manifest" -- not sure why you think that it needs more documentation than just the chrome.manifest change.
I don't know of any limitations on data: URI length in Firefox, and <canvas>+drawWindow should let you work with any image you choose, regardless of the source, and without converting to base64. Tried it?
Comment 135•16 years ago
|
||
Fair enough.
Comment 136•16 years ago
|
||
My profile's userChrome.css became broken in the last 2 weeks.
I wonder if the fix for this bug is the cause.
The userChrome file contains:
#navigator-throbber
{list-style-image : url("chrome://communicator/skin/brand/throbber-single.gif")
!important; }
And the new error (shown in error console) is:
Security Error: Content at
file:///.../Mozilla/SeaMonkey/Profiles/.../chrome/userChrome.css
may not load or link to
chrome://communicator/skin/brand/throbber-single.gif.
Clearly the contents of userChrome are not web content, and denying
userCrome access to chrome is a bug.
Comment 137•16 years ago
|
||
I have the same problem with an extension of mine, which adds a chrome CSS to mailnews messages; it's BiDi Mail UI 0.9, and I get:
Security Error: Content at mailbox:///C|/Some/Where/MyFolder?number=22870681 may not load or link to chrome://bidimailpack/content/quotebar.css.
this is not userChrome.css but I suspect the cause is the same.
Comment 138•16 years ago
|
||
(In reply to comment #137)
> I have the same problem with an extension of mine, which adds a chrome CSS to
> mailnews messages; it's BiDi Mail UI 0.9, and I get:
>
> Security Error: Content at mailbox:///C|/Some/Where/MyFolder?number=22870681
> may not load or link to chrome://bidimailpack/content/quotebar.css.
Please see comment #132, comment #114, comment #97, etc.
-[Unknown]
Comment 139•16 years ago
|
||
I am yet another person faced with this problem (tested my extension with some firefox3 beta, where it worked perfectly, then out of sudden it stopped to work when firefox3 was published). Well, answered some googled, found contentaccessible=yes, answered some emails, I am done.
But - does there exist any document which explain what should I do to avoid contentaccessible? My extension refers to the following content:
- a bunch of images (icons I inject into the page) - here I have no idea how could I avoid contentaccessible
- CSS files which I activate by creating link element inside head from overlay - here I could probably inject direct style elements, but as this CSS is rather long I find it unaesthetical at least
- javascript files which I activate by creating script elements inside head from overlay (I need them to provide callback functions extending builtin page functionality, working in the page context) - here again I could probably inject javascript directly, but considering the code is fairly long I again find it innatural
Comment 140•16 years ago
|
||
Marcin,
for images, the Mozilla developer suggest you use data: URIs. They are described here: http://en.wikipedia.org/wiki/Data:_URI_scheme
All you need to do is write the code to convert your data to Base64 and identify the correct MIME type. Good luck.
You need to log in
before you can comment on or make changes to this bug.
Description
•