Closed Bug 1307604 Opened 8 years ago Closed 8 years ago

Add blacklist and whitelist to permit/deny Flash plugin use

Categories

(Core Graveyard :: Plug-ins, defect, P2)

defect

Tracking

(firefox52 wontfix, firefox53 fixed, firefox54 fixed)

RESOLVED FIXED
mozilla54
Tracking Status
firefox52 --- wontfix
firefox53 --- fixed
firefox54 --- fixed

People

(Reporter: ddurst, Assigned: bytesized)

References

Details

Attachments

(3 files)

To be able to control enabling of critical Flash, we need to establish an "allowedlist" (by host or source domain). This would be referenced when determining what Flash objects should be allowed to play unimpeded (i.e. rather than CTA or prefer fallback content). Ideally, this "allowedlist" should be modifiable independent of the train schedule.
Depends on: 1314094
Depends on: 1318768
Depends on: 1319571
Depends on: 1321377
Since it does not make sense to write a separate patch for the blacklist and whitelist changes, I am going to combine the two bugs.
Summary: add "allowedlist" to permit Flash plugin use → Add blacklist and whitelist to permit/deny Flash plugin use
Attachment #8822484 - Flags: review?(benjamin)
Hmm. Looks like my build is failing in other OSes due to member initialization order. Fixing...
The tests I wrote are failing in Linux (PLUGIN_UNSUPPORTED). I'm currently asking around as to whether this is fixable or if the test should just not run in Linux.
Assignee: nobody → ksteuber
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review102258 ::: dom/base/nsDocument.h:1300 (Diff revision 4) > virtual nsIScriptGlobalObject* GetScriptHandlingObjectInternal() const override; > virtual bool InternalAllowXULXBL() override; > > void UpdateScreenOrientation(); > > + virtual PluginClassification DocumentPluginClassification() const override; I am not a DOM peer nor even a good person to review DOM code. This should be qdot, bz, or somebody at https://wiki.mozilla.org/Modules/All#Document_Object_Model ::: modules/libpref/init/all.js:5096 (Diff revision 4) > pref("urlclassifier.downloadAllowTable", "goog-downloadwhite-digest256"); > #else > pref("urlclassifier.downloadAllowTable", ""); > #endif > > -pref("urlclassifier.disallow_completions", "test-malware-simple,test-phish-simple,test-unwanted-simple,test-track-simple,test-trackwhite-simple,test-block-simple,goog-downloadwhite-digest256,base-track-digest256,mozstd-trackwhite-digest256,content-track-digest256,mozplugin-block-digest256,mozplugin2-block-digest256"); > +pref("urlclassifier.disallow_completions", "test-malware-simple,test-phish-simple,test-unwanted-simple,test-track-simple,test-trackwhite-simple,test-block-simple,test-flashblockallow-simple,test-flashblockallowexceptions-simple,test-flashblockdeny-simple,test-flashblockdenyexceptions-simple,test-flashblockthirdparty-simple,test-flashblockthirdpartyexceptions-simple,goog-downloadwhite-digest256,base-track-digest256,mozstd-trackwhite-digest256,content-track-digest256,mozplugin-block-digest256,mozplugin2-block-digest256"); Dare I ask what this pref actually *does*. I can't find documentation of it anywhere, and this pattern of listing a bunch of unrelated lists in one pref is a maintenance nightmare. ::: modules/libpref/init/all.js:5171 (Diff revision 4) > pref("browser.safebrowsing.provider.mozilla.lists.base.name", "mozstdName"); > pref("browser.safebrowsing.provider.mozilla.lists.base.description", "mozstdDesc"); > pref("browser.safebrowsing.provider.mozilla.lists.content.name", "mozfullName"); > pref("browser.safebrowsing.provider.mozilla.lists.content.description", "mozfullDesc"); > > +pref("urlclassifier.flashBlockAllowTable", ""); This patch doesn't add or update any in-tree docs. Given that the current specification is in a rather confusing google-doc, I think we should arrange the code like this: * add in-tree .rst docs describing these lists and their intended behavior, reviewed by me * the DOM and safebrowsing peers will review the code to make sure that the code behavior matches the in-tree docs That way if there are any questions, we can resolve them in the docs and there is a permanent record of the desired behavior. ::: toolkit/components/url-classifier/SafeBrowsing.jsm:46 (Diff revision 4) > } > return pref.split(",") > .map(function(value) { return value.trim(); }); > } > > const tablePreferences = [ Similarly, not my code to review. Owner/peers for urlclassifier are at https://wiki.mozilla.org/Modules/All#URL_Classifier ::: toolkit/components/url-classifier/tests/browser/browser.ini:3 (Diff revision 4) > +[DEFAULT] > +support-files = > + test_flash_block_lists.html It is unusual and confusing to have a support file start with "test\_". Usually only tests have that prefix. Please rename this file to avoid confusion.
Attachment #8822484 - Flags: review?(benjamin)
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review102258 > Dare I ask what this pref actually *does*. I can't find documentation of it anywhere, and this pattern of listing a bunch of unrelated lists in one pref is a maintenance nightmare. The Safe Browsing standard specifies that only partial hashes of URLs are transmitted, and upon match the whole hash must be looked up via a completion URL to see if it matches. Tables listed in the preference do not support that behavior and instead expect the entire hashes to be transmitted during the original request.
Attachment #8822484 - Flags: review?(francois)
Attachment #8822484 - Flags: review?(felipc)
Attachment #8822484 - Flags: review?(bzbarsky)
I'm not going to be able to get to this for at least two days; I have too many other things on my plate right now.
Attachment #8822484 - Flags: review?(benjamin) → review+
(In reply to Benjamin Smedberg [:bsmedberg] from comment #15) > ::: modules/libpref/init/all.js:5096 > (Diff revision 4) > > pref("urlclassifier.downloadAllowTable", "goog-downloadwhite-digest256"); > > #else > > pref("urlclassifier.downloadAllowTable", ""); > > #endif > > > > -pref("urlclassifier.disallow_completions", "test-malware-simple,test-phish-simple,test-unwanted-simple,test-track-simple,test-trackwhite-simple,test-block-simple,goog-downloadwhite-digest256,base-track-digest256,mozstd-trackwhite-digest256,content-track-digest256,mozplugin-block-digest256,mozplugin2-block-digest256"); > > +pref("urlclassifier.disallow_completions", "test-malware-simple,test-phish-simple,test-unwanted-simple,test-track-simple,test-trackwhite-simple,test-block-simple,test-flashblockallow-simple,test-flashblockallowexceptions-simple,test-flashblockdeny-simple,test-flashblockdenyexceptions-simple,test-flashblockthirdparty-simple,test-flashblockthirdpartyexceptions-simple,goog-downloadwhite-digest256,base-track-digest256,mozstd-trackwhite-digest256,content-track-digest256,mozplugin-block-digest256,mozplugin2-block-digest256"); > > Dare I ask what this pref actually *does*. I can't find documentation of it > anywhere, and this pattern of listing a bunch of unrelated lists in one pref > is a maintenance nightmare. I filed bug 1329349 to clean this up.
Status: NEW → ASSIGNED
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review103562 This looks quite good. One thing that would be really handy and enable quick manual testing (probably belongs in a new dependent bug) is to add a few hard-coded URLs and a new test page on itisatrap.org (see https://github.com/mozilla/itisatrap). It might also simplify the test setup for the browser test you added. Another thing I would like to point out is that the list names are quite long. How about shortening them to: - `*-flashthirdpartyexcept-simple-*` instead of `*-flashblockthirdpartyexceptions-*` - `urlclassifier.flashThirdPartyExceptTable` instead of `urlclassifier.flashBlockDenyThirdPartyExceptionsTable` - etc. We can assume that nothing means "deny" since that's what Safe Browsing does most of the time and label the whitelists with "allow". ::: build/pgo/server-locations.txt:185 (Diff revision 6) > https://tracking.example.com:443 > https://not-tracking.example.com:443 > https://tracking.example.org:443 > > +# > +# Used while testing flash blocking nit: a link to bug 1307604 in that comment could be useful ::: dom/base/nsDocument.cpp:12890 (Diff revision 6) > + * allow for quick lookups of the classification of plugins in this document. > + */ > +nsresult > +nsDocument::ClassifyDocumentPlugins() > +{ > + // If flash blocking is disabled, it is equivilant to all sites being typo: equivalent ::: dom/base/nsDocument.cpp:12932 (Diff revision 6) > + nsresult rv; > + nsCOMPtr<nsIURIClassifier> uriClassifier = > + do_GetService(NS_URICLASSIFIERSERVICE_CONTRACTID, &rv); > + NS_ENSURE_SUCCESS(rv, rv); > + > + // URI Classification is by domain in this context, so pass a URI with an This block is unnecessary because if you have "foo.com/" on the list and you classify "foo.com/bar.html", it will match, as per the [simplified regexp algorithm](https://web.archive.org/web/20160422212049/https://developers.google.com/safe-browsing/developers_guide_v2#RegexLookup). ::: dom/base/nsDocument.cpp:12943 (Diff revision 6) > + NS_ENSURE_SUCCESS(rv, rv); > + > + nsTArray<nsCString> results; > + rv = uriClassifier->ClassifyLocalWithTables(classificationURI, tables, > + results); > + NS_ENSURE_SUCCESS(rv, rv); You might want to add an early return here if `results` is empty. I assume that will be the most common case. ::: modules/libpref/init/all.js:5171 (Diff revision 6) > pref("browser.safebrowsing.provider.mozilla.lists.base.name", "mozstdName"); > pref("browser.safebrowsing.provider.mozilla.lists.base.description", "mozstdDesc"); > pref("browser.safebrowsing.provider.mozilla.lists.content.name", "mozfullName"); > pref("browser.safebrowsing.provider.mozilla.lists.content.description", "mozfullDesc"); > > +pref("urlclassifier.flashBlockAllowTable", ""); These should be set to the `test-*-simple` lists mentioned above. It won't take effect until we flip the `.enabled` prefs below anyways. ::: modules/libpref/init/all.js:5178 (Diff revision 6) > +pref("urlclassifier.flashBlockDenyTable", ""); > +pref("urlclassifier.flashBlockDenyExceptionsTable", ""); > +pref("urlclassifier.flashBlockDenyThirdPartyTable", ""); > +pref("urlclassifier.flashBlockDenyThirdPartyExceptionsTable", ""); > + > +pref("plugins.flashBlock.lists.enabled", false); Why do we need two settings to control this? Couldn't we just have the list updates be predicated on `plugins.flashBlock.enabled == true`? ::: toolkit/components/url-classifier/flash-block-lists.rst:9 (Diff revision 6) > + > +List based Flash blocking currently uses six lists. > +The lists specify what domains/subdomains Flash is allowed to or denied from loading on. > +The domains specified by the lists indicate the domain of the document that the Flash is loaded in, not the domain hosting the Flash content itself. > + > +* Allow List It would be good to note the middle part of each list. Something like: * Allow List (*-flashblockallow-*) * Allow Exceptions List (*-flashblockallowexceptions-*) * ... ::: toolkit/components/url-classifier/flash-block-lists.rst:23 (Diff revision 6) > +Classification > +============== > + > +Documents can be classified as Allow, Deny or Unknown. > +Documents with an Allow classification may load Flash normally. > +Documents with a Deny classification may not load Flash at all. It might be worth repeating here that `Deny` takes precedence over `Allow`. ::: toolkit/components/url-classifier/tests/browser/browser_flash_block_lists.js:151 (Diff revision 6) > + expectedPluginFallbackType: Ci.nsIObjectLoadingContent.PLUGIN_CLICK_TO_PLAY, > + expectedActivated: false, > + expectedHasRunningPlugin: false > + }, > + { > + name: "Third-party blocked domain in first-part context", typo: first-party ::: toolkit/components/url-classifier/tests/browser/browser_flash_block_lists.js:179 (Diff revision 6) > + expectedPluginFallbackType: Ci.nsIObjectLoadingContent.PLUGIN_SUPPRESSED, > + expectedActivated: false, > + expectedHasRunningPlugin: false > + }, > + { > + name: "Third-party exception in first-part context", typo: first-party ::: toolkit/components/url-classifier/tests/browser/browser_flash_block_lists.js:200 (Diff revision 6) > + return Task.spawn(function* () { > + let depth = 0; > + for (let domain of iframeDomains) { > + // Firefox does not like to load the same page in its own iframe. Put some > + // bogus query strings in the URL to make it happy. > + let url = domain + URL_PATH + "?date=" + Date.now() + "rand=" + Math.random(); Is the `rand=` part necessary? The date should be all that's needed (and this would also make the test deterministic). ::: toolkit/components/url-classifier/tests/browser/classifierHelper.js:112 (Diff revision 6) > + classifierHelper._updatesToCleanup.push(update); > + testUpdate += > + "n:1000\n" + > + "i:" + LISTNAME + "\n" + > + "ad:1\n" + > + "a:" + ADD_CHUNKNUM + ":" + HASHLEN + ":" + CHUNKLEN + "\n" + The existing code is abusing the Safe Browsing protocol in a way that's not guaranteed to keep working. If we're adding lots of chunks into the same test table, they need unique CHUNKNUM. In this case, we could simply stick all of the URLs into the same CHUNKNUM like we do here: http://searchfox.org/mozilla-central/rev/0f254a30d684796bcc8b6e2a102a0095d25842bb/toolkit/components/url-classifier/SafeBrowsing.jsm#352-357 I filed bug 1329366 to clean up the original test you copied. ::: toolkit/components/url-classifier/tests/browser/classifierHelper.js:193 (Diff revision 6) > + }, > + updateSuccess: function(requestedTimeout) { > + resolve(); > + } > + }; > + dbService.beginUpdate(listener, "test-flashblockallow-simple,test-unwanted-simple", ""); I don't understand why "test-unwanted-simple" is included here.
Attachment #8822484 - Flags: review?(francois) → review-
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review103562 > This block is unnecessary because if you have "foo.com/" on the list and you classify "foo.com/bar.html", it will match, as per the [simplified regexp algorithm](https://web.archive.org/web/20160422212049/https://developers.google.com/safe-browsing/developers_guide_v2#RegexLookup). I know that it will match without the conversion, I thought that this would be a simple optimization because it would prevent the hashing of a bunch of URLs that will not match because the list does not contain URLs of that format. Do you think it would be better not to do this? > These should be set to the `test-*-simple` lists mentioned above. > > It won't take effect until we flip the `.enabled` prefs below anyways. I don't see why the test lists should be in there by default. They are only needed during tests which explicitly overwrite the prefs with the `test-*-simple` values (as I believe they should). Is there some reason the prefs should be pre-populated with the test lists? > Why do we need two settings to control this? Couldn't we just have the list updates be predicated on `plugins.flashBlock.enabled == true`? We could. I intended `plugins.flashBlock.enabled` to be the preference to enable/disable all of Flash blocking (eventually including the work Felipe is doing with heuristics), but thought that it could be good to have a separate preference controlling the flash blocking lists. If you think only one preference is necessary, we can consolidate. > It would be good to note the middle part of each list. Something like: > > * Allow List (*-flashblockallow-*) > * Allow Exceptions List (*-flashblockallowexceptions-*) > * ... I was not aware that all lists for the same preference had to have the same middle section. Is that convention? Is possible that some list names used would not have the same middle section as other names in that preference? What about `test-unwanted-simple`, which is grouped with lists that have a middle section of `*-malware-*`? > Is the `rand=` part necessary? > > The date should be all that's needed (and this would also make the test deterministic). I think it is necessary. If the same page is being nested multiple frame levels deep, they could well both be requested in under a second, causing `Date.now()` to return the same value for both. > The existing code is abusing the Safe Browsing protocol in a way that's not guaranteed to keep working. > > If we're adding lots of chunks into the same test table, they need unique CHUNKNUM. > > In this case, we could simply stick all of the URLs into the same CHUNKNUM like we do here: > > http://searchfox.org/mozilla-central/rev/0f254a30d684796bcc8b6e2a102a0095d25842bb/toolkit/components/url-classifier/SafeBrowsing.jsm#352-357 > > I filed bug 1329366 to clean up the original test you copied. It sounds like you are saying that repeated calls to this function may not work properly because it will overwrite the URLs previously added to the table rather than adding to them. However, there is really no usecase for calling this function more than once. Do you want me to document its behavior and possibly change its name? Do you want me to write a system where different chunk numbers are used for each insertion (even though only one insertion is likely to ever be made)? If so, do I need to make sure that the chunk number picked is one that has not been used? How do I do that? > I don't understand why "test-unwanted-simple" is included here. I pulled it from [here](http://searchfox.org/mozilla-central/source/toolkit/components/url-classifier/tests/mochitest/classifierCommon.js#41). I don't really understand why it was that way there either. That code ultimately updates more than just those tables even in its [original usage](http://searchfox.org/mozilla-central/source/toolkit/components/url-classifier/tests/mochitest/test_classifier.html#24). What *should* go there?
Flags: needinfo?(francois)
Flags: needinfo?(francois)
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review103562 > I know that it will match without the conversion, I thought that this would be a simple optimization because it would prevent the hashing of a bunch of URLs that will not match because the list does not contain URLs of that format. Do you think it would be better not to do this? I don't have numbers on this, so I don't know if that optimization is worth it. I would be tempted to leave it out to reduce the amount of code. > I don't see why the test lists should be in there by default. They are only needed during tests which explicitly overwrite the prefs with the `test-*-simple` values (as I believe they should). Is there some reason the prefs should be pre-populated with the test lists? The reason why we include the `test-*-simple` lists in all of the other `urlclassifier.*Table` prefs is so that we can easily do manual tests via like https://itisatrap.org/firefox/its-a-trap.html and https://itisatrap.org/firefox/its-a-tracker.html > We could. I intended `plugins.flashBlock.enabled` to be the preference to enable/disable all of Flash blocking (eventually including the work Felipe is doing with heuristics), but thought that it could be good to have a separate preference controlling the flash blocking lists. If you think only one preference is necessary, we can consolidate. If you wanted to just disable the lists, you could also just clear the relevant `urlclassifier.flashBlock*` prefs. > I was not aware that all lists for the same preference had to have the same middle section. Is that convention? Is possible that some list names used would not have the same middle section as other names in that preference? What about `test-unwanted-simple`, which is grouped with lists that have a middle section of `*-malware-*`? `unwanted` is a little strange in that we use the middle part to differentiate between `malware` and `unwanted` and show the right error page but for historical reasons (we didn't want extra pref UI at the time it landed) they are both enabled via the same `urlclassifier.malwareTable` pref. In the case of FlashBlocking, we might be able to simplify the classification code even further: - why not have a single `urlclassifier.flashblockTable` pref with all of the tables? we could eliminate all of the `MaybeAddTableToTableList()` calls - change the checks from `if (ArrayContainsTable(results, denyTables)` to something like `if (ArrayContainsTable(results, '-flashblockdeny-')` > I think it is necessary. If the same page is being nested multiple frame levels deep, they could well both be requested in under a second, causing `Date.now()` to return the same value for both. Good point. Unless we add IDs to the frames and include that, the time might be the same, or worst, it could be intermittently the same. > It sounds like you are saying that repeated calls to this function may not work properly because it will overwrite the URLs previously added to the table rather than adding to them. However, there is really no usecase for calling this function more than once. Do you want me to document its behavior and possibly change its name? Do you want me to write a system where different chunk numbers are used for each insertion (even though only one insertion is likely to ever be made)? If so, do I need to make sure that the chunk number picked is one that has not been used? How do I do that? If it's not going to be used to add more than one URL at a time, then I guess we can remove the for loop and just add a single entry to `testUpdate`? > I pulled it from [here](http://searchfox.org/mozilla-central/source/toolkit/components/url-classifier/tests/mochitest/classifierCommon.js#41). I don't really understand why it was that way there either. That code ultimately updates more than just those tables even in its [original usage](http://searchfox.org/mozilla-central/source/toolkit/components/url-classifier/tests/mochitest/test_classifier.html#24). What *should* go there? I'm quite confused by the original test too. I'd say if it's not needed, remove `test-unwanted-simple` to reduce the confusion. Otherwise if it needs to be there for the test to pass, then let's figure this out later when we clean up our tests.
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review103562 > `unwanted` is a little strange in that we use the middle part to differentiate between `malware` and `unwanted` and show the right error page but for historical reasons (we didn't want extra pref UI at the time it landed) they are both enabled via the same `urlclassifier.malwareTable` pref. > > In the case of FlashBlocking, we might be able to simplify the classification code even further: > > - why not have a single `urlclassifier.flashblockTable` pref with all of the tables? we could eliminate all of the `MaybeAddTableToTableList()` calls > - change the checks from `if (ArrayContainsTable(results, denyTables)` to something like `if (ArrayContainsTable(results, '-flashblockdeny-')` That doesn't seem simpler to me. I don't like the idea of having tables that serve totally different purposes all be in the same preference and then differentiating them using magic substrings. I tend to expect the names of things to contain semantic information, but not functional information. Indeed, you just had to explain to me how those two lists (`malware` and `unwanted`) do different things despite being in the same preference. By separating the lists by functionality and giving them descriptive preference names, the explanation is no longer necessary and the code is more maintainable. I feel that your proposal results in a reduction in self-documentation without any obvious benefit. Why would we do this? > If it's not going to be used to add more than one URL at a time, then I guess we can remove the for loop and just add a single entry to `testUpdate`? It does add multiple URLs to multiple tables, but I thought that was acceptable. My understanding was that a chunk number corresponds to any number of URLs added to a particular table at one time. Can't we say "Add these two URLs to table1 as chunk 100. Also add these two different URLs to table2 as chunk 100." without issues since they are being added to different tables? > I'm quite confused by the original test too. I'd say if it's not needed, remove `test-unwanted-simple` to reduce the confusion. Otherwise if it needs to be there for the test to pass, then let's figure this out later when we clean up our tests. Do I remove both table names or just one? I'm not really sure what the other one is there for either.
Flags: needinfo?(francois)
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking (I think you have enough reviewers here that I don't need to do a detailed review too. I'm keeping an eye on this bug of course and looking over the patch to make sure it fits the code from 1282484)
Attachment #8822484 - Flags: review?(felipc) → feedback+
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review103562 > That doesn't seem simpler to me. I don't like the idea of having tables that serve totally different purposes all be in the same preference and then differentiating them using magic substrings. I tend to expect the names of things to contain semantic information, but not functional information. > > Indeed, you just had to explain to me how those two lists (`malware` and `unwanted`) do different things despite being in the same preference. By separating the lists by functionality and giving them descriptive preference names, the explanation is no longer necessary and the code is more maintainable. > > I feel that your proposal results in a reduction in self-documentation without any obvious benefit. Why would we do this? For better or for worst, both the middle part and the last part of list names already have functional implications in the URL classifier. The last part is used to determine which parser to use and the middle part is use to determine the type of error pages to show for example. The only part that is free-form is the first one. > It does add multiple URLs to multiple tables, but I thought that was acceptable. My understanding was that a chunk number corresponds to any number of URLs added to a particular table at one time. Can't we say "Add these two URLs to table1 as chunk 100. Also add these two different URLs to table2 as chunk 100." without issues since they are being added to different tables? Sorry, I was a little sloppy in my last comment. You're right. As long as you don't add chunk 100 twice in the same table, you're good. You can add multiple URLS to the same table, either in a single chunk or in multiple chunks, and you can add the same chunks to different tables. The chunk numbers are table-specific. > Do I remove both table names or just one? I'm not really sure what the other one is there for either. If the test passes, it's probably fine to remove both. I'm not sure either about the purpose of this in that test.
Flags: needinfo?(francois)
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review103562 I am happy to change the table names. Do we want to use `*-flash-*`, `*-flashexcept-*`, `*-flashallow-*`, `*-flashallowexcept-*`, `*-flashthirdparty-*`, `*-flashthirdpartyexcept-*`? > For better or for worst, both the middle part and the last part of list names already have functional implications in the URL classifier. The last part is used to determine which parser to use and the middle part is use to determine the type of error pages to show for example. The only part that is free-form is the first one. It is fine that Shavar uses list names to convey information, but I still see no reason to change my code to do the same when prefs can do the same thing more clearly. Unless you object, I will add the list name conventions to the documentation, but will not change code to use table names this way. > Sorry, I was a little sloppy in my last comment. You're right. As long as you don't add chunk 100 twice in the same table, you're good. > > You can add multiple URLS to the same table, either in a single chunk or in multiple chunks, and you can add the same chunks to different tables. The chunk numbers are table-specific. So do you want me to fix something here or not? What changes are necessary?
Flags: needinfo?(francois)
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review103562 These names look good to me. We can use these shorter ones in the pref name too. > So do you want me to fix something here or not? What changes are necessary? It looks like you're only adding a single chunk to each table, so this will work. You can leave it as-is. I've just added a note in bug 1329366 so that this gets cleaned up at the same time.
Flags: needinfo?(francois)
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review107706 ::: toolkit/components/url-classifier/flash-block-lists.rst:9 (Diff revisions 6 - 8) > > List based Flash blocking currently uses six lists. > The lists specify what domains/subdomains Flash is allowed to or denied from loading on. > The domains specified by the lists indicate the domain of the document that the Flash is loaded in, not the domain hosting the Flash content itself. > > -* Allow List > +* Allow List (list naming convention: base-flashallow-\*) Now that we have only 3 listtypes instead of 6, it might be better to not mention the naming convention since it looks a little confusing. Sorry for changing my mind on this.
Attachment #8822484 - Flags: review?(francois) → review+
@bz Are you still available to review this patch?
Flags: needinfo?(bzbarsky)
I should be able to get to this tomorrow. Did any of the above discussion affect the parts I assume you want me to review (the dom/base pieces)? One question I do have already: why do we want to ClassifyDocumentPlugins on every SetDocumentURI? Do we really care about the entire document URI, including the path, ref, etc?
Flags: needinfo?(bzbarsky) → needinfo?(ksteuber)
The discussion pretty much just covered the preferences, naming, and testing related to lists. I don't think it affected any dom/base pieces. You are correct that documents only need to be reclassified if the domain (or subdomain) changes. Is there a better place that you would suggest calling ClassifyDocumentPlugins from?
Flags: needinfo?(ksteuber)
bz -- can you give us an idea of when you think you'll be able to look at this? I don't want to throw bytesized onto something else before this is wrapped up.
Flags: needinfo?(bzbarsky)
Yes, I'm looking at it right now, for the last hour or so. ;)
Flags: needinfo?(bzbarsky)
\o/ (sorry to interrupt. Thanks!)
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review109046 I'm really sorry for the lag here. It took me a while to figure out the context for all this stuff, including why we couldn't do this via the normal permission manager codepaths, etc... I probably should have just asked for an explanation earlier. :( I hope I did understand things well enough that the advice below won't lead you astray. ::: dom/base/nsDocument.h:1382 (Diff revision 10) > > // The root of the doc tree in which this document is in. This is only > // non-null when this document is in fullscreen mode. > nsWeakPtr mFullscreenRoot; > > + // The classification that applies to all plugins in this document. So I think we should just have a single PluginClassification member, because whether a document is top level or not does not change. More on this below. ::: dom/base/nsDocument.cpp:2790 (Diff revision 10) > { > nsCOMPtr<nsIURI> oldBase = GetDocBaseURI(); > mDocumentURI = NS_TryToMakeImmutable(aURI); > nsIURI* newBase = GetDocBaseURI(); > > + nsresult rv = ClassifyDocumentPlugins(); Alright, so per conversation with ddurst (and finally getting hold of the design doc), there are a few issues here: 1) We probably want to base this on the URI of the document's principal, not the URI of the document. Otherwise sites could try to evade a blacklist entry just by putting the flash in a srcdoc iframe or about:blank iframe or whatnot. Though it would be somewhat mitigated by walking up the parent chain, but window.open() with about:blank could still evade things. 2) We don't care about the non-hostname parts of the URI, which is good, because those parts aren't really well-reflected in the principal anyway. And the good news is that the hostname of the document's principal doesn't ever change after a very early part of the document's lifetime, so we don't have to worry about stuff changing on us. There's one other problem that remains if we do that: how do we want to handle URIs (coming from a principal) without a host? In practice this could be any of: a) file:// URI b) data: (or other host-less scheme) loaded directly from URL bar. c) iframe sandboxed without allow-same-origin d) A load that had resetPrincipalsToNullPrincipal() called on its loadinfo. I _think_ those are conceptually the only ways this could happen. This is mostly a policy decision. What the patches here currently implement is that "no host in the URI" means eUnknownPlugin, which is the same as "not on any of the lists". It's not clear to me that this is desirable. Plausible policies here: * No host means deny. Would break the file:// URI case, but ok. * Null principal means deny, otherwise no host means treat as unlisted. This would disallow cases (b), (c), (d) but allow (a). * Deny if sandboxed, else treat as unlisted if no host. This would disallow (c) but allow the others. Telling apart (b) and (d) is a bit hard, but doable if we directly examine the loadinfo. In any case, I think we should do this classification stuff lazily, not eagerly. That way we don't pay any performance costs in documents that don't try to use flash. This is especially important for things like XHR responseXML, documents created via DOMParser, etc: doing all this work there is a complete waste of time. We could exclude them directly, but it's probably simpler to just do the work lazily and get the data document behavior for free, since nsObjectLoadingContent::LoadObject does the IsLoadedAsData check quite early. We'll want to add a comment there pointing out that doing that check up front, as opposed to waiting for AsyncOpen2 to do it, is very important... Anyway, so if we do this lazily then we can have a single member for our flash plugin state, pre-initialized to a "figure it out" enum value. Then when DocumentPluginClassification is called it checks the member. If it's set to "figure it out" it figures it out and saves this value, which can't change for the lifetime of the document. Then it just returns the cached value. ::: dom/base/nsDocument.cpp:13001 (Diff revision 10) > + aTableList.AppendLiteral(","); > + } > + aTableList.Append(aTableNames); How long do we expect these strings to get in practice? Should these appends be fallible, or will these be fairly short lists? ::: dom/base/nsDocument.cpp:13023 (Diff revision 10) > + nsACString::const_iterator strStart, strEnd; > + aTableNames.BeginReading(strStart); > + aTableNames.EndReading(strEnd); > + // This check is sufficient because table names cannot contain commas and > + // cannot contain another existing table name. > + if (FindInReadable(table, strStart, strEnd)) { How about: if (FindInReadable(table, aTableNames)) { and drop the manual iterator bits? ::: dom/base/nsDocument.cpp:13035 (Diff revision 10) > +/** > + * Sets mTopLevelPluginClassification and mThirdPartyPluginClassification to > + * allow for quick lookups of the classification of plugins in this document. > + */ > +nsresult > +nsDocument::ClassifyDocumentPlugins() It might be a good idea to have a pointer to the .rst documenting these prefs here, in a code comment. ::: dom/base/nsDocument.cpp:13077 (Diff revision 10) > + } > + > + nsresult rv; > + nsCOMPtr<nsIURIClassifier> uriClassifier = > + do_GetService(NS_URICLASSIFIERSERVICE_CONTRACTID, &rv); > + NS_ENSURE_SUCCESS(rv, rv); This failure should probably make us fall back to "deny" or something, right? ::: dom/base/nsDocument.cpp:13081 (Diff revision 10) > + do_GetService(NS_URICLASSIFIERSERVICE_CONTRACTID, &rv); > + NS_ENSURE_SUCCESS(rv, rv); > + > + nsTArray<nsCString> results; > + rv = uriClassifier->ClassifyLocalWithTables(mDocumentURI, tables, results); > + NS_ENSURE_SUCCESS(rv, rv); This failure could mean a bunch of things, but we have other code assuming that urlclassifier returning NS_ERROR_MALFORMED_URI means "no hostname" (which brings us back to the previous discussion) and any other return value means something actually went wrong (which should perhaps result in "deny"?). ::: dom/base/nsDocument.cpp:13110 (Diff revision 10) > +{ > + aClassification = eUnknownPlugin; > + > + nsCOMPtr<nsIDocShellTreeItem> current = this->GetDocShell(); > + nsCOMPtr<nsIDocShellTreeItem> parent; > + nsresult rv = current->GetSameTypeParent(getter_AddRefs(parent)); GetSameTypeParent never fails in practice (we should probably document that in the IDL or something), so it would be OK to just assert that and not worry about this method being able to fail. Then you could make it just return the PluginClassification value. ::: dom/base/nsDocument.cpp:13115 (Diff revision 10) > + nsresult rv = current->GetSameTypeParent(getter_AddRefs(parent)); > + NS_ENSURE_SUCCESS(rv, rv); > + > + if (parent) { > + // Not the top level document > + if (mThirdPartyPluginClassification == eDeniedPlugin || OK, so this is NOT the normal meaning of "third party", because we're not bothering to check whether we're actually third-party with our parent or toplevel or anything. We should probably use "NonTopLevel" or something instead of "ThirdParty" in the pref name and our variable names, and so forth, since that's what it means. ::: dom/base/nsDocument.cpp:13119 (Diff revision 10) > + // Not the top level document > + if (mThirdPartyPluginClassification == eDeniedPlugin || > + mTopLevelPluginClassification == eDeniedPlugin) { > + aClassification = eDeniedPlugin; > + } else { > + nsCOMPtr<nsIDocument> parentDocument = parent->GetDocument(); Unfortunately, parent->GetDocument() may not be our parent document, in cases when parent got navigated. You probably want to just use GetParentDocument() here, which will do the right thing. You do still need the "parent" thing, because GetParentDocument() will happily cross the content/chrome boundary. ::: dom/base/nsDocument.cpp:13121 (Diff revision 10) > + mTopLevelPluginClassification == eDeniedPlugin) { > + aClassification = eDeniedPlugin; > + } else { > + nsCOMPtr<nsIDocument> parentDocument = parent->GetDocument(); > + // Recurse up to the top level document > + aClassification = parentDocument->DocumentPluginClassification(); So just to be clear, since as far as I can tell the classification of a document is an invariant, we can just do the parent check when computing out own classification, etc. Which is why it's enough to store a single PluginClassification result. ::: dom/base/nsDocument.cpp:13124 (Diff revision 10) > + nsCOMPtr<nsIDocument> parentDocument = parent->GetDocument(); > + // Recurse up to the top level document > + aClassification = parentDocument->DocumentPluginClassification(); > + // "Allow" classification can override the parent's "Unknown" > + // classification, but not a "Deny" classification. > + // (This scenario results in a trusted frame in an unknown site). s/results in/results from/? ::: dom/base/nsIDocument.h:2901 (Diff revision 10) > + enum PluginClassification { > + eUnknownPlugin, > + eAllowedPlugin, > + eDeniedPlugin > + }; > + virtual PluginClassification DocumentPluginClassification() const = 0; This function is totally Flash-specific, right? How about we have the name reflect that? So DocumentFlashClassification, and FlashClassification for the enum. Which should probably become an enum class. The values can then be "Unknown" (or "Unlisted?"), "Allowed", "Denied", "Uninitialized" or so. ::: dom/base/nsObjectLoadingContent.cpp:3428 (Diff revision 10) > return false; > } > > switch (enabledState) { > case nsIPluginTag::STATE_ENABLED: > - return true; > + return documentClassification == nsIDocument::eAllowedPlugin; So this part changes behavior for non-Allow cases to "click to play" in practice, right? And I guess this is ok for now because if "plugins.flashBlock.enabled" is false everything ends up with the Allow policy? So we're not actually changing any behavior until we flip that flag? ::: modules/libpref/init/all.js:5197 (Diff revision 10) > pref("browser.safebrowsing.provider.mozilla.lists.base.name", "mozstdName"); > pref("browser.safebrowsing.provider.mozilla.lists.base.description", "mozstdDesc"); > pref("browser.safebrowsing.provider.mozilla.lists.content.name", "mozfullName"); > pref("browser.safebrowsing.provider.mozilla.lists.content.description", "mozfullDesc"); > > +pref("urlclassifier.flashAllowTable", "test-flashallow-simple"); Do we really want to ship this to our users? Or should this be set in the test profile's prefs.js instead? The latter approach means we'd have to update the test profile whenever we change the main prefs.js for these prefs... though in practice tests shouldn't be accessing any hostnames that would normally appear on our lists, right? So it might be enough to have the test profile have just the test lists.
Attachment #8822484 - Flags: review?(bzbarsky) → review-
(In reply to Boris Zbarsky [:bz] (still a bit busy) from comment #46) > ::: modules/libpref/init/all.js:5197 > (Diff revision 10) > > pref("browser.safebrowsing.provider.mozilla.lists.base.name", "mozstdName"); > > pref("browser.safebrowsing.provider.mozilla.lists.base.description", "mozstdDesc"); > > pref("browser.safebrowsing.provider.mozilla.lists.content.name", "mozfullName"); > > pref("browser.safebrowsing.provider.mozilla.lists.content.description", "mozfullDesc"); > > > > +pref("urlclassifier.flashAllowTable", "test-flashallow-simple"); > > Do we really want to ship this to our users? Or should this be set in the > test profile's prefs.js instead? > > The latter approach means we'd have to update the test profile whenever we > change the main prefs.js for these prefs... though in practice tests > shouldn't be accessing any hostnames that would normally appear on our > lists, right? So it might be enough to have the test profile have just the > test lists. We ship similar test lists to our users for all other Safe Browsing lists. These lists only contain entries on the itisatrap.org domain so it won't come up normally. They allow us to do quick manual tests (on official builds) of the features that depend on these lists.
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review109046 > So I think we should just have a single PluginClassification member, because whether a document is top level or not does not change. > > More on this below. I thought about doing it this way originally, but I wasn't sure if a level change was possible for a document. I'll change it. To be clear though, right click->Show only this Frame (or something similar) does not move a sub-document to the top level, right? > Alright, so per conversation with ddurst (and finally getting hold of the design doc), there are a few issues here: > > 1) We probably want to base this on the URI of the document's principal, not the URI of the document. Otherwise sites could try to evade a blacklist entry just by putting the flash in a srcdoc iframe or about:blank iframe or whatnot. Though it would be somewhat mitigated by walking up the parent chain, but window.open() with about:blank could still evade things. > > 2) We don't care about the non-hostname parts of the URI, which is good, because those parts aren't really well-reflected in the principal anyway. > > And the good news is that the hostname of the document's principal doesn't ever change after a very early part of the document's lifetime, so we don't have to worry about stuff changing on us. > > There's one other problem that remains if we do that: how do we want to handle URIs (coming from a principal) without a host? In practice this could be any of: > > a) file:// URI > b) data: (or other host-less scheme) loaded directly from URL bar. > c) iframe sandboxed without allow-same-origin > d) A load that had resetPrincipalsToNullPrincipal() called on its loadinfo. > > I _think_ those are conceptually the only ways this could happen. > > This is mostly a policy decision. What the patches here currently implement is that "no host in the URI" means eUnknownPlugin, which is the same as "not on any of the lists". It's not clear to me that this is desirable. > > Plausible policies here: > > * No host means deny. Would break the file:// URI case, but ok. > * Null principal means deny, otherwise no host means treat as unlisted. This would disallow cases (b), (c), (d) but allow (a). > * Deny if sandboxed, else treat as unlisted if no host. This would disallow (c) but allow the others. > > Telling apart (b) and (d) is a bit hard, but doable if we directly examine the loadinfo. > > In any case, I think we should do this classification stuff lazily, not eagerly. That way we don't pay any performance costs in documents that don't try to use flash. This is especially important for things like XHR responseXML, documents created via DOMParser, etc: doing all this work there is a complete waste of time. We could exclude them directly, but it's probably simpler to just do the work lazily and get the data document behavior for free, since nsObjectLoadingContent::LoadObject does the IsLoadedAsData check quite early. We'll want to add a comment there pointing out that doing that check up front, as opposed to waiting for AsyncOpen2 to do it, is very important... > > Anyway, so if we do this lazily then we can have a single member for our flash plugin state, pre-initialized to a "figure it out" enum value. Then when DocumentPluginClassification is called it checks the member. If it's set to "figure it out" it figures it out and saves this value, which can't change for the lifetime of the document. Then it just returns the cached value. I can probably switch from classifying by URI to principal pretty easily. As for the rest, I think ideally we would allow case (a) and deny case (b). I don't really understand cases (c) or (d). Particularly case (c): why can't we classify sandboxed iframes? Do they not have a principal? Do they not have a pointer to their parent document? It seems like as long as we have those two things we should be able to classify it properly, right? (Sorry, I know very little about sandboxed iframes). > How long do we expect these strings to get in practice? Should these appends be fallible, or will these be fairly short lists? I think don't expect any individual pref to contain more than two tables (the test table and the "real" table). A rough, back-of-the-envelope calculation places the upper bound around 400 characters (longest table name is 33 characters plus one comma character times 6 prefs times 2 tables per pref = 408 characters). Maybe it makes sense to make these fallible. > It might be a good idea to have a pointer to the .rst documenting these prefs here, in a code comment. What is the correct way to point to a document in the tree? Path from mozilla-central directory? > This failure should probably make us fall back to "deny" or something, right? I agree, that would be safer. I'll change that. > OK, so this is NOT the normal meaning of "third party", because we're not bothering to check whether we're actually third-party with our parent or toplevel or anything. We should probably use "NonTopLevel" or something instead of "ThirdParty" in the pref name and our variable names, and so forth, since that's what it means. While I agree that the implementation is not totally faithful to the name, this is what this list/feature has been called since its creation. So although it is not technically accurate, I tend to think we should keep the name consistent with what we have been calling it in the documentation/specifiations. If anything, we might consider making the behavior closer to what "third party" actually means. I didn't do this originally because it adds a bit of complexity that frankly I thought seemed unnecessary. > Unfortunately, parent->GetDocument() may not be our parent document, in cases when parent got navigated. > > You probably want to just use GetParentDocument() here, which will do the right thing. You do still need the "parent" thing, because GetParentDocument() will happily cross the content/chrome boundary. I am confused about how the parent gets navigated while the child is still around, but I'll ignore that for now. I also do not understand the second bit. How do I keep from travelling outside of content documents when ascending the tree? > So just to be clear, since as far as I can tell the classification of a document is an invariant, we can just do the parent check when computing out own classification, etc. Which is why it's enough to store a single PluginClassification result. Now I am a little confused. Didn't you say in the last comment that the parent can be navigated away? Doesn't that mean that we need to check with the parent every time to see if its classification has changed? > s/results in/results from/? Well I was thinking of it as the classification causes the frame to be treated as trustworthy. You could also say the frame's trustworthiness causes its classification. I am not sure the distinction is important but it might be a little more clear your way. I'll change it. > So this part changes behavior for non-Allow cases to "click to play" in practice, right? > > And I guess this is ok for now because if "plugins.flashBlock.enabled" is false everything ends up with the Allow policy? So we're not actually changing any behavior until we flip that flag? Correct.
Flags: needinfo?(bzbarsky)
> right click->Show only this Frame (or something similar) does not move a sub-document to the top level, right? Right. It ends up creating a new document. > Particularly case (c): why can't we classify sandboxed iframes? Because their principal is a nullprincipal which has no useful URI. Case (d) ends up in the same boat. data: loaded directly from the url bar also has a nullprincipal. So the principal URI is a nullprincipal uri, not data:, fwiw. > I think don't expect any individual pref to contain more than two tables Oh, so the prefs just have table names, not lists of hostnames? That explains the use of the urlclassifier then. Also, no need to worry about OOM on these appends given that. > What is the correct way to point to a document in the tree? Path from mozilla-central directory? Yep. > I tend to think we should keep the name consistent with what we have been > calling it in the documentation/specifiations. The problem is that we already have uses of "third party" in our tree, with the actual third party meaning. And it will _really_ confuse people looking at this code, because they will expect some sort of checks for third-partiness and not find them. Consistency with the not-even-visible-to-people doc is a much lower priority for me here than consistency with the rest of our codebase... > I am confused about how the parent gets navigated while the child is still around Well, say I load foo.com, it loads a subframe. Then I navigate toplevel to bar.com and foo.com goes into bfcache. Then I have script poke at foo.com and change the "data" attribute of an <object>. In theory we should prevent that load anyway, for various reasons. In practice, it's best to actually check against our parent document, not some other document. ;) > How do I keep from travelling outside of content documents when ascending the tree? You're doing that part correctly already. GetSameTypeParent() on the docshell will do the right thing here. _If_ that returns non-null then GetParentDocument() will return the thing you want for your parent-document check. > Didn't you say in the last comment that the parent can be navigated away? The parent _docshell_ can navigate. At which point it gets a new nsIDocument object and so forth. The parent _document_ (as reported by GetParentDocument()) is immutable for our purposes. In my example above, the parent document of the iframe remains foo.com, while the document of the parent docshell becomes bar.com.
Flags: needinfo?(bzbarsky)
> Then I have script poke at foo.com and change the "data" attribute of an <object>. I meant "poke at foo.com's subframe".
If our terminology is going to cause confusion elsewhere in the tree, we should change our terminology. As for the null principals, I'm not sure. I see no reason not to deny b), c), and d). I'm torn on a), as I can see why this should be considered permitted by default, but I worry about the un-savvy, exploited user (which is one of the reasons Flash is a vector anyway). NI bsmedberg for his opinion on this too: > There's one other problem that remains if we do that: how do we want to handle URIs > (coming from a principal) without a host? In practice this could be any of: > > a) file:// URI > b) data: (or other host-less scheme) loaded directly from URL bar. > c) iframe sandboxed without allow-same-origin > d) A load that had resetPrincipalsToNullPrincipal() called on its loadinfo.
Flags: needinfo?(benjamin)
> I see no reason not to deny b), c), and d). The question is what should happen for case (c) for a sandboxed thing opened from a site that is denied. It's sandboxed, so its principal is not associated with that site, but imo it should still be denied.
Isn't Flash blocked (by spec) on sandboxed iframes?
Good question about file:. I think we should for security reasons disallow plugins from file: entirely (HTTP/HTTPS-only). I'm happy for that to be a separate patch/bug that just rides the trains. For b) and c): if a site triggers these, we should disallow plugins in them. This is already the case for c) because sandboxed frames never allow plugins. I'm tempted to say that data: URIs should never load plugins, but I'm not sure what that would break so let's not do that here. I don't know much about d) - what are the cases where this happens?
Flags: needinfo?(benjamin)
I feel like I need to say that I really do not like the idea of disallowing plugins from file:. I think a lot of developers use Firefox (including me) and would be irritated at their new inability to test their content locally. What is the security concern when blocking local content?
> Isn't Flash blocked (by spec) on sandboxed iframes? Ah, there's no sandbox flag for allow-plugins. OK, so no need to worry about sandboxed stuff, that's good. > I don't know much about d) - what are the cases where this happens Whenever something in our UI has content it doesn't want to be running with the privileges of the site it was loaded from, basically. Looks like it was added for the json viewer, in particular.
Ah in that case it sounds like d) should be treated as sandboxed without any plugin support. Kirk, let's move the file: thing to a different bug. I'm pretty sure I want to do that, and developers can serve their content via localhost. The line of security vulnerabilities there is long and complicated.
So, to be clear, for this bug we are going to deny plugins for cases (b), (c), and (d), but not (a)?
Flags: needinfo?(benjamin)
b) and c) are based on the domain that triggers the load d) is always-deny-plugins no matter who triggers the load (if that's possible)
Flags: needinfo?(benjamin)
I'm not quite sure I follow comment 59. Seems to me like (c) always blocks plugins anyway and we want (d) to do the same (certainly at least for Flash) . (b) is only about data: entered through the URL bar, not loaded from sites, for the moment, because data: loaded from sites will inherit those sites' principal. So the simplest thing to do may be to deny flash if the principal is a nullprincipal, but allow for now if the principal URI has no host and the principal is not nullprincipal. Then we can decide in followups how to handle other cases like that (e.g. file://).
Oh, I misread. Let me see if I can write this all down again (and Kirk this should go into the docs): * If a data: URI is loaded from a web page, it should inherit the permissions of that page * If a data: URI is loaded directly from the URL bar and therefore has a null principal, deny plugins * If a URI is loaded with resetPrincipalsToNullPrincipal() (e.g. json viewer), deny plugins * If a URI has a non-null principal but no host (file:), no whitelist/blacklist should apply and the global default should apply Please file followups to deny plugins for ftp:, file:, and perhaps all other non-HTTP/HTTPS protocols.
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking https://reviewboard.mozilla.org/r/101394/#review109776 r=me with the nits below. Thank you, this looks great! ::: dom/base/nsDocument.cpp:13028 (Diff revision 11) > + * > + * For more information, see > + * toolkit/components/url-classifier/flash-block-lists.rst > + */ > +nsresult > +nsDocument::URIFlashClassification(FlashClassification& aClassification, This is more of a PrincipalFlashClassification(), I think. ::: dom/base/nsDocument.cpp:13049 (Diff revision 11) > + return NS_OK; > + } > + > + nsCOMPtr<nsIURI> classificationURI; > + rv = principal->GetURI(getter_AddRefs(classificationURI)); > + NS_ENSURE_SUCCESS(rv, rv); So nsIPrincipal::GetURI will only return failure if NS_EnsureSafeToReturn does. That means either we had some really weird and very broken URI (e.g. extension-implemented) or we OOMed. In either case, I think it's fine to classify as Denied. That's what you end up doing in the caller in practice anyway (treating errors as Deny). The benefit of doing it inside this method would be that we could just have the method return a FlashClassification, and not have the kinda-weird outparam. Oh, and GetURI _can_ output a null nsIURI (while returning NS_OK), in the case of system or expanded principals. We shouldn't have documents whose principals are expanded principals, but system can happen. Passing null to urlclassifier will assert and then probably just fail out, based on code inspection. Seems to me like we should probably just deny those cases when classificationURI is null, though maybe it's worth checking what our policy should be for system-principal stuff that's using Flash. (I hope it's "What? Heck no!", personally.) In either case, we should not pass a null URI to the urlclassifier. ::: dom/base/nsDocument.cpp:13114 (Diff revision 11) > + > + return NS_OK; > +} > + > +nsresult > +nsDocument::GetFlashClassification(FlashClassification& aClassification) Again, I think this can just return FlashClassification. And maybe call this method ComputeFlashClassification, for clarity? ::: dom/base/nsDocument.cpp:13116 (Diff revision 11) > +} > + > +nsresult > +nsDocument::GetFlashClassification(FlashClassification& aClassification) > +{ > + if (mFlashClassification != FlashClassification::Unclassified) { This check should probably be up in nsDocument::DocumentFlashClassification(). ::: dom/base/nsDocument.cpp:13121 (Diff revision 11) > + if (mFlashClassification != FlashClassification::Unclassified) { > + aClassification = mFlashClassification; > + return NS_OK; > + } > + > + nsCOMPtr<nsIDocShellTreeItem> current = this->GetDocShell(); If current is null, should deny. Just in case someone manages to reach here in that state. ::: dom/base/nsDocument.cpp:13138 (Diff revision 11) > + aClassification = FlashClassification::Denied; > + } else { > + rv = URIFlashClassification(aClassification, false); > + NS_ENSURE_SUCCESS(rv, rv); > + > + // Allow a unknown children to inherit allowed status from parent, but s/a unknown/unknown/ ::: dom/base/nsDocument.cpp:13151 (Diff revision 11) > + // This is the top level document > + rv = URIFlashClassification(aClassification, true); > + NS_ENSURE_SUCCESS(rv, rv); > + } > + > + mFlashClassification = aClassification; This part should also be up in DocumentFlashClassification. So DocumentFlashClassification would look something like: if (mFlashClassification == FlashClassification::Unclassified) { mFlashClassification = ComputeFlashClassification(); MOZ_ASSERT(mFlashClassification != FlashClassification::Unclassified); } return mFlashClassification; or so. ::: dom/base/nsDocument.cpp:13161 (Diff revision 11) > +} > + > +/** > + * Retrieves the classification of plugins in this document. This is dependent > + * on the classification of this document and all parent documents. > + * This function is infalliable - It must return some classification that "infallible" ::: dom/base/nsIDocument.h:2896 (Diff revision 11) > void NoteScriptTrackingStatus(const nsACString& aURL, bool isTracking); > bool IsScriptTracking(const nsACString& aURL) const; > > bool PrerenderHref(nsIURI* aHref); > > + enum class FlashClassification { Might be good to document the values a bit.
Attachment #8822484 - Flags: review?(bzbarsky) → review+
@bz Do my changes look ok?
Flags: needinfo?(bzbarsky)
Looks great, thank you!
Flags: needinfo?(bzbarsky)
Pushed by felipc@gmail.com: https://hg.mozilla.org/integration/autoland/rev/ab5b2bbc74b1 Add allow and deny lists for Flash Blocking r=bsmedberg,bz,francois
Keywords: checkin-needed
Status: ASSIGNED → RESOLVED
Closed: 8 years ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla54
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking Approval Request Comment [Feature/Bug causing the regression]: This feature is needed for the Shield study to be run on Release 52 (bug 1335232), which we'll use to study the effect of making flash click-to-play by default. [User impact if declined]: Can't run the study as intended [Is this code covered by automated tests?]: Yes [Has the fix been verified in Nightly?]: Yes [Needs manual test from QE? If yes, steps to reproduce]: Not for the feature independently. We'll do QE on the study as a whole to make sure all pieces work as expected [List of other uplifts needed for the feature/fix]: bug 1282484, which has already been uplifted. Bug 1335379 and bug 1335388 will be needed too. [Is the change risky?]: Somewhat yes, but only if enabled. [Why is the change risky/not risky?]: This feature will not be enabled for release users, except a few that are recruited through Shield and offered to opt-in into the study. [String changes made/needed]: none
Attachment #8822484 - Flags: approval-mozilla-beta?
Attachment #8822484 - Flags: approval-mozilla-aurora?
Bug 1336714 should also be uplifted with this one as it is a fix for a crash caused by this bug.
(In reply to Kirk Steuber [:bytesized] from comment #74) > Bug 1336714 should also be uplifted with this one as it is a fix for a > crash caused by this bug. And also bug 1333483 so we can manually test this and ensure that it works in other channels. Since the main patch makes changes to safe browsing, we should get SoftVision to run their Safe Browsing smoketests on this to make sure we don't break anything.
Comment on attachment 8822484 [details] Bug 1307604 - Add allow and deny lists for Flash Blocking For shield study purpose. Take it in aurora first. Aurora53+.
Attachment #8822484 - Flags: approval-mozilla-aurora? → approval-mozilla-aurora+
seems needs rebasing for aurora as well: grafting 387769:ab5b2bbc74b1 "Bug 1307604 - Add allow and deny lists for Flash Blocking r=bsmedberg,bz,francois" merging dom/base/nsDocument.cpp merging dom/base/nsIDocument.h merging dom/base/nsObjectLoadingContent.cpp merging modules/libpref/init/all.js merging toolkit/components/url-classifier/SafeBrowsing.jsm warning: conflicts while merging dom/base/nsDocument.cpp! (edit, then use 'hg resolve --mark') abort: unresolved conflicts, can't continue (use 'hg resolve' and 'hg graft --continue')
Flags: needinfo?(ksteuber)
Attached patch flashkill-aurora.patch (deleted) — Splinter Review
This patch has been rebased for merge to mozilla-aurora.
Flags: needinfo?(ksteuber)
Attachment #8835616 - Flags: approval-mozilla-beta?
Attachment #8835616 - Flags: approval-mozilla-aurora?
Attachment #8822484 - Flags: approval-mozilla-beta?
Comment on attachment 8835616 [details] [diff] [review] flashkill-aurora.patch You don't need to re-request approval for rebases unless it's significantly changing the patch (i.e. it would change the answer to the approval questions you previously filled out).
Attachment #8835616 - Flags: approval-mozilla-aurora?
Comment on attachment 8835616 [details] [diff] [review] flashkill-aurora.patch flash plugin {white,black}lists, for shield study in 52
Attachment #8835616 - Flags: approval-mozilla-beta? → approval-mozilla-beta+
Hrm. This bug depends on bugs 1318768, 1319571 and 1321377 which aren't in 52. Is this actually safe to uplift to beta without those other changes?
Attachment #8835616 - Flags: approval-mozilla-beta+ → approval-mozilla-beta?
Flags: needinfo?(ksteuber)
@jcristau You are correct. I will seek uplift of those patches.
Flags: needinfo?(ksteuber)
Attached patch flashkill-beta.patch (deleted) — Splinter Review
Rebased for beta
Attachment #8837373 - Flags: approval-mozilla-beta?
Attachment #8835616 - Flags: approval-mozilla-beta?
Comment on attachment 8837373 [details] [diff] [review] flashkill-beta.patch this was deemed too risky for beta
Attachment #8837373 - Flags: approval-mozilla-beta? → approval-mozilla-beta-
Depends on: 1344908
Product: Core → Core Graveyard
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: