Stop allowing markup injection via DTD in system-privileged contexts
Categories
(Core :: XML, task, P1)
Tracking
()
People
(Reporter: Gijs, Assigned: peterv)
References
Details
(Keywords: csectype-priv-escalation, sec-want, Whiteboard: [prevents sec-high sandbox escape] [adv-main68-])
Attachments
(4 files, 1 obsolete file)
(deleted),
text/x-phabricator-request
|
jcristau
:
approval-mozilla-beta+
|
Details |
(deleted),
text/x-phabricator-request
|
Details | |
(deleted),
text/plain
|
Details | |
(deleted),
text/plain
|
Details |
I believe that once bug 1523741 and bug 1539758 are fixed, we can forbid inline markup at the expat parser level and avoid injection for system-privileged documents that way.
Reporter | ||
Updated•6 years ago
|
Updated•6 years ago
|
Reporter | ||
Comment 1•6 years ago
|
||
Updated•6 years ago
|
Reporter | ||
Updated•6 years ago
|
Reporter | ||
Comment 2•6 years ago
|
||
Comment 3•6 years ago
|
||
Backed out for build bustages in nsExpatDriver.cpp:
https://hg.mozilla.org/integration/autoland/rev/5a65e18dc7d5ae6f38352112d9323205ad36f790
Failure log: https://treeherder.mozilla.org/logviewer.html#?job_id=244109380&repo=autoland
[task 2019-05-02T08:37:21.590Z] 08:37:21 INFO - In file included from /builds/worker/workspace/build/src/obj-firefox/parser/htmlparser/Unified_cpp_parser_htmlparser0.cpp:29:
[task 2019-05-02T08:37:21.591Z] 08:37:21 ERROR - /builds/worker/workspace/build/src/parser/htmlparser/nsExpatDriver.cpp:493:7: error: no viable conversion from 'nsString' (aka 'nsTString<char16_t>') to 'bool'
[task 2019-05-02T08:37:21.592Z] 08:37:21 INFO - mInternalSubset || mInExternalDTD,
Please check all reported errors.
Reporter | ||
Comment 4•6 years ago
|
||
Ugh, mInternalSubset
vs. mInInternalSubset
.
Comment 5•6 years ago
|
||
Backed out for various test failures:
https://hg.mozilla.org/integration/autoland/rev/751ffd71976190312033e68a27affb501136bafb
Push which ran all jobs: https://treeherder.mozilla.org/#/jobs?repo=autoland&resultStatus=testfailed%2Cbusted%2Cexception%2Cretry%2Cusercancel%2Crunnable&group_state=expanded&revision=bfebd4300498feec18eab878c5fecf005d468d13
Failures:
https://treeherder.mozilla.org/logviewer.html#/jobs?job_id=244126180&repo=autoland&lineNumber=3354
https://treeherder.mozilla.org/logviewer.html#?job_id=244124004&repo=autoland
Please check all failed jobs if there are more issues.
Reporter | ||
Updated•6 years ago
|
Updated•6 years ago
|
Reporter | ||
Comment 6•6 years ago
|
||
Updated•6 years ago
|
Reporter | ||
Comment 7•6 years ago
|
||
Comment 8•6 years ago
|
||
Comment 9•6 years ago
|
||
Could this be causing bug 1548990?
Comment 10•6 years ago
|
||
(In reply to Panos Astithas (he/him) [:past] (please ni?) from comment #9)
Could this be causing bug 1548990?
Yes, it looks like there are entities with embedded markup in browser.dtd in at least the ja langpack. Possibly only comments:
browser/chrome/ja/locale/browser/browser.dtd:<!ENTITY tabCmd.label "新しいタブ"><!-- 表示カ所によりアクセスキーが異なる -->
Comment 11•6 years ago
|
||
browser/chrome/ja/locale/browser/browser.dtd:<!ENTITY tabCmd.label "新しいタブ"><!-- 表示カ所によりアクセスキーが異なる -->
That's a comment outside of the entity, it the parser throwing for any content in the DTD file?
All the examples mentioned in bug 1548990 have <
in the source strings (en-US), and some of them are not in use in Nightly (Panic button was migrated to Fluent in bug 1495861). Why is en-US not broken?
Comment 12•6 years ago
|
||
(In reply to Francesco Lodolo [:flod] from comment #11)
browser/chrome/ja/locale/browser/browser.dtd:<!ENTITY tabCmd.label "新しいタブ"><!-- 表示カ所によりアクセスキーが異なる -->
That's a comment outside of the entity, it the parser throwing for any content in the DTD file?
Yeah, it's late and I misread that line. But there are others with markup in the actual DTD value.
All the examples mentioned in bug 1548990 have < in the source strings (en-US), and some of them are not in use in Nightly (Panic button was migrated to Fluent in bug 1495861). Why is en-US not broken?
They don't. Or, at least the ones in mozilla-central don't. The entities don't need to be used. Just their presence is enough to cause a parse failure.
Comment 13•6 years ago
|
||
OK, so our cross-channel localization system wasn't taken in account here.
The strings we expose for localization cover all from central to release, and thus https://hg.mozilla.org/l10n/gecko-strings/file/dc3afb0408aa1d15e87a36dab1f93804024ea62a/browser/chrome/browser/browser.dtd#l1090 still has markup, for example. And thus, also localizations.
We'll need to back this out, and wait with re-landing it until:
- all markup is gone from DTD files loaded by system principal in gecko-strings
- all localizations have been updated to also remove those strings
The latter is a manual step, as our usual process doesn't remove obsolete strings until someone modifies the file. We know how to do that, but it's an extra step.
To clarify, the list of files loaded only matters for the branches where this code lands again. It'd be good to have the actual list before trying to land this again.
Comment 14•6 years ago
|
||
I wouldn't be surprised if this broke Thunderbird in general, fwiw. Maybe CC Jörg?
Reporter | ||
Comment 15•6 years ago
|
||
This was backed out in https://hg.mozilla.org/mozilla-central/rev/24a6a4f933a8289666dbda9b9c5e39c2de89fa4f on past's request.
OK, so, apologies for breaking all the non-en-US builds...
(In reply to Axel Hecht [:Pike] from comment #13)
OK, so our cross-channel localization system wasn't taken in account here.
The strings we expose for localization cover all from central to release, and thus https://hg.mozilla.org/l10n/gecko-strings/file/dc3afb0408aa1d15e87a36dab1f93804024ea62a/browser/chrome/browser/browser.dtd#l1090 still has markup, for example. And thus, also localizations.
We'll need to back this out, and wait with re-landing it until:
- all markup is gone from DTD files loaded by system principal in gecko-strings
We produce per-version langpacks, right? Are you saying the 67 and 68 ones are generated with the same data? That should be fixed...
This patch needs to make it into 68 release (see whiteboard), and ideally that should happen before all the l10n work for 68 is finished (which would be late in 68 beta, with no testing runway). All the removals of these strings are automated, AFAIK, with fluent migrations, so there's no actual localizer work required. So presumably we can land this early in the 69 cycle and uplift to 68 beta immediately once the DTD copies have been removed from gecko-strings.
The other issue is that without this patch in the tree, nothing prevents people from reintroducing DTD strings with these issues.
There's already an 'if' condition that governs whether we break parsing in this way. Perhaps we can detect the langpack case there so we can at least get infra coverage, and then uplift a patch to enable the checks for langpacks when we're confident no more langpacks have issues. I'll look into this in more detail early next week.
Comment 16•6 years ago
|
||
(In reply to :Gijs (he/him) from comment #15)
All the removals of these strings are automated, AFAIK, with fluent migrations, so there's no actual localizer work required.
They're not. If you migrate feature X in 68, strings are added to Fluent. Those strings are still needed for 67 and 66, and remain available in cross-channel.
When 69 is in Nightly, we still need those strings in 67 (release). BTW, cross-channel is the only reason we're able to support projects like TrailHead, which require manual uplifts to 67.0.5
Unless I'm mixing things up, the sooner this can land is 70. With the open question of having to support a Fennec 68 ESR.
Why does it need to land in 68?
Updated•6 years ago
|
Reporter | ||
Comment 17•6 years ago
|
||
(In reply to Francesco Lodolo [:flod] from comment #16)
(In reply to :Gijs (he/him) from comment #15)
All the removals of these strings are automated, AFAIK, with fluent migrations, so there's no actual localizer work required.
They're not. If you migrate feature X in 68, strings are added to Fluent.
Sure, my point is that it's not like we're needing localizers to come up with new translations to deal with these changes. The "only" issue is bit-shunting, ie getting the right things in the right xpis/builds/whatever, and "making things work".
Those strings are still needed for 67 and 66, and remain available in cross-channel.
When 69 is in Nightly, we still need those strings in 67 (release). BTW, cross-channel is the only reason we're able to support projects like TrailHead, which require manual uplifts to 67.0.5
I don't really understand. If we have l10n-central and l10n-beta we can just uplift things, right? Anyway, the details here are probably off-topic.
Unless I'm mixing things up, the sooner this can land is 70. With the open question of having to support a Fennec 68 ESR.
Yeah, that's not gonna work.
Why does it need to land in 68?
Copying from dveditz' email:
This fix is a primary break in the chain of the sandbox escape used in the Pwn2Own bug this year (https://bugzilla.mozilla.org/show_bug.cgi?id=1538007).
So, we're gonna need to do better than 70.
Isn't it true that we build separate builds and xpis on a per-version basis? So, let's say that I come up with a list of problematic string identifiers, we could script our way to victory by ensuring the problematic strings don't make it into the copies of the DTDs we put in 68-compatible xpis/builds, given that they're not needed on 68 ? Or is that also not possible (and if not, why)?
Comment 18•6 years ago
|
||
(In reply to :Gijs (he/him) from comment #17)
I don't really understand. If we have l10n-central and l10n-beta we can just uplift things, right? Anyway, the details here are probably off-topic.
We DO NOT have different repositories for different versions. Every build shipping is built out of 1 l10n repository for each locale
https://hg.mozilla.org/l10n-central/
Unless I'm mixing things up, the sooner this can land is 70. With the open question of having to support a Fennec 68 ESR.
Yeah, that's not gonna work.
Why does it need to land in 68?
Copying from dveditz' email:
This fix is a primary break in the chain of the sandbox escape used in the Pwn2Own bug this year (https://bugzilla.mozilla.org/show_bug.cgi?id=1538007).
So, we're gonna need to do better than 70.
I don't see how it can be fixed on our side. Leaving to Axel to confirm that, and answer your last question.
Side question: how long has Firefox been affected by this bug? My guess is "forever".
Reporter | ||
Comment 19•6 years ago
|
||
Peter, I looked for this originally and didn't find it, but just to be 100% sure I didn't miss it... is there a callback handler in expat we could use to be notified when actually inserting an entity into the doc, instead of when we reference an external DTD + all the entity definitions in said DTD?
Comment 20•6 years ago
|
||
Generally speaking, can we not abort parsing? Escape the contents we get, or fall back to just showing the ID or something?
L10n aside, I'm concerned that if a user is affected by this and it's on the startup path, we can't help them resolve their issue in-product. And if it's profile data, they would even need to uninstall and remove their profile.
From the l10n side of things, we want to get to a world where our language assets work independently of the build version and channel. We're not there yet, but I'd rather not make us be further from that vision instead of closer.
Reporter | ||
Comment 21•6 years ago
|
||
(In reply to Axel Hecht [:Pike] from comment #20)
Generally speaking, can we not abort parsing? Escape the contents we get, or fall back to just showing the ID or something?
Unfortunately, it doesn't seem like there is a straightforward way to do this with the current approach. Expat does not provide a hook for when an entity is used, and does not provide write access to the entities that get declared. The only real escape we have is aborting parsing.
peterv suggested we could try to add a hack to expat that exposes some internal state, and key off that in the handlers that actually create elements. I don't know if that will allow us to do something less extreme than abort parsing.
Comment 22•6 years ago
|
||
(In reply to Axel Hecht [:Pike] from comment #20)
From the l10n side of things, we want to get to a world where our language assets work independently of the build version and channel. We're not there yet, but I'd rather not make us be further from that vision instead of closer.
That ideally depends on removing our reliance on DTDs for localization altogether, so I don't think this takes us farther from it.
Comment 23•6 years ago
|
||
The theory is that we can move to cross-channel language packs for beta and release "basically now", maybe late-beta and release. For nightly, we need to be in shape to not add strings to chrome:// anymore.
Neither require full migration away from Fluent.
Assignee | ||
Comment 24•6 years ago
|
||
Assignee | ||
Comment 25•6 years ago
|
||
This seems to work to just ignore elements in entities (but not their contents). Do we want to report the ignored elements somewhere?
Reporter | ||
Comment 26•6 years ago
|
||
Passing this to Peter as he's working on this patch now.
(In reply to Peter Van der Beken [:peterv] from comment #25)
This seems to work to just ignore elements in entities (but not their contents). Do we want to report the ignored elements somewhere?
I don't think we need to, though I guess we could report to the browser console if that's easy to do.
Updated•6 years ago
|
Updated•6 years ago
|
Comment 27•6 years ago
|
||
Landed: https://hg.mozilla.org/integration/autoland/rev/16c03995ac55ce131282881c7c5d31ee6061441e
Backed out for causing global buffer overflow in nsExpatDriver.cpp:
https://hg.mozilla.org/integration/autoland/rev/ed8137fa8e4ad6b7a9c8c64a51fdcf2b6be3ce2c
Push which ran failing tests: https://treeherder.mozilla.org/#/jobs?repo=autoland&resultStatus=testfailed%2Cbusted%2Cexception%2Cretry%2Cusercancel%2Crunnable&group_state=expanded&revision=8990b9990220bf229abe7c5f5ec6934802559af2
Failure log: https://treeherder.mozilla.org/logviewer.html#?job_id=246120215&repo=autoland
[task 2019-05-13T13:06:31.334Z] 13:06:31 INFO - TEST-START | dom/base/test/browser_bug593387.js
[task 2019-05-13T13:06:31.586Z] 13:06:31 INFO - GECKO(2499) | =================================================================
[task 2019-05-13T13:06:31.589Z] 13:06:31 ERROR - GECKO(2499) | ==2499==ERROR: AddressSanitizer: global-buffer-overflow on address 0x7f4894e03f42 at pc 0x7f488851b528 bp 0x7ffee77aeb70 sp 0x7ffee77aeb68
[task 2019-05-13T13:06:31.589Z] 13:06:31 INFO - GECKO(2499) | READ of size 2 at 0x7f4894e03f42 thread T0
[task 2019-05-13T13:06:32.257Z] 13:06:32 INFO - GECKO(2499) | #0 0x7f488851b527 in AppendErrorPointer /builds/worker/workspace/build/src/parser/htmlparser/nsExpatDriver.cpp:291:9
[task 2019-05-13T13:06:32.259Z] 13:06:32 INFO - GECKO(2499) | #1 0x7f488851b527 in nsExpatDriver::HandleStartElementForSystemPrincipal(void*, char16_t const*, char16_t const**) /builds/worker/workspace/build/src/parser/htmlparser/nsExpatDriver.cpp:322
[task 2019-05-13T13:06:32.263Z] 13:06:32 INFO - GECKO(2499) | #2 0x7f489061d7b2 in doContent /builds/worker/workspace/build/src/parser/expat/lib/xmlparse.c:2872:11
[task 2019-05-13T13:06:32.264Z] 13:06:32 INFO - GECKO(2499) | #3 0x7f48906199f2 in processInternalEntity /builds/worker/workspace/build/src/parser/expat/lib/xmlparse.c:5409:14
[task 2019-05-13T13:06:32.268Z] 13:06:32 INFO - GECKO(2499) | #4 0x7f489061fd17 in doContent /builds/worker/workspace/build/src/parser/expat/lib/xmlparse.c:2787:20
[task 2019-05-13T13:06:32.271Z] 13:06:32 INFO - GECKO(2499) | #5 0x7f4890611b9f in contentProcessor /builds/worker/workspace/build/src/parser/expat/lib/xmlparse.c:2528:27
[task 2019-05-13T13:06:32.272Z] 13:06:32 INFO - GECKO(2499) | #6 0x7f4890611b9f in doProlog /builds/worker/workspace/build/src/parser/expat/lib/xmlparse.c:4585
[task 2019-05-13T13:06:32.273Z] 13:06:32 INFO - GECKO(2499) | #7 0x7f4890608863 in prologProcessor /builds/worker/workspace/build/src/parser/expat/lib/xmlparse.c:4311:10
[task 2019-05-13T13:06:32.277Z] 13:06:32 INFO - GECKO(2499) | #8 0x7f4890608863 in prologInitProcessor /builds/worker/workspace/build/src/parser/expat/lib/xmlparse.c:4120
[task 2019-05-13T13:06:32.278Z] 13:06:32 INFO - GECKO(2499) | #9 0x7f48906067a4 in MOZ_XML_Parse /builds/worker/workspace/build/src/parser/expat/lib/xmlparse.c:1894:17
[task 2019-05-13T13:06:32.280Z] 13:06:32 INFO - GECKO(2499) | #10 0x7f488852198a in nsExpatDriver::ParseBuffer(char16_t const*, unsigned int, bool, unsigned int*) /builds/worker/workspace/build/src/parser/htmlparser/nsExpatDriver.cpp:819:16
[task 2019-05-13T13:06:32.281Z] 13:06:32 INFO - GECKO(2499) | #11 0x7f48885222b1 in nsExpatDriver::ConsumeToken(nsScanner&, bool&) /builds/worker/workspace/build/src/parser/htmlparser/nsExpatDriver.cpp:914:5
[task 2019-05-13T13:06:32.281Z] 13:06:32 INFO - GECKO(2499) | #12 0x7f488852e7b8 in nsParser::Tokenize(bool) /builds/worker/workspace/build/src/parser/htmlparser/nsParser.cpp:1414:25
[task 2019-05-13T13:06:32.282Z] 13:06:32 INFO - GECKO(2499) | #13 0x7f488852a6d7 in nsParser::ResumeParse(bool, bool, bool) /builds/worker/workspace/build/src/parser/htmlparser/nsParser.cpp:965:45
[task 2019-05-13T13:06:32.283Z] 13:06:32 INFO - GECKO(2499) | #14 0x7f488852f9e3 in nsParser::OnDataAvailable(nsIRequest*, nsIInputStream*, unsigned long, unsigned int) /builds/worker/workspace/build/src/parser/htmlparser/nsParser.cpp:1321:12
[task 2019-05-13T13:06:32.284Z] 13:06:32 INFO - GECKO(2499) | #15 0x7f4887edb395 in nsJARChannel::OnDataAvailable(nsIRequest*, nsIInputStream*, unsigned long, unsigned int) /builds/worker/workspace/build/src/modules/libjar/nsJARChannel.cpp:1055:19
[task 2019-05-13T13:06:32.300Z] 13:06:32 INFO - GECKO(2499) | #16 0x7f48864e218d in nsInputStreamPump::OnStateTransfer() /builds/worker/workspace/build/src/netwerk/base/nsInputStreamPump.cpp:554:23
[task 2019-05-13T13:06:32.302Z] 13:06:32 INFO - GECKO(2499) | #17 0x7f48864e0e37 in nsInputStreamPump::OnInputStreamReady(nsIAsyncInputStream*) /builds/worker/workspace/build/src/netwerk/base/nsInputStreamPump.cpp:399:21
[task 2019-05-13T13:06:32.320Z] 13:06:32 INFO - GECKO(2499) | #18 0x7f4886234de6 in nsInputStreamReadyEvent::Run() /builds/worker/workspace/build/src/xpcom/io/nsStreamUtils.cpp:91:20
[task 2019-05-13T13:06:32.322Z] 13:06:32 INFO - GECKO(2499) | #19 0x7f48862c4faf in nsThread::ProcessNextEvent(bool, bool*) /builds/worker/workspace/build/src/xpcom/threads/nsThread.cpp:1175:14
[task 2019-05-13T13:06:32.323Z] 13:06:32 INFO - GECKO(2499) | #20 0x7f48862cb108 in NS_ProcessNextEvent(nsIThread*, bool) /builds/worker/workspace/build/src/xpcom/threads/nsThreadUtils.cpp:486:10
[task 2019-05-13T13:06:32.340Z] 13:06:32 INFO - GECKO(2499) | #21 0x7f488732ab7a in mozilla::ipc::MessagePump::Run(base::MessagePump::Delegate*) /builds/worker/workspace/build/src/ipc/glue/MessagePump.cpp:88:21
[task 2019-05-13T13:06:32.341Z] 13:06:32 INFO - GECKO(2499) | #22 0x7f48872581d2 in RunInternal /builds/worker/workspace/build/src/ipc/chromium/src/base/message_loop.cc:315:10
[task 2019-05-13T13:06:32.342Z] 13:06:32 INFO - GECKO(2499) | #23 0x7f48872581d2 in RunHandler /builds/worker/workspace/build/src/ipc/chromium/src/base/message_loop.cc:308
[task 2019-05-13T13:06:32.344Z] 13:06:32 INFO - GECKO(2499) | #24 0x7f48872581d2 in MessageLoop::Run() /builds/worker/workspace/build/src/ipc/chromium/src/base/message_loop.cc:290
[task 2019-05-13T13:06:32.348Z] 13:06:32 INFO - GECKO(2499) | #25 0x7f488e7ffde9 in nsBaseAppShell::Run() /builds/worker/workspace/build/src/widget/nsBaseAppShell.cpp:137:27
[task 2019-05-13T13:06:32.350Z] 13:06:32 INFO - GECKO(2499) | #26 0x7f48921d1540 in nsAppStartup::Run() /builds/worker/workspace/build/src/toolkit/components/startup/nsAppStartup.cpp:276:30
[task 2019-05-13T13:06:32.358Z] 13:06:32 INFO - GECKO(2499) | #27 0x7f489246f77e in XREMain::XRE_mainRun() /builds/worker/workspace/build/src/toolkit/xre/nsAppRunner.cpp:4555:22
[task 2019-05-13T13:06:32.358Z] 13:06:32 INFO - GECKO(2499) | #28 0x7f4892471810 in XREMain::XRE_main(int, char**, mozilla::BootstrapConfig const&) /builds/worker/workspace/build/src/toolkit/xre/nsAppRunner.cpp:4693:8
[task 2019-05-13T13:06:32.359Z] 13:06:32 INFO - GECKO(2499) | #29 0x7f489247316e in XRE_main(int, char**, mozilla::BootstrapConfig const&) /builds/worker/workspace/build/src/toolkit/xre/nsAppRunner.cpp:4774:21
[task 2019-05-13T13:06:32.362Z] 13:06:32 INFO - GECKO(2499) | #30 0x55f7accc304d in do_main /builds/worker/workspace/build/src/browser/app/nsBrowserApp.cpp:212:22
[task 2019-05-13T13:06:32.362Z] 13:06:32 INFO - GECKO(2499) | #31 0x55f7accc304d in main /builds/worker/workspace/build/src/browser/app/nsBrowserApp.cpp:291
[task 2019-05-13T13:06:32.420Z] 13:06:32 INFO - GECKO(2499) | #32 0x7f48a698682f in __libc_start_main /build/glibc-LK5gWL/glibc-2.23/csu/../csu/libc-start.c:291
[task 2019-05-13T13:06:32.421Z] 13:06:32 INFO - GECKO(2499) | #33 0x55f7acbe4af8 in _start (/builds/worker/workspace/build/application/firefox/firefox+0x2aaf8)
[task 2019-05-13T13:06:32.422Z] 13:06:32 INFO - GECKO(2499) | 0x7f4894e03f42 is located 62 bytes to the left of global variable '<string literal>' defined in '/builds/worker/workspace/build/src/xpcom/string/nsSubstring.cpp:289:3' (0x7f4894e03f80) of size 59
[task 2019-05-13T13:06:32.426Z] 13:06:32 INFO - GECKO(2499) | '<string literal>' is ascii string 'data[aLen] == char16_t(0) (data should be null terminated)'
[task 2019-05-13T13:06:32.427Z] 13:06:32 INFO - GECKO(2499) | 0x7f4894e03f42 is located 0 bytes to the right of global variable 'gNullChar' defined in '/builds/worker/workspace/build/src/xpcom/string/nsSubstring.cpp:56:23' (0x7f4894e03f40) of size 2
[task 2019-05-13T13:06:32.429Z] 13:06:32 INFO - GECKO(2499) | SUMMARY: AddressSanitizer: global-buffer-overflow /builds/worker/workspace/build/src/parser/htmlparser/nsExpatDriver.cpp:291:9 in AppendErrorPointer
[task 2019-05-13T13:06:32.430Z] 13:06:32 INFO - GECKO(2499) | Shadow bytes around the buggy address:
[task 2019-05-13T13:06:32.431Z] 13:06:32 INFO - GECKO(2499) | 0x0fe9929b8790: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
[task 2019-05-13T13:06:32.432Z] 13:06:32 INFO - GECKO(2499) | 0x0fe9929b87a0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
[task 2019-05-13T13:06:32.433Z] 13:06:32 INFO - GECKO(2499) | 0x0fe9929b87b0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
[task 2019-05-13T13:06:32.433Z] 13:06:32 INFO - GECKO(2499) | 0x0fe9929b87c0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
[task 2019-05-13T13:06:32.434Z] 13:06:32 INFO - GECKO(2499) | 0x0fe9929b87d0: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
[task 2019-05-13T13:06:32.434Z] 13:06:32 INFO - GECKO(2499) | =>0x0fe9929b87e0: 00 00 f9 f9 f9 f9 f9 f9[02]f9 f9 f9 f9 f9 f9 f9
[task 2019-05-13T13:06:32.434Z] 13:06:32 INFO - GECKO(2499) | 0x0fe9929b87f0: 00 00 00 00 00 00 00 03 f9 f9 f9 f9 00 00 00 00
[task 2019-05-13T13:06:32.437Z] 13:06:32 INFO - GECKO(2499) | 0x0fe9929b8800: 00 00 00 00 00 07 f9 f9 f9 f9 f9 f9 00 00 00 00
[task 2019-05-13T13:06:32.438Z] 13:06:32 INFO - GECKO(2499) | 0x0fe9929b8810: 00 00 07 f9 f9 f9 f9 f9 00 00 00 00 00 00 00 00
[task 2019-05-13T13:06:32.440Z] 13:06:32 INFO - GECKO(2499) | 0x0fe9929b8820: 00 03 f9 f9 f9 f9 f9 f9 06 f9 f9 f9 f9 f9 f9 f9
[task 2019-05-13T13:06:32.442Z] 13:06:32 INFO - GECKO(2499) | 0x0fe9929b8830: 00 06 f9 f9 f9 f9 f9 f9 02 f9 f9 f9 f9 f9 f9 f9
[task 2019-05-13T13:06:32.444Z] 13:06:32 INFO - GECKO(2499) | Shadow byte legend (one shadow byte represents 8 application bytes):
[task 2019-05-13T13:06:32.446Z] 13:06:32 INFO - GECKO(2499) | Addressable: 00
[task 2019-05-13T13:06:32.447Z] 13:06:32 INFO - GECKO(2499) | Partially addressable: 01 02 03 04 05 06 07
[task 2019-05-13T13:06:32.449Z] 13:06:32 INFO - GECKO(2499) | Heap left redzone: fa
[task 2019-05-13T13:06:32.451Z] 13:06:32 INFO - GECKO(2499) | Freed heap region: fd
[task 2019-05-13T13:06:32.453Z] 13:06:32 INFO - GECKO(2499) | Stack left redzone: f1
[task 2019-05-13T13:06:32.454Z] 13:06:32 INFO - GECKO(2499) | Stack mid redzone: f2
[task 2019-05-13T13:06:32.456Z] 13:06:32 INFO - GECKO(2499) | Stack right redzone: f3
[task 2019-05-13T13:06:32.458Z] 13:06:32 INFO - GECKO(2499) | Stack after return: f5
[task 2019-05-13T13:06:32.461Z] 13:06:32 INFO - GECKO(2499) | Stack use after scope: f8
[task 2019-05-13T13:06:32.463Z] 13:06:32 INFO - GECKO(2499) | Global redzone: f9
[task 2019-05-13T13:06:32.464Z] 13:06:32 INFO - GECKO(2499) | Global init order: f6
[task 2019-05-13T13:06:32.466Z] 13:06:32 INFO - GECKO(2499) | Poisoned by user: f7
[task 2019-05-13T13:06:32.471Z] 13:06:32 INFO - GECKO(2499) | Container overflow: fc
[task 2019-05-13T13:06:32.473Z] 13:06:32 INFO - GECKO(2499) | Array cookie: ac
[task 2019-05-13T13:06:32.474Z] 13:06:32 INFO - GECKO(2499) | Intra object redzone: bb
[task 2019-05-13T13:06:32.476Z] 13:06:32 INFO - GECKO(2499) | ASan internal: fe
[task 2019-05-13T13:06:32.478Z] 13:06:32 INFO - GECKO(2499) | Left alloca redzone: ca
[task 2019-05-13T13:06:32.479Z] 13:06:32 INFO - GECKO(2499) | Right alloca redzone: cb
[task 2019-05-13T13:06:32.481Z] 13:06:32 INFO - GECKO(2499) | Shadow gap: cc
[task 2019-05-13T13:06:32.483Z] 13:06:32 INFO - GECKO(2499) | ==2499==ABORTING
Comment 28•6 years ago
|
||
Merge of the backout:
https://hg.mozilla.org/mozilla-central/rev/16c03995ac55
Assignee | ||
Comment 29•6 years ago
|
||
Eric, could you review the latest patch?
Updated•6 years ago
|
Comment 30•6 years ago
|
||
(In reply to Peter Van der Beken [:peterv] from comment #29)
Eric, could you review the latest patch?
r=me
Comment 31•6 years ago
|
||
https://hg.mozilla.org/integration/autoland/rev/723d1a2c81e8bfb8dfce985e6e9fffa3716a1c7b
https://hg.mozilla.org/mozilla-central/rev/723d1a2c81e8
Comment 33•6 years ago
|
||
[Tracking Requested - why for this release]:
This is a place that breaks the Pwn2Own "sandbox-bypass via langpack" chain, and more generally protects against privilege escalation from rogue language packs. ZDI is itching to tell people about the rest of the Pwn2Own bug so we'd like a fix sooner than later (their standard 120-day disclosure policy lands around mid-July, just after the Fx68 release).
Updated•5 years ago
|
Assignee | ||
Comment 34•5 years ago
|
||
Updated•5 years ago
|
Assignee | ||
Comment 35•5 years ago
|
||
Comment on attachment 9063307 [details]
Bug 1539759 - Improve DTD entity handling. r?erahm!
Beta/Release Uplift Approval Request
- User impact if declined: Vulnerable to privilege escalation from rogue language packs.
- Is this code covered by automated tests?: Yes
- Has the fix been verified in Nightly?: Yes
- Needs manual test from QE?: Yes
- If yes, steps to reproduce: I haven't checked in the automated tests, so could apply the patch that adds them and run the test (parser/htmlparser/tests/mochitest/browser_elementindtd.js).
- List of other uplifts needed: None
- Risk to taking this patch: Low
- Why is the change risky/not risky? (and alternatives if risky): Limited to DTDs loaded from chrome documents. Simply skips adding elements from entities in those DTDs.
- String changes made/needed: None
Assignee | ||
Updated•5 years ago
|
Assignee | ||
Updated•5 years ago
|
Comment 36•5 years ago
|
||
Comment on attachment 9063307 [details]
Bug 1539759 - Improve DTD entity handling. r?erahm!
sec fix for 68.0b8.
might be worth sanity checking a localized build or two?
Reporter | ||
Comment 37•5 years ago
|
||
(In reply to Julien Cristau [:jcristau] from comment #36)
Comment on attachment 9063307 [details]
Bug 1539759 - Improve DTD entity handling. r?erahm!sec fix for 68.0b8.
might be worth sanity checking a localized build or two?
I think a sanity check makes sense. The automated test gives us some assurance that this works as expected, but I suppose in theory it is possible a locale is currently using markup where en-US isn't, in order to achieve an effect that's expected in that locale or something...
Comment 38•5 years ago
|
||
uplift |
Assignee | ||
Comment 39•5 years ago
|
||
(In reply to :Gijs (he/him) from comment #37)
I think a sanity check makes sense. The automated test gives us some assurance that this works as expected, but I suppose in theory it is possible a locale is currently using markup where en-US isn't, in order to achieve an effect that's expected in that locale or something...
Yeah, I hadn't thought of that. If we didn't have cross-channel l10n files we could just have checked the l10n source files :-(. So a sanity check seems like a good idea.
Updated•5 years ago
|
Comment 40•5 years ago
|
||
I'm not sure there is something manual QA can do here in order to reproduce the initial issue on an affected build and verify it on latest build. Can you please give us some pointers if there is something manual QA can do here?
Comment 41•5 years ago
|
||
It might make sense for someone to do a quick scan of the gecko-strings repo to check if there any entities present in mozilla-central that contain markup in any locales.
I'm not sure what there is for QA to do except maybe make sure that any langpacks with unexpected markup in their DTDs don't fail too catastrophically.
Comment 42•5 years ago
|
||
Do we have existing manual QA testing for localized builds? If so, that will likely catch things.
Reporter | ||
Comment 43•5 years ago
|
||
(In reply to Frederik Braun [:freddyb] from comment #42)
Do we have existing manual QA testing for localized builds? If so, that will likely catch things.
I don't think so. Flod, can you clarify what testing we have in place?
Comment 44•5 years ago
|
||
(In reply to Frederik Braun [:freddyb] from comment #42)
Do we have existing manual QA testing for localized builds? If so, that will likely catch things.
Even if we do, I don't think we have the resources to fully QA every locale, and if issues do show up, it's likely that some of them will be in inconspicuous places.
Comment 45•5 years ago
|
||
(In reply to :Gijs (he/him) from comment #43)
(In reply to Frederik Braun [:freddyb] from comment #42)
Do we have existing manual QA testing for localized builds? If so, that will likely catch things.
I don't think so. Flod, can you clarify what testing we have in place?
I'm afraid I don't have that information either, I believe there are only spot checks when testing new features, but I could be wrong.
Can someone help me understand what scenarios we're worried about? In the first iteration of this patch, we were throwing the moment a locale had markup in strings, and showing a YSOD. Now this content is just ignored, and the parsing moves on. Is that accurate?
Looking at the l10n repos is not going to be extremely helpful. There's going to be markup, not in strings that are used in 68/69.
Comment 46•5 years ago
|
||
To evaluate this, we'll need a list of DTDs loaded in principals affected by the change here, on the affected branches.
Reporter | ||
Comment 47•5 years ago
|
||
(In reply to Francesco Lodolo [:flod] from comment #45)
(In reply to :Gijs (he/him) from comment #43)
(In reply to Frederik Braun [:freddyb] from comment #42)
Do we have existing manual QA testing for localized builds? If so, that will likely catch things.
I don't think so. Flod, can you clarify what testing we have in place?
I'm afraid I don't have that information either, I believe there are only spot checks when testing new features, but I could be wrong.
Can someone help me understand what scenarios we're worried about? In the first iteration of this patch, we were throwing the moment a locale had markup in strings, and showing a YSOD. Now this content is just ignored, and the parsing moves on. Is that accurate?
Yes, though AIUI the content is still inserted (ie if you had a string like Foo <em>Bar</em> Baz
then the result would now be Foo Bar Baz
, so the text content would still be inserted. I think my worry was that in some locales the elements would be important (maybe things like <ruby>
elements? But those specific ones don't seem to occur in l10n-central; I don't know enough about HTML in non-English to know for sure what other things might be affected).
Looking at the l10n repos is not going to be extremely helpful. There's going to be markup, not in strings that are used in 68/69.
Well, at this point, given people have expressed low confidence that QA can catch this anyway, I'm wondering if we should script this. The list of entities used in 68 is knowable, and with a bit of work, filterable on whether they're used in anything that has system principal (e.g. we can exclude netError.dtd) and we can then check the filtered set of strings on l10n-central for elements (there's python-based DTD parsers in the tree, right?) statically.
Comment 48•5 years ago
|
||
(In reply to :Gijs (he/him) from comment #47)
Well, at this point, given people have expressed low confidence that QA can catch this anyway, I'm wondering if we should script this. The list of entities used in 68 is knowable, and with a bit of work, filterable on whether they're used in anything that has system principal (e.g. we can exclude netError.dtd) and we can then check the filtered set of strings on l10n-central for elements (there's python-based DTD parsers in the tree, right?) statically.
If provided with a list of entities, I should be able to easily script that (compare-locales has a parser).
Reporter | ||
Comment 49•5 years ago
|
||
(In reply to Axel Hecht [:Pike] from comment #46)
To evaluate this, we'll need a list of DTDs loaded in principals affected by the change here, on the affected branches.
I think this is insufficient given what happened before, because the localized files contain strings not used on those branches, right?
(In reply to Francesco Lodolo [:flod] from comment #48)
If provided with a list of entities, I should be able to easily script that (compare-locales has a parser).
Does the attached work?
Comment 50•5 years ago
|
||
We'll need DTD file and entitiy, not loading X/HTML file.
Which is what I meant with "we need the DTDs", we need the list of actual DTD files loaded in affected contexts. The IDs in those files are icing on the cake, then.
AFAICT, we failed on the first attempt on <!ENTITY>s that were in DTD files in chrome context but with IDs not used.
Thus, l10n file level first, key second.
Reporter | ||
Comment 51•5 years ago
|
||
(In reply to Axel Hecht [:Pike] from comment #50)
Thus, l10n file level first, key second.
There is no way to get this information without a DTD parser and a parser that understands which entity gets loaded for a given XUL/XHTML reference to an entity, which I don't have.
I think you could take the list of entity IDs I produced and run them against all DTDs (so yes, if there's a window.title
entity, check all those entities in all DTDs that have them, which in theory could cause false positives for brokenness and in practice won't because those entities won't contain elements), and check any matches.
The attached is a list of only the DTD files, though I don't much see the point of that either - you could run whatever script we need here that flod is writing on all DTDs, filter for entities that occur in the en-US mozilla-beta or mozilla-central version of these DTDs, and create a DTD file / key list of output (with a list of which locales have elements in that entity) - auditing that would have a smaller chance of accidentally excluding some DTDs that really do get loaded with system principal.
Comment 52•5 years ago
|
||
Can we not just check all of the DTDs for any entities that contain markup in a given locale, but not in mozilla-central (or just en-US)?
Comment 53•5 years ago
|
||
(In reply to Kris Maglione [:kmag] from comment #52)
Can we not just check all of the DTDs for any entities that contain markup in a given locale, but not in mozilla-central (or just en-US)?
That I already did ages ago, checking if a string contains "<" but en-US doesn't
https://bugzilla.mozilla.org/show_bug.cgi?id=1548990#c15
Reporter | ||
Comment 54•5 years ago
|
||
(In reply to Francesco Lodolo [:flod] from comment #53)
(In reply to Kris Maglione [:kmag] from comment #52)
Can we not just check all of the DTDs for any entities that contain markup in a given locale, but not in mozilla-central (or just en-US)?
That I already did ages ago, checking if a string contains "<" but en-US doesn't
https://bugzilla.mozilla.org/show_bug.cgi?id=1548990#c15
Can we re-run that check but ignore entities not present in en-US on beta (which includes the example in that comment)?
Comment 55•5 years ago
|
||
I can easily run checks against gecko-strings (cross-channel repository) and l10n repositories, but not against mozilla-beta (that would require writing a different parser that goes through the entire repo, and ignore most of the content).
What my check does is:
- Parse the en-US repository (gecko-strings). That includes all strings shipping in Nightly, Beta, Release.
- Parse the l10n repositories.
- If a string is not available in en-US, it's ignored (obsolete).
- If it contains
<
, it checks if en-US contains that character as well. If it doesn't, that's an issue and it's reported. If en-US contains markup, then it's assumed that having markup is acceptable.
At this point, localized strings don't contain markup, unless markup is available also in the en-US source. There might be obsolete strings in the files containing markup, but they're unused, so they're not relevant (the parse will strip away markup, but they're unused anyway). The same applies to unused strings (e.g. string for 67).
In case someone wants to do a sanity check
https://gist.github.com/flodolo/5051d8063c00b4d5d11ae373dbe7a8d1
Comment 56•5 years ago
|
||
As I understand from comment 41, there's nothing much that manual QA can do here. I'll remove the qe+ flag in this case.
Reporter | ||
Comment 57•5 years ago
|
||
I think we're done here, based on the last few comments. Thanks, flod!
Comment 58•5 years ago
|
||
Comment on attachment 9069932 [details]
Bug 1539759 - Improve DTD entity handling. Testcase. r?Gijs!
dropping the uplift request for this patch since it hasn't landed on central (and in any case is test-only)
Updated•5 years ago
|
Comment 59•5 years ago
|
||
Julien: why did you "wontfix" this for ESR-60? Since we can't backport the actual localization changes (e.g. re-writing about:telemetry using fluent) this is about the only working break in the sandbox-escapewe might have for ESR.
Gijs: is this at all feasible for ESR-60?
Comment 60•5 years ago
|
||
My bad, I don't remember why I did that, I guess I shouldn't have. :(
Reporter | ||
Comment 61•5 years ago
|
||
(In reply to Daniel Veditz [:dveditz] from comment #59)
Julien: why did you "wontfix" this for ESR-60? Since we can't backport the actual localization changes (e.g. re-writing about:telemetry using fluent) this is about the only working break in the sandbox-escapewe might have for ESR.
Gijs: is this at all feasible for ESR-60?
Only if we're willing to accept some UI breakage. In esr60 there are a lot more DTD strings that might be problematic, and so it depends how important the markup in those strings is for the functioning of the relevant content. The change from this bug would essentially strip nested elements. Using the same query as Kris used in https://bugzilla.mozilla.org/show_bug.cgi?id=1538007#c21, looking at: https://searchfox.org/mozilla-esr60/search?q=%3C[a-z]&case=true®exp=true&path=.dtd
- emphasis will be lost from the panic/forget button view (bold text becomes non-bold). We probably don't care that much.
addonPostInstallMessage.label
will lose the images of the items to click, meaning we end up with the string "Manage your add-ons by clicking in the menu.", which sounds surprisingly intelligible but doesn't actually make any sense, and probably won't remain a grammatical sentence in some other languages.aboutDevtools.newsletter.privacy.label
would lose the link to the privacy policy in about:devtools, so it'd just say "I’m okay with Mozilla handling my info as explained in this Privacy Policy." without any idea what the privacy policy is/was.- about:about would lose some formatting in a description. We probably don't care that much.
- about:networking would lose a link to documentation about http logging.
- about:telemetry would lose all links to documentation and web-based dashboards.
- about:support would lose links to support.m.o, and potentially break in other ways depending on how code copes with links that it expects to be there, not being there.
So, not straightforwardly. We could potentially try to write some kind of complicated patch for all the important cases here, where we load the DTD string into a dummy unprivileged document and try to work out based on that where the relevant markup needs to go into the 'real' document, but it'd be a lot of work, bugprone and potentially problematic from a security perspective, and we're about to release esr68...
We can't normally touch the strings because l10n, AIUI. I have no idea what state fluent is in on 60, it's possible we could re-run some of the conversion scripts and get around the issues above that way, but that too would be a lot of work without any scope to figure out issues. :flod, thoughts ?
I think we likely have a better shot at uplifting bug 1539598, ie enabling a prompt to install the langpack. Andrew, can you confirm (I'm a bit confused about some of the discussion there re: things running in the same process as the disco pane...)? Dan, how would you feel about that as a mitigation?
Comment 62•5 years ago
|
||
Taking the question for flod:
ESR 60 is on Fluent 0.6.2. That has syntax changes compared to today, Also, there's been a ton of bug fixes since Fluent 0.6.2, so it's unclear that we can use the same coding patterns as on 68, or that we get the same performance characteristics. So neither the code nor the migrations can be assumed to just apply to 60. For migration code, we'd need to check if we can use a better version than the one in tree. Probably, but it's trial and error.
For the l10n side of things, we'd need to create a rel-branch for each locale, based on the current esr 60 state, apply the migrations to those branches (100 times), and update the l10n-changesets.json to take all of those changes.
We might break Thunderbird as we go, so they'd need to update, too. No idea if they're OK to use the same base versions.
Comment 63•5 years ago
|
||
(In reply to :Gijs (he/him) from comment #61)
I think we likely have a better shot at uplifting bug 1539598, ie enabling a prompt to install the langpack. Andrew, can you confirm (I'm a bit confused about some of the discussion there re: things running in the same process as the disco pane...)? Dan, how would you feel about that as a mitigation?
The actual patch is pretty straightforward and the mozAddonManager code hasn't been changing much so from eyeballing it, it looks like it should apply without any trouble. The description of that bug isn't very clear, it would cause the "This site would like to install an add-on in Firefox" notification to appear for any addon installation request coming from a content process (even without that patch we always show that prompt for extensions, the patch changes so that a prompt is shown for themes, langpacks, etc)
Comment 64•5 years ago
|
||
Per the recent comments, trying to backport this to ESR60 this late in its life cycle is too invasive and regression-prone for a late-RC uplift. We've uplifted bug 1538015 and bug 1539598 to help mitigate the sandbox escape on that branch and will let the remaining mitigations ride with 68.
Comment 65•5 years ago
|
||
I think we likely have a better shot at uplifting bug 1539598, ie enabling a prompt to install the langpack. [...] Dan, how would you feel about that as a mitigation?
Your searchfox query is convincing, even apart from knowing of places where localizations add more markup that won't be found by that query. I guess bug 1539598 will have to do.
Comment 66•5 years ago
|
||
I just want to note that we probably could uplift a less extreme version of this patch that allowed a whitelisted set of elements, or otherwise did some sanitization or blacklist checks, to ESR. It's mostly a question of whether we think it's worth the effort.
Alternatively, we could just make sure we do similar checks on AMO before signing any third-party langpacks. I suspect that would require the less effort of the two options.
Reporter | ||
Comment 67•5 years ago
|
||
(In reply to Kris Maglione [:kmag] from comment #66)
I just want to note that we probably could uplift a less extreme version of this patch that allowed a whitelisted set of elements, or otherwise did some sanitization or blacklist checks, to ESR. It's mostly a question of whether we think it's worth the effort.
I thought about this, but we'd want to keep allowing img
and a
at a minimum, and we'd then need to also whitelist attributes on those, and in the case of href
, parse the value, ... Doing all that inside expat while it's dealing with DTDs seemed very unappealing, especially given it'd get no test coverage on other branches before being released to ESR users.
Alternatively, we could just make sure we do similar checks on AMO before signing any third-party langpacks. I suspect that would require the less effort of the two options.
This seems like a more sensible solution - we could more easily leverage existing libraries to parse DTD string content and check that it doesn't include any untoward markup. Who would be able to do work on something like that?
Comment 68•5 years ago
|
||
(In reply to :Gijs (he/him) from comment #67)
(In reply to Kris Maglione [:kmag] from comment #66)
I just want to note that we probably could uplift a less extreme version of this patch that allowed a whitelisted set of elements, or otherwise did some sanitization or blacklist checks, to ESR. It's mostly a question of whether we think it's worth the effort.
I thought about this, but we'd want to keep allowing
img
anda
at a minimum, and we'd then need to also whitelist attributes on those, and in the case ofhref
, parse the value, ... Doing all that inside expat while it's dealing with DTDs seemed very unappealing, especially given it'd get no test coverage on other branches before being released to ESR users.
Sure. I definitely agree that this is nontrivial. But it's also probably doable if we think it's worth the effort, so I figured it was at least worth mentioning the possibility.
Alternatively, we could just make sure we do similar checks on AMO before signing any third-party langpacks. I suspect that would require the less effort of the two options.
This seems like a more sensible solution - we could more easily leverage existing libraries to parse DTD string content and check that it doesn't include any untoward markup. Who would be able to do work on something like that?
I don't know at this point. Typically AMO issues are tracked on Github, but security bugs are tracked on Bugzilla in the addons.mozilla.org :: Security
component. Presumably if we file the bug there, someone will CC the appropriate people. Either way, I can make sure it gets the attention of the people who can make that determination.
Reporter | ||
Updated•5 years ago
|
Updated•4 years ago
|
Comment 69•3 years ago
|
||
Comment 70•3 years ago
|
||
Backed out for causing failures at browser_elementindtd.js.
Backout link: https://hg.mozilla.org/integration/autoland/rev/b154dbcf77ea1ed32b57d12c77dbbdcf7842775b
Failure log: https://treeherder.mozilla.org/logviewer?job_id=367815763&repo=autoland&lineNumber=2427
- and the lint failure: https://treeherder.mozilla.org/logviewer?job_id=367813151&repo=autoland&lineNumber=347
Comment 71•3 years ago
|
||
Comment 72•3 years ago
|
||
Backed out for causing failures at browser_elementindtd.js.
Backout link: https://hg.mozilla.org/integration/autoland/rev/a9f709011cab5f379efac6b8cf5bfe720a532e32
Push with failures: https://treeherder.mozilla.org/jobs?repo=autoland&resultStatus=testfailed%2Cbusted%2Cexception%2Cretry%2Cusercancel&revision=b5c1408b6ad487b723e5c813d53782d57058af53
Failure log: https://treeherder.mozilla.org/logviewer?job_id=367830326&repo=autoland&lineNumber=2587
Comment 73•3 years ago
|
||
Assignee | ||
Updated•3 years ago
|
Comment 74•3 years ago
|
||
bugherder |
Description
•