Closed Bug 1080301 Opened 10 years ago Closed 10 years ago

[e10s] Content process leaks from timing info changes

Categories

(Core :: Networking, defect)

defect
Not set
normal

Tracking

()

RESOLVED FIXED
Tracking Status
e10s + ---

People

(Reporter: mccr8, Assigned: valentin)

References

Details

(Keywords: memory-leak, regression, Whiteboard: [MemShrink])

Attachments

(1 file)

At some point in the last week, we started leaking about 800kb in the content process on Mochitest-4 e10s tests. The leaks include documents and nsGlobalWindows Leak checking is not currently enabled on TBPL, so this is with some other patches I have applied. Anyways, I did a bisection and the leaks are apparently being caused by this changeset: https://hg.mozilla.org/mozilla-central/rev/afecbcdbe6d5 I suspect this may also be causing multi megabyte leaks in other Mochitest plain test suites, but I haven't confirmed that.
tracking-e10s: --- → ?
Attached patch patch to show leaks (deleted) — Splinter Review
With this patch applied, you can reproduce the leak with: ./mach mochitest-plain --e10s --total-chunks 5 --this-chunk 4 Some of the stuff in this patch is patches that have already landed, so if you apply to trunk it may be a little weird.
In Mochitest-4, various object elements are keeping the windows alive. They are from the test dom/tests/mochitest/dom-level2-html/files/document-with-applet.html . Here is one of the paths: 0x7f426878f9c0 [FragmentOrElement (xhtml) object http://mochi.test:8888/tests/dom/tests/mochitest/dom-level2-html/files/document-with-applet.html] --[mAttrsAndChildren[i]]--> 0x7f426bacc0c0 [FragmentOrElement (xhtml) applet http://mochi.test:8888/tests/dom/tests/mochitest/dom-level2-html/files/document-with-applet.html] --[[via hash] mListenerManager]--> 0x7f4269bad8f0 [EventListenerManager] --[mListeners event=onoverflow listenerType=3 [i]]--> 0x7f42689aaf10 [CallbackObject] --[mCallback]--> 0x7f426819f160 [JS Object (Function - PluginContent.prototype.handleEvent/resizeListene] --[fun_environment]--> 0x7f426c5ecec0 [JS Object (Block)] --[overlay]--> 0x7f426d9eaf80 [JS Object (Proxy)] --[private]--> 0x7f4268021b20 [JS Object (HTMLDivElement)] --[parent]--> 0x7f4268672080 [JS Object (Proxy)] --[private]--> 0x7f426969dd00 [JS Object (HTMLDocument)] --[type_proto]--> 0x7f42667ce550 [JS Object (HTMLDocumentPrototype)] --[type_proto]--> 0x7f4269130500 [JS Object (DocumentPrototype)] --[getter]--> 0x7f4269356580 [JS Object (Function - ontouchstart)] --[parent]--> 0x7f426939a7e0 [JS Object (Window)] --[CLASS_OBJECT(Function)]--> 0x7f42679bb8b0 [JS Object (Performance)] --[UnwrapDOMObject(obj)]--> 0x7f4269465670 [DOMEventTargetHelper ] --[mParentPerformance]--> 0x7f4269421a60 [DOMEventTargetHelper http://mochi.test:8888/tests/dom/tests/mochitest/dom-level2-html/test_HTMLDocument08.html] --[mWindow]--> 0x7f4269f53c00 [nsGlobalWindow #1009 inner http://mochi.test:8888/tests/dom/tests/mochitest/dom-level2-html/test_HTMLDocument08.html] You can see near the end where performance stuff gets involved, though I don't know how it might be related to the actual object element that is being held alive.
Hi Andrew, thanks for investigating this. I think I understand the issue. Will look into it tonight.
Flags: needinfo?(valentin.gosu)
Great! Running just the dom-level2-html directory seems to reproduce the leak for me. ./mach mochitest-plain --e10s dom/tests/mochitest/dom-level2-html/
I'm also actually seeing a leak that looks very similar to this in Nightly, and it is causing ghost windows. I see a similar path holding a ghost window alive as comment 2.
Do you mind if I back out part 2 and 3 while you investigate? This is a pretty bad leak on e10s.
Actually, I'm just going to go ahead and back it out for now.
Assignee: nobody → valentin.gosu
Flags: needinfo?(valentin.gosu)
The backout is good enough to close this.
Status: NEW → RESOLVED
Closed: 10 years ago
Resolution: --- → FIXED
I've got some interesting results from a try push: https://treeherder.mozilla.org/ui/#/jobs?repo=try&revision=34b1ba73199d It seems that the leaks are intermittent, and are due to nsPerformanceTiming holding a reference to the channel. I did a try run with a dummy timed channel, and it seemed to fix the issue. Also interesting is that the leaks don't occur in the resource_timing.html tests, but in unrelated tests. I'll try to change the implementation so that the nsPerformanceTiming doesn't need to keep the channels around just for the timing info.
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: