Closed Bug 1245121 Opened 9 years ago Closed 9 years ago

Enable JSON Viewer on Nightly

Categories

(DevTools :: JSON Viewer, defect)

defect
Not set
normal

Tracking

(firefox47 fixed)

RESOLVED FIXED
Firefox 47
Tracking Status
firefox47 --- fixed

People

(Reporter: jryans, Assigned: jryans)

References

Details

(Keywords: dev-doc-complete)

Attachments

(1 file)

We should at least enable it on Nightly.
Attachment #8714884 - Flags: review?(odvarko) → review+
Comment on attachment 8714884 [details] MozReview Request: Bug 1245121 - Enable JSON Viewer on RELEASE_BUILD. r=Honza https://reviewboard.mozilla.org/r/33255/#review30109 Looks good to me! Honza
Status: ASSIGNED → RESOLVED
Closed: 9 years ago
Resolution: --- → FIXED
Target Milestone: --- → Firefox 47
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
what is more scary is that this backout has made linux64 tart become very unstable: https://treeherder.mozilla.org/#/jobs?repo=fx-team&filter-searchStr=talos%20svg%20e10s&tochange=8618a85c5801&fromchange=b2d75ac5ba0f oh the random balancing acts of running tests! :jryans, any thoughts on this? I am inclined to leave the perf regressions in if needed to get stability.
Flags: needinfo?(jryans)
(In reply to Joel Maher (:jmaher) from comment #8) > what is more scary is that this backout has made linux64 tart become very > unstable: > https://treeherder.mozilla.org/#/jobs?repo=fx-team&filter- > searchStr=talos%20svg%20e10s&tochange=8618a85c5801&fromchange=b2d75ac5ba0f > > oh the random balancing acts of running tests! > > :jryans, any thoughts on this? I am inclined to leave the perf regressions > in if needed to get stability. Hmm... Not sure what past stability was like. It seems somewhat stable looking further ahead? I don't see any way for this change to break the talos test itself. Aren't the regressions quite large here?
Flags: needinfo?(jryans) → needinfo?(jmaher)
Going back in history, when this patch landed originally we fixed the instability in the test: https://treeherder.mozilla.org/#/jobs?repo=fx-team&filter-searchStr=talos%20svg%20e10s%20linux&startdate=2016-02-01&enddate=2016-02-05 And when it backed out we started having stability problems (previous link). Yes, we have a lot of perf issues with this- maybe we need to debug what is going on with the tart test. One other factor in this is that bug 1246695 (landed Feb 10th on inbound) fixed some timeout issues in tart crashing which showed up back in January. Oddly enough it was uplifted to Aurora and was backed out for frequent talos tart timeouts! Is it possible that the bug 1246695 is conflicting with json viewer somehow?
Flags: needinfo?(oyiptong)
Flags: needinfo?(jryans)
Flags: needinfo?(jmaher)
(In reply to Joel Maher (:jmaher) from comment #10) > Going back in history, when this patch landed originally we fixed the > instability in the test: > https://treeherder.mozilla.org/#/jobs?repo=fx-team&filter- > searchStr=talos%20svg%20e10s%20linux&startdate=2016-02-01&enddate=2016-02-05 > > And when it backed out we started having stability problems (previous link). > > Yes, we have a lot of perf issues with this- maybe we need to debug what is > going on with the tart test. One other factor in this is that bug 1246695 > (landed Feb 10th on inbound) fixed some timeout issues in tart crashing > which showed up back in January. Oddly enough it was uplifted to Aurora and > was backed out for frequent talos tart timeouts! > > Is it possible that the bug 1246695 is conflicting with json viewer somehow? Do we know much about the stability issue itself? Looking at logs like https://treeherder.mozilla.org/logviewer.html#?job_id=7384068&repo=fx-team, it seems like the test process just crashes without saying much? At the moment, JSON Viewer is off, so I would not expect it be the cause of issues itself... Perhaps enabling it resolved some elusive mystery problem? At the same time, I don't believe we load any JSON files during the tart test (just actual web pages), so I would not expect JSON Viewer to affect things there. The file https://dxr.mozilla.org/mozilla-central/source/devtools/client/jsonview/converter-observer.js is loaded in all cases (on or off) as a process script. It checks a pref, currently disabled in Nightly. When the pref is off, the other modules aren't loaded. At the moment, I don't see how this pref observer is related to tart stability.
Flags: needinfo?(jryans) → needinfo?(jmaher)
I do understand that is is off and that for some odd reason enabling jsonviewer actually stopped this test from hanging. I do wonder if we are loading .json from somewhere outside of the test which is causing us to behave differently. This is e10s only, is it possible that jsonviewer has some code which is not behind the preference and we are hitting it. I also want to see what :mconley has to say as he helped write tart and knows a bit more about e10s.
Flags: needinfo?(jmaher) → needinfo?(mconley)
So this looks like the TART test is timing out because the test sometimes doesn't kick off once the tart.html page loads. This sounds to me like a race, and that somehow the JSON Viewer caused the one side to the win the race more. I'm not sure where the race is, I'm afraid. :/
Flags: needinfo?(mconley)
how do we fix this? run this test under rr and then debug and fix it? We could get a loaner machine and go that route. This patch clearly fixed the problem, and started the problem on backout. While this isn't the only patch to fall prey here, could we not use what we have here to help us figure out what might be going on? As it stands with 30-40% failure rates on e10s, I am worried we will be turning this test off for e10s- that isn't ideal.
Flags: needinfo?(mconley)
I haven't looked at the JSON Viewer code yet, bug I can speak to bug 1246695: it fixes a race condition that happens when TART changes the newtab URL. It seems pretty distant to what JSON Viewer does. But I'll try to take a look and see if I can be helpful
Flags: needinfo?(oyiptong)
I don't think I'm going to be much help here in the short-term. What's sorely needed, IMO, is a re-write of TART to be easier to reason about, and where e10s support is baked in as opposed to tacked on. This is not a knock against anyone who's worked on the TART code (myself included!), it's just the reality of the situation: we have a test here that seems really really touchy for poorly understood reasons.
Flags: needinfo?(mconley)
:jmaher, we'd like to try enabling this again, because we want to match what's already enabled on Dev. Ed. However, I am guessing it will cause various Talos regressions again. Is it okay to accept them? It seems like it would be better to gate on / fix up Talos regressions when we prepare to enable this beyond Dev. Ed, as this bug only wants the feature enabled for Dev. Ed and Nightly.
Flags: needinfo?(jmaher)
that seems reasonable. Can we get this in before the uplift on Monday?
Flags: needinfo?(jmaher)
Status: REOPENED → RESOLVED
Closed: 9 years ago9 years ago
Resolution: --- → FIXED
Product: Firefox → DevTools
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: