Closed Bug 1337010 Opened 8 years ago Closed 8 years ago

2.5 - 2.76% kraken (linux64) regression on push 380a46afcf858c4990b3f334da54246dfc2f1156 (Sat Feb 4 2017)

Categories

(Core :: JavaScript Engine: JIT, defect)

53 Branch
defect
Not set
normal

Tracking

()

RESOLVED FIXED
mozilla54
Tracking Status
firefox-esr45 --- fixed
firefox52 --- fixed
firefox53 --- fixed
firefox54 --- fixed

People

(Reporter: jmaher, Assigned: jandem)

References

Details

(Keywords: perf, regression, talos-regression)

Talos has detected a Firefox performance regression from push 380a46afcf858c4990b3f334da54246dfc2f1156. As author of one of the patches included in that push, we need your help to address this regression. Regressions: 3% kraken summary linux64 pgo 1523.43 -> 1565.51 3% kraken summary linux64 opt 1574.92 -> 1614.36 You can find links to graphs and comparison views for each of the above tests at: https://treeherder.mozilla.org/perf.html#/alerts?id=5022 On the page above you can see an alert for each affected platform as well as a link to a graph showing the history of scores for this test. There is also a link to a treeherder page showing the Talos jobs in a pushlog format. To learn more about the regressing test(s), please see: https://wiki.mozilla.org/Buildbot/Talos/Tests For information on reproducing and debugging the regression, either on try or locally, see: https://wiki.mozilla.org/Buildbot/Talos/Running *** Please let us know your plans within 3 business days, or the offending patch(es) will be backed out! *** Our wiki page outlines the common responses and expectations: https://wiki.mozilla.org/Buildbot/Talos/RegressionBugsHandling
:jandem, can you look into this and determine if there is a fix we can do or if we have to document this and accept it?
Component: Untriaged → JavaScript Engine: JIT
Flags: needinfo?(jdemooij)
Product: Firefox → Core
Tomorrow I'll see if I can reproduce this. There was no change on AWFY I think (Windows and OS X) and Kraken is very sensitive to code alignment changes. I doubt there's much we can/should do, but I'll take a look.
I can't reproduce this locally (Linux64, browser and shell) but I'm trying some things on Try. Unfortunately my Talos jobs have been queued for a while so it's going to take some time.
I may know what's causing this, but I need to do a couple more Try pushes/retriggers to confirm. If I'm right it should be an easy fix. Stay tuned.
My Try pushes confirm js::GenerateRandomSeed is slowing down Linux64 *content processes* on Try. I don't see this locally, but it probably has to do with sandboxing or an older kernel. I'm changing this code in bug 1337561 to no longer use GenerateRandomSeed on every allocation. It should fix this one as well.
Depends on: 1337561
Flags: needinfo?(jdemooij)
Note that there's still a small regression according to that graph, but I think that's because bug 1334187 introduced an additional Kraken regression in the meantime...
this looks good, thanks for fixing this!
Status: NEW → RESOLVED
Closed: 8 years ago
Flags: needinfo?(jmaher)
Resolution: --- → FIXED
Assignee: nobody → jdemooij
Target Milestone: --- → mozilla54
You need to log in before you can comment on or make changes to this bug.