Closed
Bug 1007774
Opened 11 years ago
Closed 10 years ago
Add a target line for performance in datazilla
Categories
(Datazilla Graveyard :: Metrics, defect, P1)
Tracking
(Not tracked)
RESOLVED
WONTFIX
People
(Reporter: hub, Assigned: hub)
References
Details
(Keywords: perf, Whiteboard: [c=automation p=3 s= u=])
Attachments
(2 files)
Add a target line for performance in datazilla
This is a perf Q2 goal. We need to add this target line per test in order to be able to visualize how far we are from target.
Assignee | ||
Updated•11 years ago
|
Priority: -- → P1
Whiteboard: [c=automation p= s= u=]
Updated•11 years ago
|
Assignee | ||
Updated•10 years ago
|
Assignee: nobody → hub
Status: NEW → ASSIGNED
Assignee | ||
Updated•10 years ago
|
Whiteboard: [c=automation p= s= u=] → [c=automation p=3 s= u=]
Comment 1•10 years ago
|
||
Jeads suggests adding a config file in the datazilla repo that contains the target values for each app+test+device combination. The UI would then load the config file and just plot the lines. The UI would also have a checkbox for toggling the lines on/off.
Assignee | ||
Comment 2•10 years ago
|
||
That would work. I was thinking something like that.
Updated•10 years ago
|
Priority: P1 → P2
Updated•10 years ago
|
Priority: P2 → P3
Updated•10 years ago
|
Severity: normal → major
Priority: P3 → P1
Assignee | ||
Comment 3•10 years ago
|
||
Here what I think we should: For each performance test, attach the performance goal so that it is persisted and datazilla can plot it. Then we need some change on the datazilla side. I have a proof of concept.
Also I do need the performance goal stated.
The testsuite will have them by device.
Assignee | ||
Comment 4•10 years ago
|
||
The current data set is pure fantasy.
Attachment #8467915 -
Flags: review?(eperelman)
Comment 5•10 years ago
|
||
The only launch test value that has a performance goal associated with it is hitting "moz-app-visually-complete" by 1000ms. I am unsure if there are goals for other metrics like memory or overfill.
Comment 6•10 years ago
|
||
fxOS Performance Criteria & Goals are always accessible here: https://wiki.mozilla.org/FirefoxOS/Performance/Release_Acceptance
We currently have latency for application first launch and are working to also define memory usage for that scenario.
Component: General → Metrics
Assignee | ||
Comment 7•10 years ago
|
||
Work being done in bug 1049031 will require this to be rebased.
Depends on: 1049031
Assignee | ||
Comment 8•10 years ago
|
||
Updated the PR.
Updated•10 years ago
|
Attachment #8467915 -
Flags: review?(eperelman) → review+
Assignee | ||
Comment 9•10 years ago
|
||
Now we need to get it plotted.
Assignee | ||
Comment 10•10 years ago
|
||
Jonathan, do you think we can get this plotted on datazilla?
We have a "mozPerfGoal" value, that is optional, with each perf results. We'd like to have it displayed at the same time as the actual perf value.
Thanks,
(landing will happen soon as gaia is closed right now)
Flags: needinfo?(jgriffin)
Assignee | ||
Comment 11•10 years ago
|
||
Assignee | ||
Comment 12•10 years ago
|
||
Or maybe :jeads you are the better contact?
Flags: needinfo?(jeads)
Comment 13•10 years ago
|
||
Jeads, any guesses as to how much work it is? If it's not trivial, Eli said he'd be willing to step in and implement the front-end side, assuming the mozPerfGoal values are making it into the datazilla db.
Flags: needinfo?(jgriffin)
Comment 14•10 years ago
|
||
We're not currently submitting mozPerfGoal to datazilla. We'll need to modify the mozperf poster to handle that, but to do that, we'll have to decide how we want to handle this.
The easiest way would require no datazilla changes, but would require a little extra work on the part of people who are interested in the goal, and that would be to submit the goal as an extra "app" that could be selected or de-selected. So, for example, when selecting "startup_>_moz-app-visually-complete" as the test, you might see the following under apps: "calendar", "calendar-goal", "camera", "camera-goal", etc.
The other option is to add the goal to the metadata we submit with each test, and modify datazilla to display it. This doesn't fit in with the current data model (see e.g., https://datazilla.mozilla.org/b2g/refdata/objectstore/json_blob/360433), so the work to support this might be non-trivial.
Comment 15•10 years ago
|
||
This might not be much work if I understand it correctly. The UI could independently ingest the config file, https://github.com/mozilla-b2g/gaia/blob/master/tests/performance/config.json, and use the device/test names to map it to the correct set of options for display in the performance graph in the datazilla b2g UI. This would require that the exact same device/test nomenclature is used in comparison to what is sent to datazilla. If those match, this approach would work and would probably make more sense then repeatedly sending the same target value in the test data json.
Flags: needinfo?(jeads)
Assignee | ||
Comment 16•10 years ago
|
||
I caught this testing on hamachi - because we have an empty goal set on hamachi.
Attachment #8473101 -
Flags: review?(eperelman)
Updated•10 years ago
|
Attachment #8473101 -
Flags: review?(eperelman) → review+
Assignee | ||
Comment 17•10 years ago
|
||
Comment 18•10 years ago
|
||
The modifications to datazilla to display the goals in https://github.com/mozilla-b2g/gaia/blob/master/tests/performance/config.json#L14 have been made.
Note, the goals will only display for a device name of "flame" because that's the only entry in config.json. If the intention was to apply the goals to the flame-512MB and flame-319MB just add entries for them in the goals structure in config.json and they will show up on the datazilla graph.
Also, there is a discrepancy between the test name listed in config.json and the one sent to datazilla in the json objects:
config.json: "startup event test > * > startup > moz-app-visually-complete"
json objects: "startup_>_moz-app-visually-complete"
To make these match I'm replacing the underscores with spaces:
https://github.com/mozilla/datazilla/blob/master/datazilla/webapp/static/js/b2g_apps/PerformanceGraphComponent.js#L198
This is a bit brittle, it would be great if we could match the test name in config.json exactly without having to modify it. It will work for now, let me know if this gets changed in the future.
Assignee | ||
Comment 19•10 years ago
|
||
You shouldn't even be aware of the config.json.
The test should have a value for the perf goal if there is one. Also the json object is transformed AFTER we run the test, so we are not the one changing these name. It is the datazilla submitter that is. See comment 14, :jgriffin knows more about that.
Also, you should be aware that since the perf team has been reorganized (ie disbanded) there is nobody to own this bug / task anymore. So the work may be all pointless.
Comment 20•10 years ago
|
||
Datazilla is being deprecated this quarter.
Status: ASSIGNED → RESOLVED
Closed: 10 years ago
Resolution: --- → WONTFIX
You need to log in
before you can comment on or make changes to this bug.
Description
•