Closed
Bug 1390092
Opened 7 years ago
Closed 7 years ago
Measure various performance times inside devtools
Categories
(DevTools :: General, defect, P2)
DevTools
General
Tracking
(Not tracked)
RESOLVED
WORKSFORME
People
(Reporter: miker, Assigned: miker)
References
(Blocks 1 open bug)
Details
Attachments
(2 files)
I will attach the patch when I have finished untangling it.
- Mostly just updating React components... the main mechanisms are in:
- devtools/shared/perfService.js
- Prefs are in modules/libpref/init/all.js
Assignee | ||
Comment 1•7 years ago
|
||
As a base we need to find a way to easily add something to **all** of our React components without monkeypatching React.
Comment hidden (mozreview-request) |
Assignee | ||
Comment 3•7 years ago
|
||
Comment 4•7 years ago
|
||
I would like to suggest Alex as a reviewer of the perfService.js file.
Alex, do you mind giving a look at this file?
Flags: needinfo?(poirot.alex)
Comment 5•7 years ago
|
||
mozreview-review |
Comment on attachment 8897012 [details]
Bug 1390092 - Measure various performance times inside devtools
https://reviewboard.mozilla.org/r/168296/#review175940
Looks good to me
Attachment #8897012 -
Flags: review?(jlaster) → review+
Comment 6•7 years ago
|
||
mozreview-review |
Comment on attachment 8897012 [details]
Bug 1390092 - Measure various performance times inside devtools
https://reviewboard.mozilla.org/r/168296/#review175968
What is the typical usage of this patch?
It looks like it is meant to be used with perf-html and its timeline.
Then I would suggest removing "dumpMeasures", unless I miss some alternative usage.
https://perfht.ml/2vSBPe5
PerfService.js is quite significant compared to react: 50ms versus 300ms
processValues is the most expansive method, by far, followed by getEntriesByName and timingEnabled getter.
It would be great to reduce this footprint as it pollute its own data.
We may think some steps are slower than what it really is! Which is the worse issue a performance tool can have :/
Also, did you played with this new feature against a performance regression that we fixed recently? Is it able to clearly spot the regression? How? By showing long measures and/or too many marks?
Finally, finding performances issues is great, but do you have plans on preventing regressions, like running things on CI? My experience regarding performances proves that any measures that relate to timings is very hard to track, but may be we could assert a precise number of markers for a given test?
::: devtools/shared/perfService.js:15
(Diff revision 1)
> +
> +this._PerfService = function _PerfService(options) {
> + // For testing, so that we can use a fake Window.performance object with
> + // known state.
> + if (options && options.performanceObj) {
> + this._perf = options.performanceObj;
It looks like this is never used, if that's really the case, we should strip that.
::: devtools/shared/perfService.js:28
(Diff revision 1)
> + this.stopMeasureOnce = this.stopMeasureOnce.bind(this);
> + this.dumpMeasures = this.dumpMeasures.bind(this);
> + this.storeValue = this.storeValue.bind(this);
> +
> + this.reset();
> +};
nit: what about using es6 classes to define PrefService?
nit2: The `_` prefix isn't really helpful, I would call the class PerfService.
::: devtools/shared/perfService.js:54
(Diff revision 1)
> + * @returns An EventTarget which is a callback that will be called when the
> + * resourcetimingbufferfull event is fired.
> + */
> + get onresourcetimingbufferfull() {
> + return this._perf.onresourcetimingbufferfull;
> + },
All these three properties seem to be unused and should be removed (navigation, timing, onresourcetimingbufferfull)
As well as: clearResourceTimings, now, setResourceTimingBufferSize, toJSON, timeOrigin, startMeasureOnce and stopMeasureOnce.
::: devtools/shared/perfService.js:57
(Diff revision 1)
> + get onresourcetimingbufferfull() {
> + return this._perf.onresourcetimingbufferfull;
> + },
> +
> + get timingEnabled() {
> + return Services.prefs.getBoolPref("devtools.performance.measure.timing");
You are accessing `timingEnabled` in many places
and calling getBoolPref isn't cheap.
What about storing its value only on startup within `_PerfService` constructor?
`this.timingEnabled = Services.prefs.getBoolPref("devtools.performance.measure.timing");`
It sounds reasonable to say that this pref requires devtools restart.
::: devtools/shared/perfService.js:305
(Diff revision 1)
> + getNextMarkIndex(mark) {
> + for (let i = 1; i < 9999; i++) {
> + if (this.getEntriesByName(`${mark}-${i}`, "mark").length === 0) {
> + return i;
> + }
> + }
This looks like bruteforce and it explain why we see getEntriesByName in profiler results.
What about using a Set(mark => count) to implement that efficiently? (and no longer have to call perf.getEntriesByName)
::: devtools/shared/perfService.js:316
(Diff revision 1)
> + */
> + reset() {
> + this.once = new Set();
> + this.values = new Map();
> + this._perf.clearMarks();
> + this._perf.clearMeasures();
I don't follow why you have to clear marks and measures on PerfService instanciation?
Also `reset` is only called from PerfService constructor you may inline it in it.
::: devtools/shared/perfService.js:356
(Diff revision 1)
> + throw new Error(`No marks with the name ${name}`);
> + }
> +
> + let mostRecentEntry = entries[entries.length - 1];
> + return this._perf.timeOrigin + mostRecentEntry.startTime;
> + },
This doesn't seem to be used and should be stripped.
::: devtools/shared/perfService.js:380
(Diff revision 1)
> + } else {
> + console.log(name, duration, value);
> + }
> + }
> + console.groupEnd();
> + },
Is it any useful?
Either in command line output or in Browser console, it is too verbose to read.
May be it was before your registered the mixin to every component, but now that every single component throw multiple timings, it is extremelly long list of values!
::: devtools/shared/perfService.js:406
(Diff revision 1)
> + this.processValues(this.props, prevNextProps, "prop", stop);
> + this.processValues(this.state, prevNextState, "state", stop);
> + this.processValues(this.state, prevNextContext, "context", stop);
> + },
> +
> + processValues(storedValues, prevNextValues, type, stop) {
`prevNextValues` is quite confusing name.
Shouldn't this be called `previousValues`?
What does "next" mean here?
::: devtools/shared/perfService.js:408
(Diff revision 1)
> + this.processValues(this.state, prevNextContext, "context", stop);
> + },
> +
> + processValues(storedValues, prevNextValues, type, stop) {
> + let measure = stop ? perfService.stopMeasure : perfService.startMeasure;
> + let displayName = this.getDisplayName();
Move these two lines after the check `if (!prevNextValues)`, to avoid unecessary computation.
Also it may be a good idea to cache display name value as it looks like a non-naive operation.
::: devtools/shared/perfService.js:426
(Diff revision 1)
> + perfService.storeValue(mark, newValue);
> + } else if (storedValue !== newValue) {
> + let mark = `${displayName}-${type}-changed-${key}`;
> + measure(mark);
> + perfService.storeValue(mark, newValue);
> + }
You can share more code between these two, and also avoid the intermidiate "measure" variable which is hard to follow:
```
let action = typeof storedValue === "undefined" ? "added" : "changed";
let mark = `${displayName}-${type}-${action}-${key}`;
if (stop) {
perfService.stopMeasure(mark);
} else {
perfService.startMeasure(mark);
}
perfService.storeValue(mark, newValue);
```
Also, shouldn't we call start/stop Measure after storageValue?
Updated•7 years ago
|
Flags: needinfo?(poirot.alex)
Comment 7•7 years ago
|
||
It looks like these kind of data could be useful to any React application.
Did you gave some thought in offering such feature via a devtools-addon -or- builtin devtools feature?
Assignee | ||
Comment 8•7 years ago
|
||
(In reply to Alexandre Poirot [:ochameau] from comment #7)
> It looks like these kind of data could be useful to any React application.
> Did you gave some thought in offering such feature via a devtools-addon -or-
> builtin devtools feature?
Thanks for the feedback.
I think that for now it is best as a Firefox pref because it is useful for us but we could create an in content version for the React addon if the React guys are okay with that.
Comment 9•7 years ago
|
||
Mike and I just had a quick chat about this. Mike says his patches already make it possible for him to find React components to be optimized (via shouldComponentUpdate). So I think a good way to progress would be for Mike to use this locally to find components to optimize, file bugs and fix them. This way it gives us time to evaluate the value of this performance "framework" and make it better.
Obviously, if you're able to land this bug soon, great. If not, the code here still appears to have value, and improving the performance of our UI is the outcome we're looking for anyway.
Comment 10•7 years ago
|
||
Story telling on how you are using it will be very useful,
so do not hesitate to provide details if you submit a patch thanks to its information.
Assignee | ||
Comment 11•7 years ago
|
||
mozreview-review |
Comment on attachment 8897012 [details]
Bug 1390092 - Measure various performance times inside devtools
https://reviewboard.mozilla.org/r/168296/#review176358
::: devtools/shared/perfService.js:305
(Diff revision 1)
> + getNextMarkIndex(mark) {
> + for (let i = 1; i < 9999; i++) {
> + if (this.getEntriesByName(`${mark}-${i}`, "mark").length === 0) {
> + return i;
> + }
> + }
What the heck was I thinking? Now using a map, great call.
::: devtools/shared/perfService.js:316
(Diff revision 1)
> + */
> + reset() {
> + this.once = new Set();
> + this.values = new Map();
> + this._perf.clearMarks();
> + this._perf.clearMeasures();
Yes, that was a relic from past iterations.
::: devtools/shared/perfService.js:380
(Diff revision 1)
> + } else {
> + console.log(name, duration, value);
> + }
> + }
> + console.groupEnd();
> + },
It is for the browser console.
I know it is a long list, but it is the only way to see the props and state values so that you can target the objects passed via shouldComponentUpdate and that is vital for optimizing React's rendering.
::: devtools/shared/perfService.js:406
(Diff revision 1)
> + this.processValues(this.props, prevNextProps, "prop", stop);
> + this.processValues(this.state, prevNextState, "state", stop);
> + this.processValues(this.state, prevNextContext, "context", stop);
> + },
> +
> + processValues(storedValues, prevNextValues, type, stop) {
Sometimes logPerfStats() is called from componentWillUpdate() and sometimes it is called from componentDidUpdate().
When it is called from componentWillUpdate() it is either the next props, next state or the next context.
When it is called from componentDidUpdate() it is either the previous props, previous state or the previous context.
So prevNextValues means either the previous or next prop, state or context.
::: devtools/shared/perfService.js:408
(Diff revision 1)
> + this.processValues(this.state, prevNextContext, "context", stop);
> + },
> +
> + processValues(storedValues, prevNextValues, type, stop) {
> + let measure = stop ? perfService.stopMeasure : perfService.startMeasure;
> + let displayName = this.getDisplayName();
Done and Done.
::: devtools/shared/perfService.js:426
(Diff revision 1)
> + perfService.storeValue(mark, newValue);
> + } else if (storedValue !== newValue) {
> + let mark = `${displayName}-${type}-changed-${key}`;
> + measure(mark);
> + perfService.storeValue(mark, newValue);
> + }
Agreed, that is easier to understand.
We need to call storeValue after startMeasure because startmeasure increments the index.
Comment hidden (mozreview-request) |
Assignee | ||
Comment 13•7 years ago
|
||
mozreview-review-reply |
Comment on attachment 8897012 [details]
Bug 1390092 - Measure various performance times inside devtools
https://reviewboard.mozilla.org/r/168296/#review175968
Sorry I took so long to get to the second round of reviews but I had a huge amount of merge conflicts to fix.
We need dumpMeasures because it logs useful information to the browser console like render time and any payload that was in props or state. It is a shame that console.table doesn't work from privileged code because then we could really tidy up the output.
There is a performance hit, I can try to reduce that but the performance is all relative so the slowest things revealed by the tool will still be the slowest if the tool isn't running so this isn't a huge issue... we are not accessing the DOM so maybe we could spin them off to worker threads.
Yes, we really would like to add some of these tests to DAMP.
I have now stopped changing the names like I was e.g. someComponent to some-component because it wastes cycles and it is more obvious that these are component names.
> It looks like this is never used, if that's really the case, we should strip that.
No, we can use this to allow tests, e.g. DAMP, to access the performance data as you suggested above.
I have just added exports.perfServiceForTesting, which is a version to be used for testing. This allows passing in a new performanceObj (fake window.performance) for testing purposes.
> All these three properties seem to be unused and should be removed (navigation, timing, onresourcetimingbufferfull)
> As well as: clearResourceTimings, now, setResourceTimingBufferSize, toJSON, timeOrigin, startMeasureOnce and stopMeasureOnce.
My original idea was to use this instead of window.performance so I included all methods but doing it this was is a little slimmer so I am happy with that.
I have removed navigation, timing, onresourcetimingbufferfull, clearResourceTimings, now, setResourceTimingBufferSize, toJSON, timeOrigin, startMeasureOnce and stopMeasureOnce.
Comment 14•7 years ago
|
||
Thanks, the patch looks better, I would still need to test it again.
Now I'm especially interested in an example which will justify the value of this.
You, finding a regression thanks to these data and describing how you identified and addressed one performance issue.
At first I thought you were using perf-html, but it looks like you are mostly using browser console?
Comment hidden (mozreview-request) |
Assignee | ||
Comment 16•7 years ago
|
||
A huge part of the reason for the timing mixin was for automated testing using DAMP but Alex has convinced me that timing is unreliable so we can't really use it for tests.
Saying that, 99% of this patch is providing mixins for *all* of our React components so the patch is still very useful.
I will fold this patch into the patch from bug 1390093 and remove the timing stuff so it can be used with whydidyouupdate instead.
I will also quickly create a new mixin that can be used to identify exactly which properties should be added to shouldComponentUpdate().
Assignee | ||
Comment 17•7 years ago
|
||
I am swallowing my pride and merging this patch with that from bug 1390093. It will include whydidyouupdate and also be a general "add a mixin to all our React components" patch.
That was both patches are well worth it.
Assignee | ||
Comment 18•7 years ago
|
||
This has been merged into bug 1390092
Assignee | ||
Comment 19•7 years ago
|
||
I meant 1390093
Assignee | ||
Updated•7 years ago
|
Status: ASSIGNED → RESOLVED
Closed: 7 years ago
Resolution: --- → WORKSFORME
Updated•6 years ago
|
Product: Firefox → DevTools
Updated•5 years ago
|
Attachment #8897012 -
Flags: review?(poirot.alex)
You need to log in
before you can comment on or make changes to this bug.
Description
•