Closed Bug 1254898 Opened 9 years ago Closed 2 years ago

Add a talos test for video performance with basic compositor

Categories

(Core :: Graphics, defect, P3)

defect

Tracking

()

RESOLVED WONTFIX
mozilla50
Tracking Status
firefox50 --- fixed

People

(Reporter: jrmuizel, Unassigned)

References

(Blocks 1 open bug)

Details

(Whiteboard: [gfx-noted])

Attachments

(4 files, 22 obsolete files)

(deleted), video/mp4
Details
(deleted), patch
Details | Diff | Splinter Review
(deleted), patch
avih
: review+
Details | Diff | Splinter Review
(deleted), patch
avih
: review+
Details | Diff | Splinter Review
No description provided.
Whiteboard: [gfx-noted]
Assignee: nobody → ethlin
Sounds like a good idea. I'm not familiar with video playback in Firefox, but I'm guessing we're interested in tracking decoding throughput with some interesting/representative encoding profiles (resolutions, codecs, bitrates, maybe containers)? Are we also interested in playback fidelity? (decoding bugs, color management?), though this would belong more in a unit tests than perf tests. What about encoding perf? Even if we don't get to it right away, it would be useful to get some background and bounce some ideas for approaches. Jeff? This bug is rather sparse thus far ;)
(In reply to Avi Halachmi (:avih) from comment #1) Basically we just want to avoid regressions in our decode/presentation pipeline. I'm interested in getting something sooner rather than later as our current performance with the basic compositor is atrocious. Something as simple as frame times for a short 720p vp9 video in windowed and fullscreen mode would be a great place to start. Testing correctness and encoding shouldn't be needed here.
Syd, Ryan mentioned you're running some youtube tests in Marionette. Care to elaborate a bit about it? (bug number, what it does, is it considered effective, etc). Thanks.
Flags: needinfo?(spolk)
Flags: needinfo?(spolk)
(In reply to Syd Polk :sydpolk from comment #4) > https://developer.mozilla.org/en-US/docs/Mozilla/QA/external-media-tests > > I am still developing this document. Thanks, I've read all of it, but as far as I can tell, this document deals exclusively with the test setup, but doesn't mention what's being tested, what do the numbers represent, how are results collected or crunched, what are the pass/fail criteria (or if there are such at all) etc. So what does it actually test?
It tests that video playback occurs when the given url is loaded. The default test makes sure that the video will start, and then it will play 2 minutes of the video, and verify that there are no significant stalls. Other tests could be written to figure out performance if the page or Firefox somehow makes those numbers available via javacript. However, that would be a separate effort using this framework.
I tried to add a talos test for video performance. But I face a problem is I couldn't get the real FPS from compositor. Jeff, do you have any idea to measure the performance of video from basic compositor?
Flags: needinfo?(jmuizelaar)
(In reply to Ethan Lin[:ethlin] from comment #7) > I tried to add a talos test for video performance. But I face a problem is I > couldn't get the real FPS from compositor. > Jeff, do you have any idea to measure the performance of video from basic > compositor? Please post your WIP patch someplace so we can reference it. Grabbing the FPS is possibly a good goal, but it won't be a very useful one on its own if we're always ending up with 60 fps (or 24 or whatever) because the system is good enough to never drop frames. What we need here IMO is a test which does something like "try to play this clip as fast as possible, and tell us how fast we managed". Grabbing the fps (when playing as fast as possible) is one way, but grabbing the overall playback time is equivalent to the average frame-duration or fps, and probably much easier to measure. Of course, overall time won't show us how the durations distribute, but since talos tests typically report a single number anyway, it might be good enough, and probably a good first goal for this test. So the thing to figure out is how to make a clip play as fast as possible, or something which could yield results of similar quality, IMO.
I will upload a simple WIP without the score calculation first to have further discussion. I agree if we play a video as fast as possible, the total playback time could be the performance score for talos test. But I think normally the video demuxer will play each frame by it's dts/pts or based on the video's fps. The demuer may drop a frame when it's overdue. I am not sure how to play a video as fast as possible since each frame has a timestamp. Or if there is a config to play video without dropping frames? About the fps method, it's possible to create a video with 240fps or more. There is config media.navigator.video.max_fr to change the gecko's max video fps and we also need to set the ASAP mode in talos test. So if we can get the real fps from compositor, fps may be a workable way to measure the performance of video playback in gecko.
Flags: needinfo?(jmuizelaar)
Attached patch talos test for video. (obsolete) (deleted) — Splinter Review
Upload a WIP. The video source is copy from mochitest. The performance measurement is wrong now. I will find another way to measure video playback performance.
Attached patch talos test for video. (obsolete) (deleted) — Splinter Review
In this patch, I try to use MozAfterPaint to get the real frame number from compositor. But MozAfterPaint seems not workable for video, so the result is wrong. Avi, do you have any idea how to play a video as fast as possible?
Flags: needinfo?(avihpit)
Attachment #8733267 - Attachment is obsolete: true
Can't we use HTMLVideoElement::MozPaintedFrames() for video paint count?
(In reply to Ethan Lin[:ethlin] from comment #11) > Created attachment 8738858 [details] [diff] [review] > talos test for video. Thanks. > Avi, do you have any idea how to play a video as fast as possible? No. I think this would be a question for the media guys. A possible approach I have in mind is to either control the playback speed (if the video control has such - which I don't know), or use a video with exceptionally high frame rate which Firefox clearly shouldn't be able to handle - and somehow make the player not drop frames (i.e. force it to compose all frames and not only to decode them). Are there any preferences which might help here? or another approach? Thomas?
Flags: needinfo?(avihpit) → needinfo?(tdaede)
(In reply to Sotaro Ikeda [:sotaro] from comment #12) > Can't we use HTMLVideoElement::MozPaintedFrames() for video paint count? I think I can use this to get the correct frame count. I will upload another patch to use HTMLVideoElement.mozPaintedFrames as the score. Sotaro, if I want to play a video with 120fps or higher, should I change any pref of gecko? It seems to be limited in 60fps now.
Flags: needinfo?(sotaro.ikeda.g)
Attached patch talos test for video. (obsolete) (deleted) — Splinter Review
Use HTMLVideoElement.mozPaintedFrames as the score.
Attachment #8738858 - Attachment is obsolete: true
Comment on attachment 8738912 [details] [diff] [review] talos test for video. Review of attachment 8738912 [details] [diff] [review]: ----------------------------------------------------------------- So, what kinds of outputs does it produce? with some sample video etc? ::: testing/talos/talos/tests/video/video.manifest @@ +1,1 @@ > +% http://localhost/tests/video/video_playback.html?auto=true I don't think I see where 'auto=true' is used.. ? ::: testing/talos/talos/tests/video/video_playback.html @@ +17,5 @@ > + > +function videoEnded() { > + var vdo = document.getElementById("vdo"); > + data.names.push("video"); > + data.values.push(vdo.mozPaintedFrames); Most talos throughput tests report average duration for a single frame (so lower is better). If possible, it's preferable to use the same approach here too. The advantage of reporting average rather than overall is that we can "tune" the test case with different videos etc and still get comparable results. Also, it's easier to understand when someone says "on my system the average for this video is 10ms/frame" rather than "it painted 1327 frames". Doesn't matter much if the averaging is applied here or just before the tpRecordTime call, but to get the overall duration - here is probably better.
(In reply to Ethan Lin[:ethlin] from comment #14) > I think I can use this to get the correct frame count. I will upload another > patch to use HTMLVideoElement.mozPaintedFrames as the score. > Sotaro, if I want to play a video with 120fps or higher, should I change any > pref of gecko? It seems to be limited in 60fps now. Composition fps could be changed by "layout.frame_rate". It also change Refresh driver's frame rate. /*** * The preference "layout.frame_rate" has 3 meanings depending on the value: * * -1 = Auto (default), use hardware vsync or software vsync @ 60 hz if hw vsync fails. * 0 = ASAP mode - used during talos testing. * X = Software vsync at a rate of X times per second. */ https://dxr.mozilla.org/mozilla-central/source/gfx/thebes/gfxPlatform.cpp#2234
Flags: needinfo?(sotaro.ikeda.g)
(In reply to Sotaro Ikeda [:sotaro] from comment #12) > Can't we use HTMLVideoElement::MozPaintedFrames() for video paint count? ImageContainer also expose a dropped image count as GetDroppedImageCount(), it is not exposed by HTMLVideoElement. https://dxr.mozilla.org/mozilla-central/source/gfx/layers/ImageContainer.h#543
Attached patch talos test for video. (obsolete) (deleted) — Splinter Review
I change the unit to ms/frame. The test plays a 120fps video by 4x in fullscreen mode. Ideally the best result should be 2.08 ms/frame.
Attachment #8738912 - Attachment is obsolete: true
Attachment #8740302 - Flags: review?(avihpit)
Attached patch Add a talos test for video. (obsolete) (deleted) — Splinter Review
Fix a small bug.
Attachment #8740302 - Attachment is obsolete: true
Attachment #8740302 - Flags: review?(avihpit)
Attachment #8740304 - Flags: review?(avihpit)
Comment on attachment 8740304 [details] [diff] [review] Add a talos test for video. Review of attachment 8740304 [details] [diff] [review]: ----------------------------------------------------------------- Looks good, thanks. Some comments in no specific order: 1. I think that while we're at it, we'd probably want to test more than one view configuration and then report the numbers as sub-tests. I was thinking normal (1:1) size, full screen, double size (even if it overflow's the window size), and 1.1x size. The 1.1x size is very similar in the area it covers to 1:1, but might expose issues when scaling is involved regardless of the area size. We can leave the view modes to last, after we're confident of the basics. 2. I think the clip own properties should be standard where possible. Specifically, the clip should probably play native at one of 23.976/24/25/29.97/30/59.94/60 (rather than 120). I'd probably pick 59.94 since 60fps clips are all the rage and we should be able to handle them well. Resolution should probably be 16:9 in either 720p (1280x) or 1080p (1920x). Also, where is this test clip? how long is it? what format (container, codecs)? 3. Unless it introduces issues, I think we should aim at more than 4x (or 8x with ~60fps). So maybe 59.94 fps clip and x20 speed? What are the implications of the speed in practice? Assuming it doesn't drop frames, I'd expect x10 and x100 to produce effectively identical results, is that indeed the case? 4. You have these prefs: > preferences = {'full-screen-api.allow-trusted-requests-only': False, > 'layers.acceleration.force-enabled': False, > 'layers.acceleration.disabled': True, > 'layout.frame_rate': 0, > 'docshell.event_starvation_delay_hint': 1, > 'dom.send_after_paint_to_content': False} These seem to use ASAP mode and disable more advanced compositor (do correct me if I'm wrong), which might be indeed what the bug title suggests we want ("basic compositor"). Jeff, is this a good configuration to test the kind of playback performance you're interested in? Any other comments on 1/2/3 above? Anything else we might want to test here which didn't come up?
Attachment #8740304 - Flags: review?(avihpit)
Flags: needinfo?(jmuizelaar)
(In reply to Avi Halachmi (:avih) from comment #21) > Jeff, is this a good configuration to test the kind of playback performance > you're interested in? Any other comments on 1/2/3 above? Anything else we > might want to test here which didn't come up? Nope. That certainly sounds thorough enough. Even the bare minimum of testing is sufficient for me.
Flags: needinfo?(jmuizelaar)
(In reply to Avi Halachmi (:avih) from comment #21) > Comment on attachment 8740304 [details] [diff] [review] > Add a talos test for video. > > Review of attachment 8740304 [details] [diff] [review]: > ----------------------------------------------------------------- > > Looks good, thanks. Some comments in no specific order: > > 1. I think that while we're at it, we'd probably want to test more than one > view configuration and then report the numbers as sub-tests. I was thinking > normal (1:1) size, full screen, double size (even if it overflow's the > window size), and 1.1x size. > > The 1.1x size is very similar in the area it covers to 1:1, but might expose > issues when scaling is involved regardless of the area size. > > We can leave the view modes to last, after we're confident of the basics. Okay, I will add these view modes in the test. > 2. I think the clip own properties should be standard where possible. > Specifically, the clip should probably play native at one of > 23.976/24/25/29.97/30/59.94/60 (rather than 120). I'd probably pick 59.94 > since 60fps clips are all the rage and we should be able to handle them > well. Resolution should probably be 16:9 in either 720p (1280x) or 1080p > (1920x). > > Also, where is this test clip? how long is it? what format (container, > codecs)? The format of test clip in my patch is H264@1920x1080@60fps, and the duration is about 3 seconds. Do you have any suggested video clip with 59.94fps. > > 3. Unless it introduces issues, I think we should aim at more than 4x (or 8x > with ~60fps). So maybe 59.94 fps clip and x20 speed? What are the > implications of the speed in practice? Assuming it doesn't drop frames, I'd > expect x10 and x100 to produce effectively identical results, is that indeed > the case? The max playback rate in gecko should be 5x[1]. With 5x playback rate, we can get 300fps from 60fps clip or 600fps from 120fps clip. My current patch is 480fps (the best score is about 2ms/frame). I got about 70 ms/frame on macbook pro(i7 2.5G). I think 600fps should be enough for a long time since the Moore's law has gone. I prefer 600fps because this test is just for performance and the standard format is not necessary. What do you think? [1] https://dxr.mozilla.org/mozilla-central/source/dom/html/HTMLMediaElement.cpp#152 > > 4. You have these prefs: > > preferences = {'full-screen-api.allow-trusted-requests-only': False, > > 'layers.acceleration.force-enabled': False, > > 'layers.acceleration.disabled': True, > > 'layout.frame_rate': 0, > > 'docshell.event_starvation_delay_hint': 1, > > 'dom.send_after_paint_to_content': False} > > These seem to use ASAP mode and disable more advanced compositor (do correct > me if I'm wrong), which might be indeed what the bug title suggests we want > ("basic compositor"). > > Jeff, is this a good configuration to test the kind of playback performance > you're interested in? Any other comments on 1/2/3 above? Anything else we > might want to test here which didn't come up?
Flags: needinfo?(avihpit)
Attached video testsrc.480p.120fps.mp4 (deleted) —
(In reply to Ethan Lin[:ethlin] from comment #23) > > We can leave the view modes to last, after we're confident of the basics. > > Okay, I will add these view modes in the test. Only after we're happy with the rest ;) > The max playback rate in gecko should be 5x[1]. Yeah, I noticed this too, empirically :) So I agree 120fps and 5x speed would be OK for this test. Note, however, that it's possible to reach this limit with a light enough clip (and/or hw acceleration enabled) - which I've confirmed. I actually managed to get ~1000fps with a small clip @240fps. But for this test, 600fps is a reasonable limit. > The format of test clip in my patch is H264@1920x1080@60fps, and the > duration is about 3 seconds. > Do you have any suggested video clip with 59.94fps. So I got the patch with the clip and ran the test locally. IMO it's too short. There's some (expected) transitional behavior when the clip starts, which means that the first ~1-3 seconds play in a different speed (slower) compared to the rest. Also, there appears to be the full screen warning/message which seems to affect playback performance, and also sometimes it starts with a throbber overlay for a second or so. All those affect the results, and while it would be nice to measure them too, the first goal should be to measure the stable behavior after all those init things have ended. I suggest the following: 1. Make a longer test clip, say, 1 minute. With ffmpeg you can do this to get a 480p clip @120fps (which I've attached): > ffmpeg -f lavfi -i testsrc=rate=120 -aspect 16:9 -vf scale=848:480 -t 60 -pix_fmt yuv420p testsrc.480p.120fps.mp4 2. Shorten the fullscreen message duration with full-screen-api.warning.timeout=500 (originally 3000). 3. To further avoid penalties and inaccuracies from the measurement start/sop, I suggest to start the playback without measuring anything, wait some seconds (4 is reasonable), then save the current number of painted frames so far and the timestamp, then wait another fixed duration (2-3 secs), and then take the end timestamp and the current number of painted frames. Finally, (end - start) / (paintedFramesEnd - paintedFramesStart) This way, the measurement start/stop will not be affected by loading, initialization, fullscreen transitions, etc. I used this approach locally and noticed that the performance is greatly affected by the following factors: 1. Scaling 2. full screen 3. clip dimension So this means eventually we'd want at least 2 sizes (I'd say 480p and 1080p) to let the numbers reflect the relation to the clip dimensions too (and also the few view modes per clip). And finally, when I ran it in talos locally (Windows 8.1), Firefox refused to go full screen at the first cycle. If I manually click fullscreen, then the test starts and the following cycles go to fullscreen automatically. To test this, I added a listener to "fullscreenerror" which indeed got fired. Not sure why it happens on first load, but we'd need to make sure it runs correctly on all OSs, and I'd imagine especially Windows.
Flags: needinfo?(avihpit)
Attached patch Part1. Add a talos test for video. (obsolete) (deleted) — Splinter Review
Address avih's comments. I remove the control bar and change the time out of the fullscreen notification. And I also add different display modes and resolutions in this test.
Attachment #8740304 - Attachment is obsolete: true
Attachment #8741707 - Flags: review?(avihpit)
Attached patch Part2. Add 240p clips. (obsolete) (deleted) — Splinter Review
Add 240p clips.
Attachment #8741708 - Flags: review?(avihpit)
Attachment #8741707 - Attachment description: Add a talos test for video. → Part1. Add a talos test for video.
Attachment #8741708 - Attachment description: Add 240p clips. → Part2. Add 240p clips.
Attached patch Part3. Add 480p clips. (obsolete) (deleted) — Splinter Review
Add 480p clips.
Attachment #8741709 - Flags: review?(avihpit)
Attached patch Part4. Add 720p clips. (obsolete) (deleted) — Splinter Review
Add 720p clips.
Attachment #8741710 - Flags: review?(avihpit)
Attached patch Part5. Add 720p@240fps clips. (obsolete) (deleted) — Splinter Review
Add 720p@240fps clips.
Attachment #8741712 - Flags: review?(avihpit)
Attached patch Part6. Add 1080p clips. (obsolete) (deleted) — Splinter Review
Add 1080p clips.
Attachment #8741713 - Flags: review?(avihpit)
Attached patch Part1. Add a talos test for video. (obsolete) (deleted) — Splinter Review
Correct a typo. I tried the test on windows and the fullscreen mode is fine.
Attachment #8741707 - Attachment is obsolete: true
Attachment #8741707 - Flags: review?(avihpit)
Attachment #8741722 - Flags: review?(avihpit)
Comment on attachment 8741722 [details] [diff] [review] Part1. Add a talos test for video. Review of attachment 8741722 [details] [diff] [review]: ----------------------------------------------------------------- (In reply to Ethan Lin[:ethlin] from comment #31) > I tried the test on windows and the fullscreen mode is fine. Hmm.. for some reason when I run it locally in talos it still doesn't enter full screen (and just hangs forever) until I manually right click the clip and then choose "Full screen". How do you invoke it within talos? I'm using: > talos --develop -e path/to/nightly/firefox.exe --tppagecycles 1 --tpcycles 1 -a basic_compositor_video So, how do the numbers look like when you run this? do they seem meaningful for all the clips? ::: testing/talos/talos/tests/video/video_playback.html @@ +20,5 @@ > + ['testsrc.720p.60fps.mp4', 1280, 720], > + ['testsrc.720p.120fps.mp4', 1280, 720], > + ['testsrc.720p.240fps.mp4', 1280, 720], > + ['testsrc.1080p.60fps.mp4', 1920, 1080], > + ['testsrc.1080p.120fps.mp4', 1920, 1080] ]; 1. There's no need to hardcode the clip dimensions. You can use vdo.videoWidth and dvo.videoHeight. See https://developer.mozilla.org/en-US/docs/Web/API/HTMLVideoElement . 2. I don't think we need so many test clips. I uploaded all of them so that we can experiment and then choose 2-3 clips which we'll use at the test. Unless you prefer otherwise, I think using only these three would be good enough: 240p@120fps, 480p@60fps, 1080p@60fps. 3. Let's put the clips at a directory called 'clips' or 'data'. @@ +27,5 @@ > +var testResult = {names: [], values: []}; > + > +function init() { > + vdo = document.getElementById('vdo'); > + vdo.addEventListener('ended', videoEnded, false); Instead of finishing the test when the clip ends, I think it would be better "finish" the test some fixed duration after we started measuring (so people can use this test also with their own clips even if they're much longer than 1 minute). 2-3 seconds of measurement is probably good enough to get stable numbers. So let's put this value as const/var also at the top together with the warmup period (see below). And, rename videoEnded below to something like measurementEnded. @@ +33,5 @@ > + readyToStart = true; > + readyToNext = true; > + }); > + > + document.addEventListener('mozfullscreenchange', function(event) { Is there a good reason to listen to both 'fullscreenchange' and 'mozfullscreenchange' ? If there is, then please add a comment explaining it, and also, it would be nicer to have a single function which both listeners register instead of duplicating the code. Same goes for the other places where you check for two APIs (like exitFullscreen and mozCancelFullscreen, and elsewhere, if there are more). @@ +55,5 @@ > + } else if (vdo.mozRequestFullScreen) { > + vdo.mozRequestFullScreen(); > + } > + } else { > + readyToStart = true; If you're waiting for the clip to enter full screen before we start the test, then you should probably also wait for it to exit full screen (from the previous test) before you start the test. Preferably, we're waiting while the clip is paused. If we're not pausing the clip then it would start playback and the fullscreen change could happen after we're some time into the clip. @@ +65,5 @@ > +} > + > +function startTest() { > + if (!readyToStart) { > + setTimeout(startTest, 1000); Please add a comment explaining what problem this solves. Also, we can probably use a much shorter timeout, or just directly wire it to the fullscreenchange listener. @@ +74,5 @@ > + > + setTimeout(function(){ > + start = new Date(); > + paintedFramesStart = vdo.mozPaintedFrames; > + }, 4000); So 4000ms here is the "warmup period" Let's put this value in a var/const at the configuration section at the top. Also, in general, performance.now() is more accurate than Date. It doesn't matter much here, but still nicer to use it instead of Date. @@ +78,5 @@ > + }, 4000); > +} > + > +function videoEnded() { > + var end = new Date(); We should probably pause the video too. @@ +81,5 @@ > +function videoEnded() { > + var end = new Date(); > + paintedFramesEnd = vdo.mozPaintedFrames; > + var timePerFrame = (end - start) / (paintedFramesEnd - paintedFramesStart); > + testResult.names.push(test[testIndex][0] + '_scale:' + viewMode[viewModeIndex]); I'm not sure how the "system" would handle a test name with ":" in it. Let's change it to "_". @@ +110,5 @@ > + > + if (testIndex >= test.length) { > + viewModeIndex++; > + testIndex = 0; > + } Your iteration order is such that for every new test you load a different clip (since you cycle the clips and then when you exhaust them you change to the next view mode). I would feel better if we test each clip at all the view modes, and then cycle to the next clip.
Attachment #8741722 - Flags: review?(avihpit) → review-
Attachment #8741708 - Flags: review?(avihpit)
Attachment #8741709 - Flags: review?(avihpit) → review-
Attachment #8741710 - Flags: review?(avihpit)
Attachment #8741712 - Flags: review?(avihpit)
Comment on attachment 8741713 [details] [diff] [review] Part6. Add 1080p clips. Review of attachment 8741713 [details] [diff] [review]: ----------------------------------------------------------------- Let's use only 3 clips. Choose the ones you think are best. See comment 32.
Attachment #8741713 - Flags: review?(avihpit)
Attachment #8741709 - Flags: review-
(In reply to Avi Halachmi (:avih) from comment #32) > Comment on attachment 8741722 [details] [diff] [review] > Part1. Add a talos test for video. > > Review of attachment 8741722 [details] [diff] [review]: > ----------------------------------------------------------------- > > (In reply to Ethan Lin[:ethlin] from comment #31) > > I tried the test on windows and the fullscreen mode is fine. > > Hmm.. for some reason when I run it locally in talos it still doesn't enter > full screen (and just hangs forever) until I manually right click the clip > and then choose "Full screen". > > How do you invoke it within talos? I'm using: > > talos --develop -e path/to/nightly/firefox.exe --tppagecycles 1 --tpcycles 1 -a basic_compositor_video I run the test by using './mach talos-test --suite video'. Does the windows command line show the detail talos commands? I could check it later when I go to the office. It make sense to hang because the test is waiting for fullscreen change. My test env is win10 with nightly. What is yours? If you load the video_playback.html directly on the browser, can it run normally? > > So, how do the numbers look like when you run this? do they seem meaningful > for all the clips? I will reply the question later when I go to the office. > > ::: testing/talos/talos/tests/video/video_playback.html > @@ +20,5 @@ > > + ['testsrc.720p.60fps.mp4', 1280, 720], > > + ['testsrc.720p.120fps.mp4', 1280, 720], > > + ['testsrc.720p.240fps.mp4', 1280, 720], > > + ['testsrc.1080p.60fps.mp4', 1920, 1080], > > + ['testsrc.1080p.120fps.mp4', 1920, 1080] ]; > > 1. There's no need to hardcode the clip dimensions. You can use > vdo.videoWidth and dvo.videoHeight. See > https://developer.mozilla.org/en-US/docs/Web/API/HTMLVideoElement . Okay. > 2. I don't think we need so many test clips. I uploaded all of them so that > we can experiment and then choose 2-3 clips which we'll use at the test. > Unless you prefer otherwise, I think using only these three would be good > enough: 240p@120fps, 480p@60fps, 1080p@60fps. Okay. > 3. Let's put the clips at a directory called 'clips' or 'data'. Okay, I will move clips. > @@ +27,5 @@ > > +var testResult = {names: [], values: []}; > > + > > +function init() { > > + vdo = document.getElementById('vdo'); > > + vdo.addEventListener('ended', videoEnded, false); > > Instead of finishing the test when the clip ends, I think it would be better > "finish" the test some fixed duration after we started measuring (so people > can use this test also with their own clips even if they're much longer than > 1 minute). 2-3 seconds of measurement is probably good enough to get stable > numbers. So let's put this value as const/var also at the top together with > the warmup period (see below). Okay, will do. > And, rename videoEnded below to something like measurementEnded. Okay. > @@ +33,5 @@ > > + readyToStart = true; > > + readyToNext = true; > > + }); > > + > > + document.addEventListener('mozfullscreenchange', function(event) { > > Is there a good reason to listen to both 'fullscreenchange' and > 'mozfullscreenchange' ? If there is, then please add a comment explaining > it, and also, it would be nicer to have a single function which both > listeners register instead of duplicating the code. I think I can leave 'mozfullscreenchange' only. For release version, only 'mozfullscreenchange' works, and for Nightly, both events work well. > Same goes for the other places where you check for two APIs (like > exitFullscreen and mozCancelFullscreen, and elsewhere, if there are more). > > @@ +55,5 @@ > > + } else if (vdo.mozRequestFullScreen) { > > + vdo.mozRequestFullScreen(); > > + } > > + } else { > > + readyToStart = true; > > If you're waiting for the clip to enter full screen before we start the > test, then you should probably also wait for it to exit full screen (from > the previous test) before you start the test. Preferably, we're waiting > while the clip is paused. If we're not pausing the clip then it would start > playback and the fullscreen change could happen after we're some time into > the clip. I use 'readyToNext' to wait for exiting full screen. The flow now is: 'runTest' -> 'Change to full screen' -> 'Wait till full screen' -> 'Play' -> 'End' -> 'Exit full screen' -> 'Wait till exiting full screen' -> 'Run next test'. So the play action is after it entered/exited full screen. > > @@ +65,5 @@ > > +} > > + > > +function startTest() { > > + if (!readyToStart) { > > + setTimeout(startTest, 1000); > > Please add a comment explaining what problem this solves. Also, we can > probably use a much shorter timeout, or just directly wire it to the > fullscreenchange listener. Okay, I will try to use the listener. > @@ +74,5 @@ > > + > > + setTimeout(function(){ > > + start = new Date(); > > + paintedFramesStart = vdo.mozPaintedFrames; > > + }, 4000); > > So 4000ms here is the "warmup period" Let's put this value in a var/const at > the configuration section at the top. Okay. > Also, in general, performance.now() is more accurate than Date. It doesn't > matter much here, but still nicer to use it instead of Date. Okay, good to know. I will change it. > @@ +78,5 @@ > > + }, 4000); > > +} > > + > > +function videoEnded() { > > + var end = new Date(); > > We should probably pause the video too. Okay. > @@ +81,5 @@ > > +function videoEnded() { > > + var end = new Date(); > > + paintedFramesEnd = vdo.mozPaintedFrames; > > + var timePerFrame = (end - start) / (paintedFramesEnd - paintedFramesStart); > > + testResult.names.push(test[testIndex][0] + '_scale:' + viewMode[viewModeIndex]); > > I'm not sure how the "system" would handle a test name with ":" in it. Let's > change it to "_". Okay. > @@ +110,5 @@ > > + > > + if (testIndex >= test.length) { > > + viewModeIndex++; > > + testIndex = 0; > > + } > > Your iteration order is such that for every new test you load a different > clip (since you cycle the clips and then when you exhaust them you change to > the next view mode). I would feel better if we test each clip at all the > view modes, and then cycle to the next clip. Okay, sounds good.
(In reply to Ethan Lin[:ethlin] from comment #34) > (In reply to Avi Halachmi (:avih) from comment #32) > > Comment on attachment 8741722 [details] [diff] [review] > > Part1. Add a talos test for video. > > > > Review of attachment 8741722 [details] [diff] [review]: > > ----------------------------------------------------------------- > > > > (In reply to Ethan Lin[:ethlin] from comment #31) > > > I tried the test on windows and the fullscreen mode is fine. > > > > Hmm.. for some reason when I run it locally in talos it still doesn't enter > > full screen (and just hangs forever) until I manually right click the clip > > and then choose "Full screen". > > > > How do you invoke it within talos? I'm using: > > > talos --develop -e path/to/nightly/firefox.exe --tppagecycles 1 --tpcycles 1 -a basic_compositor_video > I run the test by using './mach talos-test --suite video'. Does the windows > command line show the detail talos commands? I could check it later when I > go to the office. > It make sense to hang because the test is waiting for fullscreen change. My > test env is win10 with nightly. What is yours? If you load the > video_playback.html directly on the browser, can it run normally? Could you check the 'full-screen-api.allow-trusted-requests-only' in about:config? I set it as 'FALSE' in the test.py.
> I think I can leave 'mozfullscreenchange' only. For release version, only 'mozfullscreenchange' works, and for Nightly, both events work well. Oh, interesting. Then let's leave both of them. Sounds like a good enough reason to me. Same with the other places then. > ... So the play action is after it entered/exited full screen. Ah. Hmm.. I should check again then. I recalled that when I tested the last patch it started playing before it exited fullscreen. (In reply to Ethan Lin[:ethlin] from comment #35) > Could you check the 'full-screen-api.allow-trusted-requests-only' in > about:config? I set it as 'FALSE' in the test.py. I did, it's false. > If you load the > video_playback.html directly on the browser, can it run normally? In the same nightly browser which I use for talos, if I run it normally and set this pref to false and load the HTML file, it works correctly (the test starts and it goes to fullscreen without request/confirmation) and the test complete successfully. > It make sense to hang because the test is waiting for fullscreen change. My > test env is win10 with nightly. What is yours? It makes sense that it waits for the change, but the full screen request is rejected (I added a listener to the error case to test this). I'm on Windows 8.1 with normal nightly binary build provided by mozilla (I didn't build it myself). (In reply to Ethan Lin[:ethlin] from comment #34) > Does the windows > command line show the detail talos commands? I could check it later when I > go to the office. Attached below is the entire log until it hangs (but then if I make it go fullscreen with the context menu - the entire test runs and completes correctly): > $ talos --develop -e /d/run/firefox-nightly/core~nightly-32/firefox.exe --tppagecycles 1 -a basic_compositor_video --tpcycles 1 > mozversion application_buildid: 20160414030247 > mozversion application_changeset: 91115264629dfaacf2d60d52a3eff89c18c5af0d > mozversion application_display_name: Nightly > mozversion application_id: {ec8030f7-c20a-464f-9b0e-13a3a9e97384} > mozversion application_name: Firefox > mozversion application_remotingname: firefox > mozversion application_repository: https://hg.mozilla.org/mozilla-central > mozversion application_vendor: Mozilla > mozversion application_version: 48.0a1 > mozversion platform_buildid: 20160414030247 > mozversion platform_changeset: 91115264629dfaacf2d60d52a3eff89c18c5af0d > mozversion platform_repository: https://hg.mozilla.org/mozilla-central > mozversion platform_version: 48.0a1 > starting webserver on 'localhost:53471' > SUITE-START | Running 1 tests > TEST-START | basic_compositor_video > Initialising browser for basic_compositor_video test... > TEST-INFO | started process 9832 (d:\run\firefox-nightly\core~nightly-32\firefox.exe --no-remote -profile u:\tmp\tmplw2ukp\profile http://localhost:53471/getInfo.html) > PROCESS | 9832 | JavaScript error: jar:file:///d:/run/firefox-nightly/core~nightly-32/omni.ja!/components/Weave.js, line 13: NS_ERROR_NOT_AVAILABLE: Component returned failure code: 0x80040111 (NS_ERROR_NOT_AVAILABLE) [nsIXPCComponents_Utils.import] > PROCESS | 9832 | [Child 5376] WARNING: pipe error: 109: file c:/builds/moz2_slave/m-cen-w32-ntly-000000000000000/build/src/ipc/chromium/src/chrome/common/ipc_channel_win.cc, line 343 > PROCESS | 9832 | [Child 5376] WARNING: pipe error: 109: file c:/builds/moz2_slave/m-cen-w32-ntly-000000000000000/build/src/ipc/chromium/src/chrome/common/ipc_channel_win.cc, line 343 > PROCESS | 9832 | JavaScript error: jar:file:///d:/run/firefox-nightly/core~nightly-32/omni.ja!/components/Weave.js, line 13: NS_ERROR_NOT_AVAILABLE: Component returned failure code: 0x80040111 (NS_ERROR_NOT_AVAILABLE) [nsIXPCComponents_Utils.import] > PROCESS | 9832 | __metrics Screen width/height:1920/1080 > PROCESS | 9832 | colorDepth:24 > PROCESS | 9832 | Browser inner width/height: 1010/674 > PROCESS | 9832 | __metrics > PROCESS | 9832 | JavaScript error: chrome://browser/content/tabbrowser.xml, line 2936: TypeError: this.tabs is undefined > PROCESS | 9832 | [Child 9580] WARNING: pipe error: 109: file c:/builds/moz2_slave/m-cen-w32-ntly-000000000000000/build/src/ipc/chromium/src/chrome/common/ipc_channel_win.cc, line 343 > PROCESS | 9832 | [Child 9580] WARNING: pipe error: 109: file c:/builds/moz2_slave/m-cen-w32-ntly-000000000000000/build/src/ipc/chromium/src/chrome/common/ipc_channel_win.cc, line 343 > TEST-INFO | 9832: exit 0 > Browser initialized. > Running cycle 1/1 for basic_compositor_video test... > TEST-INFO | started process 10752 (d:\run\firefox-nightly\core~nightly-32\firefox.exe --no-remote -profile u:\tmp\tmplw2ukp\profile -tp file:\t:\dev\moz\src\m-c\testing\talos\talos\tests\video\video.manifest.develop -tpchrome -tpnoisy -tpcycles 1 -tppagecycles 1) > PROCESS | 10752 | RSS: Main: 143843328 > PROCESS | 10752 | > PROCESS | 10752 | 1460761207146 addons.productaddons WARN Failed downloading XML, status: 0, reason: error
Attached patch Part1. Add a talos test for video. (obsolete) (deleted) — Splinter Review
Address avih's comments.
Attachment #8741722 - Attachment is obsolete: true
Attachment #8742201 - Flags: review?(avihpit)
Attached patch Part2. Add video clips. (obsolete) (deleted) — Splinter Review
Add video clips: 'testsrc.240p.120fps.mp4' 'testsrc.480p.60fps.mp4' 'testsrc.1080p.60fps.mp4'
Attachment #8741708 - Attachment is obsolete: true
Attachment #8741709 - Attachment is obsolete: true
Attachment #8741710 - Attachment is obsolete: true
Attachment #8741712 - Attachment is obsolete: true
Attachment #8741713 - Attachment is obsolete: true
Attachment #8742202 - Flags: review?(avihpit)
(In reply to Avi Halachmi (:avih) from comment #36) > In the same nightly browser which I use for talos, if I run it normally and > set this pref to false and load the HTML file, it works correctly (the test > starts and it goes to fullscreen without request/confirmation) and the test > complete successfully. I think you may see the log "Request for fullscreen was denied because requesting element is not in the currently focused tab" in the browser's console window. I notice that if I run the test and use mouse to change window focus at the same time, the fullscreen request will fail due to the focus problem. If I ran the test with "./mach talos-test --suite video" without using the mouse, then the system focus will be on the browser windows and the result will be correct. I guess the try server uses similar command, so this part should be okay.
I would like to share the scores from my Macbook Pro (i7 2.5G) with different videos and scales. Video/Sale ms/frame 240p.60fps.mp4_scale_fullscreen 17.25 240p.60fps.mp4_scale_1 3.87 240p.60fps.mp4_scale_1.1 3.92 240p.60fps.mp4_scale_2 8.27 240p.120fps.mp4_scale_fullscreen 24.59 240p.120fps.mp4_scale_1 3.88 240p.120fps.mp4_scale_1.1 4.41 240p.120fps.mp4_scale_2 11.19 240p.240fps.mp4_scale_fullscreen 34.48 240p.240fps.mp4_scale_1 5.89 240p.240fps.mp4_scale_1.1 7.04 240p.240fps.mp4_scale_2 15.08 480p.60fps.mp4_scale_fullscreen 31.26 480p.60fps.mp4_scale_1 13.77 480p.60fps.mp4_scale_1.1 16.76 480p.60fps.mp4_scale_2 20.55 480p.120fps.mp4_scale_fullscreen 35.30 480p.120fps.mp4_scale_1 16.76 480p.120fps.mp4_scale_1.1 17.65 480p.120fps.mp4_scale_2 22.91 480p.240fps.mp4_scale_fullscreen 40.00 480p.240fps.mp4_scale_1 16.58 480p.240fps.mp4_scale_1.1 20.69 480p.240fps.mp4_scale_2 24.19 720p.60fps.mp4_scale_fullscreen 38.48 720p.60fps.mp4_scale_1 23.26 720p.60fps.mp4_scale_1.1 19.87 720p.60fps.mp4_scale_2 21.28 720p.120fps.mp4_scale_fullscreen 41.10 720p.120fps.mp4_scale_1 27.28 720p.120fps.mp4_scale_1.1 26.09 720p.120fps.mp4_scale_2 27.28 720p.240fps.mp4_scale_fullscreen 41.68 720p.240fps.mp4_scale_1 34.10 720p.240fps.mp4_scale_1.1 30.30 720p.240fps.mp4_scale_2 30.01 1080p.60fps.mp4_scale_fullscreen 51.72 1080p.60fps.mp4_scale_1 38.47 1080p.60fps.mp4_scale_1.1 31.58 1080p.60fps.mp4_scale_2 38.46 1080p.120fps.mp4_scale_fullscreen 65.24 1080p.120fps.mp4_scale_1 46.16 1080p.120fps.mp4_scale_1.1 44.78 1080p.120fps.mp4_scale_2 42.87
(In reply to Ethan Lin[:ethlin] from comment #39) > (In reply to Avi Halachmi (:avih) from comment #36) > > In the same nightly browser which I use for talos, if I run it normally and > > set this pref to false and load the HTML file, it works correctly (the test > > starts and it goes to fullscreen without request/confirmation) and the test > > complete successfully. > > I think you may see the log "Request for fullscreen was denied because > requesting element is not in the currently focused tab" in the browser's > console window. I notice that if I run the test and use mouse to change > window focus at the same time, the fullscreen request will fail due to the > focus problem. If I ran the test with "./mach talos-test --suite video" > without using the mouse, then the system focus will be on the browser > windows and the result will be correct. > I guess the try server uses similar command, so this part should be okay. I do see this message, but I didn't touch the mouse after invoking the test, and the browser is definitely focused. However, the caret is at the URL bar, which seems enough for it to think maybe the content isn't focused? If before the test starts I click the content (i.e. after the window has opened and focused while the caret is at the URL bar, but before the test URL was loaded), then it does go correctly to full screen. Not sure exactly how talos is invoked or whether or not the focus is at the URL bar or not, but I think that's a bug with Firefox. Or, maybe we should force the focus to the content before the test is started? I think it's possible that the focus will stay at the URL bar also at production runs.
Comment on attachment 8742202 [details] [diff] [review] Part2. Add video clips. Review of attachment 8742202 [details] [diff] [review]: ----------------------------------------------------------------- Joel, these are 3 test clips, ~4.5M combined. Shall we put it into the tree? I'd really have preferred to use some external place/repo for binary test data if we had one...
Attachment #8742202 - Flags: review?(avihpit)
Attachment #8742202 - Flags: review+
Attachment #8742202 - Flags: feedback?(jmaher)
Comment on attachment 8742202 [details] [diff] [review] Part2. Add video clips. Review of attachment 8742202 [details] [diff] [review]: ----------------------------------------------------------------- I don't see an issue with adding these to the tree. I assume there are no copyright issues. My only hope is that we run this test on try for all platforms and get 20+ data points to show that this is green on all of them (looking for stability in the test), then examine the numbers to ensure we don't have a very large distribution of data.
Attachment #8742202 - Flags: feedback?(jmaher) → feedback+
Comment on attachment 8742201 [details] [diff] [review] Part1. Add a talos test for video. Review of attachment 8742201 [details] [diff] [review]: ----------------------------------------------------------------- Joel - mostly test.py and talos.json, but feel free to skim the rest too. ::: testing/talos/talos/tests/video/video_playback.html @@ +20,5 @@ > + 'testsrc.720p.60fps.mp4', > + 'testsrc.720p.120fps.mp4', > + 'testsrc.720p.240fps.mp4', > + 'testsrc.1080p.60fps.mp4', > + 'testsrc.1080p.120fps.mp4', You forgot to remove all the unused clips. @@ +40,5 @@ > + > +function fullscreen(event) { > + if ((document.fullscreenElement && document.fullscreenElement !== null) || > + document.mozFullScreen) { > + startTest(); I'm not really happy with this logic. I think it's sensitive and I _think_ this element (and the event) is not in any spec. Maybe we should just not listen to fullscreen changes and just assume they happen? (i.e. still try to enter/exit fullscreen, but don't wait for the event to confirm it). Now that we have TIMING_DELAY_MS and we're waiting anyway, I think it's OK. Also, it would make the code flow simpler and less error prone, and also easier to follow. Worst case, we'll ask for fullscreen and instead get 100% size. Not a terrible worst case, with the advantage that the test won't hang if the API changes. What do you think? @@ +64,5 @@ > + } > + } else { > + readyToStart = true; > + vdo.setAttribute('width', vdo.videoWidth*viewMode[viewModeIndex]); > + vdo.setAttribute('height', vdo.videoHeight*viewMode[viewModeIndex]); Missing spaces around the *'s. @@ +70,5 @@ > + } > +} > + > +function startTest() { > + vdo.playbackRate = 5; We can make the 5 also a const at the configuration section. @@ +83,5 @@ > + > +function measurementEnded() { > + vdo.pause(); > + var end = performance.now(); > + paintedFramesEnd = vdo.mozPaintedFrames; paintedFramesEnd should be a local variable, and also remove the global 'end' since it's not used anywhere.
Attachment #8742201 - Flags: review?(avihpit) → feedback?(jmaher)
Joel, thanks to ethlin's analysis, we found out that the clip doesn't go to full screen if the content doesn't have focus (which is presumably a feature). This apparently depends on how the test is invoked. If invoked from mach, then it does go to full screen, but if invoked with talos --develop... then the focus stays at the URL bar and so does not do full screen. However, if I manually click the content area before the test URL is loaded (during these ~7 seconds of idle before the test starts), then it does go to full screen when it starts. I think this is too sensitive to depend on, and also, it doesn't go full screen when invoked without mach... I'm thinking maybe the page loader should focus the content before the test starts. Thoughts?
Flags: needinfo?(jmaher)
(In reply to Avi Halachmi (:avih) from comment #42) > Comment on attachment 8742202 [details] [diff] [review] > Part2. Add video clips. > > Review of attachment 8742202 [details] [diff] [review]: > ----------------------------------------------------------------- > > Joel, these are 3 test clips, ~4.5M combined. Shall we put it into the tree? > I'd really have preferred to use some external place/repo for binary test > data if we had one... Does the test look at the contents of the clips? If not why not just simplify them so that they compress to smaller?
(In reply to Jeff Muizelaar [:jrmuizel] from comment #46) > (In reply to Avi Halachmi (:avih) from comment #42) > > Comment on attachment 8742202 [details] [diff] [review] > > Part2. Add video clips. > > > > Review of attachment 8742202 [details] [diff] [review]: > > ----------------------------------------------------------------- > > > > Joel, these are 3 test clips, ~4.5M combined. Shall we put it into the tree? > > I'd really have preferred to use some external place/repo for binary test > > data if we had one... > > Does the test look at the contents of the clips? If not why not just > simplify them so that they compress to smaller? These are high fps clips, and also not short (1 minute each because we play them at x5 speed and need some playback time before the measurement starts). We could simplify the content even down to a static image (as video), but I wanted the clips to have some real data with each frame different than the previous one to prevent potential pipeline optimizations of not presenting duplicate frames. Also, they're not huge. the 240p and 480p clips are 800-900k each, and the 1080p @60fps is 2.5M. In general these are very reasonably small test clips with some actual content. The problem is that I'm not happy with putting them into a source code repository, but apparently we don't really have an alternative...
Comment on attachment 8742201 [details] [diff] [review] Part1. Add a talos test for video. Review of attachment 8742201 [details] [diff] [review]: ----------------------------------------------------------------- I am excited that you have a patch put together for this talos test. While there are a lot of comments here, I think most of them can be addressed simply. I really would like to see many more than 1 sample collected for each measurement. ::: testing/talos/talos.json @@ +109,1 @@ > } to make this work we will need to do buildbot config changes. Also e10s is enabled by default, so the non e10s test needs --disable-e10s. How long does this test take? Can we add it into another job? ::: testing/talos/talos/test.py @@ +556,5 @@ > + 'layers.acceleration.disabled': True, > + 'layout.frame_rate': 0, > + 'docshell.event_starvation_delay_hint': 1, > + 'full-screen-api.warning.timeout': 500} > + filters = filter.ignore_first.prepare(1) + filter.median.prepare() if this is just a single data point, does that produce repeatable data? Normally we have tppagecycles=25 ::: testing/talos/talos/tests/video/video_playback.html @@ +39,5 @@ > +} > + > +function fullscreen(event) { > + if ((document.fullscreenElement && document.fullscreenElement !== null) || > + document.mozFullScreen) { where would we not have fullscreenElement or mozFullScreen? certain platforms, graphics cards, os configs, firefox configs? @@ +60,5 @@ > + if (document.body.requestFullscreen) { > + document.body.requestFullscreen(); > + } else if (document.body.mozRequestFullScreen) { > + document.body.mozRequestFullScreen(); > + } nit: why would we sometimes have requestFullscreen vs mozRequestFullScreen? also one has a capital S- is that a typo? is this code intended to work in different browsers? @@ +77,5 @@ > + setTimeout(function() { > + start = performance.now(); > + paintedFramesStart = vdo.mozPaintedFrames; > + setTimeout(measurementEnded, MEASUREMENT_MS); > + }, TIMING_DELAY_MS); do we need to have a TIMING_DELAY_MS for the setTimeout? I really don't like these as it ends up in intermittent problems more often than we would like.
Attachment #8742201 - Flags: feedback?(jmaher) → feedback-
(In reply to Avi Halachmi (:avih) from comment #47) > The problem is that I'm not happy with putting them into a source code > repository, but apparently we don't really have an alternative... Simplifying the content, removing the changing gradient and making the changing area much smaller and increasing the interval between keyframes should make a big difference and you'll still have properties that you desire without bloating the source tree.
:avih, regarding pageloader focusing before we start, that seems valid and I created bug 1266181 to work on it.
Flags: needinfo?(jmaher)
(In reply to Avi Halachmi (:avih) from comment #44) > Comment on attachment 8742201 [details] [diff] [review] > Part1. Add a talos test for video. > > Review of attachment 8742201 [details] [diff] [review]: > ----------------------------------------------------------------- > > Joel - mostly test.py and talos.json, but feel free to skim the rest too. > > ::: testing/talos/talos/tests/video/video_playback.html > @@ +20,5 @@ > > + 'testsrc.720p.60fps.mp4', > > + 'testsrc.720p.120fps.mp4', > > + 'testsrc.720p.240fps.mp4', > > + 'testsrc.1080p.60fps.mp4', > > + 'testsrc.1080p.120fps.mp4', > > You forgot to remove all the unused clips. I will remove the unused clips. > > @@ +40,5 @@ > > + > > +function fullscreen(event) { > > + if ((document.fullscreenElement && document.fullscreenElement !== null) || > > + document.mozFullScreen) { > > + startTest(); > > I'm not really happy with this logic. I think it's sensitive and I _think_ > this element (and the event) is not in any spec. Maybe we should just not > listen to fullscreen changes and just assume they happen? (i.e. still try to > enter/exit fullscreen, but don't wait for the event to confirm it). Now that > we have TIMING_DELAY_MS and we're waiting anyway, I think it's OK. Also, it > would make the code flow simpler and less error prone, and also easier to > follow. > > Worst case, we'll ask for fullscreen and instead get 100% size. Not a > terrible worst case, with the advantage that the test won't hang if the API > changes. > > What do you think? I refer to the sample in mdn[1]. I think it's ok to assume that it will enter/exit fullscreen. [1] https://developer.mozilla.org/en-US/docs/Web/API/Document/mozFullScreen > @@ +64,5 @@ > > + } > > + } else { > > + readyToStart = true; > > + vdo.setAttribute('width', vdo.videoWidth*viewMode[viewModeIndex]); > > + vdo.setAttribute('height', vdo.videoHeight*viewMode[viewModeIndex]); > > Missing spaces around the *'s. > Okay, I will add spaces here. > @@ +70,5 @@ > > + } > > +} > > + > > +function startTest() { > > + vdo.playbackRate = 5; > > We can make the 5 also a const at the configuration section. Okay. > > @@ +83,5 @@ > > + > > +function measurementEnded() { > > + vdo.pause(); > > + var end = performance.now(); > > + paintedFramesEnd = vdo.mozPaintedFrames; > > paintedFramesEnd should be a local variable, and also remove the global > 'end' since it's not used anywhere. Okay.
(In reply to Joel Maher (:jmaher) from comment #48) > Comment on attachment 8742201 [details] [diff] [review] > Part1. Add a talos test for video. > > Review of attachment 8742201 [details] [diff] [review]: > ----------------------------------------------------------------- > > I am excited that you have a patch put together for this talos test. While > there are a lot of comments here, I think most of them can be addressed > simply. I really would like to see many more than 1 sample collected for > each measurement. > > ::: testing/talos/talos.json > @@ +109,1 @@ > > } > > to make this work we will need to do buildbot config changes. Also e10s is > enabled by default, so the non e10s test needs --disable-e10s. > How to do the buildbot config changes? Do you mean add "talos_options": ["--disable-e10s"] here? > How long does this test take? Can we add it into another job? > The test takes about 60sec. Do you mean add the test into another test suite? I am not familiar with the categories of talos test. > ::: testing/talos/talos/test.py > @@ +556,5 @@ > > + 'layers.acceleration.disabled': True, > > + 'layout.frame_rate': 0, > > + 'docshell.event_starvation_delay_hint': 1, > > + 'full-screen-api.warning.timeout': 500} > > + filters = filter.ignore_first.prepare(1) + filter.median.prepare() > > if this is just a single data point, does that produce repeatable data? > Normally we have tppagecycles=25 > I will set the tppagecycles to 25. > ::: testing/talos/talos/tests/video/video_playback.html > @@ +39,5 @@ > > +} > > + > > +function fullscreen(event) { > > + if ((document.fullscreenElement && document.fullscreenElement !== null) || > > + document.mozFullScreen) { > > where would we not have fullscreenElement or mozFullScreen? certain > platforms, graphics cards, os configs, firefox configs? > 'mozFullScreen' works fine. I tried document.body.requestFullscreen on firefox 45 and it is undefined. I think we can just keep mozFullScreen. > @@ +60,5 @@ > > + if (document.body.requestFullscreen) { > > + document.body.requestFullscreen(); > > + } else if (document.body.mozRequestFullScreen) { > > + document.body.mozRequestFullScreen(); > > + } > > nit: why would we sometimes have requestFullscreen vs mozRequestFullScreen? > also one has a capital S- is that a typo? is this code intended to work in > different browsers? > Basically I refer to here[1]. It's not a typo. I think requestFullscreen is a standard API for all browser and mozRequestFullScreen is just for Firefox. [1] https://developer.mozilla.org/en-US/docs/Web/API/Fullscreen_API > @@ +77,5 @@ > > + setTimeout(function() { > > + start = performance.now(); > > + paintedFramesStart = vdo.mozPaintedFrames; > > + setTimeout(measurementEnded, MEASUREMENT_MS); > > + }, TIMING_DELAY_MS); > > do we need to have a TIMING_DELAY_MS for the setTimeout? I really don't > like these as it ends up in intermittent problems more often than we would > like. :avih suggested that I should skip the first 4 seconds to avoid bad data. I could do some test to check the difference between with/without time delay. If the results are similar, maybe we don't need the time delay.
need info in comment 52.
Flags: needinfo?(jmaher)
if the test takes 60 seconds, lets not do 25 cycles of it, that will take too long. I need to know how noisy this is, maybe 12 cycles will be enough. Also it might take longer on our automation machines vs a local machine. Regarding scheduling this, I think we need a new job- probably call it g4. When we have more confidence in the runtime and stability, I can handle the buildbot configs (which are all out of tree).
Flags: needinfo?(jmaher)
(In reply to Joel Maher (:jmaher) from comment #54) > if the test takes 60 seconds, lets not do 25 cycles of it... The test should take 21 seconds (4 seconds playback without measuring, 3 seconds measuring, over 3 clips). Agreed that 12 cycles would be a good start, and indeed we'll need to see how noisy it ends up.
Attached patch Part1. Add a talos test for video. (obsolete) (deleted) — Splinter Review
Address avih's and jmaher's comments. I kept mozRequestFullScreen and removed requestFullscreen to make the code simpler. I also compared with/without TIMING_DELAY_MS locally and the results are quite similar (37.8 vs 37.4). So I removed the TIMING_DELAY_MS and kept using fullscreen event to trigger the next action.
Attachment #8742201 - Attachment is obsolete: true
Attachment #8745267 - Flags: review?(jmaher)
Attachment #8745267 - Flags: review?(avihpit)
Attached patch Part1. Add a talos test for video. (obsolete) (deleted) — Splinter Review
I uploaded wrong patch. Correct it.
Attachment #8745267 - Attachment is obsolete: true
Attachment #8745267 - Flags: review?(jmaher)
Attachment #8745267 - Flags: review?(avihpit)
Attachment #8745268 - Flags: review?(jmaher)
Attachment #8745268 - Flags: review?(avihpit)
Comment on attachment 8745268 [details] [diff] [review] Part1. Add a talos test for video. Review of attachment 8745268 [details] [diff] [review]: ----------------------------------------------------------------- (In reply to Ethan Lin[:ethlin] from comment #56) >I kept mozRequestFullScreen and removed requestFullscreen to make the code simpler. Prefixed APIs go away and then the test breaks, and the non-prefixed API is already on nightly, and is the future. Please keep both APIs at all places (for fullscreen entering/exiting/event). > I also compared with/without TIMING_DELAY_MS locally and the results are quite similar (37.8 > vs 37.4). So I removed the TIMING_DELAY_MS Even if it has similar results on your system, it changes the semantics of the test from "test stable playback throughput" to "test playback startup and hopefully also stable throughput". I prefer clearer semantics, but I can live with this too. OK, keep it. > and kept using fullscreen event to trigger the next action. OK. r+ after you add the non-prefixed APIs and remove the unused 'end'. ::: testing/talos/talos/tests/video/video_playback.html @@ +6,5 @@ > +<script language="javascript" type="text/javascript"> > +const MEASUREMENT_MS = 3000; > +const PLAYBACK_RATE = 5; > +var vdo; > +var start = 0, end = 0; I think this 'end' is still not used anywhere.
Attachment #8745268 - Flags: review?(avihpit) → review+
@gps, Ryan told me you (rightfully) dislike binary test data at the tree, and thought we should discuss the subject with you too. We need to host 3 video clips which are 4.5M combined for talos tests. Joel is apparently OK with storing them at the tree, but I think all of us would like to have a better solution. Ideas?
Flags: needinfo?(gps)
Comment on attachment 8745268 [details] [diff] [review] Part1. Add a talos test for video. Review of attachment 8745268 [details] [diff] [review]: ----------------------------------------------------------------- let me run this on try and report back here.
Attachment #8745268 - Flags: review?(jmaher) → review+
You are correct: I don't like large binaries in the tree. For reference, here are the largest files in the tree currently: $ hg files 'set:size(">1MB")' | xargs ls -l | awk '{print $5, $9}' | sort -n -r 10426848 config/external/icu/data/icudt56l.dat 9062393 testing/talos/talos/tests/kraken/test-contents.js 8937963 js/src/jit-test/lib/mandelbrot-results.js 7873600 dom/media/webspeech/recognition/models/en-US/mixture_weights 6992020 dom/media/webspeech/recognition/models/en-US/mdef 6640394 db/sqlite3/src/sqlite3.c 6528077 security/nss/lib/sqlite/sqlite3.c 5000408 js/src/octane/mandreel.js 4731391 browser/components/translation/cld2/internal/cld2_generated_quadchrome0122_16.cc 4520440 testing/web-platform/tests/media/2048x1360-random.jpg 3245621 dom/media/webspeech/recognition/models/dict/en-US.dic 2925991 js/src/tests/ecma_5/String/string-upper-lower-mapping.js 2757913 testing/web-platform/tests/media/movie_300.mp4 2344665 testing/web-platform/tests/media/movie_300.ogv 2279914 js/src/tests/ecma_6/String/normalize-generateddata-input.js 2195930 testing/web-platform/tests/domxpath/xml_xpath_tests.xml 4.5MB of video files isn't obscene and I won't put up large objections. It is desirable to avoid putting large files in the tree if they aren't actually needed by most people. Think of the casual contributor that now must download another 4.5MB for tests they will never run. (This is really a deficiency in version control tools not allowing cloning of specific directories - something Google is implementing for Mercurial at https://bitbucket.org/Google/narrowhg, so this concern goes away in the long term.) The standard Mozilla answer for where to put random blobs is tooltool (https://wiki.mozilla.org/ReleaseEngineering/Applications/Tooltool). That's basically a glorified frontend for Amazon S3, which is highly available. The client tools require hash pinning, so you have some tampering resistance. There is even a caching layer in the client. Anyone at Mozilla can get access credentials to upload things to tooltool. And automation machines have access to tooltool servers and can download things no problem. If this is only going to be a one-time 4.5MB addition, I'm fine with it. But if you will be adding more media or replacing existing media on an ongoing basis, I'd recommend using tooltool.
Flags: needinfo?(gps)
hmm, this is failing quite often: https://treeherder.mozilla.org/#/jobs?repo=try&revision=ee73cd59c4ecb81613927165b03421d25a9a6f35 we need to see this green prior to landing.
(In reply to Avi Halachmi (:avih) from comment #55) > (In reply to Joel Maher (:jmaher) from comment #54) > > if the test takes 60 seconds, lets not do 25 cycles of it... > > The test should take 21 seconds (4 seconds playback without measuring, 3 > seconds measuring, over 3 clips). That was a lie actually, since each clip runs 4 times, it would be 84 seconds (that was before ethlin removed the initial wait). (In reply to Joel Maher (:jmaher) from comment #63) > hmm, this is failing quite often: > https://treeherder.mozilla.org/#/ > jobs?repo=try&revision=ee73cd59c4ecb81613927165b03421d25a9a6f35 > > we need to see this green prior to landing. Indeed, I suspected it's the fullscreen/focus issue, So Joel re-pushed with a patch which focuses the content before the test starts, and it became much better: https://treeherder.mozilla.org/#/jobs?repo=try&revision=b218e192d084092c13dce7e948fdf3558dab2eaf Only two failures, on linux, and they _seem_ unrelated to this test specifically. This means this issue is not limited to how I invoke talos locally, and it needs to be addressed, probably on the talos side with a patch similar to the one we used, and either apply it only for this specific test, or apply it always and make sure otehr tests are unaffected. The numbers look quite stable and representative too. E.g. from one of the runs (win7, e10s): > [#0] 240p.120fps.mp4_scale_fullscreen Cycles:12 Average:10.31 Median:10.33 stddev:0.06 (0.6%) stddev-sans-first:0.06 > Values: 10.4 10.2 10.3 10.2 10.3 10.4 10.4 10.3 10.2 10.4 10.3 10.3 > > [#1] 240p.120fps.mp4_scale_1 Cycles:12 Average:1.67 Median:1.67 stddev:0.00 (0.1%) stddev-sans-first:0.00 > Values: 1.7 1.7 1.7 1.7 1.7 1.7 1.7 1.7 1.7 1.7 1.7 1.7 > > [#2] 240p.120fps.mp4_scale_1.1 Cycles:12 Average:1.67 Median:1.67 stddev:0.00 (0.0%) stddev-sans-first:0.00 > Values: 1.7 1.7 1.7 1.7 1.7 1.7 1.7 1.7 1.7 1.7 1.7 1.7 > > [#3] 240p.120fps.mp4_scale_2 Cycles:12 Average:2.03 Median:2.03 stddev:0.01 (0.4%) stddev-sans-first:0.01 > Values: 2.0 2.0 2.0 2.0 2.0 2.0 2.0 2.0 2.0 2.0 2.0 2.0 > > [#4] 480p.60fps.mp4_scale_fullscreen Cycles:12 Average:11.05 Median:11.03 stddev:0.07 (0.6%) stddev-sans-first:0.07 > Values: 11.1 11.0 10.9 11.0 11.0 11.1 11.1 11.0 11.0 11.0 11.0 11.2 > > [#5] 480p.60fps.mp4_scale_1 Cycles:12 Average:3.36 Median:3.36 stddev:0.00 (0.1%) stddev-sans-first:0.00 > Values: 3.4 3.4 3.4 3.4 3.4 3.4 3.4 3.4 3.4 3.4 3.4 3.4 > > [#6] 480p.60fps.mp4_scale_1.1 Cycles:12 Average:3.37 Median:3.37 stddev:0.01 (0.2%) stddev-sans-first:0.01 > Values: 3.4 3.4 3.4 3.4 3.4 3.4 3.4 3.4 3.4 3.4 3.4 3.4 > > [#7] 480p.60fps.mp4_scale_2 Cycles:12 Average:4.58 Median:4.59 stddev:0.03 (0.8%) stddev-sans-first:0.04 > Values: 4.6 4.6 4.6 4.6 4.5 4.5 4.6 4.6 4.6 4.6 4.6 4.6 > > [#8] 1080p.60fps.mp4_scale_fullscreen Cycles:12 Average:29.28 Median:29.28 stddev:0.26 (0.9%) stddev-sans-first:0.23 > Values: 28.8 29.1 29.1 29.7 29.4 29.4 29.1 29.7 29.1 29.1 29.4 29.1 > > [#9] 1080p.60fps.mp4_scale_1 Cycles:12 Average:13.12 Median:13.16 stddev:0.10 (0.8%) stddev-sans-first:0.11 > Values: 13.0 12.9 13.2 13.2 13.2 13.2 13.2 13.2 12.9 13.0 13.2 13.2 > > [#10] 1080p.60fps.mp4_scale_1.1 Cycles:12 Average:15.42 Median:15.43 stddev:0.18 (1.2%) stddev-sans-first:0.18 > Values: 15.2 15.2 15.5 15.4 15.4 15.5 15.8 15.3 15.2 15.6 15.3 15.5 > > [#11] 1080p.60fps.mp4_scale_2 Cycles:12 Average:15.21 Median:15.23 stddev:0.16 (1.0%) stddev-sans-first:0.16 > Values: 15.1 15.2 15.5 15.4 15.1 15.2 15.3 15.0 15.2 15.2 15.4 15.0 One thing to note here is that the full screen results look relatively bad. Hard to judge whether or not it's due to the removal of the initial wait or not. While it's valuable to measure the start of playback, I would have felt better if we also had a measurement which starts some time after playback starts. Maybe we should take two measurements - the first like we have now (3 seconds right from the start) and another one of, say, 2 seconds right afterwards? So the timeline would look like this: t0 > 3 seconds playback > t1 > 2 seconds playback > t2 Where we measure but don't stop the playback at t1, and then at t2 add another measurement compared to t1? This would make the second measurement hopefully clean of the initial fullscreen transition effect, and this way we'll measure both in a single playback.
Depends on: 1266181
this is pretty stable: https://treeherder.mozilla.org/perf.html#/compare?originalProject=try&originalRevision=b218e192d084&newProject=try&newRevision=fdc54e6c75321a797b3acbfa057c2c60a3576e5a&framework=1&showOnlyImportant=0 ^ note, the linux data is mixed with some VM's at amazon, I have manually verified just the hardware machines have very little noise On top of that WindowsXP doesn't seem to be working at all. Is this not expected to run there?
I think that is also fullscreen problem. I will try to reproduce locally to see if I can fix this. Or we could just skip XP platform? avih, what do you think?
We should probably at least try to understand what the issue is with XP, and then decide. We're still supporting XP.. for now.
Attached patch Part1. Add a talos test for video. (obsolete) (deleted) — Splinter Review
Address avih's comments. This patch doesn't fix XP problem, and I will check it later after I have the XP environment. The tryserver link is here: https://treeherder.mozilla.org/#/jobs?repo=try&revision=69d58e6e5d53&selectedJob=20086452 We can see the results from the log. Please ignore the failures. I already fixed the failures in the patch.
Attachment #8745268 - Attachment is obsolete: true
Attachment #8746590 - Flags: review?(avihpit)
Comment on attachment 8746590 [details] [diff] [review] Part1. Add a talos test for video. Review of attachment 8746590 [details] [diff] [review]: ----------------------------------------------------------------- Looks good. Thanks for the patience :) So just the test name fix, and figure out what's wrong on XP. I'll wait with bug 1266181 until we understand whether or not the current one line focus patch is enough also for XP. ::: testing/talos/talos/tests/video/video_playback.html @@ +81,5 @@ > + var timePerFrame1 = (start2 - start1) / (paintedFramesStart2 - paintedFramesStart1); > + var timePerFrame2 = (end - start2) / (paintedFramesEnd - paintedFramesStart2); > + testResult.names.push('_startup_' + test[testIndex] + '_scale_' + viewMode[viewModeIndex]); > + testResult.values.push(timePerFrame1); > + testResult.names.push('_inclip_' + test[testIndex] + '_scale_' + viewMode[viewModeIndex]); Apologies, I probably wasn't accurate enough when we talked on IRC. I meant to put "_startup" and "_inclip" at the end of the name.
Attachment #8746590 - Flags: review?(avihpit) → review+
Just adding some hard-to-find info about running talos on Windows XP. 1. Start with XP SP3. 2. Install all dotnet versions (up to 4) and power shell (via windows update). Some might be redundant but I didn't find out which. Same goes for DirectX (via web installer). 3. Use mozilla-build 2.1.0 which has python 2.7.10. (mozilla-build 2.2.0 has python 2.7.11 which I couldn't make it work for talos on windows 8, and I didn't try it on XP). 4. At talos/requirements.txt replace psutil>=3.1.1 with psutil==3.4.2 . Seemingly the newer psutil packages (4.x) were built in a way which is incompatible with XP. 3.4.2 is the latest 3.x and seems to work on XP. 5. If there, delete the following dirs inside the talos dir: Lib, Include, Scripts, Talos.egg-info . 6. run `python INSTALL.py` and hope for the best, or goto 5. 7. `source Scripts/activate` and then `talos --help` Good luck.
(In reply to Avi Halachmi (:avih) from comment #70) > Just adding some hard-to-find info about running talos on Windows XP. > Thanks. I can run the talos test on XP now. I should do all windows update first even the system already has SP3. The test has this error on XP: HTTP “Content-Type” of “video/mp4” is not supported. Load of media resource http://localhost:3233/tests/video/clips/testsrc.240p.120fps.mp4 failed. I found that the talos test will run browser with a specified profile and the profile doesn't have gmp-eme-adobe. But if I run the nightly normally, I default profile has gmp-eme-adobe. I'm not sure how talos generates the profile. Maybe it's related to Bug 1234100.
great stuff :ethlin! Here is some info about the profile used in talos (it is a fresh generated profile for each test): * preference we specify globally: https://dxr.mozilla.org/mozilla-central/source/testing/talos/talos/config.py#61 * code that builds the profile: https://dxr.mozilla.org/mozilla-central/source/testing/talos/talos/ffsetup.py#80 * here is an exception where we disable hwacceleration in e10s mode for winxp only: https://dxr.mozilla.org/mozilla-central/source/testing/talos/talos/ttest.py#65 * and a few other test specific exceptions in winxp mode: https://dxr.mozilla.org/mozilla-central/search?q=path%3Atesting%2Ftalos%2Ftalos+win_%3B&redirect=false&case=true * the code that builds the profile (mozprofile): https://dxr.mozilla.org/mozilla-central/search?q=path%3Atesting%2Fmozbase%2Fmozprofile%2Fmozprofile&redirect=true&case=true (probably not related) why would the gmp-eme-adobe preference affect winxp only? We could change the value for winxp only.
(In reply to Joel Maher (:jmaher) from comment #72) > great stuff :ethlin! Here is some info about the profile used in talos (it > is a fresh generated profile for each test): > * preference we specify globally: > https://dxr.mozilla.org/mozilla-central/source/testing/talos/talos/config. > py#61 > * code that builds the profile: > https://dxr.mozilla.org/mozilla-central/source/testing/talos/talos/ffsetup. > py#80 > * here is an exception where we disable hwacceleration in e10s mode for > winxp only: > https://dxr.mozilla.org/mozilla-central/source/testing/talos/talos/ttest. > py#65 > * and a few other test specific exceptions in winxp mode: > https://dxr.mozilla.org/mozilla-central/ > search?q=path%3Atesting%2Ftalos%2Ftalos+win_%3B&redirect=false&case=true > * the code that builds the profile (mozprofile): > https://dxr.mozilla.org/mozilla-central/ > search?q=path%3Atesting%2Fmozbase%2Fmozprofile%2Fmozprofile&redirect=true&cas > e=true (probably not related) > > why would the gmp-eme-adobe preference affect winxp only? We could change > the value for winxp only. I should say plugin (no plugin, so no prefs). The profile from talos doesn't have the plugin 'Primetime Content Decryption Module provided by Adobe Systems, Incorporated'. I think XP doesn't contain the decoder so it needs the plugin to do the decode. By Bug 1234100, Firefox46 or later version will download the plugin automatically on XP. But I think the env of talos test cannot connect internet so the test will fail on XP. We can either put the addon in this video test or skip XP platform of this test. jmaher, what do you think?
Right, so we have a real problem. XP cannot play these test clips without a plugin, and it's unrealistic to let the test download the plugin, and probably we also don't want to put the plugin as part of the profile. We could change the test clips to mpeg2 or mpeg1 (need to check whatever is supported on XP in Firefox out of the box), but these are relatively outdated formats, or at least not formats which are common on today's web. They would not be good test clips in general (e.g. if we'd want to use them in future tests for decoding speed etc). Maybe the solution would be to add a single clip in a format which XP can play, and the test on XP will use only this clip?
technically we have internet access, but we hack off the auto downloading of things with preferences. A concern about bringing this in house and installing it is that if it is updated we have to maintain that and keep updating it by hand. I am fine ignoring this on winxp, as long as that is a final decision (final as in I don't want to be redoing the scheduling work next month). I am curious how we run unittests for this, specifically on windows xp?
ok, if we are not running unittests, that concerns me that talos will be used for the only automated test to validate we run. what :avih did mention about a video that will play by default on xp sounds interesting- could we add a smaller clip that could be tested on all platforms but we test only that single clip on winxp.
Using VP9 will work on all platforms.
Addressed avih's comments.
Attachment #8746590 - Attachment is obsolete: true
I created an ogg clip and XP platform only run this clip. Other platforms run this clip and the original mp4 clips. *Here is the try server result: https://treeherder.mozilla.org/#/jobs?repo=try&revision=7bfb8e70ccc7&filter-searchStr=talos&selectedJob=20522592
Attached patch Part3. Add test for XP (obsolete) (deleted) — Splinter Review
Add an ogg clip with 240p@120fps for XP platform.
Attachment #8750083 - Flags: review?(jmaher)
Attachment #8750083 - Flags: review?(avihpit)
We discussed this on IRC. ethlin will try the following: - Instead of adding a 240p clip: replace the 480p mp4 clip with XP-compatible one. - Instead of ogg: use webm (with vp9) - it's more modern, assuming it works on XP. He's currently checking if the output of this works on XP (it generated vp9, and interestingly almost half the size of the mp4 clip with good visual quality): > ffmpeg -f lavfi -i testsrc=rate=60:size=640x480:decimals=3 -t 60 -pix_fmt yuv420p testsrc.480p.60fps.webm
Attached patch Part1. Add a talos test for video. (obsolete) (deleted) — Splinter Review
Replace the 480p clip with webm format. Windows XP will only run this video clip.
Attachment #8750080 - Attachment is obsolete: true
Attachment #8750083 - Attachment is obsolete: true
Attachment #8750083 - Flags: review?(jmaher)
Attachment #8750083 - Flags: review?(avihpit)
Attached patch Part2. Add video clips. (obsolete) (deleted) — Splinter Review
Use these 3 clips as the test source: testsrc.240p.120fps.mp4 testsrc.480p.60fps.webm testsrc.1080p.60fps.mp4
Attachment #8742202 - Attachment is obsolete: true
Attachment #8750135 - Flags: review?(jmaher)
Attachment #8750135 - Flags: review?(avihpit)
Attachment #8750136 - Flags: review?(jmaher)
Attachment #8750136 - Flags: review?(avihpit)
Attachment #8750136 - Flags: review?(jmaher) → review+
Comment on attachment 8750135 [details] [diff] [review] Part1. Add a talos test for video. Review of attachment 8750135 [details] [diff] [review]: ----------------------------------------------------------------- just one small nit. also the commit message add an 's' to talos ::: testing/talos/talos/tests/video/video_playback.html @@ +135,5 @@ > + > +</head> > +<body id="body" onload="init();"> > +<video id="vdo" width="100%" height="100%"> > + Your browser does not support HTML5 video. is this message really useful?
Attachment #8750135 - Flags: review?(jmaher) → review+
Attachment #8750135 - Flags: review?(avihpit) → review+
Comment on attachment 8750136 [details] [diff] [review] Part2. Add video clips. Review of attachment 8750136 [details] [diff] [review]: ----------------------------------------------------------------- So, this is weird. I applied your patch and it seems the new webm clip is ~1.5M. When I generated it on two systems with 3 different versions of ffmpeg, it always ended up less than 600k for the exact same command. Because we really care about binary file sizes, I've uploaded the clip which I generated (with ffmpeg git-3395ad4 which I built from few days ago, same command as on comment 82), here https://people.mozilla.org/~ahalachmi/share/test-clips/testsrc.480p.60fps.webm . FWIW, when encoding, it showed "[libvpx-vp9 @ 00e2e180] v1.5.0" I asked ethlin to do another try push with this smaller clip, and if everything is still fine, use this instead of the one he generated.
(In reply to Joel Maher (:jmaher) from comment #86) > Comment on attachment 8750135 [details] [diff] [review] > Part1. Add a talos test for video. > > Review of attachment 8750135 [details] [diff] [review]: > ----------------------------------------------------------------- > > just one small nit. also the commit message add an 's' to talos Okay. > > ::: testing/talos/talos/tests/video/video_playback.html > @@ +135,5 @@ > > + > > +</head> > > +<body id="body" onload="init();"> > > +<video id="vdo" width="100%" height="100%"> > > + Your browser does not support HTML5 video. > > is this message really useful? Actually nope. I'll remove it.
Addressed :jmaher's comment.
Attachment #8750135 - Attachment is obsolete: true
Update the clip with the smaller one.
Attachment #8750136 - Attachment is obsolete: true
Attachment #8750136 - Flags: review?(avihpit)
Attachment #8750324 - Flags: review?(avihpit)
:jmaher, the test category is g4 now but I used g1 to do the try. Is there anything I need to take care before checkin-need?
Flags: needinfo?(jmaher)
when these land as g4, I will have to do scheduling in buildbot-configs, when I see these land, I will push that through.
Flags: needinfo?(jmaher)
Comment on attachment 8750324 [details] [diff] [review] Part2. Add video clips. (carry r+: jmaher) Review of attachment 8750324 [details] [diff] [review]: ----------------------------------------------------------------- XP results (and the rest too) look good. Bug 1266181 still didn't land and we need it for this test. The results seems for the most part good (no change in numbers by focusing the content on startup) but few tests show some diff, some of them not even using the pageloader, which is quite weird. I'll discuss this with jmaher and might update the patch to focus the content conditionally instead.
Attachment #8750324 - Flags: review?(avihpit) → review+
Depends on: 1274992
Attached patch bug1254898.talos-powers-focus.patch (obsolete) (deleted) — Splinter Review
So, bug 1266181 didn't work out too well. Making the pageloader addon focus only the video test proved too troublesome, and making all pageloader tests focus the content affected the talos numbers with other tests. I ended up filing and posting patches to bug 1274992 which a) reverts the global pageloader focus, and instead b) adds focus-on-demand functionality to talos-powers (an addon which is loaded automatically on all talos tests). This means that the test itself (e.g video_playback.html here) can now request to focus the content via talos-powers. @ethanlin: after bug 1274992 lands, please apply the attached patch on top of your existing one, and then do a try push to make sure everything still works, and if it does, hopefully we can land this soon.
Flags: needinfo?(tdaede)
(In reply to Avi Halachmi (:avih) from comment #95) > Created attachment 8755556 [details] [diff] [review] > bug1254898.talos-powers-focus.patch > > So, bug 1266181 didn't work out too well. Making the pageloader addon focus > only the video test proved too troublesome, and making all pageloader tests > focus the content affected the talos numbers with other tests. > > I ended up filing and posting patches to bug 1274992 which a) reverts the > global pageloader focus, and instead b) adds focus-on-demand functionality > to talos-powers (an addon which is loaded automatically on all talos tests). > > This means that the test itself (e.g video_playback.html here) can now > request to focus the content via talos-powers. > > @ethanlin: after bug 1274992 lands, please apply the attached patch on top > of your existing one, and then do a try push to make sure everything still > works, and if it does, hopefully we can land this soon. Thanks. I will try the patch after bug 1274992 lands.
I apply the patch and it looks good on try server.
Attachment #8755556 - Attachment is obsolete: true
Attachment #8757743 - Flags: review?(avihpit)
Comment on attachment 8757743 [details] [diff] [review] Part3. Focus the browser for fullscreen mode. Review of attachment 8757743 [details] [diff] [review]: ----------------------------------------------------------------- Thanks.
Attachment #8757743 - Flags: review?(avihpit) → review+
Joel, what's next here? would "the system" automatically "take" the new g4 job? or is anything else needed?
Flags: needinfo?(jmaher)
please land the changes, I need a definition in talos.json for 'g4' prior to enabling the job :)
Flags: needinfo?(jmaher)
(In reply to Joel Maher (:jmaher) from comment #101) > please land the changes, I need a definition in talos.json for 'g4' prior to > enabling the job :) Ethan? :)
Flags: needinfo?(ethlin)
(In reply to Avi Halachmi (:avih) from comment #102)O > (In reply to Joel Maher (:jmaher) from comment #101) > > please land the changes, I need a definition in talos.json for 'g4' prior to > > enabling the job :) > > Ethan? :) Okay, I'm going to set checkin-needed flag.
Flags: needinfo?(ethlin)
Keywords: checkin-needed
Pushed by cbook@mozilla.com: https://hg.mozilla.org/integration/mozilla-inbound/rev/b863ad0b2ba8 Part 1. Add talos test for video playback with basic compositor. r=avih,jmaher https://hg.mozilla.org/integration/mozilla-inbound/rev/54d7a042b8b7 Part 2. Add video clips for talos video test. r=avih,jmaher https://hg.mozilla.org/integration/mozilla-inbound/rev/84f7d596ce22 Part 3. Focus browser when test start for fullscreen mode. r=avih
Keywords: checkin-needed
Reopening since the test doesn't run yet on automation (it should have been kept open). Joel still needs to do some work on it. We'll block this bug with the automation bugs which Joel will file, and RESOLVE this once the new test is actually running together with the other talos tests.
Status: RESOLVED → REOPENED
Resolution: FIXED → ---

The bug assignee is inactive on Bugzilla, so the assignee is being reset.

Assignee: demo99 → nobody
Severity: normal → S3

Basic Compositing was removed in bug 1727876.

Status: REOPENED → RESOLVED
Closed: 8 years ago2 years ago
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: