Closed
Bug 742086
Opened 12 years ago
Closed 12 years ago
Need a more comprehensive "suite" of tests for testing checkerboarding performance with Eideticker
Categories
(Testing Graveyard :: Eideticker, defect)
Tracking
(Not tracked)
RESOLVED
FIXED
People
(Reporter: wlach, Assigned: wlach)
References
Details
(Whiteboard: [eideticker:p1])
Right now the eideticker dashboard uses taskjs.org to measure the amount of checkerboarding during a pan, and nightly.mozilla.org for testing a zoom. taskjs.org isn't necessarily a good benchmark, or at least not a good benchmark to use on its own. The reason why it's slow (use of gradients fills in the background) are not the same as the reasons why other sites as slow. I don't really know about nightly, but chances are it's not totally representative either. We should add some panning/zooming tests that represent real-world browsing experiences. Right now I am thinking: * cnn.com * nytimes.com * An interesting web page on wikipedia.org (the full version) Unfortunately the former two are copyrighted, which makes importing them into the eideticker repository annoying. :( I would really prefer to keep everything there for simplicity's sake. One option here would be to synthesize a test which mimics their essential properties without using copyrighted material. Other ideas welcome.
Assignee | ||
Updated•12 years ago
|
Assignee: nobody → wlachance
Assignee | ||
Comment 1•12 years ago
|
||
After talking to jmaher, I think we should also probably create this suite in a generic fashion that would also be usable by Talos and friends. This isn't really an eideticker-specific bug (though the work will prolly be used by Eideticker first)
Comment 2•12 years ago
|
||
I vote we specifically avoid timecube.com even if we used it for internal testing. We should avoid testing on any page that is a known slow path of cairo such as a very hard gradients (like you said). We don't want to benchmark cairo slow path performance for our main Eideticker test but rather something that represent better 'typical webpages'. planet.mozilla.org <-- Complex, lots of text, some images. IANAL Copyright should be okay to cache? a reddit comment page <-- long, lots of text http://en.wikipedia.org/wiki/Facebook <-- Apparently a popular page according to a random source, public domain
Comment 4•12 years ago
|
||
I think we need to use timecube. Specifically for checkerboarding tests since our competition doesn't appear to checkerboard when using this page.
Comment 5•12 years ago
|
||
timecube is a reflection of skia vs. cairo and not a reflection of how good we are in general. I don't want to have a benchmark that isn't representative of typical web pages and where the work to improve means working towards a skia port. (In reply to Bob Moss :bmoss from comment #4) > Specifically for checkerboarding tests > since our competition doesn't appear to checkerboard when using this page. That argument can be made for many web pages. Esp. if we're testing against borwser version that don't have async panning.
Comment 6•12 years ago
|
||
I don't understand the objection. Does this mean it doesn't matter how we perform on this page relative to our competition? I don't think users care why it checkerboards, if checkerboarding is bad then we need to find a way to make it go away.
Comment 7•12 years ago
|
||
That page doesn't not represent typical web performance ours users will see. So no, we should not base our general benchmark on that page. We should still track issues with that page but not benchmark against it.
Comment 8•12 years ago
|
||
I think I see the problem. What do you mean by "general benchmark"? My purpose for this test is to track issues with it.
Comment 10•12 years ago
|
||
Let's do planet.mozilla.org at a relatively high priority as well.
Assignee | ||
Comment 11•12 years ago
|
||
(In reply to JP Rosevear [:jpr] from comment #10) > Let's do planet.mozilla.org at a relatively high priority as well. Mirroring planet mozilla is actually a bit tricky because of all the external images/assets in it. I'll try to make it happen though.
Comment 12•12 years ago
|
||
We've done this before with our tp pageset. Let's find out how they did it. Maybe joe knows?
Comment 13•12 years ago
|
||
Alice pulls the Alexa top500 pages, culls it somewhat, and makes them work offline. She's off right now, though, so I'll CC Lukas, who at one point knew about Talos things and will likely know who to ask if she doesn't know.
Comment 14•12 years ago
|
||
I'm adding on Armen, since he helped implement tp5 and is probably more up to date on how we can deploy new talos suites than my crusty knowledge from the past.
Comment 15•12 years ago
|
||
The knowledge of creating a pageset is afaik in Alice's head unless it is documented somewhere in the A-team's documentation pages. I will be gone until Wednesday 25th and don't have knowledge at all on how the pageset can be created.
Assignee | ||
Comment 16•12 years ago
|
||
Easiest thing is probably just to modify the existing tp5 pages with the extra metadata eideticker needs. Going to try that route next.
Assignee | ||
Comment 17•12 years ago
|
||
So I had a look at the tp5 pageset. I really should have just done that a long time ago, as it would have saved me some effort in cleaning up/modifying the nytimes and cnn.com. Anyway, there's lots of good stuff in there. No planet mozilla, though I imagine there's something that has similar behaviour in there. Should be very straightforward to modify most/all of these to work with Eideticker. Here's the manifest (should be fairly easy to figure out which pages are which from it) http://people.mozilla.com/~wlachance/tp5.manifest What should we prioritize? My thought is that wikipedia, along with maybe a few more news sites, would be the most useful.
Comment 18•12 years ago
|
||
My vote is for: reddit.com/www.reddit.com/index.html imgur.com/imgur.com/gallery/index.html cnn.com/www.cnn.com/index.html bing.com/www.bing.com (more interesting testcase then google.com) facebook.com (would be nice to get a real page instead of the log-in but not worth spending much time on this) wikipedia.com
Assignee | ||
Comment 19•12 years ago
|
||
Added tests for reddit, imgur, and wikipedia: https://github.com/mozilla/eideticker/commit/20627a515f69666fed886bd67fb284d4fd3f6e4f Having some issues with imgur, there seems to be issues downloading all the images. Solution is probably to move mozhttpd into its own process. For now, you can see reddit and wikipedia here: http://wrla.ch/eideticker/dashboard/#/reddit/checkerboard http://wrla.ch/eideticker/dashboard/#/wikipedia/checkerboard
Assignee | ||
Comment 20•12 years ago
|
||
Filed dependent bug 747937 to deal with the image latency issue with imgur.
Assignee | ||
Updated•12 years ago
|
Whiteboard: [eideticker:p1]
Assignee | ||
Comment 21•12 years ago
|
||
Finally added imgur: http://wrla.ch/eideticker/dashboard/#/imgur/checkerboard
Blocks: mobile-automation
Assignee | ||
Comment 22•12 years ago
|
||
This bug seems a bit ambiguous to me. If there's a particular site we want to test with eideticker on a regular basis, IMO we should file a bug to handle that. If no one objects, I'm going to close this bug soon. (setting needinfo to myself to remind myself to do this)
Flags: needinfo?(wlachance)
Assignee | ||
Comment 23•12 years ago
|
||
Closing. Please feel free to file specific bugs if there's something you want to see!
Status: NEW → RESOLVED
Closed: 12 years ago
Flags: needinfo?(wlachance)
Resolution: --- → FIXED
Updated•7 years ago
|
Product: Testing → Testing Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•