Closed
Bug 767064
Opened 12 years ago
Closed 12 years ago
WebGL should prefer 565 (or native) on mobile for alpha-less contexts
Categories
(Core :: Graphics: CanvasWebGL, defect)
Tracking
()
RESOLVED
FIXED
mozilla17
People
(Reporter: vlad, Assigned: vlad)
References
(Blocks 1 open bug)
Details
(Whiteboard: [games:p1])
Attachments
(3 files, 1 obsolete file)
(deleted),
patch
|
jgilbert
:
review+
bjacob
:
review+
|
Details | Diff | Splinter Review |
(deleted),
patch
|
Details | Diff | Splinter Review | |
(deleted),
patch
|
bjacob
:
review+
|
Details | Diff | Splinter Review |
See bug 728524 comment 75 and 76. Currently, we end up creating an 8888 context or an 8880 one for WebGL, even on mobile. At best, this is costing us 2x memory bandwidth; at worst, we're hitting format conversions (e.g. rendering 888 -> 565) and other badness along the way.
This seems to have been caused by bug 739775. ChooseGLFormats ignores 565 unless OES_rgb8_rgba8 isn't supported, and even then it'll *still* choose 888/UNSIGNED_BYTE -- rbColor is only used for creating a MSAA renderbuffer (I'm actually confused how we create a color renderbuffer when samples = 0 and aNeedsReadBuffer = false).
I'll try to create a patch shortly.
Comment 1•12 years ago
|
||
Since alpha is the default, any alphaless-only solution will only affect a minority of content. Would it be acceptable to default to 4444 for contexts with alpha?
Comment 2•12 years ago
|
||
In fact, DITHER is enabled by default (GLES 2.0.25 page 145, table 6.11) so 4444 should give acceptable quality by default.
Assignee | ||
Comment 3•12 years ago
|
||
Two bits here:
One, it reworks GLContext::ChooseGLFormats so that it takes into account the requested bit depth. If we're on ES2 hardware and the number of color bits is <= 16 or we don't support rgb8/rgba8, we'll do 4444/565.
Two, it changes the webgl format logic to request 4444/565 if the platform screen depth is 16.
Note that on the clear test (with postMessage), 4444 is a *tiny* bit slower than 565 (like 1ms, if that). Likely due to extra compositing needed when blitting the texture to the output.
Assignee: nobody → vladimir
Attachment #635458 -
Flags: review?(jgilbert)
Attachment #635458 -
Flags: review?(bjacob)
Assignee | ||
Updated•12 years ago
|
OS: Windows 7 → Android
Hardware: x86_64 → ARM
Whiteboard: [games:p1]
Comment 4•12 years ago
|
||
Comment on attachment 635458 [details] [diff] [review]
use 4444/565 when requested on ES2, and have WebGL request 4444/565
Review of attachment 635458 [details] [diff] [review]:
-----------------------------------------------------------------
I think we should really propose a color depth hint to the WG. Something like 'full-color' which defaults to false. If true, try to use 8-bit-color backings. Else, it's up to the impl, with the goal being that we back the context with a device-appropriate bit-depth.
R- for the forcing of non-8-bit-color textures where 8-bit-color RBs are unsupported.
Hitting for BGR(A) on desktop and sized-formats for low-bit-depth desktops should probably wait until a follow-up bug.
::: gfx/gl/GLContext.cpp
@@ +1278,5 @@
> GLFormats formats;
>
> + // If we're on ES2 hardware and we have an explicit request for 16 bits of color or less
> + // OR we don't support full 8-bit color, return a 4444 or 565 format.
> + if (mIsGLES2 && (aCF.colorBits() <= 16 || !IsExtensionSupported(OES_rgb8_rgba8))) {
Do we want to use 565/4444 with ANGLE (and other upcoming desktop GLES)? I think then we should add support for 16bpp rendering on desktop GL as well, uncommon as that is.
Also this precludes support for 8888 textures on >16bit-color system, if GL lacks OES_rgb8_rgba8. While unlikely, there's no reason to preclude this.
Particularly since we just don't use renderbuffers on mobile, currently.
@@ +1301,5 @@
> +
> + if (aCF.alpha) {
> + // prefer BGRA8888 on ES2 hardware; if the extension is supported, it
> + // should be faster.
> + // XXX why don't we prefer BGRA on desktop?
To support BGRA on desktop, you can't call texImage2d with internalFormat == format == BGR(A). Just add a texColorInternal to GLFormats and make sure that while it's BGR,BGR for GLES, it's RGB,BGR for desktop. To my knowledge, there's no way to specify a texture as BGR-order on desktop GL.
Attachment #635458 -
Flags: review?(jgilbert) → review-
Assignee | ||
Comment 5•12 years ago
|
||
(In reply to Jeff Gilbert [:jgilbert] from comment #4)
> Comment on attachment 635458 [details] [diff] [review]
> use 4444/565 when requested on ES2, and have WebGL request 4444/565
>
> Review of attachment 635458 [details] [diff] [review]:
> -----------------------------------------------------------------
>
> I think we should really propose a color depth hint to the WG. Something
> like 'full-color' which defaults to false. If true, try to use 8-bit-color
> backings. Else, it's up to the impl, with the goal being that we back the
> context with a device-appropriate bit-depth.
>
> R- for the forcing of non-8-bit-color textures where 8-bit-color RBs are
> unsupported.
That's true. The problem is that ChooseGLFormats doesn't have enough information; I don't really like the entire machinery, tbh. My patch at least makes it consistent -- if it'll choose a 565 renderbuffer, it'll choose a 565 texture and the same for 888/8888. Otherwise, as is the current case, you'll get vastly different performance characteristics depending on whether you end up using the texture format or the renderbuffer format. Is there any current situation where using a 565 texture instead of an 888 one isn't wanted?
Maybe ChooseGLFormats needs to be split into something with a bit more smarts, with information about what will actually be used?
> ::: gfx/gl/GLContext.cpp
> @@ +1278,5 @@
> > GLFormats formats;
> >
> > + // If we're on ES2 hardware and we have an explicit request for 16 bits of color or less
> > + // OR we don't support full 8-bit color, return a 4444 or 565 format.
> > + if (mIsGLES2 && (aCF.colorBits() <= 16 || !IsExtensionSupported(OES_rgb8_rgba8))) {
>
> Do we want to use 565/4444 with ANGLE (and other upcoming desktop GLES)? I
> think then we should add support for 16bpp rendering on desktop GL as well,
> uncommon as that is.
I doubt angle exports 565/4444; we won't get it (we'll pick an 8888 config). If it does export one, well, then we'll get it if requested -- but WebGL won't, since the screen depth will be 24 so it won't request 565/4444.
> Also this precludes support for 8888 textures on >16bit-color system, if GL
> lacks OES_rgb8_rgba8. While unlikely, there's no reason to preclude this.
Yep, see above. Given the unlikely scenario, I'd rather untangle this in a followup...
> @@ +1301,5 @@
> > +
> > + if (aCF.alpha) {
> > + // prefer BGRA8888 on ES2 hardware; if the extension is supported, it
> > + // should be faster.
> > + // XXX why don't we prefer BGRA on desktop?
>
> To support BGRA on desktop, you can't call texImage2d with internalFormat ==
> format == BGR(A). Just add a texColorInternal to GLFormats and make sure
> that while it's BGR,BGR for GLES, it's RGB,BGR for desktop. To my knowledge,
> there's no way to specify a texture as BGR-order on desktop GL.
EXT_bgra? But yes, the internal format should still be RGB/RGBA, that's true. The 'format' should be BGR/BGRA. We only expose one 'texColor' in this formats, where we should probably expose both. (However, EXT_bgra only matters for upload, and we assume we have it given GetOptimalReadFormats() usage. We do the BGR swap in a shader anyway, so that we don't have to depend on it, but it'd be nice if our textures just had the right colors by default.)
Comment 6•12 years ago
|
||
EXT_bgra is core for GL 2+, but it only allows passing BGR(A) as |format| to texImage2D. If you don't use a sized |internalFormat|, GL can guess that you want it stored in BGR(A) and do so if it's capable. At their core, though, |format| and |type| just specify the format of the |data| being uploaded, while |internalFormat| designates the actual storage format.
The ANGLE thing actually is a problem, I think. If someone's running a 16bpp display (for whatever reason), and we have an AA context, our MSAA draw FBO will be either 565 or 4444, even though our backing PBuffer is 888(8).
I think there are a few cases for 8-bit colors on <8-bit-color devices, such as offline rendering and full-quality screenshots. If we're doing readback anywhere, I think we also want to have 24 or 32bpp backings anyway, as well.
I don't understand what you mean when you say ChooseGLFormats doesn't have enough information. What more does it need?
Comment 7•12 years ago
|
||
I should add that the implementation can store textures however it likes, so long as it's functionally transparent. What isn't performance-transparent is that BGRA is more useful for use for readback, and indeed seems to reflect the way textures are stored internally on desktop GL for most drivers.
The only reason we care about BGR(A) is in case of readback. For other cases, it's irrelevant how texels are stored internally.
Assignee | ||
Comment 8•12 years ago
|
||
(In reply to Jeff Gilbert [:jgilbert] from comment #6)
> EXT_bgra is core for GL 2+, but it only allows passing BGR(A) as |format| to
> texImage2D. If you don't use a sized |internalFormat|, GL can guess that you
> want it stored in BGR(A) and do so if it's capable. At their core, though,
> |format| and |type| just specify the format of the |data| being uploaded,
> while |internalFormat| designates the actual storage format.
>
> The ANGLE thing actually is a problem, I think. If someone's running a 16bpp
> display (for whatever reason), and we have an AA context, our MSAA draw FBO
> will be either 565 or 4444, even though our backing PBuffer is 888(8).
Hrm, why will the backing pbuffer be 888? (I just checked -- ANGLE does actually expose 565, but not 4444 [no such d3d format].) We'll create it using the same config... and honestly, I don't see why we'd overcomplicate this to support someone on Windows with a 16bpp display.
> I think there are a few cases for 8-bit colors on <8-bit-color devices, such
> as offline rendering and full-quality screenshots. If we're doing readback
> anywhere, I think we also want to have 24 or 32bpp backings anyway, as well.
I absolutely agree. But I think that's beyond the scope of this specific bug -- we need a way for content to say "I want high quality" for webgl, and perhaps for others, which we don't have currently. We should figure out how to do that in another bug; I'm happy to add an experimental mozHighQuality or something to the webgl context creation attribs.
> I don't understand what you mean when you say ChooseGLFormats doesn't have
> enough information. What more does it need?
Well, it has enough info, as long as we make it consistent like my patch does -- returning 565 formats for both textures and renderbuffers, instead of returning 8-bit for one and 565 for the other.
Assignee | ||
Comment 9•12 years ago
|
||
(In reply to Jeff Gilbert [:jgilbert] from comment #7)
> I should add that the implementation can store textures however it likes, so
> long as it's functionally transparent. What isn't performance-transparent is
> that BGRA is more useful for use for readback, and indeed seems to reflect
> the way textures are stored internally on desktop GL for most drivers.
>
> The only reason we care about BGR(A) is in case of readback. For other
> cases, it's irrelevant how texels are stored internally.
It can do whatever it wants, but if the internal format is actually BGR (which it often is, due to compat with windows and d3d), then uploading texture data in BGR, as in ThebesTextureLayer/BasicTextureImage can provide a perf boost. Though when I looked at it last on the desktop, it ended up not mattering much, and it was simplest to just swap BGR in the shaders which is what I ended up doing. But, for this bug, I'm happy to take out those XXX comments :D
Comment 10•12 years ago
|
||
Yeah, but we're not actually uploading here, so that part doesn't matter too much.
Assignee | ||
Comment 11•12 years ago
|
||
Right. So ignore the BGR issue, I'll take out the comments. :)
So, coming back though -- what do you want me to change here? I don't think that the renderbuffer issue is something that we should fix (or even can, without reintroducing this exact same bug). I'd maintain that the function should give similar RB and texture formats if it can.
ANGLE going into 16bpp mode would only happen on a windows desktop in 16bpp, which I think is going to be extremely rare, so I'd rather not try to handle things there either.
Comment 12•12 years ago
|
||
I don't have confidence that out of our half a billion users, none will hit this ANGLE+16bpp issue. It will cause antialiased contexts to fail to blit from the MSAA FBO into the PBuffer backing, since the color formats won't match.
The PBuffer path in ProviderEGL supports asking for 565 (and it looks like 4444 should work too), but I'm not sure if it'll work. Also, currently, we try to request 8-bit-color first, regardless of the ContextFormat. Not only that, but if mCreationFormat is 565/4444, and we create an 888(8) PBuffer anyways, we don't update either it or mActualFormat.
The minimal fix for this bug is to give ANGLE an exemption (for now), such that it always takes the >16bpp path. A follow-up bug can go through and correct this logic in ProviderEGL, and remove the exemption.
Assignee | ||
Comment 13•12 years ago
|
||
Sure, I can add an exemption for ANGLE. Simple enough :)
Comment 14•12 years ago
|
||
Comment on attachment 635458 [details] [diff] [review]
use 4444/565 when requested on ES2, and have WebGL request 4444/565
r- from jgilbert -> waiting for new version
Attachment #635458 -
Flags: review?(bjacob)
Assignee | ||
Comment 15•12 years ago
|
||
Updated; added XP_WIN ifndef around 565 check to avoid ANGLE.
Attachment #635458 -
Attachment is obsolete: true
Attachment #639104 -
Flags: review?(jgilbert)
Attachment #639104 -
Flags: review?(bjacob)
Comment 16•12 years ago
|
||
Comment on attachment 639104 [details] [diff] [review]
use 4444/565 when requested on ES2, and have WebGL request 4444/565
Review of attachment 639104 [details] [diff] [review]:
-----------------------------------------------------------------
We should do BGRA (maybe BGR?) on desktop in a follow-up bug.
::: gfx/gl/GLContext.cpp
@@ +1301,5 @@
> +
> + if (aCF.alpha) {
> + // prefer BGRA8888 on ES2 hardware; if the extension is supported, it
> + // should be faster.
> + // XXX why don't we prefer BGRA on desktop?
BGRA is not a valid internal format on desktop. We could (and should) add a texColorInternal (or similar) to GLFormats, for which <texColorInternal,texColor> should be <RGB(A),BGR(A)> on GL and <BGR(A),BGR(A)> on GLES.
Our texColorType on desktop should probably be GL_UNSIGNED_INT_8888_REV for BGRA, also. Probably doesn't matter, though.
@@ +1315,2 @@
> } else {
> + // XXX why don't we prefer BGR on desktop?
We should probably do it for symmetry's sake. (see above for why we don't do it yet)
I somehow doubt it'll be any faster, though.
Attachment #639104 -
Flags: review?(jgilbert) → review+
Assignee | ||
Comment 17•12 years ago
|
||
https://hg.mozilla.org/integration/mozilla-inbound/rev/1f9c66d66df4
Nuked the BGR comments. We should file a separate bug to check on desktop, though you're probably right that it won't matter much.
Target Milestone: --- → mozilla16
Assignee | ||
Comment 18•12 years ago
|
||
Backed out -- gfxPlatform::getScreenDepth isn't implemented on some platforms, and asserts! Grrrrr. I am going to just change it to a NS_WARNING, and have it return 0 by default.
Comment 19•12 years ago
|
||
That would be my fault, sorry about that :)
Comment 20•12 years ago
|
||
Comment on attachment 639104 [details] [diff] [review]
use 4444/565 when requested on ES2, and have WebGL request 4444/565
Review of attachment 639104 [details] [diff] [review]:
-----------------------------------------------------------------
Nothing to add to Jeff's comments + the gfxPlatform issue.
Attachment #639104 -
Flags: review?(bjacob) → review+
Assignee | ||
Comment 21•12 years ago
|
||
Status: NEW → UNCONFIRMED
Ever confirmed: false
Assignee | ||
Updated•12 years ago
|
Status: UNCONFIRMED → ASSIGNED
Ever confirmed: true
Comment 22•12 years ago
|
||
Status: ASSIGNED → RESOLVED
Closed: 12 years ago
Resolution: --- → FIXED
Comment 23•12 years ago
|
||
Backed out in https://hg.mozilla.org/mozilla-central/rev/02b26fb307b4 - turns out that it takes about 12 hours to notice Android mochitest bustage, but eventually Ryan did notice that on the few times native mochitest-1 managed to run, and managed to start up, and managed not to crash, it was failing 422 WebGL mochitests every time, a la https://tbpl.mozilla.org/php/getParsedLog.php?id=13372355&tree=Firefox
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
Updated•12 years ago
|
Target Milestone: mozilla16 → ---
Assignee | ||
Comment 24•12 years ago
|
||
Ugh... it might be that the webgl tests are not safe for 5/6/5 contexts, especially the ones that check for exact pixel values. This is going to be a problem :/
Comment 25•12 years ago
|
||
We have a nontrivial battle to win here: the current WebGL spec actually requires 8 bit per channel. So we need it changed. To that effect, we need some solid data on a few different phones. This patch helps get that data by setting some env vars (check the code) to switch between 444, 565, 888.
Comment 26•12 years ago
|
||
Yep, I hadn't realized it, but the spec currently requires 8-bits per color channel.
I agree that we should land this pref'd off (or similar), but be able to switch to it to demonstrate its advantages.
Status: REOPENED → NEW
Comment 27•12 years ago
|
||
On try, on top of Oleg's inbound push so this should have the fast-omtc patches as well as Vlad's patch here.
https://tbpl.mozilla.org/?tree=Try&rev=2c27ac9c6e60
Comment 28•12 years ago
|
||
Landing this preffed off is probably a good short-term move but we really want to get enough data to
- either confirm it's worth it, and convince others it is
- or infirm that it's worth it
Assignee | ||
Comment 29•12 years ago
|
||
I'll recreate this patch and add two prefs for testing:
webgl.prefer_16bit -- basically what this patch does, do a 565/444 on 16-bit displays
webgl.default_no_alpha -- unless alpha is explicitly specified, give contexts with no alpha
That'll give us a way to test content out and see what perf is like. I'd rather not do the env var route since that's a pain on Android, and makes it hard to let people do demos with the flags set if we decide we'd like to get there in the future.
Assignee | ||
Comment 30•12 years ago
|
||
Same patch, but does the 16-bit change behind a pref, and adds a new pref for default-alpha:
webgl.prefer-16bit
webgl.default-no-alpha
Both default to false, so should be no change from today.
Attachment #641178 -
Flags: review?(bjacob)
Updated•12 years ago
|
Attachment #641178 -
Flags: review?(bjacob) → review+
Assignee | ||
Comment 31•12 years ago
|
||
Target Milestone: --- → mozilla17
Comment 32•12 years ago
|
||
Status: NEW → RESOLVED
Closed: 12 years ago → 12 years ago
Resolution: --- → FIXED
Comment 33•12 years ago
|
||
(In reply to Vladimir Vukicevic [:vlad] [:vladv] from comment #30)
> Created attachment 641178 [details] [diff] [review]
> updated with prefs
>
> Same patch, but does the 16-bit change behind a pref, and adds a new pref
> for default-alpha:
>
> webgl.prefer-16bit
> webgl.default-no-alpha
>
> Both default to false, so should be no change from today.
It's actually "webgl.prefer-16bpp". ("bpp" not "bit")
Updated•11 years ago
|
Blocks: gecko-games
You need to log in
before you can comment on or make changes to this bug.
Description
•