Open Bug 1428034 Opened 7 years ago Updated 2 years ago

Apply Resist Fingerprinting Protection to WebGL's readPixels method

Categories

(Core :: Graphics: CanvasWebGL, enhancement, P5)

enhancement

Tracking

()

UNCONFIRMED
Tracking Status
firefox59 --- affected

People

(Reporter: tjr, Unassigned)

References

(Depends on 1 open bug)

Details

(Whiteboard: [fingerprinting] [gfx-noted] [fp-triaged])

For more details see Bug 1422890 In general though, we want to protect WebGL's ReadPixels method to prevent a website from rendering something, and using subtle rendering differences to fingerprint a user. (Same as Canvas.) Like canvas, the short-term fix for this will probably be to provide a blank result and throw the Canvas permission prompt. Before doing so, we should block on Bug 1428033 and then confirm that rendering differences *can* still be used to fingerprint users. (It would be great if they couldn't!) We will also need to look at the WebGL APIs and figure out if there are APIs that can be used to infer data besides readPixels. For example hypothetical getPixelColor or isPointOnLine methods. Finally, it would also be good to document the path forward for making WebGL unfingerprintable entirely. Tor does not need us to get this into ESR60, but it is a hole in FF's RFP mode.
> making WebGL unfingerprintable entirely This is not possible. I do not believe WebGL rendering differences have the same magnitude of fingerprintability as fonts on canvas2d. I'm going to need some PoCs before we crimp WebGL for this.
Status: NEW → UNCONFIRMED
Ever confirmed: false
Priority: -- → P5
Whiteboard: fingerprinting → fingerprinting gfx-noted
You can find an image hash that uses WebGL's readPixels here: https://browserleaks.com/webgl I am tentatively working on a patch for Bromite (Chromium fork) that would address this fingerprinting vector, see the related issue here: https://github.com/bromite/bromite/issues/131 The idea is to modify the RGB color components of a few pixels with some noise, as it has already been implemented for the canvas (https://github.com/bromite/bromite/blob/70.0.3538.71/patches/BRM053_Canvas-fingerprinting-mitigations-for-image-data-and-webGL.patch). (Ugly, but effective?)
FYI: It is also implemented in the extension Canvas Blocker [1][2]. I'll needinfo him so he can help or offer any insight if required/asked, and be aware of the coming changes. Fairly sure he has a test for WebGL readPixels. Normally they're all here [3] but I don't see a front facing link. He been delving into this stuff for years and is very thorough. [1] https://addons.mozilla.org/en-US/firefox/addon/canvasblocker/) [2] https://github.com/kkapsner/CanvasBlocker [3] https://canvasblocker.kkapsner.de/test/
Flags: needinfo?(kkapsner)
Thanks for the pointer Simon; in the past I referred mostly to ScriptSafe (https://github.com/andryou/scriptsafe) to see what a complete anti-fingerprinting extension would do for Chromium. I checked CanvasBlocker's source, I believe it does not uses the noise-adding approach I described above but rather blocks the WebGL readPixels functionality, or returns a fake one. Interesting to note, we use the same RGB noise solution for the regular canvas, although they were developed separately. The Javascript source code there does not have any comments, it would have been nice to see the reasons for some strategic choices in there; the mitigation I am implementing as a Chromium patch will work inside the C++ class, so there is not much I can use from CanvasBlocker. There is also an important difference with the WebGL readPixels data vs canvas image data: the former uses GL color types/format for the pixel data and I am not sure how closely it matches the RGB formats you can use on a regular canvas; I am currently working on this aspect. I will indeed take a look at the tests of CanvasBlocker though, they seem quite complete; I might also want to check the other fingerprinting vectors there.
I do not expect adding random noise to be effective. It will change the output of the image, resulting in a new base64 representation, and a new hash - yes. This would break naive tracking; but as soon as the trackers decide they want to still track you (which would happen if these were implemented by default or adopted with any significant user base) - they could bypass it. The noise can be subtracted out by comparing multiple images. Or they can report the entire image to the server and use fuzzy image comparison algorithms.
@tjr according to the papers I have read there is a certain amount of noise that makes the techniques you described not effective anymore; the mitigations adopted in Bromite/ungoogled-chromium are currently not adding enough noise to qualify for that but could be modified to do so. I am maintaining a collection of the papers/approaches discussed in Bromite issue tracker here: https://github.com/bromite/bromite/wiki/Fingerprinting A more comprehensive list is probably here: https://amiunique.org/links (NOTE: although the page there says "scientific papers", I would not assume they have all been peer reviewed and accepted). The most simple countermeasure to satisfy the invalidation criteria you exposed would be to have different noise-addition thresholds, including an option with more noise that can satisfy image subtraction techniques and fuzzy algorithms (it is sufficient to add noise in the same order of magnitude of the standard deviation of the fuzzy algorithm to defeat it).
Whiteboard: fingerprinting gfx-noted → [fingerprinting] [gfx-noted] [fp-triaged]
Severity: normal → enhancement

FWIW, Tor Browser 8.5.a* (not sure exactly which release) has starting enabling WebGL, but they block readPixels, so TB might like this at some stage (although I don't see it landing in ESR68?). Note, in [1] the error is output to increase entropy in results

[1] https://ghacksuserjs.github.io/TorZillaPrint/TorZillaPrint.html#canvas

8.0.*
getContext: 2d: supported, webgl: not supported, webgl2: not supported
[webgl] readPixels: webgl not supported


8.5a*
getContext: 2d: supported, webgl: supported, webgl2: supported
[webgl] readPixels: Error: Permission denied to access property "createBuffer"

[2] https://panopticlick.eff.org/

  • 8.0.* - 00000000000000000000000000000000
  • 8.5a* - undetermined

While I realize that we should do something along the lines of the description of this bug, we move forward and just don't support readPixels() in RFP mode in Tor Browser for now:

https://gitweb.torproject.org/tor-browser.git/commit/?h=tor-browser-60.7.0esr-9.0-1&id=e462f9d9eb505b5e724ec64a52280c70210cf5eb

Would that be something upliftable? Or should we treat it as a stopgap for us while we should try to get to a better solution for uplift than outright blocking readpPixels()?

Flags: needinfo?(jgilbert)

My recommendation for Tor is to investigate software-only WebGL solutions as a holistic approach, but we can't ship that as a solution for our mainline Firefox users. Switching Windows to D3D-WARP is pretty easy, and I know MacOS has a software OpenGL driver, but I think it's in pretty poor shape. Bug 1286056 is for adding support for Swiftshader as a universal modern software WebGL fallback, so that's also something to keep an eye on.

There's no amount of noise you can add to ReadPixels that can thwart data exfil, and it's relatively easy to exfil the 4-5 bits of data that would identify the vendor*generation of device, by my estimates. I think we should drop that idea, unfortunately, since programmable shaders make calculation differences trivial to magnify.

I don't think you can even reasonably expect to prevent this exfil even if you disable ReadPixels (and similar) entirely, though maybe it's possible to reduce it to timing attacks.

There was a related investigation a while back to emulate minimum-precision restrictions in GLSL, but IIRC it was very expensive and might have been incomplete.

I think what's needed here is a more targeted approach, where we enumerate (and ideally proof-of-concept) fingerprinting bit leaks, and respond to them individually.

Flags: needinfo?(jgilbert)
Flags: needinfo?(bugzilla)
Severity: normal → S3
You need to log in before you can comment on or make changes to this bug.