Open Bug 1034526 Opened 10 years ago Updated 1 year ago

New Sync should have an option to encrypt data with a secret that's not used for anything else and never entered into a Web page

Categories

(Firefox :: Sync, enhancement, P3)

enhancement

Tracking

()

People

(Reporter: hsivonen, Unassigned)

References

Details

(Keywords: sec-want)

In New Sync, the encryption key used for encrypting personal data such as browsing history and passwords is derived from the Firefox Accounts password. This makes New Sync weaker than Old Sync or Chrome Sync with the optional client-side encryption secret, because: 1) Going forward, the user may have to login to Firefox Accounts for purposes other than Sync and, therefore, need to make the password more rememberable and, thus, more guessable than an appropriate number of bits from a CSPRNG. 2) Even if Firefox Accounts login is designed never to send the original password across the network when a Firefox Accounts login page is functioning as promised, if a Firefox Accounts login page was compromised, the login page could technically exfiltrate the password without leaving any evidence of such exfiltration if the user doesn't happen to read the source of the page exactly on the page load that includes the rogue code. To address these problems, the ideal solution would be to go back to the Old Sync model, but obviously that's not going to happen. So as a second-best solution, I suggest New Sync be extended by allowing the user to optionally supply a secret distinct from the Firefox Accounts password such that a) This additional secret is used for encrypting Sync data instead of driving the Sync encryption key from the Firefox Accounts password. b) This additional secret is never entered into a Web page or into UI that looks like browser chrome but is actually Web page-like on-demand code that can change from one load to next. c) This additional secret is not used for any other purpose. With these changes, someone seeking to compromise the confidentiality of Sync data would have to deliver a Trojanized update for Firefox itself (or for a fully-privileged add-on), which involved sending the same level of evidence of compromise to the user system as would be necessary to grab passwords and history from a Firefox instance that doesn't use Sync. (The general assumption here is that the eagerness of entities that might seek to grab the user's data to carry out an operation to grab the user's data goes up when the level of evidence that needs to be exposed goes down. I.e. that entities that might seek to grab the user's data are averse to getting caught doing so.)
Hrm, it's not ideal that this bug report is about two different issues. We should have one bug per issue, that usually makes solving them much easier...
(In reply to Robert Kaiser (:kairo@mozilla.com) from comment #1) > Hrm, it's not ideal that this bug report is about two different issues. We > should have one bug per issue, that usually makes solving them much easier... Well, it doesn't really make sense to fix this piecewise.
(In reply to Robert Kaiser (:kairo@mozilla.com, vacation until Aug 10) from comment #1) > it's not ideal that this bug report is about two different issues. I'm only seeing one issue here, what's the second one? There were two examples of problems that arise from the current design but only one proposed change, namely "don't in any way let the user's password be enough to unlock the encrypted data." I'm not sure a) is quite right: I thought bwarner described the encryption key as still separate, but uploaded to the server "wrapped" by the user's password (otherwise there wouldn't be a difference between changing your password and "resetting" your password--both would require a full re-encrypt and re-upload). Doesn't change the thrust of this bug, but might change the solution from "need a separate key" to "don't upload the key we already have and resurrect JPAKE sharing for transfer to new machines". https://blog.mozilla.org/warner/2014/05/23/the-new-sync-protocol/ I don't see how to do this without reintroducing the usability concerns that drove the new model so at best we could hope for this to be an option, not the default. Ah, were you seeing "make FxA UI part of browser chrome rather than web" as a separate issue? If the encryption key is never sent to the server and isn't derived from that password then it's irrelevant to this bug (though it would still be nice). Making that change alone is not sufficient: Fennec has native UI (bug 951304) but still suffers from the problems Henri would like fixed.
(In reply to Daniel Veditz [:dveditz] from comment #3) > (In reply to Robert Kaiser (:kairo@mozilla.com, vacation until Aug 10) from > comment #1) > > it's not ideal that this bug report is about two different issues. > > I'm only seeing one issue here, what's the second one? The summary says "not used for anything else *and* never entered into a Web page" which sounds like two different things to me.
> The summary says "not used for anything else *and* never entered into a Web page" which sounds like two different things to me. :kaiser is correct, these are two degrees of freedom here, but IMO, they aren't exactly the ones he identifies. E.g., as user stories: 1) As a user, I want to use a full strength encryption key with Firefox Sync that isn't dependent on my Firefox Account password. E.g., to satisfy this story, we could bring back an option for Sync that uses a randomly generated key that's never sent to the FxA server and relies on a pairing protocol to transfer the key between machines. The second story is perhaps murkier. Here's a shot: 2) As a user, I don't want the security of my full strength encryption key to depend on the security of hosted web content. E.g., we could implement the pairing UI in hosted web content and not satisfy this story.
There are indeed two issues but they are pretty closely related. The issues are: 1) As a user, I don't want the privacy of my passwords and history to depend on the Firefox Accounts password. 2) As a user, I don't want enabling sync to make my the privacy of my passwords and history depend on any code that the privacy of my passwords and history doesn't already depend on when sync is not enabled. (Where a "Firefox build" counts as one lump of "code" but on-demand code loaded from a server at run-time is a distinct unit of "code".) The relation is that unless #1 is addressed, addressing #2 would require *all* Firefox Accounts login forms--not just Sync-related ones--to be built into the browser. However, there are also rememberability vs. entropy reasons for #1 alone, but that issue the user can mitigate on their own by sacrificing rememberability. (Whether the sync key is wrapped with the password or derived from it is a detail, since either way, exfiltrating the password means exfiltrating enough data to unwrap or derive the key.) I tried to phrase #2 in a less murky way. Note that if you build Firefox yourself (or use a build from a Linux distro rather than Mozilla) but use Mozilla's services, #2 becomes even less murky in terms of illustrating Old Sync providing "provider-independent security" compared to New Sync (but "build your own sync service, too, then" is not really a fix for this issue).
I agree that addressing your "2" would put awkward constraints on the FxA ecosystem if we didn't address "1". FWIW, I think these stories are murky, because I think they are still ill-defined and we don't have common agreement on them. We have both phrased them in what the user doesn't want, not what they want, and they are missing "so that" phrases, which describe the perceived benefits to the user. I suspect that "provider independent security" is important to you, but it's not the only issue in play here.
"... so that government coercion or a Mozilla compromise cannot be used to recover my data (short of brute-force decryption attacks)"
Keywords: sec-want
Priority: -- → P3
Flags: firefox-backlog+
Whiteboard: [fxa-waffle]
Severity: normal → enhancement
Priority: P3 → --
Adding Alex, Sync+FxA Product Manager, for context.
Alex, this is a product level question, want to chat about it and maybe move a variant of it into Projects for tracking/planning?
Flags: needinfo?(adavis)
Whiteboard: [fxa-waffle]
I've added it to the product board backlog. Let's talk about it this week to just understand effort, impact and confidence levels. https://projects-beta.growthhackers.com/cards/add-non-fxa-tied-encryption-key-option-to-sync/UglYuWiKCNvPCTnO9813zw
Flags: needinfo?(adavis)
I found myself directing folks to this bug from the comments in https://news.ycombinator.com/item?id=18446278, and it's due for an update on what our latest plans/thinking are, so I wanted to add a couple of quick notes. The "obvious" approach of adding a second passphrase just for Firefox Sync remains off the table; as discussed a bit in [1] it's a bad user experience, and it's contagiously bad - it forces you to enter two passphrases on every device where you want to access your sync data. What we *are* doing is working on building a great pairing experience, the first version of which is in Bug 1490671. As foreshadowed by the ever-prescient :ckarlof in Comment 5, the first version of this pairing flow is implemented largely in web content in order to get something up and running. However, it's being deliberately designed so that it will work without web content handling key material, and it has the advantage of benefiting all users regardless of whether they care about the issues in this bug or not. So I see a nice path from here to a world where, if you've got one Firefox connected to Sync, you can connect additional Firefoxes without having to trust dynamically-loaded web content to handle your encryption keys. That would leave this bug with the smaller problem of getting your *initial* Firefox instance syncing with a key that's separate from your account password, which seems more tractable when isolated to a single instance. I can imagine some tweaks to the login process that would accommodate this, but it would remain a question of prioritizing that (smaller and more tractable!) work relative to all the other things we've got going on. [1] https://hacks.mozilla.org/2018/11/firefox-sync-privacy/
Continuing from https://bugzilla.mozilla.org/show_bug.cgi?id=1447656#c3: > A brief and incomplete answer to "why not?" is that Firefox Sync is not the only consumer of Firefox Accounts authentication. It would be fine to have the login UI for Firefox Sync builtin to Firefox, but what about the login UI for addons.mozilla.com? For Pocket? It starts to hit thorny trade-offs pretty quickly when taking multiple different consumers into account. I see nothing wrong with that. Just have a Firefox built-in service that exposes some kind of special method that only Mozilla sites can use (I guess this is already limited – or if not then that is okay for all to use; a fake domain name would be possible; or some JS method that is injected into these sites as done for AMO etc.), so you can use it for OAuth. So your OAuth provider is just built-in into your browser and handles it via a special about: site or whatever… I mean authenticating to FxA should be possible for websites without having the password exposed,, and with the whole service built-in, it is not even exposed to _any_ web servers.
What should happen if a user tries to sign in to https://addons.mozilla.org/ in a non-Firefox browser? An answer of "they cant" is contrary to the philosophy of the open web and limits the reach of our products. An answer of "they fall back to entering their password into web content" means we've made the security properties of the system murkier, but not really any stricter.
Oh, I indeed did not consider that case. A fallback may indeed the easiest solution. At least for Firefox users, it is then safer. Don't know what "murkier" should mean in that context. However, I see that this is a problem. In any case, however, the "content-signing" you proposed in bug 1447656 has the same problem. It likely also only works in Firefox, if implemented. So at some point, you may need a compromise here. Whether that be a content-signing or just including the login form. Or… separating Firefox Sync from FxA. So maybe you use the usual website and then only get an access token? The in-browser version may then not only use this access token for encryption, but something else. (And yeah, I get this issue is about exactly that.) Or in a very similar way: Firefox somehow catches the password entering into the FxA site and uses it's own mechanism for authenticating. --- So I get to see the problem. :) Generally, I think you somehow need to implement a mechanism that must be Firefox-only. I mean it's about the sync, that is not even available in other browsers (with the FxA account, at least). So either another secret (as proposed here), or another security layer for the login site (i.e. "content-signing" or shipping FxA with the browser itself). The last thing may be done with Subresource Integrity and then a global mechanism that verifies that integrity of the main page (or, at least, all JavaScript and CSS or so). It may also need some CSP "require-sri" or so… If you choose that way, it is at least based on web standards. And https://github.com/tasn/webext-signed-pages shows how this may be done. To make it easier you could, of course, instead of using pgp/public key crypto, just hardcode the SRI hashes or so in Firefox. This then of course binds it to the Firefox version, similarly to when you ship it inside of Firefox. But yeah, some kind of security mechanism that does not depend on the "web security" would be great. The Protonmail case has shown, this is an issue and "e2e crypto" while relying on website services may not be considered real "e2e".
> (i.e. "content-signing" or shipping FxA with the browser itself). > The last thing may be done with Subresource Integrity and then a global mechanism that verifies > that integrity of the main page (or, at least, all JavaScript and CSS or so). It may also need some CSP "require-sri" or so… FWIW Firefox has existing infrastructure for this, based on a "Content-Signature" header for the page and SRI for loaded resources. But of course, this doesn't protect against Mozilla itself sending you malicious code (i.e. the "provider-independent security" mentioned in Comment 6).
Priority: -- → P3

How do you protect against a court order compelling Mozilla to serve malicious JS to a user in order to phish the password ?

How do you protect against a court order compelling Mozilla to serve malicious JS to a user in order to phish the password ?

It's unclear to me if it's any different from getting a court order to land malicious client-side code to extract keys directly from the browser or to phish an encryption passphrase.

I think that in either case, it will be a legal question more than a technical one.

Here is a great example on how it's less of a technical problem and more of a legal one:
https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_dispute

The "obvious" approach of adding a second passphrase just for Firefox Sync remains off the table; as discussed a bit
in [1] it's a bad user experience, and it's contagiously bad - it forces you to enter two passphrases on every device where
you want to access your sync data.

As a user, I'd like to be able to make the decision whether the UX badness is unacceptably bad for me. For example, I was very happy with the PAKE pairing of the original Sync, but others deemed it a UX problem, so now it's been six years without it. :-(

(As a developer, I totally recognize that this is the usual "why don't you just add another pref?" argument.)

Although a pairing-based approach may be awkward between desktop computers in different locations, the computer+phone use case would work rather well with Firefox on the computer exposing the requisite data as a QR code and Firefox on the phone scanning the QR code. You could also hold a phone that shows a QR code in such a way that a laptop's camera sees the QR code.

In the years since Old Sync's removal, the UX pattern of scanning a QR code to establish a TOTP key in Google Authenticator has become common enough, that I believe there'd be an addressable audience of security-minded users who would be able to perform pairing by QR code.

For example, I was very happy with the PAKE pairing of the original Sync, but others deemed it a UX problem, so now it's been six years without it. :-(

And in fact we do now point people to set up sync on their android device using a QR code, but it's not used to share the kind of separate secret you and I want it to.

(In reply to Alex Davis [:adavis] from comment #21)

How do you protect against a court order compelling Mozilla to serve malicious JS to a user in order to phish the password ?

It's unclear to me if it's any different from getting a court order to land malicious client-side code to extract keys directly from the browser or to phish an encryption passphrase.

I think that in either case, it will be a legal question more than a technical one.

This isn't true, you can turn it into a technical problem. If the only code that handles my password is in a browser which is compiled from source then a government forcing Mozilla to add code is a hugely different question than asking Mozilla to serve different code in a webpage (possibly only to particular IP addresses).

The former has many people looking over the tarballs, confirming that they came from Mozilla's source repos and the commits that add these changes. The second can be done with little to no public visibility. Furthermore there are literally hundreds of CAs that can be compromised to issue certs for accounts.mozilla.org such that Mozilla doesn't even need to be involved. (I acknowledge that the web PKI system is getting more secure but it is still far behind the security of distributing Firefox)

(In reply to Alex Davis from comment #21)

How do you protect against a court order compelling Mozilla to serve malicious JS to a user in order to phish the password ?

It's unclear to me if it's any different from getting a court order to land malicious client-side code to extract keys directly from the browser or to phish an encryption passphrase.

I think that in either case, it will be a legal question more than a technical one.

Here is a great example on how it's less of a technical problem and more of a legal one:
https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_dispute

The technical different are:
1- Check-sum of a hard coded sign-in form (Firefox installation file) can be compared to other sources/websites
2- In cases like if China government make a deal with Mozilla for Chinese citizens (or my government, Iran). A signed copy of installation file (which include sign-in form) would give user assurance that he/she would have an evidence in hand if Mozilla did an illegal action
Please post you reply below (@rfkelly incorrectly considered these two requests duplicate) :
https://bugzilla.mozilla.org/show_bug.cgi?id=1700275

Severity: normal → S3
You need to log in before you can comment on or make changes to this bug.