Closed Bug 1236431 Opened 9 years ago Closed 9 years ago

Make the origin (site) we are on clearer and less spoofable or phishing prone

Categories

(Firefox for Android Graveyard :: General, defect)

All
Android
defect
Not set
normal

Tracking

(firefox46 affected, firefox47 fixed, relnote-firefox 47+, fennec47+)

RESOLVED FIXED
Firefox 47
Tracking Status
firefox46 --- affected
firefox47 --- fixed
relnote-firefox --- 47+
fennec 47+ ---

People

(Reporter: gcp, Assigned: sebastian, NeedInfo)

References

Details

Attachments

(5 files, 1 obsolete file)

Attached image How to spoof our URL bar (deleted) —
For more context, see bug 966532, bug 605206 and bug 1111729. We changed our default display to the URL, which makes it harder for pages to phish and spoof others. But the URL is pretty dense (and often for a large part useless) information and often doesn't fit well on a phone. On top of that, I believe although we highlight the domain, we currently don't do any justification of the top level part, so an URL like photos.facebook.com.myphishsite.com should still be able to fool the user. (Verified this is the case, see screenshot. Easiest phishing win ever) In the web, the origin is what matters. It's what all the security mechanics work with. So let's make the origin clearer. 1) Only display the origin by default. Show the full URL when selecting the Awesomebar. This reduces visual clutter and it eases user education. 2) Bold the top level domain (as we do now) and make sure it's not pushed further right in the awesomebar than a right-justified text would be. That is, for the example given above, it should display as: [ook.com.MYPHISHSITE.COM] (caps=bold) with the subdomain pushed off to the left.
I think this sounds like a reasonable approach to help prevent phishing attacks. Also, for the non-phishing case, it does make it clearer what website you're on. Barbara, Anthony, do you think this is something we should consider?
tracking-fennec: --- → ?
Flags: needinfo?(bbermes)
Flags: needinfo?(alam)
I think that's something worth updating. I noticed Chrome shows the full URL and then makes an animation to revert/shorten it to the origin/main domain. Suggestion 1) and 2) sounds good but let UX chime in. Aha card will be created. +NI Richard Barnes to keep in the loop
Flags: needinfo?(bbermes) → needinfo?(richard)
I've been wanting to clean up the lengthy URL's in the URL bar (when keyboard is inactive) for a while now. And this makes sense to me. Can I see a screenshot of this improvement GCP? I'm especially looking for the situation where the subdomain is extra long (like in your example under point #2) and whether or not we would still show "https://" or "http://"? I know that Safari doesn't do this.
Flags: needinfo?(alam) → needinfo?(gpascutto)
I have no idea how you expect me to create a screenshot? My original post describes how it should look. Is there some part there that I can clarify? The length of the subdomain is irrelevant as it's pushed off to the left in favor of showing the top level domain entirely. If the original top level domain is very long, its early part would be pushed off as well, but there's no ideal solution if the screen size is just too small to contain the top level domain. Maybe marquee it, but that's perhaps going a bit far. We should not show HTTP or HTTPS, but simply show a lock icon if the connection is secure. The problem there is that we have another phishing vulnerability in the Awesombar: bug 1018994.
Flags: needinfo?(gpascutto)
(In reply to Gian-Carlo Pascutto [:gcp] from comment #4) > I have no idea how you expect me to create a screenshot? > > My original post describes how it should look. Is there some part there that > I can clarify? Yes it does, but I hope you understand, ASCII art isn't that helpful in this case. I just think a screenshot/mock of what you are suggesting would help me (and others) avoid unnecessary confusion and back and forth. I just want to make sure we're talking about the same thing. > The length of the subdomain is irrelevant as it's pushed off > to the left in favor of showing the top level domain entirely. If the > original top level domain is very long, its early part would be pushed off > as well, but there's no ideal solution if the screen size is just too small > to contain the top level domain. Maybe marquee it, but that's perhaps going > a bit far. Why don't we just not display the subdomain at all until the user taps on the Awesomebar then? I.e. https://facebook.com NOT https://www.facebook.com http://myphishingsite.com NOT http://facebook.com.myphishingsite.com > We should not show HTTP or HTTPS, but simply show a lock icon if the > connection is secure. The problem there is that we have another phishing > vulnerability in the Awesombar: bug 1018994. I'd like to not get too prescriptive with a solution right off the bat. I know the favicon creates many issues but I'd like to here why you think we should not show HTTP or HTTPS? In bug 1144430, there's an idea to use that to denote security, since there are way too many states with the Site ID icon (lock, striked out lock, caution sign, yellow sign) that I worry our users understand.
(In reply to Anthony Lam (:antlam) from comment #5) > Yes it does, but I hope you understand, ASCII art isn't that helpful in this > case. I just think a screenshot/mock of what you are suggesting would help > me (and others) avoid unnecessary confusion and back and forth. > > I just want to make sure we're talking about the same thing. I'll exercise my paint skills a bit :-) > Why don't we just not display the subdomain at all until the user taps on > the Awesomebar then? > > I.e. https://facebook.com NOT https://www.facebook.com > http://myphishingsite.com NOT http://facebook.com.myphishingsite.com Hmm, the subdomain is sometimes relevant (bugzilla.mozilla.org, unix.stackexchange.com vs travel.stackexchange.com), so best effort seems preferable. But the extra clarity from your proposal is good, so I'm not sure about this one. How does the compeition (specifically, Safari) handle it? I checked Chrome and they show the subdomain in favor of lopping off the last part of the toplevel (which I think is very bad for phishability, see the screenshot in the original post). > I'd like to not get too prescriptive with a solution right off the bat. I > know the favicon creates many issues but I'd like to here why you think we > should not show HTTP or HTTPS? Because HTTPS or HTTP are meaningless with regards to security. Just because the connection goes over HTTPS doesn't mean its secure. You need a valid cert (maybe it's an EV one?), no mixed content, etc.
tracking-fennec: ? → 47+
(In reply to Gian-Carlo Pascutto [:gcp] from comment #6) > Hmm, the subdomain is sometimes relevant (bugzilla.mozilla.org, > unix.stackexchange.com vs travel.stackexchange.com), so best effort seems > preferable. But the extra clarity from your proposal is good, so I'm not > sure about this one. I think we should start with just the TLD here then. Next to the Site ID icon, it provides a good summary of where the user is already and that's what we're looking to provide at a high level. While subdomains useful and nice to have, I think they're not completely necessary sometimes E.g. m.reddit.com. They're also a lot of edge cases (as you've correctly pointed out). I do also worry about phishing, but to keep things simple, maybe we can look at working in subdomains later on down the line. Thoughts? For later --- Perhaps there's also something we can do with certs and the like. For example, Safari shows "Mozilla Foundation" for our sites and "reddit.com". I assume the logic looks something like this: Green lock: show name E.g. Mozilla Foundation lock: show TLD E.g. mozilla.org
Flags: needinfo?(gpascutto)
Sure. The improvement from either solution would be big enough that the subdomain question is small in comparison. The name of the organization is typically displayed if the site has an EV certificate, which is also what leads to the green lock.
Flags: needinfo?(gpascutto)
Perfect, let's get a build to test how this feels? Summary: Display TLD: E.g. [favicon][icon] mozilla.org *user taps on URL* Display full URL: E.g. bugzilla.mozilla.org/show_bug.cgi?id=123456
+1 to only showing the top-level domain. The goal here is to know where the page is coming from. If you're on a site like bugzilla.mozilla.org, you don't need to look at the urlbar to figure that out, you'll know it from the site.
Not sure if we need this in a separate bug... but for Search results pages, can we show the search terms then? That could be more useful than "google.com" or "yahoo.com" for example. E.g. "sushi restaurants" would be in the URL bar, even before the user taps the URL bar.
(In reply to Anthony Lam (:antlam) from comment #11) > Not sure if we need this in a separate bug... but for Search results pages, > can we show the search terms then? That could be more useful than > "google.com" or "yahoo.com" for example. > > E.g. "sushi restaurants" would be in the URL bar, even before the user taps > the URL bar. No, let's not do this. This raises the same security concerns as showing a page title (something we removed a while ago).
(In reply to :Margaret Leibovic from comment #12) > No, let's not do this. This raises the same security concerns as showing a > page title (something we removed a while ago). If we know they are search results (and we do for our own engines), then I don't see a security issue with it. But it would break as soon as you go to page 2 of the results etc so there's little benefit. (Didn't you wish a separate bug had been filed for this discussion by now? :-)
Assignee: nobody → s.kaspari
Status: NEW → ASSIGNED
Attached image urlbar-just-topprivatedomain.png (deleted) —
Here are some screenshots of a WIP patch build on top of the no-favicon patch (bug 1018994). I can upload a build for you too. Just using the "top level domain" sounds easy but is actually something that is tricky given the nature of the internet (example.com vs. example.co.uk). To build this patch I used the "Public Suffix List"[1][2] maintained by Mozilla. The guava library has an implementation to use this list (offline, no network calls) [3] and because this library is HUGE I just pulled some bits and created a minimal helper class. Note that the Public Suffix List contains domains under which people can register a domain. This means it does not just include top level domains but also public suffixes like "blogspot.com" ("Effective Top-Level Domain"). This means: * mail.google.com -> google.com * www.somesite.blogspot.com -> somesite.blogspot.com * bugzilla.mozilla.org -> mozilla.org The guava library has a nice website describing what public suffixes are: [4]. I think this is not a disadvantage but actually rather nice to use the "public suffix" instead of only the TLD. Apart from that there are several other edge/special cases: * We talked about http(s) uris but what about other protocols? I included some in the screenshots. For ftp:// this works nicely because the URL has the same syntax. But things like file:// uris do not have a "host". * Do we always want to hide the scheme (http/https/ftp/file/..) or do we sometimes want to show it? Does the user need to know that this is now a ftp server and not a regular website? * What about IPs? I included a screenshot where we just show the IP. * What about authorities or ports, hide them too? * How do we want to handle local hosts. This.is.not.a.public.suffix can be a valid domain in my local network With the patch the URL bar looks clean and nice. But it also feels a bit odd to not exactly know the page you are on. I wonder if such a change could result in a lot of negative feedback. We should test this carefully. [1] https://publicsuffix.org/ [2] https://publicsuffix.org/learn/ [3] http://google.github.io/guava/releases/snapshot/api/docs/com/google/common/net/InternetDomainName.html [4] https://github.com/google/guava/wiki/InternetDomainNameExplained
Attachment #8714793 - Flags: feedback?(alam)
(In reply to Anthony Lam (:antlam) from comment #7) > Green lock: show name E.g. Mozilla Foundation Btw. I did not work on this yet but I want to create a patch for this too (Pulling and displaying things from the EV certificate).
>I think this is not a disadvantage but actually rather nice to use the "public suffix" instead of only the TLD. I agree, this is also what the desktop awesomebar uses, I assume. > Apart from that there are several other edge/special cases: > * We talked about http(s) uris but what about other protocols? I included > some in the screenshots. For ftp:// this works nicely because the URL has > the same syntax. But things like file:// uris do not have a "host". I don't think phishing is a concern for file:// URIs. We can eliminate HTTP (default) and HTTPS (lock, maybe green or struck) and show the others. > * Do we always want to hide the scheme (http/https/ftp/file/..) or do we > sometimes want to show it? Does the user need to know that this is now a ftp > server and not a regular website? I think so, would be really confusing otherwise. Not that non-HTTP matters a lot. > * What about IPs? I included a screenshot where we just show the IP. Just show the IP? Shouldn't have the usual problems of long subdomains etc with those. > * What about authorities or ports, hide them too? Should show non-default ports, that's important because they're part of the origin. > * How do we want to handle local hosts. This.is.not.a.public.suffix can be a > valid domain in my local network Doesn't matter, no phishing concerns :-) > With the patch the URL bar looks clean and nice. But it also feels a bit odd > to not exactly know the page you are on. I wonder if such a change could > result in a lot of negative feedback. We should test this carefully. You can click on the awesomebar to see the full URL, right? And it's arguable you couldn't really see it before either due to space constraints. But I understand your point. I get the impression the visual design of our awesombar looks a bit odd now that the contents is sparse. On the other hand, maybe that's room you can cram features in :-)
(In reply to Gian-Carlo Pascutto [:gcp] from comment #16) > Doesn't matter, no phishing concerns :-) Yeah, those questions are mostly for UX than for security. :)
(In reply to Sebastian Kaspari (:sebastian) from comment #14) > Created attachment 8714793 [details] > urlbar-just-topprivatedomain.png > > Here are some screenshots of a WIP patch build on top of the no-favicon > patch (bug 1018994). I can upload a build for you too. > > Just using the "top level domain" sounds easy but is actually something that > is tricky given the nature of the internet (example.com vs. example.co.uk). > To build this patch I used the "Public Suffix List"[1][2] maintained by > Mozilla. The guava library has an implementation to use this list (offline, > no network calls) [3] and because this library is HUGE I just pulled some > bits and created a minimal helper class. > > Note that the Public Suffix List contains domains under which people can > register a domain. This means it does not just include top level domains but > also public suffixes like "blogspot.com" ("Effective Top-Level Domain"). > This means: > * mail.google.com -> google.com > * www.somesite.blogspot.com -> somesite.blogspot.com > * bugzilla.mozilla.org -> mozilla.org > The guava library has a nice website describing what public suffixes are: > [4]. > > I think this is not a disadvantage but actually rather nice to use the > "public suffix" instead of only the TLD. Yeah I think that's a fair point. Thanks for all the added info! > Apart from that there are several other edge/special cases: > * We talked about http(s) uris but what about other protocols? I included > some in the screenshots. For ftp:// this works nicely because the URL has > the same syntax. But things like file:// uris do not have a "host". > * Do we always want to hide the scheme (http/https/ftp/file/..) or do we > sometimes want to show it? Does the user need to know that this is now a ftp > server and not a regular website? When the URL bar is not active: I think we can visually hide http/https because these are probably the most common. But ftp and file for example, we should show because I feel they're more important in that context (correct me if I'm wrong). > * What about IPs? I included a screenshot where we just show the IP. Given the icon on the left, I don't see a need to show it. We currently don't show http::// anyways right? > * What about authorities or ports, hide them too? I think we can show ports. There's not really a reason to change what we're doing here. Chrome kind of treats it visually a little bit different, but I think we can cover that more in bug 1242624 - I think that's out of scope here. Not sure about authorities here. I don't think I understand them enough to prescribe something different to what we're currently doing. > * How do we want to handle local hosts. This.is.not.a.public.suffix can be a > valid domain in my local network I guess anything goes in local networks? But I don't think I understand the security aspect (if any) here enough to prescribe a change to what we're currently doing. > With the patch the URL bar looks clean and nice. But it also feels a bit odd > to not exactly know the page you are on. I wonder if such a change could > result in a lot of negative feedback. We should test this carefully. Yeah this change will likely invoke some feedback. But I'm not too worried about it. > [1] https://publicsuffix.org/ > [2] https://publicsuffix.org/learn/ > [3] > http://google.github.io/guava/releases/snapshot/api/docs/com/google/common/ > net/InternetDomainName.html > [4] https://github.com/google/guava/wiki/InternetDomainNameExplained
Comment on attachment 8714793 [details] urlbar-just-topprivatedomain.png Awesome! Comments above^ about ftp:// mostly but nothing major :)
Attachment #8714793 - Flags: feedback?(alam) → feedback+
Local hosts are more than 192.168.x.x and 10.x.x.x they can be any valid url string. http://kevin http://fileserv.kevin http://video.fileserv.kevin http://video.fileserv.kevin/memes/rickashtley/Never_Gonna_Give_You_Up.webm All could be valid requests on a local network.
I realized that Tab.getBaseDomain() already returns the base domain using the public suffix list (Added by wesj here: bug 857165). This will simplify the patch a lot.
Comment on attachment 8717848 [details] MozReview Request: Bug 1236431 - (Pre) ToolbarDisplayLayout: Address lint warnings and clean up code. r?mcomella https://reviewboard.mozilla.org/r/34347/#review31157 ::: mobile/android/base/java/org/mozilla/gecko/toolbar/ToolbarDisplayLayout.java:408 (Diff revision 1) > - private void updateProgress(Tab tab, EnumSet<UpdateFlags> flags) { > + private void updateProgress(Tab tab) { Add `@Nullable` here. ::: mobile/android/base/java/org/mozilla/gecko/toolbar/ToolbarDisplayLayout.java:438 (Diff revision 1) > - private void updatePageActions(EnumSet<UpdateFlags> flags) { > + private void updatePageActions() { The `flags` argument is only unused after bug 1018994, which makes this feasible. I'd prefer if this change (and the ones it causes) happens in the other bug to make the separation of concerns clearer, but I generally find rewriting commit history for cleanliness ends up being a waste of time so if you don't do it, wfm.
Comment on attachment 8717849 [details] MozReview Request: Bug 1236431 - ToolbarDisplayLayout: Only show base domain. r=mcomella https://reviewboard.mozilla.org/r/34349/#review31159 Cool beans. Did we also decide to remove url syntax highlighting for this bit? I personally think it adds a bit of visual flair to the url bar, but if decided not to do it, wfm.
Attachment #8717849 - Flags: review?(michael.l.comella) → review+
(In reply to Michael Comella (:mcomella) from comment #24) > The `flags` argument is only unused after bug 1018994, which makes this > feasible. > > I'd prefer if this change (and the ones it causes) happens in the other bug > to make the separation of concerns clearer, but I generally find rewriting > commit history for cleanliness ends up being a waste of time so if you don't > do it, wfm. I'll fold those changes into the patch in bug 1018994. (In reply to Michael Comella (:mcomella) from comment #25) > Did we also decide to remove url syntax highlighting for this bit? I > personally think it adds a bit of visual flair to the url bar, but if > decided not to do it, wfm. Yeah, the highlighting was nice but the thing we highlighted was the base domain and that's now the only thing we are showing. We would need to define a different highlighting algorithm to keep the effect.
(In reply to Sebastian Kaspari (:sebastian) from comment #26) > https://treeherder.mozilla.org/#/jobs?repo=try&revision=d39bc748b95c I expect this to break some tests. It seems like I need to at least fix testSessionHistory (we are running assertions on the text in the urlbar).
Comment on attachment 8717849 [details] MozReview Request: Bug 1236431 - ToolbarDisplayLayout: Only show base domain. r=mcomella Review request updated; see interdiff: https://reviewboard.mozilla.org/r/34349/diff/1-2/
Attachment #8717848 - Attachment is obsolete: true
(In reply to Sebastian Kaspari (:sebastian) from comment #28) > It seems like I need to at least fix testSessionHistory > (we are running assertions on the text in the urlbar). This seems to be harder than I anticipated. We use the title (url) to determine if the website we expected has been loaded. And while navigating back and forth we observe the title changes. This of course stops working if the title is always just something like "mochi.test" now. So far I came up with the following alternative ways. But I'm not really happy with any of them: * Use ToolbarEditLayout instead of ToolbarDisplayLayout because ToolbarEditLayout always contains the full URL. This seems to be easy but as our tests also enter URLs into ToolbarEditLayout this is not always an indicator that we loaded something. * Add the full URL as contentDescription to ToolbarDisplayLayout. While we display the base domain, we could set the full URL as contentDescription. This is very nice for the test but I fear that we might mess with screen readers. It's also worth noting that we already set the contentDescription of BrowserToolbar to the title of ToolbarDisplayLayout. So there would be two, maybe interfering, descriptions. * Instead of searching in views directly access Tabs.getSelectedTab(). This will always let us inspect the full URL and that's where the views get their values from too. However this breaks a bit this kind of semi-blackbox UI testing we do: With that we would not just interact with the UI anymore but poke around in the tab implementation. Michael: What do you think about that? Do you know a better way? Currently I tend to option 3 because it is the most reliable option but I still have hopes that there's a better solution.
Flags: needinfo?(michael.l.comella)
Comment on attachment 8717849 [details] MozReview Request: Bug 1236431 - ToolbarDisplayLayout: Only show base domain. r=mcomella Review request updated; see interdiff: https://reviewboard.mozilla.org/r/34349/diff/2-3/
Comment on attachment 8718342 [details] MozReview Request: Bug 1236431 - (Pre) ToolbarDisplayLayout: Address lint warnings and clean up code. r=mcomella I messed this up the first time. This is the previous patch but minus the flag changes that are now part of the patch in bug 1018994.
Attachment #8718342 - Flags: review?(michael.l.comella) → review+
Perhaps I've been too idealistic, but I'm a strong proponent of blackbox UI testing here: we shouldn't be validating anything the user does not have the opportunity to see/interact with because the end result is that we're testing that the UI does not regress for the user. If we tie our tests too closely to the implementation details, we're likely to break things with refactors and such. (In reply to Sebastian Kaspari (:sebastian) from comment #30) > This seems to be harder than I anticipated. We use the title (url) to > determine if the website we expected has been loaded. And while navigating > back and forth we observe the title changes. Are we trying to determine that the page has finished loading or just that the title from the page has been correctly updated? If it's the former, we probably weren't testing a good metric in the first place. :P Using the blackbox theory above, ideally we'd be scanning the page content for the changes, rather than inspecting the title. But given that we don't have full access to the page content since it's in Gecko land, it's pretty challenging! Since we were just trying to verify that the title changed before, I'm going to assume that's what we're trying to do here too. Some of my own ideas: * One part of the API that isn't likely to change are those "Content:PageShow"-style Gecko messages, including one for title update – if we can access Gecko, maybe we can latch onto them. If we can't add a Gecko listener directly, we could try to add a Java listener to BrowserApp because I think there is a similar title update broadcast * We could also be terrible and inject a module into BrowserToolbar in testing that will display the full url for these example pages. > * Use ToolbarEditLayout instead of ToolbarDisplayLayout because > ToolbarEditLayout always contains the full URL. I disagree with this one – ToolbarEditLayout isn't guaranteed to store the current site if the implementation decides to change (e.g. user clicks toolbar, enters new text, hits back, ToolbarEditLayout still contains the newly entered text rather than the current site). > * Add the full URL as contentDescription to ToolbarDisplayLayout. While we > display the base domain, we could set the full URL as contentDescription. > This is very nice for the test but I fear that we might mess with screen > readers. It's also worth noting that we already set the contentDescription > of BrowserToolbar to the title of ToolbarDisplayLayout. So there would be > two, maybe interfering, descriptions. So what do we want a screen reader to say here? Previously we would read out the full url but with this change, afaict, we're updating that to the abridged url. Is one more useful for users than the other? If the full url is more useful, we can use this field as our test and this is actually my favorite suggested approach (including the ones I suggested) because it tests precisely what the users are interacting with. We should also verify that the abridged url we show on screen is a subset of the content of this larger url too. > * Instead of searching in views directly access Tabs.getSelectedTab(). This > will always let us inspect the full URL and that's where the views get their > values from too. However this breaks a bit this kind of semi-blackbox UI > testing we do: With that we would not just interact with the UI anymore but > poke around in the tab implementation. I'd expect that this approach is somewhat robust – the API isn't likely to change – but there are some issues. When does the selected tab get updated? When does the title of the selected tab get updated? We can throw in some wait helpers to try to get the correct values, but the implementation may make some assumptions about how the object gets used – e.g. we call `Tabs.getSelectedTab` and wait for the title to change in the returned instance but instead, `Tabs.getSelectedTab` might change to return a new Tab instance each time the title is updated so we're waiting for anan old, invalid reference to change the title, which it never will. A more robust approach would be to wait on a "title update" listener (especially if it's used by in the main source code) since the contract is much clearer than it is for calling directly into the code. > Michael: What do you think about that? Do you know a better way? Currently I > tend to option 3 because it is the most reliable option but I still have > hopes that there's a better solution. As such, my top choices (in order of preference) are: * ContentDescription * Waiting for title-update broadcast * Touching Tabs.getSelectedTab directly
Flags: needinfo?(michael.l.comella)
Comment on attachment 8718342 [details] MozReview Request: Bug 1236431 - (Pre) ToolbarDisplayLayout: Address lint warnings and clean up code. r=mcomella Review request updated; see interdiff: https://reviewboard.mozilla.org/r/34531/diff/1-2/
Attachment #8718342 - Attachment description: MozReview Request: Bug 1236431 - (Pre) ToolbarDisplayLayout: Address lint warnings and clean up code. r?mcomella → MozReview Request: Bug 1236431 - (Pre) ToolbarDisplayLayout: Address lint warnings and clean up code. r=mcomella
Attachment #8718342 - Flags: review+ → review?(michael.l.comella)
Comment on attachment 8717849 [details] MozReview Request: Bug 1236431 - ToolbarDisplayLayout: Only show base domain. r=mcomella Review request updated; see interdiff: https://reviewboard.mozilla.org/r/34349/diff/3-4/
Attachment #8717849 - Attachment description: MozReview Request: Bug 1236431 - ToolbarDisplayLayout: Only show base domain. r?mcomella → MozReview Request: Bug 1236431 - ToolbarDisplayLayout: Only show base domain. r=mcomella
Attachment #8718342 - Flags: review?(michael.l.comella) → review+
The contentDescription approach is working fine. I tested this with Talkback and this won't interfere. The whole toolbar will be focused by the screen reader and the shortened domain will be read. ToolbarDisplayLayout itself is not focusable and the contentDescription will not be read -> We can grab the contentDescription from here.
Comment on attachment 8718342 [details] MozReview Request: Bug 1236431 - (Pre) ToolbarDisplayLayout: Address lint warnings and clean up code. r=mcomella https://reviewboard.mozilla.org/r/34531/#review31811 rb tells me I still have to review this one.
Attachment #8718342 - Flags: review+
Comment on attachment 8719540 [details] MozReview Request: Bug 1236431 - Use contentDescription to verify URL in tests. r=mcomella https://reviewboard.mozilla.org/r/34991/#review31819 Nice. ::: mobile/android/base/java/org/mozilla/gecko/toolbar/ToolbarDisplayLayout.java:262 (Diff revision 1) > + setContentDescription(strippedURL); Can you add a comment explaining what we're doing here? Be sure to mention this value is not actually seen by users (for the reasons you mentioned in your comment). ::: mobile/android/tests/browser/robocop/src/org/mozilla/gecko/tests/BaseTest.java:504 (Diff revision 1) > + if (urlDisplayLayout != null) { It's my understanding that if `urlDisplayLayout == null`, we'll throw because `!actualUrl.equals(expected)` since `actualUrl == null`. I'd rather this be explicit, i.e.: `assertNotNull("message", urlDisplayLayout);` ::: mobile/android/tests/browser/robocop/src/org/mozilla/gecko/tests/components/ToolbarComponent.java:66 (Diff revision 1) > - fAssertEquals("The Toolbar title is " + expected, expected, getTitle()); > + fAssertEquals("The Toolbar title is " + expected, expected, getUrlFromContentDescription()); Add a comment explaining why we're using the content description rather than the title. ::: mobile/android/tests/browser/robocop/src/org/mozilla/gecko/tests/components/ToolbarComponent.java:154 (Diff revision 1) > + CharSequence contentDescription = getUrlDisplayLayout().getContentDescription(); nit: `final`
Attachment #8719540 - Flags: review?(michael.l.comella) → review+
Comment on attachment 8718342 [details] MozReview Request: Bug 1236431 - (Pre) ToolbarDisplayLayout: Address lint warnings and clean up code. r=mcomella Review request updated; see interdiff: https://reviewboard.mozilla.org/r/34531/diff/2-3/
Attachment #8718342 - Flags: review+
Comment on attachment 8717849 [details] MozReview Request: Bug 1236431 - ToolbarDisplayLayout: Only show base domain. r=mcomella Review request updated; see interdiff: https://reviewboard.mozilla.org/r/34349/diff/4-5/
Attachment #8719540 - Attachment description: MozReview Request: Bug 1236431 - Use contentDescription to verify URL in tests. r?mcomella → MozReview Request: Bug 1236431 - Use contentDescription to verify URL in tests. r=mcomella
Comment on attachment 8719540 [details] MozReview Request: Bug 1236431 - Use contentDescription to verify URL in tests. r=mcomella Review request updated; see interdiff: https://reviewboard.mozilla.org/r/34991/diff/1-2/
https://hg.mozilla.org/integration/fx-team/rev/7945d91d718db79ff98e7e61b407d7c10dca0244 Bug 1236431 - (Pre) ToolbarDisplayLayout: Address lint warnings and clean up code. r=mcomella https://hg.mozilla.org/integration/fx-team/rev/e4e3639c5f670ccabc353d31d457e3ac9bd5becc Bug 1236431 - ToolbarDisplayLayout: Only show base domain. r=mcomella https://hg.mozilla.org/integration/fx-team/rev/1ae9062a16d4907c0313ce4cb5092a177e2fe31d Bug 1236431 - Use contentDescription to verify URL in tests. r=mcomella
Status: ASSIGNED → RESOLVED
Closed: 9 years ago
Resolution: --- → FIXED
Target Milestone: --- → Firefox 47
Depends on: 1249593
Depends on: 1249594
Ugh, can this rather disturbing change (at least with using the tablet UI but not limited to that) to a primary UI item be made pref-able, please? Thanks!
No longer blocks: 1243953
Depends on: 1243953
Depends on: 1250671
Release Note Request (optional, but appreciated) [Why is this notable]: [Suggested wording]: Show main domain (origin) for long URLs in address bar only (to prevent phishing attacks) [Links (documentation, blog post, etc)]:
relnote-firefox: --- → ?
Depends on: 1262344
Depends on: 1271698
Hi all, I will take this feature as a QA. Here is the Test plan based on which the testing is made: https://wiki.mozilla.org/QA/Fennec/Show_main_domain_%28origin%29_for_long_URLs_in_address_bar_only
QA Contact: sorina.florean
(In reply to Sorina Florean [:sorina] from comment #52) > Hi all, I will take this feature as a QA. Here is the Test plan based on > which the testing is made: > https://wiki.mozilla.org/QA/Fennec/ > Show_main_domain_%28origin%29_for_long_URLs_in_address_bar_only Note that we are going to disable this feature for now (See bug 1268753).
(In reply to Sebastian Kaspari (:sebastian) from comment #53) > Note that we are going to disable this feature for now (See bug 1268753). Thank you, I've put myself to CC in Bug 1268753.
Depends on: 1269832
Product: Firefox for Android → Firefox for Android Graveyard
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: