Closed
Bug 492459
Opened 16 years ago
Closed 14 years ago
[regression] Websites are no longer rendered if SSL requests for JavaScripts are blocked
Categories
(Core :: Networking: HTTP, defect)
Tracking
()
RESOLVED
DUPLICATE
of bug 561536
People
(Reporter: heiko, Unassigned)
References
Details
(Keywords: regression, Whiteboard: Regression in Firefox 3.0.9->3.0.10)
Attachments
(1 file)
(deleted),
text/html
|
Details |
User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.10) Gecko/2009042809 GranParadiso/3.0.10
Build Identifier: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.10) Gecko/2009042809 GranParadiso/3.0.10
Websites like sourceforge.net, ubuntu.com or many others especially commercial ones, which are using some of these annoying trackers like etracker.de or google-analytics.com, which don't care about privacy, can't be loaded and used anymore with Firefox 3.0.10 in cunjunction with Privoxy 3.0.12 and with JavaScript activated. These trackers seem to have switched from http to https lately.
If such a website is loaded, Firefox tries to load these tracker sites and doesn't timeout. This has the effect, that the actual website is not being loaded completely. This happens among others on the login page of sourceforge.net. If I have JavaScript and Privoxy enabled, the login page of sourceforge.net is not loaded anymore. Instead Firefox hangs on connecting to google-analytics.com. Logging into sourceforge.net is only possible with JavaScript disabled.
This only happens since Firefox 3.0.10. With Firefox 3.0.9 and other browsers like Opera and Midori these site are loaded completely and shown as expected.
See also my bug report for Privoxy:
https://sourceforge.net/tracker/?func=detail&atid=460288&aid=2790091&group_id=11118
Reproducible: Always
Steps to Reproduce:
1. Start Privoxy.
2. Start Firefox, set the proxy for HTTP and SSL to 127.0.0.1:8118 and activate JavaScript
3. Go to http://sourceforge.net and click on the "Log in" link.
Comment 1•15 years ago
|
||
Duplicate of https://bugzilla.mozilla.org/show_bug.cgi?id=494893
I'm experiencing this problem too. Here's another example (I originally posted this to a support forum after 3.0.10 was released):
I have confirmed that the following is a new problem in 3.0.10 (3.0.9 works correctly). 3.0.11 continues to exhibit the problem.
I have a Squid proxy server that blocks connections to certain web sites. It seems that if Squid returns a 403 response to Firefox while it is loading an inline object over an SSL connection that Firefox will soft hang (stops making progress on loading the page, but other tabs work fine). Version 3.0.9 seems to be just fine with the 403: it ignores the object and moves on completing the page rendering just fine. Here's a sample line from my Squid log where this behavior is shown in Firefox:
TCP_DENIED/403 2305 CONNECT ssl.google-analytics.com:443 - NONE/- text/html
Firefox shows "Waiting for ssl.google-analytics.com..." in the lower left corner indefinitely (no apparent timeouts occur).
The HTML that seems to trigger the bug is:
script type="text/javascript"
var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www.");
document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E"));
/script
(I removed the LT and GT symbols around "script" to prevent the system from thinking this is HTML, but they were there in the original HTML.)
Again, this problem only manifests if the connection is made as a result of some inline action (maybe not all such actions) in the web page itself. If the connection is made via the web URL bar, Firefox behaves normally.
Unfortunately I've only seen this issue on secure shopping types of web sites so I cannot provide a public URL that actually shows the problem.
This issue happens on both the Windows and Linux versions of Firefox 3.0.10/11.
Comment 3•15 years ago
|
||
Heiko, could you please change the summary to something like:
"Websites are no longer rendered if SSL requests for JavaScripts are blocked by the proxy. Reproducible with at least Privoxy and Squid."
From the current summary it isn't clear that it's a Firefox
problem and I assume the reason why the status is still
UNCONFIRMED is that no Firefox developer has been interested
enough to read the report yet.
Reporter | ||
Updated•15 years ago
|
Summary: Sites using trackers aren't loaded completely anymore, when using Privoxy → Websites are no longer rendered if SSL requests for JavaScripts are blocked by a proxy. Reproducible with at least Privoxy and Squid.
Version: unspecified → 3.5 Branch
Reporter | ||
Comment 4•15 years ago
|
||
Done. Hope this helps, because this bug is pretty annoying.
Reporter | ||
Comment 5•15 years ago
|
||
There's another evidence that this is infact a Firefox bug.
This doesn't happen with Opera 9.64 and with Firefox versions prior to 3.5. Opera and earlier Firefox versions load and render SSL websites complete and correctly, if particular JavaScripts are blocked by a proxy like Privoxy.
If this won't be fixed, Firefox will get unusable for me in the long or medium term.
Reporter | ||
Updated•15 years ago
|
Summary: Websites are no longer rendered if SSL requests for JavaScripts are blocked by a proxy. Reproducible with at least Privoxy and Squid. → Websites are no longer rendered if SSL requests for JavaScripts are blocked by a proxy. Reproducible with at least Privoxy and Squid. Opera and Firefox 3.0.x are rendering those websites correctly.
I agree strongly with the need to fix this, but I'd like to disagree with the statement that versions prior to 3.5 work correctly. Versions prior to 3.0.10 work correctly based on my testing. (I make this point in case it helps accelerate finding and fixing the bug.)
Reporter | ||
Updated•15 years ago
|
Reporter | ||
Comment 7•15 years ago
|
||
I'm really wondering, when this bug will be assigned and fixed.
It's so annoying, that Firefox became unusable. For my part, I've switched back to Opera. Opera is working perfectly.
Updated•15 years ago
|
Component: General → Networking
Product: Firefox → Core
QA Contact: general → networking
Version: 3.5 Branch → 1.9.2 Branch
Comment 10•15 years ago
|
||
Interestingly with SSL-Proxy enabled FF 3 does not finish loading if you click on "Details"
Comment 11•15 years ago
|
||
We are having the same problem when you try to load a script through a non-existent domain such as:
https://ssl.noexist.example.com/script.js
Only a portion of the page will load if using SSL proxy. If proxy is turned off for SSL, the page will render fine, ignoring the missing script.
So this appears to not only affect blocked scripts, but scripts with an erroneous URL as well.
Comment 12•15 years ago
|
||
CONFIRMing
Compare 544979.
From title: Reproducible with at least Privoxy and Squid.
Privoxy bug:
<https://sourceforge.net/tracker/index.php?func=detail&aid=2790091&group_id=11118&atid=460288>
Privoxy 3.0.16 works around it by returning an empty page and 200 OK instead of blocking requests. (That doesn't change the fact that it's a bug in Firefox, though.)
Status: UNCONFIRMED → NEW
Component: Networking → HTML: Parser
Ever confirmed: true
Keywords: regression
QA Contact: networking → parser
Summary: Websites are no longer rendered if SSL requests for JavaScripts are blocked by a proxy. Reproducible with at least Privoxy and Squid. Opera and Firefox 3.0.x are rendering those websites correctly. → [regression] Websites are no longer rendered if SSL requests for JavaScripts are blocked
Whiteboard: Regression in Firefox 3.0.9->3.0.10
Comment 13•15 years ago
|
||
Linked: Compare bug 544979
Comment 14•15 years ago
|
||
Privoxy's workaround is documented at:
http://www.privoxy.org/user-manual/config.html#HANDLE-AS-EMPTY-DOC-RETURNS-OK
Returning an empty page when blocking requests that look like
the client expects JavaScript is done to prevent syntax-error
warnings and should be unrelated to this Firefox regression:
http://config.privoxy.org/user-manual/actions-file.html#HANDLE-AS-EMPTY-DOCUMENT
Given that changing the HTTP status code is enough to work around
the problem, I'd rather blame "Networking: HTTP" than "HTML: Parser".
Comment 15•15 years ago
|
||
> I'd rather blame "Networking: HTTP" than "HTML: Parser"
The network lib just reports the status code upwards (and does so properly, as you can see e.g. in XMLHttpRequest). It's the HTML parser / engine which does not continue.
Probably, it simply doesn't handle the 'connection refused' and HTTP 4xx-errors correctly and decides to stall and wait infinitely. That's clearly faulty, because a errored fetch is never going to finish further.
Comment 16•14 years ago
|
||
You're probably right.
This would also explain why the status line continues to display "Waiting for host ..." even if Firefox no longer has any connections open.
Comment 17•14 years ago
|
||
Based on comment 2, this seems a lot like bug 561536. Can anyone who sees this please check a current nightly of mozilla-central, 1.9.2, or 1.9.1 (or all three) to see whether those fix the issue for them?
> The network lib just reports the status code upwards
Are you sure? In the cases when this bug is encountered does it do that?
One other thing that would be helpful here is an HTTP log from anyone who can reproduce the problem (ideally of as small a testcase as possible).
Component: HTML: Parser → Networking: HTTP
QA Contact: parser → networking.http
Comment 18•14 years ago
|
||
Indeed, this seems to have fixed it. The description also sounds like a DUP, so marking this bug a dup.
Anybody who saw this, please also try for yourself, with a build from
http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/latest-trunk/
and report results here.
Status: NEW → RESOLVED
Closed: 14 years ago
Resolution: --- → DUPLICATE
Comment 19•14 years ago
|
||
Fix confirmed, Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.3a5pre) Gecko/20100518 Minefield/3.7a5pre
You need to log in
before you can comment on or make changes to this bug.
Description
•