Closed
Bug 478837
Opened 16 years ago
Closed 16 years ago
Failed HTTP parsing with SSL, gzip and keep-alive
Categories
(Core :: Networking: HTTP, defect)
Tracking
()
VERIFIED
DUPLICATE
of bug 363109
People
(Reporter: ezyang, Unassigned)
References
()
Details
Attachments
(1 file)
(deleted),
text/plain
|
Details |
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.6) Gecko/2009020911 Ubuntu/8.10 (intrepid) Firefox/3.0.6 (.NET CLR 3.5.30729)
Build Identifier: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.6) Gecko/2009020911 Ubuntu/8.10 (intrepid) Firefox/3.0.6 (.NET CLR 3.5.30729)
How to reproduce:
1. Navigate to the wiki in question. Make sure Keep-alive is enabled (you will have to allow the certificate, as it is self-signed)
2. Iterate through the links of the left, especially Main Page, Community Portal and Current Events
Expected results:
Nothing
Actual results:
After ~20 clicks, a download box appears for the page.
Video of the bug in action:
http://web.mit.edu/quentin/Public/scripts-ssl-bug.mov
We've done some extensive debugging on this issue, and here's what we've found:
1. Turning off Keep-Alive in Firefox gets rid of the error
2. Turning off SSL gets rid of the error (we CAN reproduce with a null cipher, however)
3. Turning off Gzip gets rid of the error
Inspecting the page within Firefox suggests that a gzip header was improperly placed before the HTTP headers (you see the Gzip header, and then a bunch of HTTP headers in plaintext). However, looking at the network trace (we can provide instructions on how to get these, and we will also post a few as attachments, since SSL makes it difficult to look into the network packets), there does not seem to be anything abnormal about the packets.
The downloaded files tend to be a gzip header, some garbage text, and then HTTP headers (and then the compressed page contents, with the real gzip header). The garbage text does not occur in an obvious manner in the network traces, and there is no indication the two byte gzip magic sequence is sent in a packet to Firefox.
We've been able to reproduce on Linux, Mac OS X and Windows. We have not been able to reproduce with other browsers. We can reproduce on Mozilla 1.7.13.
Some server-side related notes: it's been somewhat difficult creating a simpler test-script that tickles this bug. MediaWiki works very regularly, but if we switch from php-cgi to mod_php, the bug goes away. We've investigated this being an issue in Apache or PHP, but the lack of egregious errors in the network traces has lead us to file a bug here. We have been able to reproduce by giving our server very high load on a simple test script, but we suspect this is a different bug. It does not seem to be tied to a particular server, and can be tickled in virtual servers and physical servers.
We believe this bug has existed for over four years, and whenever we ask someone about this bug, they say, "Oh, we've seen that at some point, but never reported it."
Reproducible: Sometimes
Comment 1•16 years ago
|
||
I was able to reproduce the problem on my local box running OS X. It does not happen when using Safari. At least not during my latest tests. I'll attach such a gzip-ed response.
Boris, is there a way to create a https log?
Status: UNCONFIRMED → NEW
Ever confirmed: true
OS: Linux → All
Hardware: x86 → All
Version: unspecified → 1.9.0 Branch
Comment 2•16 years ago
|
||
Reporter | ||
Comment 3•16 years ago
|
||
Since our packet dumps are too big to attach, I've posted them here:
http://web.mit.edu/~ezyang/Public/ssl-bug/
Some notes about the files:
index.php contains one of the standard files that is downloaded when the error
occurs.
dump.xml is the full PDML file of all HTTPS traffic during the packet dump.
output2.txt is a binary/text file of the packets in human-readable form.
marked.xml is dump.xml, but filtered for requests that failed. They were marked
during testing using X-Identifier, and are inside dump.xml, so you should be
able to get any contextual packets that were not included in marked.xml (as it
only contains packets that contained HTML headers).
marked2.txt contains marked.xml in human-readable form.
We can send you the actual pcap file privately if requested. These network
traces were done using a null cipher, so that wireshark was able to decrypt
them.
The X-Identifiers corresponding to manifestations of the bug are:
X-Identifier: 1234841957
X-Identifier: 1234841968
X-Identifier: 1234841984
X-Identifier: 1234841996
X-Identifier: 1234842021
X-Identifier: 1234842035
X-Identifier: 1234842054
X-Identifier: 1234842082
X-Identifier: 1234842097
X-Identifier: 1234842108
Reporter | ||
Comment 4•16 years ago
|
||
We've managed to reproduce on a vanilla Debian Etch install, with:
* Apache 2
* PHP (cgi)
* SSL
* MediaWiki
And with zlib compression using the following php.ini:
zlib.output_compression = On
The new test server we set up is here: https://q.xvm.mit.edu/mediawiki
Reporter | ||
Comment 5•16 years ago
|
||
On the request of hskupin, I've posted http://web.mit.edu/~ezyang/Public/ssl-bug/http-log.txt which is the output of the latest Firefox nightly with HTTP logging enabled.
Reporter | ||
Comment 6•16 years ago
|
||
One last thing: we've managed to reproduce with mod_php as well.
Comment 7•16 years ago
|
||
Jason, want to look into this?
Comment 8•16 years ago
|
||
it fails here :
1246733424[8374d10]: looks like a HTTP/0.9 response
Component: Networking → Networking: HTTP
QA Contact: networking → networking.http
Comment 9•16 years ago
|
||
Right. You'll get that any time data's misplaced in the stream...
We had some similar issues with pipelining and broken servers, but this doesn't seem like the same problem.
Comment 10•16 years ago
|
||
This seems to be the same problem as in bug #363109. At the beginning of attachment #362682 [details] is again gzipped "nothing" :
1F8B080000000000000302000000FFFF03000000000000000000
I.e. server sends unexpected body. See https://bugzilla.mozilla.org/show_bug.cgi?id=363109#c12 for more details.
Comment 11•16 years ago
|
||
Ah, indeed. Should this just be marked duplicate, or dependent?
Edward, does fixing your server make this work ok?
Comment 12•16 years ago
|
||
(In reply to comment #11)
> Ah, indeed. Should this just be marked duplicate, or dependent?
Yes, it is duplicate. I was searching backwards from the 0.9 response in http://web.mit.edu/~ezyang/Public/ssl-bug/http-log.txt and preceding response on that connection was "304 Not Modified" (line 15521). So this is exactly the same bug.
Status: NEW → RESOLVED
Closed: 16 years ago
Resolution: --- → DUPLICATE
Updated•16 years ago
|
Status: RESOLVED → VERIFIED
You need to log in
before you can comment on or make changes to this bug.
Description
•