Closed
Bug 59464
Opened 24 years ago
Closed 24 years ago
Any Transfer-Encoding header is interpreted as "chunked"
Categories
(Core :: Networking, defect, P3)
Tracking
()
Future
People
(Reporter: bob+mozilla, Assigned: darin.moz)
References
Details
When the Transfer-Encoding header is set to 'gzip', mozilla does not decompress
the content. Web pages show up blank, and once I crashed mozilla by loading a
'Transfer-Encoding: gzip' page. When using File->Save As, mozilla reports error
"Unknown Error [1 80004005]".
The value of 'gzip' for header Transfer-Encoding is defined by rfc2616
(HTTP/1.1) in sections 3.5 and 3.6. This may be a way to unambiguously fix the
gzip/Content-Encoding mess (see bug #35956) when the compression is provided by
a proxy or server (and not inherent in the original file). But of course, it
has to work first. ;)
Comment 1•24 years ago
|
||
Reporter is this still a problem in the latest nightlies?
Reporter | ||
Comment 2•24 years ago
|
||
Yes. Trying with build 2000121411... Mozilla now displays the binary
uncompressed gzip data (where previously it would display a blank page).
Also save-as no longer reports an error. The file created by save-as
can be decompressed successfuly by gzip. Here's the headers generated
by my proxy when loading http://slashdot.org/index.pl:
Cache-Control: private
Connection: keep-alive
Date: Sat, 16 Dec 2000 20:38:28 GMT
Pragma: no-cache
Transfer-Encoding: chunked,gzip
Server: Apache/1.3.12 (Unix) mod_perl/1.24
Content-Length: 15529
Content-Type: text/html
Content-Type: text/html
Client-Date: Sat, 16 Dec 2000 20:38:19 GMT
Client-Peer: 64.28.67.48:80
Keep-Alive: 300
Proxy-Connection: keep-alive
It looks like Mozilla is properly decoding the 'chunked' portion of the
Transfer-Encoding header, but completely ignoring the gzip portion (note
order of keywords doesn't matter -- I've tried both). No error messages
are printed on stdout when I load a page. Only:
Document http://slashdot.org/index.pl loaded successfully
When using save-as, the following is printed on stdout:
we don't handle eBorderStyle_close yet... please fix me
moo!moo!we don't handle eBorderStyle_close yet... please fix me
result native path = /home/mcelrath/index.pl
we don't handle eBorderStyle_close yet... please fix me
************************************************************
* Call to xpconnect wrapped JSObject produced this error: *
[Exception... "[Exception... "Illegal value" code: "-2147024809"
nsresult: "0x80070057 (NS_ERROR_ILLEGAL_VALUE)" location:
"chrome://global/content/downloadProgress.js Line: 70"]
[nsIObserver::Observe]" nsresult: "0x8057001c
(NS_ERROR_XPC_JS_THREW_JS_OBJECT)" location: "<unknown>" data: no]
************************************************************
You can grab my proxy at http://draal.physics.wisc.edu/FilterProxy/. Within the
'Compress' module is an option to use the Transfer-Encoding header instead of
the Content-Encoding header.
Comment 4•24 years ago
|
||
Reporter, I think the bug is in your proxy.
RFC 2616 says in 14.39 TE:
If the TE field-value is empty or if no TE field is present, the only
transfer-coding is "chunked".
AFAIK, Moz never sends any TE header. Thus it does not have to be aware of any
other transfer-coding than chunked.
But you could file an RFE for support for the TE header (haven't searched if
something similar already exists).
A related bug is bug 68414.
Comment 5•24 years ago
|
||
This bug is not completly invalid.
The least what Moz should do in this case, is giving an appropriate error
message instead of displaying a blank page or something cryptic like
"Unknown Error [1 80004005]".
Reporter | ||
Comment 6•24 years ago
|
||
I wanted to get Transfer-Encoding to work because Content-Encoding is ambiguous
(it is not possible to tell if the encoding is inherent in the original file, or
added by a server or proxy). My proxy only implements it because I wanted to
test it. ;) I mention my proxy for the purpose of testing Mozilla in this
circumstance. (Transfer-Encoding is disabled by default) 'Content-Encoding:
gzip' is not technically correct, but is currently the only way for a proxy to
get any browser to recognize a compressed file, for the purpose of increasing
transfer speed.
I think your reading of rfc2616 is correct, and proxies should look for the TE
header (which mozilla currently does not send, nor does any browser I have ever
seen). I agree with the comment by Andreas Schneider, attached to bug 68414,
"TE should always be set to accept all compression formats Moz supports". This,
if implemented by proxies, could solve the problems mentioned in bug 35956,
assuming Mozilla *removed* encoding applied with the Transfer-Encoding header,
and *preserves* encoding applied with the Content-Encoding header. Currently,
many softwares use Content-Encoding to acheive both, resulting in ambiguity.
Comment 7•24 years ago
|
||
I filed bug 68517 "[RFE] improved support for HTTP/1.1 transfer codings".
But I suggest we keep this bug for issues with (unwanted) received transfer
codings.
Is this bug a problem with Content-Length?
Bob's proxy should not send one, but Mozilla should ignore it.
See RFC 2616, 4.4 Message Length
( http://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html#sec4.4 ):
Messages MUST NOT include both a Content-Length header field and a
non-identity transfer-coding. If the message does include a non-
identity transfer-coding, the Content-Length MUST be ignored.
Blocks: 68517
Reporter | ||
Comment 8•24 years ago
|
||
I have ammended my proxy to not send Content-Length when there is a
Transfer-Encoding, and Mozilla still does not decode the content. (Content is
received and the binary gzip file is displayed) So presumably this isn't a bug
with Content-Length.
I released 0.27 of FilterProxy (http://draal.physics.wisc.edu/FilterProxy/) this
morning with the fix that it looks for the TE header, and only sends
Transfer-Encoding if TE was present in the request. Ask me if you want a patch
that removes Content-Length (this fix will be in the next release though).
Comment 10•24 years ago
|
||
I'm not familiar with the code, but I looked at
nsHTTPServerListener::OnDataAvailable() in
http://lxr.mozilla.org/seamonkey/source/netwerk/protocol/http/src/nsHTTPResponse
Listener.cpp .
If I got it right, whenever we see an Transfer-Encoding header field,
we treat it as if it was "Transfer-Encoding: chunked". That would be very bad.
It would break even pages with "Transfer-Encoding: identity".
Comment 11•24 years ago
|
||
Yes, when I send
--8<----
HTTP/1.1 200 OK
Transfer-Encoding: identity
test
--8<----
to Mozilla, I get nothing but an occasional crash.
Same for empty "Transfer-Encoding:".
Adjusting summary.
Summary: Transfer-Encoding: gzip not working (not implemented?) → Any Transfer-Encoding header is interpreted as "chunked"
Assignee | ||
Updated•24 years ago
|
Target Milestone: --- → Future
Comment 12•24 years ago
|
||
is bug 78065 a dup?
Assignee | ||
Comment 13•24 years ago
|
||
Status: NEW → RESOLVED
Closed: 24 years ago
Resolution: --- → DUPLICATE
Assignee | ||
Comment 14•24 years ago
|
||
let me elaborate on what was fixed: we now check to make sure that "chunked"
is set in the Transfer-Encoding header before installing a chunked decoder.
we do not support any other Transfer-Encoding, which is consistent with the
fact that we don't send a TE header.
so i think we can safely close this bug.
You need to log in
before you can comment on or make changes to this bug.
Description
•