Closed
Bug 116445
Opened 23 years ago
Closed 23 years ago
gzip files expanded when downloaded via "save link as"
Categories
(Core Graveyard :: File Handling, defect, P2)
Core Graveyard
File Handling
Tracking
(Not tracked)
VERIFIED
FIXED
mozilla0.9.8
People
(Reporter: darin.moz, Assigned: bugs)
References
()
Details
(Keywords: dataloss)
Attachments
(1 file)
(deleted),
patch
|
Details | Diff | Splinter Review |
gzip files expanded when downloaded via "save link as"
go to http://ftp.mozilla.org/pub/mozilla/nightly/latest/, and right click on a
.tar.gz file. select "save link as" and notice that the downloaded file has
been decompressed. this happens because the code invoking the download has not set
nsIHttpChannel::SetApplyConversion(PR_FALSE) as it should when downloading files.
see nsStreamTransfer.cpp:152 for an example of how this works.
Comment 3•23 years ago
|
||
I also noticed that when I use a proxy that the downloaded file is both
ungzipped and TRUNCATED. Is there a bug on that?
Comment 4•23 years ago
|
||
Japanese users reported this bug reproduced MacOS 9.2(2001122408
) and MacOSX(0.9.7). Platform and OS should be All/All.
Updated•23 years ago
|
Comment 5•23 years ago
|
||
This sounds like a bug in WebBrowserPersist itself... Adding dataloss since the
file is truncated.
Assignee: law → adamlock
Keywords: dataloss
Updated•23 years ago
|
QA Contact: sairuh → tever
The xpfe should be specifying PERSIST_FLAGS_NO_CONVERSION as a flag to the
webbrowserpersist object. See bug 108268
Assignee: adamlock → ben
Comment 7•23 years ago
|
||
nominating since this should be a simple fix for a dataloss bug.... Just add a
flag setting at
http://lxr.mozilla.org/seamonkey/source/embedding/components/ui/progressDlg/nsPr
ogressDlg.js#382 (right before the try/catch).
+ webBrowserPersist.persistFlags |=
+ Components.interfaces.nsIWebBrowserPersist.PERSIST_FLAGS_NO_CONVERSION;
(if I recall the syntax for getting to interface constants correctly).
I'd attach a patch if I had a tree...
Keywords: mozilla0.9.8
Comment 9•23 years ago
|
||
*** Bug 118041 has been marked as a duplicate of this bug. ***
Comment 10•23 years ago
|
||
*** Bug 118320 has been marked as a duplicate of this bug. ***
Assignee | ||
Updated•23 years ago
|
Status: NEW → ASSIGNED
Priority: -- → P2
Target Milestone: --- → mozilla0.9.8
Assignee | ||
Comment 11•23 years ago
|
||
Comment 12•23 years ago
|
||
That specifies "no conversion" all the time, right?
I think there might be a complication when saving a web page served up by a
server that is doing content-encoding. There has always been some lurking bugs
about that (or similar) situation and we should try to squash that now. I'll
have to go track down the discussion.
Maybe we just turn off encoding for non-text types? Or only turn it off for
application/zip, etc.?
Comment 13•23 years ago
|
||
One thing is that Apache will serve up gzipped files as
Content-Type: application/x-gzip (or application/gzip?)
Content-Encoding: gzip
which is, of course, wrong.... We should try to handle that case as we fix this.
:(
Reporter | ||
Comment 14•23 years ago
|
||
bz: that case is already handled internally by necko. all we need to do is
ensure that necko is instructed to not apply content conversions. see the top
of nsHttpChannel::ProcessNormal for code that handles apache.
Comment 15•23 years ago
|
||
Darin, would the "no conversion ever" patch do the right thing then? If not,
under what conditions should the "no conversion" flag be set such that
gzip-encoded HTML is gunzipped but regular gzip files are not?
Reporter | ||
Comment 16•23 years ago
|
||
if that's the behavior you want then you need to check for Content-Type and
Content-Encoding headers. telling necko to not apply content conversions means
that necko will simply ignore the Content-Encoding header. by default necko
honors a Content-Encoding header. you can override this behavior in your
OnStartRequest handler.
Comment 17•23 years ago
|
||
Darin,
How does this interact with the cache? Is the content in the cache unzipped, or
compressed? If we ask for the data from the cache with no content encoding,
might we still get it uncompressed if it is fetched from the cache?
Reporter | ||
Comment 18•23 years ago
|
||
the content in the cache is decoded per any transfer encodings specified by a
Transfer-Encoding header and encoded per any content encodings specified by a
Content-Encoding header.
this means that if a server sends a document with Content-Encoding: gzip, the
document in the cache will be compressed.
you have actually described a bug that existed in the old days... the compressed
state of a saved file used to depend on how it was originally fetched. now, we
avoid any problems like that by writing the data to the cache before (possibly)
decoding an encoded document.
Comment 19•23 years ago
|
||
What is the relation between this and bug 35956 ?
Would it help people working on this to compare?
Comment 20•23 years ago
|
||
The checkin for bug 110135 made PERSIST_FLAGS_NO_CONVERSION a default persist
flag so it would be worth retesting now.
Even so, it's best for the client (xpfe) to be explicit and set the flags it
wants to use unless it doesn't care how the persist object saves something.
Assignee | ||
Comment 21•23 years ago
|
||
Fixed. Patch w/analysis in bug 120174.
Status: ASSIGNED → RESOLVED
Closed: 23 years ago
Resolution: --- → FIXED
Comment 22•23 years ago
|
||
Works ok in 0.9.8. Anyone wants to mark "VERIFIED" or to close?
BTW, I do understand that "Content-encoding: gzip" is wrong.
Do you guys know what is right? ": data"?
Comment 23•23 years ago
|
||
verified that we now never unzip, even for files for which we should. I'll
address that in another bug...
Status: RESOLVED → VERIFIED
Updated•8 years ago
|
Product: Core → Core Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•