Open Bug 401846 Opened 17 years ago Updated 2 years ago

Resume fails with "Download could not be saved, because the source file could not be read"

Categories

(Core :: Networking: HTTP, defect, P5)

defect

Tracking

()

People

(Reporter: stephend, Unassigned)

References

()

Details

(Keywords: uiwanted, Whiteboard: [necko-would-take])

Attachments

(4 files, 1 obsolete file)

Build ID: Mozilla/5.0 (Macintosh; U; Intel Mac OS X; en-US; rv:1.9a9pre) Gecko/2007102720 Minefield/3.0a9pre Summary: "Download could not be saved, because the source file could not be read" error on Retry Steps to Repro: 1. Load http://files.filefront.com/Call+of+Duty+4+Modern+Warfare+Demo/;8774322;/fileinfo.html 2. Click "Download Now" 3. Wait a few seconds, click Pause 4. Now click Resume Expected Results: Download resumes Actual Results: See screenshot (Note: if you resume again after the warning, we continue...)
Flags: blocking-firefox3?
Summary: Download could not be saved, because the source file could not be read → "Download could not be saved, because the source file could not be read" error on Retry
Is this because the server doesn't support resume? If that's the case, I think we want to just restart from the beginning, or if we can pre-detect that, maybe not even offer pause ...
Flags: blocking-firefox3? → blocking-firefox3+
(In reply to comment #2) > Is this because the server doesn't support resume? If that's the case, I think > we want to just restart from the beginning, or if we can pre-detect that, maybe > not even offer pause ... We do actually restart from the beginning (after we throw up the error), which makes me think that they really don't support resume; if the pre-detection is feasible, I'm happy to morph the bug to cover that.
Seems like the server is giving an entityID so we think it's resumable, but it doesn't follow up for some reason. I haven't looked into the response headers. I suppose we could fix this by using the ResumeRetry method that I added for auto-resume downloads.
Target Milestone: --- → Firefox 3 M11
That sounds like a good plan. :)
Priority: -- → P3
Firefox needs to display in the download manager if resume is supported or not! Has anyone filed such a bug!?. As it is right now there is no way of knowing if you can pause the download without having to restart the whole thing. It could be like this, Resume Supported, Green text, Resume Not Supported, In red text. Or you could display all the status text in red when it is not supported, and when it is supported display it in green instead. I would prefer the first option. Mainly because beginners would understand it better. Yes you are right if you can not resume, the download manager should start it over from the beginning.
I did not see the bug being filed so i filed Bug 406418 Download Manager don't display if Resume is supported on server.
If the server has us thinking it's not resumable, we default back to the old pause which doesn't actually break the connection. However, if the server lies and doesn't actually support resume, we do the new resume anyway and ideally restart when we find out it doesn't actually support resume.
I suppose it might be possible for the download manager to try pausing/resuming the download at the very beginning of the download to see if the server *really* supports resuming. (basically when we first get the entity id) Hopefully there aren't too many servers that use "one time" links that stop accepting requests a second time. Alternatively, a separate connection could be started as if resuming a separate download to see if that works... but then again, hopefully the server isn't limiting connections and we aren't maxxed out on outgoing connections to that server to begin with...
ResumeRetry doesn't help here because the action of resuming succeeds. It's that when the async reopen to resume the file happens, the server throws up its hands because it lied about being able to resume. We'll need to detect that we resumed and retry when we get a status change of failing to read from source.
Edward - would you be willing to take this? You seem to know exactly how to fix it :)
Whiteboard: [needs assignee]
Edward?
(In reply to comment #4) > I suppose we could fix this by using the ResumeRetry method that I added for > auto-resume downloads. This doesn't even work because ResumeRetry also calls dl->Resume(). It checks the entityID and immediately starts a RealResume which will fail. http://bonsai.mozilla.org/cvsblame.cgi?file=/mozilla/toolkit/components/downloads/src/nsDownloadManager.cpp&rev=1.154&#139 http://bonsai.mozihlla.org/cvsblame.cgi?file=/mozilla/toolkit/components/downloads/src/nsDownloadManager.cpp&rev=1.154&#2429 While running the debugger I noticed that the entityID contains a bad value while inside the downloads.sqlite there is no value: (gdb) frame 0 #0 nsACString_internal::IsEmpty (this=0x41e49310) at ../../../../dist/include/string/nsTSubstring.h:181 181 in ../../../../dist/include/string/nsTSubstring.h (gdb) p mData $3 = 0x422a7e28 "/1473748992/" For a real resumable download it looks like: (gdb) p mData $5 = 0x3d9683d8 "%227d968-282a1000-8526b040%22/673845248/Wed, 02 Jan 2008 13:47:37 GMT" The EntityID gets only a value with the size of the download because no eTag is available: http://bonsai.mozilla.org/cvsblame.cgi?file=mozilla/netwerk/protocol/http/src/nsHttpChannel.cpp&rev=1.324&#4713 On what parts we have to rely to build up the EntityID? Do we need etag, size, and lastmod? Then we should add a check if etag and lastmod isn't null.
I can come up with a patch if someone can answer my last question.
Component: Download Manager → Networking: HTTP
Flags: blocking-firefox3+
OS: Mac OS X → All
Product: Firefox → Core
QA Contact: download.manager → networking.http
Hardware: PC → All
Target Milestone: Firefox 3 M11 → mozilla1.9 M11
Reasking for blocking 1.9 due to change of the product.
Flags: blocking1.9?
See comment #10 about why ResumeRetry won't help us. We'll need to keep track that we tried to resume and then get a failure status change.
(In reply to comment #13) > On what parts we have to rely to build up the EntityID? Do we need etag, size, > and lastmod? Then we should add a check if etag and lastmod isn't null. If we can accurately determine if it's resumable or not from that, we could fix up the IsResumable function. Fixing title just to clarify that this is resume not retry.
Summary: "Download could not be saved, because the source file could not be read" error on Retry → Resume fails with "Download could not be saved, because the source file could not be read"
(In reply to comment #17) > If we can accurately determine if it's resumable or not from that, we could fix > up the IsResumable function. Why IsResumable is a member of nsResponseHead? Don't we need a response head to be able to resume a download? Shouldn't it be better a member of nsHttpChannel? Then we could do something like: if (!IsResumable()) return NS_ERROR_NOT_RESUMABLE; Instead of: if (!mResponseHead || (mResponseHead && !mResponseHead->IsResumable())) return NS_ERROR_NOT_RESUMABLE;
To summarize: 1. We can call IsResumable from within GetEntityID to know if the server supports resuming. 2. Shouldn't IsResumable be a member of nsHttpChannel? 3. IsResumable also misses the check for "mRequestHead.Method() != nsHttp::Get". Do we allow other request methods now? Or should it be fixed within IsResumable?
Attached patch Patch v1 (without touching IsResumable) (obsolete) (deleted) — Splinter Review
With the patch we only build an EntityID if the server definitely support resuming. I haven't touched the IsResumable function for now. If I should do that I could come up with a follow-up patch. IMO it would be a nice thing so we don't have to take care about all conditions on different places.
Assignee: nobody → hskupin
Status: NEW → ASSIGNED
Attachment #296814 - Flags: review?(darin.moz)
We shouldn't disallow resuming just because last-modified and etag are missing. it could still work. does anyone have an http log?
(In reply to comment #21) > We shouldn't disallow resuming just because last-modified and etag are missing. > it could still work. Based on which headers we can be make sure that the file hasn't been changed meanwhile? With the above file we only have the file size.
urg, the server doesn't send a Content-Range header :/ that violates RFC 2616 section 10.2.7 as for your question, I don't think it's likely that the content would change but the filesize stays the same. perhaps we can notify the user that we aren't sure that the file hasn't changed. I don't think we should flat out refuse to resume in that case.
I tried to check what wget does when resuming such a stopped download. It's interesting because it mentioned a 403: Access forbidden: $ wget -c http://38.118.213.174/89ycy0nvxf+/pub2/Call_of_Duty_4_Modern_Warfare/Official_Demos/CoD4MWDemoSetup.exe/X6 --23:30:21-- http://38.118.213.174/89ycy0nvxf+/pub2/Call_of_Duty_4_Modern_Warfare/Official_Demos/CoD4MWDemoSetup.exe/X6 => `X6' Verbindungsaufbau zu 38.118.213.174:80... verbunden. HTTP Anforderung gesendet, warte auf Antwort... 200 OK L"ange: 1,473,748,992 (1.4G) [application/octet-stream] 0% [ ] 645,135 246.57K/s ^C $ wget -c http://38.118.213.174/89ycy0nvxf+/pub2/Call_of_Duty_4_Modern_Warfare/Official_Demos/CoD4MWDemoSetup.exe/X6 --23:30:56-- http://38.118.213.174/89ycy0nvxf+/pub2/Call_of_Duty_4_Modern_Warfare/Official_Demos/CoD4MWDemoSetup.exe/X6 => `X6' Verbindungsaufbau zu 38.118.213.174:80... verbunden. HTTP Anforderung gesendet, warte auf Antwort... 206 Partial Content Erneuter Versuch. --23:30:57-- http://38.118.213.174/89ycy0nvxf+/pub2/Call_of_Duty_4_Modern_Warfare/Official_Demos/CoD4MWDemoSetup.exe/X6 (Versuch: 2) => `X6' Verbindungsaufbau zu 38.118.213.174:80... verbunden. HTTP Anforderung gesendet, warte auf Antwort... 403 Access Forbidden 23:30:58 FEHLER 403: Access Forbidden. No idea if the server is really broken...
could you try with wget -S -c http://... as well? (to show the headers)
Sure. Following the output when start resuming with additional response headers: --09:30:59-- http://38.118.213.174/89ycy0nvxf+/pub2/Call_of_Duty_4_Modern_Warfare/Official_Demos/CoD4MWDemoSetup.exe/X6 => `X6' HTTP Anforderung gesendet, warte auf Antwort... HTTP/1.1 206 Partial Content Server: FileFront-DistSRV/2.73 Connection: close Accept-Ranges: bytes Content-transfer-encoding: binary Content-length: 1472398257 Content-Type: application/octet-stream Erneuter Versuch. --09:31:00-- http://38.118.213.174/89ycy0nvxf+/pub2/Call_of_Duty_4_Modern_Warfare/Official_Demos/CoD4MWDemoSetup.exe/X6 (Versuch: 2) => `X6' Verbindungsaufbau zu 38.118.213.174:80... verbunden. HTTP Anforderung gesendet, warte auf Antwort... HTTP/1.1 403 Access Forbidden Connection: close Content-Type: text/html Accept-Ranges: bytes 09:31:00 FEHLER 403: Access Forbidden.
Comment on attachment 296814 [details] [diff] [review] Patch v1 (without touching IsResumable) For now the use of IsResumable doesn't seems to be the right way.
Attachment #296814 - Attachment is obsolete: true
Attachment #296814 - Flags: review?(darin.moz)
Given that we are not sure if this is a server or client problem moving off nom list
Flags: blocking1.9? → blocking1.9-
Summary: Resume fails with "Download could not be saved, because the source file could not be read" → Resume fails with "Download could not be saved, because the source file could not be read" (post-submit)
Summary: Resume fails with "Download could not be saved, because the source file could not be read" (post-submit) → Resume fails with "Download could not be saved, because the source file could not be read"
status update?
Shawn, in comment 28 you can already see that wget isn't able to resume the download due to a 403. I did a further test with the latest nightly from behind a proxy and got following HTTP log: 5712[c4efb0]: max hang time exceeded! 5712[c4efb0]: nsHttpConnectionMgr::ProcessPendingQ [ci=P.proxy.xxx.de:8000] 5712[c4efb0]: STS dispatch [4241c90] 5712[c4efb0]: nsSocketInputStream::Read [this=3bfd100 count=4096] 5712[c4efb0]: calling PR_Read [count=4096] 5712[c4efb0]: PR_Read returned [n=1460] 5712[c4efb0]: nsSocketTransport::SendStatus [this=3bfd068 status=804b0006] 5712[c4efb0]: nsHttpTransaction::OnSocketStatus [this=2f36e40 status=804b0006 progress=56583] 5712[c4efb0]: nsHttpTransaction::ProcessData [this=2f36e40 count=1460] 5712[c4efb0]: nsHttpTransaction::ParseHead [count=1460] 5712[c4efb0]: nsHttpTransaction::ParseLine [HTTP/1.0 403 Forbidden] 5712[c4efb0]: nsHttpResponseHead::ParseVersion [version=HTTP/1.0 403 Forbidden] 5712[c4efb0]: Have status line [version=10 status=403 statusText=Forbidden] 5712[c4efb0]: nsHttpTransaction::ParseLine [Content-Type: text/html] 5712[c4efb0]: ParseContentType [type=text/html] 5712[c4efb0]: nsHttpTransaction::ParseLine [Accept-Ranges: bytes] 5712[c4efb0]: nsHttpTransaction::ParseLine [X-Cache: MISS from okd-proxy1.fzk.de] 5712[c4efb0]: nsHttpTransaction::ParseLine [Proxy-Connection: close] 5712[c4efb0]: nsHttpTransaction::HandleContent [this=2f36e40 count=1324] 5712[c4efb0]: nsHttpTransaction::HandleContentStart [this=2f36e40] 5712[c4efb0]: http response [ 5712[c4efb0]: HTTP/1.0 403 Forbidden 5712[c4efb0]: Content-Type: text/html 5712[c4efb0]: Accept-Ranges: bytes 5712[c4efb0]: X-Cache: MISS from proxy.xxx.de 5712[c4efb0]: Proxy-Connection: close 5712[c4efb0]: ] Seems that it is a server side problem. I'll try to have a look at it again when I'm at home later.
Won't make beta3; punting to beta4, at least.
Target Milestone: mozilla1.9beta3 → mozilla1.9beta4
Shawn, what we have to do here is showing up a better error message like "download cannot be resumed". Is such a message already in use? If not I think it will be problematic due to the l10n freeze.
Whiteboard: [needs assignee]
Do we need a new string? "source file could not be read" isn't totally useful but not totally wrong.. and only certain troublesome sites should be causing this issue.
The decision is up to you. I'll hold on until I get told what's the way Mozilla prefers.
Putting this on UE's radar then.
Keywords: uiwanted
As long as there is no decision I reassign this bug to the default assignee.
Assignee: hskupin → nobody
Status: ASSIGNED → NEW
Target Milestone: mozilla1.9beta4 → ---
Oh snap. Well, this missed the string freeze for Firefox 3 anyway, so we are hosed here... :(
At the QA workshop we found another cause for this error, although it doesn't look like it's related to this. We might want to spin it off. If you pause a download, and clear your download location of part files and/or the partial download, resume will display the same error about the source file not being present. Maybe we should check to make sure files created before the pause are present first and if not, restart the whole thing.
(In reply to comment #41) > At the QA workshop we found another cause for this error, although it doesn't > look like it's related to this. We might want to spin it off. If you pause a > download, and clear your download location of part files and/or the partial > download, resume will display the same error about the source file not being > present. Maybe we should check to make sure files created before the pause are > present first and if not, restart the whole thing. New bug please - request blocking 3.0.1 and 3.1 please.
(In reply to comment #42) <snip> > New bug please - request blocking 3.0.1 and 3.1 please. I filed bug 452461.
I believe the error message is misleading anyway. If the error is cause by a server not supporting Resume then that's exactly what should be displayed as the error reason. Not some odd error message leading into wrong conclusions. Error is still present in current FireFox version. I can reproduce it anytime when downloading the new Windows versions from MSDN. Well, although I don't know about the Microsoft servers' capabilities I'm pretty sure they do support Resume.... Have you really checked that this is the actual cause of this error message?
Whiteboard: [necko-would-take]
Priority: P3 → P5

Mass-removing myself from cc; search for 12b9dfe4-ece3-40dc-8d23-60e179f64ac1 or any reasonable part thereof, to mass-delete these notifications (and sorry!)

Hi,
I tried to reproduce the issue in latest Fx versions but I was unable to reproduce it. On my end, the download resumes accordingly after pause it for a few seconds.
@ Stephen, does this still happen on your end with a new and empty profile? See https://support.mozilla.org/en-US/kb/troubleshoot-and-diagnose-firefox-problems#w_6-create-a-new-firefox-profile using the latest Firefox version or it can be closed?
Thanks.

Flags: needinfo?(stephen.donner)

(In reply to Alin Ilea from comment #47)

Hi,
I tried to reproduce the issue in latest Fx versions but I was unable to reproduce it. On my end, the download resumes accordingly after pause it for a few seconds.
@ Stephen, does this still happen on your end with a new and empty profile? See https://support.mozilla.org/en-US/kb/troubleshoot-and-diagnose-firefox-problems#w_6-create-a-new-firefox-profile using the latest Firefox version or it can be closed?
Thanks.

Can still reproduce using Firefox 93.0 on macOS 11.6.

Flags: needinfo?(stephen.donner)

In the process of migrating remaining bugs to the new severity system, the severity for this bug cannot be automatically determined. Please retriage this bug using the new severity system.

Severity: major → --
Severity: -- → S3
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: