Closed
Bug 100250
Opened 23 years ago
Closed 15 years ago
Large image causes machine to lock up.
Categories
(Core :: Graphics: ImageLib, defect, P3)
Core
Graphics: ImageLib
Tracking
()
RESOLVED
WORKSFORME
People
(Reporter: pavlov, Unassigned)
References
()
Details
(Keywords: crash, perf, testcase, Whiteboard: patch)
Attachments
(2 files, 1 obsolete file)
(deleted),
image/png
|
Details | |
(deleted),
patch
|
Details | Diff | Splinter Review |
This image stills freezes my entire w2k machine with build 2001-09-13-05-0.9.4
crashes linux with build 2001-09-13-05-0.9.4 (incident number 35342053) and
crashes mac with build 2001-09-13-03-0.9.4
Reporter | ||
Comment 1•23 years ago
|
||
Reporter | ||
Updated•23 years ago
|
Reporter | ||
Comment 2•23 years ago
|
||
Some kind of fix for this is needed.. if its just a hard limit (for now) we
should probably do it. I will look into this.
Reporter | ||
Comment 3•23 years ago
|
||
This loads (very very very slowly) for me on Win2k. I havn't tried anywhere
else yet. The lockup on win2k you are seeing is probably video driver or
memory related.
I'm running on a 1Ghz P3 with 256mb ram (HP Omnibook 6000).
Reporter | ||
Comment 4•23 years ago
|
||
Looking at talkback incident 35342053 (linux) doesn't tell me much...
Stack Signature libc.so.6 + 0x1ed41 (0x4043fd41) d1bc3b49
Platform ID LinuxIntel
Trigger Reason SIGIOT: Abort or IOT Instruction: (signal 6)
Stack Trace
libc.so.6 + 0x1ed41 (0x4043fd41)
libc.so.6 + 0x200d8 (0x404410d8)
libstdc++-libc6.1-1.so.2 + 0x2ee55 (0x40544e55)
libstdc++-libc6.1-1.so.2 + 0x2ee72 (0x40544e72)
libstdc++-libc6.1-1.so.2 + 0x2f75b (0x4054575b)
libstdc++-libc6.1-1.so.2 + 0x313c5 (0x405473c5)
nsImageGTK::Init()
gfxImageFrame::Init()
info_callback()
png_push_have_info()
...
I expect this is being caused by the call to |new|, but I am unsure why,
unless, perhaps, it is trying to throw an exception.. It should just be
returning NULL I would think. Of course we don't check for that, but nothing
in nsImageGTK::Init does anything with this memory.
Reporter | ||
Comment 5•23 years ago
|
||
Reporter | ||
Comment 6•23 years ago
|
||
Comment on attachment 49730 [details] [diff] [review]
Check for NULL returns from the calls to 'new'
er, typo
Attachment #49730 -
Attachment is obsolete: true
Reporter | ||
Comment 7•23 years ago
|
||
Reporter | ||
Comment 8•23 years ago
|
||
While I don't actually think this patch will fix the crash that is being seen
on Linux, it should help some.
Reporter | ||
Comment 9•23 years ago
|
||
marking nsbranch-. Most people arn't going to try and load huge images, and
this doesn't happen everywhere. Will try to get to it for 0.9.5 though.
Reporter | ||
Updated•23 years ago
|
Target Milestone: mozilla0.9.7 → mozilla0.9.8
Comment 11•23 years ago
|
||
Hi Stuart. :-)
That image worked fine for me although it bogged down my system. Build
2001121608 - Windows XP. Funny thing is, at the same time, I'm Flattening a 1+
GB image in Photoshop. I thought I'd test that image out in Mozilla and watch it
crash. In the middle of that I felt like going to look for a bug on this and
found yours. I can't believe I didn't try this back in around late 2000 when I
did the extremely large Iframe crashes Mozilla thing. I was thinking about it.
I've been making the image for an hour now. Its a 20000x20000 RGB gradient image.
The thing I'm thinking is "How large is too large?".
Shouldn't Mozilla draw the line somewhere?
I was thinking that Libpr0n could pick a certain size - depending on the system
- that it would store in memory. After that, it should say: "Forget it" or
something.
For instance, if you have only 1GB hard drive left and 20MB mem, you aren't
going to be able to load a 1.2GB image. To try would be futile.
There are several options (maybe you can think of others):
1. Not load the image.
2. After a certain size, Mozilla could start to degrade image quality (i.e. zoom
out the image and store it at the new zoomed size to remove some of the image
information). A message could be passed to the user that Mozilla did this. The
chance a user would have a monitor big enough to see it in pure quality is
small. In the future, when this is possible - memory is low.
If just low on memory but has enough hard drive space:
3. Only show the part of the image that is in the window at one time. This might
require breaking up the image as it is passed from the decoder. Store the rest
in "scrap files". Does Mozilla currently do this?
I'll be back later when the files are finished. It has been an hour and a half now.
Comment 12•23 years ago
|
||
Here are the images I was talking about (Might not always be up):
http://mozilla.netdemonz.com/src/patchandtest/100250%20-%20large%20images/
Reporter | ||
Updated•23 years ago
|
Target Milestone: mozilla0.9.8 → mozilla0.9.9
Comment 13•23 years ago
|
||
the hello world image takes a long time to load on a 450MHz linux box,
eventually it worked but not before galeon sucked down 197M of ram.
Keywords: perf
Comment 14•23 years ago
|
||
Photoshop uses a scratch disk for extremely large images. It would be nice if
Mozilla could do something like that. It could even compress an image into
blocks to store it in mem/disk. Each block would contain part of the image, and
compress individually so you don't have to decompress the whole image at once.
See also bug 122581
Comment 15•23 years ago
|
||
nominating ...
what are the chances this is gonna make 099?
Keywords: nsbeta1
Comment 16•23 years ago
|
||
Removing nsbeta1 nomination because this bug has been plussed.
Keywords: nsbeta1
Comment 17•23 years ago
|
||
Reporter | ||
Updated•23 years ago
|
Target Milestone: mozilla0.9.9 → mozilla1.0
Reporter | ||
Comment 18•23 years ago
|
||
not easily reproducible; reassessing this for nsbeta1 as minus (gagan from
pavlov's desk)
Updated•23 years ago
|
Keywords: mozilla1.0+
Updated•23 years ago
|
Keywords: mozilla1.0+ → mozilla1.0-
Reporter | ||
Updated•23 years ago
|
Target Milestone: mozilla1.0 → mozilla1.1alpha
Comment 19•23 years ago
|
||
*** Bug 148060 has been marked as a duplicate of this bug. ***
Comment 20•23 years ago
|
||
From bug 148060, there's a reproducible testcase with URL
http://user.tninet.se/~yff510t/slike/wtc_satelite.jpg which crashes Mozilla
1.0RC3 on Linux (only ? I didn't crash with Win2k + trunk).
OS -> All (was: Win2k) although this last testcase seems to crash Linux only.
Comment 21•22 years ago
|
||
By the definitions on <http://bugzilla.mozilla.org/bug_status.html#severity> and
<http://bugzilla.mozilla.org/enter_bug.cgi?format=guided>, crashing and dataloss
bugs are of critical or possibly higher severity. Only changing open bugs to
minimize unnecessary spam. Keywords to trigger this would be crash, topcrash,
topcrash+, zt4newcrash, dataloss.
Severity: normal → critical
Comment 22•22 years ago
|
||
I noticed behavior like this on build 2002121215 of Mozilla 1.3a, on the
following image:
http://www.oqo.com/_images/H.jpg
It's a 4577 x 3597 pixel image, 1.2 megabytes in disk size. I found this out
from Internet Explorer 5.0 after Mozilla 1.3a froze up trying to load it.
Internet Explorer 5.0 loads the image quickly (assuming a fast Internet
connection) decompresses it completely into memory, taking about 20 MB of memory
to do so, and succeeds in displaying it. Some time after decompressing the
image, IE 5.0 seems to be able to release the memory even while keeping the
image in the browser Window, reallocating it when necessary to decompress on the
fly.
Mozilla, on the other hand, just froze up within a few seconds of clicking on
the link to load the image. I tried this several times... there were a few
instances where if I could get Mozilla to respond to the mouse click to close
the window, everything would come back and the program would keep operating as
normal. Mostly, I tended to give up and force Mozilla out of memory via "End
Task". I believe I was seeing the upper-left-hand corner of the image in the
Mozilla window appearing probably about a minute after beginning to load the
page;... I cannot be sure however, since I was unable to scroll the page to see
any more of the image. Perhaps as much as 30 extra megabytes... I can't say for
sure though whether it was more extra memory than IE 5.0 allocated at its peak
allocation though.
Operating System on my machine is Windows 2000.
I'm not sure whether this information correctly applies to this bug, or to bug
#9922, so I'm cross-posting this information to both bug reports.
Thanks in advance for looking at it.
--David
Comment 23•22 years ago
|
||
David: How much system RAM do you have? (That's the most important thing to
know, IMHO).
Comment 24•22 years ago
|
||
*** Bug 196387 has been marked as a duplicate of this bug. ***
Comment 25•21 years ago
|
||
Loading URL in comment 22 in 20030622 vacpp OS/2 1.4 latest while free RAM was
~28MB reduced free RAM to 512K. On restarting Mozilla with far more free RAM,
loading that image reduced free RAM by 50MB.
Updated•21 years ago
|
Target Milestone: mozilla1.1alpha → ---
Comment 26•21 years ago
|
||
I tried loading a large image
(http://homepages.wmich.edu/~j2ockert/LTValentine.jpg) in Firefox and my machine
became totally unresponsive. Verified in Mozilla Application Suite 1.6. This
should be fixed.
For images/other files over 2MB or whatever, perhaps it would be better not to
handle it in the browser -- the user can then use an external viewer directly,
or save it to disk first. All modern operating systems come with decent image
viewers. Also, I assume that the image isn't redrawn directly from the image
format every time -- the JPEG is probably translated into a bitmap structure of
some sort and then certain pixels are picked out. Since drawing from JPEG each
time might be too slow, and would reduce modularity (all further functions only
need operate on the the bitmap structure, regardless of the format of the image
file), perhaps the best option would be to create a bitmap structure only big
enough for the resized image. This could make redrawing when resizing the window
really slow, but that's not really an issue -- it's far better than it is right
now anyway, and I don't think people who are viewing such a large image are
expecting interruption-free browsing; after all, 3MB JPEGs don't go on web
pages, for the most part. Another option, perhaps better, would be to create a
bitmap structure only as big as the screen resolution. Then that can be scaled
to the window size. This has the added benefit of not affecting the redraw time
of images that are well-handled now. It should be relatively simple:
newDimnsnX = (imgDimnsnX > scrnDimnsnX)?scrnDimnsnX:imageDimnsnX;
newDimnsnY = (imgDimnsnY > scrnDimnsnY)?scrnDimnsnY:imageDimnsnY;
...
scaledImage = malloc( newDimnsnY * newDimnsnX *
(colorBitDepth/8 + ((colorBitDepth % 8)==0)?0:1) );
Thanks for your time. :-)
Comment 27•21 years ago
|
||
Oops. That doesn't keep it proportional. But the changes are trivial. &c.
Comment 28•21 years ago
|
||
see also bug 166862
Comment 29•19 years ago
|
||
Testcase WFM using Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9a1) Gecko/20060526 Minefield/3.0a1 ID:2006052604
Comment 30•19 years ago
|
||
Ronald, which testcase? there's lots of dead links here.
THe 8001x8001 worksforme,
Mozilla/5.0 (Macintosh; U; PPC Mac OS X Mach-O; en-US; rv:1.8.0.3) Gecko/20060426 Firefox/1.5.0.3
My machine has 2.5GB ram, we probably should test this on a low ram machine.
Comment 31•19 years ago
|
||
attachment 49728 [details] WFM with 1.5.0.3 on Linux w/ 512M RAM
Comment 32•18 years ago
|
||
does it still crash MAC?
attachment 49728 [details] is only testcase that's still available.
w2k - Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9a1) Gecko/20060421 Minefield/3.0a1
XP - Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.9a1) Gecko/20060531 Minefield/3.0a1
kills machine performance, slow but no crash
painfully slow on w2k 512MB machine - especially if image is reloaded a few times, it causes windows message "virtual memory will be increased"
not so bad on XP 1GB machine
* SM not slow (2-13 sec) and doesn't kill machine Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.1) Gecko/20060130 SeaMonkey/1.0
Comment 33•18 years ago
|
||
attachment 49728 [details] loads fine in about 2 secs for me, I'm on a 4x2.5GHz ppc a fairly
fast machine, 2.5G ram.
Comment 34•18 years ago
|
||
I have an intel mac at home, it takes slightly longer to load
attachment 49728 [details] but loads fine in about 4 secs, 4-5 reloads ok,
no crashes, this looks ok to me.
Reporter | ||
Updated•18 years ago
|
Assignee: pavlov → nobody
Status: ASSIGNED → NEW
QA Contact: tpreston → imagelib
Comment 35•17 years ago
|
||
something is still a problem. As reported in comment 32 initial load is OK, but reload kills the UI for 20-30 or more secs, but at *low CPU*. It does grab a bunch of memory, maybe it's the memory activity that is affecting UI. tessed on a faster 3.2ghz and current trunk.
you may have to play to make it happen. make sure you do a reload or two. roughly what I did...
load image
zoom in (click on image)
scroll with arrow keys
zoom out
zoom in
reload with shift+ctrl+R
repeat if needed
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9a7pre) Gecko/2007072417 Minefield/3.0a7pre
Comment 36•16 years ago
|
||
WFM.
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1b4) Gecko/20090427 Shiretoko/3.5b4
Comment 37•15 years ago
|
||
Many large-image crash bugs have been fixed since this bug was filed. If anyone is still seeing problems, please file a new bug for the specific image and OS.
Status: NEW → RESOLVED
Closed: 15 years ago
Resolution: --- → WORKSFORME
See Also: → https://launchpad.net/bugs/284051
You need to log in
before you can comment on or make changes to this bug.
Description
•