Closed
Bug 59624
Opened 24 years ago
Closed 13 years ago
Do image loading times depend on the size of the compressed image?
Categories
(Core :: Graphics: ImageLib, defect, P3)
Tracking
()
RESOLVED
INCOMPLETE
Future
People
(Reporter: tenthumbs, Unassigned)
References
Details
(Keywords: perf)
Attachments
(1 file)
(deleted),
image/png
|
Details |
PNG images always seem to load slowly so I thought I'd try to find out
why. Rather than attaching a megabyte or so of random images, here's
what I did.
First, I created a 600x600 image with Gimp and filled it in with Gimp's
plasma plugin which created a nice rgb image. I then saved it as a PPM
file. I converted the PPM to a PNG and ran the PNG through pngcrush to
shrink it as much as possible. I did "cjpeg -sample 1x1 -opt -quality
75 plasma.ppm" and "cjpeg -sample 1x1 -opt -quality 95 plamsa.ppm" to
create two JPEGs. I now had a 35k and a 134K JPEG plus a 509K PNG.
I now created some simple HTML pages on my server. Each page just had a
tag of the form <img src="foo.jpg">. I pointed Mozilla at each page and
reloaded each page twice before I took measurements. The server was
returning 304 responses for both the page and the image so no
significant time was consumed there.
What I found is that the load time, according to either Mozilla's own
time or my stopwatch, depended on the size of the compressed image,
*not* the size of the rendered images. It even appears to be linear.
(See the attached graph.) This is very surprising.
I realize that this is just one image with only three data points and I
could be measuring something specific to my machine but, if this
phenomenon is real, then something is definitely wrong with Mozilla.
Anyone want to repeat the test?
Here's the general impression I've received after adding some
probes and reading some code. on a single pass image, libimg
tries to push the image to display as fast as possible. It
forces a notification every 15000/(3*width) lines. For the 600
pixel wide example used in this bug this amounts to a repaint
hitting gfx every nine lines of the image. The current gtk gfx
isn't terribly fast with images, so this hurts a lot.
The question then arises as to why small images don't show all
these forced repaints. While I'm not an expert in how layout
handles repaint requests, it seems possible that the repaint
notifications from blocks are pushed through libimg before
returning to the main loop are being collapsed together.
While this is a problem, it's probably not worth trying to fix
with the current libimg.
I'm not so sure gtk is the problem. The decoded image is the same size
in all cases so a gtk issue should be of constant time. Now it is true
that the JPEG decoder seems to do some buffering that the PNG decoder
doesn't. Maybe this is a problem, maybe not. It's also possible that
netlib is the bottleneck; i.e., data is coming in at a fixed rate. In
any case, I doubt it's a layout issue.
Update.
There is no longer that nice linear relationship (which was probably
bogus to begin with). Larger files are still slower than smaller ones
but the slope for PNG's is different than, say, JPEGs.
Within a particular image type I can manipulate the size somewhat and
linear relationship seems to be still there.
I wonder how this screws up page loading tests.
gagan, can you get someone to help investigate? this is one of the ideas that
people shoot off at peformance meeting.
If we store compressed images, would that improve our page load time?
Assignee: tor → gagan
Comment 9•23 years ago
|
||
I don't understand this bug. It sounds like all it is saying is that smaller
images are faster to load than big ones. This seems like a pretty ovbious "duh"
to me. If the data is spread out through the image more, and the decoder has to
do more work, then clearly it is going to take longer to load the image than if
it is all together. Am I missing something?
Reporter | ||
Comment 10•23 years ago
|
||
No, it's about load time for different encodings of the _same_ image.
Consider a non-interlaced PNG. By choosing different compression levels
I can create files that contain identical information but are different
physical sizes, a 3 to 1 ratio, or more, for uncompressed to maximally
compressed is common. The problem was that the larger the physical size
the slower it loaded. It made no difference whether it loaded from a
server or a local file.
That's counter-intuitive because decompression should take time. I did
some tests with a stock libpng alone and the time difference between
reading an uncompressed and a compressed image is far too small to
explain the mozilla behavior.
I also tried the same thing with JPEG's and I saw the same thing, larger
physical size = slower. Since JPEG's are usually smaller than PNG's, it
was also true that a JPEG version of a given image loaded faster than
the PNG version.
Now, it's been a while since I tested this so it may well have changed.
Updated•23 years ago
|
Target Milestone: --- → Future
Updated•18 years ago
|
Assignee: pavlov → nobody
QA Contact: tpreston → imagelib
Updated•15 years ago
|
Severity: normal → minor
Comment 11•13 years ago
|
||
This is an old bug but a good candidate for further investigation. We should look at how various parameters effect image decoding/loading.
Comment 12•13 years ago
|
||
Actually I have read the comment more carefully and I've reconsidered. I'd be willing to take a look at this bug if we encounter concrete examples where images decode slower then they should on a per case basis. Resolving as incomplete until we encounter concrete examples.
Status: NEW → RESOLVED
Closed: 13 years ago
Resolution: --- → INCOMPLETE
You need to log in
before you can comment on or make changes to this bug.
Description
•