Re: Graphics Question

by jdowdell(at)macromedia.com (John Dowdell)

 Date:  Fri, 16 Oct 1998 12:14:02 -0700
 To:  hwg-graphics(at)hwg.org
  todo: View Thread, Original
At 9:58 PM 10/14/98, Griselda Jarnsaxa wrote:
> I currently have 10 238x48-pixels sized graphics on one of my pages
> and was wondering if they would load faster if I made a image map
> containing all 10 graphics or are they faster separately?

As Harold noted, it could go either way. For a given page I'd make two
different versions myself, flush the browsers' caches, and test it on a
28.8 modem.

Here are some of the concerns that need to be balanced:

--  Each download requires a separate multi-step HTTP connection... first
the browser opens a connection to a server, then the server acknowledges
the request, then the file is sent, then the connection is closed. Each
connection imposes a little bit of a wait. A single connection is usually
faster than multiple connections.

--  But browsers can handle multiple connections in parallel. Because of
the way TCP/IP works, the modem is only rarely taking in data at the
maximum rate. Breaking a large graphic into four pieces usually provides
faster delivery than a single large file, all other things being equal.

--  Once you get beyond four simultaneous connections, then downloads are
usually staggered in sequence. Using guidelines to slice an image usually
gives you too many tiles, because first four tiles are sent, then
connections are negotiated for the next four, etc.
    Rephrased, too many slices can definitely slow down the overall
transfer, because only a set number can be handled at once.

--  Because some slicing can help, but too much can hinder, be careful when
using a "slice along guidelines" approach. If you have three vertical
guides and four horizontal guides, then you'll be sending twenty different
slices, and this can be slow... the browser would need to negotiate for
twenty different files for the page!
    (This is why tools are moving to a "smart slicing" approach, where
different cells in an image table can be merged... it's easier to manage
fewer files on disk, and can definitely speed up the delivery of the page.)

--  If you will be reusing parts of the image on different pages, or if
only a small part of the image changes on different pages, then slicing can
*definitely* help. If you set up your HTML so that slices from previous
pages are reused then the browser can grab them from the cache, and this is
much faster than requesting them from the server again.

--  If you have an image where different parts could use different
compression schemes, then this is also a good candidate for slicing. Some
tools now let you use different compression methods for different pieces,
and this can reduce the total file size.


In your situation, you describe ten graphics, each 238 by 48 pixels. That's
a lot! In uncompressed 8-bit that would be 115K for graphics alone.

Suggestion: Are there recurring sections among the graphics? For instance,
if each is a piece of text atop a gradient, it might be possible to slice
out the text in each copy, letting them all use the common background. This
could perceptibly reduce download size.

I don't know the exact design there, Griselda, but I'd second Harold's
suggestion: make variant copies, and test it. The above is some of the
theoretical background on the speed differences in the one-big-file vs
many-little-files problem... hope it's of help as you work through that
site.

jd




John Dowdell, Macromedia Tech Support, San Francisco CA US

Private email options: http://www.macromedia.com/support/priority.html
Search technotes: http://www.macromedia.com/support/search/
Entertainment on the web: http://shockrave.macromedia.com/
Luscious web graphics: http://www.macromedia.com/software/fireworks/

HWG: hwg-graphics mailing list archives, maintained by Webmasters @ IWA