that's not what i meant by "serial multiplexing". but can someone please
explain why anyone would want to do this? you don't get the bits faster by
opening multiple connections, now that persistent http allows you to avoid
the inter-object delay that used to be induced by doing a full syn/synack/ack
for each object. does seeing the GIFs fill in in parallel really make that
much difference, if the total page fill time is going to be the same?
You're assuming that the goal is to get _all_ the bits in the shortest
amount of time. And in some cases that is the goal. In others it is not.
By loading the images in parallel, with the initial part of the image files
being a fuzzy approximation, you get to see about where every button is
located, and in many cases you know exactly what it is, and you can click
on them as soon as you know where to go.
The total page fill time isn't that much of an issue when people are not in
a hurry to go on to the appropriate page. In about half of my surfing I
never let the pages finish. Many pages are designed with this in mind
because they build their images to load low resolution first.
What might work better is to just make the whole page be one giant image
and that way it can pre-load in low res _and_ you get it in one connection.
But now that means much _more_ total bandwidth involved.
There is also a psychological perception of speed when images load in
parallel. When someone complains about the slowness, and is given the
low MTU solution, they often end up being happier.
I still say the solution is a way to transmit multiple images in parallel
over a single connection. A multiplexer that allowed concurrency would be
one way. An image format that allowed scattered objects could be another.