Network for Sale

> RTT doesn't drive performance, (bw*delay)-loss does.

FWIW, the heavily interactive apps are more affected by RTT than they are
by bandwidth. Network games are the new TELNET. They despise varying
latency levels, and are generally oblivious to bandwidth. Your point is
still mostly valid, in that the only thing they hate more than varying
latency is packet loss, but if the network isn't losing packets then RTT
does affect "perfomance" for the heavily interactive apps.

i wasn't talking about network performance in general. (and i would never,
ever recommend (bw*delay)-loss as a routing metric!) this is in the context
of so-called global server load balancing. RTT may, or may not, matter in
the decisions such a system must make ("serve this client from which proxy?")

when I was talking (writing) about RTT, it's the variation above
the minimal RTT which is interesting as a congestion indicator.
still, it's the case were each node use FIFO queueing which
is ''easily'' analysed (refer to vern paxson papers amongst other).

RTT in itself is nothing, analysing RTT dynamic over time is very interesting.
But in any case, packet drop rate as an impact of a much higer order of
magnitude than RTT variation.

But interesting RTT variation happen usually before packets get dropped.
again, presuming FIFO queueing. Per-flow queueing may give a more blurred
picture.