bandwidth. There is no end in sight for the need of more bandwidth.
If you had read what I wrote carefully, my claim was simply once
everyone is connected at reasonable speed, demand for bandwidth will
rise as fast as the equipment can cope with it, but no faster. I
thus already addressed your point.
In fact you can roughly calculate the maximum bandwidth needed,
whether in sight or not.
For a single system we can assume most (not all but like 98% of
end-users) will be happy when the bandwidth they have access to can
drive their output devices at full speed, for example (and one of two
possibilities to consider.)
We know the refresh rate of a screen, and can calculate the bandwidth
of driving, say, audio at stereo DAT rates (an exceedingly high
standard to use, but why not), there are some other things we could
consider, but not many.
Even input going out onto the net has to come from somewhere so for
example if it's coming from a hard disk then any bandwidth beyond the
speed of that disk isn't going to be needed (for that particular data.)
Each part has a very knowable limit, and the good news is that they're
merely additive as an excellent approximation (we could quibble but
let's not, it's not going to change the conclusions I don't think, or
at least provide some numbers for discussion.)
Add all that up and multiply by the number of end-users you're trying
to support and you're going to be pretty close to the maximum
theoretical bandwidth needed, less the switching and path metrics.
Perhaps it seems mind-boggling but I don't think it is. For example at
70 HZ and 1024x1024x8 a color screen whose entire surface is changing
on each refresh is 70Mbytes/second, roughly (add some for overhead,
75MBytes/second?) DAT quality sound is 44.1Khz at 16-bits per channel,
so around 160Kbytes/second for stereo, practically in the noise, if
you pardon the expression, compared with the screen.
Fool around with this sort of thing and you'll probably come up with
something around 100MBytes/second which seems rather high (and no
doubt is very high in practice, very few would be using even half that
any given moment) but then so did double-digit MIPS on CPUs ten years
Now I realize there are examples of super-computers or Lucas Films who
have this huge ability to generate or absorb bits but when you add up
how many of them there are compared with how many of the above even
that's not all that daunting except that they introduce it all at more
or less one point. But it wouldn't be unreasonable for some of their
super-applications to be able to support private network links, at any
rate they'd better have money to match their tastes.
Anyhow, there's a reason why Cable TV works, as an illustrative
example. It's mostly because there's a rather precise rate at which a
television set can use screen and sound input and the CATV system is
capable of delivering that input at just that rate. The TV never gets
bogged down waiting for input.
These things are knowable, and they are not infinite or even nearly
so. I think people are just a little boggled by the "sticker shock" of
doing the back of the envelope calculations, just like 10 years ago if
you suggested that for some app to run on a desktop you'd need
100MIPS! Impossible! Well, today that's the cheap systems...
In my experience what limits reaching comfortable levels in
technologies most is not so much the ability of engineers to come up
with workable designs to meet the loftier goals, but simply money. For
example, when money, big money, began pouring into the PC market for
PC's suddenly the speed began soaring and the price, relatively,
dropping like a rock.
(If you're about to tell me about how CATV systems are broadcast
systems so it's easier for them then you completely and utterly
misunderstand what I am saying, sit on your hands.)