[Nanog] ATT VP: Internet to hit capacity by 2010

All this talk of exafloods seems to ignore the basic economics of
IP networks. No ISP is going to allow subscribers to pull in 8gigs
per day of video stream. And no broadcaster is going to pay for the
bandwidth needed to pump out all those ATSC streams. And nobody is
going to stick IP multicast (and multicast peering) in the core just
to deal with video streams to people who leave their TV on all day
whether they are at home or not.

The floor is littered with the discarded husks of policies about what ISP's
are going to allow or disallow. "No servers", "no connection sharing",
"web browsing only," "no voip," etc. These typically last only as long as
the errant assumptions upon which they're based remain somewhat viable.
For example, when NAT gateways and Internet Connection Sharing became
widely available, trying to prohibit connection sharing went by the wayside.

8GB/day is less than a single megabit per second, and with ISP's selling
ultra high speed connections (we're now able to get 7 or 15Mbps), an ISP
might find it difficult to defend why they're selling a premium 15Mbps
service on which a user can't get 1/15th of that.

At best you will see IP multicast on a city-wide basis in a single
ISP's network. Also note that IP multicast only works for live broadcast
TV. In today's world there isn't much of that except for news.

Huh? Why does IP multicast only work for that?

Everything else is prerecorded and thus it COULD be transmitted at
any time. IP multicast does not help you when you have 1000 subscribers
all pulling in 1000 unique streams.

Yes, that's potentially a problem. That doesn't mean that multicast can
not be leveraged to handle prerecorded material, but it does suggest that
you could really use a TiVo-like device to make best use. A fundamental
change away from "live broadcast" and streaming out a show in 1:1 realtime,
to a model where everything is spooled onto the local TiVo, and then
watched at a user's convenience.

We don't have the capacity at the moment to really deal with 1000 subs all
pulling in 1000 unique streams, but the likelihood is that we're not going
to see that for some time - if ever.

What seems more likely is that we'll see an evolution of more specialized
offerings, possibly supplementing or even eventually replacing the tiered
channel package offerings of your typical cable company, since it's pretty
clear that a-la-carte channel selection isn't likely to happen soon.

That may allow some "less popular" channels to come into being. I happen
to like holding up SciFi as an example, because their current operations
are significantly different than originally conceived, and they're now
producing significant quantities of their own original material. It's
possible that we could see a much larger number of these sorts of ventures
(which would terrify legacy television networks even further).

The biggest challenge that I would expect from a network point of view is
the potential for vast amounts of decentralization. For example, there's
low-key stuff such as the "Star Trek: Hidden Frontier" series of fanfic-
based video projects. There are almost certainly enough fans out there
that you'd see a small surge in viewership if the material was more
readily accessible (read that as: automatically downloaded to your TiVo).
That could encourage others to do the same in more quantity. These are
all low-volume data sources, and yet taken as a whole, they could
represent a fairly difficult problem were everyone to be doing it. It is
not just tech geeks that are going to be able produce video, as the stuff
becomes more accessible (see: YouTube), we may see stuff like mini soap
operas, home & garden shows, local sporting events, local politics, etc.

I'm envisioning a scenario where we may find that there are a few tens of
thousands of PTA meetings each being uploaded routinely onto the home PC's
of whoever recorded the local meeting, and then made available to the
small number of interested parties who might then watch, where (0<N<20).

If that kind of thing happens, then we're going to find that there's a
large range of projects that have potential viewership landing anywhere
between this example and that of the specialty broadcast cable channels,
and the question that is relevant to network operators is whether there's
a way to guide this sort of thing towards models which are less harmful
to the network. I don't pretend to have the answers to this, but I do
feel reasonably certain that the success of YouTube is not a fluke, and
that we're going to see more, not less, of this sort of thing.

As far as I am concerned the killer application for IP multicast is
*NOT* video, it's market data feeds from NYSE, NASDAQ, CBOT, etc.

You can go compare the relative successes of Yahoo! Finance and YouTube.

While it might be nice to multicast that sort of data, it's a relative
trickle of data, and I'll bet that the majority of users have not only
not visited a market data site this week, but have actually never done
so.

... JG

As if most financial (and other mega-dataset) data was on consumer Web
sites. Think pricing feeds off stock exchange back-office systems.

> IP multicast does not help you when you have 1000 subscribers
> all pulling in 1000 unique streams.

Yes, that's potentially a problem. That doesn't mean that
multicast can not be leveraged to handle prerecorded
material, but it does suggest that you could really use a
TiVo-like device to make best use.

You mean a computer? Like the one that runs file-sharing
clients? Or Squid? Or an NNTP server?

Is video so different from other content? Considering the
volume of video that currently traverses P2P networks I really
don't see that there is any need for an IP multicast solution
except for news feeds and video conferencing.

What seems more likely is that we'll see an evolution of more
specialized offerings,

Yes. The overall trend has been to increasingly split the market
into smaller slivers with additional choices being added and older
ones still available. During the shift to digital broadcasting in
the UK, we retained the free-to-air services with more channels
than we had on analog. Satellite continued to grow in diversity and
now there is even a Freesat service coming online. Cable TV is still
there although now it is usually bundled with broadband Internet as
well as telephone service. You can access the Internet over your mobile
phone using GPRS, or 3G and wifi is spreading slowly but surely.

But one thing that does not change is the number of hours in the day.
Every service competes for scarce attention spans, and a more-or-less
fixed portion of people's disposable income. Based on this, I don't
expect to see any really huge changes.

That may allow some "less popular" channels to come into
being.

YouTube et al.

I happen to like holding up SciFi as an example,
because their current operations are significantly different
than originally conceived, and they're now producing
significant quantities of their own original material.

The cost to film and to edit video content has dropped
dramatically over the past decade. The SciFi channel is the
tip of a very big iceberg. And what about immigrants? Even
50 years ago, immigrants to the USA joined a bigger melting
pot culture and integrated slowly but surely. Nowadays,
they have cheap phonecalls back home, the same Internet content
as the folks back home, and P2P to get the TV shows and movies
that people are watching back home. How is any US channel-based
system going to handle that diversity and variety?

There are almost certainly enough fans out
there that you'd see a small surge in viewership if the
material was more readily accessible (read that as:
automatically downloaded to your TiVo).

Is that so different from P2P video? In any case, the Tivo model
is limited to the small amount of content, all commercial, that
they can classify so that Tivo downloads the right stuff. P2P
allows you to do the classification, but it is still automatically
downloaded while you sleep.

I'm envisioning a scenario where we may find that there are a
few tens of thousands of PTA meetings each being uploaded
routinely onto the home PC's of whoever recorded the local
meeting, and then made available to the small number of
interested parties who might then watch, where (0<N<20).

Any reason why YouTube can't do this today? Remember the human
element. People don't necessarily study the field of possibilities
and them make the optimal choice. Usually, they just pick what is
familiar as long as it is good enough. Click onto a YouTube video,
then click the pause button, then go cook supper. After you eat,
go back and press the play button. To the end user, this is much
the same experience as P2P, or programming a PVR to record an
interesting program that broadcasts at an awkward time.

I don't pretend to have the answers
to this, but I do feel reasonably certain that the success of
YouTube is not a fluke, and that we're going to see more, not
less, of this sort of thing.

Agreed.

> As far as I am concerned the killer application for IP multicast is
> *NOT* video, it's market data feeds from NYSE, NASDAQ, CBOT, etc.

You can go compare the relative successes of Yahoo! Finance
and YouTube.

Actually, Yahoo! Finance is only one single subscriber to these market
data feeds. My company happens to run an IP network supporting global
multicast, which delivers the above market data feeds, and many others,
to over 10,000 customers in over 50 countries. Market data feeds are not
a mass market consumer product but they are a realtime firehose of data
that people want to receive right now and not a microsecond later. It is
not unusual for our sales team to receive RFPs that specify latency
times
that are faster than the speed of light. The point is that IP multicast
is probably the only way to deliver this data because we cannot afford
the
additional latency to send packets into a server and back again. I.e. a
CDN
type of solution won't work.

It's not only nice to multicast this data, it is mission critical.
People are risking millions of dollars every hour based on the data in
these
feeds. The way it usually works (pioneered by NYSE I believe) is that
they send two copies of every packet through two separate multicast
trees. If there is too much time differential between the arrival of the
two
packets then the service puts up a warning flag so that the traders know
their data
is stale. Add a few more milliseconds and it shuts down entirely because
the data
is now entirely useless. When latency is this important, those copies
going
to multiple subscribers have to be copied in the packet-forwarding
device,
i.e. router supporting IP multicast.

Of course consumer video doesn't have the same strict latency
requirements,
therefore my opinion that IP multicast is unneeded complexity. Use the
best
tool for the job.

--Michael Dillon