[Nanog] ATT VP: Internet to hit capacity by 2010

For what it's worth, I agree with Ryan Paul's summary of the issues
here:

http://arstechnica.com/news.ars/post/20080420-analysis-att-fear-mongering-o
n-net-capacity-mostly-fud.html

...but take it at face value.

$.02,

- - ferg

The rest of the story?

http://www.usatoday.com/tech/products/services/2008-04-20-internet-broadband-traffic-jam_N.htm

   By 2010, the average household will be using 1.1 terabytes (roughly
   equal to 1,000 copies of the Encyclopedia Britannica) of bandwidth a
   month, according to an estimate by the Internet Innovation Alliance in
   Washington, D.C. At that level, it says, 20 homes would generate more
   traffic than the entire Internet did in 1995.

How many folks remember InternetMCI's lack of capacity in the 1990's
when it actually needed to stop installing new Internet connections
because InternetMCI didn't have any more capacity for several months.

I've been on the side arguing that there's going to be enough growth to
cause interesting issues (which is very different than arguing for any
specific remedy that the telcos think will be in their benefit), but the
numbers quoted above strike me as an overstatement.

Let's look at the numbers:

iTunes video, which looks perfectly acceptable on my old NTSC TV, is .75
gigabytes per viewable hour. I think HDTV is somewhere around 8 megabits
per second (if I'm remembering correctly; I may be wrong about that),
which would translate to one megabyte per second, or 3.6 gigabytes per
hour.

For iTunes video, 1.1 terabytes would be 1,100 gigabytes, or 1,100 / .75 =
1,467 hours. 1,467 / 30 = 48.9 hours of video per day. Even assuming we
divide that among three or four people in a household, that's staggering.

For HDTV, 1,100 gigabytes would be 1,100 / 3.6 = 306 hours per month. 306
/ 30 = 10.2 hours per day.

Maybe I just don't spend enough time around the "leave the TV on all day"
demographic. Is that a realistic number? Is there something bigger than
HDTV video that ATT expects people to start downloading?

-Steve

Steve Gibbard wrote:

Maybe I just don't spend enough time around the "leave the TV on all day"
demographic. Is that a realistic number? Is there something bigger than
HDTV video that ATT expects people to start downloading?
  

I would not be surprised if many households watch more than 10hrs of TV
per day. My trusty old series 2 TiVo often records 5-8hrs of TV per day,
even if I don't watch any of it.

Right now I can get 80 or so channels of basic cable, and who knows how
many of Digital Cable/Satellite for as many TVs as I can fit in my house
without the Internet buckling under the pressure. I assume AT&T is just
saying "We use this pipe for TV and Internet, hence all TV is now
considered Internet traffic"? How many people are REALLY going to be
pulling 10hrs of HD or even SD TV across their Internet connection,
rather than just taking what is Multicasted from a Satellite base
station by their TV service provider? Is there something significant
about AT&T's model (other than the VDSL over twisted pair, rather than
coax/fiber to the prem) that makes them more afraid than Comcast,
Charter or Cox?

Maybe I'm just totally missing something - Wouldn't be the first time.
Why would TV of any sort even touch the 'Internet'. And, no, YouTube is
not "TV" as far as I'm concerned.

Once upon a time, Steve Gibbard <scg@gibbard.org> said:

iTunes video, which looks perfectly acceptable on my old NTSC TV, is .75
gigabytes per viewable hour. I think HDTV is somewhere around 8 megabits
per second (if I'm remembering correctly; I may be wrong about that),
which would translate to one megabyte per second, or 3.6 gigabytes per
hour.

You're a little low. ATSC (the over-the-air digital broadcast format)
is 19 megabits per second or 8.55 gigabytes per hour. My TiVo probably
records 12-20 hours per day (I don't watch all that of course), often
using two tuners (so up to 38 megabits per second). That's not all HD
today of course, but the percentage that is HD is going up.

1.1 terabytes of ATSC-level HD would be a little over 4 hours a day. If
you have a family with multiple TVs, that's easy to hit.

That also assumes that we get 40-60 megabit connections (2-3 ATSC format
channels) that can sustain that level of traffic to the household with
widespread deployment in 2 years and that the "average" household hooks
it up to their TVs.

Why would TV of any sort even touch the 'Internet'. And, no,
YouTube is not "TV" as far as I'm concerned.

FWIW:

http://www.worldmulticast.com/marketsummary.html

I think you're too high there! MPEG2 SD is around 4-6Mbps, MPEG4 SD is around
2-4Mbps, MPEG4 HD is anywhere from 8 to 20Mbps, depending on how much wow
factor the broadcaster is trying to give.

A typical satellite TV multiplex is 20-30Mbps for 4-8 channels, depending
on how much the broadcaster pays for higher bitrate, and thus higher quality.

Simon

Once upon a time, Simon Lockhart <simon@slimey.org> said:

The different resolutions can operate in progressive scan or interlaced
mode, although the highest 1080-line system cannot display progressive
images at the rate of 59.94 or 60 frames per second. (Such technology was
seen as too advanced at the time, plus the image quality was deemed to be
too poor considering the amount of data that can be transmitted.) A
terrestrial (over-the-air) transmission carries 19.39 megabits of data per
second, compared to a maximum possible bitrate of 10.08 Mbit/s allowed in
the DVD standard.

Ric

My directivo records wads of stuff every day, but they are the same bits
that rain down on gazillions of other potential recorders and viewers.
Incremental cost to serve one more household, pretty much zero.

There are definitely narrowcast applications that don't make sense to
broadcast down from a bird, but it also makes no sense at all to claim for
capacity planning purposes that every household will need a unicast IP
stream of all it's TV viewing capacity...

-dorn

Chris Adams wrote:

Once upon a time, Steve Gibbard <scg@gibbard.org> said:

iTunes video, which looks perfectly acceptable on my old NTSC TV, is .75
gigabytes per viewable hour. I think HDTV is somewhere around 8 megabits
per second (if I'm remembering correctly; I may be wrong about that),
which would translate to one megabyte per second, or 3.6 gigabytes per
hour.

You're a little low. ATSC (the over-the-air digital broadcast format)
is 19 megabits per second or 8.55 gigabytes per hour. My TiVo probably
records 12-20 hours per day (I don't watch all that of course), often
using two tuners (so up to 38 megabits per second). That's not all HD
today of course, but the percentage that is HD is going up.

1.1 terabytes of ATSC-level HD would be a little over 4 hours a day. If
you have a family with multiple TVs, that's easy to hit.

That also assumes that we get 40-60 megabit connections (2-3 ATSC format
channels) that can sustain that level of traffic to the household with
widespread deployment in 2 years and that the "average" household hooks
it up to their TVs.

I'm going to have to say that that's much higher than we're actually
going to see. You have to remember that there's not a ton of
compression going on in that. We're looking to start pushing HD video
online, and our intial tests show that 1.5Mbps is plenty to push HD
resolutions of video online. We won't necessarily be doing 60 fps or
full quality audio, but "HD" doesn't actually define exactly what it's
going to be.

Look at the HD offerings online today and I think you'll find that
they're mostly 1-1.5 Mbps. TV will stay much higher quality than that,
but if people are watching from their PCs, I think you'll see much more
compression going on, given that the hardware processing it has a lot
more horsepower.

I've found it interesting that those who do Internet TV (re)define HD in a
way that no one would consider HD anymore except the provider. =)

In the news recently has been some complaints about Comcast's HD TV.
Comcast has been (selectively) fitting 3 MPEG-2 HD streams in a 6 MHz
carrier (38 Mbps = 12.6 Mbps) and customers aren't happy with that. I'm not
sure how the average consumer will see 1.5 Mbps for HD video as sufficient
unless it's QVGA.

Frank

Here in comcast land hdtv is actually averaging around 12 megabits a
second. Still adds up to staggering numbers..:slight_smile:

Steve Gibbard wrote:

> I think you're too high there! MPEG2 SD is around 4-6Mbps,
MPEG4 SD is
> around 2-4Mbps, MPEG4 HD is anywhere from 8 to 20Mbps, depending on
> how much wow factor the broadcaster is trying to give.

Nope, ATSC is 19 (more accurately 19.28) megabits per second.

So why would anyone plug an ATSC feed directly into the Internet?
Are there any devices that can play it other than a TV set?
Why wouldn't a video services company transcode it to MPEG4 and
transmit that?

I can see that some cable/DSL companies might transmit ATSC to
subscribers
but they would also operate local receivers so that the traffic never
touches their core. Rather like what a cable company does today with TV
receivers in their head ends.

All this talk of exafloods seems to ignore the basic economics of
IP networks. No ISP is going to allow subscribers to pull in 8gigs
per day of video stream. And no broadcaster is going to pay for the
bandwidth needed to pump out all those ATSC streams. And nobody is
going to stick IP multicast (and multicast peering) in the core just
to deal with video streams to people who leave their TV on all day
whether
they are at home or not.

At best you will see IP multicast on a city-wide basis in a single
ISP's network. Also note that IP multicast only works for live broadcast
TV. In today's world there isn't much of that except for news.
Everything
else is prerecorded and thus it COULD be transmitted at any time. IP
multicast
does not help you when you have 1000 subscribers all pulling in 1000
unique
streams. In the 1960's it was reasonable to think that you could deliver
the
same video to all consumers because everybody was the same in one big
melting
pot. But that day is long gone.

On the other hand, P2P software could be leveraged to download video
files
during off-peak hours on the network. All it takes is some cooperation
between
P2P software developers and ISPs so that you have P2P clients which can
be told
to lay off during peak hours, or when they want something from the other
side
of a congested peering circuit. Better yet, the ISP's P2P manager could
arrange
for one full copy of that file to get across the congested peering
circuit during
the time period most favorable for that single circuit, then distribute
elsewhere.

--Michael Dillon

As far as I am concerned the killer application for IP multicast is
*NOT* video,
it's market data feeds from NYSE, NASDAQ, CBOT, etc.

It's certainly not reasonable to assume the same video goes to all
consumers, but on the other hand, there *is* plenty of video that goes to a
*lot* of consumers. I don't really need my own personal unicast copy of the
bits that make up an episode of BSG or whatever. I would hope that the
future has even more tivo-like devices at the consumer edge that can take
advantage of the right (desired) bits whenever they are available. A single
"box" that can take bits off the bird or cable tv when what it wants is
found there or request over IP when it needs to doesn't seem like rocket
science...

-dorn

I've found it interesting that those who do Internet TV (re)define
HD in a
way that no one would consider HD anymore except the provider. =)

The FCC did not appear to set a bit rate specification for HD
Television.

The ATSC standard (A-53 part 4) specifies aspect ratios and pixel
formats and frame rates, but not
bit rates.

So AFAICT, no redefinition is necessary. If you are doing (say) 720 x
1280 at 30 fps, you
can call it HD, regardless of your bit rate. If you can find somewhere
where the standard
says otherwise, I would like to know about it.

In the news recently has been some complaints about Comcast's HD TV.
Comcast has been (selectively) fitting 3 MPEG-2 HD streams in a 6 MHz
carrier (38 Mbps = 12.6 Mbps) and customers aren't happy with that.
I'm not
sure how the average consumer will see 1.5 Mbps for HD video as
sufficient
unless it's QVGA.

Well, not with a 15+ year old standard like MPEG-2. (And, of course,
HD is a set of
pixel formats that specifically does not include QVGA.)

I have had video professionals go "wow" at H.264 dual pass 720 p
encodings at 2 Mbps, so it can be done. The real
question is, how often do you see artifacts ? And, how much does the
user care ? Modern encodings
at these bit rates tend to provide very good encodings of static
scenes. As the on-screen action increases, so
does the likelihood of artifacts, so selection of bit rate depends I
think on user expectations and the typical content being down.
(As an aside, I see lots of artifacts on my at-home Cable HD, but I
don't know their bandwidth allocation.)

Regards
Marshall

"IP multicast does not help you when you have 1000 subscribers all pulling
in 1000 unique streams. In the 1960's it was reasonable to think that you
could deliver the same video to all consumers because everybody was the same
in one big melting pot. But that day is long gone."

... well multicast could be used - one stream for each of the "500 channels"
or whatever, and the time-shifting could be done on the recipients' sides
... just like broadcast TV + DVR today ... as long as we aren't talking
about adding place-shifting (a la SlingBox) also! The market (or, atleast in
the short-mid term - the provider :slight_smile: ) would decide on that.

/TJ

p2p isn't the only way to deliver content overnight, content could
also be delivered via multicast overnight.

http://www.intercast.com/Eng/Index.asp

http://kazam.com/Eng/About/About.jsp

hmm sorry i did not get it IMHO multicast ist uselese for VOD ,
correct ?

marc

Michael said the same thing "Also note that IP multicast only works
for live broadcast TV." and then mentioned that p2p could be used to
download content during off-peak hours.

   Kazam is a beta test that uses Intercast's technology to download
content overnight to a users PC via multicast.

   My point was p2p isn't the only way to deliver content overnight,
multicast could also be used to do that, and in fact at least one
company is exploring that option.

   The example seemed to fit in well with the other examples in the
the thread that mentioned TiVo type devices recording content for
later viewing on demand.

   I agree that multicast can be used for live TV and others have
mentioned the multicasting of the BBC and www.ostn.tv is another
example of live multicasting. However since TiVo type devices today
record broadcast content for later viewing on demand there could
certainly be devices that record multicast content for later viewing
on demand.