Network end users to pull down 2 gigabytes a day, continuously?

Increased bandwidth consumption does
not necessarily cost money on most ISP infrastructure.
At my home I have a fairly typical ISP service using
BT's DSL. If I use a P2P network to download files from
other BT DSL users, then it doesn't cost me a penny
more than the basic DSL service. It also doesn't cost
BT any more

It does cost BT, as I said someone pays even if it's not obvious
to the user

The only time that costs increase is when I download
data from outside of BT's network because the increased
traffic reaquires larger circuits or more circuits, etc.

Incorrect, DSLAM backhaul costs regardless of where the traffic
comes from. ISPs pay for that, it costs more than transit

The real problem with P2P networks is that they don't
generally make download decisions based on network
architecture.

Indeed, that's what I said. Until then ISPs can only fix it with P2P
aware caches, if the protocols did it then they wouldn't need the
caches though P2P efficiency may go down

It'll be interesting to see how Akamai & co. counter this trend. At the
moment they can say it's better to use a local Akamai cluster than have
P2P taking content from anywhere on the planet. Once it's mostly local
traffic then it's pretty much equivalent to Akamai. It's still moving
routing/TE up the stack though so will affect the ISPs network ops.

I have to admit that I have no idea how BT charges
ISPs for wholesale ADSL.

Hence your first assertion is unfounded

If there is indeed some kind
of metered charging then Internet video will be a big
problem for the business model.

It is, BT Wholesale don't give it away (usage or capacity
based ISPs still have to pay)

The difference with P2P is that caching is built-in to
the model, therefore 100% of users participate in
caching.

But those caches are in the wrong place, they'd be better
at the ISP end of the ADSL

With HTTP, caches are far from universal,
especially to non-business users.

Doesn't matter if they're buying P2P caches they could buy HTTP if they
don't have one already. The point is they're buying something

brandon

ISPs don't pay Akamai, content owners do.

Content owners are usually not concerned with the same things an ISP's "newtork ops" are. (I'm not saying that's a good thing, I'm just saying that is reality. Life might be much better all around if the two groups interacted more. Although one could say that Akamai fills that gap as well. :slight_smile:

Anyway, a content provider is going to do what's best for their content, not what's best for the ISP. It's a difficult argument to make to a content provider that putting their content on millions of end user HDs depending on grandma to provide good quality streaming to Joe Smith down the street. At least in my experience.

Setting aside the issue of what particular ISPs today have to pay, the real cost of sending data, best-effort over an existing network which has spare capacity and which is already supported and managed is surely zero.

If I acquire content while I'm sleeping, during a low dip in my ISP's usage profile, the chances good that are nobody incurs more costs that month than if I had decided not to acquire it. (For example, you might imagine an RSS feed with BitTorrent enclosures, which requires no human presence to trigger the downloads.)

If I acquire content the same time as many other people, since what I'm watching is some coordinated, streaming event, then it seems far more likely that the popularity of the content will lead to network congestion, or push up a peak on an interface somewhere which will lead to a requirement for a circuit upgrade, or affect a 95%ile transit cost, or something.

If asynchronous delivery of content is as free as I think it is, and synchronous delivery of content is as expensive as I suspect it might be, it follows that there ought to be more of the former than the latter going on.

If it turned out that there was several orders of magnitude more content being shifted around the Internet in a "download when you are able; watch later" fashion than there is content being streamed to viewers in real-time I would be thoroughly unsurprised.

Joe

I may have missed it in previous posts, but I think an important point is being missed in much of this discussion: take rate.

An assumption being made is one of widespread long time usage. I would argue consumers have little interest in viewing content for more than a few hundred seconds on their PC. Further, existing solutions for media extension to the television are gaining very little foothold outside of technophiles. They tend to be more complex for the average user than many vendors seemingly realize. While Apple may help in this arena, there are many other obstacles to widespread usage of streaming video outside of media extension.

In entertainment, content is king. More specifically, new release content is king. While internet distribution may help breathe life into the long tail market, it is hard to imagine any major shift from existing distribution methods. People simply like the latest TV shows and the latest movies.

So, this leaves us with little more than what is already offered by the MSOs: linear TV and VoD. This is where things become complex.

The studios will never (not any time soon) allow for a subscription based VoD on new content. They would instantly be sued by Time Warner (HBO). This leaves us with a non-subscription VoD option, which still requires an agreement with the each of the major studios, and would likely cost a fortune to obtain. CinemaNow and MovieLink have done this successfully, and use a PushVoD model to distribute their content. CinemaNow allows DVD burning for some of their content, but both companies are otherwise tied to the PC (without a media extender). Furthermore, the download wait is a pain. Their content is good quality 1200-1500 kbps VC-1 wince. It is really hard to say when and if either of these will take off as a service. It is a good service, with a great product, and almost no market at the moment. Get it on the TV and things may change dramatically.

This leaves us with linear TV, which is another acquisition nightmare. It is very difficult to acquire pass-through/distribution rights for linear television, especially via IP. Without deep pockets, a company might be spinning their wheels trying to get popular channels onto their lineup. And good luck trying to acquire the rights to push linear TV outside of a closed network. The studios will hear none of it.

I guess where I am going with all this is simply it is very hard to make this work from a business and marketing side. The network constraints are, likely, a minor issue for some time to come. Interest is low in the public at large for primary (or even major secondary) video service on the PC.

By the time interest in the product swells and content providers ease some of their more stringent rules for content distribution, a better solution for multicasting the content will have presented itself. I would argue streaming video across the Internet to a large audience, direct to subscribers, is probably 4+ years away at best.

I am not saying we throw in the towel on this problem, but I do think unicast streaming has a limited scope and short life-span for prime content. IPv6 multicast is the real long term solution for Internet video to a wide audience.

Of course, there is the other argument. The ILECs and MSOs will keep it from ever getting beyond a unicast model. Why let the competition in, right? sniff I smell lobbyists and legislation. :slight_smile:

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

Setting aside the issue of what particular ISPs today have to pay, the real cost of sending data, best-effort over an existing network which has spare capacity and which is already supported and managed is surely zero.

As long as the additional traffic doesn't exceed the existing capacity.

But what happens when 5% of the paying subscribers use 95% of the existing capacity, and then the other 95% of the subscribers complain about poor performance? What is the real cost to the ISP needing to upgrade the
network to handle the additional traffic being generated by 5% of the
subscribers when there isn't "spare" capacity?

If I acquire content while I'm sleeping, during a low dip in my ISP's usage profile, the chances good that are nobody incurs more costs that month than if I had decided not to acquire it. (For example, you might imagine an RSS feed with BitTorrent enclosures, which requires no human presence to trigger the downloads.)

The reason why many universities buy rate-shaping devices is dorm users don't restrain their application usage to only off-peak hours, which may or may not be related to sleeping hours. If peer-to-peer applications restrained their network usage during periods of peak network usage so it didn't result in complaints from other users, it would probably have a better reputation.

If I acquire content the same time as many other people, since what I'm watching is some coordinated, streaming event, then it seems far more likely that the popularity of the content will lead to network congestion, or push up a peak on an interface somewhere which will lead to a requirement for a circuit upgrade, or affect a 95%ile transit cost, or something.

Depends on when and where the replication of the content is taking place.

Broadcasting is a very efficient way to distribute the same content to large numbers of people, even when some people may watch it later. You can broadcast either streaming or file downloads. You can also unicast either streaming or file downloads. Unicast tends to be less efficient to distribute the same content to large numbers of people. Then there is lots of events in the middle. Some content is only of interest to a some people.

Streaming vs download and broadcast vs unicast. There are lots of combinations. One way is not necessarily the best way for every
situation. Sometimes store-and-forward e-mail is useful, other times
instant messenger communications is useful. Things may change over time. For example, USENET has mostly stopped being a widely flooded through every ISP and large institution, and is now accessed on demand by users from a few large aggregators.

Distribution methods aren't mutually exclusive.

If asynchronous delivery of content is as free as I think it is, and synchronous delivery of content is as expensive as I suspect it might be, it follows that there ought to be more of the former than the latter going on.

If it turned out that there was several orders of magnitude more content being shifted around the Internet in a "download when you are able; watch later" fashion than there is content being streamed to viewers in real-time I would be thoroughly unsurprised.

If you limit yourself to the Internet, you exclude a lot of content
being shifted around and consumed in the world. The World Cup or Superbowl are still much bigger events than Internet-only events. Broadcast
television shows with even bottom ratings are still more popular than most Internet content. The Internet is good for narrowcasting, but its
still working on mass audience events.

"Asynchronous receivers" are more expensive and usually more complicated
than "synchronous receivers." Not everyone owns a computer or spends a
several hundred dollars for a DVR. If you already own a computer, you might consider it "free." But how many people want to buy a computer for each television set? In the USA, Congress debated whether it should
spend $40 per digital receiver so people wouldn't lose their over the air broadcasting.

Gadgets that interest 5% of the population versus reaching 95% of the population may have different trade-offs.

Joe Abley said: >>(For example, you
might imagine an RSS feed with BitTorrent enclosures, which requires
no human presence to trigger the downloads.)

I think that is essentially the Democracy client I mentioned.

Great thread so far, btw.

But what happens when 5% of the paying subscribers use 95% of the

existing

capacity, and then the other 95% of the subscribers complain about poor
performance?

"Capacity" is too vague of a word here. If we assume that the P2P
software can be made to recognize the ISP's architecture and prefer
peers that are topologically nearby, then the issue focuses on the
ISP's own internal capacity. It should not have a major impact on
the ISP's upstream capacity which involves stuff that is rented
from others (transit, peering). Also, because P2P traffic has its
sources evenly distributed, it makes a case for cheap local
BGP peering connections, again, to offload traffic from more
expensive upstream transit/peering.

What is the real cost to the ISP needing to upgrade the
network to handle the additional traffic being generated by 5% of the
subscribers when there isn't "spare" capacity?

In the case of DSL/Cable providers, I suspect it is mostly in
the Ethernet switches that tie the subscriber lines into the
network.

The reason why many universities buy rate-shaping devices is dorm users
don't restrain their application usage to only off-peak hours, which may

or may not be related to sleeping hours. If peer-to-peer applications
restrained their network usage during periods of peak network usage so
it didn't result in complaints from other users, it would probably
have a better reputation.

I am suggesting that ISP folks should be cooperating with
P2P software developers. Typically, the developers have a very
vague understanding of how the network is structured and are
essentially trying to reverse engineer network capabilities.
It should not be too difficult to develop P2P clients that
receive topology hints from their local ISPs. If this results
in faster or more reliable/predictable downloads, then users
will choose to use such a client.

The Internet is good for narrowcasting, but its
still working on mass audience events.

Then, perhaps we should not even try to use the Internet
for mass audience events. Is there something wrong with
the current broadcast model? Did TV replace radio? Did
radio replace newspapers?

--Michael Dillon

<snip>

I am suggesting that ISP folks should be cooperating with
P2P software developers. Typically, the developers have a very
vague understanding of how the network is structured and are
essentially trying to reverse engineer network capabilities.
It should not be too difficult to develop P2P clients that
receive topology hints from their local ISPs. If this results
in faster or more reliable/predictable downloads, then users
will choose to use such a client.

I'd think TCP's underlying and constant round trip time measurement to
peers could be used for that. I've wondered if P2P protocols did that
fairly recently, however hadn't found the time to see if it was so.

Dear Sean;

Setting aside the issue of what particular ISPs today have to pay, the real cost of sending data, best-effort over an existing network which has spare capacity and which is already supported and managed is surely zero.

As long as the additional traffic doesn't exceed the existing capacity.

But what happens when 5% of the paying subscribers use 95% of the existing capacity, and then the other 95% of the subscribers complain about poor performance? What is the real cost to the ISP needing to upgrade the
network to handle the additional traffic being generated by 5% of the
subscribers when there isn't "spare" capacity?

If I acquire content while I'm sleeping, during a low dip in my ISP's usage profile, the chances good that are nobody incurs more costs that month than if I had decided not to acquire it. (For example, you might imagine an RSS feed with BitTorrent enclosures, which requires no human presence to trigger the downloads.)

The reason why many universities buy rate-shaping devices is dorm users don't restrain their application usage to only off-peak hours, which may or may not be related to sleeping hours. If peer-to-peer applications restrained their network usage during periods of peak network usage so it didn't result in complaints from other users, it would probably have a better reputation.

Do not count on demand being geographically localized or limited to certain times of day. The audience for streaming is world-wide (for an example, see

http://www.americafree.tv/Ads/geographical.html

for a few hour slice in the early evening EST on a Sunday - note, BTW, that this is for English language content). The roughly equal distribution to the US and the EU is entirely normal; typically the peak-to-trough bandwidth usage variation during a day is less than a factor of 2, and frequently it disappears all together.

Regards
Marshall

Setting aside the issue of what particular ISPs today have to pay, the real cost of sending data, best-effort over an existing network which has spare capacity and which is already supported and managed is surely zero.

As long as the additional traffic doesn't exceed the existing capacity.

Indeed.

So perhaps we should expect to see distribution price models whose success depends on that spare (off-peak, whatever) capacity being available being replaced by others which don't.

If that's the case, and assuming the cost benefits of using slack capacity continue to be exploited, the bandwidth metrics mentioned in the original post might be those which assume a periodic utilisation profile, rather than those which just assume that spare bandwidth will be used.

(It's still accounting based on peak; the difference might be that in the second model there really isn't that much of a peak any more, and the effect of that is a bonus window during which existing capacity models will sustain the flood.)

If you limit yourself to the Internet, you exclude a lot of content
being shifted around and consumed in the world. The World Cup or Superbowl are still much bigger events than Internet-only events. Broadcast
television shows with even bottom ratings are still more popular than most Internet content. The Internet is good for narrowcasting, but its
still working on mass audience events.

Ah, but I wasn't comparing internet distribution with cable/satellite/UHF/whatever -- I was comparing content which is streamed with content which isn't.

The cost differences between those are fairly well understood, I think. Reliable, high-quality streaming media is expensive (ask someone like Akamai for a quote), whereas asynchronous delivery of content (e.g. through BitTorrent trackers) can result in enormous distribution of data with a centralised investment in hardware and network which is demonstrably sustainable by voluntary donations.

"Asynchronous receivers" are more expensive and usually more complicated
than "synchronous receivers."

Well, there's no main-stream, blessed product which does the kind of asynchronous acquisition of content on anything like the scale of digital cable terminals; however, that's not to say that one couldn't be produced for the same cost. I'd guess that most of those digital cable boxes are running linux anyway, which makes it a software problem.

If we're considering a fight between an intelligent network (one which can support good-quality, isochronous streaming video at high data rates from the producer to the consumer) and a stupid one (which concentrates on best-effort distribution of data, asynchronously, with a smarter edge) then absent external constraints regarding copyright, digital rights, etc, I presume we'd expect the stupid network model to win. Eventually.

  Not everyone owns a computer or spends a
several hundred dollars for a DVR. If you already own a computer, you might consider it "free."

Since I was comparing two methods of distributing material over the Internet, the availability of a computer is more or less a given. I'm not aware of a noticeable population of broadband users who don't own a computer, for example (apart from those who are broadband users without noticing, e.g. through a digital cable terminal which talks IP to the network).

Joe

From: owner-nanog@merit.edu [mailto:owner-nanog@merit.edu] On
Behalf Of Gian Constantine
Sent: Sunday, January 07, 2007 7:18 PM
To: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?

<snip>

In entertainment, content is king. More specifically, new
release content is king. While internet distribution may help
breathe life into the long tail market, it is hard to imagine
any major shift from existing distribution methods. People
simply like the latest TV shows and the latest movies.

What's new to you is very different from what's new to me?

I am very happy watching 1 year old episodes of Top Gear whereas
if you are located in the UK, you may consider this as old news.

The story here is about the cost of storing the video content (which
is asymptotically zero) and the cost of distributing it (which is also
asymptotically
approaching zero, despite the ire of the SPs).

So, this leaves us with little more than what is already
offered by the MSOs: linear TV and VoD. This is where things
become complex.

The studios will never (not any time soon) allow for a
subscription based VoD on new content. They would instantly
be sued by Time Warner (HBO).

This is a very US-centric view of the world. I am sure there are
hundreds of
TV stations from India, Turkey, Greece, etc that would love to put their
content
online and make money off the long tail.

I guess where I am going with all this is simply it is very
hard to make this work from a business and marketing side.
The network constraints are, likely, a minor issue for some
time to come. Interest is low in the public at large for
primary (or even major secondary) video service on the PC.

Again, your views are very US centric, and are mono-cultural.

If you open your horizons, I think there is a world of content out there
that the content owners would be happy to license and sell at < 10 cents
a pop.
To them it is dead content, but it turns out that they are worth
something to someone out there.
This is what iTunes, and Rhapsody are doing with music. And the day of
the video is coming.

Bora

-- Off to raise some venture funds now. (Just kidding :wink:

Well, yes. My view on this subject is U.S.-centric. In fairness to me, this is NANOG, not AFNOG or EuroNOG or SANOG.

I would also argue storage and distribution costs are not asymptotically zero with scale. Well designed SANs are not cheap. Well designed distribution systems are not cheap. While price does decrease when scaled upwards, the cost of such an operation remains hefty, and increases with additions to the offered content library and a swelling of demand for this content. I believe the graph becomes neither asymptotic, nor anywhere near zero.

You are correct on the long tail nature of music. But music is not consumed in a similar manner as TV and movies. Television and movies involve a little more commitment and attention. Music is more for the moment and the mood. There is an immediacy with music consumption. Movies and television require a slight degree more patience from the consumer. The freshness (debatable :slight_smile: ) of new release movies and TV can often command the required patience from the consumer. Older content rarely has the same pull.

I agree there is a market for ethnic and niche content, but it is not the broad market many companies look for. The investment becomes much more of a gamble than marketing the latest and greatest (again debatable :slight_smile: ) to the larger market of…well…everyone.

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
constantinegi@corp.earthlink.net

Please see my comments inline:

From: Gian Constantine [mailto:constantinegi@corp.earthlink.net]
Sent: Monday, January 08, 2007 4:27 PM
To: Bora Akyol
Cc: nanog@merit.edu
Subject: Re: Network end users to pull down 2 gigabytes a
day, continuously?

<snip>

I would also argue storage and distribution costs are not
asymptotically zero with scale. Well designed SANs are not
cheap. Well designed distribution systems are not cheap.
While price does decrease when scaled upwards, the cost of
such an operation remains hefty, and increases with additions
to the offered content library and a swelling of demand for
this content. I believe the graph becomes neither asymptotic,
nor anywhere near zero.

To the end user, there is no cost to downloading videos when they are
sleeping.
I would argue that other than sports (and some news) events, there is
pretty much no content that
needs to be real time. What the downloading (possibly 24x7) does is to
stress the ISP network to its max since the assumptions of statistical
multiplexing
goes out the window. Think of a Tivo that downloads content off the
Internet
24x7.

The user is still paying for only what they pay each month, and this is
"network neutrality 2.0" all over again.

You are correct on the long tail nature of music. But music
is not consumed in a similar manner as TV and movies.
Television and movies involve a little more commitment and
attention. Music is more for the moment and the mood. There
is an immediacy with music consumption. Movies and television
require a slight degree more patience from the consumer. The
freshness (debatable :slight_smile: ) of new release movies and TV can
often command the required patience from the consumer. Older
content rarely has the same pull.

I would argue against your distinction between visual and auditory
content.
There is a lot of content out there that a lot of people watch and the
content
is 20-40+ years old. Think Brady Bunch, Bonanza, or archived games from
NFL,
MLB etc. What about Smurfs (for those of us with kids)?

This is only the beginning.

If I can get a 500GB box and download MP4 content, that's a lot of
essentially free storage.

Coming back to NANOG content, I think video (not streamed but multi-path
distributed video) is going to bring the networks down not by sheer
bandwidth alone but by challenging the assumptions behind the
engineering of the network. I don't think you need huge SANs per se to
store the content either, since it is multi-source/multi-sink, the
reliability is built-in.

The SPs like Verizon & ATT moving fiber to the home hoping to get in on
the "value add" action are in for an awakening IMHO.

Regards

Bora
ps. I apologize for the tone of my previous email. That sounded grumpier
than I usually am.

Lets see what I can do using today's technology:

According to the itunes website they have over 3.5 million songs. Lets
call it 4 million. Assume a decent bit rate and make them average 10 MB
each. That's 40 TB which would cost me $6k per month to store on Amazon
S3. Lets assume we use Amazon EC3 to only allow torrents of the files to
be downloaded and we transfer each file twice per month. Total cost around
$20k per month or $250k per year. Add $10k to pay somebody to create the
interface and put up a few banner ads and it'll be self supporting.

That sort of setup could come out of petty cash for larger ISPs marketing
Departments.

Of course there are a few problems with the above business model (mostly
legal) but infrastructure costs are not one of them. Plug in your own
numbers for movies and tv shows but 40 TB for each will probably be enough.

There may have been a disconnect on my part, or at least, a failure to disclose my position. I am looking at things from a provider standpoint, whether as an ISP or a strict video service provider.

I agree with you. From a consumer standpoint, a trickle or off-peak download model is the ideal low-impact solution to content delivery. And absolutely, a 500GB drive would almost be overkill on space for disposable content encoded in H.264. Excellent SD (480i) content can be achieved at ~1200 to 1500kbps, resulting in about a 1GB file for a 90 minute title. HD is almost out of the question for internet download, given good 720p at ~5500kbps, resulting in a 30GB file for a 90 minute title.

Service providers wishing to provide this service to their customers may see some success where they control the access medium (copper loop, coax, FTTH). Offering such a service to customers outside of this scope would prove very expensive, and likely, would never see a return on the investment without extensive peering arrangements. Even then, distribution rights would be very difficult to attain without very deep pockets and crippling revenue sharing. The studios really dislike the idea of transmission outside of a closed network. Don’t forget. Even the titles you mentioned are still owned by very large companies interested in squeezing every possible dime from their assets. They would not be cheap to acquire.

Further, torrent-like distribution is a long long way away from sign off by the content providers. They see torrents as the number one tool of content piracy. This is a major reason I see the discussion of tripping upstream usage limits through content distribution as moot.

I am with you on the vision of massive content libraries at the fingertips of all, but I see many roadblocks in the way. And, almost none of them are technical in nature.

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
constantinegi@corp.earthlink.net

So, kind of back to the original question: what is going to be the reaction of your average service provider to the presence of an increasing number of people sucking down massive amounts of video and spitting it back out again... nothing? throttling all traffic of a certain type? shutting down customers who exceed certain thresholds? or just throttling their traffic? massive upgrades of internal network hardware?

Is it your contention that there's no economic model, given the architecture of current networks, which would would generate enough revenue to offset the cost of traffic generated by P2P video?

Thomas

Gian Constantine wrote:

My contention is simple. The content providers will not allow P2P video as a legal commercial service anytime in the near future. Furthermore, most ISPs are going to side with the content providers on this one. Therefore, discussing it at this point in time is purely academic, or more so, diversionary.

Personally, I am not one for throttling high use subscribers. Outside of the fine print, which no one reads, they were sold a service of Xkbps down and Ykbps up. I could not care less how, when, or how often they use it. If you paid for it, burn it up.

I have questions as to whether or not P2P video is really a smart distribution method for service provider who controls the access medium. Outside of being a service provider, I think the economic model is weak, when there can be little expectation of a large scale take rate.

Ultimately, my answer is: we’re not there yet. The infrastructure isn’t there. The content providers aren’t there. The market isn’t there. The product needs a motivator. This discussion has been putting the cart before the horse.

A lot of big pictures pieces are completely overlooked. We fail to question whether or not P2P sharing is a good method in delivering the product. There are a lot of factors which play into this. Unfortunately, more interest has been paid to the details of this delivery method than has been paid to whether or not the method is even worthwhile.

From a big picture standpoint, I would say P2P distribution is a non-starter, too many reluctant parties to appease. From a detail standpoint, I would say P2P distribution faces too many hurdles in existing network infrastructure to be justified. Simply reference the discussion of upstream bandwidth caps and you will have a wonderful example of those hurdles.

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.

Gian Constantine wrote:

Well, yes. My view on this subject is U.S.-centric. In fairness to me, this is NANOG, not AFNOG or EuroNOG or SANOG.

I thought Qu�bec and Mexico did belong to the North American Network too.

...

I agree there is a market for ethnic and niche content, but it is not the broad market many companies look for. The investment becomes much more of a gamble than marketing the latest and greatest (again debatable :slight_smile: ) to the larger market of...well...everyone.

There is only a minority in north america who happens to be white and
only some of them do speak english.

I remember the times when I could watch mexican tv transmitted from a
studio in florida.

Today everything is crypted on the sats. We have to use the internet
when we want someting special here in germany.

I guess Karin and me are not the only ones who do net even own a tv set.
The internet is the richer choice.

Even if it is mostly audio, video is nasty overseas, I am shure it does
make an impact in north america. Listening to my VoIP fone is mostly
impossible now at least overseas. I used to be able to fone overseas.
but even the landline has deteriorated because the fonecompanies have
switched to VoIP themselves.

Cheers
Peter and Karin

I remember the times when I could watch mexican tv transmitted from a
studio in florida.

If it comes from a studio in Florida then it
is AMERICAN TV, not Mexican TV. I believe there
are three national TV networks in the USA,
which are headquartered in Miami and which
broadcast in Spanish.

--Michael Dillon

I am not sure what I was thinking. Mr Bonomi was kind enough to point out a failed calculation for me. Obviously, a HD file would only be about 3.7GB for a 90 minute file at 5500kbps. In my haste, I neglected to convert bits to bytes. My apologies.

Gian Anthony Constantine
Senior Network Design Engineer
Earthlink, Inc.
Office: 404-748-6207
Cell: 404-808-4651
Internal Ext: x22007
constantinegi@corp.earthlink.net