Online games stealing your bandwidth

I think most people are aware that the Blizzard "World of WarcCraft" patcher
distributes files through Bittorrent, however apparently a number of other
MMO companies (LotR, Lego) are apparently doing something similar but aren't
as upfront about it, and are installing Windows services which seed whenever
the computer is online. Game Companies Should Play Fair With P2P |
TorrentFreak<http://torrentfreak.com/game-companies-should-play-fair-with-p2p-100901/>

If you follow the links in the article people are complaining that the LotR
process has served 70gb in a week, others are complaining that the service
is resulting in 300ms pings, and unusable connections.
This is a very grey area it will be interesting how this issue unfolds in
the long run.

<snip>

I once read an article talking about making BitTorrent scalable by
using anycasted caching services at the ISP's closest POP to the end
user. Given sufficient traffic on a specified torrent, the caching
device would build up the file, then distribute that direct to the
subscriber in the form of an additional (preferred) peer. Similar to a
CDN or Usenet, but where it was cached rather than deliberately pushed
out from a locus.

Was anything ever standardised in that field? I imagine with much of
P2P traffic being (how shall I put this...) less than legal, it's of
questionable legality and the ISPs would not want to be held liable
for the content cached there?

M

I haven't played any of these things, so I don't know what they put in the fine print, but unless LotR makes it clear that they're going to utilize your (i.e. players of the game) bandwidth to PTP distribute their software, I'd call that theft and unauthorized use of a computer network.
Are these companies not making enough in monthly subscriptions to afford Akamai or similar CDN services to distribute their software updates?

I personally love Bittorrent. It is wonderful for CDN - for both legal
and not-so-legal files. I however despise the game-loaders not giving
you more options to control this traffic. For example, many run as a
daemon that does not stop seeding - even when you're in-game. You could
manually kill it, but it will re-run again. If you remove it, enjoy not
being able to continue to play the game if it updates.

"Pando Media Booster" (which I have had to deal with), detects your
upload speed and uses about 3/4ths of it. The game installer never even
had you accept an EULA - just when PMB.exe started when it was
installed, it says on the bottom "you have accepted the EULA". EULAs
don't really mean much now of days, sadly. This ruins the network
quality, especially on large LANs, and most people don't even realize
it's using all their connection.

My personal verdict on it is: Give the users the option to limit the
upload speed to whatever they want, and an option to disable it when
they don't need to download any updates; if this is done, I personally
see no problem with it. If this option is not added, I see it as malware
that should be deleted.

As I said above, I have no problems with using Bittorrent as a method of
CDN - it's actually one of the best methods. But, the end-users need
more control over it.

If you read the article, you will see that Akami is one of the
perpetrators, via the "Akamai NetSession Interface". :slight_smile:

The ISP is off the hook on that one. 17 USC 512(2) specifically carves out an ISP
safe-harbor for data that's only cached on an ISP's servers due to an end
user's request. IANAL, so have somebody you pay for legal advice read 17 USC 512(2)
and tell you what they think.

Speaking to your example with Blizzard:

The Blizzard downloader does provide an option to disable P2P transfers which then downloads direct via http from Blizzard.

Yes, the update software defaults to allow P2P but it isn't like they are forcing it upon their users. I have seen Sony do the same thing and have never seen a downloader that you couldn't disable that option if you like.

-Kevin

So it that is true, if you define "news server" as a "cache", even
though you have to buy several terabytes, make that several petabytes,
to "be able to "cache" this data one along with all the network
environment to support getting data out of this "cache", the ISP is
completely in the clear even though that "cache" is the sole single
point where one can retrieve that "cached" data from even years after
the data was originally put on the network, the original is gone and
that "cache" works without anything being attached to it ? :slight_smile:

Oh, one just have to love these things called laws... good that they are
protecting the right parts of the distribution network eh :wink:

Greets,
Jeroen

Speaking to your example with Blizzard:

It was not my example, I do not play Blizzard games.

The Blizzard downloader does provide an option to disable P2P
transfers which then downloads direct via http from Blizzard.

This is nice. Many other games using Pando and other such P2P
downloaders do not.

Yes, the update software defaults to allow P2P but it isn't like they
are forcing it upon their users. I have seen Sony do the same thing
and have never seen a downloader that you couldn't disable that option
if you like.

I am 100% sure "Pando Media Booster" do not give you the option of
disabling it. Unless they hide it very, very deep.

t

I think most people are aware that the Blizzard "World of WarcCraft" patcher
distributes files through Bittorrent,

<snip>

I once read an article talking about making BitTorrent scalable by
using anycasted caching services at the ISP's closest POP to the end
user. Given sufficient traffic on a specified torrent, the caching
device would build up the file, then distribute that direct to the
subscriber in the form of an additional (preferred) peer. Similar to a
CDN or Usenet, but where it was cached rather than deliberately pushed
out from a locus.

Was anything ever standardised in that field? I imagine with much of
P2P traffic being (how shall I put this...) less than legal, it's of
questionable legality and the ISPs would not want to be held liable
for the content cached there?

M

IMHO,

Sooner or later our community will catch on and begin to deploy such
technology. P2P is a really elegant 'tool' when used to distribute large
files (which we all know). I expect that even the biggest last-mile
providers will lose the arms race they currently engage in
against this 'tool' and start participating in and controlling the flow of
data.

Throwing millions into technologies to thwart this 'tool,' technologies such
as DPI only takes away from a last-mile provider's ability to offer service.
I believe this is one reason the USA lags the Rest of the World in broadband
deployment.

Ultimately, I believe it will make sense to design last-mile networks to
benefit from P2P (e.g. allow end stations to communicate locally rather
than force traffic that could stay local to a central office through a session-
based router). Then take advantage by deploying a scenario such as the
one you've outlined to keep swarms local. Before we do that though,
we need to cut the paranoia about this particular tool (created by the
RIAA and others) and we need to see a few more exec's with vision.

jy

I don't recall any protocols being standard.

Plenty of people sell p2p caches but they all work using magic, smoke
and mirrors.

Adrian

I had the P4P (http://en.wikipedia.org/wiki/Proactive_network_Provider_Participation_for_P2P)
pointed out to me - but like you said, it's hardly going to be an open
standard if it gets anywhere.

M

Games? Yes a few, but...

Ever seen Skype on an open, non-NAT'ed internet connection? Capture
some netflow on a self-promoted supernode sometime.

Or seen Octoshape in action?

Jeff

Less smoky is the relatively common practice (at least in Europe) of
tech-friendly ISPs running bittorrent for "release days" of Linux and
BSD distros; Ubuntu releases especially because they have a large
proportion of release-day installs (!) and the servers get hit hard.

How much of this is just staff doing their customers a favor is
debateable, but I know two places where it's written into SOPs for
Debian and FreeBSD major releases (about a week or two after the
release). Linux distribution by bittorrent is sometimes harder now that
more Tier 1 ISPs block or inspect the P2P traffic
By a bit of quick fiddling you can ensure that users outside your blocks
don't get served.
I'm talking installation ISOs, not the ports or packages - the rsyncs
and mirrors take care of that as normal.

For the Debian 5.05 release I provided 700GB+ in a week for the x86
Gnome CD alone via BT, the AMD64 CD was about twice that, yet most of
the Debian stuff will be done using Jigdo, so that's a fraction of the
actual traffic. The Debian Netboot CD's seem quite popular too but
especially for exotic hardware archs. The 5.06 release is still flowing
nicely.

The last few FreeBSD releases I've pushed 500GB each time though I hold
them open for much longer for the less popular architectures.

The thought of it all flying round in such long circles dismays me
somewhat. There's probably an reasonable argument for temporary ISP-BT
of this stuff, as it'll save us all a tiny bit of peak and a lot of
packets, all for very little kit, space and man-hours.

The rest of the torrent users, (games or copytheft), can surely
be /dev/null-ed ? :slight_smile: Hmm, can I smell burning torches in the distance?

Gord

I don't recall any protocols being standard.

I don't either, though I recall bittorrent actually supporting it once and pushing to have ISP support and stay away from encryption/ISP circumvention. That was years ago. Haven't stayed current.

Plenty of people sell p2p caches but they all work using magic, smoke
and mirrors.

Seem to recall some law suits concerning a few of them. Even if we had ISP supporting caches, there is always the problem getting p2p clients to support them (given they often are too busy trying to circumvent).

A good standard would be nice, though, and at least offer a middle ground for trying to get support for such technology as well as pushing it back to open source, legitimate caching vs lying to p2p clients, and solving many issues that pop up from time to time of upstreams not supporting the downstream loads, which a cache could heavily alleviate.

Jack

We had a great P2P cache from Cache Appliance. Did anybody else try
them?

Can you say anything about what size cache it was, and what amount
of bandwidth savings it produced?

There's some standardization work being done in the IETF ALTO working
group. They're looking at ways ISPs can inform P2P clints about which peers
are "better", I.e., topologically nearby.
http://tools.ietf.org/wg/alto/

I'm less familiar with DECADE, but I believe they're working on more
directly cache-related stuff.
http://tools.ietf.org/wg/decade/

I think most people are...

<snip>

I once read an article talking about making BitTorrent scalable by
using anycasted caching services at the ISP's closest POP to the end
user. Given sufficient traffic on a specified torrent, the caching
device would build up the file, then distribute that direct to the
subscriber in the form of an additional (preferred) peer. Similar to a
CDN or Usenet, but where it was cached rather than deliberately pushed
out from a locus.

Was anything ever standardised in that field? I imagine with much of
P2P traffic being (how shall I put this...) less than legal, it's of
questionable legality and the ISPs would not want to be held liable
for the content cached there?

M

Can someone name an ISP that encourages P2P traffic?? :wink:

I thought the issue was more about ISPs encouraging *responsible* P2P.