BGP4 code source

I'd like to ask an operations question that deals with SPAM that hopefully
will be on topic. Here goes:

If you decide to start filtering out SPAM by blocking it from the source,
do you end up becoming a content provider because you're controlling what
your customers have access to? If that is the case, what legal
implications arise from allowing certain news groups on your server that
contain material that is illegal (the alt.binaries.warez.* and some of the
alt.binaries.pictures.erotica.* groups come to mind)? Are we now
responsible for those groups, and everything else that comes into your
network because you've taken the effort to start controlling what your
customers can see and do? A lot of us have gotten by on the premise that
we are not content providers, but service providers who can't control what
our customers see or do, and that there is illegal material out there that
we are not liable for. I think that once we start making decisions
about what content we allow, then we are setting ourselves up to be liable
for what gets through. Just think, responsibility for porn, warez,
hatespeech, harassment, etc. Could this actually be used in a court of
law? Although this is on the fringe of Network Operations as a whole, I
think it is a valid issue to be discussed on a Network Operators list
because blocking SPAM/UCE is an operational decision which might carry
some interesting legal dilemas with it.

Joe Shaw - jshaw@insync.net
NetAdmin - Insync Internet Services

> Heh, a few legitimate customers might have been inconvenienced, but when
> abuse@[owner-of-network] bounces with a "access denied", that's all you have
> left.
>
> --
> --
> Karl Denninger (karl@MCS.Net)| MCSNet - Serving Chicagoland and Wisconsin

I'd like to ask an operations question that deals with SPAM that hopefully
will be on topic. Here goes:

If you decide to start filtering out SPAM by blocking it from the source,
do you end up becoming a content provider because you're controlling what
your customers have access to?

There's a difference between being an *editor* and having a robot control
things based on the impact that a particular item has on your network.

If that is the case, what legal
implications arise from allowing certain news groups on your server that
contain material that is illegal (the alt.binaries.warez.* and some of the
alt.binaries.pictures.erotica.* groups come to mind)?

Well, we don't carry those groups because we believe we're *already* liable
(since we must reasonably know what is in the "warez" and "kiddie porn"
groups).

Are we now
responsible for those groups, and everything else that comes into your
network because you've taken the effort to start controlling what your
customers can see and do? A lot of us have gotten by on the premise that
we are not content providers, but service providers who can't control what
our customers see or do, and that there is illegal material out there that
we are not liable for.

I think you better talk to some attorneys.

Your shield from legal liability only extends as far as *reasonable*
ignorance of what is going on.

That is, if a customer is on IRC and is doing dope deals, and you're unaware
of it, you're not likely to be held responsible. There's no reasonable way
for you to know what is happening, nor any effective way for you to control
that kind of activity.

HOWEVER, the "alt.binaries.warez" groups, and more particularly, the groups
used *primarily* to distribute kiddie porn are another matter. Nobody with
half a neuron firing in their head is going to be able to make an argument
in a courtroom that the group "alt.binaries.warez.ibm-pc.windows" isn't a
group designed and operated for the purpose of distributing stolen software.

Or try "alt.binaries.pictures.teen-fuck". Tell me THAT's not obvious on
its face.

You are a distributor of content when it comes to Usenet. Since the volume
is enormous, and there is no effective way for you to police all of it, it
is not reasonable to expect you to do so.

But to put your head in the sand when you KNOW what's being carried in a
given group, and further, when that group consumes several standard-deviations
of resource above the mean or median of the Usenet system as a whole (thereby
causing you to buy more resource just to have it online) IMHO you're in
trouble.

In fact, these "binaries" groups comprise a couple hundred areas (out of
30,000+ at present) but consume nearly FORTY PERCENT of the bandwidth
required by a full Usenet feed!

Try explaining to a judge how you had *no idea* that the groups for which
you had to spend 40% of your TOTAL Usenet resources to carry were being used
primarily as a conduit for distribution of illicit material.

Good luck.

Just think, responsibility for porn, warez,
hatespeech, harassment, etc. Could this actually be used in a court of
law? Although this is on the fringe of Network Operations as a whole, I
think it is a valid issue to be discussed on a Network Operators list
because blocking SPAM/UCE is an operational decision which might carry
some interesting legal dilemas with it.

Joe Shaw - jshaw@insync.net
NetAdmin - Insync Internet Services

The question is, ARE YOU CONSTRUCTIVELY AWARE OF IT.

Go talk to some good attorneys and take their advice. Just make sure
that you tell them the TRUTH about what's really going on in these groups
and what you know about them, because that's the guy (or gal) who's going
to be standing beside you in a courtroom if you ever have to test your
beliefs.

This is in fact an operational issue Joe, but not on the side which you
think it is. Even a common carrier (and basically nobody in this business
is one) cannot *knowingly ignore* illegal activity on their infrastructure.

I think (read 'hope') that you are reading too much into this.

Going back to non-specific cases regarding the operation and liabilities of
running a BBS I shall cite what I feel has been done in the past and seems as
if it may remain appropriate:

Cases against BBS operators for distributing illegal or offensive material seem
to be split, based solely on the level of participation which the operator
displays. For those operators that have shown a 'hands off' operational
approach, there has been little liability for content. For those operators that
take part in creating/managing their content, liability is usually complete.

In other words, the way I see this as applicable to ISPs follows: *service*
providers which supply TCP/IP access to the internet are liable only to ensure
their customers are aware of netiquette. *content* providers (i.e. AOL, MSN...)
are fully liable for all content which they manage/distribute.

Feedback?

cbrown@bbnplanet.com
Server Administration, GTE Internetworking.

Or try "alt.binaries.pictures.teen-fuck". Tell me THAT's not obvious on
its face.

Our routers haven't a clue whether or not a packet belongs to an email
message or a USENET article let alone whether it is tagged with the
above-mentioned newsgroup identifier or whether its content has been
correctly tagged by that identifier. And our news server doesn't have much
more of a clue regarding content. It essentially acts as a router for
traffic on TCP port 119 attached to a sizeable buffer cache. We are in the
business of transporting bits without regard to the semantics encoded in
those bits.

Off topic opinion:

IMHO it is not possible and never will be possible to have automated
systems filter content. Filtering content is the job of human beings with
job titles like "editor", and "reviewer". Automated systems will make it
much simpler to apply the opinions of editors and reviewers but the human
element will always be an essential element in any system for filtering
content and such systems will always work best when they focus on selecting
the best content rather than on banning the worst content.

> To the best of my knowledge, we only blocked mailbombers. Someone here
> from AGIS can correct me if I am wrong.

[ snip ]

That's all right. They blocked me too, for the same thing (one complaint
per spam).

Me too..

From where I sit that's nothing new, but its also ok - we reciprocated, and
blocked a huge number of AGIS-distributed networks (basically anything that
a spammer was connected to). Guess what? Spam volume went WAY down.

Yes, me as well, but I fat fingered it and announced a slew
of AGIS netblocks to null0 via Sprint. Got a personal phone call
from Thpamford Wallace claiming he lost thousands and
thousands of dollars, frivolous lawsuit threats, etc. etc.

[ Disclaimer: It *really* was a stupid accident - 2AM, tired not
thinking straight etc. etc. ].

Trying to stay on operational issues, I think it's bad for a
provider to block another provider unless they are being harassed
and one email isn't harassment. I guess the same argument could
be used in reverse i.e. AGIS allowed Thpamford to harass us.

Regards,

Oh. mercy me, I tried to stay out of this. As an internet user I have been the
party forgotten here. I do not like spams on worthless products and make
money fast schemes, offshore business ventures, offshore credit cards and
A host of other lame products by people who at best have no moral or
ethical regard for depriving others of money by any means necessary.

I would hope that the backbone providers can refuse to service businesses
that harm their best customer interests and annoy and harrass their
customers with material content not solicited by the paying customer.

I hope that this perspective will aid in the formulation of A sound policy.

Henry R. Linneweh

Paul Flores wrote:

No one will want to hear me point this out... but this is precisely the
point ol' Rotundo makes in his editorial this month... and Stratton
Oakmont suggests strongly that your perception may well be correct,
Joe.

Cheers,
-- jra

Karl Denninger wrote:

HOWEVER, the "alt.binaries.warez" groups, and more particularly, the groups
used *primarily* to distribute kiddie porn are another matter. Nobody with
half a neuron firing in their head is going to be able to make an argument
in a courtroom that the group "alt.binaries.warez.ibm-pc.windows" isn't a
group designed and operated for the purpose of distributing stolen software.

Or try "alt.binaries.pictures.teen-fuck". Tell me THAT's not obvious on
its face.

That was beaten to death many years ago. Nobody with a half neuron will
argue for filtering or removal of those newsgroups on a large scale
(now,
i contend that a teeny weeny ISP can get away with it, but for large
ISPs
it just doesn't work). The reason is very simple: the objectionable
content
will simply move to "legitimate" newsgroups. There is no way to prevent
it.
Which would make it much harder for those who don't want to see it to
filter
it out. How about _encouraging_ such self-regulation instead?

Now, can we get out of the kiddie porn sandbox?

--vadim

There is no way to prevent bank robbery.

Therefore, since we cannot prevent it, it should be perfectly legal for me
to sell you a firearm with the full knowledge that you intend to use it for
the purpose of robbing a bank, or that you have previously robbed a bank (and
been convicted of same) and are therefore inclined to do so again.

The problem with your argument, Vadim, is that it doesn't wash under the
legal code of the United States. It also doesn't wash under the ethical
codes of a whole bunch of people, including me.

You can argue that it is impossible to prevent kiddie porn. You might
even be right about that. But arguing whether you are right or wrong about
whether we can prevent kiddie porn from being produced, and children from
being exploited by any measure short of 24x7 camera monitoring of every
individual in the country is immaterial to THIS issue.

The issue here is simple - what is the *right thing* to do? Whether you can
save the world by doing the right thing isn't the point, nor should it be.
I can't save the planet by myself, and perhaps none of us can save the
planet. But that does not mean that I should toss my can out the car
window, simply because others have before me and individually, my can is
immaterial to the problem of roadside waste.

It is, quite simply, WRONG for people to exploit children. Its wrong to
knowingly give people a medium by which they can easily congregate to do
this.

Yes, I know darn well that I can't stop every person who uses our IRC
server, for example, from sending out kiddie porn through it. I *KNOW*
this, because to do anything else requires that I have the ability to filter
through every piece of content that touches that IP number and port, looking
for the evil material. Its just not possible.

But if someone comes to me and tells me of a set of people who are using our
system for this purpose, I *ALSO* can't stick my head in the sand in
relation to the issue.

Anyone who thinks that these binary groups are just spam-hauses for
1-900-GET-LAID is incredibly naive. Those ads don't require 40% of the
feed.

The *reality* is that trading of stolen and illegal material is rampant in
these *particular* groups. Its a simple matter of volume; that is ALL you
have to look at in order to figure it out.

We STILL count the number of postings we are asked to file in these groups.

Here's the listing from the last TWO HOURS off the top of those logs:

228 newsgroup alt.binaries.pictures.erotica
193 newsgroup alt.binaries.warez.ibm-pc
189 newsgroup alt.binaries.pictures.erotica.oral
175 newsgroup alt.binaries.pictures.erotica.cartoons
151 newsgroup alt.binaries.pictures.erotica.orientals
150 newsgroup alt.binaries.pictures.erotica.breasts
148 newsgroup alt.binaries.pictures.erotica.amateur
146 newsgroup alt.binaries.pictures.boys
134 newsgroup alt.binaries.pictures.erotica.amateur.female
132 newsgroup alt.binaries.erotica
129 newsgroup alt.binaries.pictures.erotica.female
116 newsgroup alt.binaries.pictures.erotica.panties
115 newsgroup alt.binaries.pictures.erotica.blondes
112 newsgroup alt.binaries.games
108 newsgroup alt.binaries.pictures.erotica.redheads
108 newsgroup alt.binaries.pictures.erotica.female.anal
108 newsgroup alt.binaries.pictures.bisexuals
107 newsgroup alt.binaries.pictures.erotica.pornstar
105 newsgroup alt.binaries.pictures.erotica.pornstars
105 newsgroup alt.binaries.pictures.black.erotic.females
104 newsgroup alt.test.testing
103 newsgroup alt.sex.pictures
102 newsgroup alt.sex.erotica

Any questions?

This is getting off-topic for NANOG - I've said my last on the issue here.

That was beaten to death many years ago. Nobody with a half neuron will
argue for filtering or removal of those newsgroups on a large scale
(now, i contend that a teeny weeny ISP can get away with it, but for large
ISPs it just doesn't work). The reason is very simple: the
objectionable content will simply move to "legitimate" newsgroups.
There is no way to prevent it.

Vadim,

This may be true in the US, but if you are in Canada, you must prune these
groups, particularly the kiddie porn, regardless of migration of content
or any other technical issue. And if customers complain, you must act on
them, instead of claiming some non-publisher status. I have been in way
more legal briefings than I care to admit on this issue, and its a done
deal here in Canada. If you do not act, as some smaller ISPs have not, you
are law enforcement cannon fodder. Kiddie porn busts make the major
networks' national news here, particularly when the Internet is involved.

Eric Carroll eric.carroll@acm.org
Tekton Internet Associates

> Karl Denninger wrote:
Here's the listing from the last TWO HOURS off the top of those logs:

228 newsgroup alt.binaries.pictures.erotica
193 newsgroup alt.binaries.warez.ibm-pc
189 newsgroup alt.binaries.pictures.erotica.oral
175 newsgroup alt.binaries.pictures.erotica.cartoons
151 newsgroup alt.binaries.pictures.erotica.orientals
150 newsgroup alt.binaries.pictures.erotica.breasts
148 newsgroup alt.binaries.pictures.erotica.amateur
146 newsgroup alt.binaries.pictures.boys
134 newsgroup alt.binaries.pictures.erotica.amateur.female
132 newsgroup alt.binaries.erotica
129 newsgroup alt.binaries.pictures.erotica.female
116 newsgroup alt.binaries.pictures.erotica.panties
115 newsgroup alt.binaries.pictures.erotica.blondes
112 newsgroup alt.binaries.games
108 newsgroup alt.binaries.pictures.erotica.redheads
108 newsgroup alt.binaries.pictures.erotica.female.anal
108 newsgroup alt.binaries.pictures.bisexuals
107 newsgroup alt.binaries.pictures.erotica.pornstar
105 newsgroup alt.binaries.pictures.erotica.pornstars
105 newsgroup alt.binaries.pictures.black.erotic.females
104 newsgroup alt.test.testing
103 newsgroup alt.sex.pictures
102 newsgroup alt.sex.erotica

Any questions?

193 newsgroup alt.binaries.warez.ibm-pc is the only one I see that looks
like it meets the criteria you normally spout.

The rest are simply banned because you don't like the idea of carrying
pictures of possibly naked people.

This is getting off-topic for NANOG - I've said my last on the issue here.

Really? Promise? That would be really nice. Perhaps we could get back
to operational issues and take this discussion to alt.prudes?

--- David Miller

This is why the spam issue has to be attacked as a theft of service issue.
Spam is identified not by its content, per se, but by such things as:

1) unauthorized relaying
2) false Reply, From and To addresses
3) false Received headers
4) invalid domains and IP addresses
5) fradulent headers of other types
6) lack of unsubscribe capability
7) lack of explicit subscribe capability

If you decide to block spam because it violates a standard for email
headers and subscription policy, then you are not controlling content, only
the proper addressing and routing of messages. If someone sets up a
subscription mail list, solicits subscribers, and then sends out "make
money fast" messages, that is not spam.

This is where Jack Rickard is wrong in his defense of Phil Lawlor. Jack
thinks that Phil is on the high moral ground when he says he cannot filter
based on content, but that is wrong-headed. Spam is a theft of service
issue related to the rules for the transfer of email and according to these
rules, Lawlor and AGIS deserve to be crucified. Of course, AGIS should not
filter based on content, but they should enforce rules for the proper
addressing and delivery of email and the proper management of
subscriber-initiated subscription to mailing lists. They should enforce
this on their customers instead of trying to get them to follow some high
minded collective nonsense-speak, all the while taking their dirty money.

--Kent

Ok, well, Jack's known to be reading here... comments, O Editor
Rotundus?

Cheers,
-- jr 'PSm Kent: I agree' a

I've asked him about this multiple times. He has never replied.