Network Level Content Blocking (UK)

Iljitsch van Beijnum wrote:

In this case I would suggest that it is in ISPs best interests to get
involved with network content blocking, so that ISPs collectively become
deep experts on the subject. We are then in a position to modify these
activities in a way that is beneficial to ISPs and their customers (who
happen to be voters too).

Your assumption that blocking parts of the internet is a useful activity is flawed. The only positive effect that this has is that it protects users from accidentally running into stuff they'd rather not come into contact with. But this is much more effeciently and effictively done using commercially available filters.

I talked to some people from the Dutch equivalent to http://www.iwf.org.uk/

This was a very curious experience. What they want to achieve is protecting children from abuse. This is of course a laudable goal. But they think they can do that by ridding the internet of images depicting said abuse. There are pretty strong laws against that in the Netherlands*, but this woman thought that wasn't enough: she felt it would be good to also outlaw _text_ describing child abuse. This is really scary. If these well-intentioned but extremely dangerous people get their way, someone can end up in jail for simply writing some text.

All the while, children in known dangerous situations go on a waiting list before they can be removed from the dangerous (home) environment. So apparently, it's more important to go after the results of child abuse in the past, and maybe even go after people who only fantasize about this stuff, rather than help kids that are in danger NOW. But hey, removing kids from abusive homes costs money and results in angry parents on the news. Strongarming ISPs into taking "voluntary" action on the other hand, is free and only results in angry threads on NANOG.

I'm not one to give up my civil liberties without a struggle, but protecting kids may be important enough to make it worth giving up a few. But is it too much to ask for something that actually works in return?

* Not long ago, a man was convicted because he had 10 images of this kind on his computer. They were part of a 100000 image porn collection. His claim that the 10 images were downloaded accidentally wasn't accepted by the judge: he should have been more careful.

I agree that it will not protect children at all. Presumably there are already a large number of images (I hear figures of people having n * thousand images) so there is already enough material for there not to be a reason to generate more which would of course involve abuse.

So what then is the aim of the filtering?

Is it just the latest political bandwagon

It is quite odd really that governments want to implement something to prevent people from breaking a law. And some posts have been correct in asking what's next? Automatic copyright/patent infringing filtering?

Obviously you've not paid much attention to what Youtube have been doing
lately..

Adrian

Well, it seems to be a standard operating procedure that anyone in a
high profile case gets accused of possessing "child porn" via
anonymous leaks from the police to the national press. (See the Forest
Gate incident - not only did they tear the guy's house apart looking
for nonexistent "chemical weapons", they "accidentally" shot him, then
they briefed the tabloids that his computer was riddled with evil
images of children. Naturally, he was never prosecuted for same.)

If any UK ISP is willing to NOT do this, you've got my business.

Why did they even go for him in the fist place?
Has anybody heard of operation Ore in the UK? It looks like a bit of a disaster, who would have thought that stolen credit Card details would have been used to buy illegal porn?

It is quite odd really that governments want to implement something to prevent people from breaking a law. And some posts have been correct in asking what's next? Automatic copyright/patent infringing filtering?

On that subject- we should probably change the language as well. Make it so that people can't even think of breaking the law because the words for such an action no longer exist. That would be doubleplusgood!

-Don

This was a very curious experience. What they want to achieve is protecting children from abuse. This is of course a laudable goal. But they think they can do that by ridding the internet of images depicting said abuse. There are pretty strong laws against that in the Netherlands*, but this woman thought that wasn't enough: she felt it would be good to also outlaw _text_ describing child abuse. This is really scary. If these well-intentioned but extremely dangerous people get their way, someone can end up in jail for simply writing some text.

"The greatest dangers to liberty lurk in insidious encroachment by men of zeal, well-meaning but without understanding."

-Judge Louis Brandeis

"Of all tyrannies a tyranny sincerely exercised for the good of its victims may be the most oppressive. It may be better to live under robber barons than under omnipotent moral busybodies, The robber baron's cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for own good will torment us without end, for they do so with the approval of their own conscience.

- C.S. Lewis

I'm not one to give up my civil liberties without a struggle, but protecting kids may be important enough to make it worth giving up a few. But is it too much to ask for something that actually works in return?

"They that would give up essential liberty for a little temporary safety deserve neither liberty nor safety."
-Benjamin Franklin, Historical Review of Pennsylvania, 1759

"Experience teaches us to be most on our guard to protect liberty when the government's purposes are beneficent."

-Judge Louis Brandeis

I am not willing to give up any of my own liberties to protect children. We already have laws that do that and judging by the number of people arrested they seem to work. You reach a point of diminishing returns.

At some point you have to accept that the world is a dangerous place and that bad things happen. There is a balancing point and a greater good to think about. Making everyone elses life less free does not balance out with the prospect of maybe saving a few kids. As the laws become more invasive they will eventually breed resentment and hatred for the government and fellow citizens. The end result will be civil unrest and fighting and that helps noone. Sadly it's already happening. Americans hate each other more than at any almost any other time in our history- and the hatred is becoming vicious.

-Don

"The greatest dangers to liberty lurk in insidious encroachment by men of
zeal, well-meaning but without understanding."

-Judge Louis Brandeis

<snip>

I am not willing to give up any of my own liberties to protect children.
We already have laws that do that and judging by the number of people
arrested they seem to work. You reach a point of diminishing returns.

Hello,

Before *this* thread spins out of control, I would like to draw your
attention to NANOG-L AUP, available at http://www.nanog.org/aup.html ,
particularly #6: Postings of political, philosophical, and legal nature
are discouraged.

In other words, it is on-topic to discuss operational effect of filtering
- what the original post started with. It is on-topic to discuss how to
filter and comply with government or corporate mandates to filter. It is
on-topic to discuss existing logging/filtering solutions and their
operational impact.

It is "not so much" on topic to discuss legalities of filtering, but I
think most agree that it still belongs here.

It is clearly off-topic to discuss lists of british colonies, or civil
liberties or protection of children - there are better forums to do this.

Please follow any replies to this message to nanog-futures@nanog.org

-alex (acting mlc chair)

UK ISP associations have developed a centralized blocking solution with IWF providing the decision making of what to filter. 90% of the UK broadband users accept the same "voluntary" decisions about what to filter.

On the other hand, US ISP associations have advocated for decentralized blocking solutions, leaving the decision to parents and multiple content filtering companies. US ISP associations have been active in this area
since the early 1990's, although US ISP associations seem to only last so
long before they disappear and a new association springs up.

Is a centralized filtering solution better or worse than a decentralized filtering solution?

Schools, libraries, families, etc in the US choose which content filter
product to use, which vary greatly how well they work and what they
choose to filter. Since its "voluntary," some US families choose not to
have any content filters. Other US families choose to filter much more
than other families.

Cisco, Juniper, Streamshield, NetNanny, etc sell identical products around the world. If an ISP anywhere in the world wants to offer either a
centralized or decentralized filtering solution, the products are available. Likewise, if an individual is concerned about what his or her family sees,
they can use without their ISP, the products are available.

[...]

Is a centralized filtering solution better or worse than a decentralized
filtering solution?

[...]

In my opinion, it is not. With centralized, who gets to decide what is
filtered and why? I don't beleive a governmental entity should have to treat
it's *adult* citizens like children. with de-centralized, it is the user's
perrogative what gets filtered and what doesn't - if the user doesn't want
their them or their children seeing certain content, then filtering is their
option. As well as it gives the user power over *WHAT* gets filtered and
*WHY*.

say what their users can or cannot view. It's when ISPs start interfering
with what their users do is when we will run into legal, political and
otherwise issues that I'm sure none of us want to see.

And with ISP level filtering: what's to say a legitamate page won't get caught
in the filter? i.e. if a news article is covering a child pornography story,
what's to stop the filter from picking up on that and blocking the news site
due to a false-positive?

IMHO, unless it’s something blatantly illegal such as kiddie porn and the like I don’t think content filtering is the responsibility of the ISP’s. Besides all of the conspiracy theories that are bound to surface, I think forcing ISP’s to block content is a bit like forcing car makers to police what can be played on the radio. I think that giving parents the option of manually turning off porn sites would be an improvement. Although still not within the responsibility of the ISP they are in the best place to implement such a technology. However, I don’t like the idea of a mandatory global traffic filtering initiative.

Sean Donelan sean@donelan.com
Sent by: owner-nanog@merit.edu

06/09/2007 04:43 PM

To
nanog@merit.edu

cc

Subject
UK ISPs v. US ISPs (was RE: Network Level Content Blocking)

On Fri, 8 Jun 2007, michael.dillon@bt.com wrote:
> In this case I would suggest that it is in ISPs best interests to get
> involved with network content blocking, so that ISPs collectively become
> deep experts on the subject. We are then in a position to modify these
> activities in a way that is beneficial to ISPs and their customers (who
> happen to be voters too). And we are in a position to advise government
> on future actions as well. If ISPs choose not to get involved, then they
> are less likely to be listened to by government partly because they have
> less credibility and partly because they simply don't understand the
> issue and therefore fail to communicate effectively.

UK ISP associations have developed a centralized blocking solution with
IWF providing the decision making of what to filter. 90% of the UK
broadband users accept the same "voluntary" decisions about what to
filter.

On the other hand, US ISP associations have advocated for decentralized
blocking solutions, leaving the decision to parents and multiple content
filtering companies. US ISP associations have been active in this area
since the early 1990's, although US ISP associations seem to only last so
long before they disappear and a new association springs up.

Is a centralized filtering solution better or worse than a decentralized
filtering solution?

Schools, libraries, families, etc in the US choose which content filter
product to use, which vary greatly how well they work and what they
choose to filter. Since its "voluntary," some US families choose not to
have any content filters. Other US families choose to filter much more
than other families.

Cisco, Juniper, Streamshield, NetNanny, etc sell identical products around
the world. If an ISP anywhere in the world wants to offer either a
centralized or decentralized filtering solution, the products are available.
Likewise, if an individual is concerned about what his or her family sees,
they can use without their ISP, the products are available.

Sean Donelan wrote:

UK ISP associations have developed a centralized blocking solution with IWF providing the decision making of what to filter. 90% of the UK broadband users accept the same "voluntary" decisions about what to filter.

I have not seen any evidence presented that *any* "UK broadband users"
either *know* about or "accept" the "voluntary" decisions of their ISP,
made for them in their 'Net Nanny role.

Could you point to the URL for this scientific polling data?

On the other hand, US ISP associations have advocated for decentralized blocking solutions, leaving the decision to parents and multiple content filtering companies. US ISP associations have been active in this area
since the early 1990's, although US ISP associations seem to only last so
long before they disappear and a new association springs up.

And that has not worked out well for us. No continuity, no effective
lobbying organization. Where, oh where, are CIX, ISP/C, et alia?

I learned of it this week from NANOG and UKNOF.

Thus spake "Kradorex Xeron" <admin@digibase.ca>

From my view, ISPs should continue their role as "passing the
packets" and not say what their users can or cannot view. It's
when ISPs start interfering with what their users do is when we
will run into legal, political and otherwise issues that I'm sure
none of us want to see.

IIRC, AOL got whacked by a court years ago because they censored some chat rooms and not others. The court held that since they censored some content, they lost their status as a common carrier and were liable for other content they didn't censor (either by intent or mistake). This was a particularly interesting case, since the implication was that ISPs who _don't_ censor content _are_ common carriers, which I don't think has otherwise been touched upon in the US.

S

Stephen Sprunk "Those people who think they know everything
CCIE #3723 are a great annoyance to those of us who do."
K5SSS --Isaac Asimov

IIRC, AOL got whacked by a court years ago because they censored some
chat rooms and not others. The court held that since they censored
some content, they lost their status as a common carrier and were
liable for other content they didn't censor (either by intent or
mistake). This was a particularly interesting case, since the
implication was that ISPs who _don't_ censor content _are_ common
carriers, which I don't think has otherwise been touched upon in the
US.

I would be most interested in a citation of this alleged case.

Perhaps you are thinking of the Stratton Oakmont case in which a New
York court found Prodigy liable for for postings in some of their
online fora. Since there is a section in the 1996 CDA specifically to
reverse the outcome of this case, anyone who cites this case as
relevant to the current situation in the US is just plain wrong.

Also, ISPs in the United States are not common carriers. Even the
ISPs that are owned by phone companies (which are common carriers for
their phone service) are not common carriers.

Regards,
John Levine, johnl@iecc.com, Primary Perpetrator of "The Internet for Dummies",
Information Superhighwayman wanna-be, http://www.johnlevine.com, ex-Mayor
"More Wiener schnitzel, please", said Tom, revealingly.

I think in the home is the best place to implement the technology - a
power switch or BIOS password.

Here is a true analogy. My father worked for a TV station, so you'd
think we'd have the TV on all the time, yet right through up until
after I left high school, my parents wanted to limit my TV watching ...
significantly.

How did they do it ?

(a) they didn't buy a TV set and put it in my bedroom - the TV was in a
common area of the house i.e. the lounge and/or dining room

(b) they didn't allow me to watch the TV unsupervised

So what I don't understand is why parents put computers in their
childrens' bedrooms and don't supervise their children's Internet use.

Substituting a piece of filting software that won't ever do as good a
job as a parent in enforcing parental responsibility is just bad
parenting in my opinion, and not the responsiblity of government or
ISPs.

Regards,
Mark.

Are you possibly conflating the general legalistic term "common
carrier" which applies to taxi cabs, UPS, trains (common carriage,
i.e., generally cannot be held responsible for the contents of what
they carry) versus telecommunications common carriers as defined under
the "Communications Act of 1934" (CA1934) which established the AT&T
monopoly?

Granted there's no agency or body which takes out a big stamp saying
COMMON CARRIER (in the common law, first sense) and sticks it on an
ISP's head (in the second sense, CA1934, that'd be the FCC.)

The common law sense is more a practical matter of law and
precedence. The wikipedia entry has a nice brief summary:

     http://en.wikipedia.org/wiki/Common_carrier

  Internet networks are in many respects already treated like common
  carriers. ISPs are largely immune from liability for third party
  content. The Good Samaritan provision of the Communications Decency
  Act established immunity from liability for third party content on
  grounds of libel or slander. The DMCA established that ISPs which
  comply with DMCA would not be liable for the copyright violations of
  third parties on their network.

So, although it should be noted that by and large ISPs have resisted
being classified telecommunications common carriers as specifically
defined in CA1934 they seem to be treated by the law, in practice, as
common carriers in the common law sense (i.e., offer their carriage to
the general public, can't generally be considered responsible for
content etc. any more than a taxi or fedex could be.)

It's a very important if somewhat confusing distinction. It'd be a lot
easier if we could come up with separate terms for common law common
carrier versis CA1934 telecommunications common carrier.

That said, nothing is absolute and even an unarguable common law
common carrier (say, fedex) has some duty to raise suspicions about a
package which is ticking or dripping noxious (or really any) liquid,
and to cooperate with law enforcement where properly required (e.g.,
via warrants.)

I think we're less in the realm here of who is or isn't some sort of
common carrier and more "what are the responsibilities even within
those constraints?". Even the old AT&T monopoly, probably as legally
fortressed from liability as possible (other than its broad exposure
to regulation) would be required to cooperate with properly executed
legal process.

(I suppose to be PC I have to say out loud this note is US-centric)

Keegan Holley
Network Engineer, Network Managed Services
SunGard Availability Services
Mezzanine Level MC-95
401 N. Broad St.
Philadelphia, PA 19108
215.446.1242 (office)
609.670.2149 (cell)
Keegan.Holley@sungard.com

So, although it should be noted that by and large ISPs have resisted
being classified telecommunications common carriers as specifically
defined in CA1934 they seem to be treated by the law, in practice, as
common carriers in the common law sense ...

You're right, but the legal setup is flipped around. For a common
carrier of whatever variety, the assumption is that they're not
responsible for the contents of what they carry, and there's
exceptions for obviously dangerous goods, warrants, etc. For ISPs,
the assumption is that they're just normal businesses, and there's
laws like the CDA that give immunity for various kinds of customer
material.

R's,
John

PS:

It's a very important if somewhat confusing distinction. It'd be a lot
easier if we could come up with separate terms for common law common
carrier versis CA1934 telecommunications common carrier.

Agreed.

In article <18029.40210.45583.17990@world.std.com>, Barry Shein <bzs@world.std.com> writes

It'd be a lot easier if we could come up with separate terms for common law common carrier versis CA1934 telecommunications common carrier.

What you are looking for is probably a US equivalent of the European Union's "Mere Conduit" law. It's a relatively new (doesn't have historical baggage) law, and has the advantage of in effect being negotiated with telcos and ISPs during the drafting. Additionally there are sensible definitions for the liability of intermediaries in the circumstances of caching and hosting.

The caching clause was particularly important from an operational point of view, because the threat was that a different approach (as initially promoted by rightsholders) would have either made caches an illegal beach of copyright, or mandated that cache owners get a licence from all copyright holders (perhaps through a central collecting agency set up for the purpose). Luckily[1], sense prevailed.

http://ec.europa.eu/internal_market/e-commerce/index_en.htm

Topical to the current discussion, and also of crucial operational importance, a subsequent proposal in the UK for sysadmins to be individually licensed in order to obtain immunity[2] from reporting illegal material found on their systems, was lobbied into a more general amnesty for that kind of activity.

[1] Actually, no luck involved at all, just sustained lobbying via a network of EU-based trade associations.

[2] A bit like seeing a gun in the street and on handing it into the police being prosecuted for possession of an unlicenced firearm; strictly true (if having it in your hand is the definition of possession), but not in the public interest.

The Communications Decency Act, Digitial Millinium Copyright Act, Electronic Communications Privacy Act and even ISP's own Terms of Service give operators substantial protection for "Good Samaritan" acts to filter, screen, allow, disallow or report objectionable material.

In addition, under 42 USC 13032, US public electronic communication providers (also known as ISPs) have the *DUTY* to report, not monitor, child pornography to the National Center for Missing and Exploited Children. Failing to file a report could result in a fine of $50,000 or more, except due to a quirk in how the statute was written, apparently no US government agency was given the authority to issue the fines.