Nato warns of strike against cyber attackers

> There is only so much "proper security" you can expect the average PC use=
r
> to do.

Sure - but if their computer, as a result of their ignorance, starts
belching out spam, ISPs should be able at very least to counteract the
problem. For example, by disconnecting that user and telling them why
they have been disconnected. Why should it be the ISP's duty to silently
absorb the blows? Why should the user have no responsibility here?

Primarily because the product that they've been given to use is defective
by design. I'm not even saying "no responsibility"; I'm just arguing that
we have to be realistic about our expectations of the level of
responsibility users will have. At this point, we're teaching computers
to children in elementary school, and kids in second and third grade are
being expected to submit homework to teachers via e-mail. How is that
supposed to play out for the single mom with a latchkey kid? Let's be
realistic here. It's the computer that ought to be safer. We can expect
modest improvements on the part of users, sure, but to place it all on
them is simply a fantastic display of incredible naivete.

To carry your analogy a bit too far, if someone is roaming the streets
in a beat-up jalopy with wobbly wheels, no lights, no brakes, no
mirrors, and sideswiping parked cars, is it up to the city to somehow
clear the way for that driver? No - the car is taken off the road and
the driver told to fix it or get a new one. If the problem appears to be
the driver rather than the vehicle, the driver is told they cannot drive
until they have obtained a Clue.

Generally speaking, nobody wants to be the cop that makes that call.
Theoretically an ISP *might* be able to do that, but most are unwilling,
and those of us that do actually play BOFH run the risk of losing
customers to a sewerISP that doesn't.

If the user, as a result of their computer being zombified or whatever,
has to

> "take it in to
> NerdForce and spend some random amount between $50 and twice the cost of
> a new computer,"

...then that's the user's problem. They can solve it with insurance
(appropriate policies will come into being), or they can solve it by
becoming more knowledgeable, or they can solve it by hiring know how.
But it is *their* problem. The fact that it is the user's problem will
drive the industry to solve that problem, because anywhere there is a
problem there is a market for a solution.

That shows an incredible lack of understanding of how the market actually
works. It's nice in theory.

We (as technical people) have caused this problem because we've failed to
design computers and networks that are resistant to this sort of thing.
Trying to pin it on the users is of course easy, because users (generally
speaking) are "stupid" and are "at fault" for not doing "enough" to
"secure" their own systems, but that's a ridiculous smugness on our part.

> then we - as the people who have designed and provided=20
> technology - have failed, and we are trying to pass off responsibility=20
> for our collective failure onto the end user.

I think what's being called for is not total abdication of
responsibility - just some sharing of the responsibility.

I'm fine with that, but as long as we keep handing loaded guns without
any reasonably-identifiable safeties to the end users, we can expect to
keep getting shot at now and then.

> This implies that our
> operating systems need to be more secure, way more secure, our applicatio=
ns
> need to be less permissive, probably way less permissive, probably even
> sandboxed by default

Yep! And the fastest way to get more secure systems is to make consumers
accountable, so that they demand accountability from their vendors. And
so it goes, all the way up the chain. Make people accountable. At every
level.

Again, that shows an incredible lack of understanding of how the market
actually works. It's still nice in theory.

We would be better off short-circuiting that mechanism; for example, how
about we simply mandate that browsers must be isolated from their
underlying operating systems? Do you really think that the game of
telephone works? Are we really going to be able to hold customers
accountable? And if we do, are they really going to put vendor feet to
the fire? Or is Microsoft just going to laugh and point at their EULA,
and say, "our legal department will bankrupt you, you silly little twerp"?

Everyone has carefully made it clear that they're not liable to the users,
so the users are left holding the bag, and nobody who's actually
responsible is able to be held responsible by the end users.

> We can make their Internet cars safer for them - but we largely haven't.

I'm not sure that the word "we" is appropriate here. Who is "we"? How
can (say) network operators be held responsible for (say) a weakness in
Adobe Flash? At that level too, the consumer needs comeback - on the
providers of weak software.

Yes, "we" needs to include all the technical stakeholders, and "we" as
network operators ought to be able to tell "we" the website operators to
tell "we" the web designers to stop using Flash if it's that big a
liability. This, of course, fails for the same reasons that expecting
end users to hold vendors responsible does, but there are a lot less of
us technical stakeholders than there are end users, so if we really want
to play that sort of game, we should try it here at home first.

What's good for the goose, and all that ...

... JG

There is only so much "proper security" you can expect the average PC use=

r

to do.

Sure - but if their computer, as a result of their ignorance, starts
belching out spam, ISPs should be able at very least to counteract the
problem. For example, by disconnecting that user and telling them why
they have been disconnected. Why should it be the ISP's duty to silently
absorb the blows? Why should the user have no responsibility here?

Primarily because the product that they've been given to use is defective
by design. I'm not even saying "no responsibility"; I'm just arguing that
we have to be realistic about our expectations of the level of
responsibility users will have. At this point, we're teaching computers
to children in elementary school, and kids in second and third grade are
being expected to submit homework to teachers via e-mail. How is that
supposed to play out for the single mom with a latchkey kid? Let's be
realistic here. It's the computer that ought to be safer. We can expect
modest improvements on the part of users, sure, but to place it all on
them is simply a fantastic display of incredible naivete.

I don't think that is what is being proposed. What is being proposed is
that in order for this to work legally in the framework that exists in the
current law is to create a chain of liability.

Let's use the example of a third party check which should be fairly
familiar to everyone.

A writes a check to B who endorses it to C who deposits it.
The check bounces.

C cannot sue A. C must sue B. B can then recover from A.

So, to make this work realistically, the end user (latchkey mom in
your example) has a computer and little Suzie opens MakeMeSpam.exe
and next thing you know, that computer is using her full 7Mbps uplink
from $CABLECO to deliver all the spam it can deliver at that speed.

Some target of said spam calls up $CABLECO and $CABLECO turns
off LatchKeyMom's service. The spam targets can (if they choose)
go after LatchKeyMom ($CABLECO would be liable if they hadn't
disconnected LatchKeyMom promptly), but, they probably won't if
LatchKeyMom isn't a persistent problem. LatchKeyMom can go
after the makers of MakeMeSpam.exe and also can go after
the makers of her OS, etc. if she has a case that their design
was negligent and contributed to the problem.

Yes, it's complex, but, it is the only mechanism the law provides
for the transfer of liability. You can't leap-frog the process and
have the SPAM victims going directly after LatchKeyMom's
OS Vendor because there's no relationship there to provide
a legal link of liability.

To carry your analogy a bit too far, if someone is roaming the streets
in a beat-up jalopy with wobbly wheels, no lights, no brakes, no
mirrors, and sideswiping parked cars, is it up to the city to somehow
clear the way for that driver? No - the car is taken off the road and
the driver told to fix it or get a new one. If the problem appears to be
the driver rather than the vehicle, the driver is told they cannot drive
until they have obtained a Clue.

Generally speaking, nobody wants to be the cop that makes that call.
Theoretically an ISP *might* be able to do that, but most are unwilling,
and those of us that do actually play BOFH run the risk of losing
customers to a sewerISP that doesn't.

Whether anyone wants to be the cop or not, someone has to be the cop.

The point is that SewerISPs need to be held liable (hence my proposal
for ISP liability outside of a 24 hour grace period from notification).

If SewerISP has to pay the costs of failing to address abuse from their
customers, SewerISP will either stop running a cesspool, or, they will
go bankrupt and become a self-rectifying problem.

If the user, as a result of their computer being zombified or whatever,
has to

"take it in to
NerdForce and spend some random amount between $50 and twice the cost of
a new computer,"

...then that's the user's problem. They can solve it with insurance
(appropriate policies will come into being), or they can solve it by
becoming more knowledgeable, or they can solve it by hiring know how.
But it is *their* problem. The fact that it is the user's problem will
drive the industry to solve that problem, because anywhere there is a
problem there is a market for a solution.

That shows an incredible lack of understanding of how the market actually
works. It's nice in theory.

No, it shows how broken current market practice is. What we are saying is
that some relatively minor application of existing law to the computer market
would correct this brokenness.

We (as technical people) have caused this problem because we've failed to
design computers and networks that are resistant to this sort of thing.
Trying to pin it on the users is of course easy, because users (generally
speaking) are "stupid" and are "at fault" for not doing "enough" to
"secure" their own systems, but that's a ridiculous smugness on our part.

You keep saying "WE" as if the majority of people on this list have anything
to do with the design or construction of these systems. We do not. We are
mostly network operators.

However, again, if the end user is held liable, the end user is then in a
position to hold the manufacturer/vendors that they received defective
systems from liable. It does exactly what you are saying needs to happen,
just without exempting irresponsible users from their share of the pain
which seems to be a central part of your theory.

If I leave my credit card laying around in an airport, I'm liable for part of
the pain up until the point where I report my credit card lost. Why should
irresponsible computer usage be any different?

then we - as the people who have designed and provided=20
technology - have failed, and we are trying to pass off responsibility=20
for our collective failure onto the end user.

I think what's being called for is not total abdication of
responsibility - just some sharing of the responsibility.

I'm fine with that, but as long as we keep handing loaded guns without
any reasonably-identifiable safeties to the end users, we can expect to
keep getting shot at now and then.

Going back to my being perfectly willing to have a licensing process for
attaching a system to a network.

I have no problem with requiring gun-safety courses as a condition of
gun ownership. I have no problem with requiring network security/safety
courses as a condition of owning a network-attached system.

This implies that our
operating systems need to be more secure, way more secure, our applicatio=

ns

need to be less permissive, probably way less permissive, probably even
sandboxed by default

Yep! And the fastest way to get more secure systems is to make consumers
accountable, so that they demand accountability from their vendors. And
so it goes, all the way up the chain. Make people accountable. At every
level.

Again, that shows an incredible lack of understanding of how the market
actually works. It's still nice in theory.

No... It shows a need for the market to change.

We would be better off short-circuiting that mechanism; for example, how
about we simply mandate that browsers must be isolated from their
underlying operating systems? Do you really think that the game of
telephone works? Are we really going to be able to hold customers
accountable? And if we do, are they really going to put vendor feet to
the fire? Or is Microsoft just going to laugh and point at their EULA,
and say, "our legal department will bankrupt you, you silly little twerp"?

Yes, the game of telephone works all the time. It's how the entire legal
system of liability works in the united States. Yes, we need some legal
changes to make it work. For example, we need regulation which
prevents EULA clauses exempting manufacturers from liability for their
erros from having any force of law. What a crock it is that those clauses
actually work.

Imagine if your car came with a disclaimer in the sales agreement that
said the manufacturer had no liability if their accelerator stuck and you
plowed a field of pedestrians as a result. Do you think the court would
ever consider upholding such a provision? Never.

Everyone has carefully made it clear that they're not liable to the users,
so the users are left holding the bag, and nobody who's actually
responsible is able to be held responsible by the end users.

Yes, those "we're not liable for our negligence" clauses need to be
removed from legal effect. Agreed.

Owen

Primarily because the product that they've been given to use is defective
by design.

Indeed. So one approach is to remove the protection such defective
designs currently enjoy.

supposed to play out for the single mom with a latchkey kid? Let's be
realistic here. It's the computer that ought to be safer.

Fine. Agreed. Now what mechanisms do you suggest for achieving that?
Technical suggestions are no good, because noone will implement them
unless they have to, or unless implementing them in some way improves
the product so it sells better.

modest improvements on the part of users, sure, but to place it all on
them is simply a fantastic display of incredible naivete.

Indeed. And certainly not something I'd advocate. at least not without
making sure that they, in turn, could pass the responsibility on.

That shows an incredible lack of understanding of how the market actually
works. It's nice in theory.

It would be a lot more pleasant discussing things with you if you
understood that people may disagree with you without necessarily being
naive or stupid.

We (as technical people) have caused this problem because we've failed to
design computers and networks that are resistant to this sort of thing.

And why did we do that? What allowed us to get away with it? Answer:
Inadequate application of ordinary product liability law to the
producers of software. Acceptance of ridiculous EULAs that in any sane
legal system would not be worth the cellophane they are printed behind.
And so forth. I know the ecosystem that arose around software is more
complicated than that, but you get the idea.

Trying to pin it on the users is of course easy, because users (generally
speaking) are "stupid" and are "at fault" for not doing "enough" to
"secure" their own systems, but that's a ridiculous smugness on our part.

You're right. And again, I am not advocating that. People are always
going to be stupid (or ignorant, which is not the same thing as stupid).
The trick is to give them a way out - whether it's insurance, education
or effective legal remedy. That way they can choose how to handle the
risk that *they* represent - in computers just as in any other realm of
life.

I'm fine with that, but as long as we keep handing loaded guns without
any reasonably-identifiable safeties to the end users, we can expect to
keep getting shot at now and then.

You keep stating the problem, where what others are trying to do is
frame a solution. Right now we are just absorbing the impact; that is
not sustainable, as long as the people providing the avenues of attack
(through ignorance or whatever) have no obligation at all to do better.

> Yep! And the fastest way to get more secure systems is to make consumers
> accountable, so that they demand accountability from their vendors. And
> so it goes, all the way up the chain. Make people accountable. At every
> level.

Again, that shows an incredible lack of understanding of how the market
actually works. It's still nice in theory.

There are whole industries built around vehicular safety. There are
numerous varieties of insurance that protect people - at every level -
from their own failures.

Where there is no accountability in a human system, failure is
practically guaranteed - whether in the form of tyranny, monopoly,
danger to life and limb or whatever. The idea of accountability and the
drive to attain it forms the basis of most legal and democratic systems,
and of uncountable numbers of smaller systems in democratic societies.
Now, what were you saying about "theory"?

  Do you really think that the game of
telephone works? Are we really going to be able to hold customers
accountable? And if we do, are they really going to put vendor feet to
the fire? Or is Microsoft just going to laugh and point at their EULA,
and say, "our legal department will bankrupt you, you silly little twerp"?

Please, read more carefully. "At every level". If the consumer is made
responsible, they must simultaneously get some avenue of recourse. Those
ridiculous EULAs should be the first things against the wall :slight_smile:

Everyone has carefully made it clear that they're not liable to the users,
so the users are left holding the bag, and nobody who's actually
responsible is able to be held responsible by the end users.

Correct. That is the current situation, and it needs to be altered. On
the one hand consumers benefit because they will finally have recourse
for defective software, but with that gain comes increased
responsibility.

Yes, "we" needs to include all the technical stakeholders, and "we" as
network operators ought to be able to tell "we" the website operators to
tell "we" the web designers to stop using Flash if it's that big a
liability. This, of course, fails for the same reasons that expecting
end users to hold vendors responsible does, but there are a lot less of
us technical stakeholders than there are end users, so if we really want
to play that sort of game, we should try it here at home first.

Try what?

Regards, K.