Nato warns of strike against cyber attackers

Yes, it's complex, but, it is the only mechanism the law provides
for the transfer of liability. You can't leap-frog the process and
have the SPAM victims going directly after LatchKeyMom's
OS Vendor because there's no relationship there to provide
a legal link of liability.

This leads to an incredibly Rube-Goldberg-like setup to solve the
problem; if that's the case, even if the issue of EULA's leaving end
users holding the bag were resolved, this would not be much of an
incentive to vendors to fix the problem.

>> To carry your analogy a bit too far, if someone is roaming the streets
>> in a beat-up jalopy with wobbly wheels, no lights, no brakes, no
>> mirrors, and sideswiping parked cars, is it up to the city to somehow
>> clear the way for that driver? No - the car is taken off the road and
>> the driver told to fix it or get a new one. If the problem appears to be
>> the driver rather than the vehicle, the driver is told they cannot drive
>> until they have obtained a Clue.
> Generally speaking, nobody wants to be the cop that makes that call.
> Theoretically an ISP *might* be able to do that, but most are unwilling,
> and those of us that do actually play BOFH run the risk of losing
> customers to a sewerISP that doesn't.
Whether anyone wants to be the cop or not, someone has to be the cop.

The point is that SewerISPs need to be held liable (hence my proposal
for ISP liability outside of a 24 hour grace period from notification).

If SewerISP has to pay the costs of failing to address abuse from their
customers, SewerISP will either stop running a cesspool, or, they will
go bankrupt and become a self-rectifying problem.

In the meantime, CleanISP is bleeding customers to SewerISP, rewarding
SewerISP. And tomorrow there's another SewerISP.

>> If the user, as a result of their computer being zombified or whatever,
>> has to
>>> "take it in to
>>> NerdForce and spend some random amount between $50 and twice the cost of
>>> a new computer,"
>> ...then that's the user's problem. They can solve it with insurance
>> (appropriate policies will come into being), or they can solve it by
>> becoming more knowledgeable, or they can solve it by hiring know how.
>> But it is *their* problem. The fact that it is the user's problem will
>> drive the industry to solve that problem, because anywhere there is a
>> problem there is a market for a solution.
> That shows an incredible lack of understanding of how the market actually
> works. It's nice in theory.

No, it shows how broken current market practice is. What we are saying is
that some relatively minor application of existing law to the computer market
would correct this brokenness.

That's like saying going to the moon is a relatively minor application of
rocket science.

> We (as technical people) have caused this problem because we've failed to
> design computers and networks that are resistant to this sort of thing.
> Trying to pin it on the users is of course easy, because users (generally
> speaking) are "stupid" and are "at fault" for not doing "enough" to
> "secure" their own systems, but that's a ridiculous smugness on our part.

You keep saying "WE" as if the majority of people on this list have anything
to do with the design or construction of these systems. We do not. We are
mostly network operators.

I keep saying "we" as opposed to "them" because "we" are part of the problem,
and "they" are simply end users. "We" can (and, from past experience with
the membership of this list, does) include members of the networking community,
hardware community, software community, developers, and other related
interests. "We" have done a poor job of designing technology that "they"
can understand, comprehend, and just use, which is, when it comes right down
to it, all they want to be able to do.

However, again, if the end user is held liable, the end user is then in a
position to hold the manufacturer/vendors that they received defective
systems from liable.

The hell they are. Why don't you READ that nice EULA you accepted when you
bought that Mac.

It does exactly what you are saying needs to happen,
just without exempting irresponsible users from their share of the pain
which seems to be a central part of your theory.

If I leave my credit card laying around in an airport, I'm liable for part of
the pain up until the point where I report my credit card lost. Why should
irresponsible computer usage be any different?

Because the average person would consider that to be dangerous, and the
average person would not consider opening an e-mail in their e-mail client
to be dangerous, except that it is.

>>> then we - as the people who have designed and provided=20
>>> technology - have failed, and we are trying to pass off responsibility=20
>>> for our collective failure onto the end user.
>> I think what's being called for is not total abdication of
>> responsibility - just some sharing of the responsibility.
> I'm fine with that, but as long as we keep handing loaded guns without
> any reasonably-identifiable safeties to the end users, we can expect to
> keep getting shot at now and then.

Going back to my being perfectly willing to have a licensing process for
attaching a system to a network.

I have no problem with requiring gun-safety courses as a condition of
gun ownership. I have no problem with requiring network security/safety
courses as a condition of owning a network-attached system.

That seems a little extreme. How about just making a device that's safe for
people to use, and does what they need? Look at the fantastic inroads that
devices like the iPhone and iPad have made. Closed ecosystem, low risk, but
enough functionality that many users accept (and even love) them. Not saying
they're entirely safe, but the point to ponder is that it *is* possible to
offer devices that seem to have a lower risk factor.

>>> This implies that our
>>> operating systems need to be more secure, way more secure, our applicatio=
>> ns
>>> need to be less permissive, probably way less permissive, probably even
>>> sandboxed by default
>> Yep! And the fastest way to get more secure systems is to make consumers
>> accountable, so that they demand accountability from their vendors. And
>> so it goes, all the way up the chain. Make people accountable. At every
>> level.
> Again, that shows an incredible lack of understanding of how the market
> actually works. It's still nice in theory.

No... It shows a need for the market to change.

And that, too, shows an incredible lack of understanding of how the
market actually works. All the wishful thinking in the world does not
result in the market changing.

> We would be better off short-circuiting that mechanism; for example, how
> about we simply mandate that browsers must be isolated from their
> underlying operating systems? Do you really think that the game of
> telephone works? Are we really going to be able to hold customers
> accountable? And if we do, are they really going to put vendor feet to
> the fire? Or is Microsoft just going to laugh and point at their EULA,
> and say, "our legal department will bankrupt you, you silly little twerp"?

Yes, the game of telephone works all the time. It's how the entire legal
system of liability works in the united States. Yes, we need some legal
changes to make it work. For example, we need regulation which
prevents EULA clauses exempting manufacturers from liability for their
erros from having any force of law. What a crock it is that those clauses
actually work.

You're not going to get it. If you try, every software manufacturer in
the US is going to be up in arms, saying that it'll put them out of
business (in some cases, probably rightly so). Even limiting liability
to the cost of the product isn't going to work, the end user still gets
hung with the cost.

And no, the game of telephone doesn't work. Most consumers simply do not
have the time to pursue issues or the expertise to know they've been
screwed, which is why we so many class action suits - the very proof that
the game of telephone you suggest doesn't work.

Imagine if your car came with a disclaimer in the sales agreement that
said the manufacturer had no liability if their accelerator stuck and you
plowed a field of pedestrians as a result. Do you think the court would
ever consider upholding such a provision? Never.

But again, computers don't "plow" a "field of pedestrians" when they get
infected, so you're really failing to offer a meaningful comparison here.
The only way you'll get a computer to do that is to toss one out a
tenth story window above a crowded sidewalk, and even there, I doubt a
judge will hold Dell or Microsoft liable.

> Everyone has carefully made it clear that they're not liable to the users,
> so the users are left holding the bag, and nobody who's actually
> responsible is able to be held responsible by the end users.

Yes, those "we're not liable for our negligence" clauses need to be
removed from legal effect. Agreed.

I certainly agree that this is a problem, but I'm also fairly certain I
won't see it resolved in that way in my lifetime.

This doesn't seem to be a useful discussion at this point. I think we
agree that there's a problem, but I don't really see fixing the liability
laws as likely to happen, and attempting to hold people responsible for
the actions of their computers has been difficult even for the MPAA/RIAA.
What you're suggesting is even more Rube Goldberg, and as a way to address
the issue of software quality, would appear to be a spectacular EPIC FAIL
based on the influence of the software industry and the resulting effects
of what you suggest. They'll simply state that it would be a total
disaster (for them), and they'll successfully lobby any such reform into
the ground.

More likely, in my opinion, is an evolution away from the "personal
computer" model we've had until now, towards a more abstract form of
computing that emphasizes network-based ("cloud") computing, where
your device simply holds a few apps and some minor configuration, but
the heavy lifting is all done elsewhere. This, too, has many issues
associated with it, but from a security standpoint, there becomes a
manageable number of parties to hold accountable when something goes
awry, and much less of a chance for users to do something unanticipated
with their devices.

Apple seems to be making inroads in that area.

As far as network operations goes, it seems that the best thing to do
is to try to assist infected customers, but that's a hard cost to
swallow. I don't really see what other realistic conclusion can be
drawn, however.

... JG