What do you want your ISP to block today?

Only if it impacts the ISP, which it doesn't most of the time unless
they buy an unfortunate brand of dial-up concentrators.

Bits are bits, very few of them actually impact the ISP itself. Most
ISPs protect their own infrastructure. Routers are very good at
forwarding bits. Routers have problems filtering bits. Whether it is
spam, viruses or other attacks; its mostly customers or end-users that
bear the brunt of the impact, not the ISP.

Impact can be more than ISP equipment getting into trouble. It can also be congestion or excessive bandwidth use because of incoming abusive traffic, or infected customers.

The recurring theme is: I don't want my ISP to block anything I do, but
ISPs should block other people from doing things I don't think they
should do.

Actually this doesn't have to be the paradox it seems to be. If we can find a way to make sure at the source that the destination welcomes the communication, we can have both.

So how long is reasonable for an ISP to give a customer to fix an
infected computer; when you have cases like Slammer where it takes only
a few minutes to infect the entire Internet? Do you wait 72 hours?
or until the next business day? or block the traffic immediately?

Or some major ISPs seem to have the practice of letting the infected
computers continuing attacking as long as it doesn't hurt their
network.

Let's first look at the reverse situation: infective traffic comes in. Customers may take the position that it is in their best interest that their ISP filters this traffic forever, so that they can't get infected, regardless of whether they patch their systems or not. But it isn't realistic to expect ISPs to do this.

First of all, because in many cases, the vulnerability is in a service that also has legitimate uses. In some cases this isn't much of a problem: for instance, with the slammer worm blocking the affected port didn't really impact the SQL service. Or with filtering blaster, windows file sharing doesn't work anymore but this isn't a public service so the people who need it can run it over a secure tunnel of some kind. However, shutting down port 80 because an HTTP implementation has a vulnerability wouldn't be acceptable because of the collateral damage.

Then there are the issues of ISPs being able to do this effectively in the first place, and effectiveness. If ISPs were to filter everything forever everywhere, maybe this would be effective, but nearly all equipment takes a performance hit when it has to filter, and this usually gets worse as the filters get bigger, and there are limits to the length of filters. On top of that, there is the management issue: with 100k ADSL customers, you need to apply filters to 100k interfaces on hundreds of boxes. So in reality ISPs can only have a limited number of filter rules in a limited number of places. While this gets rid of most of the infective traffic for as long as the filter is in place, this doesn't really protect customers, as when one customer is infected, the infection can still spread to other customers (most worms are optimized for this) unless the ISP has put filters on all customer ports. And we've seen that worms are often carried from location to location in infected laptops.

And then, when the filter rules have to go (for instance because there is a new worm du jour) experience shows there is still some infecting traffic, however long after the initial outbreak, so at some point a vulnerable system WILL be infected.

Last but not least: if ISPs filter X worms, and then worm X+1 presents itself which proves unfilterable, things get really bad because users were depending on ISP action to prevent infection, rather than take their own measures. This could even lead to legal problems for ISPs.

Bottom line: unless ISPs explicitly want to take on this responsibility and invest in heavier equipment and very advanced network management, the best they can do is take the edge off by implementing some filtering that allows their users a little more time to patch their systems.

Then there is the other side of the coin: infected customers. I mostly work for content hosters these days, and there the situation is slightly different from the one that access ISPs are facing, as the number of customers is much smaller and the bandwidth they have is much larger. So one customer can do much more damage by either causing congestion in the local network or by driving up the bandwidth use on external connections (which is expensive because of the usual 95th percentile billing). There have been several cases the past year where my customers shut down ports of infected customers of theirs (sometimes lowering the port speed to 10 Mbps is a good compromise). But since this leads to many phone calls, I can imagine that doing this for every infected customer may be a problem for ISPs with many dial/ADSL/cable customers. Also, if the bandwidth use isn't too excessive, it may not always be apparent that a customer is infected.

OK, here's an alternative viewpoint.

We're an ISP. I'm blocking 135 and the other netbios ports inbound on my
clients dial-up/dsl lines because if I didn't, the lines would be useless.

Client side firewalls are great, but by the time they can do anything the
traffic is already over the line. It doesn't take much traffic at all to
overload a dial-up, and every virus flare-up puts a noticeable impact on DSL
lines.

I'll unblock for a client that asks. The only one who asked, sheepishly
asked for it to be put back less than an hour later. They couldn't do
anything with the line.

It's all well and good to say how things 'should' be, but reality has a way
of not caring how things should be.

Sunday morning posting. I'm blocking these ports OUTBOUND -- TO our
clients. Their lines are being saturated by other infected hosts trying to
infect them.

... Micr0$0ft's level of engineered-in vulnerabilities and wanton
disregard for security in the name of features. ...

i can't see it. i know folks who write code at microsoft and they worry
as much about security bugs as people who work at other places or who do
software as a hobby. the problem microsoft has with software quality that
they have no competition, and their marketing people know that ship dates
will drive total dollar volume regardless of quality. (when you have
competition, you have to worry about quality; when you don't, you don't.)

When you don't have liability you don't have to worry about quality.

What we need is lemon laws for software.

--vadim

When you don't have liability you don't have to worry about quality.

What we need is lemon laws for software.

--vadim

  That would destroy the free software community. You could try to exempt
free software, but then you would just succeed in destroying the 'low cost'
software community. (And, in any event, since free software is not really
free, you would have a hard time exempting the free software community.
Licensing terms, even if not explicitly in dollars, have a cost associated
with them.)

  Any agreement two uncoerced people make with full knowledge of the terms is
fair by definition. If I don't want to buy software unless the manufacturer
takes liability, I am already free to accept only those terms. All you want
to do is remove from the buyer the freedom to negotiate away his right to
sue for liability in exchange for a lower price.

  If you seriously think government regulation to reduce people's software
buying choices can produce more reliable software, you're living in a
different world from the one that I'm living in. In fact, if all companies
were required to accept liability for their software, companies that produce
more reliable software couldn't choose to accept liability as a competitive
edge. So you'd reduce competition's ability to pressure manufacturers to
make reliable software.

  Manufacturers would simply purchase more expensive liability insurance,
raise the prices on their software, and continue to produce software that is
no more reliable.

  DS

> When you don't have liability you don't have to worry about quality.
>
> What we need is lemon laws for software.

  That would destroy the free software community. You could try to exempt
free software, but then you would just succeed in destroying the 'low cost'
software community.

This is somewhat strange argument; gifts are not subject to lemon laws,
AFAIK. The whole purpose of those laws is to protect consumers from
unscurpulous vendors exploiting inability of consumers to recognize
defects in the products _prior to sale_.

The low-cost low-quality software community deserves to be destroyed,
because it, essentially, preys on the fact that in most organizations
acquisition costs are visible while maintenance costs are hidden. This
amounts to rip-off of unsuspecting customers; and, besides, the drive to
lower costs at the expense of quality is central to the whole story of
off-shoring and decline of the better-quality producers. The availability
of initially indistinguishable lower-quality stuff means that the market
will engage in the "race to the bottom", effectively destroying the
industry in the process.

(And, in any event, since free software is not really free, you would
have a hard time exempting the free software community. Licensing
terms, even if not explicitly in dollars, have a cost associated with
them.)

Free software producers make no implied presentation of fitness of the
product for a particular purpose - any reasonable person understands that
a good-faith gift is not meant to make the giver liable. Vendors,
however, are commonly held to imply such fitness if they offer a product
for sale, because they receive supposedly fair compensation. That is why
software companies have to explicitly disclaim this implied claim of
fitness and merchantability in their (often "shrink-wrap") licenses.

Any agreement two uncoerced people make with full knowledge of the
terms is fair by definition.

Consumer of software cannot be reasonably expected to be able to perform
adequate pre-sale inspection of the offered product, and therefore the
vendor has the advantage of much better knowledge. This is hardly fair to
consumers. That is why the consumer-protection laws (and professional
licensing laws) are here in the first place.

If I don't want to buy software unless the manufacturer takes
liability, I am already free to accept only those terms.

There are no vendors of consumer-grade software who would assume any
liability in their end-user licensing agreements. They don't have to do
that, so they don't, and doing otherwise would put them at the immediate
competitive disadvantage.

All you want to do is remove from the buyer the freedom to negotiate
away his right to sue for liability in exchange for a lower price.

You can negotiate if you have a choice. There is no freedom to negotiate
in practice, so the "choice" is, at best, illusory. Go find a vendor which
will sell you the equivalent of Outlook _and_ assume liability.

If you seriously think government regulation to reduce people's software
buying choices can produce more reliable software, you're living in a
different world from the one that I'm living in.

It definitely helped to stem the rampant quackery in the medical
profession, and significantly improved safety of cars and appliances. I
would advise you to read some history of fake medicines and medical
devices in the US; some of them, sold as lately as in 50s, were quite
dangerous (for example, home water "chargers" including large quantities
of radium).

Regulation is needed to make the bargain more balanced - as it stands now,
the consumers are at the mercy of software companies because of grossly
unequal knowledge and inablity of consumers to make reasonable evaluation
of the products prior to commencing transactions.

(I am living in a country having economical system full of regulation, and
it is so far the best-performing system around. Are you suggesting that
radically changing it will produce better results? As you may know, what
you offer as a solution was already tried and rejected by the same
country, leaving a lot of romantic, but somewhat obsolete, notions of
radical agrarian capitalism lingering around).

In fact, if all companies were required to accept liability for their
software, companies that produce more reliable software couldn't
choose to accept liability as a competitive edge. So you'd reduce
competition's ability to pressure manufacturers to make reliable
software.

I admire your faith in the all-mighty force of the competition. Now would
you please explain how the single vendor of the rather crappy software
came to thoroughly dominate the marketplace? (Hint: there's a thing
called network externalities).

Absolutely free market doesn't work, and that is why there are anti-trust,
securities, commercial, and consumer-protection laws - all of which were
created to address the actual problems after these problems were
discovered in previously unregulated markets. "Freedom" is not the same
as "fairness", and it is fairness which lets the better-for-consumer
players to get upper hand, to make the "invisible hand" to work.

For example of a really free market, go to Russia. Over there the
businesses are often engaged in such practices (not common in places with
better enforced laws) as killing competitors or freely buying government
officials. They have way more actual freedom in choosing their business
methods - but I doubt you would want to do business there. Oh, and
consumers are also quite free not to buy from the bad guys... if they are
willing to go without food or gas or apartments, etc.

Manufacturers would simply purchase more expensive liability insurance,
raise the prices on their software, and continue to produce software that is
no more reliable.

If vendors will need to buy liability insurance at the rates which depend
on the real quality of their products (the insurance companies are quite
able to perform risk analysis, unlike the end-users), this will make
improving quality not just a matter of nebulous and fungible
repeat-business rate and consumer loyalty, but the hard, immediate,
bottom-line impacting factor.

--vadim

This isn't the best forum for this discussion, so this will be my last
reply.

> > When you don't have liability you don't have to worry about quality.
> >
> > What we need is lemon laws for software.

> That would destroy the free software community. You could
> try to exempt
> free software, but then you would just succeed in destroying
> the 'low cost'
> software community.

This is somewhat strange argument; gifts are not subject to lemon laws,
AFAIK.

  Gifts also don't come with licensing agreements. Gifts aren't the result of
contracts, but free software is. (U.S. courts treat licensing agreements as
contracts. If there's compensation, it's not a gift.)

The whole purpose of those laws is to protect consumers from
unscurpulous vendors exploiting inability of consumers to recognize
defects in the products _prior to sale_.

  Actually, the protect consumers only against the inability to recognize
this inability. So long as consumers are aware of this inability, it poses
no threat to them.

The low-cost low-quality software community deserves to be destroyed,
because it, essentially, preys on the fact that in most organizations
acquisition costs are visible while maintenance costs are hidden. This
amounts to rip-off of unsuspecting customers; and, besides, the drive to
lower costs at the expense of quality is central to the whole story of
off-shoring and decline of the better-quality producers. The availability
of initially indistinguishable lower-quality stuff means that the market
will engage in the "race to the bottom", effectively destroying the
industry in the process.

  Your argument is predicated on the premise that you know better than
someone else what they want. You use the phrase "unsuspecting customers" if
there were no other kind. You have yet to state a problem that can't be
solved by educating the customers.

> (And, in any event, since free software is not really free, you would
> have a hard time exempting the free software community. Licensing
> terms, even if not explicitly in dollars, have a cost associated with
> them.)

Free software producers make no implied presentation of fitness of the
product for a particular purpose - any reasonable person understands that
a good-faith gift is not meant to make the giver liable.

  Gifts also don't come with licensing terms. Free software is not a gift,
it's a contract, and contracts have compensation on both sides.

Vendors,
however, are commonly held to imply such fitness if they offer a product
for sale, because they receive supposedly fair compensation.

  You say this, but you don't believe it. If the compensation was fair, then
there would be no need to provide the consumer with any additional
protections since he hasn't paid for them. You can't have it both ways.

  Whatever offer the vendor makes, the customer may take it or leave it based
upon whether it provides the customer with value for his money. There are no
"unsuspecting customers" because the terms are not a secret.

That is why
software companies have to explicitly disclaim this implied claim of
fitness and merchantability in their (often "shrink-wrap") licenses.

  Umm, makers of free software have to do this too. Even people who place
software in the public domain have to do this. This has nothing to do with
compensation and has more to do with nuisance.

> Any agreement two uncoerced people make with full knowledge of the
> terms is fair by definition.

Consumer of software cannot be reasonably expected to be able to perform
adequate pre-sale inspection of the offered product, and therefore the
vendor has the advantage of much better knowledge. This is hardly fair to
consumers. That is why the consumer-protection laws (and professional
licensing laws) are here in the first place.

  This lack of pre-sale inspection reduces the value of software to the
purchaser. So the vendor is already paying fair compensation for this lack.
The consumer hasn't paid for this information and so isn't entitled to it.

  Again you want to have it both ways. The consumer could pay for this
knowledge if he or she wanted to. Corporate customers, for example, could
buy one copy of the software to inspect. Or they could hire outside firms to
produce software reviews. Or they could just read printed reviews. There are
any number of ways customers could obtain this information if they were
willing to pay for it.

  If we assume, arguendo, that they don't obtain this information, it follows
that they weren't willing to pay for it. But you want to force them to pay
for it whether they want it or not. And why shouldn't you, you know better
than they do, right?

> If I don't want to buy software unless the manufacturer takes
> liability, I am already free to accept only those terms.

There are no vendors of consumer-grade software who would assume any
liability in their end-user licensing agreements. They don't have to do
that, so they don't, and doing otherwise would put them at the immediate
competitive disadvantage.

  Right, that's why you, knowing better than everyone else, have to force
them to.

> All you want to do is remove from the buyer the freedom to negotiate
> away his right to sue for liability in exchange for a lower price.

You can negotiate if you have a choice. There is no freedom to negotiate
in practice, so the "choice" is, at best, illusory. Go find a vendor which
will sell you the equivalent of Outlook _and_ assume liability.

  They won't, because people aren't willing to pay the costs for it. Again,
that's why you, knowing better, must force them. If you really were willing
to pay the full cost, I don't think you'd have trouble finding it. Insurance
companies will provide you insurance against software failures.

> If you seriously think government regulation to reduce people's software
> buying choices can produce more reliable software, you're living in a
> different world from the one that I'm living in.

It definitely helped to stem the rampant quackery in the medical
profession, and significantly improved safety of cars and appliances. I
would advise you to read some history of fake medicines and medical
devices in the US; some of them, sold as lately as in 50s, were quite
dangerous (for example, home water "chargers" including large quantities
of radium).

  Sure, it saves lives. But it costs lives too. How many people in the United
States died while the government dragged its heels approving labetolol?
But again, you know better than everyone else, so you can decide which lives
are important.

Regulation is needed to make the bargain more balanced - as it stands now,
the consumers are at the mercy of software companies because of grossly
unequal knowledge and inablity of consumers to make reasonable evaluation
of the products prior to commencing transactions.

  Of course you know better than the consumers what's important to them.
Strange they seem to buy software even without this knowledged. Maybe that's
because it's not worth the cost.

> Manufacturers would simply purchase more expensive liability insurance,
> raise the prices on their software, and continue to produce
> software that is no more reliable.

If vendors will need to buy liability insurance at the rates which depend
on the real quality of their products (the insurance companies are quite
able to perform risk analysis, unlike the end-users), this will make
improving quality not just a matter of nebulous and fungible
repeat-business rate and consumer loyalty, but the hard, immediate,
bottom-line impacting factor.

  Of course only the big software business could afford this. They'd have the
muscle and savvy to defend the liability claims. So far from being a threat
to Microsoft, Microsoft would likely champion a proposal like this. How much
of their competition would go away?

  DS

David, since all your arguments are variations on "You think you know
better than anyone else what they need" (whereby you, supposedly, extoll
virtues of a system which you don't yourself think is the best one) I do
concur that the further discussion makes no sense.

--vadim

Umm.. if you explicitly put it in the public domain, you*can't* do it. You no
longer have a way to say "by copying this, you agree not to sue us down to our
skivvies". That's why the BSD and X11 distributions had copyrights at all - public
domain would probably have served their political goals just fine except for the
inability to disclaim liability by hanging it off the copyright (which they wouldn't
have if they put it in public domain).

I just summarized my thoughts on this topic here:
http://www.sans.org/rr/special/isp_blocking.php

Overall: I think there are some ports (135, 137, 139, 445),
a consumer ISP should block as close to the customer as
they can.

One basic issue is that people discussing this topic on
mailing lists like these are not average home users. Most
of us here have seen a DOS prompt at some point and know
about "Service Packs" and "Hotfixes".

If ISPs had blocked port 119, Sobig could not have been distributed
via USENET.

Perhaps unbelievably to people on this mailing list, many people
legitimately use 135, 137, 139 and 445 over the open Internet
everyday. Which protocols do you think are used more on today's
Internet? SSH or NETBIOS?

Some businesses have create an entire industry of outsourcing Exchange
service which need all their customers to be able to use those ports.

http://www.mailstreet.net/MS/urgent.asp

http://dmoz.org/Computers/Software/Groupware/Microsoft_Exchange/

If done properly, those ports are no more or less "dangerous" than
any other 16-bit port number used for TCP or UDP protocol headers.

But we need to be careful not to make the mistake that just because
we don't use those ports that the protocols aren't useful to other
people.

I just read the paper... Sounds like as an ISP, I should offer a new product
"The Internet Minus Four Port Numbers Microsoft Can't Handle." What I can't
tell is whether this should cost more or less than "The Internet"

Matthew Kaufman

Even on Windows they can be used in a much safer fashion (although I would never attempt it for any of my stuff). It is possible to use IPSec policies on 2000 and higher to encrypt all traffic on specified ports to specified hosts/networks and block all other traffic. I bet some people are using this to join remote locations securely to each other for Windows networking with these ports and IPSec policies.

Vinny Abello
Network Engineer
Server Management
vinny@tellurian.com
(973)300-9211 x 125
(973)940-6125 (Direct)
PGP Key Fingerprint: 3BC5 9A48 FC78 03D3 82E0 E935 5325 FBCB 0100 977A

Tellurian Networks - The Ultimate Internet Connection
http://www.tellurian.com (888)TELLURIAN

There are 10 kinds of people in the world. Those who understand binary and those that don't.

Some businesses have create an entire industry of outsourcing Exchange
service which need all their customers to be able to use those ports.

So should everyone else be required to keep their doors open so they can
offer the service? Who is wrong/right? Millions of vulnerable users that
need some basic protection now, or a few businesses?

Charge the same and take your 'abuse' team out for lunch on the change
you save by blocking the ports :wink:

I would think that any company that outsourced exchange services to another
entity would want either a VPN between their two offices or a direct PtP
link.
But I also know that the most logical method is not always understandable to
the pointy haired people.

william

Johannes Ullrich wrote:

So should everyone else be required to keep their doors open so they can
offer the service? Who is wrong/right? Millions of vulnerable users that
need some basic protection now, or a few businesses?

That depends if you are buying the 100% internet or 99.993% internet service.

Pete

Even on Windows they can be used in a much safer fashion (although I would
never attempt it for any of my stuff). It is possible to use IPSec policies
on 2000 and higher to encrypt all traffic on specified ports to specified
hosts/networks and block all other traffic. I bet some people are using
this to join remote locations securely to each other for Windows networking
with these ports and IPSec policies.

If you explain the difference between "IPSec", "The Web" to
an end user, and can convince them that they have "enough
Pentium" for it, you win and don't have to block the ports.

There are 10 kinds of people in the world. Those who understand binary
and those that don't.

ISPs should either block the mentioned ports, or send out bills in
binary.

Some businesses have create an entire industry of outsourcing Exchange
service which need all their customers to be able to use those ports.

So should everyone else be required to keep their doors open so they can
offer the service? Who is wrong/right? Millions of vulnerable users that
need some basic protection now, or a few businesses?

Sorry... "Millions of vulnerable users" are only vulnerable because those
users chose to run vulnerable systems. They have the responsibility to
do what is necessary to correct the vulnerabilities in the systems they
chose to run. I am really tired of the attitude that the rest of the
world should bear the consequences of Micr0$0ft's incompetence/arrogance.
The people who are Micr0$0ft customers should have responsibility to resolve
these issues with Micr0$0ft. It is nice of ISPs to help when they do.

This is akin to driving a pinto, knowing that it's a bomb, and expecting
your local DOT to build explosion-proof freeways.

Owen