Jeff Shultz wrote:
** Reply to message from Brad Knowles <email@example.com> on Fri,
25 Jun 2004 18:14:43 +0200
At least if someone in this "clearing house" sells it to the
terrorists, they will have had to work for it a bit, instead of having
us hand it to them on a silver platter, as the FCC seems to want.
Not true. If the information is forced to be completely in the open, then everyone knows it's not insecure and no one depends on the fact that it was supposed to be kept secret. This is a case where you are more secure the more open the information is -- indeed, as we are in most cases, which is why we have the age-old security mantra of "security through obscurity is not secure".
Do you realize that the basic element of security, the password, is
based on the entire premise you just dismissed? And yet we still use
them - and depend on the fact that they are supposed to be kept secret.
The problem with being totally open about infrastructure is that there
are some vulnerabilities that simply cannot or will not be fixed -
wires sometimes have to run across bridges, redundant pumping stations
are too expensive... in these cases is it not better to hide where
these vulnerabilities are?
Not really. Security through obscurity in some circumstance can
help, but rarely when it comes to something like that. When it
comes to wires crossing a bridge or pumping stations, anyone who
tries hard enough will find out pretty easily. You end up with
two groups knowing where the vulnerabilities are, the handful of
"good guys" who oversee the resources and the bad guys.
It strikes me as similar to the outcry from the locksmith community
when the vulnerabilities of various master key mechanisms were
widely published. Who knew about the vulnerabilities? The "good
guy" locksmiths who used the vulnerabilities to break into your
office when you lost your keys (and sold you the locks) all knew,
and the bad guys who broke into your office to steal stuff knew.
Who didn't know? The consumer who was unable to make an informed
decision about the security of the various choices of key-lock
mechanisms he had available.
So the problem with the pumping station or the wires over the
bridge are that the limited number of experts know, the bad
guys know, but other people who should know (the network engineer
judging the reliability of his links or the civil engineer
deciding the capacity for an emergency water tower for an
industrial site) may not understand the true vulnerability
of the system.
But that doesn't mean we shouldn't put a fence around the
pumping station or a padlock on the door because a key is
just "security through obscurity" through some convoluted
The problem with your point is that even if the information is forced
to be completely in the open, that is no guarantee that it will be
fixed, and people _do_ depend on this stuff, regardless of its
reliability or security.
Somethings cannot be and should not be "fixed." Making the
public water supply invulnerable to earthquake damage is not
practical. Individuals have the responsibility (even if most
don't) to keep a few days supply of potable water available
in the inevitable, but unlikely on any given day, event of
a powerful earthquake.
Making various infrastructure invulnerable to every foreseeable,
let alone unforeseeable, attack is not practical either. But
those who are affected by the failure of any piece of
infrastructure need to know how reliable it is and plan
Do you really think that if we publish all the insecurities of the
Internet infrastructure that anyone is gonna stop using it, or
business, government, and private citizens are going to quit depending
Of course not. But they may be better able to quantify their
risks in depending on the 'Net and make contingency plans where
it is prudent. The real world is about risk management; even
the US federal government has given up on a risk avoidance
model and moved to risk management.
Security through obscurity is not secure - but sometimes it's all you
But it is worse than nothing when you obscure the truth from
people who should know. If the vulnerability is exploited,
the impact is worse than if those who should have known had had
the ability to plan for the contingency.