Cisco as a "First Responder"?

Interestingly enough, there's an article on MSNBC:

http://www.msnbc.msn.com/id/9131498/

...that talks about all of the "gee whiz" tech stuff
that is getting deployed to assist in the aftermath of
Katrina:

[snip]

Among the first high-tech responders was Cisco Systems, which is setting up mobile communication kits and wiki-based networks to deal with Katrina's information overload. "Just wanted you to know that we will have 'feet on the wet street,'" Cisco's Lori Bush reported in a posting to fellow members of the National Institute for Urban Search and Rescue.

Some of the equipment, like the Cisco kits, can fit into a search-and-rescue effort instantly. Other gadgets are being put into service on the fly, in hopes of boosting the communication systems currently being used. And still others aren't yet ready for prime time but will be tested in real-world conditions.

[snip]

- ferg

...that talks about all of the "gee whiz" tech stuff
that is getting deployed to assist in the aftermath of
Katrina:

Turn walkie-talkies into VoIP devices
http://www.dingotel.com/2way/index.asp

Satellite IP modems
http://www.starband.com/residential/index.asp

Mobile WiFi mesh networks
http://www.packethop.com/products/

And there is all kinds of rechargeable battery
technology and fuel-cell technology that is capable
of powering these devices. Solar cell rechargers,
hand crank rechargers, etc.

In addition to first aid kits and fire extinguishers
in your offices, why not keep a backpack or two
with this kind of technology and get together twice
a year with your local competitors to exercise it
all.

--Michael Dillon

For those that don't know, directNIC as well as some other NOCs are
located together in a high-rise in New Orleans.

Despite everyone's warnings (including my own!) some tough-as-nails guys
from directNIC stayed behind to battle the hurricane and keep the
networks online.

[snip]
While the safety of our staff is paramount and most were safely
evacuated, a small group of key personnel stayed behind to safeguard our
data center and make sure that all of our services remained online and
stable. During this time, while we spent a great deal of effort battling
broken windows, incoming water, and flying debris, our hosting and
registration services remained online and worked flawlessly.
[/snip]

Link: http://www.directnic.com/katrina.php (Pictures too!)

It also mentions the power shortage, and that they are using diesel
generated power.

I remember a week or two ago people were talking about building
redundant datacenters, off-grid power, failure mitigation, etc.

I think if nothing else, this is at least a success story of building a
NOC which can provide critical infrastructure that will survive major
disasters.

It also mentions the power shortage, and that they are using diesel
generated power.

I think if nothing else, this is at least a success story of building a
NOC which can provide critical infrastructure that will survive major
disasters.

We know from the Mississippi river floods from a few
years ago, that diesel generators are not sufficient
in a major flood. The problem is that the diesel gets
burned up before the roads are opened to resupply the
fuel. It is too early to tell whether these guys can
survive a major disaster.

There is also the problem of water borne diseases,
mosquitoes, and shift changes. The problems in
New Orleans are just beginning.

--Michael Dillon

For those that don't know, directNIC as well as some other NOCs are
located together in a high-rise in New Orleans.

Despite everyone's warnings (including my own!) some tough-as-nails guys
from directNIC stayed behind to battle the hurricane and keep the
networks online.

[snip]
While the safety of our staff is paramount and most were safely
evacuated, a small group of key personnel stayed behind to safeguard our
data center and make sure that all of our services remained online and
stable. During this time, while we spent a great deal of effort battling
broken windows, incoming water, and flying debris, our hosting and
registration services remained online and worked flawlessly.
[/snip]

Link: http://www.directnic.com/katrina.php (Pictures too!)

It also mentions the power shortage, and that they are using diesel
generated power.

I remember a week or two ago people were talking about building
redundant datacenters, off-grid power, failure mitigation, etc.

I remember that after 9/11 the real network hits started about 3 days later, when
the diesel generators started running out of fuel in the downtown telco hotels, and there
was no way to physically get fuel trucks to their location. The equipment and generators were
fine, they just ran out of fuel.

If you look at the flooding in downtown New Orleans, it looks like this might happen again there.

It makes me wonder whether part of disaster planning shouldn't be some sort of power triage, where
if it looks like it's not going to be possible to get fuel to a datacenter after a systemwide
power outage, instead of powering everything for a short time and then going dark, a subset is
powered for weeks.

Since I believe that air conditioning is a big part of the fuel expenditure, this might imply
preplanning to the extent of grouping essential equipment together in a limited area that
could be kept cool when everything else went dark.

I think if nothing else, this is at least a success story of building a
NOC which can provide critical infrastructure that will survive major
disasters.

--
Regards,
Chris Gilbert

Regards
Marshall Eubanks

I agree with your point on that we don't know if they will last the
entire length of the ordeal.

I was mostly pointing that they have survived the initial "brunt" of the
ordeal, which IMHO is a pretty amazing accomplishment considering that
POTS/Power/Cell have all gone down (or at least gone to hell) over there.

As far as the fuel situation goes...

[snip]
/5:04 pm/ One of our employee's uncle has some kind of huge boat and he
donated his diesel reserves to our cause. We're set for the time being
as far as that goes.
[/snip]

Not very specific, but I suppose in the case of a flood this kind of thing would be immensely useful.

It's not very applicable to the kind of disaster Marshall brought up, but in the case of a flood, moving diesel into the facility via boat seems to be a viable option. (For the time being)

My main concern at this point is getting these guys food/water reliably. They can have all the diesel fuel in the world, but if they don't have supplies to live off of then it isn't going to make any difference.

To me, this is a major area of interest as there seems to be a large amount of service convergence going on. People are moving from POTS onto VoIP, more and more formerly isolated long-distance networks are being moved onto the Internet, etc.

What kind of operating protocols are being established for critical network infrastructure points? Suppose a major earthquake was to hit San Jose and take out fiber. How would that effect Arizona or Washington... what about Japan?

Granted there are a lot of things that go into this. In a disaster situation, it's important to make sure that your machines and network continue operating, but what about provisioning to make sure you can keep NOC staff there?

But that brings the question, just _how_important_ is the Internet and other networks? Should we go for far out of the way as to build NORAD style datacenters to protect our infrastructure... or are we willing to deal with a certain amount of network failure if the cost of mitigating it is over X amount?

Just some food for thought.

Not privy to directNic's full responsibilities, but of their public facing
responsibilities I'm not sure DNS admininstrative activities are worth
risking life and limb for.

I guess there may be a need for some updates of DNS services due to the
incident itself, or similar elsewhere, but in almost all cases this can be
overridden further up the chain of DNS authority.

Granted there are a lot of things that go into this. In a disaster
situation, it's important to make sure that your machines and
network continue operating, but what about provisioning to make sure
you can keep NOC staff there?

We, as a society, do know how to solve this problem.
It's called, the military. Most people live under the
delusion that an army's main job is to shoot guns and
bombs at people. But the reality is that the main
work of the armed forces is to mobilize, people, food,
water, equipment and communications into an area where
all these things are not available. The shooting part
is incidental to the work that most people are doing.

A lot of this logistics work is done by people who
aren't on the firing line so it gets overlooked easily.
And much of the work happens long before the shooting
starts when people plan how to deploy stuff and then
practise doing it so they know that it will work when
the crunch comes. There is no reason why corporations
or network operators couldn't apply lessons learned
from the military.

--Michael Dillon

[...]

I guess there may be a need for some updates of DNS services due to
the incident itself, or similar elsewhere, but in almost all cases
this can be overridden further up the chain of DNS authority.

I live just a mile down the road from the ISP I work at.

Given the choice of sitting at home (no power, probably no roof), or
hiding in the NOC (warm, internal room with no windows, has a shower
and cooking facilities) and being *paid* for it, I'll "heroically" man
the ship (as opposed to cowardly hiding at work).

I think the issue is not staying at home or work, but rather deciding whetehr
or not to follow advice to evacuate an area, where you risk becoming a
liability for other rescue and recovery workers.

I understand the need for handling some telecommunications differently from
other services.

But certainly in the Carribean, companies like C&W have established hurricane
procedures, and will deploy staff (like the Red Cross did) in key areas close
to the affected areas ready to step in and repair stuff without undue risk to
their own employees and contractors.

For organisations like registrars there are plans in place for the loss of key
organisations (mostly intended for business failure rather than catastrophic
failure).

Obviously easy to comment with hindsight from a nice comfortable location.

There are other reasons too. People have been following NOPD police scanners and posting news that the mainstream media refuse to cover:

http://www.freerepublic.com/focus/news/1474267/posts

Some rescue services are refusing to enter due to armed thugs roaming the streets with ak47 assault rifles, carjacking, mugging and murdering people.

Law enforcement officials have been captured on videotape participating in the looting.

Does not sound like a place any sane person would choose to go to. I don't think "risking your life to protect your employer's property" is on the
job description...

-Dan

While there aren't a lot of systems that are worth the risk, they definitely do exist -
think air traffic control, hospital power generation, emergency dispatch, and relief
effort coordination. Companies suffer when they're offline, but in the above cases,
people can die. The good news is these systems are usually better equipped and
protected than the local data center...

/John

If you're going to post a URL on *that* site (for other NANOG'ers, it's a
highly politically charged site), I am obliged to exercise equal time.

Louisiana forum:

http://www.democraticunderground.com/discuss/duboard.php?az=show_topics&forum=155

listen to the police scanner yourself.

-Dan