Wacky Weekend: NERC to relax power grid frequency strictures

The North American Electric Reliability Council is planning to relax
the standards for how closely power utilities must hold to 60.00Hz.

Here's my absolute favorite quote of all time:

  Tweaking the power grid's frequency is expensive and takes a lot of effort,
  said Joe McClelland, head of electric reliability for the Federal Energy
  Regulatory Commission.

  "Is anyone using the grid to keep track of time?" McClelland said. "Let's see
  if anyone complains if we eliminate it."

http://hosted.ap.org/dynamic/stories/U/US_SCI_POWER_CLOCKS?SITE=AP&SECTION=HOME&TEMPLATE=DEFAULT

I believe the answer to that question is contained here:

  http://yarchive.net/car/rv/generator_synchronization.html [1]

This is gonna be fun, no?

Cheers,
-- jra
[1]Please, let's not start in on the source.[2]
[2]No, really: *please*. :slight_smile:

If your definition of fun is spending a year watching an old microwave
clock lose or gain a few minutes.

I don't see how this has anything to do with syncing two generators. The
grid is in sync, and if the frequency of the grid changes (as it does
all the time) it will stay in sync. It has nothing to do with the
absolute frequency.

Perhaps I read the piece incorrectly, but it certainly sounded to *me* like
the part that was hard was not hitting 60.00, but *staying in sync with
others*...

Cheers,
-- jra

Way I read it, when they occasionally run at 59.9hz for a few hours
(and according to my UPS monitoring software this is a regular
occurance), they're no longer going to run at 60.1 hz for a while so
that the average comes out to 60.

Regards,
Bill Herrin

Generators all stay in sync. Generator owners have expensive devices
that sync the phase before the generator is connected to the grid. Once
a generator is connected to the gird, it will stay in sync - in fact
that is why they have the expensive devices to make sure that they are
in sync before they connect them, as if they are not, it will instantly
jump to being in sync, which may destroy the generator.

I'm not an electrical engineer, but I do IT "Cybersecurity" for a local
utility. The electrical engineering / power utility side of the house
starts to rub off - but I say this as I might not have all the terms
exactly right. I follow the FERC/NERC discussions as CIP compliance is
one of my primary job duties.

In fact, if you want to see what happens if you connect a generator out
of phase, just look into the AURORA out-of-phase circuit breaker
re-closing issues which were brought to light last year. Here's some links:
http://www.atcllc.com/oasis/Customer_Notices/ATCNetworkCustomerMeeting052411_Francis.pdf

Here's another link I read in the last week when trying to get up to
speak more:
http://yarchive.net/car/rv/generator_synchronization.html

Jason Roysdon

As a matter of fact, it may destroy the generator, the housing, the building,
the damn, and more. An out-of-sync generator becomes a motor until it is
in sync. lt can be a graphic and dramatic event.

Owen

This paper describes what they currently do to keep clocks accurate with
Manual Time Error Correction (which is what they are going to suspect
for a year):
http://www.naesb.org/pdf2/weq_bklet_011505_tec_mc.pdf

As I said in my last post, I'm not an EE, but just follow some of topics
on that side of the house.

What I gather is that Manual TEC, which is done by purposely running the
frequency away from 60Hz to correct an average deviation, can actually
cause more problems.

"NERC is investigating the possibility of eliminating Time Error
Corrections. NERC has been collecting data regarding Interconnection
frequency performance, including the number of clock-­‐ minutes during
which actual frequency dropped below the low Frequency Trigger Limit
(FTL) of 59.95 Hertz. During the period of July 2005 through March 2010,
approximately 44% of the minutes during which clock-­‐minute actual
frequency dropped below the low FTL occurred during Time Error
Corrections when scheduled frequency was 59.98 Hertz (1,875 of the 4,234
total minutes observed below 59.95 Hertz). Upon further investigation,
it was found that almost all of those minutes (1,819 of the 1,875 total)
represented frequency deviations that would likely not have dropped
frequency below 59.95 Hertz if the scheduled frequency had been 60
Hertz. In other words, approximately 97% of the Low FTLs were of such a
magnitude that if the Time Error Correction had not been in effect, the
exceedance of the low FTL would not have occurred.

These Frequency Trigger Limits in and of themselves are only indicators
of system behavior, but the nature of their relationship to Time Error
Corrections calls into question the potential impact that Time Error
Corrections can have on frequency behavior overall. While it is
intuitively obvious that any frequency offset that moves target
frequency away from the reference point to which all other frequency
sensitive devices (such as relays) have been indexed will have a
potential impact on those devices’ performance, the industry has by and
large regarded Time Error Corrections as harmless and necessary as part
of the service it provides to its customers. However, in light of this
data, NERC’s stakeholders are now questioning whether or not the
intentional movement closer to (or in some cases, further away from) the
trigger settings of frequency-­‐based protection devices as is evidenced
during Time Error Correction events is appropriate.

Accordingly, NERC is planning a Field Trial during which the practice of
doing Time Error Corrections will be suspended. Because of the
fundamental nature of this 60Hz signal, NERC is reaching out to various
industries to get their thoughts on whether they anticipate any problems
with the elimination of Time Error Corrections. Those industries include
appliance manufacturers, software companies, chemical manufacturers,
companies that make automation equipment, computer manufacturers, and
many others."

Source:
http://www.wecc.biz/library/Lists/Announcements/Attachments/9/June%2014%20Time%20Error%20Webinar.pdf

The main point I gather is that trying to do manual Time Error
Correction actually makes the power grid less stable at times, and as
such they want to do away with it (thus making the power grid more stable).

Think of it the same as patch management risk assessment. If there are
no security or bug fixes that directly affect you or even feature
enhancements that you don't need, do you apply a patch/upgrade to
critical systems? Nah, you skip those, because we all know every
patch/upgrade carries with it risk of an unknown bug or even security flaw.

That's what they're doing here, opting to skip "patching" the time
error. They're not ignoring frequency altogether, but rather only
minding that aspects that have to do with grid stability, not your alarm
clock. This is for the better anyway, and NTP/GPS/WWV/WWVH is the way
to go to keep clocks accurate and hopefully will be the outcome of any
consumer complaints.

I've seen conversation in various forums and lists I read that they are
going to ignore or not care about the 60Hz standard. This is incorrect.
They just aren't going to purposely deviate from the scheduled
frequency to perform manual TEC.

Mind you, that they still care about why the frequency is off, and when
things are not able to quickly compensate, they want to know and be able
to pinpoint it and fix it:
http://www.nerc.com/filez/standards/Frequency_Response.html

Specifically, read this PDF:
http://www.nerc.com/docs/oc/rfwg/NERC%20Balancing%20and%20Frequency%20Control%20July%205%202009.pdf

The AP piece was focused on hype and word-spinning (I couldn't find an
AP.org link, so used one that I could find,
http://www.huffingtonpost.com/2011/06/24/clock-problems-power-grid-clock-disruptions_n_884259.html
):

"The experiment would allow more frequency variation than it does now —
without corrections."

The NERC BAL standards already hold the NERC entities to very high
frequency standards, and this will be unchanged, except for manual TEC.
All it is doing is eliminating the corrections made purely for time's
sake, which actually eliminates more frequency variation. This may, or
may not, create more average frequency variation, and that is part of
this test.

"Officials say they want to try it to make the power supply more
reliable, save money and reduce what may be needless effort."

This is the real goal, and should have been the focus of the news story
- but that doesn't make headlines.

I'm going to go shop for a new clock. I had one that used the WWV/WWVH
stations, but then they messed with DST and it was off for a few weeks
around each DS change. That forced me me to pull it off the wall and
change the TZ at one time of the year to correct it, but the other time
of the year I could not correct it (as it only had 4 TZ settings), so I
took it down. Beware "Automatic Time Set" clocks which don't really
learn the time from the WWV/WWVH stations (like the Sony ICF-C218, which
has a preset time and battery, but still uses the frequency from the
wall to maintain time). The best bet is a clock that requires batteries
as you know it won't get time from the power grid.

Jason Roysdon

Thank you, Jason. I did some searching before I posted that, to see if
I could locate better information, but clearly, I didn't search hard enough.

Cheers,
-- jr 'my google-fu requires 60.01Hz :-)' a

NERC's site is very hard to find info on if you don't know where to
look. Even when you've found something before, it can be hard to find
again. I run into that nearly monthly and have a document just to help
me navigate to certain areas.

http://www.nerc.com/page.php?cid=6|386
http://www.nerc.com/page.php?cid=6|386|391
http://www.nerc.com/files/NERC_TEC_Field_Trial_Webinar_061411.pdf
http://www.nerc.com/filez/Webinars/tec_webinar_061411/index.htm <webinar

Jason Roysdon

Perhaps I read the piece incorrectly, but it certainly sounded to *me* like
the part that was hard was not hitting 60.00, but *staying in sync with
others*...

If the grid is enough bigger than any one generator, that happens
automatically; once
connected, you can't get out of sync without tripping a circuit
breaker. There are some
minor exceptions for DC and VF AC generation (e.g. solar and wind
respectively)

Way I read it, when they occasionally run at 59.9hz for a few hours
(and according to my UPS monitoring software this is a regular
occurance), they're no longer going to run at 60.1 hz for a while so
that the average comes out to 60.

Finer gradations than that, but yes.

Regards,
Bill Herrin

This paper describes what they currently do to keep clocks accurate with
Manual Time Error Correction (which is what they are going to suspect
for a year):
http://www.naesb.org/pdf2/weq_bklet_011505_tec_mc.pdf

As I said in my last post, I'm not an EE, but just follow some of topics
on that side of the house.

What I gather is that Manual TEC, which is done by purposely running the
frequency away from 60Hz to correct an average deviation, can actually
cause more problems.

"NERC is investigating the possibility of eliminating Time Error
Corrections. NERC has been collecting data regarding Interconnection
frequency performance, including the number of clock-­‐ minutes during
which actual frequency dropped below the low Frequency Trigger Limit
(FTL) of 59.95 Hertz. During the period of July 2005 through March 2010,
approximately 44% of the minutes during which clock-­‐minute actual
frequency dropped below the low FTL occurred during Time Error
Corrections when scheduled frequency was 59.98 Hertz (1,875 of the 4,234
total minutes observed below 59.95 Hertz). Upon further investigation,
it was found that almost all of those minutes (1,819 of the 1,875 total)
represented frequency deviations that would likely not have dropped
frequency below 59.95 Hertz if the scheduled frequency had been 60
Hertz. In other words, approximately 97% of the Low FTLs were of such a
magnitude that if the Time Error Correction had not been in effect, the
exceedance of the low FTL would not have occurred.

This is an example of counting an intentional deviation as a fault and
is really only
an accounting problem (even though the triggers are mostly fixed hardware).
On the other hand, if customers can notice, it matters. (what was the
SLA again?)

Interesting... Normally slow is not a problem since it happens normally
when the grid is running near
capacity... The "frequency runs" (when I worked IT/comms with a power
company they were on Thursday nights after midnight...) were almost
always fast (I was involved in this in the mid 70's mostly; our company
rarely reached capacity but could if the weather was hot enough.) Given
that the slew command has to be done almost simultaneously at most of
the power plants in US and Canada, in order to avoid the throttle
hunting problem that JDA mentioned (yes, even big plants can hunt, just
ask the NE folks (twice!!!) though
both of them were caused by circuit faults (network partitions) and not
time corrections)

Now that most clocks are run by 32khz crystals and not counting cycles,
the corrections don't matter as much. I guess the experiment is to find
out who complains...

Then again, as a kid I remember So Cal Edison sending out new rollers
for phonographs (I know - what's that?) when they changed from 50 to
60hz. Talk about inaccurate clocks (and I don't know what they did
about those either.) And most of LA's hydro was still running generators
that were built for 50hz even into the 70's, so lost some efficiency.

These Frequency Trigger Limits in and of themselves are only indicators
of system behavior, but the nature of their relationship to Time Error
Corrections calls into question the potential impact that Time Error
Corrections can have on frequency behavior overall. While it is
intuitively obvious that any frequency offset that moves target
frequency away from the reference point to which all other frequency
sensitive devices (such as relays) have been indexed will have a
potential impact on those devices’ performance, the industry has by and
large regarded Time Error Corrections as harmless and necessary as part
of the service it provides to its customers. However, in light of this
data, NERC’s stakeholders are now questioning whether or not the
intentional movement closer to (or in some cases, further away from) the
trigger settings of frequency-­‐based protection devices as is evidenced
during Time Error Correction events is appropriate.

Accordingly, NERC is planning a Field Trial during which the practice of
doing Time Error Corrections will be suspended. Because of the
fundamental nature of this 60Hz signal, NERC is reaching out to various
industries to get their thoughts on whether they anticipate any problems
with the elimination of Time Error Corrections. Those industries include
appliance manufacturers, software companies, chemical manufacturers,
companies that make automation equipment, computer manufacturers, and
many others."

Source:
http://www.wecc.biz/library/Lists/Announcements/Attachments/9/June%2014%20Time%20Error%20Webinar.pdf

The main point I gather is that trying to do manual Time Error
Correction actually makes the power grid less stable at times, and as
such they want to do away with it (thus making the power grid more stable).

Given that you have to bump up everyone's throttle half the time (or
more) to
accomplish the correction, if things are already near capacity, you are
asking for it :slight_smile:

Think of it the same as patch management risk assessment. If there are
no security or bug fixes that directly affect you or even feature
enhancements that you don't need, do you apply a patch/upgrade to
critical systems? Nah, you skip those, because we all know every
patch/upgrade carries with it risk of an unknown bug or even security flaw.

That's what they're doing here, opting to skip "patching" the time
error. They're not ignoring frequency altogether, but rather only
minding that aspects that have to do with grid stability, not your alarm
clock. This is for the better anyway, and NTP/GPS/WWV/WWVH is the way
to go to keep clocks accurate and hopefully will be the outcome of any
consumer complaints.

Already most "modern" consumer clocks are driven at least by
crystals if not externally sync'd...

I've seen conversation in various forums and lists I read that they are
going to ignore or not care about the 60Hz standard. This is incorrect.
They just aren't going to purposely deviate from the scheduled
frequency to perform manual TEC.

Mind you, that they still care about why the frequency is off, and when
things are not able to quickly compensate, they want to know and be able
to pinpoint it and fix it:
http://www.nerc.com/filez/standards/Frequency_Response.html

I didn't read that, but the short answer is: build more generators...
(and I don't know the
politics of how the ISO's fit into this; they complicated things a lot
from when I was involved
in the power biz.)

Specifically, read this PDF:
http://www.nerc.com/docs/oc/rfwg/NERC%20Balancing%20and%20Frequency%20Control%20July%205%202009.pdf

The AP piece was focused on hype and word-spinning (I couldn't find an
AP.org link, so used one that I could find,
HuffPost - Breaking News, U.S. and World News | HuffPost
):

"The experiment would allow more frequency variation than it does now —
without corrections."

The NERC BAL standards already hold the NERC entities to very high
frequency standards, and this will be unchanged, except for manual TEC.
All it is doing is eliminating the corrections made purely for time's
sake, which actually eliminates more frequency variation. This may, or
may not, create more average frequency variation, and that is part of
this test.

It probably will eliminate most of the faster slews and some risky manual
operations.

"Officials say they want to try it to make the power supply more
reliable, save money and reduce what may be needless effort."

This is the real goal, and should have been the focus of the news story
- but that doesn't make headlines.

I'm going to go shop for a new clock. I had one that used the WWV/WWVH
stations, but then they messed with DST and it was off for a few weeks
around each DS change.

Which is cute since there is a field in the wwvb message that gives a
countdown until dst change, and flags the direction. One of my "atomic"
clocks gave up and just has a DST button... The other is like yours.

That forced me me to pull it off the wall and
change the TZ at one time of the year to correct it, but the other time
of the year I could not correct it (as it only had 4 TZ settings), so I
took it down. Beware "Automatic Time Set" clocks which don't really
learn the time from the WWV/WWVH stations (like the Sony ICF-C218, which
has a preset time and battery, but still uses the frequency from the
wall to maintain time).

Booo. I'm surprised at that in this day & age (a 32khz crystal and the
15-bit
counter it needs is actually cheaper than the analog signal-conditioning you
need to count the 60hz reliably).

In a message written on Fri, Jun 24, 2011 at 06:29:14PM -0400, Jay Ashworth wrote:

I believe the answer to that question is contained here:

  Generator synchronization (John De Armond) [1]

I wouldn't use a colo that had to sync their generator to the grid.
That is a bad design.

Critical load should be on a battery or flywheel system. When the
utility is bad (including out) the load should be on the battery or
flywheel for 5-15 seconds before the generators start. The generators
need to sync to each other.

Essential load (think lighting, AC units) get dropped completely for
30-60 seconds until the generators are stable, and then that load gets
hooked back in.

I have never seen a generator that syncs to the utility for live, no
break transfer. I'm sure such a thing exists, but that sounds crazy
dangerous to me. Generators sync to each other, not the utility.

Most of these come in open, delayed, or closed transition models:
http://www.gedigitalenergy.com/powerquality/ATSHome.htm

For open and closed transitions you'll most certainly want to sync to
utility to transition between the two. For the delayed transition model
it'll stop at the intermediate "open" point for a configurable amount of
time during which the load is disconnected from everything (i.e. let all
the motors spin down first).

~Seth

Generators all stay in sync. Generator owners have expensive devices
that sync the phase before the generator is connected to the grid. Once
a generator is connected to the gird, it will stay in sync - in fact
that is why they have the expensive devices to make sure that they are
in sync before they connect them, as if they are not, it will instantly
jump to being in sync, which may destroy the generator.

As a matter of fact, it may destroy the generator, the housing, the building,
the damn, and more. An out-of-sync generator becomes a motor until it is
in sync. lt can be a graphic and dramatic event.

Big generator are synchron maschines, as they can generate also reactive power. If a out of sync synchron maschine is connected to the grid, theres a big "kawumm" and then the maschine is in sync or dead.
Only the angle between the rotor and the magentic field make the difference between generator and motor.
A synchron motor can not self-start and only run at fixed grid freuency / rpm's. A overloaded motor suddenly stops.

Smaller generators are asynchron maschines, that can run faster or slower than network frequency - ie run as generator or motor - but they always consume reactive power.
They can self-start.

Synchronising maschines to a grid is not a big problem, the bigger problem is to syncronise 2 disconnected grids.
Some years ago in europe a grid operator violated the n+1 redundancy rule as he needed to switch of a big power line over the river "Ems" - to allow a big ship to leave the shipyard.
The result was a netsplit trough whole europe - a lot of "big" line-breakers flipped and switched of north-west and south-east power lines. The whole european grid was split into 3 parts, running at higher and lowet frequencies.

Details:
http://www.bundesnetzagentur.de/SharedDocs/Downloads/EN/BNetzA/Areas/ElectricityGas/Special%20Topics/Blackout2005/BerichtEnglischeVersionId9347pdf.pdf?__blob=publicationFile

Kind regards,
   Ingo Flaschberger

Take a guess what the datacenter our equipment is currently hosted in uses. Yet another reason to be glad of a datacenter move that's coming up.

Why can't we just all use DC and be happy?

motors don't produce DC?

tesla vs edison?

human safe dc voltage requires comically large conductors for the sorts of loads we energize?

transmission loss except at very high voltages...

http://en.wikipedia.org/wiki/High-voltage_direct_current

Also depends on the operator, so ask to see their xfer switches and how
they're programmed if that's a concern. All of the non-residential
models in that link for three-phase have motor/load disconnect signaling
capability. If the operator is clued enough to use it then it's all
good: shut off signal to motor/compressor loads, phase sync and switch,
signal reconnect after delay. But if they're not... run away.

Even with the delayed transition models the "hold open" delay can be too
short and end up re-energizing the motors too quickly. There's always
plenty of ways to f*ck things up good.

~Seth

And more to the point, if you're installing 2-5MW of generation capacity,
it's not all that uncommon to make it a cogen plant, at which point yeah,
you're gonna run in sync.

Leo: note that your body text was an *attachment* for some reason; new mailer?

Cheers,
-- jra

It ismy understanding also that most commercial grade gensets have built into the ATS logic that when utility power comesback online, that the transfer back to utility power is coordinated with the ATS driving the generator until both frequency and phases are within a user specified range?

- mike

Because often short term pain and long term gain frequently (pardon the pun) better ideas do not make it through the free market process?