On the topic of wind/solar energy, I think these two sentences are the key in the above article:
"Generators running at 100% load cannot have their governors respond to a drop in frequency. Generators running at 100% load cannot be controlled to increase output by AGC."
Wind and solar have been getting their unreliable intermittency covered up by eating our pre-existing margin (dispatchable generators running at less than 100%). At some point, they've eaten so much of the margin, that the grid can no longer maintain reliability and Texas, e.g., is close to that point.
One particularly galling aspect is that they were allowed to "use" that margin at no cost to themselves, and now the tax payers of Texas are having to fork out over $8B to pay for dispatchable power plants to bring back the margin that was given to wind/solar operators for free.
Leave it to the solar and wind lobbyists to push for continuing their "temporary" free-ride. California is an example of this problem with electric power rates two and three times the rates in other states with low solar and wind penetration.
How can "20 minutes to get frequency back to 60Hz or the pre-event value, or he will need to shed load to recover frequency" possibly be true in the event of a 0.1 Hz drop in power supply frequency to 59.9 Hz. The helpful, neighboring balancing area runs at 60.0 Hz, coupled to the troubled area at 59.9 Hz? Can't happen, unless connected via DC.
And is that last example required simply to keep 100-year old electric clocks on time?
He has twenty minutes to restore ACE to pre-event values or zero, not frequency, if I said that I need to edit that. However if the BA is the cause of an underfrequency excursion, that BA has twenty minutes to correct the issue or shed load. The entire Interconnection will have the same average low frequency because the are synchronized, so the neighboring BA would have low frequency too, but it would not be the responsibility of that next door BA to fix the issue.
NERC ended the requirement for time error correction in 2018, WECC kept automatic time error correction in their ACE calculation because it keeps the imbalance energy at a minimum. Their study found that utilities with large pools of imbalance energy were causing the time error issue. The automatic time error correction punishes the BAs that are sloppy at meeting ACE. It was nicknamed the CAISO rule.
Well you're not wrong, part of it is the older steam only plants were forced into retirement by the subsidized energy prices.
On the topic of wind/solar energy, I think these two sentences are the key in the above article:
"Generators running at 100% load cannot have their governors respond to a drop in frequency. Generators running at 100% load cannot be controlled to increase output by AGC."
Wind and solar have been getting their unreliable intermittency covered up by eating our pre-existing margin (dispatchable generators running at less than 100%). At some point, they've eaten so much of the margin, that the grid can no longer maintain reliability and Texas, e.g., is close to that point.
One particularly galling aspect is that they were allowed to "use" that margin at no cost to themselves, and now the tax payers of Texas are having to fork out over $8B to pay for dispatchable power plants to bring back the margin that was given to wind/solar operators for free.
Leave it to the solar and wind lobbyists to push for continuing their "temporary" free-ride. California is an example of this problem with electric power rates two and three times the rates in other states with low solar and wind penetration.
How can "20 minutes to get frequency back to 60Hz or the pre-event value, or he will need to shed load to recover frequency" possibly be true in the event of a 0.1 Hz drop in power supply frequency to 59.9 Hz. The helpful, neighboring balancing area runs at 60.0 Hz, coupled to the troubled area at 59.9 Hz? Can't happen, unless connected via DC.
And is that last example required simply to keep 100-year old electric clocks on time?
He has twenty minutes to restore ACE to pre-event values or zero, not frequency, if I said that I need to edit that. However if the BA is the cause of an underfrequency excursion, that BA has twenty minutes to correct the issue or shed load. The entire Interconnection will have the same average low frequency because the are synchronized, so the neighboring BA would have low frequency too, but it would not be the responsibility of that next door BA to fix the issue.
NERC ended the requirement for time error correction in 2018, WECC kept automatic time error correction in their ACE calculation because it keeps the imbalance energy at a minimum. Their study found that utilities with large pools of imbalance energy were causing the time error issue. The automatic time error correction punishes the BAs that are sloppy at meeting ACE. It was nicknamed the CAISO rule.