Peak Season Reliability
Some years ago I was attending a meeting between my company’s Planning and Operations departments when operations made a comment that our generating plants had higher reliability goals during their peak season (in our case the summer). I realized that if the plants actually achieved higher reliabilities during the peak season our Planning department would be overestimating the amount of “Expected Unserved Energy” (EUE) during this period because my reliability department only provided “annual” estimates of each plants future unreliability. Since EUE is the key component in establishing the optimal Reserve Margin Criteria and since the substantial majority of EUE occurs during the peak season, incorporating higher reliabilities forecasts during the summer into the planning models could significantly reduce the amount of peaking generation we needed while maintaining economically optimal customer service reliability (defined as the point where the incremental cost of increased reliability is equal to the incremental value the customer receives as a result of that increased reliability). My Reliability department decided to investigate to find out if this trend was historically true and if we could confidently predict it to continue into the future.
Data Collection and Analysis
Reliability data for each of the ~100 generating units was collected for both the system’s “peak season” and “non-peak season” for the previous five years, an easy task using the North American Electric Reliability Corporation’s (NERC) Generating Availability Data System (GADS) program. This data was then compiled for groups of units, depending on their duty type (base load, load following, cycling, and peaking) and comparisons were made between the plants’ reliabilities during the peak versus non-peak seasons. Statistical analysis indicated a very high probability that the plants were in fact exhibiting higher reliabilities during peak seasons than simply a random variation. Furthermore, the differences were greatest for units used primarily for peaking duty and least for base-loaded units, even though the nuclear units (the most base-loaded units) exhibited the same trend but to a lesser degree. Therefore, forecasts of future plant reliability incorporating seasonal variations could be made with a high degree of confidence.
My Reliability Engineering department began providing the Planning department with two sets of reliability forecasts, one for the peak season and one for the non-peak season and the Planning department modified their generation expansion optimization programs to incorporate those sets of forecasts. The new programs showed that the economically optimal reserve margin was reduced by one full percentage point with no reduction in customer service reliability. For the company’s 30,000 + MW of capacity at that time the effect was to avoid building 300 MW of new peaking capacity and no one had to do anything differently from what they had been doing! At the time this represented a cost savings of $100 million. Following publication of our work, the analysis was extended to the industry by NERC’s Generating Availability Trend Evaluation (GATE) Working Group with similar results. It pays for reliability engineers to keep their ears open!
1) Richwine, R.R.; Lofe, J.J.; Decreasing System Peak Reserve Margin Requirements
2) Lofe, J.J.; Bell, F.J.; Curley, G.M.; Seasonal Performance Trends; NERC publication