skip navigation links 
 
 Search Options 
Index | Site Map | FAQ | Facility Info | Reading Rm | New | Help | Glossary | Contact Us blue spacer  
secondary page banner Return to NRC Home Page

                                                                 1

 1                      UNITED STATES OF AMERICA

 2                    NUCLEAR REGULATORY COMMISSION

 3                                 ***

 4                   BRIEFING ON IMPROVEMENTS IN THE

 5                      REACTOR OVERSIGHT PROCESS

 6                                 ***

 7                           PUBLIC MEETING

 8

 9                                  Room 1F-16

10                                  White Flint Building 1

11                                  11555 Rockville Pike

12                                  Rockville, Maryland

13                                  Tuesday, March 7, 2000

14

15              The Commission met in open session, pursuant to

16    notice, at 1:00 p.m., the Honorable RICHARD A. MESERVE,

17    Chairman of the Commission, presiding.

18

19    COMMISSIONERS PRESENT:

20              RICHARD A. MESERVE,  Chairman of the Commission

21              GRETA J. DICUS, Member of the Commission

22              NILS J. DIAZ, Member of the Commission

23              EDWARD McGAFFIGAN, JR., Member of the Commission

24              JEFFREY S. MERRIFIELD, Member of the Commission

25

                                                                 2

 1    PARTICIPANTS:

 2    PANEL I:

 3    JAMES DYER, REGION III ADMINISTRATOR

 4    SAMUEL COLLINS, DIRECTOR, NRR

 5    WILLIAM TRAVERS, EDO

 6    WILLIAM DEAN, PROGRAM INSPECTION BRANCH, NRR

 7    ALAN MADISON, TASK LEADER, NRR

 8    MICHAEL JOHNSON, CHIEF PERFORMANCE EVALUATION & ASSESSMENT 

 9    SECTION, NRR

10    PANEL II:

11    RALPH BEEDLE, SR. VP, NUCLEAR GENERATION AND CHIEF NUCLEAR 

12    OFFICER, NEI

13    DAVID GARCHOW, VP TECHNICAL SUPPORT, PUBLIC SERVICE ELECTRIC

14     AND GAS

15    DAVID LOCHBAUM, NUCLEAR SAFETY ENGINEER, UNION OF CONCERNED

16     SCIENTISTS

17    JILL LIPOTI, ASSISTANT DIRECTOR, RADIATION PROTECTION 

18    PROGRAMS, DEPARTMENT OF ENVIRONMENTAL PROTECTION, STAE OF 

19    NEW JERSEY

20    FRANK GILLESPIE, CHAIRPERSON, PILOT PLANT EVALUATION PROGRAM

21

22

23

24

25

                                                                 3

 1                        P R O C E E D I N G S

 2                                                     [1:00 p.m.]

 3              CHAIRMAN MESERVE:  Good afternoon.  We are meeting

 4    this afternoon to have a briefing on the revised reactor

 5    oversight process and the lessons that have been learned

 6    from the pilot effort that was underway to test that

 7    process.

 8              As all of you know, the aim has been to revise the

 9    oversight process to provide an oversight mechanism that is

10    more risk-informed, more objective and more focused.

11              It had, after some extensive efforts by the

12    Commission before I arrived, been launched with a pilot

13    program at over six months at nine plants.  I know that, not

14    only from the materials that have been presented to us, but

15    also from my conversations with the staff over the few

16    months that I've been here, that there has been an enormous

17    effort by the staff to pull this together and before we get

18    to the substance, I did want to express my appreciation to

19    you, on behalf of my colleagues, as well, for your diligence

20    in pursuing this so aggressively.

21              The original plan was that we would commence with

22    the initial implementation of the oversight process in April

23    and we look forward to your discussion today about your

24    plans for implementation, as well as your efforts to assess

25    what you have learned as a result of your pilot activities.

                                                                 4

 1              Let me turn to my colleagues and see if they have

 2    any comments they wish to make.  If not, Mr. Travers, would

 3    you proceed?

 4              DR. TRAVERS:  Thank you, Chairman.  Good

 5    afternoon.  Certainly the development and pilot testing of

 6    the revised reactor oversight process has been a significant

 7    priority for the agency and I'm glad to tell you that, from

 8    my perspective, the Commission will have to be the judge,

 9    but from my perspective, the NRC staff have, working as a

10    team across offices and the regions, been extremely

11    successful in this effort.

12              Importantly, we have benefited from significant

13    contributions from many of our external stakeholders

14    throughout the process.  As indicated in our recent SECY

15    paper, 00-0049, we have now completed the six-month pilot of

16    the revised process and have analyzed the results and

17    lessons learned.

18              Although issues remain, and we will discuss those

19    this afternoon, we are recommending Commission approval for

20    initial implementation at all operating nuclear power

21    plants.  We think the new process is sound and that initial

22    implementation will provide opportunities to further refine

23    it.

24              With me at the table today are, starting from my

25    left, Mike Johnson and Alan Madison, both from the

                                                                 5

 1    Inspection Program Branch, NRR; Bill Dean, who is the Chief

 2    of that Inspection Program Branch; Sam Collins, the Director

 3    of NRR; and, Jim Dyer, Region III Administrator.

 4              Sam is going to continue with the briefing.

 5              MR. COLLINS:  Thank you, Bill.  Good afternoon,

 6    Chairman, Commissioners.  I'll just make a few brief

 7    remarks.  As Bill indicated, we are here today as a result

 8    of much work and experience in the pilot program, in the

 9    development of the process itself.  I would like to

10    acknowledge, briefly, contributors to the process, which

11    include not only the regions, who have contributed FTE for

12    defining the process, writing the inspection procedures and

13    implementing the pilot program; the Office of Research, in

14    the risk area; the Office of General Counsel; and, many of

15    the stakeholders that will make presentations for you today.

16              It's important to note this process, not only in

17    piloting an oversight program, also piloted a new era, if

18    you will, in the office's way of doing business, and that's

19    in public form and with stakeholder involvement and

20    stakeholder influence.

21              Having said that, it is a work in progress, so to

22    speak.  There will be further refinements.  We'll talk about

23    those today during the course of the presentation,

24    particularly in the area of those challenges that have been

25    brought to us by our external stakeholders, and I anticipate

                                                                 6

 1    that you'll hear those not only from the staff, but also

 2    from the other participants.

 3              We believe we have a viable process identified, as

 4    indicated in the SECY paper.  We will focus at the end of

 5    the briefing on the recommendations and the go-forward

 6    manner and we will identify some of the future initiatives

 7    that are presently identified, but also acknowledge a caveat

 8    that through the initial implementation period, we expect to

 9    refine the process and identify other areas where the

10    process can be made more viable and more effective.

11              With that, I will turn the briefing over to Bill

12    Dean.

13              MR. DEAN:  Thank you, Sam.  Good afternoon,

14    Chairman and Commissioners.  If I could have the first

15    slide, please.

16              What we intend to cover today is basically

17    described in SECY 0049, which is a very detailed document

18    that describes the results of a pilot program.  It also

19    talks about both near and long-term activities that will

20    address many of these lessons learned.

21              We will cover highlights of that Commission paper

22    today in our presentation.  First, we want to spend a few

23    minutes addressing where we are in our efforts to implement

24    the revised reactor oversight process and review the key

25    lessons learned, including the positive aspects of the

                                                                 7

 1    process, that resulted from the pilot program.

 2              At the end, we will discuss some of the major

 3    long-term activities associated with ongoing process

 4    improvements and summarize our recommendations regarding

 5    implementation of the revised reactor oversight process.

 6              Before we get started, I do want to recognize

 7    Mike.  Alan and I were actually charter members of the IRAP

 8    way back when several years ago, and I must say that for us

 9    to be here before the Commission to discuss with you

10    basically the fruits of a long labor is very rewarding for

11    us to be able to be here with you today.

12              Next slide, please.

13              This is really just a brief historical review of

14    the major events and the history of the development of the

15    revised reactor oversight process, and it finds us in the

16    mid-point of this chart; that is, we have developed and

17    refined the process through the pilot program such that

18    we're at the point where the Commission can consider whether

19    we should begin initial implementation.

20              Of particular note is the last bullet on this

21    slide, which underscores that initial implementation, really

22    is a natural extension of the pilot program; that we need to

23    come back before the Commission in a year after we begin

24    initial implementation and report on the results of that

25    process and the further refinements that we will make and

                                                                 8

 1    the lessons that we have learned.

 2              Next slide, please.  It's important in any

 3    discussion regarding the revised reactor oversight process

 4    to revisit the main objectives behind taking on substantial

 5    change.  There certainly is a tremendous record of

 6    Commission direction regarding the NRC's assessment process

 7    over the years, which led to the IRAP effort I described

 8    earlier, and it embodies the characteristics described on

 9    this slide.

10              The NRC's four performance goals provide an

11    overarching set of goals for this new process, while the

12    four specific objectives -- that is, making the process more

13    risk-informed, objective, predictable and understandable --

14    were specifically provided in Commission direction to the

15    staff and were described in detail in SECY 99-007 last year.

16              Next slide, please.

17              The major part of our presentation this afternoon

18    is going to regard the key issues and actions taken as a

19    result of the pilot program lessons learned.  This slide

20    basically describes how we got that point.

21              We conducted a six-month pilot program and were

22    able to meet all the purposes described here.  There's two

23    items of note that I want to point out regarding this slide

24    and those are the last two bullets.  Since the pilot program

25    ended in November, we have continued to utilize the revised

                                                                 9

 1    reactor oversight process at those nine sites at which we

 2    conducted the pilot program and have continued to gather

 3    additional lessons learned and insights from continuing the

 4    process at those sites.

 5              Then lastly, the pilot program encompassed an

 6    extensive amount of internal and external feedback

 7    activities.  We had a wide range of activities to get

 8    feedback from both our public, state, industry and internal

 9    stakeholders, and that included the pilot program evaluation

10    panel, which is the Federal Advisory Committee Act type

11    committee, an independent advisory committee, as you will.

12              We had sometimes diverse and certainly

13    considerable feedback and it was a tremendous challenge on

14    our part to accommodate all this feedback, while keeping the

15    performance goals and objectives in mind from the previous

16    slide.

17              But this led us a number of near-term refinements,

18    some that we made during the pilot program and some that

19    we've made since the program has ended, as well as issues to

20    consider during initial implementation, and we will discuss

21    those in more detail later this afternoon.

22              Next slide, please.

23              Before we get into some of the key issues, it's

24    important to note that we learned much about the revised

25    reactor oversight process that supports the general

                                                                10

 1    observations from almost all of our stakeholders that this

 2    is an improved process for many measures from our current

 3    oversight methodologies, and these are listed on this slide

 4    and described in detail in the Commission paper 0049.

 5              We recognize it is not a perfect process, but

 6    clearly an improvement that warrants implementation at all

 7    sites to further expand lessons learned and improve its

 8    efficacy.

 9              Before I move on, I would like to offer, we have

10    Jim Dyer, the Regional Administrator from Region III here. 

11    I'd like to offer Jim the opportunity, if he has any

12    comments regarding the results of the pilot program from his

13    regional perspective.

14              MR. DYER:  Overall, the views of the four Regional

15    Administrators, based on the pilot plant results and the

16    changes to the program that are outlined in the Commission

17    paper, we feel very comfortable moving forward with initial

18    implementation of the revised reactor oversight program, any

19    refinements that are going to be made in the future and that

20    are appropriate during the initial implementation phase.

21              MR. DEAN:  Thank you, Jim.  We would now like to

22    discuss some of the specific lessons learned as a result of

23    the pilot program.  First, I'll turn it over to Alan

24    Madison, who will discuss with you lessons learned out of

25    the performance indicators and the inspection program, and

                                                                11

 1    then Mike Johnson will address the significance

 2    determination process, assessment and enforcement processes.

 3              Alan?

 4              MR. MADISON:  Thank you, Bill, and good afternoon. 

 5    Lessons learned throughout the pilot identified the need to

 6    further develop and refine the guidance provided in NEI-9902

 7    for reporting of PIs.  Most inconsistencies in reporting

 8    could be traced to misinterpretation issues and,

 9    consequently, four major revisions to the guidance were

10    issued during the pilot to address these concerns.

11              NEI is in the process of issuing Revision 0 of the

12    guidance and we will endorse this revision for data

13    collection purposes beginning April 1.

14              We will continue to assess the adequacy of

15    reporting guidance and collect feedback from the licensees

16    and inspectors through a formal process involving an

17    industry working group.  In developing the PIs, the role of

18    the barrier cornerstone PIs was intended to be fundamentally

19    different from the other indicators.

20              Unlike other PIs, their thresholds were set as

21    percentages of technical specification limits instead of

22    relying on historical data analysis.  In actual practice,

23    plants operate very far below these limits and would rarely,

24    if ever, exceed them.  Consequently, these indicators serve

25    primarily a public competence role to indicate how much

                                                                12

 1    margin these barriers provide to release of radioactive

 2    materials, as opposed to most of the other indicators, where

 3    the green-white thresholds were set to indicate deviations

 4    or outliers from nominal industry performance.

 5              Some stakeholders have raised the concern that

 6    these PIs are, therefore, not meaningful.  Also, several

 7    specific concerns regarding varied collection methods and

 8    different tech spec requirements associated with the

 9    containment leakage PI were raised.  In addition, this lack

10    of consistent information was exacerbated by infrequent data

11    collection issues.

12              When combined, these concerns prompted the staff

13    to eliminate the use of the containment leakage PI from the

14    program.  We will continue to closely monitor the remaining

15    barrier PIs and assess their efficacy during initial

16    implementation, while awaiting results of the effort by the

17    Office of Research to develop indicators that may be more

18    meaningful.

19              During the pilot program, all pilot plants were

20    able to comply with the 14-day reporting requirement. 

21    However, significant industry feedback indicated this placed

22    a large burden on licensees and contributed to occurrences

23    of inaccurate reporting.  The staff reconsidered this

24    requirement in an attempt to balance the directive for

25    gathering timely data to support the assessment process and

                                                                13

 1    public dissemination of information.

 2              With concerns regarding unnecessary regulatory

 3    burden and accuracy of data, we determined that increasing

 4    this requirement to 21 days would better achieve this

 5    balance.

 6              As we indicated we would do in SECY 99-007, we

 7    have reevaluated all performance indicator thresholds using

 8    the pilot program data and the historical data provided by

 9    non-pilot licensees on January 21 of this year. 

10    Consequently, we have made changes to about half the PIs. 

11    Almost all the changes affect the green-white threshold.  We

12    have reduced this threshold for six of the PIs and raised

13    the threshold for two others.

14              For example, several of the safety system

15    unavailability green-white thresholds have been changed back

16    to those proposed in SECY 99-007.  These original settings

17    were based on a review of historical data from 1995 to

18    mid-1998.  We had agreed with NEI's proposal to change these

19    thresholds for the pilot program to take into consideration

20    industry goals established by INPO and longer risk-informed

21    allowed outage times that existed at some plants.

22              However, our ongoing analysis shows that our

23    original settings more accurately reflected current industry

24    performance.

25              This will be an area we will continue to monitor

                                                                14

 1    and some future adjustments may be necessary to take into

 2    consideration continuing efforts to risk-informed

 3    regulations and site-specific requirements.

 4              We also recognize that industry has concerns

 5    regarding the manual scrams performance indicator and the

 6    unplanned scram performance indicator, with regard to

 7    unintended consequences with these performance indicators. 

 8    We're exploring potential changes to these PIs and we'll

 9    closely monitor these issues during initial implementation.

10              We also learned from the pilot program that

11    several policy issues needed to be strengthened and these

12    are being addressed. For example, we will incorporate a

13    process for major changes to the performance indicators in

14    the overall program guidance.  This process will mirror what

15    was done for the development of the original performance

16    indicators and will provide for rigorous review and testing

17    prior to implementation.

18              Next slide, please.

19              One of the key internal issues that arose from our

20    review of lessons learned during the pilot was that our

21    inspectors were unsure what to document in an inspection

22    report and questioned whether we had established an

23    appropriate threshold.  Previous guidance given to

24    inspectors under the current oversight program already

25    discourage documenting minor violations, except in rare

                                                                15

 1    circumstances.

 2              The Office of Enforcement has recently reissued

 3    extensive guidance in this area in the form of examples. 

 4    The revised reactor oversight process incorporated this

 5    guidance and made it the threshold for documenting

 6    inspection findings.  We also extended the guidance to

 7    include issues outside the normal regulatory framework.  The

 8    intent of this change was to remove from the report

 9    subjective discussion, both positive and negative, of

10    aspects of licensees' activities that could not be used in

11    objectively assessing performance.

12              This essentially raised the threshold for what was

13    discussed in inspection reports, commensurate with the

14    philosophy of the new oversight program, that there exists a

15    band of performance for which licensees should be

16    responsible, with minimum NRC interaction.

17              Many inspectors and regional managers were

18    uncomfortable with removing the capability to document these

19    observations and insights.  Some licensees also express

20    their concern with not having access to these insights from

21    inspectors.  Therefore, early in the pilot program,

22    clarification was provided that reinforced the expectation

23    that inspectors were encouraged to share their observations

24    and insights with licensees, as they always have.

25              We have also allowed inspectors some leeway to

                                                                16

 1    document substantial observations that relate to important

 2    cross-cutting areas.  However, there continues to be a

 3    distinct change in the inspection reports, such that the

 4    primary focus is on those issues that have some risk

 5    significance.

 6              Balancing the scope, depth and frequency of

 7    inspections with the resources necessary to accomplish them

 8    is an area in which we gained a number of insights from the

 9    pilot program and which will be an area of great emphasis

10    during initial implementation.

11              The Commission directed the staff not to establish

12    specific efficiency goals for the new oversight process, but

13    to determine what effort it took to implement the new

14    process.  Much of the feedback from inspectors has been that

15    a number of the initial estimates were too low.

16              As the procedures were revised to reflect lessons

17    learned, adjustments were made to some of the estimates. 

18    Additionally, we recently met with regional management to

19    establish a better overall estimate for initial

20    implementation to support planning activities.  Of note is

21    that the pilot program showed a distinct increase in the

22    level of preparation needed, which can be attributed to the

23    risk-informed nature of the program, as well as learning

24    curve considerations.

25              We are collecting a substantial amount of

                                                                17

 1    information during initial implementation so that we can

 2    more accurately measure the resources needed to execute the

 3    program.

 4              One of the major changes from the current program

 5    is the shift in emphasis on engineering activities.  This is

 6    the result of our review of past significant events and

 7    inspection findings.  This may alter the skill set required

 8    of our inspectors.  Several regions are already pursuing

 9    hiring additional staff with engineering expertise.

10              In addition, to our experienced staff, we will use

11    contractor resources to facilitate execution of the

12    inspection program.  This will be an area that we will

13    assess during initial implementation and report back to the

14    Commission next year.

15              Additional attention has also been focused on fire

16    protection expertise and we are considering training needs

17    for regional staff to support conducting these inspections. 

18    We will also evaluate this area during initial

19    implementation and adjust our actions accordingly.

20              Finally, as in other program areas of the revised

21    oversight program, we have identified the need to develop a

22    more formal program change process.  The objective is to

23    provide a deliberate process which will consider the risk

24    significance of proposed changes to the inspection program

25    and performance indicators, and this process will take into

                                                                18

 1    consideration the effect of the proposed changes on the

 2    cornerstone attributes and their relationship to the

 3    important areas to measure within each cornerstone.  We

 4    expect this process to be in place sometime next year.

 5              This concludes my remarks.  Next is Mike Johnson.

 6              MR. JOHNSON:  Thanks, Alan.  Good afternoon.  The

 7    significance determination process provides a major advance

 8    in making our process more risk-informed, objective and

 9    predictable by providing a set of tools by which inspectors

10    and others can characterize the significance of inspection

11    findings.

12              While we believe the pilot program demonstrated

13    the potential of the SDP, issues identified highlight the

14    need for continued improvements.

15              Let me first describe several improvements we've

16    made or will make to the reactor SDP, reactor safety SDP. 

17    When we began the pilot, we noted that the SDP for fire

18    protection, containment and shutdown needed to be developed. 

19    Initial development of the fire protection SDP was completed

20    and the SDP was exercised during the pilot program.  We

21    identified lessons learned and are refining the SDP to make

22    it easier to use.

23              We developed a shutdown screening tool that will

24    enable inspectors to screen out lesser significant issues

25    and raise issues of potentially greater risk significance

                                                                19

 1    for a more complete analysis.  We've also developed a

 2    containment SDP.  We will perform a feasibility review for

 3    both the containment SDP and the shutdown screening tool

 4    prior to April and expect to have them available shortly

 5    thereafter.

 6              The reactor safety SDP was developed to determine

 7    the significance of issues based on their impact on core

 8    damage frequency and large early release frequency.  The

 9    potential impact of external events, such as flooding and

10    fires, are not reflected in the SDPs.  For some plants where

11    the influence of external events is notable, this will

12    introduce potential non-conservatism into the SDP analysis.

13              I should point out that the agency's own

14    simplified plant analysis of risk fire models have yet to be

15    updated to incorporate external events.  For initial

16    implementation, the staff will develop the screening tool to

17    help identify those issues that should receive further

18    evaluation to specifically account for increased risk

19    contribution due to external events.

20              To support implementation of the reactor SDP

21    during the pilot, we developed site-specific worksheets.  We

22    visited pilot sites to verify the plant equipment considered

23    in the SDP as providing mitigation capability is, in fact,

24    present.

25              We also ran several scenarios through the SDP and

                                                                20

 1    the licensees' PRA to ensure the results are appropriate. 

 2    We plan to finish issuing worksheets for all plants in the

 3    next several weeks and conduct visits to all sites within

 4    the next few months.

 5              As a related matter, during the pilot, we compared

 6    the plant-specific reactor SDP to two pilot plant licensees'

 7    PRAs for several hypothetical findings.  We found, in some

 8    instances, the SDPs underestimated risk due to the emission

 9    of certain core damage sequences.  These sequences related

10    to initiating events that consequently removed mitigation

11    capability and, therefore, had a greater effect on the risk

12    than the SDP had previously estimated.

13              We refer to these as special initiators.  An

14    example of a special initiator could be a loss of an

15    electrical bus or a cooling water system that simultaneously

16    causes a trip and removes mitigation equipment.

17              We will supplement the basic SDP worksheets to

18    address special initiators.  Draft special initiator

19    worksheets are expected to be available before the end of

20    May.  Site visits by the NRC risk analysis will confirm or

21    modify these worksheets and in the interim, the SDP

22    screening tool will be used to identify potentially

23    significant findings for analysis by risk experts.

24              I've spoken a lot about the reactor SDP, but I'd

25    also like to point out that during the pilot, we exercised

                                                                21

 1    the other SDPs, the emergency preparedness, the radiological

 2    protection, and the safeguards SDPs, and have subsequently

 3    improved them based on lessons learned.

 4              In addition to improving the SDP tools, we found

 5    that process improvements were needed.  We found that the

 6    time expended in completing the technical evaluation of

 7    issues that progress beyond initial screening, the time

 8    spent documenting the results, and allowing an opportunity

 9    for licensees to provide any additional information to the

10    NRC before a final decision was reached on a decision made

11    timely resolution difficult.

12              We will clarify and modify the process for

13    handling these issues to improve our efficiency and

14    timeliness, while not sacrificing allowing an opportunity

15    for licensees to provide input.  We will assess the efficacy

16    of the entire SDP process as we go forward.

17              Finally, the process increases the importance of

18    the staff's ability to understand and use risk insights.  We

19    will continue to explore ways to provide additional risk

20    knowledge and capability by increasing the number of risk

21    analysts, but also by increasing the overall ability of the

22    staff to make greater use of risk insights.

23              Relatedly, we are relooking at our risk training

24    to ensure it meets our expectations.

25              Next slide.

                                                                22

 1              Two major issues were identified related to

 2    assessment.  First, feedback indicated that there remain

 3    fundamentally differing views regarding how cross-cutting

 4    issues should be handled in the oversight process.  One

 5    prevalent view was that declining performance in

 6    cross-cutting areas will be reflected in performance

 7    indicators and inspection results that cross thresholds. 

 8    Absent such performance, the NRC should not engage.  This

 9    view is one of the fundamental tenets of the original

10    program.

11              Others share a very different view; namely, that

12    it's possible to have programmatic breakdowns in

13    cross-cutting areas that do not necessarily manifest

14    themselves in issues of sufficient significance that trip

15    thresholds in a timely manner.

16              Historically, plants that experience significant

17    performance problems evidence early problems in

18    cross-cutting areas.  Given the concerns regarding treatment

19    of cross-cutting issues, we modified the guidance to allow

20    for qualitative discussion of substantial cross-cutting

21    issues in the mid-cycle and end-of-cycle assessment letters. 

22    However, the procedure provides that actions will not be

23    taken for these cross-cutting concerns absent a PI or

24    inspection finding outside the licensee response band.

25              A working group is being established to continue

                                                                23

 1    the dialogue and the work on better addressing concerns

 2    related to treatment of cross-cutting issues.  This work

 3    will continue into the first year of implementation and in

 4    the interim, we plan to proceed with the existing policy

 5    that I've just mentioned.

 6              Secondly, feedback indicated that stakeholders

 7    recognize that deviations from the action matrix will need

 8    to occur.  However, there was a widespread belief that when

 9    deviations do occur, they be rare, that they should be made

10    in accordance with an established process, and that the

11    deviation, along with the rationale, be clearly documented

12    for all stakeholders to see.

13              We have modified our guidance to establish these

14    expectations.

15              One last area related to assessment reflects the

16    realization that the revised reactor oversight process

17    doesn't distinguish between findings involving regulatory

18    non-compliance and those findings that don't involve a

19    non-compliance, but represent an increase in plant risk as a

20    result of deficient performance.

21              If such an issue were to achieve sufficient risk

22    significance to meet the regulatory guidelines for a

23    backfit, the staff would consider a backfit.  However, it's

24    possible for issues to cross thresholds for the oversight

25    process, but that do not achieve the threshold for backfit.

                                                                24

 1              The staff believes that consideration of issues

 2    based on their safety significance and not just whether

 3    non-compliance is involved is consistent with taking a

 4    risk-informed approach to regulation.

 5              Therefore, we plan to apply the action matrix

 6    accordingly.  However, we will ensure that all requirements

 7    for backfitting are met prior to implementing new regulatory

 8    requirements on licensees.

 9              Next slide, please.

10              A primary aim of the revised oversight process was

11    to better integrate enforcement with our other oversight

12    activities and make it a process outcome and not a driver. 

13    During the pilot program, we specifically looked to ensure

14    that enforcement outcomes were consistent with the SDP

15    results and the enforcement policy, as revised, for plants

16    participating in the pilot program.

17              They were and the feedback received was generally

18    supportive of the changes made.  However, significant

19    concerns were raised by the industry regarding the staff's

20    application of 50.9, completeness and accuracy of

21    information, related to PI reporting inaccuracies.  We are

22    revising the enforcement policy to address these concerns

23    and to incorporate the remaining interim enforcement

24    guidance used for the pilot plants and to the body of

25    enforcement policy that will be applicable to all plants.

                                                                25

 1              The draft Commission paper that forwards this

 2    policy is in concurrence.

 3              Bill?

 4              MR. DEAN:  Thank you, Michael.  Next slide,

 5    please.

 6              As a result of SECY 99-007 and 007A and the

 7    pertinent Commission briefings, the Commission issued an SRM

 8    on June 18, 1999, which approved the implementation of the

 9    pilot program.  In that SRM, the Commission also asked the

10    staff to address a number of issues.

11              This slide summarizes four of the key issues that

12    were discussed in that SRM and these are described in much

13    more detail in SECY 00-0049, including the results of

14    stakeholder feedback that we solicited on these issues in a

15    Federal Register notice.

16              The first issue, Mike has just spent some time

17    talking about the issue of programmatic breakdowns and the

18    influence of cross-cutting issues and the fact that there is

19    a distinct difference of opinion between what's described in

20    the regulation framework and what may be true in

21    application, and Mike described what we intend to do on that

22    on an ongoing basis.

23              With respect to the next two issues, in terms of

24    overall assessment of cornerstones and inclusion of positive

25    inspection findings, the feedback that we've received from

                                                                26

 1    our stakeholders, as well as ongoing review with the staff,

 2    indicates that we believe the position that we described in

 3    SECY 99-007 and 007A and as reiterated in SECY 99-0049

 4    support the fact that no changes are planned in what we

 5    intend to do in the oversight process.

 6              Finally, the issue of SALP, which has been

 7    suspended for several years now as we've gone through the

 8    development of this process, that we would recommend the

 9    termination of the SALP process and implementation of the

10    revised reactor oversight process.

11              Next slide, please.

12              As I mentioned earlier, in executing the pilot

13    program, we developed and executed a number of significant

14    and time-consuming external and internal communication

15    activities.  We believe that we need to continue this level

16    of communication through initial implementation.

17              Some of the methodologies by which we intend to

18    continue to do this include continued public outreach

19    activities; for example, the regions are planning, are over

20    the course of the next six months, to visit all the

21    non-pilot sites, much as we did the pilot sites, to

22    communicate this new oversight process to the constituents

23    in the local vicinities of the plant.

24              Our external web page has been a very valuable

25    tool in providing information to the public and other

                                                                27

 1    interested stakeholders.  We've gotten a lot of feedback

 2    over the course of the last year about that web site and we

 3    are undertaking a number of changes and revisions to make

 4    that a more clear and understandable and more easily

 5    navigable web page.

 6              One of the initiatives that the Office of Public

 7    Affairs provided us great assistance in was in developing a

 8    plain language description of the oversight process.  We are

 9    in the process now of revising and rewriting that NUREG,

10    NUREG-1649, and we will be issuing that in the very near

11    future, so that the public will have a plain language

12    description of the new oversight process.

13              We will continue our regular public meetings with

14    NEI and industry.  We had these meetings on approximately a

15    biweekly basis to discuss ongoing issues in the spirit of

16    cooperation that surrounded the development of the revised

17    reactor oversight process and we will continue those public

18    forum interactions with industry and NEI.

19              Then we would look, much like we did at the end of

20    the pilot program, where we conducted a lessons learned

21    workshop which brought together NRC, industry and public

22    stakeholders, into one forum to discuss the major lessons

23    learned, and describe potential approaches for dealing with

24    those, we would intend to have a similar workshop during

25    initial implementation, probably close to the end, to once

                                                                28

 1    again revisit the lessons learned that came out of initial

 2    implementation and, once again, get all stakeholder input as

 3    to where we should go forward.

 4              Next slide, please.

 5              In addition to working very closely with our

 6    external stakeholders, moving into initial implementation

 7    will also require continuing the substantial level of

 8    interaction that we've had with the NRC staff, both in the

 9    regions and in headquarters.  We were at the point in our

10    change management process that where the regional staff is

11    educated about the new process and ready to go, albeit with

12    some level of skepticism.

13              In addition to regional initiatives to manage his

14    change, we will conduct a variety of activities to continue

15    to move both our regional and headquarters staff forward in

16    embracing this major process change and these activities are

17    listed on this slide.

18              Next slide, please.

19              I mentioned earlier, and I think there is a

20    general agreement among all stakeholders, that the process

21    is not perfect.  Alan and Mike described some of the key

22    near and long-term activities in place to refine the revised

23    reactor oversight process based on the lessons learned from

24    the pilot program.

25              This slide lists some of the more broad activities

                                                                29

 1    that may proceed over the course of the next several years,

 2    in which we would utilize expertise not only from outside

 3    the Inspection Program Branch to accomplish some of these

 4    initiatives, but also, in some cases, utilize the input from

 5    external stakeholders.

 6              The first issue there is developing additional

 7    performance indicators; for example, a containment

 8    performance indicator.  This has been described over the

 9    past year in research user need memos from the Office of NRR

10    and these are encompassed, in part, by the research effort

11    to develop risk-based performance indicators and we intend

12    to continue our dialogue with the Office of Research on the

13    efficacy of that risk-based performance indicator program

14    and where we might be able to glean some additional

15    performance indicators that would support the revised

16    reactor oversight process.

17              The next item with regard to industry-wide

18    assessment and trend evaluation, once again, we are

19    soliciting the Office of Research's assistance in this

20    regard.  What we envision here is having a process that

21    would provide essentially a check-and-balance of the revised

22    reactor oversight process to give us assurance of the

23    capabilities and continued efficiency of that process to

24    maintain safety.

25              This would include utilizing such ongoing programs

                                                                30

 1    as the accident sequence precursor program, initiating event

 2    studies and so on, and we're in the process of working with

 3    the Office of Research to better define how we would go

 4    about continuing this industry-wide assessment process.

 5              The third bullet there, the oversight process

 6    self-assessment, is in the spirit of having a continuous

 7    process improvement philosophy with the revised reactor

 8    oversight process.  We would envision that on an annual

 9    basis, that we would accumulate insights about the efficacy

10    of the program and come back to the basic framework and

11    implementing documentation for the revised reactor oversight

12    process and review that and make appropriate changes as

13    necessary.

14              Finally, we would see all of those aforementioned

15    items come together in an annual forum, where we would have

16    our annual agency action review meeting and Commission

17    briefing that we would consider to be a three-phased

18    approach.

19              One piece of that briefing would be a discussion

20    of individual plant licensee performance, where we had

21    plants that warranted agency level attention.  We would also

22    have a portion of the briefing that would discuss

23    industry-wide performance, assessment and trends, that we

24    would get from our overall assessment of industry-wide

25    performance.  And finally, the oversight process performance

                                                                31

 1    and improvements based on the self-assessment conducted by

 2    the staff.

 3              The Commission should see future correspondence on

 4    these issues over the course of the next six months or so in

 5    various SECY papers.

 6              Next slide, please.

 7              As Bill Travers mentioned at the outset of this

 8    presentation, the staff is ready to proceed with trial

 9    implementation.  As we have just noted, there certainly are

10    issues that we continue to work on and we know that there

11    will be additional refinements that will be necessarily as

12    we go through initial implementation.

13              As a practical matter, in order to meet the

14    direction provided by the Commission in its June 1999 SRM,

15    and as we described in SECY 0049, the staff has poised

16    itself, through substantial training and planning

17    activities, to begin initial implementation in April.

18              Of particular note is the ongoing efforts right

19    now where the regions are performing their annual plant

20    performance reviews and developing inspection plans based on

21    the utilization of the new baseline inspection program.

22              I think this might be a good point maybe to ask

23    Jim to possibly weigh in again with respect to the planning

24    phase that the regions are undergoing for implementing the

25    new process.

                                                                32

 1              MR. DYER:  I think right now, in fact, in Region

 2    III, this week, we're undergoing the plant PPR process and

 3    outlining the proposed inspections and reviewing the

 4    performance indicators and preparing to move forward into

 5    it.

 6              Last week, we held our final training session for

 7    the staff in Region III, which I found to be excellent.

 8              One thing I wanted to pass on, a credit to NRR,

 9    was the quality of training and the developments made in

10    explaining areas such as the significance determination

11    process have improved substantially from the initial efforts

12    when we were briefed going into the pilot program.

13              So it's really evolved and I think it's helped to

14    prepare the regional staff for implementation.

15              MR. DEAN:  Thank you, Jim.  The other two items

16    here are discussed in some detail in SECY 0049 and that is

17    the recommendation, as I mentioned earlier, to terminate the

18    SALP process and also to solicit feedback from the

19    Commission on three issues of note that Mike and Alan just

20    described, that being the role of the barrier performance

21    indicators, the issues where we have issues that might be

22    identified that are outside licensing basis, design basis

23    activities, but which still result in some risk insights,

24    and the area of cross-cutting issues, which we are

25    continuing to review.

                                                                33

 1              These are highlighted in SECY 0049 and we would

 2    ask any Commission input if they believe that the staff

 3    approach creates any concerns, certainly we would appreciate

 4    and ask for feedback in those areas.

 5              With that, the staff has concluded its briefing of

 6    the Commission on its recommendations regarding the revised

 7    reactor oversight process and initial implementation and we

 8    sit here ready to respond to your questions.

 9              CHAIRMAN MESERVE:  Thank you very much for a very

10    helpful briefing.

11              I think the point that several of you have

12    mentioned, that what you envision for April is initial

13    implementation, which obviously reflects an expectation that

14    this will be some period of months to get everything fully

15    operating and there will be lots of learning and changing

16    that may well be required as we go forward.

17              Nonetheless, I am rather struck by the

18    presentation, by the number of areas in which you have

19    uncertainty.  You indicated that there are performance

20    indicators under development and some needing further

21    assessment.  The thresholds are changing.  The process for

22    changing performance indicators is only now being worked

23    out.  The resources that would be required to undertake the

24    inspection is something that's under continuing evaluation.

25              You have some concerns about the expertise in fire

                                                                34

 1    protection engineering that may be necessarily to fully

 2    implement the program.  You have additional SDPs that you

 3    need to develop for fire protection, shutdown and

 4    containment.  You're working on screening tools for external

 5    events and for what you're calling special initiators, you

 6    have some of the whole SDP process timeliness issues that

 7    you need to resolve, and on top of that, you have training.

 8              It is an incredibly ambitious agenda to get all of

 9    that pulled together and it seems to me a substantial

10    portion of it or some parts of it at least really have to be

11    in place by April.

12              Are you ready?

13              MR. DEAN:  Let me take the first shot at that.  A

14    number of the areas that you described, Chairman, are things

15    that we have been working on over the course -- these aren't

16    brand new issues.  These are things that we've been aware of

17    for quite some time and we've had a number of efforts in

18    place to develop these.

19              For example, I'll point to the significance

20    determination processes associated with containment.  That

21    is something that has been worked on over the course of the

22    last nine months or so and earlier this month, we received a

23    product from Brookhaven National Labs that has been working

24    on it that we believe is a very substantial product that

25    will be very useful and we intend to do a feasibility review

                                                                35

 1    of the containment SDP next week, as a matter of fact, and

 2    share that SDP with industry and allow industry to run

 3    similar scenarios through that significance determination

 4    process.

 5              So for example, a lot of those things you

 6    described are things that we believe we will have in place.

 7              One of the purposes of the pilot program was, of

 8    course, to shake out what I would consider to be major

 9    issues, major flaws or potential flaws with the process.  We

10    were successful in doing that, but we also identified areas

11    where we need to continue to refine and by expanding the

12    sample set, as you will, from nine sites to the 68 sites

13    across the country, will certainly engender a lot more

14    information and will allow the staff to more fully, I think,

15    flesh out or fully define all the areas that need to be to

16    make this process a solid process going forward.

17              We think, based on what we got from the pilot

18    program, that we have developed those key lessons learned

19    and we've fixed those or they will be fixed in the very near

20    term.  But certainly there is a clear recognition that we're

21    going to learn additional information, which is why the

22    whole concept of initial implementation is what it is.  It's

23    an extension, it's the next natural step from the pilot

24    program, to now let's get all plants under the same umbrella

25    of oversight, so we can fully explore and learn how this

                                                                36

 1    program is going to work with all sites under that same

 2    program and process.

 3              CHAIRMAN MESERVE:  So you really view this

 4    exercise as one that's an extension of the pilot effort and

 5    that there's going to be changes that will be undertaken,

 6    will have to be undertaken over the months of going forward.

 7              MR. DEAN:  Correct.

 8              MR. COLLINS:  Chairman, I would just like to

 9    respond also to that, because it's going to be me and my

10    counterparts in the regions that are going to go forward

11    with implementing this.  I think the pilot program results

12    showed us we're not all the way there, but we would prefer

13    to implement and recognize that we have these changes coming

14    than to continue two processes with a few pilot plants and

15    try to refine everything to be completely done.

16              I think the activities and the changes recently

17    made to the program to accommodate some of our concerns and

18    these holes that are in the program are acceptable for

19    ensuring that we can implement the program.

20              So I'm comfortable with it.

21              CHAIRMAN MESERVE:  One of the concerns that's been

22    expressed about the program is a fear that we will not get

23    advanced enough notice of declining safety performance and,

24    therefore, will not have the capacity to step in in a timely

25    fashion.

                                                                37

 1              Ms. Lipoti, who is on the second panel, has

 2    characterized the indicators, for example, as lagging

 3    indicators, as she will speak to us about when she gets

 4    here.

 5              Is that a fair evaluation of the situation or are

 6    you comfortable that you're going to be able to get on top

 7    of the situations in a timely fashion before things have

 8    deteriorated significantly?

 9              MR. DYER:  Yes.  I believe that that concern still

10    exists.  I believe the anxiety level from when originally

11    reported in the GAO report and more recently in our internal

12    surveys has been reduced because of the training and some of

13    the experience that we have and some of the recent changes

14    that allow cross-cutting issues to be addressed in

15    inspection reports and commented on in the PPR letters.

16              I think with respect to the performance

17    indicators, a lot of the questions were taken considering

18    the performance indicators alone and considering the

19    inspection program alone, would they be able, in and of

20    themselves, to identify this declining trend.

21              I think one of the things we've been working with,

22    at least in Region III and the other regions, is that it's a

23    package deal.  So we've still got a fairly robust inspection

24    program that's accompanying these performance indicators.

25              My discussions with some of the inspectors last

                                                                38

 1    week during the training is they're concern about turning

 2    things over to the licensees' corrective action program once

 3    they've identified them; in other words, we can't pursue

 4    resolution to our schedule, that we have to rely on the

 5    licensees' corrective action program.  That still causes us

 6    some concerns about whether or not their programs are going

 7    to be robust.

 8              But I believe we can still engage them in the

 9    current climate, it will be acceptable.

10              CHAIRMAN MESERVE:  Won't you inspect their program

11    to assure yourself that it's adequate?

12              MR. DYER:  Yes, sir.  We'll have a problem

13    identification and resolution inspection on an annual basis. 

14    That's a rather significant program review.  The other thing

15    we will do is the residents and the ongoing inspections, as

16    they go month to month, will -- a part of their review of

17    the surveillance program or the maintenance program also

18    will include the effectiveness of corrective actions.

19              But in the past, where an inspector would identify

20    an issue, then it would become the inspector's issue and it

21    would drive through with the licensee.  Now, it's once they

22    put it in their corrective action program, we let their

23    corrective action program -- and there is some reluctance

24    and skepticism on that, on the ability of that to do, to be

25    successful.

                                                                39

 1              DR. TRAVERS:  Our experience in addressing

 2    performance issues has been pretty minimal, though, in the

 3    course of this pilot.  We really have had no significant

 4    experience with the need to identify, in a timely way,

 5    declining performance.

 6              We think -- we hope that stays true for all plants

 7    at all times.  Nevertheless, a healthy program has to be one

 8    that demonstrates its effectiveness for declining

 9    performance, as well as good performance.

10              This is not a program and we're certainly not

11    selling it as one that recognizes the increased level of

12    performance on the part of the industry.  Certainly, that's

13    a fact, but we have to be able to demonstrate that what we

14    have in place is a program that will, in fact, recognize

15    declining performance in a timely way.

16              We think we've got that.  We think, in large

17    measure, the cross-cutting issues and the insights

18    associated with the inspection program are the key to that,

19    because of, as you point out, rightfully, the lagging nature

20    of some of the performance indicators and some of the

21    inspection findings, as well.

22              Many of the cross-cutting issues, in my mind, and

23    the insights that we glean from those are going to be key to

24    the roll-up thinking that we do on identifying declining

25    performance.

                                                                40

 1              CHAIRMAN MESERVE:  Mr. Dyer, you had mentioned the

 2    skepticism of the staff and I think one of the comments had

 3    been also the skepticism of the staff resonates with the GAO

 4    evaluation.

 5              MR. DYER:  Yes, sir.

 6              CHAIRMAN MESERVE:  What is your sense of the

 7    current attitude among inspection staff of this program? 

 8    Has the acceptance rate grown as people have learned more

 9    about it?  Are people more comfortable with it.  What is

10    your sense?  I realize you don't have a survey, but what is

11    your sense of the current impressions of this activity, from

12    the inspector's point of view?

13              MR. DYER:  In particular, I talked to a lot of the

14    Region III staff about that very subject and I believe it is

15    improving, the acceptance is improving.  I think if you go

16    back to the GAO, the timing of the GAO report, which is

17    about March of last year, and the reason it's improving, I

18    think, is, one, the training and the experience with the

19    pilot program, and the third thing is improved industry

20    performance.

21              Particularly in Region III, it was about a year

22    ago at the time of the survey, when I was just getting to

23    Region III, and we had between a third and a half of our

24    plants on the problem plant list, multiple 0350 oversight

25    panels.

                                                                41

 1              So if you're talking about improved performance

 2    over the past ten years, particularly in Region III, I don't

 3    think that was a saleable feature at the time a year ago.

 4              Since that time, I think the industry performance

 5    in Region III has improved and that helps alleviate some of

 6    the staff's concerns, although they're still there because

 7    we're not that far away from some poor performance, as well

 8    as, again, the training and the experience that this program

 9    can identify issues.

10              CHAIRMAN MESERVE:  There is a sort of a hook in

11    your answer, in that there has been improved performance,

12    but it's under the -- for the most part, under the former

13    oversight program.  I think the question has been that the

14    staff, the inspectors have been comfortable with that

15    program, it's what they know, and they're expressing some

16    concern as to whether they would be able to have achieved

17    the same improvement with the new program.

18              And I was trying to get at the point of whether

19    you think that that concern of the inspectors has been

20    alleviated.  You mentioned the training has been helping.

21              MR. DYER:  It has been reduced.  It hasn't gone

22    away.  And I would be concerned if they weren't skeptical

23    and weren't concerned about going to a new program and were

24    they able to identify problems.

25              CHAIRMAN MESERVE:  Commissioner Dicus.

                                                                42

 1              COMMISSIONER DICUS:  Thank you, Mr. Chairman. 

 2    First of all, on slide three, on these milestones, where you

 3    suggest that in June of 2001, you will report on the revised

 4    reactor oversight program, its initial implementation.  I'm

 5    assuming that is a report.  You're not going to come back

 6    and say -- with a recommendation we continue it or may you

 7    came back with a recommendation to not continue it, or am I

 8    asking you to look into a crystal ball?

 9              MR. COLLINS:  I think the obligation we have to

10    the Commission, at the Commission's direction, is to target

11    full implementation as opposed to pilot or initial

12    implementation at that date.

13              The staff believes that a report back of an

14    appropriate type to the Commission would be useful to

15    address these areas, as well as to also be sure there is a

16    clear understanding of the areas that Bill mentioned as far

17    as the periodic reviews and the Commission's desired role in

18    the annual meeting.

19              It can take whatever form the Commission desires.

20              COMMISSIONER DICUS:  Thank you.  Slide seven.  My

21    question goes somewhat to what we've all dealt with, the

22    performance indicators, are they where they should be, are

23    they not where they should be, and I'll have a couple more

24    questions about that.

25              But particularly the issue of reevaluating the

                                                                43

 1    thresholds as we go down the road and you -- they're based

 2    upon some historical information and some of our own feeling

 3    that what we have learned from past performance.

 4              Do you qualify them beyond that or is this in the

 5    going forward part of our program?

 6              MR. DEAN:  In terms of -- I'm not sure if I fully

 7    understand.  Are you talking about in terms of we continue

 8    to assess and analyze the performance indicators on an

 9    ongoing basis for efficacy?

10              COMMISSIONER DICUS:  Right.  And do you qualify it

11    beyond that?  Do you have some other concern that you have

12    that you haven't expressed to us?

13              MR. MADISON:  I think the performance indicator

14    program is an -- we're always trying to refine that and

15    we're always looking for a better performance indicator, a

16    better guidance, trying to refine the guidance, collecting

17    feedback so that we can get better guidance out, that we can

18    -- as I mentioned earlier, we're looking at performance

19    indicators to replace the barrier performance indicators.

20              We're looking for performance indicators that may

21    not have unintended consequence concerns regarding scrams,

22    counting manual scrams, for instance.

23              So we're trying to refine it.  It's not that we

24    have a major concern with the existing performance

25    indicators, as they are.

                                                                44

 1              COMMISSIONER DICUS:  Okay.

 2              MR. COLLINS:  Commissioner, I think you asked a

 3    very good question.  The challenge to the industry and to

 4    the NRC for quite a period of time has been what to focus on

 5    as far as indicators of industry performance and are they

 6    leading or are they lagging.

 7              There was a time, I believe, when we were using

 8    indicators that were developed by NRC, the old office of

 9    AEOD had that role.  At the same time, we were using SALP

10    senior management meeting inspection report messages.  There

11    were INPO indicators, there were WANO indicators.

12              Part of the goal with this program is to try to

13    stabilize our process so that it's predictable.  It can be

14    improved, but it still needs to be predictable.  So there

15    are two messages here.  One is that using the cornerstones,

16    and indicators are the cornerstones themselves, we'll

17    continue to search for a refined indicator that's meaningful

18    and agreed upon and then we'll implement it in a way that

19    doesn't create instability in the process.

20              That's the real message here.  The other is that

21    we want to try to collect information and the Office of

22    Research is working with the industry to do this, to

23    understand better, perhaps even at a component level, how

24    the plants are performing, so that we can look at overall

25    trends below that of the oversight process, and, therefore,

                                                                45

 1    be more far-reaching as far as indicators are concerned.

 2              So that process will continue.  We won't do it in

 3    isolation.  We're going to have to do it with stakeholder

 4    and industry input.  So there is a balance here with the

 5    Chairman's concern over the work that's in front of us and

 6    completing that work and implementing it in a way that

 7    doesn't create these discontinuities or instabilities in the

 8    process.

 9              The inspectors are the ones who are primarily

10    affected by this.  They have to know what to focus on and we

11    have to train them in a way they feel confident that they

12    can discharge their duties in a meaningful way and they have

13    to have a forum by which they can articulate those concerns

14    and be heard.

15              I think if we stick to those mandates, then we'll

16    be fine.

17              MR. DEAN:  Sam, if I could add one other thing,

18    Commissioner Dicus, is that with respect to changing

19    thresholds, we don't intend to put in place a process by

20    which we're evaluating all the thresholds on an annual basis

21    and continuing to change them in response to industry

22    performance.

23              We want to establish a stable set of performance

24    indicators and if all of industry gets below those

25    thresholds, then that's great.  Okay.  Now, that may spur us

                                                                46

 1    to look at other areas to look at that a suitable

 2    performance indicator would be recognized, but we don't

 3    intend to be in a situation where we continue to ratchet

 4    industry to continue to improve by changing the thresholds.

 5              COMMISSIONER DICUS:  Good response, which leads me

 6    into my next question, which happens to be on slide nine. 

 7    First of all, I think we all recognize this is a work in

 8    progress.  We've said that repeatedly when we met with the

 9    ACRS last week, et cetera. So we recognize that.

10              I also want to congratulate the staff on what I

11    think was a very succinct and carefully done briefing, very,

12    very well.  I think you rehearsed it.  It was very good.

13              I have a couple of questions on slide nine, which

14    is good.  I think it's a good idea, ask your own questions.

15              I think this would go to Mr. Johnson.  You

16    indicated that consideration of external events has not been

17    reflected in the SDPs, but needs to be incorporated.  Did I

18    hear that correctly?

19              MR. JOHNSON:  That's correct.

20              COMMISSIONER DICUS:  Okay.  And the other thing,

21    you know, obviously, I'm from the south and I not only speak

22    a little slower, but I listen a little slower.  You

23    indicated that there were some non-reactor safety SDPs.  I

24    got emergency preparedness, but I didn't get the other two.

25              MR. JOHNSON:  Containment -- I'm sorry.  Emergency

                                                                47

 1    preparedness, safeguards, and radiological protection.

 2              COMMISSIONER DICUS:  Okay.  I thought it was the

 3    rad health that I was wanting to take a look at.  And you're

 4    working on those.  Do you think you might incorporate those

 5    or you think this is one of the things you're going to have

 6    to deal with?

 7              MR. JOHNSON:  We actually have done a bunch of

 8    work on those.  We were able to get to a point where we

 9    could exercise them during the pilot program.  We've learned

10    lessons from them and we're at a good spot on all of those

11    SDPs.  So it should be ready to go.

12              MR. DEAN:  They're all in place and have been

13    tested out and we've run certain scenarios through them and

14    we've refined them based on lessons learned.  So we feel

15    we're in pretty good shape on those SDPs to go forward.

16              COMMISSIONER DICUS:  All right.  Cross-cutting

17    issues.  It's not so much -- maybe slide ten and slide 12,

18    not so much the slide, as I'd just like to ask the question,

19    because this came up as a question I asked at the ACRS

20    briefing.

21              The importance of the cross-cutting issues and I

22    know even the staff, our internal staff has expressed some

23    concern, have we really gotten our hands around the

24    cross-cutting issues.  In response to one of the Chairman's

25    questions, you indicated that one of the problems we may

                                                                48

 1    have, and Dr. Lipoti will bring this up, on the

 2    lagging/leading performance indicators, how important

 3    cross-cutting may be to get a handle around these things.

 4              Do you want to give me a little more feedback? 

 5    Have we identified what we think are all the cross-cutting

 6    issues?  And I may ask Dr. Lipoti to address that question,

 7    as well, so Ill give her a heads-up for that.

 8              But do you feel that we're there or we're still --

 9    is that going to be a part of this learning process?

10              MR. JOHNSON:  I'll talk to that just a little bit. 

11    One of the things that we did on the external lessons

12    learned workshop was talk about that very issue, whether or

13    not we've captured what people believe are, in general, the

14    cross-cutting issues, and there was a good sense among all

15    of the stakeholders, the internal stakeholders and the

16    external stakeholders, that in general, the cross-cutting

17    issues that we were talking about, problem identification

18    and resolution, human performance and safety conscious work

19    environment, those do, in fact, capture what it is most

20    folks are concerned with with respect to cross-cutting

21    issues.

22              I tried to illustrate in my prepared words that

23    there really were two camps and there wasn't just an NRC

24    camp and an external stakeholder camp.  There were mixed

25    folks from among those camps in terms of their perspective

                                                                49

 1    on how we ought to treat cross-cutting issues, and it really

 2    did come down to, in some people's views, you really do need

 3    to have a good early indication of cross-cutting issues as

 4    an indicator of where performance has gone bad.

 5              Again, from the other perspective, it's hard to

 6    know.  Someone said, one of the Region I inspectors said we

 7    predicted 21 of the last watch list plants.  It's tough to

 8    know if the issue that you find as a cross-cutting issue is,

 9    in fact, going to end up being, for that plant, a predictor

10    of performance, and we tend to be conservative.

11              So we want to make sure that we've got the right

12    cross-cutting issues and we've got thresholds set

13    appropriately and we have an avenue to handle those

14    cross-cutting issues when they arise, so that we can take

15    the appropriate response.

16              MR. DEAN:  And I think to add on to what Mike

17    says, I think that the baseline inspection program

18    recognizes the degree of import and the degree of comfort, I

19    think, as to how much we inspect the various cross-cutting

20    issues.

21              There's a substantial amount of effort looking at

22    problem identification and resolution activities which we

23    believe to be a very important process.

24              One of the things that this program clearly does

25    is it -- there is a distinct burden shift from the NRC to

                                                                50

 1    the licensee in terms of pursuing and resolving issues and

 2    the issue of do we have the cross-cutting issues captured

 3    right, are we looking at them through the inspection program

 4    in enough depth and frequency is a good question and I tell

 5    a lot of our regional staff, when I'm asked this question,

 6    that in some respects, time will tell.

 7              We think we've established a good framework.  We

 8    think we've got it captured the right way, but it may take

 9    some time before we get a situation where we see a

10    substantial cross-cutting issue result in PIs crossing

11    thresholds and risk significant inspection findings.

12              So this is going to be one of the things that

13    we're going to have to continue to do an ongoing assessment

14    and evaluation of.

15              COMMISSIONER DICUS:  Okay.  And the local

16    skepticism, just -- and we all understand we do have that

17    and we have various areas, but you have a comfort level, as

18    the Regional Administrators are your counterparts, together

19    with Sam, that we have the program in place to address this.

20              MR. DYER:  Yes, ma'am.

21              COMMISSIONER DICUS:  Okay.  And then one --

22              MR. COLLINS:  If I can comment on that,

23    Commissioner Dicus.

24              COMMISSIONER DICUS:  Okay.

25              MR. COLLINS:  I think our roles are compatible,

                                                                51

 1    but perhaps separable.  The Program Office's role is to

 2    define the program and provide the resources and this good

 3    will and access for the region to go forward and implement

 4    the program.

 5              This is where you get into the stability issue and

 6    the change issue.  We've been working with the regions to

 7    identify the appropriate budget.  We have benchmarked that

 8    and met with each region.  We know what qualifications and

 9    we anticipate what qualifications it will take to do the

10    inspections.

11              We know we have an interim period where we have to

12    supplement three out of the four regions at least in the

13    area of fire protection and engineering and we have

14    contractors to do that.  That's an identified issue, that's

15    a transitional issue.

16              Jim's challenge is to take the tools that we

17    provide for Jim and communicate back to the program office

18    and to the arena manager, Frank Miraglia, Jim is comfortable

19    that the program achieves the goals of the agency and those

20    goals are identified in the SECY paper.

21              We anticipate that there will be skepticism and if

22    I was an inspector, which I used to be, I would anticipate

23    that that's true.  These issues are interrelated, the issue

24    of being able to communicate findings, be able to

25    acknowledge the subjective inspector instincts, to be able

                                                                52

 1    to write it down in an inspection report, the ability to

 2    follow your nose, if you will, as an inspector, the ability

 3    to communicate on cross-cutting issues are all interrelated

 4    to the overall goal of ensuring that this program is

 5    predictable to identify declining trends.

 6              I think we can take those in series with the

 7    changes we have already manifested and provide a viable

 8    process.  It needs to be proofed and then we need to get the

 9    feedback from Jim's staff and his counterparts.

10              COMMISSIONER DICUS:  That will be very valuable. 

11    I'm about to take up more -- I have taken up more than my

12    fair share of the time, but one more question, Mr. Chairman.

13              CHAIRMAN MESERVE:  Please.

14              COMMISSIONER DICUS:  Thank you.  Is the industry

15    ready?

16              MR. COLLINS:  I always hesitate to speak for the

17    industry.

18              COMMISSIONER MERRIFIELD:  As you should.

19              COMMISSIONER DICUS:  In your opinion, is the

20    industry ready?

21              MR. COLLINS:  The industry has met their

22    milestones that have been put forth by the tasking team in

23    order to provide for this process to go forward.  That

24    includes submitting the performance indicators.  That

25    includes providing the expertise and the resources necessary

                                                                53

 1    to support the program, both in a defining way and in

 2    producing documents.

 3              I think Mr. Beedle probably will be able to speak

 4    for NEI.

 5              COMMISSIONER DICUS:  Yes.  Mr. Beedle, you can

 6    expect the same question.  You've got a heads-up.

 7              DR. TRAVERS:  But in fairness to what we have been

 8    doing, we have been asking that question in just about every

 9    forum we've had the opportunity to ask it, and the answer is

10    yes.

11              COMMISSIONER DICUS:  Actually, I may ask the same

12    question of Mr. Lochbaum, so he can get prepared for it, as

13    well.  All right.  And if we have areas of local skepticism

14    in the industry, are we prepared to recognize that?

15              MR. COLLINS:  I think we have recognized that and

16    I think the answer is yes, there are areas and those areas

17    are fairly well identified.  I think some of the performance

18    indicators could have potential unintended consequences. 

19    Will an operator hesitate to scram a plant manually if he

20    knows that's being counted as an initiating event?  Will

21    there be the hesitation to embark on immediate corrective

22    maintenance if you can wait 72 hours and call it preventive

23    maintenance?

24              There are subtleties like that in the program that

25    I think we have to evaluate as we go forward, receive the

                                                                54

 1    industry input and be sure the program is getting us to

 2    where we want to go, without these unintended influences, or

 3    ensure that training and education is the key to that.

 4              DR. TRAVERS:  Another -- if I can just add to

 5    that, for a moment.  Another issue that we've heard from the

 6    industry that's been on their minds is a concern as to

 7    whether or not this sort of program or this program is going

 8    to result in the communication of the sorts of insights that

 9    they are used to getting from NRC, from the resident

10    inspector, from the region-based inspectors, from the branch

11    chief, all the way up to the regional administrator.

12              And much of the dialogue we've had in many of our

13    workshops has been focused on discussing that issue and why

14    we believe those sorts of insights, in fact, will still be

15    communicated in this new process.

16              COMMISSIONER DICUS:  Thank you, Mr. Chairman.

17              CHAIRMAN MESERVE:  Commissioner Diaz.

18              COMMISSIONER DIAZ:  Thank you, Mr. Chairman.  I'd

19    like to first make a short statement.  I always try to say

20    whether I'm making a statement or a question, although

21    you're always wondering which one I'm making.

22              I really have no problem with the program as it's

23    being proposed.  I think it is as good as it can be today. 

24    That frames my impressions to it.

25              Having said that, I do have a small concern, the

                                                                55

 1    fact that some of the last parts of the interactions and

 2    questions and answers with Commissioner Dicus boiled down to

 3    this single issue that I'm going to try to address, and that

 4    single issue, just not to keep you waiting, is the value of

 5    self-assessment and the corrective action program.

 6              Now, let me just start with putting some tricky

 7    questions in here.  Mr. Dyer, if you are a power plant and

 8    you're the senior resident inspector and you're day to day

 9    to day doing things and you have to deal with what is

10    happening in the power plant, which of the three

11    cross-cutting issues comes every day into your life?  Is it

12    the human performance, is it the safety conscious work

13    environment, or is it the corrective actin program?

14              MR. DYER:  Actually, all three would come in every

15    day probably; human performance on the day-to-day conduct of

16    operations, safety conscious work environment in the nature

17    of the kinds of problems that are being raised, minor action

18    with the staff -- I mean, the licensee staff, and certainly

19    the corrective action program just in the day-to-day review

20    of the condition reports and things that are identified, the

21    interface with what's being fixed.

22              COMMISSIONER DIAZ:  So you deal with them on an

23    equal basis as far as priorities.

24              MR. DYER:  As far as priorities go, I think the

25    corrective action program is -- of the cross-cutting issues

                                                                56

 1    -- that's the most important.

 2              COMMISSIONER DIAZ:  Okay.  All right.  I tend to

 3    agree with you.  This is a statement now.  When this program

 4    was started, fundamentally it was started as a data

 5    gathering and processing program that essentially was going

 6    to go horizontal across the issues that occurred in a power

 7    plant every day and the very first thing that was happening

 8    was that this transparent, open data processing program was

 9    going to enhance our ability to know what was happening at

10    the power plant.

11              That was really the fundamental thing, what is

12    happening, not only us, but everybody.  It was to be an

13    open, horizontal program without multiple levels.  And to

14    me, that always ended up as a very robust corrective action

15    program.

16              Whatever you do with any of the things, it has to

17    end up in the corrective action program.  The thing that we

18    change is that the corrective action program, like you well

19    said, there was, let's say, a transfer of responsibility,

20    somewhat a transfer of responsibility, not a complete

21    transfer of responsibility, between us and the licensee, in

22    which they would actually take more responsibility of what

23    goes into the corrective action program, how they get

24    dispositioned.

25              But we always have the capability to go into.  It

                                                                57

 1    is always, every day, the corrective action program is, to

 2    me, more than a cross-cutting issue.  It's an every day

 3    cross-cutting issue and it's an every instant cross-cutting

 4    issue.  It's the one that is the beginning and is the end of

 5    everything that we try to do in a power plant.

 6              And if I have a concern, it's how we have packaged

 7    this oversight program, and my concern was clearly

 8    highlighted last week when the ACRS, which is an expert

 9    body, says the oversight program consists of two things, and

10    those two things are the performance indicators and the

11    baseline inspections.

12              I take objection to that, because I think that's

13    the problem that you're hearing from inspectors and I have

14    been hearing.  The problem is that there is a robust

15    underlying structure to the performance indicators and to

16    the baseline programs and that very robust structure is a

17    strong self-assessment, corrective action program, and that

18    is at the very front.

19              To me, this process has three components.  The

20    first one and foremost is a data processing ending in a very

21    strong corrective action program, followed by the

22    performance indicators and the baseline and the corrective

23    action program should cross-cut into the performance

24    indicators and vice versa.  They should all go back.

25              I think we are really not expressing the

                                                                58

 1    importance of the first phase and the last phase of the

 2    program, which is a corrective action program underlies

 3    everything that we do.

 4              And by the way, you know, I hate to admit that

 5    INPO can make a shorter statement than I can, but I'm going

 6    to read you from INPO today, this is probably a first right

 7    now.  INPO's reviews today, self-assessment corrective

 8    action, and let me quote, Mr. Chairman.

 9              It says, "Self-assessment and corrective action

10    programs are vehicles for identifying and successfully

11    implementing change."  I agree.  As such, these programs are

12    important contributors to safe and reliable plant operation. 

13    They are also an essential element, and I fully agree with

14    this, in the revised reactor oversight process being put in

15    place by the Nuclear Regulatory Commission.

16              My statement is that I think that we're doing the

17    right things.  I think we're in the right place.  But I

18    believe that we cannot put the emphasis only on the fact

19    that we have very new models with performance indicators and

20    baseline inspection, that there is a very new powerful, open

21    self-assessment and corrective action program that will make

22    our oversight better.

23              In fact, I will go on the limb and say if that's

24    all we do, if that's the only thing that we do, we'll be

25    better off than we were before.

                                                                59

 1              Therefore, I'd like to get some comments back on

 2    the relative importance of our inspectors and the industry

 3    to value the new self-assessment and corrective action

 4    program as a major cross-cutting issue that will go into the

 5    performance indicators.

 6              My drive, you know, baseline inspections will go

 7    back and reinforce what we're doing.  That was a statement,

 8    Mr. Chairman.

 9              MR. DEAN:  I have a response, Commissioner Diaz. 

10    I have to say that in terms of the importance of having a

11    strong, robust self-assessment and corrective action program

12    really is an underlying concept with this revised reactor

13    oversight process and we have been, I think, fairly

14    aggressive in publicizing that issue with our stakeholders

15    and telling licensees that this is a key element of this --

16    we called it a responsibility transfer, but it really has a

17    direct implication of the capability of them to have that

18    appropriate self-assessment and corrective action program

19    and it's recognized.

20              Mike talked about the whole issue surrounding

21    cross-cutting issues and the fact that there's two camps. 

22    Embedded into our process is that recognition that if you do

23    not have a robust self-assessment and corrective action

24    program, then we would expect to see, over time, PIs

25    crossing thresholds.  We would expect to see

                                                                60

 1    risk-significant inspection findings emanating from our

 2    baseline inspection program, and those are the types of

 3    things that we have been promoting on an ongoing basis.

 4              There is not a total buying on that concept yet

 5    and that's why we had a discussion earlier about there is

 6    still some skepticism about that as being an integrated part

 7    of this oversight process.

 8              COMMISSIONER DIAZ:  But wouldn't a well analyzed

 9    corrective action program be the precursor to declining

10    performance?

11              MR. DEAN:  I think that you could certainly look

12    at the corrective action program as being a font of

13    potential information that would give you leading

14    information, and that's, I think, what we have embedded in

15    the program, the whole concept of having the risk-informed

16    thresholds that we have, that would allow us to integrate or

17    to provide a greater regulatory response as licensees cross

18    performance thresholds.

19              I think there is a recognition of that, that there

20    is a band of performance where what we call the licensee

21    response band, that a licensee is responsible for their own

22    issues within that band, and it's only when we start seeing

23    things that emanate, that cross thresholds do we have to

24    engage at a level beyond the baseline inspection program,

25    and that's a defined philosophy intended of the oversight

                                                                61

 1    process.

 2              MR. MADISON:  I'd like to add to that.  We also --

 3    I'm sorry, Sam.  We have recognized this in the baseline

 4    inspection program.  Ten to 15 percent of all inspection

 5    activity is focused on the corrective action program,

 6    problem identification and resolution.

 7              We have a major inspection, annual inspection at

 8    every site focused on a problem identification and

 9    resolution program.  We've begun to establish an internal

10    and we'll eventually get both an internal and external

11    working group asking the question what more -- what

12    different areas do we need to look at in the problem

13    identification and resolution program, what are the issues,

14    what are the standards and the criteria we need to judge

15    good programs on, and INPO has begun that work in developing

16    their principals document, which we're going to incorporate

17    into that effort.

18              CHAIRMAN MESERVE:  Commissioner McGaffigan.

19              COMMISSIONER McGAFFIGAN:  Let me start by

20    complimenting the staff.  I think given all the boundary

21    conditions under which you work, you've done a tremendous

22    job to this point.  I don't know, I wasn't around and I

23    wasn't paying attention to the NRC in the mid '80s or early

24    '80s when SALP was put into place, but I can't imagine that

25    when we put that process into place, we went through

                                                                62

 1    anything like the process we've gone through this time,

 2    interacting with stakeholders, and I firmly am of the

 3    conclusion that this is better than SALP, which isn't much

 4    of a standard, as David Lochbaum would say.

 5              Let me ask a process question to start.  Given all

 6    of the issues that the Chairman talked about that are going

 7    to have to continue to be worked, and I'm going to ask some

 8    questions about them.

 9              Is there an advantage in continuing the FACA

10    committee, as well as the other activities you have

11    underway?  There's a lot of -- on one of the slides, and I

12    won't try to find it, there's a lot of talk about ongoing

13    interactions with the industry, but it seems to me that

14    there was some real advantage in the FACA process during the

15    pilot process and in some sense, we're now piloting with 103

16    plants, or 101 until Cook catches up.

17              And so what is your answer on that?

18              MR. DEAN:  The goal of the pilot program

19    evaluation panel, the FACA committee, was, I think, very

20    valuable in the fact that it brought together a wide variety

21    of stakeholders and, matter of fact, probably the pertinent

22    spectrum of stakeholders, to look at the oversight process

23    and come to some consensus about what did all the

24    information that came out of the pilot program tell us about

25    readiness to go forward, and I know that Frank Gillespie

                                                                63

 1    will talk to you later today about the efforts of the PPEP,

 2    the pilot program evaluation panel.

 3              We've discussed, as we go into initial

 4    implementation, the potential for having a similar endeavor

 5    and I think our current thinking right now is that probably

 6    towards the latter half of the first year of initial

 7    implementation, after we've had a couple quarters of getting

 8    information out and we can develop at least some sort of

 9    trends or patterns about the new -- about implementing this

10    at all 103 plants, is that it would probably be a good time

11    to revisit having such a body come together and provide an

12    independent assessment, as you will, of what are the results

13    telling us about the efficacy of the oversight process.

14              COMMISSIONER McGAFFIGAN:  My reaction to that is,

15    partly, that I would think that having them involved the

16    entire way through the process, as you're learning, not just

17    so that they -- you just don't pull them back at the end and

18    get another report card, it might be better to keep them --

19    PPEP would become IIEP, but we'd have to think of a better

20    acronym, initial implementation evaluation plan.

21              But it strikes me, just off the top of my head,

22    that given all these issues that are going to be worked and

23    you have various timeframes to work them, some in April,

24    some in May, some in June, some in July, and they're going

25    to worked all the way through, they won't be working off of

                                                                64

 1    data, but they can be working off of the various issues.

 2              So that's just a reaction.

 3              The significance determination process, when we

 4    started this effort, I thought there was only one, so

 5    forgive me, but we now have multiple significance

 6    determination processes.  One issue is going to be do they

 7    -- and I remember the conversation we had about there would

 8    be 100 things, 100 inspection findings a year, you guys may

 9    want to back out of this, but I think it's on the record,

10    you'd have about 100 a year that would truly enter the SDP

11    process and about ten big ones would pop out and move an

12    indicator from green to white or yellow or red.

13              I don't know what the numbers are today, but if

14    you have multiple processes, you have an issue of

15    consistency across the processes, and how is that being

16    handled?

17              MR. MADISON:  That was one of the major questions

18    and concerns raised at the lessons learned meeting, that and

19    do we have the same input, were the inputs equal going into

20    the significance determination process, and we did an awful

21    lot of looking at each of these significance determination

22    processes, both the reactor and the non-reactor, and

23    comparing findings, level of significance, to assure

24    ourselves that we -- a red finding in safeguards had the

25    same level of weight as a red finding in a reactor SDP.

                                                                65

 1              We do feel that there is consistency across those.

 2              MR. DEAN:  And in fact, I would add to Alan, I

 3    think that in the effort to achieve consistency in our

 4    process, the significance determination processes are

 5    perhaps the greatest advance that we have made in developing

 6    the revised reactor oversight process, to try and assure

 7    consistency, because what it does is it forces inspectors to

 8    take their findings and then lay down what are the

 9    assumptions that they're making with respect to that

10    finding, so that the licensee, his management, his or her

11    peers, can look at those assumptions and judge the risk

12    characterization of that inspection finding with those

13    assumptions in front of him.

14              So it lays out the playing field, as you will, for

15    discussing the characterization of the issue.

16              COMMISSIONER McGAFFIGAN:  Now, the way the SDP

17    process works, there are a lot of findings out there, many

18    of them are going to be green findings.  But what a resident

19    needs is a mechanism for quickly screening out the green

20    from the potential hundred or the ten.

21              My understanding is that you're trying to have

22    screening tools for the residents.  They will probably

23    initially test those tools a lot, so that's the learning

24    curve you talked about.

25              Once you enter into the hundred space, I'm using

                                                                66

 1    the number you guys gave, I'm begging for you guys to

 2    correct me, but once you enter the hundred space, it really

 3    becomes the senior reactor analyst or somebody who is more

 4    into heavy lifting on PRAs that is going to work that with

 5    the licensee.

 6              Is that correct?

 7              MR. MADISON:  Initially, there's going to be an

 8    awful lot of involvement from the SRAs, we think, but we're

 9    also trying to focus attention on the training and educating

10    the resident and senior resident inspectors so that they can

11    begin that conversation early.

12              We expect during the phase two that that will be

13    the inspector talking to the licensee to gather that

14    information, to further refine that at the latter part of

15    the phase two and into the phase three review, yes.  The

16    senior risk analyst will probably be involved in that.

17              COMMISSIONER McGAFFIGAN:  But if you're going to

18    get timeliness, isn't it important that things get kicked

19    into the -- you talked about -- and Jill will talk later,

20    Dr. Lipoti, about this being a long negotiation, you have to

21    improve timeliness, quicker, it gets kicked out of the

22    resident's world into the detailed process of trying to

23    decide whether this is a truly, truly significant finding

24    that's going to move an indicator, is the better.

25              So how do you make that -- if I'm a resident, I'm

                                                                67

 1    a resident here and I'm trying to screen this, I think it

 2    may be one of the hundred, so I'm going to pass it on.  Do

 3    you want him to err in the direction of documenting and

 4    getting quickly onto the more detailed process, that he's

 5    probably not going to do, or do you want him to err on the

 6    side of not passing things on?  Where is the err going to be

 7    in the initial implementation?

 8              MR. JOHNSON:  Let me try to talk to that. 

 9    Initially, the resident will find an issue.  They'll apply

10    that every early screening that you talked about.  It takes

11    very little time to do that.  If they got at the issue as

12    one of the ones that pass through, I'm not sure the hundred

13    is a right number, but it's one of the ones that you pass

14    through.

15              In addition, there is a phase two part that I

16    didn't talk about in the SDP that is the actual

17    plant-specific worksheet that, as Alan indicated, we really

18    want the resident to be able to apply that phase two

19    screening, and we think you can do that, you can work on

20    that during the inspection, the regular interval of the

21    inspection, and, in fact, we really want to be able to get

22    to a point where, at the end of the inspection, that period,

23    and the report that gets issued, we can talk about a finding

24    that is a potentially risk significant finding, so we can

25    right away begin to engage the licensee in terms of their

                                                                68

 1    understanding of the significance of the issue.

 2              So the process that we're setting up really does

 3    try to drive towards getting sort of a timely presentation

 4    of our concerns that are potentially risk significant,

 5    again, gone through that initial screening, gone into that

 6    second screening and had the SRA look at it, had us in

 7    headquarters take a look at it, and then be on the docket so

 8    that there can be response.

 9              COMMISSIONER McGAFFIGAN:  So that's part of --

10    I'll ask Jim Dyer.  I'm listening to this, I can understand

11    some of the trepidation in the resident core, because

12    there's a lot that we're asking them to do that they haven't

13    had to do in the past, have they?  I understand the initial

14    screening, but during this inspection, they're going to be

15    simultaneously talking to the senior reactor analyst,

16    filling out some of the flesh on whether this is a

17    significant risk.

18              MR. DYER:  I think it will be initially,

19    Commissioner, but I think the true value of the SDP process

20    lies in the phase two worksheets and those are still being

21    validated for the plant-specific characteristics, and that

22    is where I see, as a result of the training I had last week,

23    as really the link to save our SRAs from being completely

24    overrun with consultant activities to the resident

25    inspectors for everything that comes up, a risk-informing

                                                                69

 1    training tool for the resident inspectors and the

 2    inspectors.

 3              This phase two worksheet, I was very favorably

 4    impressed with the logic breakdown that it took you through,

 5    that an inspector or a branch chief could follow through

 6    based on their site specific knowledge and come to a

 7    decision as to whether or not the finding or the event

 8    needed to pass on to phase three of the SDP.

 9              MR. JOHNSON:  And if I could add on to that.  The

10    value, I've said this a number of times, the value of the

11    SDP is in the process as much as it is in the result, and

12    that's what Jim is talking about.

13              We could perhaps build a computer system where you

14    could enter the finding and have it kick out a color at the

15    back end, but the value is in the exercise that the

16    inspector goes through and using that, applying that phase

17    two worksheet to see what the significance of the issue is.

18              COMMISSIONER McGAFFIGAN:  Let me ask -- I have

19    lots of questions.  I'll ask one more.  Dr. Lipoti, in her

20    prepared testimony, says something that I think isn't true

21    and you've already addressed, but I'll just -- the new

22    program is prescriptive in preventing NRC inspectors from

23    getting involved in non-safety-significant issues.

24              You've already said that you have -- I think there

25    was some misimpression early on that residents were -- there

                                                                70

 1    were some exit meetings that lasted ten seconds or whatever,

 2    because they didn't find anything.

 3              That's all changed, right?  You've made clear to

 4    the resident core that -- I don't know what a

 5    non-safety-significant issue is, but if it's no level four,

 6    which I think we defined as non-safety-significant, it goes

 7    into the CAP, it goes into the corrective action program and

 8    gets dealt with by the licensee, but it's still something

 9    they should talk about.

10              Furthermore, they can even talk about, as you said

11    earlier, cross-cutting issues, where they have something in

12    the pit of their stomach telling them that there's something

13    wrong or whatever.

14              So this is a problem that has been fixed, hasn't

15    it?

16              MR. DEAN:  We've made adjustments to the process

17    to try and address those concerns.  You're correct,

18    Commissioner, in that early in the process, there were some

19    situations where residents felt they had to restrict their

20    ability to communicate with the licensee and we were made

21    aware of those early and got the expectation out there that

22    we're not trying to do anything to truncate their

23    communication.

24              What we're trying to do is make sure that our

25    inspection reports capture issues at the appropriate

                                                                71

 1    threshold.

 2              I do want to add one other thing in talking about

 3    -- to try and help you with the numbers, at least give you

 4    an update on the number of issues, based on the pilot

 5    program.  We had -- and these are approximations.  We had

 6    about 100 or so green issues that were identified during the

 7    pilot process and about three white issues that emerged or

 8    were originally characterized as white issues.

 9              So if you were to take that and multiply that by

10    about nine or so to indicate the full number of sites and

11    then the number two to indicate it from a six month to an

12    annual, you're probably talking on the order of anywhere

13    from 50 to 75 issues that we might deal with on an annual

14    basis that are more than a green issue.

15              COMMISSIONER McGAFFIGAN:  So that's more.  That's

16    more than you expected once.  Based on the pilot, there

17    could be 75 issues.  That also helps explain why you're

18    asking for more screening at the plants, because if there

19    were only 100 in four regions, I mean, 100 things that have

20    to go through the screening process in some elaborate way

21    resulting in ten findings a year, then that's not that big a

22    burden on -- you know, there would be 25 per region, that

23    would be two a month.

24              But if you're talking these numbers, then it gets

25    to be a more significant burden.

                                                                72

 1              MR. DEAN:  One of the things that we did in terms

 2    of trying to estimate resource planning for initial

 3    implementation to help the regions out is that we came up

 4    with an estimate that for each site, we would have about one

 5    and a half white issues per year that we would have to deal

 6    with, and we used that as kind of a planning assumption.

 7              So at least we could go forward with having -- and

 8    that was based on the pilot program results.

 9              COMMISSIONER McGAFFIGAN:  I've gone over my time,

10    so I quit.

11              MR. COLLINS:  Commissioner, let me respond to one

12    of your questions.  I think Bill did it partially.  The

13    issue of does the program -- is it prescriptive, does it

14    limit the inspectors.  The answer to that is yes and it's an

15    intended issue.  The goal of this program was to be able to

16    focus limited resources towards those risk and safety

17    significant issues.

18              The value here is that the processes you mentioned

19    have the tendency to do that.  That means we have to change

20    the shift in focus and be able to let go of those issues

21    that historically we did place some focus on, whether they

22    be level five or four violations, that unintendedly drove

23    licensee resources, because they were NRC findings.

24              Even though they weren't safety significant or

25    risk significant, the licensee had to respond to them.

                                                                73

 1              COMMISSIONER McGAFFIGAN:  But we still --

 2    unfortunately, a level four violation was a violation of

 3    something.

 4              MR. COLLINS:  Yes.

 5              COMMISSIONER McGAFFIGAN:  We're not going to not

 6    find violations.

 7              MR. COLLINS:  The difference is that the

 8    licensee's corrective action program will then determine the

 9    priority and the licensee's resources will be focused on the

10    overall prioritization of that issue, along with the

11    backlog.  If there is a risk significant issue, we would

12    expect the licensee's program to acknowledge that, based on

13    the type of work that you mentioned with the SDP.

14              The priority of that would become greater, would

15    be agreed upon, and the resolution of that issue would

16    become the priority.

17              COMMISSIONER McGAFFIGAN:  Maybe the word

18    non-safety significant, but in the past, I think we have

19    defined in the enforcement policy level fours as being

20    non-safety or risk significant, yet they are level four

21    violations and they are currently put into the corrective

22    action program.  So it may be semantic, but I would hope

23    those things are still being found according to the

24    corrective action program.

25              MR. COLLINS:  Yes, sir, I'm sure that is the case.

                                                                74

 1              CHAIRMAN MESERVE:  Commissioner Merrifield.

 2              COMMISSIONER MERRIFIELD:  Thank you, Mr. Chairman. 

 3    I want to add my comments about the staff, as well.  We've

 4    come a long way on this program.  There's obviously a

 5    further way to go, but I think certainly I want to add my

 6    appreciation and congratulations to the staff for, I think,

 7    a very large amount of work done in a relatively short

 8    amount of time.

 9              The first question I have, we did have the meeting

10    last week with ACRS and there were a couple issues that they

11    raised that I'd like to have you all make a brief comment

12    on.

13              The first one is that the performance indicator

14    thresholds may be so high, that it serves a disincentive to

15    improve plant performance.  That's the first issue.  I think

16    Dr. Lipoti has a similar concern.

17              The second one is that the significance

18    determination process is cumbersome and will bog down our

19    inspectors.  Mr. Lochbaum stated he believes the SDP, as it

20    currently stands, is unworkable and SECY 00-0049 has raised

21    similar concerns.  So I'm wondering if you have any comments

22    about either one of those issues.

23              MR. DEAN:  I will address the PI threshold

24    question first, Commissioner Merrifield.  Some of the

25    experiences that we've gotten from the pilot program, and

                                                                75

 1    Mr. Beedle might be able to reiterate or talk to this

 2    further, is that a number of licensees have looked at

 3    trending their performance within the green band.  In other

 4    words, they want to assure that they're not going to get

 5    their performance close to that threshold, where they can

 6    risk going into a green-white and get it enhanced during

 7    increased regulatory action.

 8              And so I guess my own personal belief is that we

 9    have seen nothing in the pilot program and from our

10    discussions with industry that would give us an indication

11    that those thresholds serve as a disincentive.

12              Now, we talked earlier, Alan talked about the

13    effort that we took to try and adjust those performance

14    indicator thresholds based on the pilot program information,

15    as well as historical information, and balancing that

16    against the original thresholds that were established in

17    SECY 99-007, but we think that those actually serve as good

18    guide posts for what we believe to be appropriate licensee

19    performance and we think that just by having those

20    thresholds out there and having the public observation of

21    how licensees are doing against those thresholds serve as a

22    tremendous motivator for licensees to perform and be low

23    within the green band.

24              MR. COLLINS:  Commissioner, I think there is a

25    theme that runs through the ACRS presentation which is

                                                                76

 1    valid, but it's looking down the road a little bit perhaps,

 2    and that would be that if the performance indicators could

 3    be more risk-informed and plant-specific, then they would

 4    acknowledge the differences in plants, and, at that point in

 5    time, the plants would acknowledge, based on real

 6    indicators, where the value is in reducing those risks.

 7              We acknowledge that and that is perhaps a future

 8    vision that we're working with the Office of Research in a

 9    pilot way to determine if that's feasible.

10              In the interim period, we are where we are and

11    we're just not at that point yet.  Although the process is

12    risk-informed, the specific indicators are not.

13              DR. TRAVERS:  The only thing I would add to the

14    question of disincentive or not is that we've been fairly

15    strident about not equating green with good.  In that

16    respect, we recognize green as an area within which

17    additional NRC attention is probably not warranted, based on

18    the risk significance.  It doesn't mean that we are trying

19    to transmit that this is a good thing.

20              When I talk to licensees, various other factors,

21    business factors really drive them to make gains in

22    performance that transcend the green zone even.  They can

23    trend based on issues being identified.

24              So I don't necessarily agree with what ACRS said

25    relative to not being able to trend.

                                                                77

 1              But we're very careful about characterizing green

 2    in the acceptable zone of not entailing any further NRC

 3    interaction as opposed to anything else.

 4              COMMISSIONER MERRIFIELD:  In that meeting, I

 5    indicated my belief, as well, that that was not -- that that

 6    particular theory that the licensee would track the

 7    performance indicators was not at all -- didn't correlate at

 8    all with the discussions I had with the licensees.

 9              I also asked about the significance determination

10    process.  Any comments about that one?

11              MR. DYER:  I think, Commissioner, I would just

12    re-echo what I said two weeks ago.  I think I would have had

13    that same understanding, the results of -- since I was -- I

14    went to the training and particularly saw how the phase two

15    SDP worksheets are being implemented or are intended to be

16    implemented when we get them validated, and I'm optimistic

17    now.

18              COMMISSIONER MERRIFIELD:  In her testimony, Dr.

19    Lipoti refers to our new oversight process as a voluntary

20    program.  Mr. Lochbaum indicates that there is a perception

21    of self-regulation.  Obviously, these are concerns we ought

22    to address.

23              For the record, could you discuss what our

24    response as an agency would be if a licensee withdrew from

25    the process by not providing performance data?

                                                                78

 1              MR. MADISON:  We have developed an inspection

 2    procedure that would provide guidance to the region and the

 3    inspector in what to do if, for various reasons, the

 4    performance indicator data is either not provided or we have

 5    determined that because of inaccuracy problems, that the PI

 6    has become invalid, and we would direct them to either --

 7    depending upon the performance indicator, either go out and

 8    collect the data ourselves, if it's easy, and if not, go

 9    back to the original documents, the diagrams that describe

10    the attributes, important attributes and important areas to

11    measure in each of the cornerstones, and focus the

12    inspection on those activities that the performance

13    indicator was trying to provide information on.

14              So we would replace the performance indicator with

15    inspection activities.

16              COMMISSIONER MERRIFIELD:  As a follow-up to that,

17    are there any circumstances -- I mean, it would be hard for

18    me to believe that a utility would withdraw from this

19    program and providing that information given the extreme

20    reaction from stakeholders and the financial community. 

21    Would you disagree with that characterization?

22              MR. DEAN:  No.

23              MR. COLLINS:  I think I agree with it.  However, I

24    believe that the program has to play out and demonstrate

25    value.  We can differ on how we define value.

                                                                79

 1              Some of that would depend on what is the overall

 2    inspection effort that's necessary as the process is refined

 3    and can the agency legitimately substantiate that level of

 4    inspection effort given varied plant performance.

 5              Are we going to credit industry initiatives or

 6    industry audits, self-assessments, for those plants that

 7    perform in the higher areas, for example?  Do we stay

 8    coupled with our stakeholders so that the performance

 9    indicators have validity and, therefore, credibility?

10              Those areas, as Commissioner McGaffigan mentioned,

11    I think, are going to be ongoing activities, so the program

12    stays at a level where people have confidence in them.

13              If that's true, I think the answer will be we will

14    not encounter that problem.

15              COMMISSIONER MERRIFIELD:  Okay.  Getting back to

16    the significance determination process, Dr. Lipoti made what

17    I believe is quite a damning statement, and she states that

18    the SDP turns regulating into negotiating.

19              Now, we haven't heard her testimony yet and she

20    may flesh that out, but at least the impression that we have

21    turned some of our regulatory responsibilities over to

22    licensees, would you like to respond to that?

23              MR. DEAN:  I'd like to try first.  I do take some

24    exception to that.  I think that the significance

25    determination process, as I mentioned earlier, does an

                                                                80

 1    outstanding job of laying out in front of both the licensee

 2    and the agency what are the issues that contribute to the

 3    potential risk significance of an identified inspection

 4    finding.

 5              I think what she refers to in terms of negotiation

 6    is exactly as Michael described earlier, that we want to get

 7    to a point where we describe that issue, its potential

 8    significance, and what surrounds our assessment of that

 9    significance, and then share that with the licensee, so they

10    can provide a response to us.

11              I hate to compare it to the enforcement process,

12    but there's a certain amount of due process --

13              COMMISSIONER MERRIFIELD:  I was going to try that

14    very same analogy.

15              DR. TRAVERS:  It's really just a tool and the best

16    way this tool works is with full information brought to bear

17    on how you process the information.

18              COMMISSIONER MERRIFIELD:  But ultimately, we have

19    the choice and we make the call.

20              DR. TRAVERS:  That's right.

21              MR. COLLINS:  I think the difference is, between

22    the old process and this process, that in the past, when we

23    engaged the licensees, the first order of business was

24    disagreement on the process, are we a SALP-1, are we a

25    SALP-2, why am I a problem plant, why am I not a problem

                                                                81

 1    plant, is this or is this not risk significant, why is it a

 2    level three if it doesn't have risk.

 3              The purpose of this process, as it's defined

 4    currently, is to take that off the table and engage on the

 5    issues and not on the process and once there is an agreement

 6    on what the issue is and what is risk and safety

 7    significant, then it becomes a matter of corrective action.

 8              COMMISSIONER MERRIFIELD:  My final question.  We

 9    are moving away from the SALP program, which was noted for,

10    among other things, its subjectivity and unpredictability. 

11    Those were two of the more significant charges against it.

12              In reading the paper SECY 0049, one of the things

13    that the staff intends to do is to deviate from the

14    specified action in the action matrix when it deems it to be

15    appropriate.

16              That strikes me as adding a level of subjectivity

17    and unpredictability, which is precisely what we're trying

18    to get away from.

19              I think that puts us on a slippery slope and I'm

20    wondering if you can outline your intentions about that.  I

21    also was wondering, if you feel strongly about that, how you

22    would react to having the notion that the Commission should

23    approve any deviations from that matrix.

24              MR. DEAN:  If I can take a first shot at that. 

25    One of the lessons, we actually had an actual experience in

                                                                82

 1    the pilot program that provides some credence as to why

 2    there certainly needs to be in the process a consideration

 3    of situations where a deviation from the action matrix may

 4    be warranted, and I'll describe the situation.

 5              At Fitzpatrick plant, they had an issue with their

 6    high pressure coolant injection pump that resulted in the

 7    pump being declared inoperable and in going back and trying

 8    to figure out how long the pump had been inoperable since

 9    the last time that they tested.  It resulted in a

10    substantial unavailability time period that caused the

11    performance indicator threshold for safety system

12    unavailability to from a green to a white performance

13    indicator.

14              In inspecting this issue, our inspectors

15    determined that there were some performance issues

16    associated with the cause of that pump being inoperable for

17    that time period.  So basically what you had was you had --

18    and it ended up being characterized as a white inspection

19    finding.

20              So that basically what you had was that for the

21    same root cause, for the same reason, you had both a

22    performance indicator crossing a threshold and you had an

23    inspection finding that was characterized as white.  So you

24    had two issues crossing thresholds for essentially the same

25    underlying cause.

                                                                83

 1              MR. COLLINS:  Double jeopardy issue.

 2              MR. DEAN:  Basically a double jeopardy issue.  And

 3    so if you were to follow the action matrix explicitly, it

 4    would tell you that you had two white issues in the same

 5    cornerstone, you had a degraded cornerstone, and that would

 6    result in a more substantial NRC interaction and regulatory

 7    response than was probably warranted by that singular issue.

 8              COMMISSIONER MERRIFIELD:  That may be a fair

 9    example.  I'm just generally concerned about our tinkering

10    with that action matrix.  To the extent that there are

11    suggestions to do that, I think we need to work with our

12    stakeholders and Mr. Lochbaum raised significant concerns

13    about this in his testimony, and make sure that that is

14    scrutable, understandable and clear up front, so that

15    everyone is comfortable about that kind of a circumstance

16    and not lead us to a place where it would be an ad hoc

17    subjective determination.

18              DR. TRAVERS:  I think you're exactly right.  Of

19    course, the intention is to have in place a process,

20    Regional Administrator, Director of NRR, that would set into

21    motion a review and deliberate decisions on what is expected

22    to be rare instances where we would deviate.

23              So in that sense, we fully agree that we need a

24    structured process and we need to explain it and it's really

25    our burden, we think, when we go outside that matrix.

                                                                84

 1              MR. COLLINS:  I think the intent of providing

 2    stability has actually created this need to have a kickout. 

 3    The stability is that the process should handle the vast

 4    majority of the cases, but it can't be so refined and so

 5    complex that it handles them all.

 6              So where there is an exception, there needs to be

 7    a defined process by which high levels are engaged to deal

 8    with that exception and, therefore, leave the process

 9    stable.  That's a little bit of a tradeoff here.

10              COMMISSIONER MERRIFIELD:  Thank you.

11              CHAIRMAN MESERVE:  I'd like to thank the staff for

12    a very helpful briefing.  We much appreciate obviously the

13    huge amount of effort that's gone into this and it's very

14    helpful.

15              We are going to be hearing from another panel. 

16    Let me suggest, however, that we take a very short recess

17    and return with the second panel.

18              [Recess.]

19              CHAIRMAN MESERVE:  Why don't we get underway. 

20    Before our second panel begins, let me introduce them.  Our

21    second panel is constituted of Mr. Ralph Beedle, who is the

22    Senior Vice President and Chief Nuclear Officer for the

23    Nuclear Energy Institute.  I believe Mr. David Garchow is

24    intending to be here, who is the Vice President of Technical

25    Services for Public Service Electric and Gas.  Mr. David

                                                                85

 1    Lochbaum, who is a Nuclear Safety Engineer and who is with

 2    the Union of Concerned Scientists.  Dr. Jill Lipoti, who has

 3    been mentioned several times here, is the Assistant Director

 4    of Radiation Protection Programs at New Jersey's Department

 5    of Environmental Protection.  And Mr. Frank Gillespie, who

 6    is a manager here at the NRC, but is appearing before us

 7    today in his capacity as Chairman of the pilot plan

 8    evaluation panel, about which Mr. McGaffigan asked some

 9    questions.

10              Why don't we get underway and just proceed along

11    the table.  Mr. Beedle, would you like to start?

12              MR. BEEDLE:  Thank you, Mr. Chairman and

13    Commissioners.  Before I start, I would like to go back to

14    July of 1998 and a quote by a Commissioner by the name of

15    Mr. Nils Diaz, when he said that the need to change the

16    regulatory process is not an indictment of the past, but is

17    a requirement of the future.

18              That kind of went through my mind as you quizzed

19    the staff on the development of this new oversight process

20    and a lot of questions, and I thought they were good

21    penetrating questions, good food for thought.  I'm convinced

22    that the staff has given a lot of thought to many of the

23    questions that you asked, but I think your questions will

24    spur them on in a number of areas.

25              But the process that we're looking at is one that,

                                                                86

 1    while it's not perfect, it just seems to me a whole lot

 2    better than what we had in the past with the SALP and the

 3    watch list process.  So I think it gives the staff, the

 4    licensees and the public a much better view of plant

 5    performance from the safety point of view and focuses the

 6    resources of the agency and the licensees on those things

 7    that are significant.

 8              So with that, if we could have the first slide,

 9    lessons learned from this pilot process that we've just

10    completed.  As I indicated, it's not perfect, but I think

11    we've seen a significant improvement in our ability to

12    measure the -- there was quite a bit of discussion on the

13    self-assessment and corrective action program, one of the

14    three key cross-cutting issues, and that is one in which

15    INPO has taken a major step in providing some guidelines for

16    the industry, such that we could have some consistency.

17              And as the agency focuses more on the

18    self-assessment and corrective action programs, I think

19    consistency is an important part of that evolutionary

20    process and INPO has, I think, done a good job in developing

21    that.

22              While you're looking at a guideline that was

23    issued by INPO that is not particularly voluminous,

24    underpinning that guideline is a tremendous inspection

25    evaluation effort on the part of the staff at INPO, regular

                                                                87

 1    inspections, one that look at all facets of the corrective

 2    action program, and I might add they have been doing that

 3    for a number of years.

 4              This is not a new area of evaluation for the

 5    Institute of Nuclear Power Operations.

 6              The third area, greater management oversight

 7    needed on data collection, and I think that is one that has

 8    been evident as we went about the development of the

 9    performance indicators for the reactor oversight process.

10              We collect, in the industry, thousands of data

11    points every month.  We collect them for FERC, we collect

12    them for INPO, we collect them for WANO, we collect them for

13    EUCG and NUMARC and everybody else, and it becomes almost a

14    routine kind of process to go collect data.

15              When you're collecting data to support a

16    regulatory process, it places a great deal of significance

17    on it that wasn't there before, and we're finding that a

18    greater deal of fidelity is necessary to deal with that. So

19    I think all in all, that's good.  It will probably improve

20    our overall performance indicator data as a result of that.

21              And then the process is more risk-informed and

22    provides for improved safety focus.  As mentioned earlier by

23    the staff, one of the major outcomes of this process over

24    the ones that we've had in the past in trying to assess

25    performance has been the significance determination process.

                                                                88

 1              It asks the question, is the situation and the

 2    condition that you're looking at today one of safety

 3    significance and if it is, you appropriately place resources

 4    on it to deal with it.  If it isn't, although it may be a

 5    violation of regulation, if it has no safety significance,

 6    then perhaps it's one that's deserving of correction

 7    ultimately, but not with the same level of intensity that

 8    you would place on one that had safety significance.

 9              Another way of saying that is that given two

10    violations, which one would you put the most significance

11    on, and the answer is the one that's safety significant.  So

12    I think that's really where our benefit is.

13              The fact that there is a wide range of safety

14    determination or significance determination processes I

15    don't think is particularly disturbing and in many respects,

16    to be expected, because of the different significance of

17    some of our programs.  Fire protection, reactor systems,

18    security programs, they all have a different nexus to

19    safety.  We need to determine what that is and I think

20    that's been well worked out and we'll continue to refine the

21    process for the significance determination process.

22              Next slide, please.

23              Now, industry concerns.  Before I start this

24    series of concerns, let me preface it by saying the industry

25    is in full support of this process and is ready to go

                                                                89

 1    forward with the next phase of the program.  So we'll start

 2    that out.

 3              Now, does that mean that everything is perfect? 

 4    No.  And so we'll talk a little bit about the performance

 5    indicators.

 6              To digress just a bit, Commissioner Merrifield

 7    asked the question about these performance thresholds and

 8    performance indicators and how do they work.  I have never

 9    envisioned that the PIs for this oversight process

10    represented goals for the industry.  However, I would tell

11    you that if you set a green threshold at four, the industry

12    is going to try and stay above four or less than four.  If

13    you set one at two, I assure that you they will try and get

14    into the two range.

15              So to that extent, they do represent goals that

16    could ultimately and will cause industry performance to

17    change.  Let me just give you an example.  In the area of

18    emergency preparedness, we had a goal that talked to the

19    number of people that were trained in our emergency response

20    teams and had you taken a snapshot a year ago, I'm sure that

21    you would not find that they're all green.

22              But having defined a threshold at which green and

23    white would occur, there isn't one plant that wants to be

24    white on that particular area, so they're going to go do

25    more training in order to make sure that they satisfy that

                                                                90

 1    what is perceived to be a goal on their part.

 2              We also talked about the gradation within the

 3    green area, where the plants have established internal

 4    thresholds to keep their performance well above the break

 5    point between the green and white, because they don't want

 6    to be white.

 7              There is a clear perception on the part of the

 8    industry that white invites more attention.  They don't need

 9    more attention, and so they're going to try and keep it

10    green.

11              I think the public is going to view this the same

12    way.  Regardless of how much margin to safety you have in

13    the white area, if the plant goes into the white, the public

14    is going to react to it.

15              So with regard to the PI thresholds, I think we

16    need to recognize that they do have the potential for

17    ratcheting, and I'll use that term, since the staff used it

18    earlier, they do have the potential for ratcheting the

19    behavior of the plants, and I think we need to look at that

20    behavior and ask whether or not it's the kind of behavior we

21    want.

22              Need for more stakeholder and review comment.  I

23    would encourage you to continue the engagement of

24    stakeholders, not only NEI, industry, but David Lochbaum has

25    been a major contributor to it, other stakeholders have been

                                                                91

 1    involved in this thing that bring a different perspective,

 2    and I think it's been healthy and is a major change in the

 3    way the agency has gone about this.

 4              I think the admonition from one of the

 5    Commissioners in the past has been that the earlier you

 6    engage the stakeholders, the better the product turns out to

 7    be, and I think that's borne out in this case, as well.

 8              Consistency with other regulatory requirements. 

 9    One of the problems that I think we're going to face is that

10    as we try and establish thresholds, we're going to cross

11    some of the boundaries with other programs that we have in

12    place, and let me just give you an example.

13              In our technical specifications, we have an

14    emergency AC power system allowed outage of about 3.8.  We

15    have a maintenance rule acceptable outage of 4.1 and then we

16    come in with a PI threshold of 2.5.  So the utility sits

17    here and looks and says, well, what's the right answer,

18    where do I have to be; well, I've got to be 4.1 for the

19    maintenance rule, but I've got about 3.8 for the tech specs,

20    and a 2.5 for the PIs to satisfy that requirement.

21              They look at that and say, well, what is the

22    requirement, where does the rubber meet the road on this

23    thing, and I think we need to be careful as we set these PIs

24    that we don't set unnecessary restrictions on the operation

25    of the plant for which other programs have already evaluated

                                                                92

 1    and determined acceptable levels of performance.

 2              The last one on this slide is sufficient data to

 3    justify changes.  Current definitions for some of these PIs

 4    truly represented historical performance, others did not. 

 5    So we need to make sure that as we go about the process of

 6    establishing these thresholds, we do it with full knowledge

 7    of the performance and the historical behavior of that

 8    particular performance indicator.

 9              Next slide, please.

10              Need for continued checks and balances to ensure

11    consistency across the industry.  I think that's been one of

12    the major objectives in this process, is to try and get a

13    consistent application of the process to all the utilities,

14    from region to region and within the regions.

15              I think that having the suggestion earlier to have

16    a group of stakeholders that continue to look at the

17    process, I think would be very helpful, and I think that

18    brought together the regions in a way that they hadn't been

19    brought together in the past when we had those meetings.  So

20    very helpful.

21              And then the NRC process for future changes in the

22    program.  The issue of continually increasing the thresholds

23    is one that is of concern to us.  I will not back away from

24    that.  We need to make sure that we are in a position where

25    consistency over time is something that's relatively assured

                                                                93

 1    in this process.

 2              Now, that doesn't mean that we are reluctant to

 3    see change or that we're reluctant to see new and viable

 4    performance indicators, but we've got to make sure that we

 5    don't take the performance indicators we have and try and

 6    use that as a means of driving the performance of the

 7    industry.  We need to continue to keep our focus on those

 8    things that are necessary for safety and assurance of public

 9    health and safety.

10              The next is overreaction to the white inputs, and

11    I think Commissioner Merrifield, in his discussion on the

12    action matrix, was -- that's a good point.  All of this

13    falls by the wayside if not only the NRC's reaction to the

14    action matrix or the licensee's reaction to the action

15    matrix or the public's reaction to that action matrix, all

16    of those things play together, and if we don't try and keep

17    this in perspective and continually preach the lessons of

18    the green is an acceptable band of performance, it doesn't

19    mean good or bad, it's an acceptable band of performance,

20    and white means that you've departed from the margins

21    slightly, that you deviate from the industry's normal

22    performance, and that perhaps some more investigation on the

23    part of the NRC is warranted does not mean that you're an

24    unsafe plant.

25              We've got to continue to remind the public of

                                                                94

 1    that, as well as ourselves.

 2              Next slide, please.

 3              Industry implementation.  As I indicated, we are

 4    ready to implement the next phase of this program.  I think

 5    the characterization as an initial implementation with the

 6    clear notion that we're in a continuing process of change

 7    and revision and refinement, that we will have a continued

 8    feedback and dialogue to understand how those changes take

 9    place, get feedback from the various stakeholders in the

10    process, I think is all a good and positive indication of

11    the vibrance of this program.

12              Enhancement is a disciplined process.  I think I

13    just talked about that.  I think that is an essential

14    element of it.  And then the use of senior NRC and perhaps

15    chief nuclear officers meeting on a regular basis to discuss

16    the development of the program I think is almost crucial to

17    its success.

18              Overall benefits of the program.  I do think it's

19    far more objective than our SALP process.  I think it's

20    predictable, certainly provides safety focus through that

21    significance determination process.  I think it's

22    understandable.  I use my son as kind of a test on some of

23    these and I say go to the NRC web site and what do you

24    think, and he says, oh, wow, that's pretty neat, you know,

25    graphs and charts and things like that, and it's fairly

                                                                95

 1    easily navigable, although I think it could be refined a

 2    tad.  But it's pretty clear.  He seems to understand that

 3    well.

 4              Better use of industry and NRC resources.  I think

 5    that's really the bottom line.  We're not embarking on this

 6    program to try and make the nuclear plant operation cheaper,

 7    but we're trying to use our resources more effectively, and

 8    from the regulator's point of view, it has to be a focus on

 9    safety.  I think that's really the point that many of our

10    chief nuclear officers involved in the pilot program would

11    make is that this program gives me the ability to focus on

12    the things that truly represent safety for my plant and I

13    don't have to worry about some of the less significant

14    issues.

15              And that's not to say that regulations aren't

16    significant, but they don't all carry the same safety

17    significance, and that gives them the ability to

18    differentiate.

19              So yes, the industry is ready and hopefully we've

20    answered that question, Commissioner Dicus.

21              CHAIRMAN MESERVE:  Good.  Thank you very much, Mr.

22    Beedle.  Mr. Garchow.

23              MR. GARCHOW:  Good afternoon, Commissioners and

24    Chairman.  My background in this process is I had the

25    opportunity to get sent to attend the initial workshop and

                                                                96

 1    it's been sort of a gift that has kept giving all the way up

 2    to today.  So I was on the NEI senior management task force.

 3              Our utility was committed to this process through

 4    NEI and I was volunteered as a senior management rep to work

 5    with the NEI and the rest of the industry, spending several

 6    days a week and a month in Washington, DC over the last

 7    year, year and a half, and I happen to be on the pilot plant

 8    evaluation panel, which I thought had an end, but in

 9    deference to Commissioner McGaffigan, that may be a gift

10    that continues to keep giving.

11              So I am giving my comments as a representative of

12    the pilot plants.  I represent Salem and Hope Creek Station,

13    and we were two -- we were one site that actually had three

14    units in the pilot process reporting indicator data for

15    Salem Unit 1 and 2, as well as Hope Creek.

16              So we found the new process to be an improvement

17    over SALP, as has been mentioned, and it actually shows us

18    some objective safety performance measures, both by

19    inspections and PIs focusing on those issues that truly are

20    important to safety.

21              And in the deregulated environment, it is very

22    clear that our focus has to be on safety or we're not going

23    to get the reliability out of the plants, nor are we going

24    to be economically viable.  So the new oversight process is

25    exactly consistent with the strategy we need to do to

                                                                97

 1    competitively and safely run nuclear power plants.

 2              We will continue at PSEG to support this process

 3    and its implementation as we go forward.  We accept our

 4    responsibility to create the healthy work environment in our

 5    plant that does allow us to find and fix our own problems. 

 6    That is consistent with our safety theme of focusing on

 7    safety, and I agree with Commissioner Merrifield's view on

 8    the corrective action program that a low threshold, high

 9    volume programs have screening to screen out risk is

10    important for us to be able to operate our plants safely,

11    and as Mr. Beedle said, we have incentive to try to stay in

12    the green, as it were.

13              We also need to focus on the communication for the

14    process and we challenge implementation with continued

15    feedback with NEI, the industry and the NRC is crucial.  The

16    interactions have been very positive to this point in time. 

17    We need to continue that positive vein as we go forward to

18    initial implementation.

19              Relative to the performance indicators, we see

20    those as positive.  The objectives, I think they are tied to

21    the cornerstones appropriately, and they do include,

22    especially at the white-yellow and yellow-red thresholds,

23    they are risk-informed.

24              I guess the danger would be as we continue the

25    trend of improving industry performance, I think the

                                                                98

 1    discussion of whether to reset the green-white thresholds

 2    along the way as the industry performs will need to be

 3    looked at carefully to make sure that, in fact, doesn't

 4    cause some unintended consequences as we keep tightening up

 5    the green band relative to the issues Mr. Beedle talked

 6    about on the public perception of going into white, because

 7    I don't believe the public would understand that five years

 8    from now, we've raised the threshold and what was white five

 9    years from now might have been green today, and I think we

10    have some confusion.

11              So I think we need to be very careful on

12    adjustments to the green-white threshold, while continuing

13    to do the research on the white-yellow threshold to make

14    those as performance-based as we possibly can and make sure

15    that there is an appropriate band as licensee performance

16    goes from green to white and white to yellow.  So I would

17    add that caution.

18              PI accuracy, we've had some learnings at Salem and

19    Hope Creek, and I believe that the industry, as well as us,

20    the senior management team at the facility, need to continue

21    our vigilance in defining the process to make sure that the

22    PI data is reasonably accurate as we can provide.

23              I think we need to clarify and really come to some

24    grips of what an honest mistake that has no impact at all in

25    crossing a green-white threshold means relative to 50.9.  So

                                                                99

 1    that is still an open issue, in my mind, and we need to work

 2    through the resolution of that, because I'm not sure it's in

 3    the NRC's, the public's or the utility's best interest to be

 4    having extended discussions over 12 minutes of

 5    unavailability of a safety system, and there are examples

 6    out there where there are differences of whether it was

 7    unavailable or available for ten minutes and in the whole

 8    scheme of things, I'm not sure that's an appropriate level

 9    of discussion that we would need to have for something

10    that's really in the green.

11              That being said, we need to strive to get our data

12    collection as accurate as we possibly can.

13              Next slide.

14              The inspections during the pilot at Salem and Hope

15    Creek we found to be very thorough and we saw the

16    significance determination process as very well done.  There

17    were differences of opinion throughout the industry on the

18    SDP.  We had an opportunity to exercise the fire protection

19    SDP and we found that to be very thorough, focused on risk,

20    and I would agree, I believe with Mr. Collins, who said that

21    during the interaction with the NRC, the focus was not on

22    the process, the focus was on the true defense-in-depth that

23    existed relative to the issues that were being discussed. 

24    So we moved it away from the process and on to the actual

25    issue, as Mr. Collins said.

                                                               100

 1              We believe that the new process does reduce

 2    unnecessary burden and focuses both the NRC and the

 3    utilities towards safety, while recognizing that we still

 4    have an obligation to comply with the regulations, and, as

 5    Mr. Beedle said, it does provide a good screen on any given

 6    day, we have our resources working on any number of things. 

 7    This gives a good filter to make sure that we're working on

 8    those items that have the most safety significance when they

 9    come out through our corrective action program.

10              So in summary, we are committed to continue to

11    address the areas for improvement.  We've heard from many

12    stakeholders and you'll hear more this afternoon there are

13    areas to improve as we go into initial implementation, and

14    we need to, like I stressed, keep the open dialogue between

15    the stakeholders and the NRC through initial implementation,

16    so that we don't get misperceptions or misconceptions about

17    some of the changes that are going to occur as we roll this

18    out and so that we an stay on fairly consistent.

19              So I appreciate the opportunity to address the

20    Commission, and I can answer any questions during this

21    proceeding.

22              CHAIRMAN MESERVE:  Thank you very much.  We'll

23    come back at the end with questions.  Mr. Lochbaum.

24              MR. LOCHBAUM:  Good afternoon.  Slide two, please. 

25    I wanted to start with the bottom line, but first I wanted

                                                               101

 1    to point out that the dotted line by no means implies any

 2    uncertainty or anything like that.  That's just a standard

 3    header that we use.

 4              We recommend that the Commission adopt the revised

 5    oversight process as it is today.  Industry-wide, in April

 6    of 2000, or soon thereafter as possible.  Having said that,

 7    we recognize that various stakeholders, including UCS, have

 8    concerns that should be resolved after implementation. 

 9    We're not ready to give up on our concerns so far, but we

10    don't think any of our concerns would prevent industry-wide

11    implementation.

12              Slide three, please.

13              The reasons we like the new process are longer

14    than this list, but I want to hit the top four, the first

15    being that the performance in the new program is assessed in

16    27, roughly, areas instead of four broad categories.  In

17    addition, performance is assessed 30 days, roughly, after a

18    92 day period instead of 180 days after a 730 day period. 

19    So it's more timely.

20              In addition, the NRC response to declining

21    performance is predefined instead of being ad hoc or

22    arbitrary and lastly, or at least on this list, we think a

23    big benefit is that the performance information on all

24    plants is made available on the internet instead of some

25    information being available for some of the plants.

                                                               102

 1              Slide four, please.

 2              Some of the concerns we have about the new process

 3    kind of fall into five categories.  One is the perception of

 4    self-regulation by the industry.  The second is concerns

 5    with the significance determination process.  The third is

 6    what we call the missing link.  The fourth is the deviations

 7    from the action matrix.  The fifth is cross-cutting areas.

 8              I'd like to point out, again, that despite our

 9    concerns in these areas, we don't think any of these issues

10    would prevent industry-wide implementation.  I guess I can't

11    stress that enough.

12              Slide five, please.

13              Dealing with the issue --

14              COMMISSIONER MERRIFIELD:  Mr. Chairman?  What do

15    you mean by missing link?

16              MR. LOCHBAUM:  I'll get to that.  It's a slide

17    coming.

18              COMMISSIONER MERRIFIELD:  Okay.  Sorry.

19              MR. LOCHBAUM:  The first issue is the perception

20    of self-regulation by the industry and we see a number of

21    problems that give us evidence or suggest the impression of

22    that. The first is that the new process depends heavily on

23    plant owner cooperation, both for performance indicator data

24    and also for the significance determination process.

25              The plant owner can voluntarily decide not to

                                                               103

 1    submit performance indicator data or can slow down or

 2    virtually stop the significance determination process, and

 3    that doesn't seem to be good, from a public perception

 4    standpoint.

 5              The second problem we've found is that throughout

 6    the program, although the involvement of the public and

 7    various stakeholders, including UCS, has been better than in

 8    the past, we still have the view that the NRC's primary

 9    stakeholder is the nuclear industry and we and other public

10    stakeholders are treated as second-class stakeholders.

11              It could be a step up from third world treatment,

12    but I guess we're aiming for separate, but equal.

13              The last concern is we feel that to date, the

14    NRC's public communication's haven't been very good.  The

15    issues -- the reports that have been issued haven't been in

16    plain language or plain English.

17              I heard the previous panel, the NRC panel talk

18    about going out and scheduling regional talks and also

19    revising NUREG-1649.  It would be good if the NUREG were out

20    prior to the meetings with the public, because otherwise the

21    public has very little to look at and prepare for before

22    these meetings.

23              I think that was a problem during the pilot

24    program, that information wasn't made available to the

25    public when they were asked to come in and provide their

                                                               104

 1    views.

 2              Slide six, please.

 3              The significance determination process, or what we

 4    call pick a color, any color, as long as it's green, we

 5    felt, as a member of the pilot program evaluation panel,

 6    basically that the pilot did not show that the significance

 7    determination process worked, because none of the findings

 8    that were determined to be potentially not green were ever

 9    resolved in the lifetime of the pilot program evaluation

10    panel.

11              We felt also because of the heavy reliance on

12    plant owner negotiation or participation, it would be better

13    not to use that input.  Instead, use SPAR models or other

14    means that the NRC controls to determine if something is

15    significant or not.

16              Also, the significance determination process

17    itself is fundamentally flawed, because it looks at core

18    damage frequency and large early release F, and I forget

19    what the F stands for. There are other things that can cause

20    harm to plant workers and the public and that would be spent

21    fuel pool accidents, whether criticality or loss of water,

22    leading to overheating.

23              There's also a lot of tanks on-site that contain a

24    large amount of radioactive material; if they were to be

25    vented straight to the atmosphere, it could cause a lot of

                                                               105

 1    concern.

 2              The significance determination process ranks all

 3    those as nothing, basically, and any process that --

 4    anything that's called risk-informed that omits any

 5    significance on those issues seems to be flawed.

 6              We also look at the significance determination

 7    process for physical protection and in our view, that's more

 8    of a measure of terrorist performance than it is of plant

 9    owner performance.  If the NRC determines or finds, an NRC

10    inspector finds a bomb taped to the reactor vessel, but it

11    hasn't gone off yet, the significance determination process

12    will rank that, at worst, a white, perhaps a green.

13              Also, if a bomb gets inside and goes off, but

14    blows up a warehouse or something not very important, the

15    significance determination will, again, rank that, at worst,

16    a white and perhaps a green.  That, in our view, is a

17    measure of how successful the terrorist is and not how

18    successful the licensee is preventing the bomb from getting

19    inside the plant.  So we think that's totally messed up.

20              As totally messed up as it is, it's better than

21    the old process, so we don't think it should prevent

22    industry-wide implementation.

23              COMMISSIONER McGAFFIGAN:  As he repeats this, I'm

24    not sure of the depth of --

25              MR. LOCHBAUM:  It's sincere.  I practiced all last

                                                               106

 1    week.  Slide seven deals with what we call the missing link. 

 2    There's been some -- as we attended some of the workshops

 3    and meetings over the last year, the industry has wanted to

 4    try to use the significance determination process for all

 5    NRC findings and when you apply that to findings in the

 6    physical protection area, the security area, it simply

 7    doesn't work, because PRAs and IPEs don't consider any

 8    threats from sabotage or terrorist acts.  So therefore,

 9    because the risk assessments don't include that, you can't

10    apply the results from those processes to evaluate the

11    significance of problems in those areas.

12              So we think until you do or anybody does risk

13    assessments that account for terrorist and sabotage acts,

14    then you have to, by definition, consider physical

15    protection cornerstone problems separately from

16    risk-informed situations.

17              Slide eight, please.

18              I guess as I get older, my counting isn't as good,

19    because the previous panel, Mr. Dean said there was one

20    deviation from the action matrix during the pilot program,

21    and I counted three.  There were two involving -- it was

22    Fort Calhoun, Quad Cities and then the final event involving

23    Fitzpatrick, and yet three deviations, at least that I know

24    of, from the action matrix over a six month period over a

25    limited number of plants, that seemed far from rare and the

                                                               107

 1    reasons might have been very well justified, but they

 2    weren't open because they weren't very clear or well

 3    documented.

 4              So I think there is a problem in this area.  We

 5    think that any deviations from the action matrix, as

 6    Commissioner Merrifield pointed out, could be subjective and

 7    are, therefore, potential threats to safety and in any

 8    event, they are tangible threats to public confidence.

 9              Anytime a regulatory agency says they're going to

10    do one thing and does something else, the public confidence

11    has to be eroded.  It cannot be -- at best, it's going to be

12    held the same.  At worst, it's going to be eroded.  So the

13    NRC must take safety warnings from this while process

14    seriously and not deviate from the action matrix.  That

15    defies the whole purpose of it.

16              Slide nine is the cross-cutting areas.  As

17    important as the cross-cutting areas were, and I kind of

18    forgot which camp I was in throughout the process so far, I

19    think I understand, I believe, the fundamental tenet that

20    the cross-cutting areas will manifest themselves in one of

21    the PIs or inspection findings.

22              But regardless of which camp I'm in, we think it's

23    important that NRC not handle findings in cross-cutting

24    areas via the significance determination process, because

25    that could improperly downplay safety problems.

                                                               108

 1              If an NRC inspector finds that the corrective

 2    action program is totally out to lunch on non-safety-related

 3    systems, we think that requires an extent of condition

 4    evaluation to determine if that program is broken across the

 5    board or if that was the isolated example.

 6              That extent of condition could be done by the

 7    licensee or could be done by the NRC, but simply to dismiss

 8    the issue because it's a non-safety-related system is wrong.

 9              Perhaps with the risk-informed guidance, the NRC

10    inspectors won't be looking in non-safety-related systems

11    for those things, but be that as it may, if the findings are

12    there, you have to -- we think that there's a possible --

13    it's responsible to pull the string and follow up on them.

14              Slide ten, I go back to that bottom line and

15    basically I repeat we think despite these problems, which we

16    feel need to be fixed, but not necessarily before

17    industry-wide implementation, we think this new program can

18    and should be extended nationwide.

19              Our view is that the current program, which we

20    call the bride program, because it's something old,

21    something new, something borrowed and something blue, hasn't

22    been tested, either in a pilot or anything else.  SALP and

23    watch list have been suspended.  We're operating on an

24    interim program that's not the old, not the new.  If the

25    honeymoon ends on what we call the bride program and there

                                                               109

 1    is an event or a serious near miss at a plant, the public is

 2    not going to be real happy about this.

 3              If this is the best oversight tool that we have,

 4    we ought to start using it at every plant as soon as we can,

 5    and our view is that this is the best we have and we should

 6    use it starting next month.

 7              Thank you.

 8              CHAIRMAN MESERVE:  Thank you.  Dr. Lipoti.

 9              DR. LIPOTI:  I am reminded of the story about the

10    Emperor's new clothes and I like the material, I like the

11    cut of the suit, and I like the style, but like the little

12    boy in the story, I'm not afraid to say that the Emperor is

13    in the parade in his underwear.

14              The theory of this oversight process, with its

15    risk relationship, its quarterly reports, its

16    predictability, the web site, the visibility of the program,

17    which brings accountability, that's all wonderful.  But the

18    practice, as we have seen in the pilot, needs work and I

19    really don't want to see NRC in your underwear, so I'm going

20    to give you some comments that I think should improve the

21    program.

22              Poor Sam Collins and Bill Dean kept having to say

23    almost all stakeholder support, and I'm that hold-out.  And

24    I'm so pleased that all of you have read the comments.

25              One of the main problems I have with declaring the

                                                               110

 1    new oversight program an absolute triumph of the existing

 2    program is that you have decided that there is a good enough

 3    level for nuclear power plants.

 4              The threshold between the green and the white is

 5    the ad hoc good enough level and your regulatory system no

 6    longer encourages continuous improvement.

 7              You're willing, maybe for the sake of economic

 8    competition and in this era of deregulation, to accept a

 9    good enough level.  Now, my value system says that

10    continuous improvement should be encouraged and that the

11    good enough threshold combined with the economic pressure to

12    produce power at a competitive rate will instead encourage

13    the nuclear power plants to strive for the lowest green

14    parameters they can without passing in the white zone.

15              Maybe the reason I feel so strongly about this is

16    because I've been faced with the same decision to set a good

17    enough level for X-ray programs and I haven't been able to

18    bring myself to do that.  I know that there's plenty of

19    differences between regulating X-ray systems and regulating

20    nuclear power plants, but when it comes to setting

21    performance standards, performance indicators, what's good

22    enough?  Is 95 percent of the X-ray images that attained the

23    quality standard good enough?  What if that still leads to a

24    physician missing a diagnosis?

25              So why not encourage continuous improvement?  Your

                                                               111

 1    direction to staff was to set the good enough thresholds and

 2    even that, the staff has only partially fulfilled.  The

 3    performance indicators need to be a way to be a leading

 4    indicator, as you indicated, and so far the PIs chosen are

 5    lagging indicators.

 6              Certainly there are criteria for good performance

 7    indicators, whether they correlate to safety, equate to a

 8    risk number, be anticipatory, predict safety in the future,

 9    and accurately portray the need for additional inspection

10    resources beyond the baseline.

11              The thorough inspection identifies problems before

12    they escalate, but you need the resources to perform those

13    thorough inspections.  That green to white threshold has

14    emerged as the most important, as Mr. Beedle as stated, even

15    though it's defined as just department from the margin.

16              The nuclear power industry has decided that a

17    white finding is so detrimental to their image, that it's

18    always contested.  When the white threshold is crossed,

19    rather than really searching through their nuclear power

20    plant for what might be an early warning that operations

21    need some additional scrutiny, during the pilot, utility

22    management has instead attacked a system that led to the

23    white finding and said go back, let's change the SDP

24    process, it must be wrong; go back, let's look at the PIs,

25    they must be wrong.

                                                               112

 1              The SDP process has been evolving at a tremendous

 2    rate.  I learned just today about this Brookhaven report

 3    that's going to influence the SDP process, it will be out in

 4    two weeks.  The definitions of PIs have been refined and

 5    what has happened with all of this change, this tremendous

 6    fast change, is that the unnecessary regulatory burden which

 7    has been lifted has turned into an unnecessary negotiating

 8    burden, where the oversight process becomes more negotiation

 9    and less regulation.

10              Is that a function of the pilot?  Perhaps it was. 

11    But in allowing the oversight process to be applied to all

12    the nuclear power plants in the United States, calling it

13    initial implementation instead of full implementation,

14    you've kind of opened the door to negotiation in the

15    regulation of all 103 plants.

16              Now, maybe in those negotiations, as Sam Collins

17    said, you'll engage on the issues and not on the process and

18    maybe that will be a good thing.  I just don't know.

19              To make this new oversight process really useful,

20    it will have to be made more rigid and less subject to these

21    lengthy arguments and substantial changes.  Yet, at the same

22    time, we know that the program, as set forth, needs

23    improvement.

24              It must be understood what the performance

25    indicators and the inspection program that fit together and

                                                               113

 1    the thresholds stand for and how much uncertainty there is

 2    in that number.  Right now, the thresholds are chosen by a

 3    variety of different schemes and I know the Office of

 4    Research is working on a white paper.

 5              In that white paper, the risk level has to be well

 6    defined, and the rationale thoroughly explained, and an

 7    uncertainty analysis performed that addresses the

 8    uncertainty in the risk assumptions and the propagation of

 9    uncertainty through the whole process.

10              That's going to require the balance between the

11    generic PRA and the plant-specific IPE.

12              My last point has to do with a thorough

13    explanation of the rationale for the thresholds.  That

14    explanation, that rationale has to be viewed not only as a

15    regulatory tool, but as a communication tool to make your

16    regulation understandable to the public.

17              The public will look for the explanation of your

18    indicators as their leading indicator, the public's leading

19    indicator for their concern or lack thereof about nuclear

20    power.  Excellence in regulation is what can change public

21    perception about nuclear power, allowing it to be part of

22    the energy mix for the nation.

23              So don't allow your agency to abandon a continuous

24    improvement approach for regulation and settle for a good

25    enough approach, and continue to communicate your rationale

                                                               114

 1    and your philosophy to the public.

 2              I know there were a lot of questions that were

 3    asked about my testimony provided and I'll try to answer a

 4    few of them, if you'd like me to now, or if you'd like to

 5    wait for questions.

 6              CHAIRMAN MESERVE:  Have you completed your

 7    statement?

 8              DR. LIPOTI:  My statement is over.

 9              CHAIRMAN MESERVE:  Okay.  Why don't we come back

10    later?  Mr. Gillespie.

11              MR. GILLESPIE:  Good afternoon.  My presence on

12    the second panel, representing the pilot program evaluation

13    panel rather than speaking as part of the staff, I think

14    represents kind of a unique approach to stakeholder

15    participation in this development effort.

16              I would like to thank all the participants and we

17    kind of a microcosm of the group here because the two Daves

18    were both on the panel.

19              I think the strength of this effort was in the

20    time and I'll say even emotional involvement on the part of

21    the members through the whole process.  We were established

22    as a Federal advisory panel, with all the various bells and

23    whistles, we kept transcripts, we kept notes, and even

24    beyond the final report, which was -- we really focused on,

25    it was supplied actually in December and I think the

                                                               115

 1    Commission was supplied a copy in December.  So it was kind

 2    of a forerunner.

 3              We actually supplied the staff with the individual

 4    members' views, packaged with that final report, and asked

 5    the staff to deal with those individual views, also.

 6    The panel was chartered to actually use the same criteria

 7    that the staff established that they would measure the

 8    performance of the program against and which Bill Dean went

 9    through in great detail and it's included in the paper, so

10    I'm not going to repeat each one of those.

11              But in our first meeting, we rapidly adapted and

12    went through and went through a review with the staff of

13    those criteria and I do have to say that from stage one, the

14    staff was reasonably responsive to our comments on the

15    criteria and we did have a consensus letter report that we

16    sent to the staff to make some adjustments to the criteria

17    and virtually every recommendation was taken.

18              So that set the stage for both the panel and the

19    staff to be using basically the same yardstick as we went

20    through this process of information collection.

21              The panel recognized in what it was doing and I'm

22    going to stay away from any personal views, I will say that

23    right now, because I know Commissioner Diaz knows me, we've

24    talked before, and I'm representing kind of a consensus

25    view, and so I'm going to focus strictly on the view of the

                                                               116

 1    panel.

 2              But the panel saw itself needing to supplement its

 3    views, so we actually invited five states, in addition to

 4    Gary Wright, from Illinois, who was a permanent member of

 5    the panel, to come in and give us some additional state

 6    insights, New Jersey, Ms. Lipoti's staff came in.  So we did

 7    get extra insights from them.  Mr. Riccio from Public

 8    Citizen came in.

 9              And we had a meeting where we allowed these people

10    to actually almost participate as panel members and ask the

11    panel itself questions, so the panel allowed itself to be

12    subjected to why are you thinking this way processes.

13    So I think we had a very open process and we went outside

14    where the personal views or the personal insights and

15    experience of the panel might have actually been kind of

16    limiting.  So we did try to expand, but we expanded only in

17    those areas where we felt we needed to bring extra insights

18    in.

19              We also had an inspector come in, a regional

20    branch chief, and we deliberately picked people who had

21    expressed fairly vocal opinions during the process, and so

22    we were very open with those opinions.

23              And even through all that, let me focus on not the

24    details, because everyone has kind of gone through the same

25    details, but just on the final conclusion to reinforce what

                                                               117

 1    David has kind of really already said and he kind of

 2    repeated this panel's conclusion, and we sent this to the

 3    Commission in December.

 4              That the overall conclusion of the panel is that

 5    the framework provides a more objective, clear and

 6    risk-informed approach to oversight of nuclear reactors.

 7              The program should proceed to industry-wide

 8    implementation.  The panel has identified several areas that

 9    need refinement before industry implementation.  In

10    addition, industry-wide implementation will be needed to

11    gather data to judge the effectiveness of the program and to

12    allow further improvements.

13              And this was done in the context of this program

14    really does appear significantly better and using a

15    significant larger volume of information than our old

16    program was based on, and that was the context of our

17    recommendation.

18              Let me just cover the membership of the panel just

19    a little bit.  Deputy Regional Administrator was on the

20    panel, three Regional Division Directors from three of the

21    regions, all the regional views were considered, the State

22    of Illinois, David Lochbaum, four of pilot plant

23    representatives, all managers from those pilot plants, and I

24    think it really did represent a synthesis of views and was

25    kind of the unique part of the process where these views

                                                               118

 1    were kind of put at a single table with a single vision to

 2    come out with a consensus on things we could come to a

 3    consensus on, and that was the nature of our final report.

 4              And with that, I think that's really kind of

 5    covered the essence.  It was a unique process and people had

 6    a lot of time invested in it.  And the success of the

 7    process was from the first day, we had a vision of an end

 8    product and an end date, and to maximize the usefulness of

 9    the information, we set out, I think, right from the

10    beginning, saying in December of this year, and this was

11    starting in about June was our first meeting, in December of

12    this year, we are going to have a report, even if we have to

13    lock ourselves in a hotel room, to get it to the staff, to

14    make maximum usefulness, given the staff's schedule.

15              And people did a lot of homework, a lot of extra

16    reading, and while the State of New Jersey wasn't on the

17    panel officially, Jill's staff showed up for every single

18    meeting and we let them speak up at every single meeting. 

19    So they were kind of like almost like de facto members.

20              I think the strength of the report was in the

21    consensus process and the strength of the views was in the

22    delivery of those consensus views in a timely way, and those

23    views have now been overcome with corrective actions or

24    things that the staff has done to address them.  So I think

25    going through those again at this point would not be all

                                                               119

 1    that beneficial.

 2              With that, I'll just end my statement.

 3              CHAIRMAN MESERVE:  I'd like to thank you all very

 4    much.  Mr. McGaffigan is going to have to depart, so he's

 5    asked if he could have an opportunity to first crack at the

 6    questions.

 7              COMMISSIONER McGAFFIGAN:  Thank you, Mr. Chairman. 

 8    I guess I'll start with the last statement and ask other

 9    members of the panel, I think Mr. Gillespie has essentially

10    -- I've come up with a better acronym rather than IIEP --

11    IMPEP, implementation evaluation panel, and we already have

12    an integrated materials performance evaluation panel.  So it

13    would be NRR IMPEP and NMSS IMPEP.

14              But do the others have a view as to whether having

15    an ongoing activity, Mr. Beedle mentioned the value of

16    having the regions involved in the ongoing activity, as this

17    is initially implemented, whether we should wait, as the

18    staff suggested earlier, till towards the end of the year,

19    maybe January 2001, or whether having a panel that involves

20    the states, involves the regions, involves the public

21    interest groups, involves these additional people who can

22    come?

23              It's just a place to vent, even if you don't reach

24    consensus, and also a place to keep things on track.  Is

25    that worthwhile?  David, do you have a view?

                                                               120

 1              MR. LOCHBAUM:  I guess it did have value.  I'm in

 2    DC anyway, so I could have gone to the meetings.  So the

 3    PPEP was a nice opportunity for me, but I had access anyway.

 4              I think what I gained most from that was hearing

 5    the resident inspectors and the regional inspectors, what

 6    their views were.  Sometimes I had concerns that went away

 7    after they explained why that wasn't a problem and I thought

 8    that hearing that face to face was something I couldn't have

 9    gotten any other way, I don't think.

10              So from that standpoint, I think it was

11    beneficial.  The one thing I'd like -- I kind of represented

12    the public, but I don't speak for all the public.

13              COMMISSIONER McGAFFIGAN:  I understand.

14              MR. LOCHBAUM:  In fact, I disagree with most of

15    the people I work with in this issue, which puts me in an

16    awkward position. So I think the frequently asked questions

17    system that the staff had for the industry, if something

18    like that were set up for the public side, it was developed

19    very late, I think in January, after the comment period

20    ended.

21              If something like that were set up for other

22    public stakeholders, I think that would supplement anything

23    that was done in terms of an advisory panel.

24              COMMISSIONER McGAFFIGAN:  How about you, Dr.

25    Lipoti, do you have any view about the value of having an

                                                               121

 1    ongoing evaluation, a formal ongoing evaluation process as

 2    opposed to waiting till next January?

 3              DR. LIPOTI:  There's a lot of value in listening

 4    to the staff's views and what the inspectors themselves are

 5    finding when they try and take the theory that is beautiful

 6    and apply it to inspecting a real plant.

 7              COMMISSIONER McGAFFIGAN:  That's my view.  I'd

 8    almost love to go and watch some of this stuff play out

 9    myself.  I think Commissioner Diaz is nodding his head, too.

10              In some sense, the advisory panel is a proxy for

11    playing out some of those issues.

12              How about Mr. Beedle or Mr. Garchow?

13              MR. BEEDLE:  I think there is tremendous value in

14    being able to come together and air your concerns.  I'm not

15    suggesting we do this on a daily basis, but I think a year

16    is probably too long.  You need to do it at an interval that

17    gives you the ability to take the input and do something

18    with it that makes some sense in terms of the program's

19    development.  So it's kind of an ongoing lessons learned

20    process.  Now, I know Dave --

21              COMMISSIONER McGAFFIGAN:  Dave does not have to

22    keep on giving.  We could get a different --

23              MR. BEEDLE:  I think he'd probably volunteer.

24              COMMISSIONER McGAFFIGAN:  Okay.

25              MR. GARCHOW:  One suggestion I'd have,

                                                               122

 1    Commissioner McGaffigan, is I've never been on a Federal

 2    panel before, so I wasn't sure of all the rules, and in some

 3    respects, getting through some of the formality limited, to

 4    some extent, some of the dialogue.

 5              So I think there is value of getting the right

 6    people together and having and being able to hear from the

 7    NRC staff and the inspectors and the regional folks and even

 8    some of the industry folks from some of the plants that are

 9    just picking this up, so we've heard from the pilot plants,

10    but there's now a whole other category of plants that could

11    surface even some issues that had never come up before.

12              So I would say there s probably a benefit to have

13    that not be a year.  Whether the FACA process is the right

14    vehicle for that I guess would be worthy of some discussion

15    that Frank can carry on for the NRC, that I think we can

16    accomplish the same way through some workshops scheduled at

17    the right times, where people have an opportunity to provide

18    input and still get to the same gain without some of the

19    formality that was around the -- what you call the PPEP

20    process.

21              MR. BEEDLE:  I think one of the things that we

22    need to continually remind ourselves of is that this

23    oversight process is the NRC's policy mechanism for dealing

24    with oversight and assessment of the plant performance.  We

25    haven't changed any of the regulations yet.

                                                               123

 1              In fact, I don't know of one regulation that was

 2    changed as a result of this.  So all those regulations that

 3    were in place two years ago are still there today and we

 4    have to abide by them, we're still governed by them.

 5              And the value that we bring here, I think, is

 6    comment and critique on a process that affects us profoundly

 7    and one that I think we can add a dimension from an

 8    assessment point of view, because the NRC has caused us to

 9    be very conscious of our own self-assessment.

10              So I think we've got a lot of history that we

11    bring to bear in this process of assessment.  So I think

12    that's probably the value added from the licensees.

13              COMMISSIONER McGAFFIGAN:  I think part of the deal

14    of going first is I won't prolong this very long.  The value

15    that I'm trying to get, and I don't know whether a formal

16    FACA process is needed or not, but some sort of ongoing

17    evaluation and involvement of the people that David talked

18    about as being second or third class citizens, making sure

19    that -- and I think we have some in our own internal

20    stakeholders, making sure the regions and the inspectors and

21    whatever get to speak frankly their views and the public

22    stakeholders get to speak frankly their views, the states,

23    so that it doesn't appear and it isn't and it hasn't been an

24    industry and NEI thing.

25              One last question.  I want to pick up on David's

                                                               124

 1    notion of the bride program and the Emperor having no

 2    clothes that Dr. Lipoti talked about.

 3              It strikes me that if the child were looking at

 4    the SALP/watch list process, he'd say the Emperor is buck

 5    naked.  So underwear may be a significant improvement and

 6    maybe someday we'll have cloaks and look like real emperors.

 7              But I think that David has it right when he talks

 8    about the bride program.  We have this interim program that

 9    is -- we don't have anything.  We have SALP, which is gone;

10    we have the bride program, which is something good,

11    something old, whatever.

12              So I do want to put in a plug that I do think it's

13    an improvement.  We may be in our underwear, but it's a step

14    up.

15              CHAIRMAN MESERVE:  Thank you, Mr. McGaffigan, I

16    think.  I was really struck in the juxtaposition of the

17    comments by the two Davids here, that one characterized the

18    significance determination process as enormously valuable

19    and David Garchow talked about having gone through it in

20    fire protection and how that was a very helpful process, got

21    to the crux of the issues, rather than fighting over the

22    process, and how useful it was.

23              Whereas your slide six, David Lochbaum, was

24    basically -- said it doesn't work at all and we ought to try

25    something different.

                                                               125

 1              I'm trying to reconcile those two statements and

 2    I'm also wondering, the staff has admitted that there are

 3    some failings in the significance determination process and

 4    they have discussed today a whole variety of things they

 5    have underway to try to strengthen it, and I'm curious as to

 6    whether you are comfortable with that aspect of the process,

 7    David, and how it's evolving.

 8              MR. LOCHBAUM:  Yes.  I think so, because even with

 9    the process the way it is today, which we can say is not

10    perfect, the information is reported in the inspection

11    reports and it comes out fairly quickly thereafter.

12              So from our role as a monitor, we can see the

13    information and we can see the problem.  We would probably

14    have a different significance determination process that we

15    use than the staff is using.  So at least it gets us -- if

16    it's -- we believe it's a concern, we can engage either the

17    region or whatever the right direction that needs to be

18    taken, so we get the information and we can respond on it.

19              So it's helpful to us.  The coloration means very

20    little.  The example that I go back to, what's wrong with

21    the significance determination process, is an event at Quad

22    Cities.  They had a problem with calibration of diesel

23    generators, I forget the exact mechanical problem, but it

24    impaired the performance of all the diesel generators at the

25    plant and it existed for roughly an 18 month period and it

                                                               126

 1    was -- the significance determination process looked at that

 2    and it came out green because there wasn't a loss of

 3    off-site power during that 18 month period, which is

 4    splendid, but that doesn't address whether that's --

 5              As Mr. Beedle said earlier, if you had two

 6    findings, one where the diesel generators, all the ones at

 7    the plant were broken or impaired for an 18 month period is

 8    a little bit more than some of the other things that came

 9    out.

10              And if the SDP allows those kind of end results to

11    be presented, there is something wrong with it.  But even

12    with that problem, I saw the data that said that diesel

13    generator is broken, so we can do what we need to do.

14              So from our standpoint, we're given the

15    information we need to get engaged and to follow safety

16    issues.

17              CHAIRMAN MESERVE:  And I would take from the

18    staff's comments that they recognize that there were some

19    problems in it and they're trying to address them.

20              Mr. Gillespie, you mentioned in passing that the

21    report had a whole series of issues that it thought should

22    -- of your group -- that it thought should be addressed

23    before implementation.  I realize you can't speak for the

24    group, because they haven't reassembled yet, but I'm curious

25    as to whether, from your personal view, those issues have

                                                               127

 1    been satisfactorily enough addressed so that implementation

 2    in April is appropriate.

 3              MR. GILLESPIE:  Yes, I believe they have and I

 4    think Dave and Dave could reinforce that.  In our report,

 5    there was about a two-page summary which tried to iterate

 6    what was pre-April, what was kind of post-April.

 7              There is one important document that has come up,

 8    it's a post-April document that I would say the staff has

 9    committed to, and that's a basis document and it's come up

10    in several forums, to try to articulate all the whys,

11    because this program has been moving so fast that we haven't

12    necessarily kept pace with recording, in a very orderly way,

13    why is this indicator exactly the way it is, why is that

14    one, why did the inspection program come out this way.

15              And it's not the how to implementation inspection

16    procedure document, it's literally the basis.  And we had a

17    basis document when we started, but it didn't get kind of

18    maintained up to date, and that's become a very kind of, in

19    many discussions, including with ACRS, an important document

20    to focus on over the course of next year, and also before we

21    change the program a long after the next year, I think it's

22    important we need to write down, and the committee found

23    this, write down that basis -- before you change your basis,

24    write the first basis down.

25              But that was a more extended IOU, because

                                                               128

 1    basically we have all those people participating in the

 2    program today, so we haven't lost that insight.  So that was

 3    an okay to start in April, but, boy, let's keep an eye on it

 4    over the next year, because we do need to get that reported.

 5              So I think all the near term things have been

 6    addressed by the staff.

 7              CHAIRMAN MESERVE:  Good.  Thank you.  Dr. Lipoti,

 8    you had made a whole series of forceful comments about the

 9    program, but nearly everyone else has told us it's an

10    improvement and it's something that we should go forward

11    with.

12              You haven't given us a comment as to whether you

13    agree with that.

14              DR. LIPOTI:  I hear a train.  I think it's going

15    forward.  I may be a naysayer that says, well, I think you

16    should fix a few things before it goes forward.  There is

17    really no point in my saying that.  It's going forward.

18              So now what I want to do is just offer

19    constructive criticism for improving it as it goes forward.

20              CHAIRMAN MESERVE:  Thank you.  Commissioner Dicus.

21              COMMISSIONER DICUS:  Thank you.  Let me start with

22    Mr. Beedle.  I appreciate your answering the question that

23    the industry is ready to go forward and so forth, but I

24    think you also recognize that you do have some pockets of

25    skepticism.  Are you prepared to deal with those?

                                                               129

 1              MR. BEEDLE:  I think that both the industry, as a

 2    body, and the agency reflect the effect of change and just

 3    as the agency is dealing with some apprehension and concern

 4    and how do I fit into this and what do I do and what's my

 5    role, I think the industry has the same concerns.

 6              We've got concerns over the 72 hours, we've got

 7    concerns over the 4.1 versus the 2.5 versus the 3.2 and all

 8    those things bubble up into anxiety on somebody's part.

 9              If I were to wait until everybody was absolutely

10    satisfied that everything was perfect, we'd all be dead.  We

11    just would never get there.

12              So somewhere along the line, we've got to, as my

13    dad used to say, fish or cut bait, and I think we've already

14    been told today that you don't have a process I place today. 

15    We've got kind of one foot in never never land and the other

16    foot is in this new assessment process.

17              So I think for the sake of your sanity on the part

18    of your staff, you need to make a decision one way or the

19    other. Either go back and reinvent the SALP process or move

20    ahead with this oversight process.  I think we've already

21    heard enough about the characterization of the SALP process,

22    that we probably don't want to go there.

23              So I think this offers the best chance of giving

24    the agency the ability to effectively assess the performance

25    of the plant in an objective, predictable manner that's

                                                               130

 1    visible to the public, to the agency, and the licensees.

 2              COMMISSIONER DICUS:  Okay.  PSE&G;, in your summary

 3    statement, you say continue to address the areas for

 4    improvement, but you didn't go into them, but we've heard

 5    where there are problems.  Is there anything you would add

 6    to or subtract from all the concerns that we have heard

 7    today?

 8              MR. GARCHOW:  Specifically, for PSEG, we found no

 9    issues that were different than what had led up to the PPEP

10    or the NEI process, because we were involved in both.  So we

11    had to opportunity to hold any issues we had specifically to

12    the pilot implementation at Salem and Hope Creek into both

13    of those forums and they were adequately addressed to our

14    satisfaction.  Same as Mr. Lochbaum, sine we're apparently

15    now forever going to be the two Daves.  I'm still thinking

16    of what that means.

17              But I think that we have the same -- we need to

18    move forward and address the issues, as we've described.

19              COMMISSIONER DICUS:  Okay.  And, Mr. Lochbaum, you

20    had indicated that we need to do some more plain English

21    communication with the public and you did mention in your

22    comments that, for example, some of the documents going

23    forward into the meetings, so the public had those and could

24    better understand them.

25              That helped a little bit, because some of the

                                                               131

 1    reports that we send back and forth to the licensee are

 2    technical reports, from techies to techies, if you will.

 3              Were you referring to those reports, as well, that

 4    some of those needed to be put in plain English for the

 5    public's consumption?

 6              MR. LOCHBAUM:  I think, in my mind, there is a

 7    distinction between reports.  Inspection reports are more

 8    meant for vehicles between the NRC staff and the licensee,

 9    and the responses are also in that category.

10              There's a separate category of documents that are

11    meant for a broader audience, for other than the licensee. 

12    I think the plant performance reviews are now being used in

13    that category, but they're not written in that style, and I

14    think the six month -- I forget what the thing is called,

15    every six months, a letter goes out that says here is what

16    we're going to do for the next six months to you.

17              That letter is very difficult for people to

18    understand.  So I think there could be an improvement.

19              When I first joined UCS, our director of

20    communications sat down with me at the very first meeting

21    and said I know you're an engineer, but when you talk to

22    people, pretend like you're talking to your grandmother, and

23    I'm not saying I do that today, but that was some of the

24    best advice I had since joining the UCS.

25              COMMISSIONER DICUS:  We appreciate that.  Okay. 

                                                               132

 1    And would you care to make a comment on whether you think

 2    the industry is ready to go forward?

 3              MR. LOCHBAUM:  I talked to Mr. Beedle at the break

 4    and he said they were.  He's never let me down so far.

 5              COMMISSIONER DICUS:  Okay.  And, Dr. Lipoti, even

 6    though you have a lot of reservations about this program, I

 7    notice you are wearing green.  I don't have any further

 8    questions.

 9              CHAIRMAN MESERVE:  Commissioner Diaz.

10              COMMISSIONER DIAZ:  Thank you, Mr. Chairman.  I'd

11    like to observe that although I'm normally one track minded,

12    I'm more one track minded when I have the flu.  So I have

13    one question for everybody, it's the same question, and I

14    would like to start from Mr. Gillespie.

15              Mr. Gillespie, you must realize that although you

16    come here as a Chairman of PPEP, in reality, we know you

17    better than that.  We are going to go back and ask you for

18    some of your early knowledge of the program.

19              The question is going back to the same thing.  It

20    is the fact that I believe there is an underlying set of

21    rules and regulations that haven't changed.  There are

22    underlying value in the way we do inspections and we relate

23    them to plant performance.

24              There is a lot of things that really have not

25    changed significantly.  There is a new dimension added.

                                                               133

 1    Everybody looks at performance indicators.  Since everybody

 2    looks at them, I don't anymore.  I just let everybody look

 3    at them and I look at something else.  It's just my contrary

 4    nature.

 5              The issue becomes if the performance indicators

 6    are no good, do we have other things that really happen

 7    every day, that happen every week, that take place in the

 8    plant that provides the confidence level.

 9              So the question is, from your perspective, can you

10    quickly tell me what the role and value of the open data

11    gathering and processing is, the fact that this data

12    gathering and processing has a periodicity that never

13    existed before, and its value to a robust corrective action

14    program?

15              Do you want me to repeat the question?  No, you

16    got it.  I thought so.

17              MR. GILLESPIE:  I think the value is much in the

18    structure and many questions have been asked.  Have we got

19    the perfect set of PIs, and I think the real question is do

20    we have a good enough set of PIs to reflect across a profile

21    of performance, and I mean a profile that we have basically

22    18 performance indicators.

23              And if we had turned those into one weighted

24    indicator, we, I think, have a flaw, because one could

25    outweigh the other.

                                                               134

 1              COMMISSIONER DIAZ:  I'm sorry.  That's really not

 2    the question.  Let me ask it again.   You have to go back to

 3    the very fundamental level of what happens every day, how

 4    people put data into their plants, how they collect it, how

 5    that data has a periodicity, how it's upgraded, how people

 6    look at it, how it goes into the corrective action program,

 7    how that can impact on a performance indicator, how can

 8    performance indicators go back to it.

 9              So it is at the level of data gathering,

10    processing, going into the corrective action program.  What

11    is the role and value of this?  Will it get better?  Will it

12    give us more information than we had before?  Will it

13    interact on a day to day basis with the safety assessment of

14    the plant?

15              MR. GILLESPIE:  I think that that structure, from

16    the collection of the data all the way through, and the fact

17    that the data then gets publicized, with visibility comes

18    accountability, all the way down to the person who is

19    recording that number, who is doing the maintenance, who is

20    putting it into the system.

21              So I think with that accountability, and I think

22    that has some bearing on, in fact, the INPO initiatives that

23    you heard briefly on corrective action programs, that this

24    is one of the more significant impacts on safety that's

25    coming out of this program.

                                                               135

 1              Given that structure and visibility, and it goes

 2    across everything in the plant, if we've picked our PIs

 3    right.  So the exact PI isn't important as the fact that the

 4    breadth of the PIs and the fact that it goes down to the

 5    person doing the work and then rolls up to the visibility of

 6    the utility.

 7              So I think we have a tremendous safety benefit

 8    from this structure.

 9              COMMISSIONER DIAZ:  And our role in that, an every

10    day role?

11              MR. GILLESPIE:  Our role is verifying that the

12    data is correct, that it represents what we think it

13    represents, and that's the role of inspection.  The role of

14    inspection is to verify and to also provide independent

15    looks at the same processes and equipment and systems to say

16    that the PIs are telling us what they're supposed to be

17    telling us.

18              So that it's a checks and balance in the whole

19    process.  The PIs do take on a visible role, though.

20              COMMISSIONER DIAZ:  All right.  Thank you.  Ms.

21    Lipoti?

22              DR. LIPOTI:  The performance indicators help you

23    to prioritize your inspection resources.  There's no

24    question that you don't want to spend time inspecting

25    something which is already in good shape.  You want to spend

                                                               136

 1    your inspection oversight in those places where there could

 2    be a problem.

 3              But to me, one of the issues which is not measured

 4    by PI is the cross-cutting issue of the corrective action

 5    program and Bill Dean said earlier that time will tell.  My

 6    definition of time is inspection time and I think you need

 7    to put additional time on the cross-cutting issues, like the

 8    corrective action program, like human performance, like

 9    safety consciousness.

10              And I think that in your inspection guidance that

11    you write for your inspectors, that it needs to be written

12    by NRC and be a NUREG and not an NEI guidance.

13              I think you have to look at statistically

14    significant issues from the whole corrective action program

15    and the corrective action program that was part of the pilot

16    at Salem, they have 30,000 issues that are in the corrective

17    action program and they add them at 3,000 a month.

18              The inspection, which was an in-depth inspection,

19    picked out 20 to look at.  That's not a statistically

20    significant number.  Now, I know that there's additional

21    inspection resources that's provided by the resident

22    inspector and that their presence and that they look at

23    corrective actions.  But I think that the guidance in

24    inspection for corrective action is not yet ready.

25              COMMISSIONER DIAZ:  Thank you.

                                                               137

 1              MR. LOCHBAUM:  I think my answer would be based on

 2    my experience working as a consultant at good performing

 3    utilities and plants that weren't in that category.

 4              I think the performance indicator, the value of

 5    the performance indicators is instilling the right culture

 6    or approach to safety or approach to business, because if

 7    you're doing business right on the performance indicators,

 8    you're not going to operate 180 degrees out on other areas.

 9              So that gives the plant owner and the NRC and the

10    public an insight into how performance at the plant is

11    proceeding, if performance is in the green or the acceptable

12    category, there is a higher confidence that other areas are

13    being handled well, whereas if performance is not in those

14    categories, doubts are raised.

15              So I think that's the importance we place on the

16    performance indicators and the reason we like so many, not

17    too many, but a larger category rather than four big

18    categories of SALP, was that you can detect declining

19    performance hopefully sooner.

20              So I think that's the purpose of it and we think

21    the NRC's role is to step in and when performance declines

22    are detected, remind the licensee of the necessity of

23    promptness in turning that around.

24              MR. GARCHOW:  I guess I'll take an action item

25    here to show the difference between notification and that

                                                               138

 1    which goes in our corrective action program as an Appendix B

 2    issue.  We have a low threshold, high volume system, of

 3    which any deficiency, discrepancy, good idea, gets written

 4    up in a notification.

 5              I'm not sure we even have quite 3,000 of those,

 6    but that's probably close.  Out of that, there are several

 7    hundred of those that we track, that typically run in the

 8    three to 400 a month that actually make the screen into our

 9    corrective action program.

10              I can share that with you in one of our regular

11    interface meetings what that is.  But I think that you did

12    -- Jill actually hits on the issue, though, and we can

13    debate the number, but the issue that gives me comfort is we

14    do surveying for safety conscious work environment, so we

15    know and I have data to show that our employees feel free to

16    raise safety issues and do, and we have a process that's

17    very low threshold, high volume.

18    Every day, those get screened by license operators for

19    immediate plant impact and then get screened every day by a

20    team of engineering maintenance and ops folks to determine

21    what is the significance, what is the time we want to

22    resolve it and who is the appropriate group to resolve it.

23              So that process is a living, dynamic process and

24    the only way that we are going to get to excellent

25    performance over the long term is to really encourage people

                                                               139

 1    to identify every good idea, some of those may or may not be

 2    in the quality program, but they're still -- we don't put

 3    the burden on the employee to make that determination.  We

 4    want it written up and submitted and I can tell you that the

 5    improvements that we've made at our facility were driven by

 6    the corrective action program and our self-assessment

 7    program and we have room to grow in that area as we continue

 8    our climb at PSE&G; to raise the performance of Salem and

 9    Hope Creek.

10              So I'm convinced that the direct answer to your

11    question, the corrective action program is the key and we

12    put the issues in wherever they come from.  So in our

13    routine interactions with the NRC inspector, if the NRC has

14    a concern in an area, we'll document it in our corrective

15    action program to go pull the string on it.  It may turn out

16    to be something significant, it may not, but the value is in

17    the dialogue and in getting it in our program.

18              We saw in the pilot where every single inspection

19    module that comes into the plant, whether it be a team

20    inspection in an area or whether it be the routine resident

21    inspector, has a portion of time, upwards of ten to 15

22    percent, where they are pulling the string on corrective

23    actions in every area on an ongoing basis, with the risk --

24    with a risk significant screening.

25              So I think it's wroth reviewing that.  Whether

                                                               140

 1    that's appropriate, the pilot program did test and we were

 2    able to see as the recipient of the program the fact that

 3    the inspectors were in the corrective action program

 4    continually with every inspection, as well as the team

 5    inspection for corrective action, which happened to be done

 6    at Hope Creek for the pilot.

 7              So we did get to see that and there were probably

 8    some areas for improvement that the NRC captured in their

 9    ability to do that inspection better.  That was one of the

10    first corrective action inspections that was done as part of

11    the pilot and I think there were several improvement areas

12    that were learned as a result of that inspection at Hope

13    Creek.

14              MR. BEEDLE:  Commissioner, I'd like to approach it

15    from two different levels.  One, and I think it has been

16    covered by Dr. Lipoti, where she talks about the data, raw

17    data and performance indicators and how that gives you some

18    sense of performance in an area and where you aren't covered

19    with a performance indicator, you do inspection.

20              But I think I'd like to take your question to a

21    different level and you talk about the day to day activity. 

22    These plants collect data of all sorts moment by moment and

23    it starts with the watch standards and the reports that they

24    make on the performance of their equipment.  It goes to the

25    management reviews.  It's the plant oversight reviews.  It's

                                                               141

 1    the off-site reviews.  It's the self-assessments, it's

 2    condition reports, extent of condition, and all of those

 3    things roll up into the corrective action program, but

 4    they're all driven by the safety conscious work environment

 5    that's created by the management team at that facility.

 6              So when we look at these three cross-cutting

 7    issues, I think they fall into kind of two categories. 

 8    There's the process issue that we're calling self-assessment

 9    and corrective action and then there's the safety conscious

10    work environment and the human performance issue, and both

11    of those two areas are driven more by the attitude of the

12    managers than anything else.

13              I don't think you can regulate a safety conscious

14    work environment.  I think you can, though, see the evidence

15    of that and the viability of that work environment in the

16    way the corrective action program functions and that's

17    reflected in the way the plant behaves.

18              So I think we're -- a lot of data that's

19    collected, it all ends up in that corrective action program

20    and that's what drives the performance of the facility.

21              COMMISSIONER DIAZ:  And is the disposition of that

22    corrective action program important to safety?

23              MR. BEEDLE:  Absolutely.  I don't think anybody

24    would argue that it's not an important element in this

25    process of managing the facilities.

                                                               142

 1              COMMISSIONER DIAZ:  Thank you, Mr. Chairman.

 2              CHAIRMAN MESERVE:  Commissioner Merrifield.

 3              COMMISSIONER MERRIFIELD:  Thank you, Mr. Chairman. 

 4    I guess, first, I want to say I appreciate the thoughtful

 5    testimony that our witnesses have had today.  I think it's

 6    very helpful for moving forward and improving this program.

 7              Also, I would want Mr. Gillespie to transmit my

 8    appreciation to the other members of the panel who are not

 9    here today, but we obviously appreciate the strong work that

10    they've put and perhaps will be a continuing effort in that

11    regard.

12              Dr. Lipoti, I want to follow up on the line of

13    where the Chairman was going.  I was reminded of the

14    testimony we had last week from the ACRS and they raised a

15    number of areas where they thought the program had some gaps

16    and as we got down into that level, I was reminded at that

17    time and made a comment that one has to strive with the

18    perfect being the enemy of the good.

19              I was reviewing your testimony and in it you say

20    the question to be answered now -- this is on the first page

21    -- the question to be answered now is not whether the new

22    program -- new oversight process is perfect, it is not.  The

23    question is not whether the new oversight program has merit,

24    it certainly does.

25              But it is whether the new oversight program is

                                                               143

 1    better than the existing oversight program.  I think the

 2    Chairman was trying to get that out of you and I couldn't

 3    get it out of your testimony and I was wondering if you

 4    could give me an answer to that.

 5              DR. LIPOTI:  Well, page two of the testimony

 6    compares the two and the comparison with the existing

 7    oversight program and certainly the existing oversight

 8    program has little linkage to risk significance and the new

 9    oversight program has a great deal of risk significance.

10              And it helps to focus your resources, and that's a

11    good thing.  But I worry a little bit about the incredible

12    cooperation that has developed this program between NEI and

13    NRC.  I think that's unique in terms of a relationship

14    between a regulator and the regulated community.

15              I wonder if that kind of cooperation really

16    inspires public confidence.  After all, you have several

17    milestones or key milestones that you are using for a

18    measure of whether the new oversight program is good.  One

19    of them is public confidence and that one I don't think has

20    been adequately measured.

21              Certainly the existing oversight program with the

22    SALP comes out every 18 months.  The new one has quarterly

23    results that are put on the web site and the web site is

24    visible and with visibility comes accountability.  Those are

25    all good things.

                                                               144

 1              But there are still some issues that give me

 2    pause.

 3              COMMISSIONER MERRIFIELD:  What about you have

 4    sitting next to you David Lochbaum, who is one of our --

 5    clearly one of our most significant public interest group

 6    participants.  He talks a lot about the public.  Mr.

 7    Lochbaum has come before us at least as long as I've been a

 8    Commissioner, and probably ten or 15 times in the last 15

 9    months, and there isn't a single statement he doesn't make

10    where he talks about the public interest.

11              His bottom line was that we ought to move forward

12    with it.

13              DR. LIPOTI:  And he also qualified It by saying

14    that he's not sure he does represent the public.  And I

15    certainly wouldn't stand before you and say I represent the

16    public.  But I have had inspectors that have gone on the

17    inspections at -- and I've expended more resources on

18    looking at this new oversight process than almost anything

19    else in my program in the past nine months.  And we have

20    said that where the rubber meets the road, where the

21    inspector is out there trying to apply this process, there

22    are still very significant problems, and I'm not sure that

23    you can see them from the level of the management and the

24    theory of the new oversight program.

25              And I just want to warn you that inside this

                                                               145

 1    program, the devil is in the details and the details are

 2    very difficult and there remain many, many things that need

 3    resolution and they may not pop up to the level that comes

 4    before the Commission again. They'll be resolved at staff

 5    level.

 6              But think about how this program, this new

 7    oversight program, does it give you the comfort level that

 8    you will know where to spend the NRC resources on inspecting

 9    the right plants and in the right areas.

10              Do you feel that it really will?  And that's the

11    question that you have to ask yourself as a Commissioner.

12              COMMISSIONER MERRIFIELD:  Thank you.

13              DR. LIPOTI:  But I do want to go on to say one

14    more thing.

15              COMMISSIONER MERRIFIELD:  I've got some other

16    questions I've got for some other witnesses.

17              DR. LIPOTI:  Sorry.

18              COMMISSIONER MERRIFIELD:  Mr. Beedle, one of the

19    questions that was raised earlier today was that the new

20    oversight process is a voluntary program.  I'd like to get

21    your thoughts on that assertion and whether you think it is

22    possible a licensee may elect not to participate in the new

23    program.

24              MR. BEEDLE:  The voluntary nature of this program

25    really goes to the production of the performance indicators

                                                               146

 1    and the submission of those indicators to the NRC.  And I

 2    would remind everybody here that there is more to this

 3    program than the performance indicators.

 4              It's the performance indicators, balanced with the

 5    inspection process, which is the dominant element of it,

 6    along with an assessment mechanism that helps the inspectors

 7    and the licensees understand where the inspection process is

 8    going.

 9              So the voluntary nature of this is in the

10    submission of the data.  I don't think that any utility has

11    balked at providing data.  They all get that data in on time

12    in January for the historical look.  I have no indication or

13    belief that anybody is going to not submit the data.

14              Personally, I would think that anybody that sat

15    there and said I'm not going to provide this data to assist

16    the NRC in determining where they're going to allocate

17    resources for inspection would run the risk of having an

18    awful lot of inspection to cover those areas that the

19    performance indicators would normally cover.

20              I don't think that's in their planning.  So I

21    would expect that all of them would provide that.

22              COMMISSIONER MERRIFIELD:  Mr. Lochbaum, could you

23    give us your perspective on two comments made by ACRS in the

24    March 2 meeting?  The first one being that the performance

25    indicator thresholds may be so high that we and the

                                                               147

 1    licensees will be unable to identify trends and that they

 2    are so high that they serve as disincentives to improve

 3    plant performance?

 4              MR. LOCHBAUM:  I think on the first one, even

 5    though you get a color on any of the boxes, you can look at

 6    the underlying data and draw trends even within a green box

 7    or any box you want to.  So I think it's possible to.

 8              I think if the performance indicators -- we had

 9    concerns with the containment leakage one, because it would

10    not provide trending because of a lot of the problems that

11    the previous panel talked about.

12              The remaining performance indicators we think can

13    be trended, even if they're all green.  That doesn't

14    necessarily mean you have to react on it, but I think you

15    can.

16              And could you refresh my memory on the second one?

17              COMMISSIONER MERRIFIELD:  That they are so high as

18    to serve as a disincentive to improving plant performance.

19              MR. LOCHBAUM:  I guess I don't agree with that.  I

20    think as many other panelists have said, that if you're

21    white, that's perceived to be bad, even though it's not

22    necessarily unsafe, and there's going to be peer pressure or

23    accountability to try to maintain all the indicators in the

24    green, whether they're performance indicators or NRC

25    inspection findings.

                                                               148

 1              So I think that pressure or accountability is

 2    there in the new system.  I don't think they're too high, so

 3    I guess I don't understand that view.

 4              COMMISSIONER MERRIFIELD:  Mr. Garchow, the staff

 5    has indicated that they're going to revise the process for

 6    documenting inspection findings to allow our inspectors to

 7    document observations associated with programmatic

 8    deficiencies and cross-cutting issues, even if those don't

 9    necessarily raise to a level of being on a performance

10    indicator crossing a threshold or in a significant

11    inspection finding.

12              Do you think that's a good idea or a bad idea?

13              MR. GARCHOW:  Well, I think the answer is I have

14    no problem with it being documented.  We're getting that

15    information as a licensee by virtue of the resident exit

16    meetings and through the pilot process, we saw improvements

17    in the exit meetings for inspections.

18              They weren't -- as the clarification got down to

19    the inspectors, I think it was Mr. Madison that indicated we

20    saw a marked difference.  They weren't perfunctory.  They

21    actually were very good dialogues of what areas were looked

22    at and what they saw and any observations or senses that the

23    inspector had.  Those started to come out towards the end of

24    the pilot program, and that was beneficial for my staff to

25    here as we're assembled in an exit meeting.

                                                               149

 1              So we were getting the information, I believe,

 2    from the inspection exit meetings that are held.  Whether

 3    those comments show up in a report or not, I see no issue

 4    with them being in the report.  I got the information from

 5    the exit meeting, so I see no issue with them not being in

 6    the report.

 7              COMMISSIONER MERRIFIELD:  Thank you.

 8              CHAIRMAN MESERVE:  I'd like to thank you all very

 9    much.  It was very helpful to have your insights on this

10    program and appreciate the time you've spent with us this

11    afternoon.

12              With that, we're adjourned.

13              [Whereupon, at 4:25 p.m., the briefing was

14    concluded.]

15

16

17

18

19

20

21

22

23

24

25