skip navigation links 
 
 Search Options 
Index | Site Map | FAQ | Facility Info | Reading Rm | New | Help | Glossary | Contact Us blue spacer  
secondary page banner Return to NRC Home Page
                                                           1
          1                      UNITED STATES OF AMERICA
          2                    NUCLEAR REGULATORY COMMISSION
          3                                 ***
          4                             BRIEFING ON
          5               REACTOR OVERSIGHT PROCESS IMPROVEMENTS
          6                                 ***
          7                           PUBLIC MEETING
          8
          9
         10                             Nuclear Regulatory Commission
         11                             Commission Hearing Room
         12                             11555 Rockville Pike
         13                             Rockville, Maryland
         14
         15                             Monday, November 2, 1998
         16
         17              The Commission met in open session, pursuant to
         18    notice, at 2:07p.m., the Honorable Shirley A. Jackson,
         19    Chairman, presiding.
         20
         21    COMMISSIONERS PRESENT:
         22         SHIRLEY A. JACKSON, Chairman of the Commission
         23         NILS J. DIAZ, Commissioner
         24         EDWARD McGAFFIGAN, JR., Commissioner
         25         JEFFREY S. MERRIFIELD, Commissioner
                                                                       2
          1    STAFF AND PRESENTERS SEATED AT COMMISSION TABLE:
          2         JOHN C. HOYLE, Secretary of the Commission
          3         KAREN D. CYR, General Counsel
          4         WILLIAM D. TRAVERS, EDO
          5         SAM COLLINS, NRR
          6         FRANK GILLESPIE, NRR
          7         MICHAEL JOHNSON, NRR
          8         PATRICK BARANOWSKY, AEOD
          9         JAMES LIEBERMAN, Office of Enforcement
         10         BRUCE MALLETT, Div of Reactor Safety,
         11          Region II
         12         JOHN FLACK, RES
         13         RALPH BEEDLE, NEI
         14         DAVID LOCHBAUM, UCS
         15
         16
         17
         18
         19
         20
         21
         22
         23
         24
         25
                                                                       3
          1                        P R O C E E D I N G S
          2                                                     [2:07 p.m.]
          3              CHAIRMAN JACKSON:  Good afternoon, everyone.  I am
          4    pleased to welcome members of the NRC staff to brief the
          5    Commission on the progress of planned improvements to the
          6    reactor oversight process and plans for and results of an
          7    initiative to improve the NRC assessment, inspection and
          8    enforcement processes for operating commercial nuclear
          9    reactors.
         10              Before we begin, however, I would like to take a
         11    moment to recognize the return of Ms. Greta Dicus to the
         12    Commission.  While she could not be with us today, we do
         13    welcome her back.  She was missed this summer.
         14              In addition, I would like to recognize and to
         15    welcome and to introduce to you Mr. Jeffrey Merrifield to
         16    his first Commission meeting.  Commissioner Merrifield, my
         17    colleagues and I look forward to working with you.  We have
         18    a lot to do, as you will get an inkling of this afternoon.
         19              Today's meeting represents a continuation of a
         20    dialogue which has existed between the Commission and the
         21    NRC staff since 1996 when, due to concerns over the
         22    subjectivity involved in the senior management meeting
         23    process, I directed the staff's attention toward seeking an
         24    external review of that process, which was the Arthur
         25    Andersen study.
                                                                       4
          1              Since that time and pursuant to Commission
          2    direction, the staff has developed proposals to modify and
          3    to improve the entire power reactor regulatory oversight
          4    process.  Not only the senior management meeting process,
          5    but power reactor performance assessment, which includes all
          6    of the constituent pieces, including SALP, as well as the
          7    inspection and enforcement processes.
          8              The reactor oversight process is intended to
          9    independently assess reactor plant performance, to
         10    facilitate the early identification of plants which require
         11    increased regulatory attention, and to direct regulatory
         12    actions towards such plants before the reasonable assurance
         13    of public health and safety is compromised.
         14              Our ultimate goal is to attain a clear, coherent
         15    picture of performance at operating reactor facilities in a
         16    way that leads to objective, consistent and predictable
         17    regulatory actions.  Through the reduction of subjectivity
         18    that can be afforded by the use of performance indicators
         19    and through the use of risk information, it is our attention
         20    to reduce unnecessary regulatory burden to the extent
         21    possible.
         22              The staff has quite properly considered the
         23    individual components of the reactor oversight process as an
         24    integrated whole in which components of the process work
         25    synergistically to achieve our objectives.  Today the staff
                                                                       5
          1    will describe its current activities to support these
          2    objectives and also should describe any incremental
          3    improvements to the process that already have been
          4    accomplished.
          5              We welcome this update which represents an amalgam
          6    of both staff and stakeholder thoughts on the subjects
          7    obtained through a number of NRC-stakeholder interactions,
          8    culminated in a well attended and, I am told, fruitful
          9    workshop conducted during the week of September 28.  The
         10    workshop was sponsored by the NRC and was attended by
         11    numerous representatives of the NRC, licensees, the power
         12    reactor industry, public interest groups, and congressional
         13    staff.
         14              The Commission applauds the cooperative efforts of
         15    all involved at the workshop.
         16              At the conclusion of the staff's presentations,
         17    two stakeholders will provide brief remarks on the NRC
         18    efforts concerning the assessment process.  To represent the
         19    Nuclear Energy Institute (NEI), Mr. Ralph Beedle will
         20    present remarks.  To represent the Union of Concerned
         21    Scientists, Mr. David Lochbaum will provide remarks.  And I
         22    will call them to the table at the appropriate time.
         23              Copies of the slide presentation are available at
         24    the entrances to the meeting.  So unless my colleagues have
         25    any introductory comments, Dr. Travers, please proceed.
                                                                       6
          1              MR. TRAVERS:  Good afternoon, Chairman Jackson,
          2    Commissioner Diaz, Commissioner McGaffigan, Commissioner
          3    Merrifield.  As the chairman mentioned, we are here today to
          4    discuss the status of the staff's efforts to develop
          5    improvements in the NRC's inspection, assessment and
          6    enforcement processes.
          7              With me here at the table are Sam Collins, Frank
          8    Gillespie, and Mike Johnson of NRR; Pat Baranowsky of AEOD;
          9    Jim Lieberman of the Office of Enforcement; John Flack of
         10    the Office of Research; and, of course, Bruce Mallett from
         11    Region II.
         12              You mentioned an integrated whole, Chairman.  I
         13    think the example that I would offer you is the kind of
         14    cooperative effort that we have internally put together to
         15    work on these processes, as demonstrated by the different
         16    organizations, regions, and major program offices that are
         17    involved in this effort.
         18              Since receiving the initial tasking memorandum
         19    from Chairman Jackson on August 7th, the staff has given a
         20    high priority to furthering changes that are intended to
         21    better utilize risk information, clarify NRC requirements
         22    and expectations, and improve the predictability,
         23    objectivity and timeliness of NRC decisions.
         24              Particular emphasis has been placed on addressing
         25    specific aspects of the reactor oversight program.  The
                                                                       7
          1    EDO's August 25th memorandum to the Chairman provided the
          2    staff's short- and long-term plans, including detailed
          3    milestones and deliverables for a number of the most
          4    important issues.
          5              In the two months since the initial response, the
          6    staff has increased its level of effort in order to
          7    accelerate ongoing improvements in the performance
          8    assessment, inspection and enforcement programs.  We remain
          9    substantially on track in our efforts.
         10              As you know, on September 15th the Commission
         11    approved the suspension of the systematic assessment of
         12    licensee, or SALP, program, for an interim period until the
         13    staff completes a review of its process for assessing
         14    licensee performance.
         15              The suspension of SALP has freed staff resources
         16    to work on this project, and, as a result, region based
         17    managers and inspectors have been able to be assigned as
         18    dedicated members on each of the teams assigned to develop
         19    the technical framework, inspection and assessment models. 
         20    This is truly an integrated effort.
         21              There has been a significant amount of interaction
         22    with stakeholders for this effort.  Chairman, you mentioned
         23    the workshop.  At that workshop we were able to achieve our
         24    goals for the workshop by obtaining alignment of the
         25    participants on the basic framework for the process and its
                                                                       8
          1    defining principles.
          2              Although significant progress has been made, it's
          3    really just the first step and a significant amount of work
          4    remains to address the details.  Between now and the middle
          5    of December there are meetings scheduled with stakeholders
          6    at the working group level nearly every week to continue to
          7    refine and add to the progress we have already made.  We
          8    view these continued interactions with stakeholders as a
          9    critical factor in developing an acceptable overall
         10    inspection, assessment and enforcement framework, and these
         11    interactions will continue to be a priority for us.
         12              As you know, we are working to provide the
         13    Commission with the results of our work, including a staff
         14    recommendation, by January.
         15              At this point, I would like to turn it over to
         16    Frank Gillespie, who is going to begin the process of
         17    describing what we have been up to.
         18              [Slides shown.]
         19              MR. GILLESPIE:  Good afternoon.  We are here this
         20    afternoon to present to the Commission a brief background
         21    review of the oversight process improvement effort completed
         22    to date, a status of current staff activities, near term
         23    goals, and to discuss long-term activities required to
         24    implement process improvements.
         25              While these efforts were originally focused on
                                                                       9
          1    improvements to the assessment process, the task has evolved
          2    to a more broadly based effort involving the close
          3    integration of inspection, assessment and enforcement.
          4              In addition, there are several other process ties
          5    to these efforts which have been recognized, such as the
          6    allegation process, licensee reporting process, and
          7    risk-informed regulation.
          8              We last briefed the Commission on April 2nd on the
          9    staff proposal which resulted from the integrated review of
         10    the assessment process (IRAP) effort.  The objective of the
         11    IRAP review was to develop a single integrated assessment
         12    process which provided greater objectivity, predictability,
         13    and scrutability.
         14              The fundamental concepts which formed the basis of
         15    the IRAP proposal were that:
         16              Inspection results provided the basis for the
         17    assessment.
         18              Inspection findings would be categorized by
         19    performance template areas.  Scored based on safety
         20    significance, assessment would be accomplished by totalling
         21    the scores in each template area and comparing them against
         22    thresholds; and NRC actions would be taken based on a
         23    decision model.
         24              Since the submittal of the IRAP proposal in
         25    SECY-98045, the staff has received feedback on a proposal
                                                                      10
          1    from the ACRS and the Commission.
          2              In a letter to the Commission dated March 13th
          3    ACRS recommended the staff take a top-down approach to
          4    developing improvements in the assessment process.
          5              In a staff requirements memorandum dated June 30th
          6    the Commission expressed concerns with the use of
          7    enforcement as a driving force for the assessment process,
          8    the quantitative scoring of PIM entries, and the use of
          9    color coding for performance ratings.  However, the
         10    Commission did approve the solicitation of public comments
         11    on the IRAP proposal.
         12              In parallel with the development and consideration
         13    of the IRAP proposal, the industry developed an independent
         14    proposal for improvement of the assessment process.  This
         15    effort, led and coordinated by the Nuclear Energy Institute,
         16    resulted in a proposal that was fundamentally and
         17    philosophically different from the IRAP proposal.
         18              This proposal took a top-down approach and
         19    established tiers of licensee performance based on
         20    maintaining barriers to radionuclide release, minimizing
         21    events that could challenge these barriers, and ensuring
         22    that systems can perform their intended function. 
         23    Performance in these tiers would be measured through
         24    reliance on high level objective indicators with thresholds
         25    set for each indicator to form a utility response band, a
                                                                      11
          1    regulatory response band, and a band of unacceptable
          2    performance.
          3              So in response to the IRAP SRM, the NEI proposal,
          4    input from July 17th Commission meeting with public and
          5    industry stakeholders, and the July 31st hearing before the
          6    Senate, the staff set out to develop a single recommendation
          7    for improvement to the regulatory oversight process which
          8    places an appropriate regulatory burden on licensees.
          9              This recommendation is intended to preserve the
         10    core values of regulatory oversight which are to carry out
         11    the agency's mission of protection of the health and safety
         12    of the public and to do this in a risk-informed and
         13    performance-based manner, and to account for the NRC's
         14    principles of good regulation:  independence, openness,
         15    efficiency, clarity, reliability.
         16              This recommendation should further reduce the
         17    burden for good performing plants but retain the ability to
         18    provide a strong focus on those licensees with significant
         19    performance problems.
         20              The approach taken by the staff to develop a
         21    framework for regulatory oversight which uses a top-down
         22    approach.  The staff started with the mission of the agency
         23    and then worked down to identify those cornerstone areas
         24    which provide the foundation for meeting our mission.  The
         25    staff then identified and addressed those key issues which
                                                                      12
          1    form the defining principles to be used in the redesign of
          2    the regulatory oversight process.
          3              For the cornerstones of safety the staff is
          4    applying a set of defining principles and a risk-informed,
          5    performance-based perspective to identify what is important
          6    to measure in each cornerstone and how it can be measured. 
          7    During this process the staff identified important ties to
          8    other key processes such as enforcement, allegations,
          9    licensing, which should be addressed in the oversight
         10    framework.
         11              CHAIRMAN JACKSON:  Frank, could you go back to 4. 
         12    Have you had any discussions about the role or continuing
         13    role of what have been the elements of the oversight and
         14    assessment process, namely, SALP, PPR, SMM?  Have you come
         15    to any discussion about whether they would be retained or
         16    retained in their current form, or is that premature at this
         17    stage of the game?
         18              MR. GILLESPIE:  I think it would be premature to
         19    give you details, but if I could take you back to one of our
         20    first presentations on the key attributes, positive and
         21    negative, of those processes, some of those positive key
         22    attributes definitely are going to affect our assessment
         23    process group and how we interface, for example, with the
         24    public.  Public meetings of some frequency appear to be an
         25    important public attribute that we want to retain.  This
                                                                      13
          1    also came up when we suspended SALP.  So there are some
          2    important positive attributes to what we were doing which we
          3    would intend to factor into how we carry out this process.
          4              While working through the process to develop the
          5    cornerstones to regulatory oversight, the staff recognized
          6    the importance of both internal and external input.  As
          7    directed by the IRAP SRM, a 60 day comment period on the
          8    IRAP and cornerstone concept was completed on October 6. 
          9    The staff received 26 submittals in response to the public
         10    comment to the Federal Register notice and is reviewing and
         11    evaluating them during the continuing development process.
         12              There have been numerous public meetings with the
         13    industry, ACRS, regional, headquarters staff, to obtain
         14    feedback on development of the cornerstone concepts.  These
         15    interactions with both internal and external stakeholders
         16    are continuing throughout the development of recommendations
         17    for improvement in the oversight process.
         18              A four day workshop, as was mentioned, was
         19    conducted by the staff on September 28 through October 1 to
         20    enable the staff to interact with industry, the public, and
         21    the NRC itself, our own staff, to obtain and evaluate input
         22    on improvements to the regulatory oversight process.
         23              Over 300 people attended the workshop, with broad
         24    participation from the NRC headquarters and regional staffs,
         25    individual licensee representatives, INPO, NEI, and
                                                                      14
          1    participation from the Union of Concerned Scientists, GAO,
          2    state regulatory agencies, foreign regulatory agencies, the
          3    Office of the Inspector General, and Senate staff.
          4              There were several significant accomplishments
          5    achieved at the workshop which have contributed to the
          6    continued development of improvements to the regulatory
          7    oversight process.
          8              I use the next words very carefully and very
          9    deliberately, because there is a scaling, as you will see in
         10    this viewgraph.
         11              Workshop consensus was reached on the overall
         12    framework for regulatory oversight and the objective
         13    definitions for each cornerstone of safety.
         14              Good alignment was achieved on the defining
         15    principles for the oversight process, with two significant
         16    issues remaining open:  the integration of data and the
         17    nature of the data reporting program.  These issues will be
         18    further discussed later.
         19              CHAIRMAN JACKSON:  Have you had any discussions at
         20    all or any preliminary discussions with INPO about data that
         21    INPO collects?
         22              MR. GILLESPIE:  Not to date, but it was one of the
         23    agenda items that we just put in.  There is an INPO senior
         24    management meeting, I think next week, and this is one of
         25    the agenda items that we are suggesting.
                                                                      15
          1              Finally, significant progress was achieved in
          2    developing a process for selecting performance indicators,
          3    thresholds, and for identifying necessary inspection areas
          4    for each cornerstone.
          5              CHAIRMAN JACKSON:  Maybe you had better back up
          6    for a minute.  Can you be a little bit more explicit?  I
          7    think we all know what consensus is.  Tell us about
          8    alignment and progress.  I'm sure we are going to hear from
          9    our other stakeholders.  I'm interested in their perceptions
         10    about where we are in those areas.
         11              MR. GILLESPIE:  We had set ourselves some
         12    objectives before the meeting.  Our main objective was to
         13    try to develop a consensus on the framework, the highest
         14    level.  The framework evolved and actually changed at the
         15    meeting.  The picture, if you would, looks different today
         16    than it did before the meeting, and that has evolved.
         17              There is a general agreement on what those
         18    cornerstones are, and more importantly, a very intense
         19    discussion on the objectives that are attached with each
         20    cornerstone and what it means.  This became very, very
         21    important because it takes you to the next step.  If you
         22    have your objectives stated, what information is needed to
         23    say that you have reasonable assurance the objectives is
         24    met?
         25              Going down to alignment, our next desire was to
                                                                      16
          1    achieve alignment.  You might say that in the simplest sense
          2    this was a broad majority.  In trying to come up in a
          3    plenary session where we summarized each of the working
          4    session findings, this in general would represent about an
          5    80 percent sense of alignment.  We stated on the first day
          6    of the meeting that that is about what we were trying to get
          7    on this topic.  This was an area where it was clear that in
          8    a four day workshop you weren't going to achieve the
          9    perfection.
         10              The defining principles are the basic structure
         11    that set the stage for how inspection integrates with
         12    assessment, assessment interacts with enforcement, and sets
         13    the stage for the basic assumption that you in fact believe
         14    you can set thresholds and have performance indicators.  So
         15    alignment there.
         16              Progress was made in discussions -- and now we
         17    were on about the last two days of the workshop -- on what
         18    are the performance indicators.  Generally they are
         19    perceived to be more than what is in the INPO indicators or
         20    what are indicators today, but clearly some of those are
         21    included in the indicators:  Where will we inspect?  How
         22    much will we inspect?  That clearly did not get decided, but
         23    we made significant progress, and we will be going through
         24    that a little later on.
         25              What are the kinds of things we should inspect? 
                                                                      17
          1    One of the rules we had coming out of this was there is a
          2    minimum risk-informed baseline inspection that would be done
          3    everyplace.
          4              The last one, how do you select a threshold?  What
          5    is the logic you use?  Is it risk-informed?  There are two
          6    thresholds in each indicator.  There is a threshold which is
          7    the operating threshold at the top level, where you would
          8    start to get the regulator to come in and start to take a
          9    gradually increasing action.  Then there is a threshold as
         10    might be represented in the NEI paper, a threshold of
         11    shutdown, that ultimate regulatory threshold.
         12              The focus of this meeting was really on more time
         13    was spent on the upper threshold than on the lower
         14    threshold.
         15              COMMISSIONER McGAFFIGAN:  On the thresholds, I'm
         16    trying to understand the concept.  We're going to have
         17    multiple items that get graded, as I understand the NEI
         18    scheme, green, white, red, in utility response space,
         19    regulatory response space, unacceptable space.  How do you
         20    integrate all of the indicators into an overall green?
         21              Is it fair for us, if a plant is sailing along in
         22    green in most indicators, but, using SALP terms, not doing
         23    so well in engineering, for us to take regulatory action in
         24    the engineering area or the operational area.
         25              I know that isn't the concept anymore; you break
                                                                      18
          1    these things down differently.
          2              Have you talked through in practice how this thing
          3    works and where the regulatory threshold is?
          4              MR. GILLESPIE:  Mike is going to cover that a
          5    little bit in assessment.  I'm really trying very hard not
          6    to prejudge where the teams are going to come out.
          7              COMMISSIONER McGAFFIGAN:  My concern, and I'll say
          8    it to the stakeholders as well, is the idea I think sort of
          9    embedded in a lot of our thinking is that this thing is
         10    going to be ready to go in January.  That's about two months
         11    from now.  If it's going to be ready to go in January or
         12    February or March, you'd already be wanting to train the
         13    people out there in the field, right?
         14              CHAIRMAN JACKSON:  There is a schedule in here.
         15              MR. GILLESPIE:  I'm going to go over the schedule. 
         16    This is a very, very important point on expectations.  Each
         17    task member is going to cover what we hope to have and how
         18    far we have gotten to deliver in January.  If you look way,
         19    way ahead to the schedule, our January deliverable is the
         20    concepts that if applied to the inspection program, if
         21    applied to the assessment program, would allow us then to
         22    move forward and then rewrite the inspection procedures,and
         23    then write the procedures on how we are going to do
         24    assessment.  So everything will not be done by January
         25    relative to implementation.
                                                                      19
          1              CHAIRMAN JACKSON:  Actually, his question segues
          2    into a question that I have.  Do you have other such
          3    workshops planned and scheduled that are as robust as the
          4    one that happened the week of September the 28th?
          5              MR. GILLESPIE:  There are none planned right now. 
          6    There are some under discussion.
          7              CHAIRMAN JACKSON:  I think that may be a good
          8    thing given the Commissioner's question.  It's something I
          9    think you ought to think about.
         10              COMMISSIONER McGAFFIGAN:  I just want to
         11    understand where we are going to be in fiscal 1999.  We are
         12    going to be trying to develop this process -- I did glance
         13    ahead at the viewgraph -- but very little of this will
         14    actually be being practiced in 1999.  So next spring we will
         15    have the typical senior management meeting, the roll-up to
         16    the senior management meeting.  You guys will do whatever
         17    you do.  Is that the thought?
         18              MR. GILLESPIE:  Yes.  This April there will be the
         19    typical senior management meeting, and we are continuing on
         20    through the process we currently have in place.  The
         21    inspection portion would be implemented between January and
         22    October, and the assessment process would be going out to
         23    June of 2000.
         24              I'd like to actually leave until Pat talks.  Pat
         25    is going to talk where he thinks he will be relative to
                                                                      20
          1    performance indicators.  Otherwise, I will end up giving --
          2              CHAIRMAN JACKSON:  Stealing his thunder.
          3              MR. GILLESPIE:  Yes.
          4              CHAIRMAN JACKSON:  They have it all worked out.
          5              MR. GILLESPIE:  Then we can come back.  I think
          6    expectations need to be honest and up front.
          7              CHAIRMAN JACKSON:  Okay.
          8              MR. GILLESPIE:  Going on to slide 7, which is our
          9    outline, as previously stated, we feel that we have good
         10    external and internal stakeholder consensus on the framework
         11    for an improved regulatory oversight process.
         12              This framework was developed using a top-down
         13    approach.  It starts at the highest level, with the NRC's
         14    overall mission.  This mission statement is based on the
         15    Atomic Energy Act of 1954, as amended, the Energy
         16    Reorganization Act of 1974, as amended.  The mission of the
         17    NRC as it applies to commercial nuclear power plants is to
         18    ensure that these facilities are operated in a manner that
         19    provides reasonable assurance of adequate protection of
         20    public health and framework and the environment, protects
         21    against radiological sabotage, and theft and diversion of
         22    special nuclear materials.
         23              The mission of protecting the public health and
         24    safety is a responsibility that we also share with the
         25    licensees.
                                                                      21
          1              Given this mission, the next step was to identify
          2    those aspects of licensee performance that are important and
          3    therefore merit regulatory oversight.
          4              The NRC's strategic plan identifies the
          5    performance goals to meet for nuclear reactor safety and
          6    includes:
          7              Maintain a low frequency of events that could lead
          8    to a reactor accident.
          9              Zero-significant radiation exposures resulting
         10    from civilian nuclear reactors.
         11              No increase in the number of offsite releases of
         12    radioactive materials for civilian nuclear reactors that
         13    exceed 10 CFR Part 20.
         14              No substantial breakdown of physical protection
         15    that significantly weakens protection against radiological
         16    sabotage or theft or diversion of nuclear materials.
         17              These performance goals reflect those areas of
         18    licensee performance for which the NRC has regulatory
         19    responsibility in support of our overall agency mission. 
         20    These performance goals were represented in the framework
         21    structure as the strategic performance areas of reactor
         22    safety, radiation safety, and safeguards, and form the
         23    second level of the framework.
         24              For each of those strategic performance areas
         25    there are many regulatory requirements.  However, with a
                                                                      22
          1    risk-informed perspective, it was possible to identify those
          2    most important elements in each strategic performance area
          3    which formed the foundation for meeting the overall agency
          4    mission.
          5              These elements are identified as a cornerstone
          6    from the third level of the framework.  As an example, the
          7    objective of initiating events cornerstone is limit the
          8    frequency of events that upset plant equilibrium and
          9    challenge critical safety functions.
         10              Acceptable licensee performance in this and other
         11    cornerstones should provide reasonable assurance that the
         12    overall mission of adequate protection of the public health
         13    and safety is meet.
         14              The cornerstones provide the fundamental building
         15    blocks for regulatory oversight process and provide
         16    reasonable assurance that the overall safety mission is met. 
         17    However, there are other aspects of licensee performance
         18    such a human performance, safety conscious work environment,
         19    problem identification resolution, which are not captured as
         20    cornerstones but are equally important to meeting our safety
         21    mission.
         22              The staff concluded that these items and others
         23    generally crosscut the affected areas and manifest
         24    themselves as causes of performance problems.
         25              Licensee performance in these crosscutting areas
                                                                      23
          1    should therefore be dealt with in each of the cornerstone
          2    areas as contributors to performance as measured by
          3    indicators and as observed through inspection.
          4              Once the cornerstones and objectives were
          5    established, we then had the basis for determining what
          6    information was needed to provide reasonable assurance that
          7    the objectives were being achieved.  This included what
          8    performance attributes are in each cornerstone, what is
          9    important to measure for each attribute, what aspects of
         10    performance can reasonably be measured with objective
         11    indicators, what areas of performance should be measured
         12    through inspection, and what are the appropriate thresholds
         13    for NRC interaction.
         14              These cornerstones provide the foundation for
         15    improvements in inspection, assessment and the enforcement
         16    process.
         17              Once the framework was established, key issues
         18    were discussed and agreed upon which formed the defining
         19    principles for regulatory oversight.  These defining
         20    principles are essential to the continued development and
         21    improvement to the oversight process since they form the
         22    rules against which the cornerstone details will be
         23    developed.
         24              Further, these defining principles also establish
         25    the relationship between elements of the oversight process
                                                                      24
          1    such as enforcement and inspection.  These defining
          2    principles are:
          3              There will be a risk-informed baseline inspection
          4    program that establishes the minimum regulatory interaction
          5    with licensees.
          6              Thresholds can be set for licensee safety
          7    performance.
          8              Performance indicators, supplemented with some
          9    inspection, will form a rebuttable presumption for licensee
         10    assessment.
         11              A risk-informed baseline inspection program will
         12    be performed for all licensees and should cover those
         13    risk-significant attributes of licensee performance not
         14    adequately covered by performance indicators.  The
         15    inspection program will also verify the adequacy of the
         16    performance indicators and provide for event response.
         17              In most cases, inspection observations are
         18    expected to complement the performance indicator results. 
         19    However, when warranted, risk-significant inspection
         20    observations can be used to overturn the indicator results
         21    when the inspection observations develop a compelling case
         22    that the performance indicators are not accurately
         23    reflecting licensee performance.
         24              Enforcement actions taken should not be an input
         25    to the assessment process.
                                                                      25
          1              CHAIRMAN JACKSON:  Frank, can you hold up one
          2    second.
          3              COMMISSIONER DIAZ:  Let's go back to these last
          4    two bullets on page 8a.  I just want to make sure you are
          5    going to eventually add some definition to the term
          6    "adequacy."  It's a broad term.  I don't know whether you
          7    mean adequacy in the whole context of what the process is or
          8    you are talking about the accuracy of the indicators in
          9    predicting, or all of the above.
         10              MR. GILLESPIE:  It's actually all of the above. 
         11    Going down to the next, "will verify the adequacy of the
         12    performance indicators," which is going to be very
         13    important, we are getting consistent information reported
         14    consistently with the same definitions from all licensees.
         15              The other piece, if I can take a simple example of
         16    PI, might be total scrams.  Total scrams reflect operators
         17    in a control room and generation operation, how they react
         18    to, but it might be that total scrams may not reflect what
         19    needs to be inspected, and that's operator reaction during
         20    an event.
         21              We do get to observe that on a simulator.  So that
         22    might suggest that while the PI as a general oversight does
         23    touch upon human and operator performance during a
         24    reactivity transient, we have to look at what is the risk
         25    significance of observing operators in an accident situation
                                                                      26
          1    on the simulator.  So while we may have a PI that has
          2    breadth, it may not have enough depth in an area that is
          3    risk significant.
          4              COMMISSIONER DIAZ:  I just wanted to point out
          5    that that word "adequacy" is an extremely important word. 
          6    It is the definition of adequacy where the process kind of
          7    hinges.  So it is something that you might not be able to
          8    address now, but the Commission will be looking in January
          9    at how adequacy is defined.
         10              The other word is "compelling," this compelling
         11    case, which is kind of the second step.  What is a
         12    compelling case?
         13              COMMISSIONER MERRIFIELD:  If I can add, what kind
         14    of burden.  You are using compelling case.  What kind of
         15    burden of proof does that put on the person seeking to sort
         16    of overturn the indicator results?
         17              MR. GILLESPIE:  The burden is clearly going to be
         18    on the staff to overturn the indicator results, not the
         19    licensee.  "Compelling" is part of the assessment group that
         20    we are still trying to work on.  That is going to be a very
         21    interesting definition to develop.  Developing it may be
         22    more interesting than the final definition.
         23              COMMISSIONER DIAZ:  Yes.  I just want to point out
         24    that those two words need to be well defined and some kind
         25    of boundaries put around it so we can actually know what we
                                                                      27
          1    are doing.
          2              CHAIRMAN JACKSON:  Are you going to have
          3    thresholds for event response?  Have you addressed that at
          4    all?
          5              MR. GILLESPIE:  Yes, we are going to address it. 
          6    Pat is working on it.  Chairman Jackson, I think you mean
          7    the kind of event that could happen that is a high risk
          8    event might trip whatever PIs we have multiple but not push
          9    past the threshold.  Clearly we are going to have to have
         10    and will have some definition of how we deal with the
         11    exception that goes across in a high risk situation like
         12    that on multiple items.
         13              MR. COLLINS:  Commissioner Diaz, what is unique
         14    about the process that we are in, which is a little
         15    different than perhaps what we are used to historically, is
         16    that we expect not only for the indicators to evolve over
         17    time with experience, but we expect as the industry matures
         18    those indicators might actually change over a period of time
         19    in response to either aging considerations, license renewal
         20    concerns, or other challenges that are brought forth by a
         21    collection of data, which might be different than what we
         22    are actually measuring today.
         23              The second aspect which would allow that to happen
         24    is that all the information is shared.  So it is not an
         25    instance as perhaps we found ourselves in the past where we
                                                                      28
          1    are justifying a SALP assessment which has broad subjective
          2    statements with very little data and criteria.  This
          3    information will all be laid out in front of the licensees
          4    as well as the staff.  It will be scrutable on a mutual
          5    basis.
          6              In any case, the words "adequacy" and "compelling"
          7    will be the subject of joint considerations as far as what
          8    is it going to take to make this process work, both for the
          9    industry, the NRC, and for the other stakeholders.  So we
         10    would expect this development not to end when the process is
         11    put in place on trial in June of 1999, but for the
         12    stakeholder involvement and for the evolution to continue
         13    and for it to be a very scrutable process.
         14              COMMISSIONER McGAFFIGAN:  I'm a little concerned
         15    about managing performance indicators and getting overly
         16    dependent on just a bunch of numbers because they happen to
         17    be what you can measure.  Maybe this is an area that is
         18    covered by one of these categories where PIs are
         19    inadequately covered.
         20              We constantly run into cable separation issues,
         21    fire protection issues at plants.  We went looking at a
         22    couple of plants that came out of start-up that had cable
         23    separation issues in their cable spreading rooms, and they
         24    had to go in and take actions.  I don't know how you have a
         25    performance indicator for whether they have adequately
                                                                      29
          1    handled fire protection.
          2              Are these indicators so good that you are working
          3    with?  At this point in the process you should have at least
          4    an existence proof.  If you had had these indicators in play
          5    over the last five years, how good are they predicting NRC
          6    regulatory actions?
          7              I'm not saying that all of our regulatory actions
          8    were perfect and I'm not saying all of our scoring was
          9    perfect, but are there SALP 1 plants that are in the red
         10    zone, and are there SALP 3 plants that are firmly at the top
         11    of the green zone.  What do the data tell you when you look
         12    at some of these, looking backward at the regulatory actions
         13    that we took, whether they were the right actions?
         14              MR. GILLESPIE:  You asked two questions.
         15              The first one is we are still developing specific
         16    indicators and thresholds.  So we have not taken an
         17    independent retrospective look.  But NEI has provided us
         18    with some insights on some work that they did.  So we are
         19    going to have to take a retrospective look at these various
         20    plants once we get the indicators done.
         21              The second question is partially addressed in our
         22    thought process on backup slide number 2.  If you look at
         23    mitigating systems and you through desired result, important
         24    attributes -- this was a straw man we kind of used at the
         25    meeting --
                                                                      30
          1              COMMISSIONER McGAFFIGAN:  That's going to be
          2    helpful.
          3              MR. GILLESPIE:  Yes.  You're not going to see it
          4    up there.  You really have to see it on the paper.
          5              What we came down is a realization that
          6    performance indicators were not going to cover all the
          7    blocks.  If you look at adequate controls to maintain plant
          8    design, you will see on the left-hand side inspection,
          9    design programs.
         10              So part of our logic is asking the question, how
         11    much information do we need about high risk systems and
         12    components relative to mitigation systems, and where can
         13    that information be made available?  It is clearly not all
         14    going to be available from performance indicators.
         15              This is an illustration of design inspection
         16    needed, validation of PIs needed, and potentially some work
         17    on licensed operator recall program, which is personnel
         18    during accidents.
         19              COMMISSIONER McGAFFIGAN:  Where does inspecting
         20    adequate fire protection come in?
         21              MR. GILLESPIE:  In the separation?
         22              COMMISSIONER McGAFFIGAN:  Cable separation or
         23    other fire issues, where does that come in.
         24              MR. GILLESPIE:  Pat.
         25              MR. BARANOWSKY:  The way thing like inspecting
                                                                      31
          1    risk-significant areas that are associated with the fire
          2    program would come in as we go through the indicators and
          3    determine what their capabilities are, we are going to
          4    identify things that they can't do, and then the
          5    risk-informed inspection program is supposed to be focused
          6    on the most significant areas that we would do our audit
          7    inspections, either ourselves or oversee what the licensee
          8    is doing.  There is not going to be indicator that can find
          9    design flaws or tell you what the implication of design
         10    flaws is before send an inspection team out there.  I think
         11    this is part of the information that has to be integrated
         12    with the indicators in order to perform an assessment of the
         13    licensee's performance.
         14              CHAIRMAN JACKSON:  So you are saying that the
         15    inspection program really has two purposes.  One has to do
         16    with what you call the verification or the validation of the
         17    PIs, but also separate and apart from that, to get at issues
         18    such as what the Commissioner mentioned, that a performance
         19    indicator may not so easily pick up per se.
         20              MR. BARANOWSKY:  For instance, if we take the Quad
         21    Cities high risk fire situation, the current regulatory
         22    program, without assistance from PRA, didn't find it, and
         23    the performance indicators by themselves couldn't indicate
         24    it, but it required a risk-informed look in that area to
         25    find the issue.  Then we took an appropriate regulatory
                                                                      32
          1    response to that situation.  I think we are going to see
          2    some of that.
          3              COMMISSIONER McGAFFIGAN:  So a possible goal is
          4    freeing up resources that at the moment may be focused on
          5    less risk-significant things and will allow people to do
          6    smart sampling in areas that are more risk significant.  Is
          7    that one of the goals?
          8              MR. BARANOWSKY:  NRC and licensee resources.
          9              COMMISSIONER McGAFFIGAN:  You said since you don't
         10    have the indicators yet you can run a truth test on them. 
         11    The world should know that when we had you all in the senior
         12    management meeting process adopt some of the Arthur Andersen
         13    indicators, and Arthur Andersen had selective ones in their
         14    public report without plant names attributed to them, that
         15    the Commission at least had all 105 plants.  This was not
         16    something that would have been outside the reach of NEI
         17    probably to replicate either except for one of the
         18    indicators.  So we could look at hits.
         19              The Arthur Andersen model, we could look at it; we
         20    could see how well it replicated the past and understand the
         21    differences.  And the GAO, of course, with the benefit of
         22    also seeing what we had seen, could come in and say, as
         23    Arthur Andersen said, that there had been, if anything,
         24    according to the Arthur Andersen model, a little bit of a
         25    bias to pull our punches.  Plants were getting hits reaching
                                                                      33
          1    Andersen thresholds before we took action, with only one or
          2    two in the other extreme where we took action and the hits
          3    never got very high.
          4              I hope that the Commission can have that sort of
          5    thing.  In this case, since it's a public process involving
          6    the stakeholders, you might as well all be working of the
          7    same data sheet when you decide whether this new set of
          8    indicators, whichever initial set you use, is a good way to
          9    go or not.  I would hope that we would have a backward
         10    looking look at how well these things do.
         11              MR. GILLESPIE:  Absolutely.  In fact, this is one
         12    of the reasons we feel, I don't want to say confident, but
         13    at least comfortable, and I guess I should say confident,
         14    that we can develop a set of performance indicators because
         15    of the Arthur Andersen and some of the work, without plant
         16    names, that NEI has shared with us in public meetings; that
         17    we do have a success path.  But we are going to have to
         18    select specific indicators and retrospectively look and test
         19    those exact indicators.
         20              CHAIRMAN JACKSON:  I had a question somewhere that
         21    was going to ask you that question, about how you were going
         22    to go about actually testing the indicators.  I think
         23    without some algorithm as to how the indicators get folded
         24    together to help you reach some decision threshold, you can
         25    look at whether they are red, green or blue, or one, two or
                                                                      34
          1    three until you are blue in the place.  So the real question
          2    becomes, how do you actually meld them to make some
          3    decisions?
          4              MR. COLLINS:  Chairman, I think in fairness to
          5    Commissioner McGaffigan's question, the premise about
          6    fitting the data over the new system will have to include
          7    the understanding that the new system allows for a response
          8    mechanism to kick in when the data starts to trend, which
          9    our previous process does not.
         10              When we overlay data on to the new window system,
         11    if you will, it will show trends, but it will not include
         12    what we would expect the licensee's reaction to be if that
         13    trend is declining and how that would have gone under the
         14    new system, and if it continues into the white, because the
         15    assumption would be that there is no licensee action because
         16    you are overlaying an old system on to a new system, that
         17    the NRC engages, that we engage in a way that creates a
         18    turnaround.  So we are going to have the same information
         19    overlaid on different processes, and I think we need to be
         20    fairly careful with that given the fact that there is
         21    already an end result, depending on the plants we look at.
         22              If we look at the plants which are "problem
         23    plants," it may actually be more informative if we pick a
         24    case where we know where a licensee picked up a problem
         25    early and in fact responded to it, or we picked up a problem
                                                                      35
          1    early and the licensee responded to it, which is probably
          2    not a problem plant situation but maybe a different type of
          3    performer, to see whether the curve actually reflects the
          4    improvement in performance.  So we will have to look at the
          5    case study to ensure that we are really proofing the system
          6    under the new system, under the new processes.
          7              COMMISSIONER MERRIFIELD:  At what point in the
          8    schedule do you think you will have defined those
          9    performance indicators?
         10              MR. GILLESPIE:  January.  And hopefully by January
         11    we will have defined at least the process for doing
         12    thresholds.
         13              COMMISSIONER McGAFFIGAN:  I will just interpose a
         14    comment if I could.  Typically the way this place works, if
         15    a paper is due to us in January, you've got a first draft
         16    drafted at the moment and it is probably going around in
         17    some sort of concurrence.  So you are going to have some
         18    very fast drafting and some very fast concurring, I assume.
         19              CHAIRMAN JACKSON:  Bill is working all that out.
         20              COMMISSIONER McGAFFIGAN:  Okay.
         21              MR. TRAVERS:  We're going to give you an update
         22    every couple of weeks.
         23              MR. GILLESPIE:  Going on to slide 9.  Although
         24    there was good alignment on most of the defining principles,
         25    there were still two key issues on which consensus was not
                                                                      36
          1    reached.  The first involved how indicator results,
          2    inspection observations, and information sources such as
          3    FEMA results and LERs will be integrated in the assessment
          4    conclusion.
          5              Although there was no consensus on this topic,
          6    there was good agreement that indicators and other
          7    information sources should not be artificially merged.
          8              It has also been acknowledge that while current
          9    assessment processes such as the semiannual plant
         10    performance reviews and annual senior management meetings
         11    may be able to accomplish this integration, objectivity and
         12    scrutability would then to be a challenge.
         13              COMMISSIONER MERRIFIELD:  I take it that that
         14    integration will also be available in January.
         15              MR. GILLESPIE:  Yes.  We would have a proposal on
         16    how the indicators would interact with the inspection
         17    results.
         18              CHAIRMAN JACKSON:  When you say that, you mean
         19    that you intend to also be presenting some analysis
         20    methodology that would fold in the indicators and the
         21    inspection results?
         22              MR. GILLESPIE:  We are doing our best to present
         23    an analysis methodology as part of the assessment task, yes.
         24              CHAIRMAN JACKSON:  It would also be interesting to
         25    know what expected change in licensee regulatory burden you
                                                                      37
          1    would expect to see due to an improved process in how you
          2    arrive at that.
          3              MR. GILLESPIE:  Yes, Chairman Jackson.
          4              MR. COLLINS:  We may not have that one.
          5              MR. GILLESPIE:  We may not have that one for
          6    January.
          7              CHAIRMAN JACKSON:  So you want to be honest. 
          8    Truth in advertising, right?
          9              MR. COLLINS:  Right.  I think we need to
         10    understand a little better once the system is developed, and
         11    perhaps it comes with overlaying the information on it from
         12    past plants.
         13              CHAIRMAN JACKSON:  That's why you need at least an
         14    analysis methodology.
         15              MR. COLLINS:  What we hope to have, though, is a
         16    connection.  Perhaps this is at the root of your question. 
         17    A connection between how the inspection program interacts
         18    with the assessment process.
         19              CHAIRMAN JACKSON:  Right, but also how they met to
         20    arrive at some judgments.
         21              MR. COLLINS:  Yes.
         22              CHAIRMAN JACKSON:  That's what I mean by analysis
         23    methodology.
         24              MR. GILLESPIE:  Going to slide 10.
         25              COMMISSIONER McGAFFIGAN:  You didn't stop very
                                                                      38
          1    long on the second bullet on slide 9, voluntary reporting
          2    program is preferable to rulemaking.  Are we going to have
          3    performance indicators for some plants and not have them for
          4    others because it's all voluntary as to whether they bother
          5    to give them us?  What is implied in that sentence?
          6              MR. GILLESPIE:  This is exactly that.  We are
          7    going to have to work with our stakeholders.  To do
          8    something in a very timely fashion is going to require a
          9    voluntary program.
         10              CHAIRMAN JACKSON:  This is not fair to Mr. Beedle,
         11    but it would be interesting when you come to the table if
         12    you could speak to that issue about how voluntary programs
         13    work that would cover the waterfront relative to what the
         14    regulator needs to have.
         15              MR. BEEDLE:  We'll do that.
         16              COMMISSIONER McGAFFIGAN:  I'm just perplexed as to
         17    how you have a program where say they come up with 18 --
         18    I'll just make up a number -- performance indicators, and
         19    for 43 plants we have 18; for 21 we have 16; for 22 we have
         20    10.  I'm sure we have some minimal that we control ourselves
         21    that we have the data on.  That would be a pretty wild
         22    program.
         23              MR. GILLESPIE:  Clearly that's not the intent.
         24              MR. TRAVERS:  That's not what is envisioned, of
         25    course.  So we recognize this question of how you get what
                                                                      39
          1    you need to do that.  We are struggling now with whether we
          2    can get it voluntarily.
          3              CHAIRMAN JACKSON:  I'm going to make an
          4    advertisement that even predates myself and my colleagues,
          5    but it continued into my tenure, and that had to do with the
          6    struggle relative to the need for a reliability data rule,
          7    were we going to get the data or not, how were we going to
          8    get it, would it be disaggregated or would it be aggregated. 
          9    It's not clear to me how it ultimately turned out.  So we
         10    struggled and struggled.  When I say we, I mean the agency,
         11    for years and years and years.  If all of this is going to
         12    wreck upon the rocks of not being able to get the data
         13    through a voluntary program, then I think we are going to
         14    have to grapple with what the implications of that really
         15    are.
         16              COMMISSIONER DIAZ:  Would it be fair to say that
         17    the more comprehensive the voluntary program the less
         18    prescriptive will our requirements be?  Is there a
         19    correlation there?  If have a very thorough, complete,
         20    comprehensive set of indicators, or whatever, I think we
         21    could say that the less prescriptive we could be.  Is that
         22    correct?
         23              MR. GILLESPIE:  That's correct.
         24              COMMISSIONER DIAZ:  So in response to Commissioner
         25    McGaffigan, there is somebody with a two-by-four sitting
                                                                      40
          1    around here.
          2              MR. COLLINS:  We are receiving the full
          3    cooperation of the industry at this point.  I think there is
          4    a mutual appreciation for the goals, as was articulated
          5    earlier, of this process, which is to allow the licensees to
          6    focus their resources on areas where they believe it's
          7    important and less overlaying of the NRC processes on top of
          8    that.
          9              CHAIRMAN JACKSON:  Theoretically, one could say
         10    that if one didn't have the indicators that one needed to
         11    make a judgment, that might trigger the need for a look
         12    before inspection above the baseline.
         13              MR. GILLESPIE:  It's a tradeoff.
         14              MR. TRAVERS:  The advantage of it is obvious. 
         15    It's up-front understanding and agreement; more or less a
         16    contract, or who's got the burden and what are these
         17    indicators telling us about performance, and how should the
         18    regulatory scheme be structured based on that.
         19              CHAIRMAN JACKSON:  So the two-by-four is send us
         20    the data or we'll send you more inspection.
         21              MR. GILLESPIE:  I think if you go back to one of
         22    our basic defining principles and the idea of an objective
         23    being stated and agreed upon for each milestone, the first
         24    question for each cornerstone is, what information do you
         25    need to make the judgment that there is reasonable assurance
                                                                      41
          1    this objective is met?  Whether that comes from a
          2    performance indicator or inspection, we are going into this
          3    asking that first broad question first.
          4              COMMISSIONER DIAZ:  And that's the balance between
          5    the inspection and the indicators.  How they integrate is
          6    the whole key.
          7              MR. GILLESPIE:  That's where it starts coming in,
          8    right there.
          9              The efforts completed to date and just discussed
         10    were intended to provide the framework for the regulatory
         11    oversight of commercial nuclear power plant licensees.
         12              The current scope of activities include developing
         13    improved processes within this framework to address
         14    inspection, assessment and enforcement.
         15              As described in the August 25th Chairman tasking
         16    memorandum, the activities in these areas are being closely
         17    coordinated to ensure that the process improvements remain
         18    integrated.
         19              The work in these three areas will form the basis
         20    for the recommendations for improvements to the regulatory
         21    oversight process that will be submitted to the Commission
         22    in January of 1999.
         23              In addition to work in these three areas, there
         24    are several other regulatory oversight processes which need
         25    to be addressed and evaluated within this cornerstone
                                                                      42
          1    framework.  Most of this work is longer term in nature and
          2    will not be part of the January 1999 recommendation. 
          3    Specifically, there will be a substantial effort required to
          4    revise the inspection program documentation to support any
          5    new approach to regulatory oversight.
          6              Definitions developed within the framework for
          7    risk-significant inspection observations will need to be
          8    applied to the enforcement program to help characterize
          9    inspection findings.
         10              The allegation program needs to be evaluated for
         11    appropriate changes within the framework structure to
         12    determine whether allegations should be handled in a
         13    risk-informed manner.
         14              As previously discussed, how assessment results
         15    affect enforcement will continue to be evaluated beyond
         16    January of 1999.
         17              Based on the results of the cornerstone framework
         18    and the identification of risk-significant performance
         19    areas, changes to licensee reporting requirements may be
         20    warranted.
         21              CHAIRMAN JACKSON:  Let me ask you a couple quick
         22    questions.  Will different inspection skill sets be required
         23    to implement this program?
         24              MR. GILLESPIE:  Bruce and I have talked about
         25    that.  I'd like to let Bruce address that.
                                                                      43
          1              MR. MALLETT:  Thank you, Frank.
          2              [Laughter.]
          3              COMMISSIONER DIAZ:  That didn't sound very
          4    enthusiastic.
          5              MR. MALLETT:  We believe that there will be some
          6    skills.  They may be lined up different than we have now. 
          7    Now in the inspection program you may have a specific skill
          8    for someone who may be in operations, may be in maintenance,
          9    or may be in electrical engineering.  I think in the future
         10    you may need a different skill.  We aren't too sure what
         11    that looks like yet.  So we will have to factor into this
         12    implementation some training aspect to develop people for
         13    those skills.
         14              As far as being able to inspect, that basic skill
         15    will be there.
         16              MR. COLLINS:  Chairman, I would say that overall
         17    we would be looking at a more performance-based approach to
         18    inspection.  In other words, as you go down through the
         19    cornerstone into the tiers, we would be looking more at the
         20    cause of performance, whether it be good performance or an
         21    area that needs attention, and we would track that back
         22    through and come to a determination of whether the action
         23    the licensee has taken or proposed to be taken are
         24    appropriate.
         25              So we will need a better understanding of the
                                                                      44
          1    corrective action system and corrective action processes. 
          2    Potentially, and this is yet to be determined, a more
          3    refined skill set on human performance, because that may end
          4    to be an area that is raised to a different visibility.  And
          5    then probably better training in the risk-informed,
          6    performance-based inspection area and less specific
          7    disciplines, unless we have a specialist inspection, which
          8    would probably be reactive inspection.
          9              CHAIRMAN JACKSON:  When you talk about reporting,
         10    you mean LERs, or do you mean these performance indicators,
         11    or what?  And will the LERs be part of the assessment
         12    program?
         13              MR. GILLESPIE:  We are looking at how we are going
         14    to integrate LERs.  Right now certain things reported in
         15    LERs would be picked up as part of the indicators, and we
         16    don't want to do a double count of LERs and indicators.  So
         17    we are going to be stepping back and looking at LERs and
         18    what information is reported and how you integrate that in
         19    with the hard indicator information being reported from the
         20    licensee.  So, yes, we do mean both, the potential for a
         21    rulemaking on reporting the PIs and the potential that this
         22    could change the LER reporting rulemaking in the longer term
         23    once we refine probably the second generation of performance
         24    indicators, quite honestly.
         25              CHAIRMAN JACKSON:  You had earlier shown the slide
                                                                      45
          1    showing operator licensing and requalification.  How do they
          2    affect assessment?
          3              MR. GILLESPIE:  What we are showing there is
          4    operator licensing and requalification.  Specifically, the
          5    requalification piece is right now part of our baseline
          6    inspection program.  That came out when we were going down
          7    through the attributes.  The question was, what is not
          8    specifically covered that would be high risk by a PI?  So
          9    that dropped out as an example, and that was operator
         10    reaction under accident conditions in a control room.
         11              CHAIRMAN JACKSON:  Okay.
         12              COMMISSIONER McGAFFIGAN:  This chart was titled
         13    "Longer Term."  For some of these actions -- I'm looking at
         14    Mr. Lieberman -- the threshold for minor violations, that
         15    isn't too much beyond January, is it?  My recollection from
         16    the ongoing response to the Chairman's tasking memo is that
         17    that is an early spring deliverable from you; isn't it?
         18              MR. LIEBERMAN:  That's correct.  In the past the
         19    enforcement program drove the assessment process.  We want
         20    the assessment process to drive the threshold process, and
         21    we need that to be done.  Right after that we will be ready
         22    to go.
         23              COMMISSIONER McGAFFIGAN:  So these are longer term
         24    but quite near term in NRC time?
         25              MR. GILLESPIE:  Yes.  Early spring.  If you look
                                                                      46
          1    at our schedule, longer term tends to be before June for
          2    most milestones.
          3              CHAIRMAN JACKSON:  Is it fair to say that that
          4    second bullet under longer term relates to ensuring that
          5    there is an appropriate alignment between thresholds for
          6    inspection and thresholds for minor violations?
          7              MR. LIEBERMAN:  That's right.  We want to make
          8    sure that we are not collecting in inspection space
          9    information that we don't need for assessment and we are not
         10    enforcing things which aren't important for the bigger
         11    picture oversight issues.  We want to have these things
         12    integrated together.
         13              CHAIRMAN JACKSON:  So in addition to having
         14    assessment come ahead of the curve, it is also meant to
         15    ensure that there is an alignment here; is that correct?
         16              MR. GILLESPIE:  Yes.  If we are successful in a
         17    risk-informed baseline inspection, then inspectors in theory
         18    should not even be looking at things that we would put in a
         19    minor violation category today.
         20              CHAIRMAN JACKSON:  Okay.
         21              MR. GILLESPIE:  Slide 12.  There are currently
         22    four short-term activities in progress to develop a
         23    recommendation to the Commission on improvements to the
         24    regulatory oversight process.
         25              The technical framework group, led by Pat
                                                                      47
          1    Baranowsky, is responsible for building on the work started
          2    in the public workshop, to complete the development of the
          3    cornerstones by identifying appropriate performance
          4    indicators, establishing criteria for thresholds, and
          5    establish the basis for risk-informed baseline inspection.
          6              The inspection task, led by Bruce Mallett, is
          7    responsible for developing a process that addresses scope,
          8    depth, and frequency of a risk-informed baseline inspection. 
          9    The scope and basis for inspection were based in part on
         10    input received from the framework group.
         11              The assessment process group, led by Mike Johnson,
         12    will determine methods for the integration of indicator and
         13    inspection data, develop criteria for NRC actions based on
         14    assessment results, and determine the best method for
         15    communication of the results to licensees and the public.
         16              The enforcement activity, led by Jim Lieberman, is
         17    working with and participating in these tasks to ensure that
         18    the enforcement process changes are properly evaluated in a
         19    framework structure and the changes to the inspection
         20    assessment program are integrated with changes to the
         21    enforcement program.
         22              All these activities are fully coordinated and
         23    integrated and consist of broad participation from all four
         24    regions, NRR, OE, Research, and AEOD.
         25              With that, I would like to turn it over to Pat
                                                                      48
          1    Baranowsky to address the technical framework.
          2              MR. BARANOWSKY:  The technical framework group, as
          3    Frank mentioned, does have representatives from a broad
          4    spectrum of the NRC's offices.  Let me mention some of the
          5    disciplines that are involved.
          6              We have people with field inspection and
          7    inspection program development background; maintenance rule
          8    implementation; performance indicator development analysis;
          9    emergency planning; health physics; security; human
         10    performance; risk assessment; and enforcement.
         11              That is a pretty broad-based group.  If you look
         12    at the cornerstones, you will see they cover pretty broad
         13    indications.
         14              We have about a dozen full-time and about a half
         15    dozen half time staff involved in this activity.
         16              The charter for this group is to develop the
         17    details of the technical framework for a more objective,
         18    risk-informed and performance-based approach to licensee
         19    performance assessment, and to provide related bases for
         20    inspection activities.  Therefore, the information that is
         21    developed by this group will be used in the development of
         22    the risk-informed baseline inspection program and for the
         23    performance assessment tasks.
         24              The work of this group, as Frank mentioned, will
         25    follow and build on the defining principles and the
                                                                      49
          1    cornerstone development effort that was begun at the
          2    performance assessment workshop in late September of this
          3    year.
          4              Also, as was mentioned by Sam Collins, we
          5    recognize that this is really the first phase of an activity
          6    that is going to evolve over several years through
          7    implementation, feedback, and improvement of the process. 
          8    Nonetheless, it's our intent to develop with sufficient
          9    detail information that will allow the Commission to make a
         10    decision on the efficacy and direction of this new approach
         11    to licensee oversight for potential near term implementation
         12    even though there may be some future development in the
         13    years to come.
         14              COMMISSIONER DIAZ:  I'm sorry, my major concern
         15    with this paper is in some of the definitions of
         16    cornerstones.  Are you going to cover that because they are
         17    in the appendix, or should I just jump right in to it?
         18              MR. BARANOWSKY:  I think we should jump right in.
         19              CHAIRMAN JACKSON:  Backup slide 1a.
         20              COMMISSIONER DIAZ:  I'm sure we are going to this
         21    from the principle that when you look at these things you
         22    first determine what is the desirable outcome, and second,
         23    how you are going to regulate to make sure that that outcome
         24    is there.
         25              If I look at the definitions in here, I do have a
                                                                      50
          1    little problem with the way they are stated.  Let me start
          2    with initiating events:  limit the frequency of events that
          3    upset plant equilibrium.  I'm a little leery about the
          4    words, because plant equilibrium is upset in many different
          5    fashions, and those might not be initiating events.  So I
          6    would encourage the staff to look at that.
          7              Words might be something like "initiating events
          8    that create deficiencies in plant balances (reactivity, heat
          9    transfer, and coolant inventory)"  That would be very
         10    specific.
         11              I think it's important to know what we mean by
         12    upsetting plant equilibrium, because if we have a scram due
         13    to losing a transformer, say 50 percent load is gone but the
         14    plant is still responding very well, it would be an event,
         15    but it it might not be an initiating event that would create
         16    a response.
         17              Fundamentally, there are three things that we are
         18    always looking at when we look at critical safety functions,
         19    and that is reactivity, heat transfer, and coolant
         20    inventories.  I don't know of any other.  Some definition on
         21    that might be appropriate to avoid people getting upset
         22    about upsetting plan equilibrium.
         23              CHAIRMAN JACKSON:  There is actually a definition
         24    of initiators in a PRA sense.
         25              MR. BARANOWSKY:  I wonder if I could respond to
                                                                      51
          1    that.
          2              COMMISSIONER DIAZ:  Please.
          3              MR. BARANOWSKY:  I hope you will be happy
          4    eventually when you see what we are putting together.  Our
          5    job is to take these bullets that were basically put on
          6    paper as a result of the workshop and detail them out just
          7    to cover the kind of concerns that you are raising.  In
          8    fact, for each of the cornerstone areas we are going to have
          9    a fairly substantial discussion of what the cornerstone is,
         10    what the performance concern is, how the performance
         11    indicators relate to those things what the performance
         12    indicators can't do.
         13              COMMISSIONER DIAZ:  I have absolutely no doubt
         14    that you will do that.  Again, the summary is something that
         15    people look at and they form their own images.  I think this
         16    process has to be so transparent, so well defined that some
         17    of those things are important.
         18              Quickly, because I know we are time constrained,
         19    when you go to mitigation systems, there is some things that
         20    we need to state that have to be according to our rules. 
         21    This definition still has something missing.  For example,
         22    "ensure that those systems required to prevent and/or
         23    mitigate core damage perform at a level commensurate with
         24    their safety significance."  It has to include "perform or
         25    are capable of performing," because if they are not capable
                                                                      52
          1    of performing that function even if they were not challenged
          2    by an initiating event, that might be sufficient to be a
          3    cornerstone.  A lot of our things are established on the
          4    capability to perform the function.  So "perform or capable
          5    of performing."  You have to have it.  If not, we are not
          6    compatible with other things.
          7              The same thing on barrier integrity:  "Assure that
          8    the physical design barriers protect 'or are capable of
          9    protecting.'"  In other words, the capability has to be
         10    there not only being able to do it.
         11              With just those minor corrections, your summary
         12    actually becomes very inclusive.
         13              MR. BARANOWSKY:  Thank you.
         14              COMMISSIONER DIAZ:  You're welcome.
         15              CHAIRMAN JACKSON:  Sam.
         16              MR. COLLINS:  Let me raise a fundamental issue
         17    here.  Not to resolve it here, but there is a difference in
         18    looking at the level of engagement of the regulator.  In
         19    your own words, things happen at power plants, and you do
         20    have random failures, random events.
         21              We become concerned at a different level when
         22    those random events result in actual challenges to our
         23    safety safety, as opposed to challenges to a safety system
         24    that the safety system does not function as required, as
         25    opposed to it doesn't function as required when you
                                                                      53
          1    challenge a barrier versus a barrier failing.  So it is a
          2    graduated approach.
          3              Although systems that may not have worked are a
          4    concern to us, they are not of the same concern under this
          5    scheme as those that are actually challenged.  So what we
          6    have to decide as an agency at what level we are going to
          7    engage versus at what level we are going to ensure that it's
          8    understood and that the licensee is approaching the issue
          9    appropriately.
         10              In other words, if I can paraphrase Jim, does luck
         11    count?  Does the fact that you had a potential but didn't
         12    have a circumstance that has a nexus that is close to one of
         13    our strategic goals more important?  We are still working
         14    our way through that to some extent.
         15              CHAIRMAN JACKSON:  Actually, the two things do tie
         16    in.  Commissioner McGaffigan raised the issue of cable
         17    separation and other fire protection issues.  You could
         18    argue that cable separation relates to the capability line
         19    as opposed to did they work if there were a fire, or what
         20    have you.  I think the real answer is that you have to give
         21    some specificity that relates to these fundamental barriers
         22    and you have to be clear on what this graduated approach is
         23    that you are talking about related to that.
         24              COMMISSIONER DIAZ:  I agree.  It's the specificity
         25    that will avoid the problems.  Voluntary or involuntary,
                                                                      54
          1    whatever it is, there is still some things that are low, and
          2    we still have to be able to maintain the capability to
          3    perform the function.  If we want to be specific about what
          4    grade we are going to risk inform those functions, that's
          5    fine, but you still have to have that.
          6              CHAIRMAN JACKSON:  That's a good point.  That's a
          7    good way to put it, I think.
          8              MR. BARANOWSKY:  The product that we are going to
          9    have is basically part of the paper that will come up here
         10    January 1999 and will document the principles, bases, logic,
         11    and technical information that supports all these areas at
         12    that time.
         13              Let me go to number 14 and talk a little bit more
         14    about some of these specific tasks that we have.  We had
         15    covered some of this stuff in pieces, parts and chunks
         16    earlier as we had a question and answer session, but I will
         17    talk about a few of these.
         18              The cornerstone task, as I said, is primarily to
         19    detail out the few bullets and charts that we have, to cover
         20    the scope, key definitions, and relationship to other
         21    activities.  We are going to have operating events that are
         22    significant by themselves, how does that relate this, and
         23    what do we do about that.  Reporting, generic issues, and so
         24    forth, all have to be looked at in terms of this framework.
         25              Enforcement philosophy as it relates to the
                                                                      55
          1    defining principles and development of the performance
          2    indicators, inspection bases and thresholds, will also be
          3    considered in this technical framework development task.
          4              The performance indicators are intended to be risk
          5    informed to the extent practical at this time.
          6              The performance indicator task involves evaluation
          7    of the performance indicators that were proposed at the
          8    workshop, and also includes the identification of other
          9    performance indicators where either there were some holes
         10    identified or limitations.
         11              However, I should note that not all the
         12    performance indicators are going to be so amenable to
         13    risk-based or risk-informed thinking.  For instance, the
         14    radiation protection cornerstone area is not so much based
         15    on risk as it is ALARA and other regulatory principles that
         16    haven't been evolved through the kind of risk analysis that
         17    is associated with initiating events and mitigating systems.
         18              In general, the PIs are meant to be a broad sample
         19    of performance in some of the more risk-significant areas
         20    and those areas that are delineated by the cornerstones.
         21              For instance, in the mitigating systems, although
         22    we are not going to look at all risk-significant mitigating
         23    systems, at this time we are thinking about four or five key
         24    systems, ones that we have some form of indication available
         25    at this time that would be easy to develop, because we are
                                                                      56
          1    working under some time constraints.  We think that would be
          2    practical, and moreover, we think that we are going to get a
          3    large chunk of what we need to get in terms of risk-informed
          4    information from that set of indicators.
          5              In the future, our work will involve developing
          6    improved indicators or additional indicators, and we will
          7    also look at that in this activity.
          8              Validation of the performance indicators is also a
          9    part, and that is related to the adequacy discussion that we
         10    had earlier.
         11              Evaluating their limitations.  Their limitations
         12    are significant in terms of the development of inspection
         13    bases.  We want to make sure that the risk-informed
         14    inspection program takes advantage of the information
         15    generated by the performance indicators but that we don't
         16    misunderstand the capabilities of performance indicators to
         17    give us relevant information.
         18              The inspection bases will include identifying
         19    areas where verification and validation needs to continue,
         20    and of course the risk-significant aspects of performance
         21    not adequately covered by the PIs.
         22              The threshold task involves the identification,
         23    definition and evaluation of the performance indicator
         24    thresholds.  These thresholds are intended to provide a
         25    clear demarcation point or points for identifying fully
                                                                      57
          1    acceptable performance, areas of declining performance, and
          2    unacceptable levels of performance.
          3              We will be evaluating the thresholds proposed by
          4    the Nuclear Energy Institute for several of the proposed
          5    indicators, and we will perform some of our own independent
          6    analyses of PI response, the benchmarking, sensitivity to
          7    risk, and that kind of thing.
          8              An important aspect of the evaluation of
          9    performance indicator response and thresholds is determining
         10    the ability of indicators to identify declining performance,
         11    allowing the staff time and the licensee time to evaluate
         12    problem areas, and initiate corrective actions before
         13    reaching an unacceptable threshold.  This would allow
         14    licensees time to do what they have to do and the NRC to
         15    implement a graded response to declining performance.
         16              In this regard, we will be considering regulatory
         17    as well as safety implications of crossing a threshold and
         18    possible mandatory actions associated with the unacceptable
         19    threshold.
         20              The enforcement philosophy and implementation are
         21    primarily being addressed in the performance assessment and
         22    the inspection baseline groups.  However, we are going to
         23    look at this philosophy in terms of its logical connection
         24    and consistency with the technical framework as it's
         25    developed.
                                                                      58
          1              Questions.
          2              COMMISSIONER McGAFFIGAN:  Could I go back to a
          3    question I asked earlier and was told, I think, this is the
          4    time to ask it?  Do we take regulatory action in a single
          5    area?  Even they are green everywhere else, if they dip into
          6    the white, in that area, is that the notion?
          7              MR. BARANOWSKY:  That's the notion that is
          8    currently being proposed, although Mike Johnson is going to
          9    be looking at whether or not that is going to be our final
         10    posture and how we should look at groups of indicators
         11    changing in one way or another.  We are not planning at this
         12    point to have an integrated indicator like the one we
         13    recently developed and put out with the IRAP public comment
         14    paper.
         15              COMMISSIONER McGAFFIGAN:  If you are in the green
         16    zone and we have an inspection finding that belies the
         17    indicator, the burden of proof is on us, but we still can
         18    take regulatory action in the green area if you pass that
         19    burden?  How does that burden get manifested in terms of
         20    staff processes?  Is there a higher level of approval
         21    required to take a regulatory action if, despite the burden
         22    of proof being on the regulator, it's passed?
         23              MR. TRAVERS:  Commissioner, I think this is
         24    exactly right.  We would look on that compelling argument
         25    that we were speaking of earlier as one that would be a
                                                                      59
          1    burden on us to make if they are in the green zone, but one
          2    which we very well could make if we felt by virtue of our
          3    inspection program or any other information that we had that
          4    we needed to engage on an issue, and how we would engage on
          5    the issue would be determinant on just what the issue is and
          6    the extent to which it looked to be a problem.  We could
          7    have a meeting; we could issue an order; we could do a
          8    special inspection; we could put more resource on the issue.
          9              COMMISSIONER McGAFFIGAN:  I'm just trying to
         10    understand what burden of proof means.  If I'm a licensee
         11    and I get an inspection and at the exit interview it's clear
         12    that despite sailing along in green in whatever category or
         13    set of categories this might fall under, I really do have
         14    some problems here, and I should expect regulatory action. 
         15    Do I petition to the EDO to say, I'm on top of it now, I
         16    appreciate what your staff found, but let me fix it and
         17    don't do anything, because I'm in green?  And how is this
         18    process scrutable to me, that you make a decision that
         19    despite the green, you are going to have a public meeting,
         20    you are going to have an order, you are going to do
         21    something?
         22              MR. TRAVERS:  That's part of the challenge that we
         23    have to yet develop, but the expectation would be if you are
         24    in the green that we wouldn't be in a position to engage.
         25              COMMISSIONER McGAFFIGAN:  Despite the fact you've
                                                                      60
          1    just found some stuff in an inspection report, I might leave
          2    the meeting thinking, well, I'm still in green.  Then three
          3    months later NRC takes an action and comes out of the blue. 
          4    How do we make sure that it doesn't come out of the blue,
          5    that we signal to them early on that we regard these
          6    findings of such magnitude that we may take regulatory
          7    action despite their being in the green, and that that is
          8    being considered?
          9              MR. TRAVERS:  I think engagement at its earliest
         10    phase would include dialogue with the licensee.  I think
         11    that is what is envisioned.
         12              MR. COLLINS:  It's really no different than our
         13    processes provide for now.
         14              MR. TRAVERS:  That's right.
         15              CHAIRMAN JACKSON:  So you would engage because you
         16    have dissonance between what the indicator says and what
         17    your inspection results might say.
         18              MR. TRAVERS:  Correct.  I would assume that what
         19    we would have in place -- it's early yet, and we haven't
         20    really developed the process -- but that we would have an
         21    internal process that would guide us in developing the
         22    compelling case.  We would base it on whatever information
         23    was at hand, including information from the inspection
         24    program, and so forth.  The internal process would make it
         25    one that is very carefully managed.  So again, the
                                                                      61
          1    expectation would be that things do happen at these complex
          2    plants and that a few things or normal kinds of events or
          3    issues would not result in engagement.  If we felt we needed
          4    to, we could do it, but it would be a very carefully managed
          5    process.
          6              COMMISSIONER McGAFFIGAN:  All I am suggesting is
          7    that it's probably going to have to be a relatively
          8    scrutable and transparent process from the point of view of
          9    the licensee.  We get criticized today.  One of your early
         10    viewgraphs talks about more scrutability, more transparency,
         11    et cetera.  Once we divert from the model, we can't make it
         12    inscrutable and less transparent.
         13              MR. TRAVERS:  I absolutely agree.  That's why when
         14    Frank Gillespie was speaking he was talking about developing
         15    the thresholds that we would use for that kind of
         16    engagement.
         17              CHAIRMAN JACKSON:  This is a question I want to
         18    put on the table.  If it's best addressed by Dr. Mallett,
         19    that's fine.  If not, if you could answer it now.  It
         20    actually relates to this, particularly if there is ever an
         21    area where there may be a dissonance between what a
         22    performance indicator or set of indicators seem to say
         23    versus inspection findings.
         24              Are you looking at, perhaps both in indicator land
         25    and inspection land, at sample size, and that it be
                                                                      62
          1    established in a way that provides a demonstrable level of
          2    confidence?  Or is there any kind of statistical or sampling
          3    protocol that is under consideration in the selection of the
          4    number, types and thresholds of indicators as well as
          5    inspection observations?
          6              MR. BARANOWSKY:  I don't think we have a
          7    statistical sampling process in mind for the performance
          8    indicators.  The approach there is to try and capture the
          9    bulk of the risk as we understand it and to use indicators
         10    that are broadly understood to be important to risk, which
         11    means we are going to take insights that we get across the
         12    industry in selecting these indicators as opposed to being
         13    very plant specific and picking details for the indicators.
         14              There may be some plant-specific elements to the
         15    indicators such as performance thresholds that make sense in
         16    either peer group, or certain design features that would
         17    benefit from a plant-specific approach, but when it comes to
         18    the inspection program, I think we are talking about some
         19    sort of sampling.  Bruce might want to address that.
         20              CHAIRMAN JACKSON:  If you are going to talk about
         21    it as part of your presentation, I'm willing to wait, but I
         22    want to be sure that you are going to talk about it.
         23              MR. MALLETT:  Okay.
         24              MR. GILLESPIE:  One other element.  If we have
         25    this compelling case, we have two problems.  One is the
                                                                      63
          1    plant-specific problem we have the compelling safety case
          2    on.  The other is the feedback loop that says this
          3    challenges having selected the right PIs and how we are
          4    using them.  So there are two elements to that when we come
          5    across it.  Our intention would be to have that feedback
          6    loop in to take on that challenge if it occurs.
          7              COMMISSIONER DIAZ:  If I might be able to confuse
          8    myself, if you think of this as you being a controller in
          9    the sense of controlling processes, I think what
         10    Commissioner McGaffigan is saying is, if any one of the
         11    inputs or the desired outcome has a significant delta or
         12    error margin from what you expect, then immediately the
         13    process gets more focused, and you might take action.
         14              There are two ways in which that could happen.  It
         15    could happen very suddenly.  All of a sudden you have
         16    inspection finding on something that shows you that you are
         17    out of whack.  Or it could be a degrading process which is
         18    slowly changing.  Either one of those could actually trigger
         19    our actions.  Is that correct?
         20              MR. GILLESPIE:  That's correct.
         21              MR. COLLINS:  One the process would accommodate by
         22    the levels of engagement and the shift in the burden of
         23    proof.  The second would be the more extreme case where we
         24    would have a data point which is abnormal, if you will. 
         25    That would engage a scrutiny of the system as well as a
                                                                      64
          1    reaction to the issue.
          2              COMMISSIONER McGAFFIGAN:  One of the issues that I
          3    see is the performance indicators are always going to be
          4    lagging.  They will be lagging even if they are close to
          5    concurrent.  An inspection finding is here and now.  You
          6    came across something and, like you say, it may be abnormal,
          7    but the performance indicators are backward looking, at best
          8    concurrent, the ones I've seen.  An inspection report, as I
          9    say, the person was in the plant yesterday and he found such
         10    and such, and it's either a big concern or it isn't.  If it
         11    is a big concern, despite their being in the green, you have
         12    to have some mechanism for dealing with it.
         13              MR. GILLESPIE:  Right.  The big concern is
         14    probably the easier one to deal with, when it is something
         15    that is recognized as very, very significant.  It's the
         16    accumulation of small things that is going to take us a lot
         17    of thought on how to deal with building a compelling case
         18    when there is an accumulation of small things.
         19              Also, there are two thresholds we are dealing with
         20    here.  One is an operating threshold which, if successful in
         21    the system, will be set high enough to give us some, you
         22    might say, margin, accounting for a one quarter or lag time
         23    when the trend shows up and crosses the threshold, but yet
         24    leave a utility sufficient freedom so that they can catch a
         25    trend and reverse it themselves before we have to interdict
                                                                      65
          1    ourselves.  Much lower, we would hope, is the safety margin
          2    where much sterner action would have to be taken, and in
          3    between these there is a gradual engagement with us.
          4              COMMISSIONER McGAFFIGAN:  The plant that keeps
          5    coming to mind is D.C. Cook.  I don't think there were lot
          6    of indicators before that inspection occurred, and then it
          7    fell off.  Mr. Lochbaum might believe, and I'm sure does,
          8    that he had concerns with the ice condenser plants earlier
          9    that hadn't been fully addressed.  That plant inspection
         10    comes along and the plant goes from non-regulatory
         11    difficulty to regulatory difficulty all at once.  The old
         12    inspection program doesn't catch it; performance indicators
         13    don't necessarily catch it.  How does it work in this new
         14    system?  It's probably in the green.
         15              CHAIRMAN JACKSON:  It's in the green if you don't
         16    look at it.
         17              COMMISSIONER McGAFFIGAN:  It's in the green,
         18    according to performance indicators, in this area where
         19    performance indicators perhaps don't capture very much.  I
         20    think you showed that backup slide earlier.  This is
         21    probably one of the areas where you have inspection still
         22    and don't rely on indicators.
         23              I think one of the things the industry is looking
         24    for is predictability in the regulator, the notion that not
         25    everybody is one day away from regulatory difficulty.  You
                                                                      66
          1    still have these hard cases.  Unless you can tell me how
          2    this new systems would catch D.C. Cook without having an
          3    engineering inspection.
          4              MR. GILLESPIE:  In discussions we've had, even
          5    with NEI, we would fully expect that there are four to six
          6    things a year that seem to occur by exception that we are
          7    still going to have to deal with outside the defined
          8    performance indicator risk baseline inspection program,
          9    whether it's an event that has multiple items that still
         10    stay in the green or whether it's significant design
         11    problems.  And design is one of the areas which was
         12    crosscutting through all the cornerstones that came up
         13    needing to still be inspected.
         14              CHAIRMAN JACKSON:  That's the question.  Will the
         15    risk-informed baseline inspection program capture
         16    risk-significant design issues or not?
         17              MR. MALLETT:  Let me answer that, Frank.
         18              We are planning to do that.  What we are also
         19    planning to do is, when we get more inspectable items
         20    defined and what we want to look at, we are going to go back
         21    and benchmark some of these events that occurred or some of
         22    these issues that came up, like a D.C. Cook, to see if we
         23    have covered in our inspectable items those sort of things
         24    in the baseline program.  I'm not saying we may pick them
         25    all up, but we want use that a way of a self check to make
                                                                      67
          1    sure we have been all-inclusive in our inspectable items.
          2              MR. COLLINS:  Chairman, I think the direct answer
          3    is the risk-informed core baseline inspection will probably
          4    not contain an in-depth design engineering review.  That's
          5    not to say it will not be done either as a result of the
          6    supplemental or as a result of licensees doing it
          7    themselves.  Much of the development of the inspection
          8    programs is going to depend on the role of the licensee as
          9    far as either routine reviews or corrective action as a
         10    result of findings, as well as those areas that we believe
         11    periodically in order to ensure that the performance
         12    indicators are giving us accurate information, we will go in
         13    and delve into.  That may be different than what the
         14    risk-informed core baseline inspection is.
         15              CHAIRMAN JACKSON:  I understand the point you are
         16    making, but going back to a couple of the examples, one has
         17    to come out of this.  There is some additional effort that
         18    is not part of today's discussion having to do with
         19    definition of design basis and design basis information. 
         20    Somewhere along the line there has to be some kind of
         21    scoping or risk ranking in that arena, and you have to be
         22    able to say how you deal with those things that show up at
         23    the top of the list.
         24              It's not necessarily everything, but you have to
         25    be able to say how you are going to deal with that:  Where
                                                                      68
          1    is the information coming from?  Is is licensee
          2    self-assessment?  Is it some inspection that is or is not
          3    part of the risk informed?
          4              But if it's important, if it's high in a risk
          5    sense, do you not have to address it at some level so that
          6    you don't come along and here's a surprise that shuts a
          7    plant down for a year, two years, et cetera, because all of
          8    a sudden this was something that was discovered?  Maybe it
          9    was self-revealed, maybe not, and now it shuts the plant
         10    down for X period of time.
         11              That's one of these kind of sudden surprises that
         12    on the one hand is very unpleasant for the licensee, and on
         13    the other hand, makes us look bad if it warrants a plant
         14    being shut down for two years and we've been going along
         15    saying all the time it was fine.  So I feel somehow you have
         16    to get at that.  It's not something you can walk away from.
         17              COMMISSIONER DIAZ:  It's going back to the same
         18    thing.  We have to have the capability to perform those
         19    functions that we believe are essential to be performed. 
         20    All we are going to do is going to risk rank them to a point
         21    that we know which ones those are, and that's where the
         22    specificity comes in.  D.C. Cook will be captured by
         23    capability to perform.
         24              CHAIRMAN JACKSON:  Right.  This will be a
         25    performance expectation that those things that come up high
                                                                      69
          1    on that risk ranking are dealt with, but secondly, there is
          2    a question of how does it get dealt with in this process. 
          3    If it's not covered by performance indicators, if you are
          4    telling us it's not covered by a risk-informed baseline
          5    inspection program, then how is it going to be covered?
          6              It either has to be covered by a risk-informed
          7    baseline inspection program suitably defined, or by some
          8    licensee self-assessment.  Perhaps that's the way to go.
          9              But there has to be something for those things
         10    that show up at the top of the risk list that have to be
         11    dealt with.  If you don't deal with it, you've left a big
         12    hole, number one, from the point of view of safety
         13    oversight, but secondly, you're left the big surprise, and
         14    that also is unacceptable.
         15              MR. MALLETT:  Let me make a comment.  One of the
         16    things Pat Baranowsky and I've talked about is he's going to
         17    give us a list of things he believes do not have adequate,
         18    to use your phrase, performance indicators.  We are
         19    approaching it from a different standpoint.  We're looking
         20    at everything that we believe we need to have in the
         21    inspection program first to get this baseline assessment. 
         22    Then we will modify that, depending on whether we have
         23    adequate performance indicators, and modify based on risk,
         24    we hope.  I think that will address your issue, or we hope
         25    it will.
                                                                      70
          1              CHAIRMAN JACKSON:  And thereby address his issue.
          2              MR. MALLETT:  It won't address the green coloring.
          3              CHAIRMAN JACKSON:  I'm talking about the surprise.
          4              MR. MALLETT:  It should address all-inclusive in
          5    the program if we've done our job right..
          6              COMMISSIONER MERRIFIELD:  It seems to me acute is
          7    going to be the point at which you can do the benchmarking
          8    of the performance indicators.  You are going to be making a
          9    presentation to us in January of the performance indicators. 
         10    At what point will you be able to do some of that
         11    benchmarking to give us some greater assurance that you've
         12    hit the mark?  What's your time line for testing those
         13    performance indicators to determine, based on past
         14    performance, that they would have picked up the concerns
         15    that Commissioner McGaffigan has raised?
         16              MR. BARANOWSKY:  The benchmarking that is part of
         17    this effort is starting now.  The Nuclear Energy Institute
         18    has done some of their own work, and we are laying out our
         19    activities that we want to do.
         20              I just want to mention that we have done
         21    benchmarking of performance indicators that are similar to
         22    the ones we are talking about here as part of the Arthur
         23    Andersen work over the last several years.  I have pretty
         24    reasonable belief that these things are going to have
         25    capability of indicating poor performers.
                                                                      71
          1              There is going to be a question of whether you
          2    want to have false positives or false negatives and
          3    statistical issues like that that we have to address. 
          4    That's what we still have to delve into with these
          5    indicators that we haven't worked with day in and day out,
          6    but they are similar to ones that we have worked with in the
          7    past.
          8              CHAIRMAN JACKSON:  I think you have to present the
          9    documentary evidence to the Commission.  If it's based on
         10    the Arthur Andersen type algorithm using the indicators you
         11    come up with, you need to present that even if the names
         12    have been changed to protect the innocent.
         13              MR. BARANOWSKY:  It's going to be part of what we
         14    provide in our January paper, and I think you will see
         15    information coming out as we go to the ACRS in December.  So
         16    it will be coming up shortly.
         17              COMMISSIONER McGAFFIGAN:  The point for
         18    Commissioner Merrifield is that there are some areas that
         19    just aren't going to be covered.  You're not going to have a
         20    performance indicator for the capability of the ice
         21    condensers.
         22              MR. BARANOWSKY:  We're not saying we can do that. 
         23    That was a problem with the old system.
         24              COMMISSIONER McGAFFIGAN:  It was a problem with
         25    the old system; it's going to be a problem with the new
                                                                      72
          1    systems; we're going to try as best we can to work the
          2    inspection program to fill the hole, as I understand the
          3    answer.
          4              CHAIRMAN JACKSON:  Inspection and/or other things. 
          5    It could be licensee self-assessments, or required
          6    self-assessments, or whatever.  Agreed upon
          7    self-assessments.  What you have to really lay out is how
          8    the pieces flow together.  The performance indicators, you
          9    have to know where they start and where they stop and where
         10    the inspection goes and where self-assessment comes in. 
         11    Nonetheless, you have to be assured that you have covered
         12    the waterfront.
         13              MR. GILLESPIE:  Going back to our first principle,
         14    the objective statements, what information do you need that
         15    reasonable assurances objective is being met?  That was the
         16    importance of the objective statements.  That is where Pat
         17    is taking off from.  Design shows up in every single
         18    cornerstone and different aspects of design.  We will have
         19    some risk-significant approaches to it.  We are trying to
         20    grapple with that problem and what comes out and how it
         21    comes out the bottom.
         22              CHAIRMAN JACKSON:  Okay.  Let's move on.
         23              MR. BARANOWSKY:  That is the end of my talk and
         24    time for John Flack from the Office of Research to tell you
         25    a little bit about risk lists.
                                                                      73
          1              CHAIRMAN JACKSON:  I was just thinking about you. 
          2    You're an SR squared A as opposed to an SRA; is that
          3    correct?
          4              MR. FLACK:  I am a risk assessment engineer, not
          5    an SRA.
          6              CHAIRMAN JACKSON:  Right.  So you are senior risk
          7    and reliability analyst.  I'm calling you an SR squared A.
          8              MR. FLACK:  Okay.  In any case, risk will be
          9    considered in all these issues, burden of proof and
         10    inspection.  So let me go on to that.
         11              Before I begin to describe how we are utilizing
         12    risk insights from both the IPE programs and PRAs that are
         13    available to support the development of risk-informed
         14    oversight process, I'd like to highlight a few issues up
         15    front that are important to the development stage.
         16              The first is the generic versus plant specific
         17    issue.  This is really a question as to the extent to which
         18    we can capture generic risk insights in formulating a
         19    risk-informed inspection process.
         20              On the next viewgraph I'll summarize the approach
         21    we've taken to address this issue.
         22              The next issue we will specifically consider risk
         23    in inspection and decision making and what metric and
         24    criteria we are going to use for this.
         25              Although guidance still needs to be developed to
                                                                      74
          1    support these activities, we expect that that guidance would
          2    be consistent with Reg Guide 1.174, and that the risk
          3    information would be used in conjunction with other
          4    considerations such as defense in depth and safety margin.
          5              The third issue, treatment of items not modeled in
          6    PRA, is really to keep us aware of the fact that PRAs do not
          7    cover everything and that we will not overlook important
          8    issues like acts of commission, complex system interactions,
          9    and transition risk.
         10              Finally, the fourth key issue involves resource
         11    allocation and use of risk to prioritize and guide the
         12    inspection process.  The risk significance and availability
         13    of the PI data that Pat Baranowsky just described and risk
         14    will both play a factor in assessing our inspection needs.
         15              This is being addressed in the ongoing research
         16    right now, which I am about to go over on the next slide.
         17              CHAIRMAN JACKSON:  You are saying there will be a
         18    plant-specific inspection program that is tied to the actual
         19    elements of risk presented by a given facility?
         20              MR. FLACK:  We are looking at it from two
         21    perspectives, generic and plant specific.  The
         22    plant-specific aspect would probably involve more of the
         23    maintenance rule development, information from the
         24    maintenance rule, which is plant specific.  We are trying
         25    from both perspectives.  What we are trying to look at is
                                                                      75
          1    from the generic perspective what we can capture, and then
          2    what would need to be supplemented with some plant-specific
          3    insights, but I'll get into that in a minute.
          4              In fact, our first step was to identify and
          5    priorities sources or risk and link these to the
          6    cornerstones using the generic PWR and BWR insights and
          7    plant-specific insights.  In this process we utilized the
          8    IPE insights and findings contained in the various NUREG
          9    reports as well as the IPE database to identify those
         10    contributors found to be most important by licensees.
         11              By scanning across the top ten sequences of each
         12    plant we were able to take a broad look at what is driving
         13    the risk at nuclear power facilities.  In general, these top
         14    sequences can capture 80 to 90 percent of the contributors
         15    to core damage frequency at any one plant.  Sequences that
         16    shows up to be in 50 percent or more of the plants was
         17    considered high and generic.
         18              At the same time, we took a vertical slice using
         19    the Surry IPE and NUREG-1150 results to gain insights into
         20    what would not be captured using the generic approach. 
         21    Taking this approach, we found that about 50 percent of the
         22    core damage frequency for internal events could be captured
         23    generically, but that a deeper understanding of
         24    plant-specific features would be needed to capture the full
         25    range of contributors.
                                                                      76
          1              Once the risk insights were identified, they were
          2    arranged into a matrix so that we could link them to each of
          3    the cornerstones.  In a similar fashion, the risk was linked
          4    to underlying attributes which could then be used as a focus
          5    of the oversight activities.
          6              Together, these form what we call the risk matrix
          7    and a framework for bringing into the process risk
          8    information.
          9              Now that we have the risk matrix, the next step is
         10    to link the risks to the identified performance indicators,
         11    and as they become developed, the identification is still
         12    ongoing, but the above approach provides a means by which we
         13    can accomplish this task.
         14              To summarize our first phase work, we were able to
         15    capitalize on information generated by the IPE program and
         16    NUREG-1150 to formulate a risk matrix and establish an
         17    approach that links risk insights to the cornerstones,
         18    attributes and PIs as they become available.
         19              Also, we are now in a better position to put in
         20    perspective generic versus plant-specific risk information. 
         21    We plan to continue our effort to capture external events
         22    and shutdown risk and insights from other risk importance
         23    measures.
         24              We will also be looking at the application of
         25    inspection resources using risk-informed approaches where
                                                                      77
          1    PIs do not cover the area.
          2              This summarizes the research work to date.  If
          3    there are no other questions, I'll pass it on to Bruce
          4    Mallett.
          5              MR. MALLETT:  Turn to slide 17.  I believe I can
          6    still say good afternoon.  I want to provide you a
          7    perspective on what the risk-informed baseline inspection
          8    group is doing as part of their project.  We discussed some
          9    of these issues earlier, so I'll just try to highlight a few
         10    of them in the interest of time.  There are a few points I
         11    want to make.
         12              The overall objective of the project is to
         13    describe a program of how the NRC will conduct its baseline
         14    inspection program at all power reactor facilities.  We
         15    anticipate providing this in a Commission paper, which I
         16    believe Frank Gillespie, we'll issue sometime in January or
         17    late December of this year.  The anticipation is that we
         18    will provide it to the Commission in January 1999.
         19              In establishing the project, as Pat Baranowsky
         20    indicated, we also recognized that we needed certain
         21    expertise on our inspection group.  This is a monumental
         22    effort.  We have 12 members on our inspection group team. 
         23    It consists of individuals from various groups.  We have two
         24    risk experts.  We have a senior reactor analyst on the
         25    group; we have individuals from Research.  We also have
                                                                      78
          1    representatives of the senior resident inspector program,
          2    those who are currently senior resident inspectors and those
          3    who have been in the past.  We have representatives who are
          4    currently regional inspectors in the program.
          5              We also felt it important we have individuals who
          6    are experts in each of the cornerstone areas.  For example,
          7    we have a person with expertise in radiation safety, another
          8    person with expertise in mitigating systems, and we also
          9    have a representative from the Office of Enforcement on our
         10    team.
         11              With regard to the charter and deliverables, I
         12    would first say, if you turn to slide 18, we divided the
         13    charter and tasks to reach our end product of describing
         14    this program into several key tasks.
         15              The first one is to look at the scope.  We felt it
         16    was important to first decide what the program should do
         17    overall.  We discussed here today that the purpose of a
         18    baseline inspection program is to achieve an indicator of a
         19    licensee's performance, that they are operating safely, but
         20    there are two key pieces to that program.  I believe,
         21    Commissioner McGaffigan, you launched us into that
         22    discussion.
         23              One is that we will emphasize risk-informed
         24    inspections in areas where there are no clear performance
         25    indicators at this point in time.  However, we will also
                                                                      79
          1    have baseline inspection program in areas where there are
          2    limited performance indicators.  That gets to some of the
          3    discussion we had earlier.
          4              Another key piece of the program will be where we
          5    do have performance indicators verifying that they are still
          6    providing us with the indicated results.
          7              As far as the question of sampling, I'll address
          8    that when I get down to the process attributes.
          9              When you look at the scope of the program, we
         10    first embark upon deciding what we want to inspect.  As I
         11    indicated earlier, we are calling these inspectable items. 
         12    We are providing a complete set of those.  Our plan is then
         13    to modify those, depending on the outputs from Pat
         14    Baranowsky's group whether there are adequate performance
         15    indicators, and depending on the experience that we've had.
         16              The next step I have down there is a basis for the
         17    linking those to the NRC mission on risk.  Let me give you
         18    an example of how we plan to do that.
         19              If you take the cornerstone back on slide 7,
         20    mitigating systems, the concept is similar to the improved
         21    tech spec documentation we have out there now.  We'll have
         22    mitigating system.  You might have a characteristic that
         23    that particular item is functional.  In other words, it's
         24    capable of performing its intended function or design
         25    function.
                                                                      80
          1              For each inspectable item we would list its basis.
          2    Let me give you an example.  If you take post-maintenance
          3    testing as an inspectable item, as a basis for that we might
          4    include why you would why you would inspect post-maintenance
          5    testing.  We are envisioning that we would have a
          6    relationship with that to the cornerstone as to why it
          7    produces a desired result.  We might talk about whether we
          8    have a performance indicator in there and what our
          9    inspection program is going to show versus that indicator.
         10              Most importantly, as John Flack indicated, we will
         11    put in some risk information.  Right now it's only on
         12    concept, but we envision using some kind of risk hierarchy
         13    to guide the inspector to what are the important systems or
         14    components to look at when you are looking at
         15    post-maintenance testing.
         16              I recognize that's only a concept, but we wanted
         17    to give you that as an example of what we are looking at as
         18    how we might link this to mission and risk.
         19              CHAIRMAN JACKSON:  Don't you really want to look
         20    at it kind of in a -- you could come out at the same place
         21    -- converse way, where you would look at some system, and
         22    you could ask if the performance indicator tells you what
         23    you need about the system?  Then you ask your three
         24    questions.  Is there something that the performance
         25    indicator doesn't tell you?  If it turns out to be that
                                                                      81
          1    post-maintenance testing will tell you that, then that tells
          2    you what you are going to look at.  If the indicator tells
          3    you what post-maintenance testing would tell you, you
          4    wouldn't necessarily do it except insofar as wanting to
          5    validate the indicator.  Is that correct.
          6              MR. MALLETT:  That's correct.
          7              CHAIRMAN JACKSON:  So you don't start that you
          8    need to do post-maintenance testing; you start with the
          9    systems and then you ask, what do I need to know in order to
         10    verify that the systems performs it's intended function?
         11              MR. MALLETT:  When you are designing what you want
         12    to look at in your program, that's correct.  You might also
         13    use it in a different method.  If a certain event came up,
         14    we envision you might go back and also use this as a way of
         15    saying, do I need to even look further into this event?
         16              If we move to the other items we put on here as
         17    key items, address stakeholders issues.  We felt it was
         18    important early on to talk to the various internal NRC
         19    stakeholders and external to see what the issues are that
         20    they believe need to be addressed in a baseline inspection
         21    program.  Once we have given the concept to paper, we intend
         22    to go back and use this as a check list to say, did we
         23    address all those issues?
         24              John Flack mentioned some of those when he
         25    discussed it on slide 15.  So I won't elaborate anymore.
                                                                      82
          1              As far as process attributes, the next step is to
          2    decide how do we use this process.  We have all these
          3    inspectable items; we have their basis now; we've linked
          4    them to our mission.  How do we use this?  How do we tell
          5    the inspector how to use this?
          6              We haven't got the answer to that question yet,
          7    but some considerations we have is, how much inspection do
          8    you need to do?  How often do you need to do it?
          9              Another thing we have considered is how we are
         10    going to put sampling -- we call it selecting inspectable
         11    items -- into the process.  Chairman Jackson, you asked that
         12    question about sampling.  We don't have it formulated yet
         13    but we do intend to include that in our description of our
         14    program.
         15              Another item we are addressing is how can you get
         16    some generic risk information and also guide the inspector
         17    to get specific risk information based on plant specificity. 
         18    How do you address type of plants, for example?
         19              CHAIRMAN JACKSON:  Mr. Flack is going to tell you
         20    that.
         21              MR. MALLETT:  He's working with us.  We are using
         22    his group heavily.  Mr. Baranowsky is also going to tell us
         23    some of that.
         24              I also have some senior reactor analysts on my
         25    group that are discussing with them and interfacing to
                                                                      83
          1    provide that result.
          2              The last item I put as a key issue.  We are also
          3    benchmarking some other agencies to see what their programs
          4    are, what they use as a baseline inspection program, and to
          5    see if we can learn any lessons from them, or issues.
          6              You did ask one other question I would like to add
          7    one other comment to, about the skills.  One skill we do see
          8    is the inspectors and managers are going to have to have
          9    more understanding than they do today about risk information
         10    and how to use risk information.  I don't know that that's
         11    necessarily a skill, but there may be some training involved
         12    in how to do that.
         13              If there aren't any more questions, I would like
         14    to turn it over to Mike Johnson, who is going to talk about
         15    the assessment group and their project.
         16              MR. JOHNSON:  Thanks, Bruce.
         17              Slide 19, please.
         18              I will discuss the role of assessment, the
         19    deliverables, and finally, the team composition.
         20              We envision that the role of assessment within the
         21    oversight framework and based on the defining principles
         22    will be to consider the results of licensee performance as
         23    measured by the objective indicators and thresholds
         24    developed by the framework group and the information that
         25    results from the implementation of a risk-informed
                                                                      84
          1    inspection program and other insights as developed by the
          2    inspection group to arrive at a view of licensee performance
          3    within the framework.
          4              Then, based on the licensee's performance, the
          5    role of the process will be to identify appropriate
          6    regulatory actions that range from conducting just the
          7    baseline, up to and including issuing an order.
          8              To communicate the assessment results along with
          9    planned regulatory actions to licensees, the public, and
         10    other stakeholders.
         11              To provide follow-up and to verify our regulatory
         12    actions to ensure that they are successful.
         13              And to provide a quality check and feedback, a
         14    process for continuous self-assessment, to ensure that the
         15    effectiveness of our other oversight processes, the
         16    inspection process, the enforcement process, continue to
         17    improve.
         18              In developing the staff's final recommendation
         19    that we will provide at the end of the year, we will
         20    consider questions such as:
         21              How do we integrate the information inputs from
         22    each of the cornerstones?
         23              At what frequency and over what interval will we
         24    roll that information up?
         25              What will be the methodology where we compare the
                                                                      85
          1    objective insights or the objective indicators with the
          2    insights coming from the risk-informed baseline inspection
          3    and other inspection and other insights?
          4              What does that methodology look like?
          5              What actions should be taken and what is the
          6    process with decision criteria to allow us to determine the
          7    appropriate regulatory response based on licensee
          8    performance in a manner that is scrutable and predictable? 
          9    Because are concerned about scrutability and predictability.
         10              How should we communicate the results of the
         11    assessment in actions to the licensees, the public and other
         12    stakeholders?  This will include issues such as how do we
         13    provide an opportunity for licensee input and feedback as a
         14    part of the assessment process.
         15              What should be the relation between the assessment
         16    process and enforcement?  As we talked earlier and as Jim
         17    Lieberman will talk in a minute, we do recognize that there
         18    is a relationship between the assessment process and
         19    enforcement.  So what should that relationship be?  How will
         20    it work?
         21              How should we phase in the recommended process
         22    with our existing processes, including the senior management
         23    meeting and the other things that we do in terms of
         24    assessment today?
         25              And how will we measure the assessment process
                                                                      86
          1    post-implementation to ensure that it meets our
          2    expectations, to ensure that a year from now the process
          3    that we have recommended and we hopefully are beginning to
          4    implement does meet the success criteria that we laid out
          5    for ourselves?
          6              We believe that the assessment process will be of
          7    great interest to licensees, the public, and other external
          8    stakeholders; arguably, perhaps of more interest than even
          9    the inspection program.
         10              Because the assessment process will provide the
         11    primary communication vehicle for the agency on the
         12    performance of utilities, it will have a great ability to
         13    impact licensee activities, public awareness and confidence
         14    in the NRC and its licensees.  And as we learned with the
         15    SALP process, it could have a potential for unintended uses
         16    and consequences.
         17              The assessment process will also be of great
         18    interest to internal stakeholders who will be the process
         19    implementers.
         20              Given the importance of the process to both
         21    internal and external stakeholders, we assembled a task
         22    group of experts made up of representatives from key
         23    internal stakeholders, including the regional offices, who
         24    will be the heavy lifters in the implementation of the
         25    assessment process, as well as members from AEOD, NRR,
                                                                      87
          1    Research, and the Office of Enforcement.
          2              Participants have implemented the previous
          3    assessment processes.  Several participated in the IRAP
          4    process and understand the challenges of developing an
          5    assessment process, and all have participated in the
          6    workshop or are members of the inspection group or the
          7    framework group, and therefore understand the philosophical
          8    approach that we are embarking on and will be in a position
          9    to ensure that the assessment group activities are properly
         10    integrated with the activities of the other groups.
         11              Finally, as is important with the other groups, we
         12    have already conducted and plan to conduct several
         13    additional meetings with the industry, the public, and other
         14    stakeholders in order to get early input and involvement in
         15    our development of the assessment process.
         16              COMMISSIONER McGAFFIGAN:  Could I ask a practical
         17    question?
         18              CHAIRMAN JACKSON:  Please.
         19              COMMISSIONER McGAFFIGAN:  You talked about the
         20    transition.  Is there likely to be an annual briefing to the
         21    Commission on the four regional administrators' and the
         22    director of NRR's view as to how the plants are doing?  Is
         23    that likely to still remain part of the process?
         24              MR. JOHNSON:  First of all, let me preface this by
         25    saying that we haven't really talked about it and done the
                                                                      88
          1    development that would enable me to answer your question
          2    conclusively, but let me just tell you that it is our
          3    feeling, based on the conversations that we've had in
          4    staffing the group, that there would be some periodic
          5    briefing of the Commission on the status.
          6              COMMISSIONER McGAFFIGAN:  I also assume that,
          7    based on what I read of the stakeholders interactions, that
          8    the watch list concept may go by the boards.  I'm just
          9    gaming this, and I hope you guys do some gaming.  If I'm an
         10    enterprising reporter, how do I still -- I've got the four
         11    regional administrators in front of me and we don't have a
         12    watch list anymore but we have the discussion list, namely,
         13    the ones that they thought important enough to call to our
         14    attention, assuming we don't have 15 or 20 minutes for each
         15    of the 104 plants.
         16              CHAIRMAN JACKSON:  We might.
         17              COMMISSIONER McGAFFIGAN:  We are doing pretty well
         18    today in time.
         19              How do you end up having the trade press not
         20    report that last week plants A, B, C, D, E, and F were the
         21    focus of the Commission's deliberations as they received the
         22    annual briefing from the staff and you still have a watch
         23    list?
         24              MR. COLLINS:  I think the backdrop that we have to
         25    keep in mind is that none of this should be new news to
                                                                      89
          1    anyone other than when either the agency believes we need to
          2    take an action or we are confirming a licensee action.  Any
          3    roll-up that we would take periodically would not be for the
          4    purposes of "announcing" any action against a plant.  As in
          5    the past perhaps the senior management meeting that was the
          6    context, this would be a review of where we have been at any
          7    given point in time and have our actions been effective.  It
          8    might be more of a status of what has previously been
          9    announced and implemented rather than a decision-making
         10    meeting.
         11              COMMISSIONER McGAFFIGAN:  I don't know how you
         12    control five Commissioners who are sitting here asking you,
         13    or even your own regional administrators, for that matter. 
         14    Plant X looks like it's getting into some regulatory
         15    difficulty.  I understand, Mr. Regional Administrator, you
         16    have some real concerns about X,Y,Z.  Is that going to be
         17    off limits for discussion?
         18              MR. COLLINS:  I don't pretend to control five
         19    Commissioners.  I guess what we would have to do is
         20    understand what the forum is.  I wouldn't envision that this
         21    process is focused toward the staff.  The process is focused
         22    towards having a mutual understanding of performance and
         23    ensuring that there is an entity, preferably the licensee,
         24    who is responding and reacting to those issues
         25    appropriately.  If not, then we engage, we reinforce; if
                                                                      90
          1    appropriate, we act independently.
          2              At that point, if we were to be in a meeting to
          3    discuss licensee performance, I would expect the licensee to
          4    be there discussing their performance and the reasons for
          5    why their performance is appropriate or not, and for the
          6    agency to be there to ensure that our actions are
          7    commensurate with that.  That meeting, if a meeting is
          8    warranted in that fashion, should not be delayed annually;
          9    it should be conducted when it is appropriate.
         10              Perhaps, in that context, the meeting we are
         11    talking about is more to review the process itself than it
         12    is to review licensee performance.  That's yet to be
         13    determined.
         14              COMMISSIONER McGAFFIGAN:  I believe Mr. Lochbaum,
         15    who is going to speak in a few minutes, has suggested that
         16    at that meeting we would focus, in his scheme, on which
         17    plants are doing less well and maybe which are doing better. 
         18    I'll let him speak for himself.  But that there would be a
         19    discussion of how plants specifically are performing as
         20    opposed to how our process is working.
         21              CHAIRMAN JACKSON:  You could argue that the one is
         22    a test of the other.  You can't ask the staff how they would
         23    control five Commissioners.
         24              COMMISSIONER McGAFFIGAN:  No, but I think I can
         25    ask the staff, I think you need to think about the gaming of
                                                                      91
          1    the process.  All processes are gamed and you should think
          2    about it.
          3              CHAIRMAN JACKSON:  But presumably, if one gets at
          4    the issue of surprises, or one is sailing along in good
          5    shape and all of a sudden one drops off the cliff, that's
          6    the ultimate sense in which someone can "game the process." 
          7    If it's an open, scrutable, continuously interactive and
          8    appropriate process with the licensees involved, it's not
          9    new news.
         10              COMMISSIONER McGAFFIGAN:  Most of our recent watch
         11    list meetings haven't been new news either.
         12              CHAIRMAN JACKSON:  Nonetheless, when we come out
         13    with the list, everybody watches.  There is a balance
         14    between having a process that is scrutable, objective,
         15    risk-informed, and so on, and the fact that the Commission
         16    has to be informed and it should be in an open process.
         17              Mr. Lieberman, you're on.
         18              MR. LIEBERMAN:  Turning to slide 21.  We've
         19    already made reference to enforcement as part of the
         20    development of the oversight process.
         21              As Mike Johnson and others have said, to develop
         22    the assessment process, we need to establish what regulatory
         23    actions will be taken based on the performance levels of
         24    licensees.  This will include consideration of the role of
         25    enforcement in the oversight process and what changes, if
                                                                      92
          1    any, are needed to be made in the enforcement policy.
          2              Specific enforcement issues that the staff is
          3    considering in coordination with the oversight effort
          4    includes developing better guidance for the thresholds
          5    between minor violations and severity level 4 violations,
          6    reviewing severity level examples and enforcement policy to
          7    make them more risk informed, reviewing the process to
          8    determine sanctions, and evaluating the role for regulatory
          9    significance in the enforcement process.
         10              As to the threshold between minor violations and
         11    level 4 violations, we've already mentioned that in the past
         12    the enforcement process has set thresholds resulting from
         13    the inspection and assessment process.  As part of the
         14    integration effort, the threshold will be driven by the
         15    needs of the assessment process.
         16              As to severity levels, the technical framework and
         17    assessment efforts will provide insights as to what is risk
         18    and safety significant for purposes of assessment.  These
         19    insights should be considered in developing the severity
         20    level examples so that violations which are significant to
         21    threshold issues are significant to enforcement and vice
         22    versa.
         23              As to enforcement sanctions, we will be
         24    considering what changes, if any, should be made to the
         25    process for assessing sanctions based on the levels of
                                                                      93
          1    licensee performance.
          2              Finally, the issue of regulatory significance be
          3    to be addressed.  By regulatory significance, I mean when
          4    the agency concludes that the significance of root causes
          5    and the circumstances of grouping individual severity level
          6    4 violations are greater than the actual potential
          7    consequences, warranting their aggregation into a severity
          8    level 3 problem.
          9              The staff has not developed a final position on
         10    whether and how regulatory significance should be used in
         11    the regulatory process.  Since regulatory significance is in
         12    essence an assessment effort, the staff is proposing that
         13    the resolution of this issue be deferred until it can be
         14    integrated into the assessment process.
         15              In the meantime, the staff intends to issue an
         16    enforcement guidance memorandum and increase its oversight
         17    of cases involving regulatory significance.  For example,
         18    reactor cases involving escalated actions which now require
         19    my approval, will require the approval of the deputy
         20    executive director for regulatory effectiveness.  In
         21    addition, we intend to continue the current efforts to
         22    ensure that each case considered for regulatory significance
         23    have a clear nexus to safety.
         24              Apart from the adjustments to the enforcement
         25    process, we have developed a proposal to address
                                                                      94
          1    non-escalated enforcement actions.  This should be delivered
          2    to the Commission very shortly.
          3              The changes for non-escalated enforcement actions
          4    have been coordinated with the oversight effort.  The
          5    proposed changes will not prejudge the outcome of the
          6    assessment and inspection improvements; it can accommodate
          7    any needed changes.
          8              COMMISSIONER McGAFFIGAN:  You are proposing to
          9    postpone this paper on regulatory significance until the
         10    assessment process is further along.  The next chart is
         11    going to tell us the schedule.  How much of a delay are you
         12    talking about?  Does it have to be already in place and
         13    being implemented?  Is it sometime soon after January?  How
         14    long do we wait to tackle this issue?
         15              MR. LIEBERMAN:  I think, Commissioner, we will be
         16    doing that in early spring.  We need to understand how the
         17    assessment process works, and then we can complete the
         18    effort on the regulatory significance.  Whether we can
         19    implement it completely may be a function of how long the
         20    assessment process is being completed, but the concept we
         21    should be able to present to the Commission after we
         22    understand the assessment process.
         23              CHAIRMAN JACKSON:  Have you interacted with other
         24    agencies that have an enforcement authority, such as FAA,
         25    FDA, EPA, DOJ?  Have you interfaced with state agencies,
                                                                      95
          1    with local law enforcement, with academicians who have done
          2    studies in criminal justice or civil justice?  Have you
          3    looked at SEC?
          4              MR. LIEBERMAN:  Some of all of that.  We've looked
          5    at DOT; we have gone to FAA; EPA.  I've gone to training
          6    programs and discussed issues with the academic community. 
          7    I haven't dealt that much with the states, but I read of lot
          8    of articles on enforcement in general.  So I understand what
          9    other organizations are doing in the enforcement process.
         10              CHAIRMAN JACKSON:  Where do things stand with
         11    OSHA, with what they call cooperative compliance?
         12              MR. LIEBERMAN:  The OSHA system is a little
         13    different from the NRC regulatory systems.  OSHA doesn't
         14    have licensees.  They have spot inspections.  They inspect a
         15    group of potential safety concerns in an industry, say the
         16    paper making industry in Maine.  They don't have the same
         17    degree of oversight.  The analogy that OSHA has to the NRC
         18    regulatory program on giving credit to self-assessment is
         19    not exactly the same system.
         20              COMMISSIONER McGAFFIGAN:  Is regulatory
         21    significance used in other agencies?  Is there a category
         22    that allows you to aggregate or do some of the things that
         23    the current enforcement policy allows?
         24              MR. LIEBERMAN:  I'm not familiar with other
         25    agencies using the specific term of regulatory significance,
                                                                      96
          1    but the concept of evaluating violations for the potential
          2    safety significance is not unique to the NRC process.  We
          3    came up with the concept of regulatory significance, the
          4    grouping of violations, in part because of the history of
          5    the civil penalty process.  Prior to 1980, our civil penalty
          6    authority provided for a maximum of $5,000 per violation. 
          7    We almost had a cash register approach to enforcement back
          8    in the 1970s.  There were so many violations and there was
          9    so much money per violation and you added it up.
         10              When we received the authority post-TMI to have
         11    $100,000 per violation and we recognized that many
         12    significant issues involved more than one violation, we came
         13    up with a concept of grouping violations together and then
         14    assessing civil penalties based on the groupings of
         15    violations.  That's where the concept of regulatory
         16    significance came along.  Other agencies have different
         17    civil penalty schemes.  So there is not necessarily a direct
         18    correlation.
         19              CHAIRMAN JACKSON:  Okay.
         20              MR. GILLESPIE:  Going on to slide 22, which is our
         21    schedule.  Short-term actions are set to happen between now
         22    and January of 1999, including public meetings more than
         23    weekly with NEI.  NEI has established two subgroups to our
         24    framework and inspection groups to deal with radiation
         25    protection and to deal with safeguards.
                                                                      97
          1              We are hoping to have our proposal developed by
          2    the end of November.  This will allow us to meet with the
          3    ACRS subcommittees and ACRS full committee the first week in
          4    December so that we might get a letter from ACRS to the
          5    Commission on our proposal, and then to the Commission in
          6    December.
          7              Beyond January, with Commission approval and
          8    comment in the spring, develop revised enforcement guidance
          9    in the spring.
         10              Start the phase-in of both the assessment and
         11    inspection process in June.
         12              Implement risk-informed inspection baseline in
         13    October.  I emphasize this is the risk-informed baseline,
         14    because there are regional initiative inspections which will
         15    not be completely redone.
         16              Complete the phase-in of both risk-informed
         17    inspection and assessment by June of the year 2000.
         18              And then a retrospective look one year later. 
         19    Hopefully we will have established some credible objectives
         20    when we put this place and measure ourselves against those
         21    objectives a year later.
         22              With that, we complete our presentation for this
         23    afternoon.
         24              CHAIRMAN JACKSON:  Thank you.  If there are no
         25    further questions or comments, let me thank the staff.  I
                                                                      98
          1    will make some fuller remarks at the end.  So you are
          2    excused for the moment.
          3              Let me, first of all, thank Mr. Beedle and Mr.
          4    Lochbaum for their patience and invite you to please come to
          5    the table.
          6              Good afternoon.
          7              MR. BEEDLE:  Good afternoon, Chairman Jackson,
          8    Commissioner McGaffigan, Commissioner Diaz.
          9              CHAIRMAN JACKSON:  We are pleased to have you.  We
         10    are particularly interested in how you see the overall
         11    progress to reengineering the assessment, inspection and
         12    enforcement program.
         13              MR. BEEDLE:  First, let me echo Frank Gillespie's
         14    comments somewhere around two o'clock this afternoon and
         15    wish you all good evening.
         16              COMMISSIONER McGAFFIGAN:  It isn't sunset yet.
         17              CHAIRMAN JACKSON:  It isn't sunset yet.  It's
         18    getting close.
         19              MR. BEEDLE:  Just a point of perspective.  The
         20    process that the staff described to you during the course of
         21    the last couple of hours is one that is shared by the
         22    industry because it's going to help both the industry
         23    executives and the NRC staff focus on the things that are
         24    important to safety in the operation of these plants.  In
         25    doing that, it helps us assign our resources to the things
                                                                      99
          1    from a safety point of view, that are meaningful, and frees
          2    us from a regulatory burden on things that are not safety
          3    related.  One of the objectives of this process, at least
          4    from the industry point of view, is to help focus on that as
          5    opposed to the management of the facilities.
          6              The four day workshop was really a very beneficial
          7    workshop in that it fostered a lot of communication between
          8    the NRC staff and the industry and other stakeholders that
          9    were present.  It helped us better appreciate the direction
         10    that the staff was moving on this particular issue.  So from
         11    an educational point of view, I think it was an immense
         12    success.
         13              Did we solve a lot of problems in the process of
         14    doing that?  Perhaps not, but we did have an alignment that
         15    was discussed earlier that there was a need to focus on
         16    safety, and with that we could define parameters that would
         17    help us understand that better, and with that we could also
         18    define some thresholds that would give us the ability to
         19    then take a look at inspection and enforcement and properly
         20    respond to that.
         21              Let me have the next slide, please.
         22              [Slides shown.]
         23              MR. BEEDLE:  You asked earlier about data.  I'd
         24    like to talk a little bit about that.
         25              First of all, the nuclear officers in the
                                                                     100
          1    community have agreed that the data is necessary for the
          2    agency in order to determine where the performance of the
          3    plant is, and it helps not only you, but it also helps the
          4    people that are managing the facility.
          5              Our thinking at this point is that that data would
          6    be provided directly to the NRC in some sort of a formatted
          7    process that would make it easy to digest and process, and
          8    it would not involve pass-through through INPO and
          9    perturbate the process that INPO has got.  So it eliminates
         10    a number of the concerns.
         11              CHAIRMAN JACKSON:  It would be direct.
         12              MR. BEEDLE:  It would be direct.  Perhaps an
         13    appendage to the monthly operating report for each of the
         14    plants.
         15              Would each of the plants participate?  I think
         16    with reasonable assurance I can tell you that they would. 
         17    So I don't think that is really an issue.  Having defined up
         18    front the parameters that we are talking about, I don't
         19    think we are going to have any particular problem.
         20              If the staff comes back and says we need six more
         21    parameters, I think we might ask some questions and try and
         22    understand why.  If we could reach agreement on it, I think
         23    all the plants would then provide those additional six
         24    parameters.
         25              We expect that the number of parameters for each
                                                                     101
          1    plant would be the same.  So we wouldn't have a group of
          2    plants that would provide four and another group of plants
          3    that would provide six.  I think we are going to look at
          4    consistency across that spectrum.
          5              The three year trending curves that we have been
          6    plotting would be plotted for each one of the plants. 
          7    Histograms would display worst value for each plant plotted
          8    against those indicators.
          9              PRA sensitivities would be run, and in the case of
         10    scram and mitigating systems, we are looking at something
         11    like two times a CDF or a delta CDF of one times ten to the
         12    minus five to set threshold.
         13              Insights from the data that we have analyzed to
         14    date.  The indicators do provide an overall perspective on
         15    safety performance.  That is certainly our assessment.
         16              Barrier integrity indicators show strong plant
         17    performance for almost all plants.
         18              The initiating events and mitigation indicators
         19    exhibit the most detectable variations.
         20              And the unplanned plant transient indicator
         21    appears to be a reasonable leading indicator of plant
         22    performance.  Indicators do not reveal any design control
         23    problems.
         24              I would remind you that this process is one
         25    focused on risk, but it does not exclude the fact that we
                                                                     102
          1    still have tech specs and regulations and design basis
          2    requirements to adhere to.  So we are looking at assessment
          3    process, not whether or not we in fact followed the
          4    requirements of regulation.
          5              Indicators do distinguish levels of safety
          6    performance.  We see the excellent performing plants have
          7    indicators that are high in the green band; the average
          8    plants are in the low green to white band; and there does
          9    appear to be declining trends that show in multiple
         10    indicators.  Recent watch list plants have several
         11    indicators that show up in the white zone.
         12              If I could have the backup slides, please, that
         13    show the graphs.
         14              COMMISSIONER McGAFFIGAN:  All of this information
         15    is going to be docketed.  If it comes in the monthly
         16    operating reports, it's public information.
         17              MR. BEEDLE:  That's correct.
         18              COMMISSIONER McGAFFIGAN:  So we can use it as we
         19    see fit.  We can aggregate it, et cetera.
         20              MR. BEEDLE:  You could aggregate it, but it's not
         21    our intent that you would aggregate it.  We are trying to
         22    focus on a plant's performance in those areas.
         23              CHAIRMAN JACKSON:  It's going to be plant
         24    specific.
         25              MR. BEEDLE:  Right.  We are not going to give you
                                                                     103
          1    aggregated information; it's plant specific.
          2              In this one, I know that the graph is a little
          3    hard to see, but here are some plant transients and
          4    unplanned shutdowns for a plant that has reasonably good
          5    performance.  You can see that the industry average is that
          6    solid line.  The occurrence at this plant is the dotted
          7    line.  So we've got a plant here that I think, by all
          8    rights, would be concluded to be a good performer, and this
          9    is what the operational challenges look like in plant
         10    transients.  These are transients that create a power change
         11    of greater than 15 percent.
         12              This is a plant whose performance is trending in a
         13    downward direction.  You can see that trend developing.
         14              This plant's performance has been cyclic in
         15    nature, and I think this performance indicator indicates
         16    that.  It was one that, coupled with other indicators, I
         17    think you would have probably concluded should be on the
         18    watch list.
         19              Back to the original set of slides.
         20              COMMISSIONER McGAFFIGAN:  The data you have there
         21    goes back five years.
         22              MR. BEEDLE:  Correct.
         23              COMMISSIONER McGAFFIGAN:  You are getting three
         24    year data for the plants.  Is that just a matter of
         25    resources?  Do you think three year data will be enough to
                                                                     104
          1    prove the point?
          2              MR. BEEDLE:  This is data that we had started out
          3    with sometime ago, when we were looking at this process.  We
          4    think that the three year rolling average looks like about
          5    the right data to look at.  I don't think that even in this
          6    one where you have that cyclic behavior that going back
          7    another couple of years makes all that much difference.
          8              In this slide we have some data that speaks to the
          9    establishment of the thresholds just to give you an idea of
         10    what we are looking in that area.  We are looking at the
         11    base core damage frequency.  In plant A it's 1.47; in plant
         12    B it's 4.6.  So we see a different range in there.
         13              If we use a delta CDF at one times ten to the
         14    minus five or 1-E to minus five, and then two times the CDF
         15    as the threshold, the resulting behavior that you see would
         16    be, in the case of scrams, the CDF was based on four.  You
         17    could set a threshold using the delta of ten scrams, and if
         18    you were going to use a two times, a doubling of the CDF,
         19    you'd have to have something on the order of 14 scrams.  So
         20    that would give you the range in that white band of, say,
         21    four to ten.
         22              If you look at diesel unavailability and HPSI
         23    unavailability, you get an idea of the sensitivity of this.
         24              From where we typically see the plants operating,
         25    to get a significant change in availability as measured by
                                                                     105
          1    the core damage frequency, you are looking at a fairly
          2    significant increase in unavailability. You really have to
          3    make it a point to have your system out of service for long
          4    periods of time before you start encroaching on a safety
          5    limit in this case.
          6              Plant B, where the CDF is a little bit higher,
          7    those ranges are slightly different but not significantly
          8    different.
          9              So there is a wide spread in this data threshold
         10    that we would be looking at.
         11              COMMISSIONER McGAFFIGAN:  Is the idea that we
         12    would have different thresholds for different plants?  That
         13    might be a little difficult to implement.  Or is there a
         14    single threshold for a performance indicator that would work
         15    for all PWRs or all BWRs, or whatever?
         16              MR. BEEDLE:  I think we are going to end up with
         17    different thresholds, with some different parameters for the
         18    BWRs and PWRs.
         19              COMMISSIONER McGAFFIGAN:  Is it a single threshold
         20    for all plants, or does this analysis suggest that you have
         21    a different threshold for each plant?
         22              MR. BEEDLE:  I think it will be a different
         23    threshold for each plant.
         24              CHAIRMAN JACKSON:  What is consistent is whether
         25    the trigger is a specific delta in core damage frequency or
                                                                     106
          1    two times or some specific multiple of core damage
          2    frequency, or a change in the base to cored damage
          3    frequency.  That's the commonality of approaches.  Is that
          4    correct?
          5              MR. BEEDLE:  That's correct.
          6              We are currently working on the trend graphs and
          7    histograms and would expect that a little later this week
          8    we'll have those available for about two thirds of the
          9    plants.
         10              PRA sensitivity results will be provided sometime
         11    later this month for a representative set of plants,
         12    approximately 25 of them.  So I think we are reaching some
         13    consensus on what the data collection effort should be and
         14    what those thresholds should be.
         15              We are working closely with Frank and his various
         16    task forces on this, and we think that we are reaching
         17    agreement on some technical issues that help us understand
         18    safety at the plants.
         19              COMMISSIONER McGAFFIGAN:  One of the conversations
         20    we've had with ACRS in the past is that we probably have
         21    pretty good confidence on delta CDFs.  I'm still stuck on
         22    this notion that we might have for plant A so many safety
         23    system actuations that get you into the white zone and for
         24    plant B have a different number based on IPE that weren't
         25    all done in a standardized way.  It's sort of taking my
                                                                     107
          1    breath away at the moment.
          2              CHAIRMAN JACKSON:  But the delta CDF --
          3              COMMISSIONER McGAFFIGAN:  The delta CDFs I can
          4    understand.  Delta CDFs is part of it.  But part of that was
          5    two times CDF.  We were using the actual number.
          6              CHAIRMAN JACKSON:  That's the point.  You have to
          7    settle on which of those is the acceptable metric, right?
          8              MR. BEEDLE:  Go back to the slide with the plant A
          9    and B table on it.  In this slide we had, for example,
         10    diesel unavailability of .61 percent.  I would argue that
         11    any plant that is getting up into the ten percent
         12    unavailability on its diesel engines is probably going to
         13    wonder what's going on with its maintenance group.  I don't
         14    think the manager of the facility is going to allow that. 
         15    Forget whether or not it's a regulatory threshold.  When we
         16    first posed in the area of scrams that we set the green band
         17    at the level of three, we had tremendous opposition on the
         18    part of the industry.  They said, well, that's ridiculous. 
         19    We never have more than two.  Why don't we set it at two?
         20              CHAIRMAN JACKSON:  The approach that is most
         21    consistent with Reg Guide 1.174 is the delta CDF approach.
         22              MR. BEEDLE:  Right.
         23              CHAIRMAN JACKSON:  At any rate, the delta approach
         24    is the approach in Reg Guide 1.174.
         25              COMMISSIONER McGAFFIGAN:  But where he derived
                                                                     108
          1    those numbers, I thought --
          2              CHAIRMAN JACKSON:  No.  If you put the viewgraph
          3    back, it means that for that particular plant, in order to
          4    have a delta of ten to the minus five, for that particular
          5    plant theoretically it would require ten scrams, or diesel
          6    generator unavailability of about 30 percent, or HPSI
          7    unavailability of about 14-1/2.  That's what that is saying.
          8              COMMISSIONER McGAFFIGAN:  Therefore, that's where
          9    the threshold should be for the red zone.
         10              CHAIRMAN JACKSON:  For that plant.
         11              MR. BEEDLE:  That's where that bottom of the white
         12    zone, start of the red zone would be.
         13              COMMISSIONER McGAFFIGAN:  We are using delta CDF
         14    there, but at the top, to decide where the green zone/white
         15    zone interface is, the proposal is that we use the base CDF. 
         16    So it's four for that plan for scrams and .61 percent and
         17    1.81 percent, and then a different set of numbers for the
         18    other plant.  So we are using the CDF itself as a mechanism
         19    for deciding the green zone.
         20              MR. BEEDLE:  I think in the case of the green zone
         21    we are looking at perhaps CDF, but we are also looking at
         22    some history of performance of the plants in the 1990 to
         23    1994 range.
         24              COMMISSIONER McGAFFIGAN:  Am I understanding you
         25    for scrams, three across the industry?  We're not going to
                                                                     109
          1    do three at one plant and four at another?
          2              MR. BEEDLE:  In that case, we'll probably have
          3    three across the industry.
          4              COMMISSIONER McGAFFIGAN:  For diesels, rather than
          5    11 percent versus 28 percent, it would be 10 percent across
          6    the industry?
          7              MR. BEEDLE:  I'm not sure, but I would guess that
          8    the diesel would probably be somewhere in the one percent
          9    range for the green band.  I can't imagine us putting it
         10    down at 30 percent.  It may be 20 or something like that. 
         11    It demonstrates that there is a tremendous margin between an
         12    operationally significant condition and a safety significant
         13    one.  The risk insights that we have developed over the last
         14    several years help us understand that every time you have a
         15    wing nut out of position doesn't mean that the plant is
         16    unsafe.  That's really what we need to focus on.
         17              CHAIRMAN JACKSON:  Your point still remains about
         18    having some consistency in approach.  What that consistency
         19    in approach translates into is a fundamental question.  Is
         20    it going to vary by plant, or do we want to just pick
         21    something and say that this is in fact the threshold?  I
         22    think that's a regulatory decision.
         23              MR. BEEDLE:  I think once we get all the data in
         24    here, our task forces are going to look at that and say what
         25    makes sense.  I think part of what makes sense also, you
                                                                     110
          1    have to factor in your ability to regulate that and get some
          2    consistency and standardization.
          3              COMMISSIONER McGAFFIGAN:  I'm just trying to
          4    anticipate.  Mr. Lochbaum at some point is going to pipe up
          5    and talk about his views as to how good the IPEs are and all
          6    that.  I guess he's waiting his turn patiently.
          7              MR. BEEDLE:  I would also argue that as we look at
          8    the spectrum of plants, when we see one that looks like it's
          9    an outlier, we may have to take some special action in the
         10    case of that one.  It will also point out difficulties
         11    associated with some of the PRAs or IPEs that have been done
         12    at the plants.
         13              CHAIRMAN JACKSON:  Are you done?
         14              MR. BEEDLE:  I'm finished.
         15              CHAIRMAN JACKSON:  Mr. Lochbaum.
         16              MR. LOCHBAUM:  Thank you for this opportunity to
         17    comment on the NRC's initiatives in the area of inspection,
         18    assessment and enforcement.  These important areas are the
         19    foundation of the NRC's reactor safety oversight function. 
         20    It's vital that they be as effective as possible.
         21              The staff mentioned the recent four day workshop. 
         22    I attended that workshop.  It wasn't as useful as it could
         23    have been.  The structure of that workshop was such that it
         24    would have been virtually impossible to result in anything
         25    but alignment.  The breakout sessions and the cornerstones
                                                                     111
          1    were determined well in advance of the workshop and really
          2    could not have been changed by the attending stakeholders.
          3              The workshop was, in my opinion, little more than
          4    a dog and pony show that the staff could tell you today that
          5    it had met with the stakeholders and had their endorsement.
          6              In my opinion, those four days could have been
          7    better spent examining the pros and cons of NEI's proposed
          8    assessment model and its regulatory scheme.
          9              I have the following comments on the specific
         10    items discussed by the staff today.
         11              Commissioner Diaz already commented on one of the
         12    concerns I had with respect to compelling cases.
         13              I felt that the NRC staff in the past has had a
         14    major flaw in its existing program and that there is a very
         15    low threshold for compelling cases.  The staff, in my
         16    opinion, should very rarely overturn indicator results.  I
         17    agree with Mr. Gillespie that if you do overturn indicator
         18    results, that also casts doubt on the validity of your
         19    indicators.  That needs to be reexamined.  Basically, that
         20    shouldn't happen very often.
         21              In the same section, defining principles, the
         22    staff said that the assessment process results might be used
         23    to modulate enforcement actions.  We strongly feel that
         24    enforcement actions should be based exclusively on the
         25    severity of the offense.  Under no circumstances should
                                                                     112
          1    enforcement actions be increased or decreased based on
          2    assessment results.
          3              We feel that a major flaw of the current senior
          4    management meeting process, which doesn't seem to be
          5    addressed in the current plans of the staff, is that the
          6    managers spend too much time deciding who's naughty and nice
          7    and too little time figuring out what to do about the
          8    naughty ones.
          9              The primary focus of the SMM process should be to
         10    develop action plans to handle plants determined by an
         11    objective assessment process to be performing badly.
         12              The staff spoke about a risk-informed oversight
         13    process to guide its inspections.  Virtually all of the
         14    staff's efforts seem to be directed towards ensuring that
         15    they look at the right areas.  The staff needs to spend more
         16    effort on figuring out how to properly respond to their
         17    inspection findings.  We remain baffled by the current
         18    inspection process, which seldom triggers a scope expansion
         19    either on the licensee's part of the NRC's part.
         20              The staff conducts inspections of very small
         21    samples.  The findings from those limited audits need to be
         22    placed in context, but they are not.  We think that was the
         23    problem at D.C. Cook.  The inspection that was done last
         24    August and September revealed a problem that begged for
         25    scope expansion that seemed late in coming.
                                                                     113
          1              We are also disappointed that the staff hardly
          2    ever asks the licensees to explain why they didn't find the
          3    problems first.  After all, the licensees have the burden
          4    for assuring that their facilities are maintained in
          5    accordance with safety regulations.  When the staff has
          6    evidence that a licensee may be shirking that burden, the
          7    staff needs to find out why.
          8              We hope that the revamped NRC assessment,
          9    inspection and enforcement processes will at least obtain
         10    one element, namely, the ability to occasionally call some
         11    event or plant condition unsafe.  We feel that plants like
         12    Millstone Unit 3 and D.C. Cook were operating unsafely for
         13    years prior to their lengthy outages.
         14              We are not asking the NRC to agree with us on
         15    these cases, but it's crucial that the NRC have a line
         16    between safe and unsafe practices and to occasionally
         17    identify something as being unsafe.  Without such a line,
         18    you can never really adopt a meaningful risk-informed
         19    regulatory policy. Quite simply, if everything at every
         20    plant is safe, you don't know where to focus resources and
         21    attention.  Besides, it's very difficult for the public to
         22    understand why you could fine NU $2.1 million or AEP half a
         23    million dollars for safe operation of their facilities.
         24              Thank you for this opportunity to present our
         25    views.  More importantly, we appreciate the fact that you've
                                                                     114
          1    undertaken these important initiatives.
          2              CHAIRMAN JACKSON:  Thank you.
          3              Let me ask you to kind of expand a little bit on
          4    what you mean when you say that so far all the efforts seem
          5    to be looking at trying to decide what the right areas are
          6    as opposed to more effort on what to do.
          7              MR. LOCHBAUM:  For example, Mr. Collins during the
          8    early presentation talked about the annual meeting or what
          9    conceptually that might anticipate.  In our view, that
         10    should be talking briefly about plant performance or what
         11    the NRC's assessment is.  We agree that that's a backward
         12    looking thing.  There should be no new surprises in that
         13    because it's all based on available information.  But we
         14    think that should be complemented by looking forward at what
         15    the NRC is going to do the upcoming year to address any
         16    weaknesses that have been identified.
         17              We think the purpose of that meeting is twofold. 
         18    One, to ensure that the public and all stakeholders know
         19    what the NRC's current assessment of a plant is, but also to
         20    identify what the NRC is going to correct any deficiencies
         21    or weaknesses.  If you have a series of these things every
         22    year, or however periodic it is, if you keep identifying the
         23    same problem over and over again, that reflects on the
         24    regulatory staff effectiveness as well as the licensee's
         25    effectiveness.  It captures both.
                                                                     115
          1              We think too much effort is focused on grading the
          2    plants and not responding to what those grades tell you or
          3    what needs to be done to improve the low grading plants. 
          4    That's just an example.  We seem to see this in many cases.
          5              COMMISSIONER DIAZ:  Have you done any work in
          6    trying to define or bound what unsafe means?  Is that a
          7    category of radioactive releases?  What are the things that
          8    you would work with?
          9              MR. LOCHBAUM:  I think the closest attempt that
         10    I've made to addressing that question is the presentation
         11    that I made at the NS meeting in Nashville this past summer.
         12              What I advocated there was when a plant event or a
         13    condition is found at a plant, the licensee should evaluate
         14    the as-found condition, whether it's a single event or an
         15    aggregate of many different problems, and look at that event
         16    with all postulated design basis events, LOCA, loss of
         17    offsite power, et cetera, and see whether the 10 CFR 100
         18    limits would have been exceeded.  Starting from that point,
         19    would the public have been jeopardized had the events
         20    occurred from that degraded point?  If not, then that event
         21    poses relatively little safety risk.  It needs to be
         22    corrected, but it's not a safety issue per se because the
         23    public would have been protected even if the accident had
         24    started from that point.
         25              Occasionally you find that the 10 CFR 100 limit
                                                                     116
          1    might have been violated had the event occurred from that
          2    degraded point.  That to me is a potentially unsafe
          3    condition, and that is where everybody should be focusing
          4    their attention.  Not the other ones, but the ones where the
          5    public might have been harmed.  And also plant workers under
          6    GDC-20.  I think applying that standard to identification of
          7    as-found conditions is the way to distinguish between safe
          8    and unsafe.
          9              COMMISSIONER DIAZ:  You say potentially unsafe. 
         10    So there is a potentially unsafe and there is an unsafe.
         11              MR. LOCHBAUM:  I agree, but the public needs to be
         12    protected even if the accident occurs.  So it is potentially
         13    unsafe, but I'm not sure in my mind that that's more than
         14    just semantics or just a technical term, because had the
         15    accident occurred at that moment, then the line would have
         16    been crossed; the public would not have been protected, and
         17    that can't happen.
         18              COMMISSIONER DIAZ:  There is a difference.
         19              MR. LOCHBAUM:  There is a difference, right.
         20              COMMISSIONER McGAFFIGAN:  What do you think the
         21    prospects are for the success of this enterprise in terms of
         22    defining performance indicators?  My recollection is that
         23    you put out a report annually that uses a performance
         24    indicator that is heavily focused on who identifies
         25    problems.  My recollection is you gave Oyster Creek high
                                                                     117
          1    marks and other folks lower marks, and whatever.  That is
          2    your favorite indicator.  Or it's an indicator.  Have you
          3    tried to insert that indicator into this process?  Does it
          4    fit in any way?  Do you think we are missing things in this
          5    NEI/NRC assessment process that is evolving?
          6              MR. LOCHBAUM:  We looked at performance
          7    indicators.  When we provided the October 2nd comments on
          8    the IRAP that were submitted, we didn't include my favorite
          9    indicator for the reasons that it didn't seem to be better
         10    than the indicators that NEI was proposing.  I can't develop
         11    NEI's indicators independently, so I didn't have access to
         12    that information.  Had I had that, I probably wouldn't have
         13    used the indicator I used.
         14              I think the long-winded answer to your question is
         15    I think NEI indicators are better than what I was using, and
         16    I would prefer to continue using those.  With them being
         17    public, I shouldn't have any problem doing that.
         18              COMMISSIONER McGAFFIGAN:  Okay.
         19              CHAIRMAN JACKSON:  If this process closes that gap
         20    in terms of what the staff is going to do based on what it
         21    finds, or what the NRC is going to do based on what it
         22    finds, would that address the major part of your criticism?
         23              MR. LOCHBAUM:  I think so.  One of the things that
         24    intrigues us about the NEI process is the trending.  When
         25    you start getting into the white area, that's when the
                                                                     118
          1    regulator should get involved.  The licensee will already be
          2    trying to turn it around, but that's when the regulator
          3    should provide whatever inducements are necessary to ensure
          4    that it happens.
          5              I am also encouraged by the fact that it doesn't
          6    look like there is going to be a roll-up of however many
          7    indicators there are into one global indicator of good or
          8    bad.  I don't think that would have been entirely fruitful. 
          9    So it's good that it looks like it's not going to happen.
         10              I think the answer to the question is yes, that
         11    concept seems to be the right way to address our concerns or
         12    criticism.
         13              CHAIRMAN JACKSON:  Do you believe that the
         14    performance indicators and the risk-informed baseline
         15    inspection program will cover the waterfront?
         16              MR. LOCHBAUM:  No.  I think there will continue to
         17    be surprises.  I don't think any process will ever eliminate
         18    surprises, but I think we need to reduce the number of
         19    surprises we have.  It looks like these initiatives will go
         20    a long way toward reducing the number of surprises, and I
         21    think that's positive from that standpoint.
         22              CHAIRMAN JACKSON:  If I can paraphrase you -- you
         23    can agree or disagree -- you're basically saying that the
         24    missing element is what the regulator is going to do based
         25    on what the regulator finds.
                                                                     119
          1              MR. LOCHBAUM:  That's right.
          2              CHAIRMAN JACKSON:  And that somehow the work still
          3    hasn't, to your satisfaction, delineated safe from unsafe.
          4              MR. LOCHBAUM:  I'm still worried about the
          5    threshold about when the plants get shut down.  Mr. Beedle
          6    pointed out that backward looking, some of the data showed
          7    that the watch list plants, some of the indicators moved
          8    into the white zone.  It didn't look like any of them moved
          9    into the red zone, which would have necessitated a plant
         10    shutdown.  Some of those watch list plants were shut down. 
         11    Was that a right decision or a wrong decision?
         12              I need to go back and look at that.  I haven't
         13    done that, so I don't know the answer to whether this
         14    process would solve that.
         15              The one thing I was concerned about in the staff's
         16    presentation was about if you are operating in the green
         17    zone and a violation comes up, do you not overlook that, but
         18    do you give the licensee credit for that?  We are kind of
         19    against that.  We think that process could lead to more
         20    surprises because you tend to dismiss early indicators of
         21    problems until it becomes so bad that several indicators go
         22    into the white or things get so bad that you get the
         23    regulatory bag brought out.  We are kind of concerned about
         24    that.  We think all sanctions should be equal, depending on
         25    the offense, no matter zone you are in at the time.
                                                                     120
          1              CHAIRMAN JACKSON:  Do you have a comment?
          2              MR. BEEDLE:  We've talked a lot about what is the
          3    regulatory response.  I would remind you that we still have
          4    tech specs and rules and regulations to follow.  This
          5    assessment process does not set any of those conditional
          6    requirements aside.
          7              The question, I guess, is really how significant
          8    is the violation.  I think we are headed toward a process
          9    that would help us understand that, and that would then
         10    determine what sort of reaction the regulator would take in
         11    response to the violation.  I think that's really what the
         12    whole point in this assessment process is all about.
         13              CHAIRMAN JACKSON:  Are you operating from the
         14    perspective that the assessment process in the end should
         15    never lead to specific regulatory action?
         16              MR. BEEDLE:  No.  I'm saying that the assessment
         17    process would help the regulator understand how to treat the
         18    violation.
         19              CHAIRMAN JACKSON:  No, no, no.  Let's leave aside
         20    violations.  I'm talking about general performance.
         21              MR. BEEDLE:  I think the general performance would
         22    be dictated by the performance indicators.
         23              COMMISSIONER McGAFFIGAN:  Do either of you have
         24    any concerns about the process whereby we are going to try
         25    to integrate these objective performance indicators with
                                                                     121
          1    inspection findings in the areas where the performance
          2    indicators are not going to provide useful information, and
          3    then any other inspection findings that we come across?  Do
          4    you have any suggestions as to how to make that process more
          5    scrutable or transparent?
          6              MR. BEEDLE:  I think there are a number of areas
          7    where these performance indicators are not going to tell you
          8    about the compliance of the facility to the rules and
          9    regulations, and I think those inspections and part of the
         10    core inspections and baseline inspections that they
         11    discussed earlier.  I think when you find problems as a
         12    result of those inspections to supplement the performance
         13    indicators, again the test of significance is whether or not
         14    they have created problems from a safety point of view.
         15              CHAIRMAN JACKSON:  Right, but if it's risk
         16    informed in the first place, presumably one is looking at
         17    the --
         18              MR. BEEDLE:  It would help you determine where
         19    your inspection effort would be devoted.
         20              MR. LOCHBAUM:  I would agree certain inspections
         21    have to continue because they are not covered under
         22    performance indicators, like human performance and training. 
         23    They have to have a fitness for duty that falls in that
         24    category as well.
         25              I also think this whole process, I don't know
                                                                     122
          1    whether it's allocated equally among assessment, inspection,
          2    and enforcement, but collectively there is a greater
          3    emphasis on corrective action programs of licensees.  I
          4    think that greater emphasis will do more to ensuring there
          5    are no more surprises than anything else.  The key
          6    difference between good and bad performance is the adequacy
          7    of their corrective action process.  Everything I've heard
          8    from all three components is to ensure that that is a good
          9    corrective action process.
         10              CHAIRMAN JACKSON:  Commissioner Diaz wanted to
         11    make a comment.
         12              COMMISSIONER DIAZ:  I just want to make a comment. 
         13    Without preempting the Chairman, I really want to express
         14    how good I feel about what is going on.  I think that the
         15    staff has made a very valiant and a very intellectual effort
         16    to get out of the box and think ahead and provide us with a
         17    risk-informed framework that will serve this country better. 
         18    I want to thank also the industry and the stakeholders, and
         19    Mr. Lochbaum.  This has been a fast and furious, but it has
         20    been a very good process.  I am very encouraged by the
         21    results and I look forward to see you all soon again to
         22    finalize it.  Thank you.
         23              CHAIRMAN JACKSON:  Thank you.
         24              Let me just say on behalf of the Commission, I do
         25    commend the staff, NEI, all the stakeholders, for working
                                                                     123
          1    together in developing improvements to the plant assessment
          2    and oversight process.  I would urge that if it is true that
          3    others are provided with a fait accompli that there be more
          4    real opportunity for participation and influence on the
          5    process, because in the end there are many stakeholders, and
          6    we have to ensure that the public or the public surrogates
          7    have an opportunity to be involved.
          8              As the staff itself has pointed out, although we
          9    have made significant progress, and I am commending you for
         10    that, much work remains to be done.  The decision
         11    thresholds, for instance, for increasing regulatory
         12    attention and regulatory action have to be well conceived;
         13    they need to be benchmarked against historical experiences
         14    and as easily implementable as possible.  So I would urge
         15    the staff to stay focused, and all the stakeholders to stay
         16    focused on those principles.
         17              I would like to thank two of our stakeholders, Mr.
         18    Beedle from NEI and Mr. Lochbaum from UCS, for your
         19    comments.  I appreciate the thoughtfulness that you put into
         20    them.  They will be very helpful as we go forward.
         21              Unless there are further comments, we are
         22    adjourned.
         23              [Whereupon, at 5:00 p.m., the briefing was
         24    concluded.]
         25
[an error occurred while processing this directive] [an error occurred while processing this directive]