1
                  UNITED STATES OF AMERICA
                NUCLEAR REGULATORY COMMISSION
                            - - -
             BRIEFING ON PRA IMPLEMENTATION PLAN
                            - - -
                       PUBLIC MEETING
           
                              Nuclear Regulatory Commission
                              One White Flint North
                              Rockville, Maryland
           
                              Thursday, April 4, 1996
           
          The Commission met in open session, pursuant to
notice, at 10:05 a.m., Shirley A. Jackson, Chairman,
presiding.
           
COMMISSIONERS PRESENT:
          SHIRLEY A. JACKSON, Chairman of the Commission
          KENNETH C. ROGERS, Commissioner
          GRETA J. DICUS, Commissioner
           
           
           
           
.                                                           2
STAFF SEATED AT THE COMMISSION TABLE:
          KENNETH HART, Technical Coordinator, Office of the
           Secretary
          STEPHEN BURNS, Associate General Counsel for
           Hearings, Enforcement and Administration
PRESENTERS:
          JAMES TAYLOR, EDO
          DAVID MORRISON, Director, Office of NRR
          ASHOK THADANI, Associate Director for Inspection
           and Technical Assessment, NRR
          GARY HOLAHAN, Director, Division of Systems Safety
           and Analysis, NRR
          CARL PAPERIELLO, Director, NMSS
          EDWARD JORDAN, Director, AEOD
           
           
           
           
           
           
           
           
           
           
           
.                                                           3
                    P R O C E E D I N G S
          CHAIRMAN JACKSON:  Good morning.  I am pleased to
welcome members of the staff to brief the Commission on the
probabilistic risk assessment implementation plan.  The plan
is intended to be a management tool to help ensure the
timely and integrated agency-wide use of PRA methods and
technologies in the agency's regulatory activities.
          During recent years the use of PRAs in regulatory
activities has continued to increase.  Recently the
Commission has tasked the staff to accelerate its efforts to
develop a standard review plan and regulatory guidance for
the industry and staff use in preparing and reviewing
requests based partially or totally on PRA insights.
          I expect the staff to provide a discussion of this
effort, including its status as well as any anticipated
difficulties.
          In addition, the Commission would like to hear
about the status and progress being made on activities
associated with industry initiatives, including quality
assurance, in-service inspection, in-service testing, and
technical specifications.
          Also, we would be interested in the staff's
strategy or plan to integrate all of the diverse PRA
activities in a structure or framework that will ensure a
consistent and stable regulatory process.  We would be
.                                                           4
particularly interested in comments beyond the fact that we
know there is a coordinating group at the branch chief
level.
          I and my fellow commissioners are pleased to hear
from you today.  I understand that copies of the viewgraphs
are available at the entrances to the room.
          Do any of my fellow commissioners have any opening
comments?
          COMMISSIONER ROGERS:  No, thank you.
          COMMISSIONER DICUS:  No, thank you.
          CHAIRMAN JACKSON:  Mr. Taylor, why don't you
proceed.
          MR. TAYLOR:  Good morning.  As the Commission can
see, at the table I have a cross section of all the major
technical offices.  Bill Russell was to be here.  He may be
running a bit late.  NRR is represented.  I won't introduce
all these gentlemen, because I think you recognize them.
          We provided a paper to the Commission on March 26,
and slides.  This presentation will be in several parts. 
The first part will be on reactor programs.  That will be
given by Ashok Thadani.
          MR. THADANI:  Thank you, Jim.  Good morning.
          May I have the first viewgraph, please.
          [Slide.]
          MR. THADANI:  I will go over some of the
.                                                           5
background covering the last several months activities.
          Gary Holahan is going to go through some of the
details of each element that is in the PRA implementation
plan, where we stand, and some of the significant issues
that have developed as part of the process that we have been
going through.  He will also summarize what our next set of
activities is.
          Next viewgraph, please.
          [Slide.]
          MR. THADANI:  The PRA implementation plan was sent
to the Commission in March 1995.  In April the staff briefed
the Commission on that implementation plan and associated
activities.  
          With Commission approval, in August 1995 the PRA
policy statement was published, which provided the
conceptual guidance on how far to proceed and what some of
the significant factors were that needed to be considered.
          In November 1995 we responded to Commission
questions on the applicability of the process that was used
in the maintenance rule implementation, whether that process
could be applied in other categories.  The staff's
conclusion was indeed that process could be applied in
several other applications.
          Subsequent to that, in November of last year, the
staff provided its framework for applying probabilistic
.                                                           6
techniques in regulatory activities.  There were four parts
to this framework.  Various regulatory applications which
could be grouped in different bins, so to speak.  
          These bins were screening type decisions where one
could go forward with fairly approximate studies.
          The next category was risk ranking applications
where one would divide systems, structures and components
and high and low safety significance, what type of data
would be needed.
          Finally, the third category was the one that would
require very detailed analyses.  Examples were if one were
to modify technical specifications, particularly if one were
to delete certain things from requirements, one would have
to go through very extensive evaluations.
          That was the first step, definition of different
types of applications.
          The second step in the process was to make sure we
understood our regulatory requirements as far as that
application was considered, what deterministic assessments
had been done in the past, taking that into account and then
going forward with conducting probabilistic assessments, and
finally ending up with integrating both the probabilistic
studies as well as the deterministic evaluations that had
been done through some means, such as perhaps an expert
panel concept of integrating these ideas.  That was
.                                                           7
discussed in the framework paper that was issued in
November.
          May I have viewgraph number four, please.
          [Slide.]
          MR. THADANI:  As the Chairman noted, in November
you asked that we accelerate development of the regulatory
guides and standard review plans.  
          In response to that recommendation, the staff did
provide its plans and schedules for accelerating development
of the regulatory guides and the standard review plans.  The
other element in that response was to identify that there
would in fact be close senior management attention to this
activity to make sure that we do end up with these products
on a timely basis.
          Today's meeting is going to cover the elements
that are covered in the March 26 paper that we sent to the
Commission.
          We will continue to provide quarterly updates to
the Commission as well as semi-annual briefings as asked for
by the Commission.
          Viewgraph number five, please.
          [Slide.]
          MR. THADANI:  As I said, the policy statement
provided what I would call a conceptual framework that was
to be utilized in developing the implementation plan.  Key
.                                                           8
pieces in the policy statement are described here.  
          That is, whatever applications where probabilistic
techniques are used, those techniques must in fact be
supported by appropriate methods as well as data.
          The decision process should use probabilistic
techniques as complementing the deterministic assessments
that have already been done.
          Finally, it was very important to make sure that
one pay close attention to the concept of defense-in-depth,
which I guess I will characterize as balance in design. 
That is, there should in fact still be multiple layers of
protection so if one were to make a mistake in one area
there still are other layers of protection that are not
lost.
          Another important element in the policy statement
is that at this time PRAs or such analyses are not
substitutes for meeting rules, regulations and requirements,
that those rules, regulations and requirements must be
adhered to until they are revised in a formal revision
process by the agency.  
          This issue has come up again as a result of some
information we have received.  It was important to make sure
that all sectors of the agency knew that that is what the
policy statement had indicated.  We have gone back to all
the regions and other folks at headquarters to reemphasize
.                                                           9
this point.
          Another guidance that was provided in the policy
statement was the staff in the application of probabilistic
techniques, if the criteria are based on safety goals or
subsidiary objectives of the safety goals, that they be
applied only in generic activities, generic decisions.
          May I have the next viewgraph, please.
          [Slide.]
          MR. THADANI:  The implementation plan, as I
indicated, does go beyond the policy statement and
identifies topics, schedules, responsible organizations. 
Many of these activities in the implementation plan in fact
require joint office evaluations and development.  I am very
happy to say that these interoffice activities are going
very well and there is very good cooperation as we go
forward trying to implement these activities.
          As I said and as the Chairman said, the regulatory
guide and the standard review plan development had to be
accelerated.  We have assigned a high priority.  I do want
to recognize the effort that a lot of people are putting in
trying to make sure we meet those milestones.
          In the backup viewgraphs, from viewgraph four
through eight, we have the names of staff from Office of
Research as well as NRR who are working together in
developing these regulatory guides and standard review
.                                                          10
plans.
          CHAIRMAN JACKSON:  Maybe at the end of the meeting
you can put them up like credits at the end of a program.  I
would like to see you do that.
          MR. THADANI:  I must say that there is lots of
enthusiasm.  Things are going quite well.  I am very
satisfied where we are today.  The staff meets with the line
organizations regularly.  As you noted, there is a
coordination committee consisting of branch chiefs from NRR,
Research, AEOD and NMSS who work with the staff.  I meet
with the whole group once a month to get an idea of where we
are and what some of the issues might be and to provide
assistance as I can.
          As I said, a lot of progress has been made.
          During this period some significant issues have
come up.  Gary is going to go through those, but I will give
you an example or two.
          As I said, policy statements that use safety goals
for generic decisions.  If we have to make plant-specific
decisions, should one utilize safety goals or subsidiary
objectives for plant-specific decisions?  Should these
changes be risk neutral or could they lead to some small
increment in risk?  If that is the case, what are the
criteria that say that increment is acceptable?
          We are not at this stage asking for any decision
.                                                          11
on these issues.  We need to develop these issues further. 
Then we expect to come to the Commission for guidance on how
to proceed on those issues.
          Gary is going to cover many of these issues.  I
just wanted to say that the pilot studies are also
progressing well.  There is one pilot study that is not
going as well as we had hoped, which is in-service
inspection.  It appears that that will be delayed.  We can
discuss that as we go forward.
          In order to accelerate development of the
regulatory guide and the standard review plan it was
important to go back and re-look at the whole implementation
plan.  We have made certain adjustments in the
implementation plan.  Some of those adjustments have been
delaying completion of some other activities.  We will touch
on those.
          As Mr. Taylor noted, this is not just an NRR and
Research activity.  The standard review plan and reg guide
are essentially Research and NRR activities, but the
implementation plan has a whole range of activities that are
identified, and both AEOD and NMSS have a significant part
in terms of developing those activities further.
          Gary is going to cover AEOD and NMSS activities
also in terms of where we are on accident sequence precursor
program, where we are on the data rule, and also in terms of
.                                                          12
how PRA techniques are being used in addressing high-level
waste and low-level waste issues.
          I did want to note that there is a considerable
international interest in this topic.  The Committee of
Nuclear Regulatory Authorities, CNRA, did have a number of
countries participate in developing a report on regulatory
approaches to PSA in OECD member countries.  That report is
complete.  If you would like copies of this, we will make
these available.
          CHAIRMAN JACKSON:  Why don't you do that.
          MR. THADANI:  It turns out that most of the
Western European countries are applying these techniques to
their decision-making process at some level.
          It was also clear at the last CNRA meeting that no
country really had any procedures and criteria for applying
these techniques in regulatory decisions.  They had used
these techniques, but there were no procedures and criteria,
which is the same thing that you asked that we do quickly, a
regulatory guide and standard review plan.  So there is a
group now under CNRA working on the same activity.  
          We are participating in that, and our
participation is sort of parallel to what we are doing in
developing our own reg guides and standard review plans.  I
think during this participation we may learn some things we
may want to incorporate in what we are doing.
.                                                          13
          With that as background, Gary will go through the
plan itself.
          MR. HOLAHAN:  Slide number seven, please.
          [Slide.]
          MR. HOLAHAN:  First I will discuss the revisions
to the implementation plan, then some of the accomplishments
to date and ongoing activities, and follow up and conclude
with actions that we are planning to take in the next six
months.
          I would first like to mention that from the very
beginning the implementation plan was meant to be a living
document in the sense that we recognized that circumstances
would change, that new issues would arise, and that
priorities might change in such a way that revisions to the
plan would be necessary and in fact healthy.  We are at one
of those stages now, but this is not a one-time change.  I
expect to see other additions and revisions in the future.
          The biggest change to take place recently is
focusing much of the attention on the development of
regulatory guides and standard review plans in the reactor
area.  I will spend some time talking about those five
activities in some detail.
          One of the implications of that focus has been to
put our pilot activities in the PRA area into a slightly
different context where they are now directly supporting the
.                                                          14
development of guidance activities, but it has also caused
us to have to rearrange some priorities.  
          Where we had in the plan the consideration for
developing a standard review plan for construction and
design errors and some reevaluations of NUREG-1150, those
things have been deleted as low priority items.  In
addition, an item has been deferred where we had planned to
address PRA issues for non-power reactors.  Because of the
lower safety significance and also because of the state of
the art the methodologies don't really exist currently for
non-power reactors, we thought we would give that a lower
priority.  That will be probably picked up in a time frame
after the development of the reg guides and the SRPs.  That
will be revisited.
          In addition to some prioritization changes, a few
new tasks have been added to the plan.  I count
approximately 120 identifiable tasks in the implementation
plan at this stage.  
          The new ones are associated with some inspection
activities where we think it is important to take PRA
insights and get them into the inspection program, into the
field offices so that our inspection activities are more
risk focused.  That is being done in part in preparation for
the inspections of the maintenance rule, which will begin
after the rule implementation in July of this year.  And
.                                                          15
also some new guidance on the inspection of design changes
at reactors.
          In addition to the other training activities that
have taken place over the last few years, there is some
activity now to develop PRA training focused on inspectors
and what use inspectors can make of PRA and risk insights.
          Slide number eight.
          [Slide.]
          MR. HOLAHAN:  In terms of accomplishments to date,
I think there has been substantial progress made on the
regulatory guides and standard review plans.  In each case
the scope of activities has been set out in detailed outline
of what those documents will be like.  
          There has been a considerable amount of discussion
both within the staff and between the staff and industry on
the role of the pilot applications and how those will be
used in developing the regulatory guides and the SRPs.
          I think it is worth mentioning at this stage that
probably every time in the paper and virtually every time in
the slides you will see regulatory guide and SRP mentioned
together, because they are really not being treated
separately.  
          The way those are being written and the way even
the teams are being structured, one group of people is
developing both the regulatory guide, which is really
.                                                          16
guidance to the industry as to what the NRC expects, and the
standard review plan, which is guidance to the staff as to
how to review industry applications.  Those are being done
together by the same group of people.
          There has been some progress on the PRA methods
development.  I will mention a few examples a little later.
          In terms of the IPEs, the examination for severe
accident vulnerability goes back to 1988.  I think we have
made substantial progress on that.  Submittals have been
made by all licensees.  We are well along on the reviews. 
Forty-five of the 75 reviews have been completed, meaning
safety evaluation reports that have been written by the
staff back to the licensees.  Any additional ones will be
completed this summer.  
          I think we had originally planned on completing
them by June.  A number of those are being re-reviewed
because of some difficulties.  I think the main area the
staff had problems with some of the IPEs had to do with the
treatment of human reliability in the analysis.  That is a
very difficult area.  We think some of the IPEs were not up
to the standards that we expected.  There is an additional
review effort to get those completed.
          In addition, I will mention the common cause
failure database developed by AEOD, which I think is an
important step forward.  Common cause failures are a very
.                                                          17
important element of PRAs.  The most likely mechanism for
losing redundant equipment is to have some hidden common
cause failure.  I think the study that has been done is an
advancement to the state of the art and it will be folded
into both the regulatory guide and the standard review plan.
          There have also been recent studies on high
pressure coolant injection on boiling water reactors and
emergency diesel generators.
          Slide number nine, please.
          [Slide.]
          MR. HOLAHAN:  Since the last Commission briefing
the 1984 accident sequence precursor report has been issued. 
Those analyses were completed.
          CHAIRMAN JACKSON:  You mean 1994.  You said 1984.
          MR. HOLAHAN:  Excuse me.  In fact, 1982 and 1983
were also completed, but I believe 1984 was done in about
1986.
          There has also been publication of the proposed
reliability data rule which would support the risk-informed
regulation.  
          There have been a substantial number of
improvements in the training programs and I think a
substantial increase in the number of individuals taking
those courses.
          In the materials area performance assessment,
.                                                          18
which I think I would describe as PRA-related methodology,
there has been some progress in that area, particularly in
some demonstration projects in the high-level waste area.
          Slide number 10.
          [Slide.]
          MR. HOLAHAN:  With respect to the standard review
plans and the regulatory guides, I would like to give some
details as to where the staff is in that arena.
          First, we really have two types of activities
going on.  One is the development of a general regulatory
guide and standard review plan which will establish general
scope and quality guidance and expectations, which would
really apply to all applications.  
          I think it is important to have that, because in
addition to the application-specific regulatory guides, the
four examples that we are dealing with now, we expect in the
future there will be a fifth and a sixth and at some point
there may be many others.  When the general guidance is in
place, that will be helpful in allowing us to make a fifth
regulatory guide and a sixth, and I think it will establish
the expectations for all future issues.
          With respect to application-specific regulatory
guides and standard review plans, we are working on four at
the moment: in-service testing, in-service inspection,
graded quality assurance, and technical specifications.
.                                                          19
          In each case there is a team in place.  Each team
has an action plan with schedules and milestones, and in
each case they have at least a draft outline of the
regulatory guide and standard review plan which really
establishes the scope and organization of what is to be
accomplished.  In some of the cases we are even a little
further along than that.
          With respect to these issues, there have been
numerous meetings with the ACRS and numerous meetings with
the industry.  Most of the industry meetings have been in
the context of the pilot applications and how they fit into
these activities.
          One thing I think is worth mentioning -- Mr.
Thadani mentioned it early -- is that the one item on this
list that is not consistent with the schedule that we had
originally laid out is the in-service inspection activity.
          We still think it's possible to meet the final
schedule for the regulatory guide and standard review plan
at the end of 1997.  The other case is to be done about a
year ahead of time, that is, at the end of this year.  That
is not possible in this area because of some delays in the
industry submittals.  
          I think that is not such a serious blow to the
program.  As Mr. Thadani mentioned, we are really developing
guidance to cover three types of applications, screening
.                                                          20
analysis, risk ranking, and detailed analyses.  The
in-service inspection activities is one of a number of
examples of risk ranking type application.  Even if the
in-service inspection activity is delayed or in fact if it
was not done on this schedule, I don't think it would have a
major impact on our development of the general guidance
activities, because in many ways it is a similar activity to
the in-service testing.  The methodologies involved will be
tested pretty well in the in-service testing area and also
in the graded QA area.
          As I mentioned, we expect by the end of this year
to be well along with the draft guidance and have the final
ones in place at the end of 1997.
          Slide number 11, please.
          [Slide.]
          MR. HOLAHAN:  I think slide number 11 is pretty
well covered except to say that in the graded QA area I
think there has been a little more progress and there is
actually a preliminary draft of a regulatory guide in
addition to what I would call a detailed outline in most of
the other cases.
          Slide number 12.
          [Slide.]
          MR. HOLAHAN:  With respect to the pilot
applications, in the motor operated valve area, which is
.                                                          21
related also to in-service testing since in-service testing
covers pumps and valves, we also dealt with a specific piece
of valve testing, which was really follow up to a testing
program that the NRC put in place with Generic Letter 89-10,
which was to address valve performance under in-service
conditions.  In other words, with the pressures and flows
from actual accident conditions.
          The Boiling Water Reactor Owners' Group had
proposed a PRA-related technique for establishing priority
among valve testing, which valves should be tested more
often and which could be somewhat delayed.  The staff
reviewed that application and in fact issued a safety
evaluation report agreeing with their approach in February
of this year.
          The in-service testing program is a much broader
issue.  We have two pilot plants at the moment, Palo Verde
and Comanche Peak.  I think those reviews are progressing
well.  There has been a request for additional information. 
There have been a number of meetings and site visits.
          In addition, the industry documents in this area
are under review and the staff has provided comments.  I
think what is happening is industry standards are being
developed in parallel with the NRC guidance.  
          In several of these areas it is not entirely clear
how the format of the regulatory guides will end up.  In
.                                                          22
some cases, where there is a well established and acceptable
industry standard, the staff would reference that standard
as in large part an acceptable way of addressing in-service
testing.  In other areas it may be that an industry guidance
document would play a minor role or perhaps no role at all,
or the staff might endorse an industry guidance document in
part to be supplemented with some other considerations that
the staff felt were important.
          Exactly how the industry standards and the staff
guidance fit together is part of this developmental process,
but in each case I think there is substantial industry
activity going on.  Exactly how that ends up remains to be
seen.
          CHAIRMAN JACKSON:  I am going to come back to that
one.
          MR. HOLAHAN:  Slide number 13, please.
          [Slide.]
          MR. HOLAHAN:  In the in-service inspection area
there have been a number of meetings with the industry;
there have been a number of discussions as to what were
suitable pilot plants.  Because the industry had been
developing more than one ISI technique, we wanted to make
sure that we were testing those approaches.  
          We also wanted to make sure that we were
addressing both boiling water reactors and pressurized water
.                                                          23
reactors.  So we have come up with a collection of pilot
plants, including ANO-2, which is a Combustion Engineering
plant, Fitzpatrick, which is a boiling water reactor, and
Surry, which is a Westinghouse designed plant.  We are
expecting some industry documents by June of this year. 
That really is a part of the critical path of getting the
draft document done by early in 1997.  We will be watching
that schedule closely.
          In the graded QA area, the staff has ongoing
discussions with three utilities, South Texas, Palo Verde
and Grand Gulf.  There have been numerous site visits and
discussions.  I think that is progressing.
          This is a risk ranking type application in the
sense of deciding which equipment is more important than
other equipment and therefore should be given more detailed
attention.  One is there has to be a strong approach for
having confidence that you have really identified the
important equipment when you are separating it from the less
important equipment.
          One of the features of a graded QA approach that
the staff is enthusiastic about is that there may be
important pieces of equipment in the plant which a risk
analysis would identify which have not traditionally been
treated as safety-related equipment in the sense that they
are not design-basis accident mitigation equipment.  It
.                                                          24
could very well be that an additional standby pump or some
other equipment might play an important role in risk even
though it doesn't happen to fit into the deterministic
design basis of the plant.  
          A graded QA approach which uses risk insights to
identify that equipment and to give it additional attention
that it wasn't getting before is an important advancement. 
One of the things that needs to be worked out is if there is
what is called non-safety-related equipment which also turns
out to be important, what do you do with that?  What kind of
QA is appropriate for that equipment?  It doesn't fit into
the traditional QA programs, and so whether it fits in the
same category as other important safety-related equipment or
whether we should give it some special focused attention is
part of what needs to be worked out in this activity.
          MR. THADANI:  Gary, let me add to that.  I think
there are two issues.  The first issue is exactly what Gary
described.  Appendix B applies to safety-related component
systems and structures.  We all know that there are
so-called non-safety-related components that have a
significant impact on risk.  
          If we go forward with the approach of risk ranking
and we have two categories, let's say high safety
significance and low safety significance, there is no doubt
in my mind that some of the structures and components that
.                                                          25
rise up to high safety significance would be non-safety-
related components.  If the industry has two different QA
approaches, then those non-safety-related components clearly
would have to get higher attention than they were getting
before.
          The more difficult issue, I think, is going to be
things that fall into the low safety significance category. 
It is very clear that the criterion which the industry would
prefer is, if there is failure, then we would take
corrective action, that you don't need to do a lot more,
because these components are not that significant.  
          Our view is that no matter what, for each failure
one should be able to do a thorough root cause and
corrective action plan.  In order to do a thorough root
cause and corrective action plan one needs to have a certain
amount of information available on that component.
          The second element we want to make sure of is, if
a component fails, even if it is not safety significant in
itself, there may be similar components within the plant and
other systems that one needs to keep such information, know
where these components are located, what systems, and so on. 
What that means is, even for the so-called non-safety
significant component certain information has to be kept so
that one can in fact achieve what I would call key analyses
that could have a significant impact on risk.
.                                                          26
          As part of this review that is going on now that
is part of the debate: where is the industry going and what
do we think about that?
          At this point, it seems to me for some
applications we seem to be coming together, but we will wait
and be sure what happens.
          MR. TAYLOR:  I'm not in the group doing this
review, but I think even the industry through the years in
terms of potential risk significance has recognized systems
such as air systems where air in some parts of the plant is
fairly mundane but used in operation of certain safety or
very important valves and stuff in the plant.  
          That sort of recognition always gets to be
important.  It doesn't mean you have to procure that
equipment or all of the requirements of Appendix B, but it
raises the status of that equipment, particularly the
ability to supply air and a continuing supply.  We have seen
this, of course, and then we have seen utilities look at air
systems and say, gee, do I have sufficient capability? 
Because you are always having trouble with compressors; they
are out of service, and so forth.  I've seen some plants
decide they will buy a backup diesel, sort of on a cart type
diesel compressor for air service.  
          This is a sort of a fertile area where you use the
risk potential to take a deeper look.  It doesn't
.                                                          27
necessarily mean rebuilding the system, but giving a lot
more attention to it.
          MR. HOLAHAN:  In the area of the maintenance rule,
there were nine pilot site visits done in order to determine
that the implementation approach to the maintenance rule was
viable and working well.  That activity was completed and
there was a workshop conducted last summer on the subject.
          In addition, there is some inspection type
training going on because there will be a baseline
inspection basically covering all plants and their
maintenance rule implementation.
          In addition, the use of risk insights in the
maintenance rule occurs in three different areas.  One of
them has to do with identifying more or less
safety-significant equipment.  As we go along, I think we
are learning more about how to do that.  We feel that it is
important that the industry is learning along with us.
          In the implementation of the maintenance rule, a
key part of that implementation is done by an expert panel
which takes both deterministic engineering insights and risk
insights and combines those to come up with a list of more
or less risk-significant, safety-significant systems for the
maintenance rule.  So the expert panel plays an important
role in that.  
          One of the areas that we think may need additional
.                                                          28
clarification is that it is probably an enhanced role of the
expert panel in the sense that as we learn more about what a
good quality PRA really means, those questions and those
issues really ought to be on the minds of the expert panel
as they are deciding what is an important system to be given
specific treatment in the maintenance rule.
          Slide number 14.
          [Slide.]
          CHAIRMAN JACKSON:  I'm going to ask you to talk a
little faster.
          MR. HOLAHAN:  Maybe I will say fewer things.
          CHAIRMAN JACKSON:  No.  You can increase the
speed, not decrease the volume.
          MR. HOLAHAN:  The last pilot application is the
technical specifications.  The CE Owners' Group has given us
a request for extension of allowable outage time of
equipment.  We are also dealing with the South Texas project
on two systems, on the service water system and on emergency
diesel generators.  
          What we appear to be converging on is what we are
now calling a 3-tiered approach, which is specific
limitations on when a piece of equipment can be out of
service based on risk insights, and also using risk insights
to decide what other equipment would be particularly
important during that outage period of time and putting
.                                                          29
specific controls on that other equipment.  
          A simple example.  If a plant has two diesel
generators and one is out of service, you really need to
make sure that the other diesel generator is given special
attention in that period of time.
          The third piece of this 3-tiered approached is
also very important.  If a piece of equipment is going to be
out of service for some extended period of time -- we have
seen applications for 14-day outages or 21-day outages --
other things can occur during that period of time, and
sometimes they could either be driven by equipment failures
that need to be dealt with or there may be just planned
activities.  
          We are saying the third tier in this approach
would be a risk management approach based on what the
maintenance rule already calls upon for licensees to look
at, the impact of taking equipment out of service, and give
special attention to taking equipment out of service or
finding equipment out of service while they are in an
allowable outage time, because that may complicate a
situation also.
          We are working out that process.
          Slide 15, please.
          [Slide.]
          MR. HOLAHAN:  In the methods development area,
.                                                          30
there have been two notable items in the reactor area, both
related to developments for treating errors of commission. 
This is a difficult subject.  Traditionally it has been much
easier to put probabilities on reactor operators failing to
take the proper action, but it is much more difficult to
have a technique which says, in addition to that, what else
could they do wrong?  This has been an issue that has been
recognized ever since the reactor safety study in the
mid-1970s, and I think there is some significant progress
being made in that area.
          Number 16, please.
          [Slide.]
          MR. HOLAHAN:  With respect to the individual plant
examinations and the examinations for external events, I
think there has been good progress.  As I mentioned earlier,
45 safety evaluations have been written, with the rest to be
completed by September.
          There is also a preliminary insights report.  I
think that is important for looking at the overall industry
and identifying what issues are important in addition to the
IPE program, which was really intended to find any
plant-specific vulnerabilities,  But it is an excellent
opportunity for learning broadly what areas are important.
          In the IPEEE program, the NRC's request came
afterwards.  So we are still in the process of receiving
.                                                          31
submittals from the industry.  Five of our reviews are
completed; an additional 20 are under review and will be
completed within the next several months.
          The staff is working on a plan to complete the
additional 49 or 50 IPEEEs but to try to do it in a way 
that focuses on just the most important issues.  
          Because of resource considerations, although we
spent a lot of resources on the first five and will do a
pretty in-depth review on the initial submittals, we think
we can learn from the first few submittals and focus our
attention on those items that are most important to be more
efficient in the remaining ones.  There is a Commission
paper due in the near future which will lay out our approach
for accomplishing that.
          MR. THADANI:  There are a number of issues in the
past that we have said we don't believe we need to take any
action, that these issues would be addressed as part of
IPEEE.  So the review process we are going to go into is to
take out all the areas where we have said we were going to
rely on the IPEEE evaluations and lay out those issues, go
back, take a look at the IPEEEs, review them so that we can
make sure we can make decisions on those specific regulatory
issues.  So the review is going to be driven by decisions
that we need to be making on those issues.
          MR. HOLAHAN:  Slide number 17, please.
.                                                          32
          [Slide.]
          MR. HOLAHAN:  The draft of the reliability data
rule was published for comment.  The comment period ends
this summer.
          A regulatory guide will be completed this month.
          We expect as part of the rulemaking process for
there to be a public workshop and comments, and we are
targeting the end of this year for putting that rule in
place.
          Number 18, please.
          [Slide.]
          MR. HOLAHAN:  The accident sequence precursor
program is a program to look at actual operating events,
actual reactor events, and to use probabilistic risk
assessment techniques to identify the most significant of
those events and also what about those events was really
important.  That program calculates a conditional
probability of core damage given the event that did occur,
how many more things, how much worse would it have had to be
in order to have gone to core damage.  In that sense, it is
a measure of how much margin was left, how close we came to
a core damaging event.
          The report for the 1994 events was published in
December.  
          In addition, the program, which has gone on for
.                                                          33
many years, had a gap in it.  It had not previously been
funded to cover the years 1982 and 1983.  There was a
feeling that there was sort of incompleteness to the
program, that there might be some insights in those years,
and also it would be helpful in your ability to make any
judgments about trending; it was important to fill in those
gaps.  Those analyses have been completed.
          An accelerated program is in place now, so that
rather than waiting for all of the events of the year and
trying to deal with them all at once, they are being dealt
with in AEOD on an event by event basis.  So not only are
the 1995 analyses being done on an event by event basis, but
in fact some of the 1996 events are also being started. 
There will still be a compilation report.  I guess in the
AEOD annual report there will still be an annual
compilation, but there would be more of an event by event
analysis available to the staff and the industry to see what
is important.
          Slide number 19, please.
          [Slide.]
          MR. HOLAHAN:  AEOD also has a number of other
initiatives related to using risk techniques in reviewing
operating experience.
          There is a general plan to increase that activity. 
Common cause database is something that I mentioned earlier
.                                                          34
and I think is an important advancement.  
          There are also a series of safety system
performance studies which are basically equipment
reliability studies that give good insights across a number
of reactors and also across a period of time.  That is
dealing with high pressure cooling injection, emergency
diesel generators, isolation condensers, a number of
important safety systems on boiling water reactors and
pressurized water reactors.  
          One important element of this program.  Not only
does it help identify important equipment or trends in
equipment reliability, it can be used as a building block
for inspection program, focusing inspection activities on
equipment that has either been shown to be degrading or less
reliable than other equipment, or in fact focusing on a
particular plant or set of plants that might be out of line
with its peers.
          This is an important check where the staff and
industry are doing PRAs and using them in regulatory
applications.  Here is an actual operating experience that
can be compared with the assumptions in the PRA to give a
good objective sanity check.
          Number 20, please.
          [Slide.]
          MR. HOLAHAN:  The PRA training activities are also
.                                                          35
in the Office of AEOD.  There have been revisions,
improvements, I would say, to the curriculum in a number of
areas.  
          There are also additional items which are being
tested and are being planned to deal with configuration
management and uncertainty.  A number of important issues
are being worked into the program.
          One of the ones that I liked is the idea of having
a course for technical managers, because if the
implementation plan is to succeed, it is not for the few
experts in the few expert branches.  It has got to be a
broadly understood and implemented program.  
          Chairman Jackson, this goes to one of your
comments in your introduction, that if risk-informed
regulation is going to be an agency approach, it needs to be
worked into the infrastructure of the agency.  Not only as
programs, but as an understanding on the part of the staff.
          The senior reactor analyst program is an important
program that I would like us to spend a moment on.  There
are ten senior reactor analysts in training.  They are in a
two-year training program.  They are predominantly senior
level inspectors from the regional offices.  After their
training is done they will go back to the regional offices. 
They are receiving training in probabilistic risk assessment
techniques.  They are having rotational assignments in the
.                                                          36
branches that are dealing with PRA activities.  
          They are developing expertise to be taken back to
the regional offices.  They are also developing a strong
understanding of what tools are available here in
headquarters, and maybe as important as anything else, they
are making important contacts here with what will be, when
they go back to the regions, their contacts back here so
that they can form a communications link between the
regional offices and headquarters.  I think that is also an
important part of what I will call risk-informed
infrastructure of the agency.
          MR. THADANI:  These senior reactor analysts will
become part of the baseline inspections that will be done in
terms of follow-up to the maintenance rule implementation. 
They will participate in those inspections.
          MR. HOLAHAN:  Slide number 21, please.
          [Slide.]
          MR. HOLAHAN:  In the waste management area, which
parenthetically I might need some help on, the performance
assessment techniques continue to be used in the high-level
and low-level waste areas.
          In the high-level waste area there is basically a
three-phase program for implementing performance assessment. 
There was an initial demonstration phase, which was
completed.  Then back in October of 1995 there was
.                                                          37
completion of the second phase, which was characterized as a
completion of the demonstration.  The third phase, which is
really an application of the methodology, is an ongoing
activity.  
          In the high-level waste area the key issues seem
to be related to timing and a probabilistic treatment of
timing issues such as the time frame of interest, the period
for which it is appropriate to give credit for engineering
barriers, and treatment of issues such as the evolution with
time of site conditions.  So performance assessment is a
probabilistic way of dealing with those difficult issues.
          In the low-level waste area there is also a
performance assessment activity.  There is a plan to publish
a branch technical position this summer.  That is about all
I know on the subject, unless Carl would like to help some.
          MR. PAPERIELLO:  We are doing it for low-level
waste performance.  We have just begun to do it in SDMP
performance assessment.  What you do is you vary parameters
and you find out what parameters change the outcome.  So I
kind of look on performance assessment compared to PRA as
contradictory.  It is more of a deterministic probability
where you have bi-values for things and for the parameters
that we have in performance assessment you have a
distribution function.  You look for which distributions
affect the outcome and which ones the outcome is not very
.                                                          38
sensitive to the change in the input.  For high level waste
it tells you should we worry about a particular phenomenon
or is it not going to change the outcome very well.
          Basically, that is how performance assessment is
used in NMSS.
          MR. HOLAHAN:  Thank you, Carl.
          Slide number 22, please.
          [Slide.]
          MR. HOLAHAN:  The last two topics I would like to
cover are emerging policy issues and then what activities we
expect to be completing over the next six months.
          The emerging policy issues are discussed in a
Commission paper.  We felt it was important to give the
Commission early warning on potentially complex issues that
are coming up.  At this state we don't feel we need
Commission guidance.  We don't have a specific proposal for
the Commission for dealing with these issues, but we thought
it was important to identify them early on.
          Mr. Thadani already mentioned the issue of use of
the safety goals in decision criteria.  I think this is an
issue that the ACRS has raised with the staff.  I think
because risk-informed regulation will call for some
decisions, it seems to me that in some sense the decision
criteria that the staff has on individual issues needs to be
in some way informed consistent with the safety goal, but
.                                                          39
how that plays out is something to be developed.  We
understand that the Commission wants to be involved in such
a decision.
          With respect to performance-based regulation, the
relationship between performance-based regulation and
risk-informed regulation is not completely defined.  There
are some elements of performance-based regulation which are
inherent in risk-informed regulation.  For example,
equipment reliability or living PRAs by definition are
feeding back the performance of equipment into an ongoing
assessment of risk insights.  
          But there is an additional element of
performance-based regulation that we need to deal with, and
this is very much related to industry initiatives to move
towards performance measures as opposed to programmatic
requirements.  
          To the extent that focusing on performance means
focusing less attention or no attention on programmatic
requirements, I think the staff wants to take a cautious
approach to this activity to make sure that if in fact we
are using a performance-based approach in a given area that
there is measurable information, that that information will
give good insight as to the safety significance of
activities.  For example, as in the maintenance rule, any
failures or poor performance doesn't result in unacceptable
.                                                          40
or intolerable conditions.
          The maintenance rule is a good example of
performance-based approach, because individual equipment
failures can be counted, can be addressed.  So long as there
is something to measure and measuring failures is not
unacceptable, that is a reasonable approach, but there are
other areas for which programmatic requirements are probably
very important.  
          The example perhaps is a little bit extreme, but I
think it is a helpful example.  In the seismic area you
simply can't wait and count earthquakes and see how well the
plants perform.  It simply doesn't make any sense, a strong
program and an inspection program where there is really
nothing to measure; there is no output in the normal
performance-based sense.  Either we have to find other
things to measure, other surrogates for real performance in
a demanding situation, or else it is more appropriate to
continue to focus on the strengths and the qualities of the
program that gives you confidence that the diesel generator
and the buildings are built to strong standards so that they
are seismically capable.
          That is an issue that we need to sort out, how
much belongs with the PRA implementation plan and what is
the right mix of performance-based and programmatic
requirements.
.                                                          41
          MR. THADANI:  I think that issue probably should
be discussed a little bit further.  Even under the
maintenance rule the criteria that the industry is setting
up are not probabilistic or numerical criteria in terms of
performance of systems and components.  For example, they
might set up for pumps criteria like changes in flow,
changes in vibration.  That is, there would be some
engineering-based criterion that would be set up and they
would be monitoring that pump, let's say, to make sure that
they don't get that condition.  That condition is a
precursor to potential failure so that problems can be
detected in time before they lead to failure.  That approach
is going to be applied for all the components that are in
high safety significance category.
          In some rare cases, if the component performance
is not acceptable, they would move those components into a
category where they set up goals, go back and take a look at
their programs, ask questions: why are we seeing poor
performance from this component?  Modify their program but
put that component in a category where attention is given by
management.  That is called category A-1.  At that point
that is a goal, and that goal could be numerical, but by and
large we don't expect numerical goals for components even
under the maintenance rule.
          MR. HOLAHAN:  The third item is related to the
.                                                          42
second, which is the staff needs to settle and presumably
bring to the Commission its advice on how to treat increases
in risk which may be allowed as part of the risk-informed
regulation.  It is certainly included in the current
industry guidance.
          The last item is really an implementation.
          CHAIRMAN JACKSON:  The last item you mentioned,
the risk-informed in-service testing and inspection?
          MR. HOLAHAN:  Yes.  
          CHAIRMAN JACKSON:  We are going to talk about that
if we get a chance.
          MR. HOLAHAN:  I would propose to give you a chance
by simply saying slide 23 and 24 are two pages of promises
for the next six months.  I will note that it is a long list
of significant promises.  I won't go into them in any
detail.  I won't even list them.
          CHAIRMAN JACKSON:  It sounds like you are tracked
to fulfill them.
          MR. HOLAHAN:  Yes, ma'am.
          CHAIRMAN JACKSON:  Let's have a few questions. 
You talked about the emerging policy issue with respect to
in-service testing and in-service inspection.  It seems that
there is this issue of the methodology for the review and
approval of changes, perhaps what someone might want to call
risk-informed changes to in-service inspection and testing
.                                                          43
requirements.  
          I guess the question I have, and I think
Commissioner Dicus has a similar concern, is, how do you
make a finding under 10 CFR 50.55(a) based on the licensee
submittal alone without having the benefit of information
that you may have gotten from the pilots?  
          Put another way, is the pilot being used de facto
or being judged de facto to be an acceptable alternative by
definition which then is subject to change after the pilot?
          MR. THADANI:  If I may go back, 50.55(a) states
that the utility should meet the ASME standards in terms of
in-service testing.
          CHAIRMAN JACKSON:  Exactly.
          MR. THADANI:  The requirement may be that each
safety-related pump has to be exercised quarterly.  It may
turn out that not all pumps are equally important; some
pumps in the plant are more important than others.  
          The idea behind this approach is that the
utilities take all those components and try to develop an
understanding of the relative importance of those
components.  If some of the pumps have less safety
significance, then they could assign lower testing
frequency.  The code calls for quarterly testing, for
example.  In this case they may go to six months, a year, or
some longer time period before they test those components.
.                                                          44
          The staff has to review that.  The code allows the
staff to provide approval if there is an alternative
approach that the licensees are using that is deemed of high
quality and is in fact acceptable to the staff.  At this
point the staff is working with the utilities.  For example,
IST with Palo Verde and Comanche Peak, getting into the
details of how did they decide what components are more
important and less important, and what is the right
frequency of testing.  
          If after the evaluation is complete the staff
agrees with the licensee over some modifications, agrees
that those changes are appropriate, that the licensee's
assessment is still acceptable, at that point that licensee
can go to the revised approach that has been reviewed and
approved by the staff. 
          Our expectation is as follows.  After these two
pilots are done, once the staff says it is okay, they can go
forward.  However, our intention is to go back and revise
the regulation to allow risk-informed thinking to be built
in as part of the code.  The code committees are also
working on this issue so that they can modify the code
itself and in the future reference to that code will meet
appropriate requirements.
          CHAIRMAN JACKSON:  I guess an issue has to do with
in the meantime the fact that essentially to implement the
.                                                          45
alternative testing requirements the licensee needs to be
granted relief from current requirements.
          MR. THADANI:  Not at this stage.  After the staff
is done with its review.
          CHAIRMAN JACKSON:  I am saying the staff is going
to be doing a review.  
          MR. THADANI:  Yes.
          CHAIRMAN JACKSON:  That would essentially grant
relief from current testing requirements.
          MR. THADANI:  Correct.
          CHAIRMAN JACKSON:  You talked about evaluations,
but the question is, what is going to be the basis for doing
those evaluations for granting the relief?
          MR. THADANI:  The basis would be essentially
negligible impact on risk.  That is the thrust of this
approach.  If you go back and do the analyses with quarterly
testing for pump X, it may not even appear in most of what I
would call important accident scenarios.  For that
particular pump, changing the frequency from three months to
six months or a year, I don't think one would even see it in
the evaluation.
          CHAIRMAN JACKSON:  Maybe it is a message as much
as a question.  Even though the staff has the capability
under 10 CFR 50.55(a), I think it would be helpful for the
Commission to understand what the methodology is and what
.                                                          46
bases you are using for making these judgment in the absence
of the input from the pilots and in the absence of the
development of the regulatory guidance, et cetera.
          MR. THADANI:  We would certainly come back.  
          One of the issues that I don't think we have total
agreement on yet also looking at is, are the tests that are
done in some cases giving us all the information that one
would like to have, or should the test itself be revised? 
Not just the frequency issue, but are some of the testing
procedures appropriate in catching the dominant contributors
to failure of that pump?  That issue is still under
discussion.  The ASME code people are looking at that also.
          CHAIRMAN JACKSON:  Let me ask you one last
question about the pilots.  Will the pilot studies tell us
anything about the required scope and level of detail of
modeling in a PRA?
          MR. THADANI:  I think so.
          CHAIRMAN JACKSON:  Are we approaching it with
goals in mind that would allow us to get at this issue?
          MR. HOLAHAN:  Yes, absolutely.  The reason that we
are including pilot activities as part of the plans for the
SRP and the reg guide is that it is very difficult to write
review standards or general guidance in an abstract way.  It
is much better to have an actual example or numerous
examples while you are trying to establish what kind of
.                                                          47
scope is important, what kind of quality features am I
looking for.  I think it is very helpful to have those pilot
applications in front of the staff.  I think they form an
integral part of developing guidance.
          CHAIRMAN JACKSON:  The last question for the
moment.  You mentioned your review of licensee submittals of
IPEs.  Can you say at this point whether the implementation
of your proposed risk-informed and performance-based
approaches will require licensees to upgrade their IPEs to
full scope PRAs, level 3's?
          MR. HOLAHAN:  I don't think I can give you a
clear-cut yes or no answer.  I think it relates to what
application is in mind.  
          CHAIRMAN JACKSON:  The real answer is you have to
work your way further through this implementation plan
before you can give us an answer.
          MR. HOLAHAN:  I will give you a guess.
          CHAIRMAN JACKSON:  Okay.
          MR. HOLAHAN:  Mr. Thadani mentioned screening type
applications, risk ranking and detailed applications.  From
what I have seen of most of the IPEs, I think they are
suitable for screening type applications.  My guess is that
many of them are good enough for risk ranking but that some
would require additional improvements.  At this stage I
wouldn't be confident in saying that many of them are good
.                                                          48
enough for detailed applications.  I would say maybe only a
few.
          MR. THADANI:  Let me add to that.  In terms of
risk ranking, as we have indicated before, we are looking at
clarification and guidance.  It may be because of some of
the variability in studies it is even more important to make
sure that the importance measures that are used in getting a
better understanding are carefully considered and the
criteria that one uses in applying those importance measures
become important.
          I think we are trying to get a clearer
understanding of what is the proper criterion.  Is it five
percent impact? 10 percent impact?  I think what is clear is
we need to look at these various measures and look at the
hardware that shows up in appropriate category by using
different approaches to get a better understanding of what
these criteria are actually doing, that is, which components
end up in high and low importance categories.  There I think
we need to do a bit more than we have done in the past.
          CHAIRMAN JACKSON:  There are going to be these
pilot applications.  It would strike me that in looking at
what you hope to get out of them, which we have talked about
before, that you need to think about all these things where
you know you have these questions to see to what extent you
can get what you need.
.                                                          49
          I am going to yield to Commissioner Rogers.
          COMMISSIONER ROGERS:  I am concerned about the
same thing.  We are talking now about applications of PRA
that I think go way beyond what we had originally thought of
a few years ago when using them for screening was clearly a
very valuable thing to do and valuable insights that the
licensee would get by doing the PRA.  Now we are beginning
to think, well, now we can use this very powerful tool for
some other purposes.  They are very interesting purposes,
and  I think it is important to look at them.  
          I guess my concern follows sort of along the lines
of the Chairman's, and that is, what are the bases we use to
judge that a PRA is a good PRA?  It is all very well to say
the risk is less or the risk analysis shows, but what is the
basis on which we decide that that risk analysis itself was
well done and sound?  
          It relates to questions about peer review of the
PRA process itself, to what extent are we availing ourselves
of peer reviews of PRA processes in what we are doing and
what licensees are doing, and is there some codification
possible of analyzing whether a PRA has been acceptably
performed?
          It is easy for us to look at the input data to
know what the reliability database is that has been
referenced in putting numbers into the PRA, but I am
.                                                          50
thinking about the general structure, the fault tree and
event tree structures of performing these and whether there
is some basis for deciding, yes, this is a really sound job. 
Or do we have to ad hoc each one of these things?  Is there
some way that one can codify test criteria for looking at
PRA that more or less meet standards of the scientific and
technological community?
          CHAIRMAN JACKSON:  And will this be in the reg
guide you are developing?
          MR. HOLAHAN:  It is more than in the reg guide; it
is the reg guide.
          MR. TAYLOR:  One of the reasons we finished
NUREG-1150 was that was the standard, presumably.  Of course
we spent a great deal of time.  Not we the agency, but those
that assisted us preparing such a study.  
          COMMISSIONER ROGERS:  Yes, but it was just for
those plants.
          MR. TAYLOR:  Right, but it set the standard. 
Having both a mix of BWRs and PWRs was an attempt to set a
base standard.
          Is that not correct, Dave?
          MR. MORRISON:  That's correct.
          You have raised a good point.  What we need to do
is build on the experience and the insights that we have
gained from the IPE process where we have a large number of
.                                                          51
these and recognize that perhaps what we set then was not
sufficient for people to be able to do what we now require
as an acceptable PRA.
          MR. TAYLOR:  We are pushing even beyond the
screen.
          COMMISSIONER ROGERS:  I wondered whether even the
1150 analyses would be good enough to make some of the
decisions that we are thinking about making using PRAs now. 
They really sort of led us to see how the safety goals were
being met or not met, but now we are talking about very
detailed applications.
          MR. THADANI:  I don't think in my lifetime we will
know how to do so-called perfect probabilistic risk
assessment where I can really believe everything that comes
out of that evaluation, because we will continue to have
questions about cognitive errors, errors of commission, and
things like that will always be around; there will always be
some questions.  
          I think the policy statement lays out clearly the
recognition that there are some places where one can apply
these techniques, but you can't just depend on these
techniques alone.  We have this infrastructure.  We have the
knowledge and evaluation studies that have been done up to
now through our deterministic process.  One can't just
replace that.  Rather, the value of these techniques would
.                                                          52
be in selected areas to see if we can do better:  Have we
gone too far?  Has there been an area of perhaps what I
would call overregulation or underregulation?  
          COMMISSIONER ROGERS:  You don't have to take into
account all of the human factors analyses perhaps for
certain types of decisions.
          Let's move on a little bit.  This question about
safety goals.  That was a pioneering effort when a
Commission put those in place.  On the other hand, they
really do relate in their initial form, if you want to use
PRA, to a level 3 PRA.  That means you have to know
something about the location of the plant, the population
distribution.  You have to do that.  Level 3 has been a big
challenge for a number of licensees.  I guess not very many
have actually gone to level 3.  There have been a few, but
mostly it has been level 2 where they terminate.  
          When we start talking about safety goals for
making some kind of regulatory decision that involves PRAs,
are you going to have to move to subsidiary goals in order
to do something meaningful here?
          MR. THADANI:  Exactly.  I think that is it. 
Uncertainties just get worse as you go on all the way out to
consequence calculations and health effects.  
          As the Commission has approved, the regulatory
analysis group put together a document on subsidiary
.                                                          53
objectives and core damage frequency and containment
performance in terms of early or late containment failure. 
Those are criteria that we use in our generic approaches to
safety issues, rulemaking activities, and so on.  If we
follow the path we are on, those will be the criteria we
will propose.
          COMMISSIONER ROGERS:  Have we totally wrapped up
what the acceptable subsidiary goals are?
          MR. THADANI:  Currently the Commission approved
use of those criteria in any new rulemaking activities and
generic activities.  We have applied those in some of the
recent regulations.  
          COMMISSIONER ROGERS:  As we proceed to deal with
the use of safety goals in regulatory decision-making, I
think we ought to be brought up to date on what the status
is of the surrogates for the level 3 statements of safety
goals.
          MR. THADANI:  We did send up a Commission paper
indicating how difficult it was to define a large early
release.
          COMMISSIONER ROGERS:  I remember that.  That's why
I'm not sure how wrapped up this is.
          MR. THADANI:  So we went to containment
performance instead, timing of containment failure as a
reflection of significant releases.
.                                                          54
          COMMISSIONER ROGERS:  I want to say one more thing
and then we will give Commissioner Dicus a chance.
          You mentioned these expert panels that are being
used in the application of the maintenance rule.  I take it
those are licensee panels.
          MR. THADANI:  Yes.
          COMMISSIONER ROGERS:  Have you thought at all
about trying to use what we have come to so far in studying
the use of expert judgment in the high-level waste area as
some useful guidance to provide to these expert panels for
use in the maintenance rule?  It is two different sides of
the house now.  Can you take something from one and usefully
provide it to the other?
          MR. HOLAHAN:  I think we have not, but I think it
is an interesting thought we can follow up on.
          COMMISSIONER ROGERS:  Thank you.
          CHAIRMAN JACKSON:  Commissioner Dicus.
          COMMISSIONER DICUS:  I am making it unanimous that
we all have some general concerns here.  I think it is clear
that all three of us would like to hear a little bit more
back from you, not necessarily today, but sometime in the
hear future, on some of these issues, particularly these
policy concerns that you have raised.  
          I would add one thing, and that is perhaps some
sequencing in how these are resolved.  It may be necessary
.                                                          55
to have some resolution of the safety goals, this risk
neutral versus increases in risk policy issues, and come to
some points there even before you can come to some
resolution of these applications that are coming in.  That
would be the only thing I would add.
          CHAIRMAN JACKSON:  Let me ask you two quick
follow-up questions.  I note that several guidance documents
are being prepared by industry and reviewed by the staff,
including the NEI PSA applications guide.  Can you clarify
again what the relationship is between these documents and
the staff's review of them and the guidance documents being
prepared by the staff?
          MR. HOLAHAN:  Yes.  The PSA applications guide
developed by EPRI for NEI, I consider that to be the same
scope as what we have called the general SRP and regulatory
guide.  It is that type of document.  As part of our
development of the regulatory guide and SRP that team is
reviewing that guidance document.  If we found that to be a
complete, thorough document, then we would propose to
reference it as part of the regulatory guide.
          In our review of the last draft of that guide we
raised a number of issues.  I think it was 12 or 15.  Some
of those issues have been dealt with by NEI in the revision
to the guide, and I think some of them have not.  
          In its current form, I think there are a number of
.                                                          56
open issues for which the staff wouldn't be satisfied with
it as a reference document.  Whether it is referenced at all
or whether the staff develops independent thoughts on the
same scope remains to be seen.  It is part of the review
process. 
          For example, in the maintenance rule area, the
regulatory guide does reference the NEI 9301 document. 
There is basically an acceptance of that as an approach.
          CHAIRMAN JACKSON:  Are you saying that you are
reviewing them with respect to their potential suitability
for the staff to endorse them?
          MR. HOLAHAN:  Yes, to reference for endorsement.
          CHAIRMAN JACKSON:  In lieu of development of our
own reg guides?
          MR. HOLAHAN:  In a practical sense, it's not in
place of it.  There will be a regulatory guide.  I would say
it is likely that if there is an endorsement it will be a
partial endorsement with remaining issues to be dealt with.
          CHAIRMAN JACKSON:  In each case we would have our
own reg guide and we would either incorporate in that a
reference and/or an endorsement as appropriate.
          MR. HOLAHAN:  Yes.
          MR. THADANI:  At this stage there are some issues
with the industry guide.  We do have some concerns and those
concerns have been identified.  When we go forward, even on
.                                                          57
these pilots, we will try to utilize our best views on the
issues as well as try and see if one were to apply the PSA
guide approach the industry put together how different the
answers might be to get a little better understanding of
what these differences might mean.
          MR. HOLAHAN:  There is some value to endorsing an
industry guide if you are comfortable with the quality of
it.  It has had a lot of industry input; the utilities are
more comfortable with a guide that they have tried out and
that they were involved in the development of; and it
probably is an easier and smoother implementation.  Whether
we can do that or not depends on whether we feel the issues
are adequately addressed in those documents.
          CHAIRMAN JACKSON:  I note that you state that
numerical criteria are espoused by the NEI PSA application
guide and that some of these criteria will be tested in the
ongoing industry initiated pilot applications.  Can you give
us an example?
          MR. THADANI:  Examples of the criteria?
          CHAIRMAN JACKSON:  Right, some criteria and how
they would be tested.
          MR. THADANI:  If you go through some results from
probabilistic safety studies, NEI guidance document would
say that for a given change -- let's say there is a change
that is to be made to the plant, a permanent change to the
.                                                          58
plant.  They would propose delta core damage frequency of
some magnitude being acceptable.  In addition to that, they
would propose that certain importance measures be looked at.
          The value of importance measures is it helps you a
little bit in terms of the uncertainties that might exist in
these studies.  They have proposed some specific criteria
for these importance measures to be used.  We don't
necessarily agree that those are the right values to be
used.  What we would try and do is to use these criteria and
some other criteria to see how the results change, take a
look at the output, and then use your best judgment:  Does
this seem a better breakdown, so to speak, of what is more
important and what is less important?  The devil is in the
details.  
          CHAIRMAN JACKSON:  You are building this into your
review of the pilot applications and what you are going to
be looking for?
          MR. THADANI:  Yes.
          MR. HOLAHAN:  Yes.
          MR. THADANI:  We have already indicated that there
are some issues we are worried about in this guide.  We have
told NEI that.
          CHAIRMAN JACKSON:  You have your list of what your
information needs are that you feel you need to get out of
these pilots?
.                                                          59
          MR. THADANI:  Yes.
          CHAIRMAN JACKSON:  Have you given any thought to
how uncertainty will be dealt with in the performance-based
side of the equation?
          MR. THADANI:  I don't have a clear answer to that. 
Uncertainty is one area that I am a little uncomfortable
with.  If you look at many of the studies done, they don't
necessarily do a very good job of addressing uncertainties. 
In my view, when we get to performance-based approaches, as
you have yourself said on many an occasion, our requirements
should be clear and consistent.  
          I am not sure that one should have numerical
criteria in terms of performance.  I think we have got to
stay back to something else that will tell us if one reaches
that threshold, it's a sign of a problem, and deal with
that.  That is a non-numerical approach at that point.  I
think numerical approaches would be difficult.  
          Commissioner Rogers was here when we had this
issue of how do you know what is the underlying reliability
of diesel generators.  If you have a rigorous statistical
approach to that, then you almost cannot tolerate any
failure at all even though the underlying reliability of the
diesel may in fact be what one wants.  
          So you get into this very tough scenario.  Diesel
unreliability is on the order of 5 percent.  Other component
.                                                          60
reliability or unreliability is much lower.  So the
magnitude of this issue will just grow.  That is an issue
not identified in the implementation plan today, those kinds
of difficulties.  I think we need to address that issue as
part of this activity.
          CHAIRMAN JACKSON:  Let me thank you very much for
a comprehensive briefing on the PRA implementation plan.  I
do want to commend you for the progress you have made to
date in this sometimes difficult area.  
          [Slide.]
          CHAIRMAN JACKSON:  You can put the credits up as I
speak.
          MR. HOLAHAN:  In addition to just the credits, we
have the actual people here.
          CHAIRMAN JACKSON:  Maybe those credits ought to
stand.  Why don't the team members stand up so we can see
who you are.
          Very good.  Now that I see you, I can encourage
you to continue to improve the PRA process and to provide
appropriate review mechanisms to ensure that the PRA is used
appropriately throughout the agency and consistently.  I
know it is widely used throughout the respective offices and
so it has already become an important regulatory tool.  
          In striving to enhance the process and to ensure
its consistent use, let me reiterate four points that I
.                                                          61
think have come out of our meeting today.
          With respect to the issue of referencing safety
goals and decision criteria, I think the point that was made
about the use of subsidiary goals, laying those out,
clarifying where they are appropriately used is important.
          Second, you raised yourself the issue of the
performance-based approaches and where performance measures
vice programmatic approaches are important.  It seems to me
that is an issue that you have to clarify, where systems or
applications can be appropriately binned one way or another
as opposed to necessarily trying to force everything within
one pot.
          Third, you have the IPE reviews that you are
completing and you have the industry initiated pilots.  It
seems to me you have to put the two of them together to very
carefully consider what your lessons are, and, either
looking back or prospectively, how they will be used in
developing the reg guides as well as the standard review
plans.
          Finally, as I think came out of the discussion on
alternative approaches for reviewing ISI and IST changes,
the message is that the staff should provide the Commission
with the pros and cons of potential staff approaches and
recommendations on all of the emerging policy issues prior
to the staff taking a position.
.                                                          62
          With that, I will ask if my fellow commissioners
have any further comments.
          COMMISSIONER ROGERS:  No, thank you.
          COMMISSIONER DICUS:  No.
          CHAIRMAN JACKSON:  We stand adjourned.
          [Whereupon at 11:50 a.m. the meeting was
adjourned.]