1
                  UNITED STATES OF AMERICA
                NUCLEAR REGULATORY COMMISSION
                             ***
             BRIEFING ON PRA IMPLEMENTATION PLAN
                             ***
                       PUBLIC MEETING
                             ***
           
                              Nuclear Regulatory Commission
                              11555 Rockville Pike
                              Rockville, Maryland
           
                              Wednesday, October 16, 1996
           
          The Commission met in open session, pursuant to
notice, at 2:07 p.m., the Honorable SHIRLEY A. JACKSON,
Chairman of the Commission, presiding.
           
COMMISSIONERS PRESENT:
          SHIRLEY A. JACKSON, Chairman of the Commission
          KENNETH C. ROGERS, Member of the Commission
          GRETA J. DICUS, Member of the Commission
          NILS J. DIAZ, Member of the Commission
          EDWARD McGAFFIGAN, JR., Member of the Commission
           
                                                           2
STAFF AND PRESENTERS SEATED AT THE COMMISSION TABLE:
          JOHN HOYLE, SECRETARY
          KAREN CYR, GENERAL COUNSEL
          JAMES TAYLOR, EDO
          EDWARD JORDAN, Director, AEOD
          CARL PAPERIELLO, Director, NMSS
          NORMAN EISENBERG, Senior Advisor, Performance
            Assessment, NMSS
          ASHOK THADANI, Associate Director for Inspection
            and Technical Assessment, NRR
          GARY HOLAHAN, Director, Division of Systems Safety
            and Analysis, NRR
          THOMAS KING, Deputy Director, Division of Systems
            Technology, RES
          JOSEPH MURPHY, Special Assistant, RES
           
           
           
           
           
           
           
           
           
           
                                                           3
                    P R O C E E D I N G S
                                                 [2:07 p.m.]
          CHAIRMAN JACKSON:  Good afternoon.  I am pleased
to welcome members of the Staff to brief the Commission on
the status of the PRA Implementation Plan.
          The PRA Implementation Plan was first issued in
August 1994, and the Staff provides quarterly written
updates and briefs to the Commission semiannually.
          Previous written updates on the status of
activities in the PRA Implementation Plan were provided to
the Commission in March and June of this year.  The
Commission was last briefed on the plan in April of this
year.  The plan is intended to be a management tool that
will help ensure the timely and integrated agency-wide use
of PRA methods and technology in the Agency's regulatory
activities.
          During today's briefing, the Staff will cover its
recent accomplishments, policy issue recommendations, key
technical and process issues, and its plan for future
activities.  I am particularly interested in hearing about
progress on the PRA regulatory guides and standard review
plans, as well as how these activities are being informed by
pilot applications.  I am also interested in cross-office
integration, and my fellow Commissioners and I are looking
forward to your briefing today.
                                                           4
          I understand that there are copies of the
viewgraphs available at the entrances to the room.
          If no one has any additional comments, Mr. Taylor,
please proceed.
          MR. TAYLOR:  Good afternoon.  With me at the table
from several offices is Norm Eisenberg, Carl Paperiello from
NMSS, Ed Jordan of AEOD, Ashok Thadani and Gary Holahan from
NRR, Tom King, and Joe Murphy from the Office of Research. 
I think this represents a good cross-section of people who
are working on this particular area.
          I would like to preface the presentation, and I
think Ashok will bring this up again, that in order for
licensees to use PRA and regulatory applications, the design
basis and configuration management issues at their plants
must be resolved.  In other words, the plant design bases
must be clearly known and maintained.  The plant must have
been constructed in accordance with the design basis, and
the plant must be configured and operated in accordance with
our NRC requirements and license commitments.
          With those opening thoughts, I will ask Ashok
Thadani to continue.
          MR. THADANI:  Thank you, Jim.
          May I have viewgraph No. 1, please?
          Good afternoon.  We thought it was probably useful
to go through fairly quickly some of the background
                                                           5
information, and I will do that, hopefully, quickly.
          Then Tom King from the Office of Research will
pick up on the recent accomplishments, as well as discussing
some of the key technical and process issues and the types
of questions that we need to make sure we can address.
          Part of these key technical issues are also some
policy matters, and then Gary is going to go over the four
key policy issues, as well as the next activities that we
are going to embark on in the next few months.
          May I have the next viewgraph, please?
          I know most of you know all of this information,
but again, the final policy statement was published over a
year ago, and following the policy statement, it was clear
that we needed to have a more detailed set of task schedules
that needed to be laid out, and with the special focus, as
you indicated Chairman Jackson, it was to accelerate
development of regulatory guides and standard review plans
as part of the activities.
          The Staff has been providing quarterly progress
reports to the Commission and a semiannual briefing on the
progress that we have made as we go forward.
          In the March '96 status report, we identified four
policy issues, and in the SRM that came out in May of 1996,
the Commission asked the Staff to provide its
recommendations to the Commission on each of the policy
                                                           6
issues.
          Last week, we had the last update report, and that
does get into the issue of where we stand on a number of
activities in the Implementation Plan, and there is a
special discussion of each of the policy issues.
          In addition to that, there is an attachment in the
last update which does identify the type of technical
process issues that we need to make sure to address.
          May I have the next viewgraph, please?
          We keep having to remind ourselves that the policy
statement has certain constraints and boundary conditions
that we have to keep in mind as we go forward, and it is
hard to capture everything on one chart, but I think this
chart does capture some of the important aspects of what is
in the policy statement. 
          The desire, clearly, is to use probabilistic
safety assessments or risk assessments, use them in all of
our regulatory activities, and the key there was all, but
then there are qualifiers, obviously, with those
applications.
          Certainly, as long as the methods are appropriate,
the database is there to support decisions in those areas,
and two important elements are that the decisions were not
to be based on risk analysis alone; that they had to
complement the traditional deterministic considerations as
                                                           7
well.  The idea, then, was to integrate the deterministic
and probabilistic considerations before making any final
decisions on regulatory matters.
          An issue that had to be dealt with was the issue
of uncertainties, not just the uncertainties that one can
quantify and develop distributions, but there are issues
that are very difficult to quantify as a matter of fact. 
          So specific focus had to be given to maintaining
defense in depth; that is, preserving the barriers that are
there, multiple barriers.
          The next element was very important to make sure
that when the policy statement went out that the industry
did not misunderstand the statement itself; that licensees
had to meet all current rules and regulations, even if there
were rules and regulations which may have low safety
significance, but the idea there was that if there are
insights from risk assessments that point out that some
other requirements may not be properly in tune in terms of
risk significance, that the process would be to first change
the requirements and not presume that they didn't have to
meet the requirements. 
          The second part, and this is the one that Mr.
Taylor mentioned, is that the risk assessments are not very
useful if they don't really represent the plant itself.  If
the documents don't reflect the plant design and the risk
                                                           8
assessment is based on those documents, then, clearly, the
risk assessment doesn't really represent the plant.  To what
degree depends on the differences between the actual plant
configuration design procedures versus what is in the
documents.  So that is an important element that needs to be
recognized, and it is clearly a lesson that we have learned
from some of the millstone activities.
          The third bullet there refers to in June of 1990,
the Commission gave us guidance through an SRM, indicating
that we ought not to be using the Commission's safety goals
and the subsidiary objectives, which relate to core damage,
frequency, and containment performance; that we ought not to
be using these on a plant-specific basis, but they should be
used in generic matters for things like future rulemaking
activities and so on.
          One of the policy issues that you will hear about
later on is, in fact, should we use these objectives on a
plant-specific basis, but we will come back to that issue
when Gary gets into those policy issues.
          The next viewgraph, please.
          This really highlights that the PRA can make a
pretty significant role in regulatory activities, and this
chart is really representing a reactive program in a very
broad scope manner.
          As our resources go down, there are budgetary
                                                           9
constraints.  With time, it becomes even more and more
important to focus our activities in areas that are more
important to safety.  So the idea here is to show that the
scope can be pretty broad in terms of where these techniques
can be applied.
          The reason for that is if one were to use these
techniques in conjunction with our deterministic
assessments, the end results are going to be much better
decisions, much more effective safety decisions.  There
would be obviously much more effective use of resources,
both in terms of the Agency resources, but also in terms of
the industry resources.  So, again, this chart supports the
major thrust of the policy statement that we should, in
fact, go forward and apply these approaches in our
regulatory activities.
          CHAIRMAN JACKSON:  Before you go ahead, can you
give us some sense of the status of licensees'
implementation of accident management strategies?
          MR. THADANI:  Yes.  I will give you just a general
sense.
          Many of the licensees, as you know, most of them
have completed individual plant examination.  As a result of
the IPEs, they had identified a number of procedural
enhancements that could be made, and by and large, the ones
that they identified, they have gone forward, but the broad
                                                          10
scope accident management program that we have been working
on with the industry for some time, which really goes into
not only in terms of prevention of accidents, but also
following core damage events, what are reasonable things to
do, and all the way into communication with different
groups.  That is broadly included under what we call the
accident management program.
          The industry through the owners group, they have
done essentially all of the technical work.  There are minor
issues that need to be dealt with on the BWR plants, but by
and large, much of the technical work is completed, and the
utilities now are going to be converted that information,
which is generic, a fair amount of good technical
assessment, converting them into their plant-specific either
emergency operating procedures or guidance for technical
support groups, which would be called upon to provide
guidance in case of an accident.
          The schedule currently calls for all the licensees
implementing accident management by December of 1988.  Some
of the licensees would have implemented accident management
early '98, and very few late '97.  Most of them in 1998, and
the last ones, end of 1998, would have implemented accident
management. 
          Could I have the next viewgraph, please?
          As I have said, the policy statement is to
                                                          11
incorporate all activities, which meant that it was very
important to capture these activities in detail, and that is
where the Implementation Plan comes in.
          In the plan, which is very comprehensive and broad
scope, there are a number of tasks.  I forget the exact
number, but certainly over a hundred distinct activities are
involved.
          From those tasks, some of which we have not met
the schedule or we think we won't be able to meet the
schedule, there will be some delays, but the highest
priority we have given is to regulatory guides and the
standard review plans.  There, the schedule was to get
drafts completed by the end of this year, and that is, in
fact, the schedule we're still on.
          You will hear about where we are in terms of the
pilot applications.  There have been some delays in the
projected completion dates for pilot plans.
          One reason for delays is resources, but I think
that is a smaller reason.  The larger reason has been trying
to do a fairly thorough job, which means a fair amount of
information that is needed from individual pilot
participants, and in some cases, it has taken longer to get
answers to some questions, but nevertheless, the key point
is that we are getting sufficient information from these
pilots, so that we can, in fact, go ahead, get the reg guide
                                                          12
and the standard review guide out for public comment, and
finish up the pilots perhaps even during that comment
period.
          The scope of the Implementation Plan goes well
beyond what NRR does, of course.  It includes a number of
activities that AEOD is involved in, NMSS, and of course,
Office of Research has been working with NRR on some of
these activities I have already described.
          Now, unless you have other questions, I was going
to go to Tom King, so he can get into the real substance of
the issues.
          MR. KING:  Thank you, Ashok.
          What I wanted to cover was to briefly summarize
the recent accomplishments since the last status report in
June, and then to discuss the review process and the key
technical and process issues that have come out of
developing the reg guides and SRPs to date.
          If I could have Slide 6, please.
          Slide 6 summarizes the recent accomplishments. 
The first bullet talks about the draft reg guides and SRPs,
but I think what I will do is when we get to Slide 7, we
will talk about that in more detail.
          We are continuing to review the industry-initiated
pilot applications, as Ashok mentioned, the pilot
applications in four areas, ISI, IST, QA, and tech specs. 
                                                          13
It involves seven or eight plants that are participating in
the pilot process.  We expect to complete those reviews and
send to the Commission a recommended decision over the next
two to eight months, starting in December with the tech
specs and then through June of next year with ISI and IST.
          CHAIRMAN JACKSON:  To what extent have those
industry-initiated pilots informed the development of the
guidance documents that you are talking about?
          MR. KING:  They have provided input.  We have gone
through and taken our list of issues that we have developed
in drafting the reg guide and SRP and looked at the pilots
as to how they were addressing those to get some feedback,
and we have actually gotten some feedback that has been
incorporated.
          CHAIRMAN JACKSON:  And it is somewhat of a lag,
also, you are saying, relative to when the final outputs of
the pilots will be available.  Is that a fair statement?
          MR. KING:  I am not sure.
          CHAIRMAN JACKSON:  Well, what I am saying is, do
you feel you have gotten all out of the industry-initiated
pilots that you can relative to how it propagates into the
development of the guidance documents?
          MR. KING:  I suspect we will probably.  As they
continue to respond to request for additional information,
we will continue to learn some more.
                                                          14
          CHAIRMAN JACKSON:  Okay.
          MR. HOLAHAN:  I can remind you that I believe at
the last Commission meeting, we presented a matrix
identifying 10 or a dozen issues and which ones we learned
stuff from on the various pilots and a couple of areas where
we needed to do more work.
          CHAIRMAN JACKSON:  And those ones where you have
identified that you have learned some things from the pilot,
is that what you mean when you say these things have or --
          MR. HOLAHAN:  Yes, yes.
          CHAIRMAN JACKSON:  -- what you have learned has
been incorporated in these guidance documents? 
          MR. HOLAHAN:  Yes.
          MR. KING:  Yes.
          CHAIRMAN JACKSON:  Okay.
          MR. KING:  The backup viewgraphs to the package
you have has some more details on the pilots in terms of the
plants and the schedules and so forth.  I wasn't going to
cover those specifically.
          The third bullet talks about the IPE and IPEEE. 
We are continuing to review in both of those areas.  We
currently have 19 IPE reviews to go until we are complete. 
We expect 16 of those 19 to be done by December.  Three will
probably carry over until next year, probably spring or so. 
Those are three where we have had problems with the IPE, and
                                                          15
we have requested that parts of it basically be redone.  We
are waiting for a resubmittal.
          CHAIRMAN JACKSON:  Let me ask you a question. 
These IPEs are essentially PRAs; is that correct?
          MR. KING:  Yes.
          CHAIRMAN JACKSON:  Will you be coming out of that
review with some assessment of how strongly coupled they are
to the design basis or how well known the design basis is
for those plants relative to what these IPEs, in fact, are
showing?
          MR. KING:  Not through the IPE program.  We are
not doing that.  We are not trying to go back and confirm
the design basis through the IPE program.  We would expect
licensees in doing their IPE actually reflect the as-built
and operated plant.  We have not checked that.
          CHAIRMAN JACKSON:  Mr. Thadani, you look like you
must say something.
          MR. THADANI:  No.  I think that is the answer.
          As you know, we have 50-50 4F letters out now, and
depending on what results come out as a follow-up to those
letters, there may be an action that we may have to follow.
          CHAIRMAN JACKSON:  Okay.  I've got you.
          MR. KING:  The IPEEEs, we have 24 of those under
review.  None have been completed at this point that we
would expect early next year that they would start coming
                                                          16
out, the staff evaluation reports on those.
          The other thing I want to mention on IPEs is that
we have prepared an insights report.  A copy was sent to the
Commission last week.  We have also been going to the
regions and briefing them on the insights coming out of the
IPEs, both the generic insights and the plant-specific
insights, so they can factor them into their inspection
programs and other interactions with licensees.  So that is
continuing to go on.
          CHAIRMAN JACKSON:  I hate to keep dwelling on the
same thing, but let me ask you this question.  Based on what
you may get out of the 50-50 4F responses, the letter
responses, are you going to do some juxtaposition of any
sample of the IPEs, what comes out of that to have some
sense?  In a sense, these insights are based on acceptance
as is, right?
          MR. THADANI:  That is correct.
          CHAIRMAN JACKSON:  So is there going to be any
kind of a sampling?
          MR. THADANI:  If certain plants are identified
which may, in fact, have differences, then I think we would
go back to those plants --
          CHAIRMAN JACKSON:  And review the IPE?
          MR. THADANI:  -- and ask them --
          CHAIRMAN JACKSON:  To review.
                                                          17
          MR. THADANI:  -- to address those.
          CHAIRMAN JACKSON:  Redo.
          MR. THADANI:  Yes.
          CHAIRMAN JACKSON:  I got your point.
          MR. THADANI:  Yes.
          CHAIRMAN JACKSON:  Thank you.
          MR. KING:  The fourth and fifth bullet really
address the proposed Reliability Data Rule.  AEOD conducted
a public workshop in June and received a number of comments. 
They are continuing to look at those and work on resolution.
          In parallel, I understand industry recently
submitted some sample data to demonstrate a proposed
voluntary alternative to the data rule.
          CHAIRMAN JACKSON:  Let me ask if I may, Mr. Jordan
--
          MR. JORDAN:  Yes.
          CHAIRMAN JACKSON:  -- where do things stand with
regard to our review of that sample data?
          MR. JORDAN:  We have a dataset that represents the
data elements from the safety system performance indicator
that INPO uses, and we are applying those data elements into
our reliability data scheme.  We are still in the process of
assessment to identify what elements might be needed in
order to assure that we have training level system
reliability.
                                                          18
          CHAIRMAN JACKSON:  Okay.
          MR. KING:  And the backup viewgraphs have some
additional information on the Reliability Data Rule.
          CHAIRMAN JACKSON:  Okay.
          MR. KING:  And finally, AEOD has completed
development of PRA training guidance document, NUREG BR0228,
and that was issued in July.  They have also developed a
prototype PRA for a technical managers course, which they
had a dry run on several months ago and I understand will be
available, be offered to the Staff in the next several
months.
          If I could have Slide 7.
          CHAIRMAN JACKSON:  I hate to do this to you, but
given that this PRA training guidance document has been
completed, how is it being used?
          MR. JORDAN:  Okay.  I can answer that.  It is the
basis for managers identifying appropriate courses for staff
members.  So it is a road map in order to provide the right
level of qualification for staff members.
          CHAIRMAN JACKSON:  So it identifies some
qualification level and associated training program for a
given function in a job that someone has?
          MR. JORDAN:  Correct.  That is correct.  So it
identifies the various levels of qualification and then the
scheme of courses that, of course, can be looked at with
                                                          19
respect to that individual's experience, education, and
training to pick the right courses.
          MR. KING:  Slide 7, please.
          Slide 7 gives a little more detail on the
regulatory guides and standard review plans that are being
developed to support risk-informed regulation.  The
regulatory guides are really the guidance for licensees in
terms of what their submittal should contain, and then the
standard review plan is guidance for the staff as to how to
review that submittal.
          Early in 1996, we had put together inter-office
teams to draft the reg guides and standard review plans, and
the ones being worked on are listed here.  We had also put
together an inter-office PRA coordination committee to
provide some oversight and direction into that effort. 
Overall, those activities have been working well.
          Currently, there are drafts for all of the reg
guides and SRPs.  The ISI one has slipped three months, as
noted in the SECY paper that came up, primarily because of
late start on the pilot programs, but the others are
underway.  They are under various stages of review.  We plan
to get them to ACRS. 
          We have also developed a draft NUREG 1602 which is
a key reference document in the general reg guide in terms
of the standards for a PRA, in terms of the level of detail
                                                          20
and quality and so forth.
          CHAIRMAN JACKSON:  So you have laid that out?
          MR. KING:  That has been sent to ACRS for review. 
We have had numerous interactions with industry on both the
pilots and generic topics, as well as ACRS.  We have had a
number of meetings with them.  We have the next one coming
up on October 31st and then another one after that on
November 21st where we will be reviewing the reg guides, the
SRPs, the draft NUREG, and the issues that are coming out of
these things.
          If we could move on to Slide 8.
          Slide 8 shows the review process around which the
reg guides and SRPs are being developed, which we are trying
out on the pilot activities.  It is a six-step review
process that we have defined to try and provide some
consistency and structure to the evaluation and review, and
we would expect licensee submittals and the Staff review
would follow these six steps as much as possible.
          The six steps are shown on Slide 8, and the
feedback loops are shown.  We thought it would be useful to
put it in this presentation as background because, as we get
into the discussion of the technical and policy issues, this
will illustrate the sequence of the logic in the evaluation
and I think will help in understanding where the technical
and policy issues fit in the evaluation process.
                                                          21
          The steps we believe are consistent with the PRA
policy statement, they are set up such that risk assessment
complements the deterministic evaluation and defense in
depth.
          There is a step that specifically was put in on
performance monitoring, which is related to one of the
policy issues we are going to talk about, and even though
there is not a feedback loop shown, if you go through this
process, you could end up coming back to step one and
redefining the scope of your proposed change, depending on
how the outcome of the evaluations were.
          If we could go to Slide 9.
          Pages 9 through 13 contain a list of what we call
the key technical and process issues.  These are things that
we are addressing as part of the reg guide and standard
review plan development, and they were identified as part of
drafting the reg guide, standard review plan and interaction
with the pilot projects.
          I don't plan to discuss all 27 of them, but what I
wanted to do is highlight the ones that are related to the
policy issues that are going to come up later on in the
briefing, as well as any others that are of particular
importance.
          We thought it would be useful to present these in
this briefing because they do provide some key background
                                                          22
regarding the six-step review process, as well as the
background for understanding the policy issues.
          CHAIRMAN JACKSON:  Let me ask you, before you go
through them, can you say to what extent these questions
will be addressed in the guidance documents being developed,
and if not, are they dependent upon the Commission
addressing the policy issues, and if they are not dependent
upon that, how are you working on answers?  So it is a
three-part question.
          MR. KING:  All of them will be addressed in some
fashion in developing the reg guides and standard review
plants.
          CHAIRMAN JACKSON:  Okay.
          MR. KING:  We are proceeding on the ones that are
related to the policy issues.  The path we are proceeding
down is consistent with what we are recommending on the
position on the policy issues.  If the Commission decides
otherwise, we will have to revisit those.
          CHAIRMAN JACKSON:  Okay.
          MR. KING:  Before I get into some of the example
issues, I did want to say a couple of things about how we
are using this list.
          We put it together for several reasons; one, to
help focus attention on the more important items, both Staff
and management intention.  Two, it is a good way to track
                                                          23
progress as to how close we are to getting these things
resolved, and three, it does provide, as I mentioned
earlier, a systematic way to go through and get some
feedback from the pilot plants, ask what the pilot
activities -- find out what they are doing and to address
each of these things.  So it is being used in several
different ways.
          Also, I want to mention that some of these issues
have sub-elements.  We didn't list all of the sub-elements
because it would get too complicated.
          Also, some of these issues, the answers may have
-- there may be several options in the way to deal with some
of these issues, and in some cases, we will probably
recommend -- go to ACRS with some options, and we may want
to go for public comment on some options and make a final
decision after we get feedback from the public comment
process.
          So we are not planning at this point to pick just
one option for each one.  Where it makes sense to list
several options, we would plan to do that.
          Our next meeting with ACRS is going to focus on
these issues as part of reviewing the reg guide and standard
review plant.
          Let me start with page 9.  These are laid out in
accordance with the six steps.  Roman Numeral I is step one
                                                          24
on the flow diagram, and I-(a), what information does the
licensee need to submit to characterize the change, this
addresses right up front the point that Mr. Thadani brought
up.  Unless a plant knows its current licensing basis and
has the plant built and operated in accordance with the risk
evaluation and the deterministic evaluation, it may not be
very useful.  So we want to establish right up front that a
licensee has confirmed its current licensing basis and that
the plant is built and operated in accordance with it, so
that the rest of the analysis is consistent with that.
          Issue II-(b), what are the acceptance guidelines
for the deterministic evaluation, this is one we have been
struggling with quite a bit.  Again, the PRA policy
statement says PRA is to be used to complement the
deterministic evaluation.  Deterministic terms like "defense
in depth" and "design margins" and so forth are used quite a
bit, but when you go to write the standard review plan and a
regulatory guide to find exactly what is meant by those
things and what are the acceptance criteria, it gets a
little tougher.  So we have been struggling with this.  I am
not here to say we have an answer yet, but it is going to be
one item that is going to involve a lot of discussion over
the next several months.
          CHAIRMAN JACKSON:  Well, one could argue that that
is an interesting statement, but one could also say that
                                                          25
maybe this exercise, then, in developing a PRA framework
helps us focus on what we mean --
          MR. KING:  Yes.
          CHAIRMAN JACKSON:  -- by defense in depth and
design margin.
          MR. KING:  Yes.
          MR. HOLAHAN:  As a matter of fact, on that
subject, we have an interoffice meeting this afternoon at 4
o'clock to see if we can come a little closer to figuring
out exactly what this ought to be.
          MR. KING:  Slide 10, please.
          Item III-(g) and (i) are directly related to two
of the policy issues, the policy issues associated with
plant-specific application of the safety goals and the risk
neutral versus risk increase.  (g) is how should the
acceptance guidelines be structured, and that gets into
issues like what metrics should be used, should it be core
damage frequency, condition of containment failure
probability, large early release frequency, some other
aspect, how do we pick the values to be consistent with the
safety goal, considering the fact that we are talking
plant-specific application, how do you account for less than
full scope PRA.
          The safety goal policy statement was fairly clear
that the risk that it was talking about was from all aspects
                                                          26
of plant operation, and a lot of the risk analysis that is
out there, including the IPEs, are focusing on full power
operation only.
          CHAIRMAN JACKSON:  I have to play the devil's
advocate here again.  If we look at issue (a), what
determines the extent to which risk analysis can be used,
did the Commission's policy statement itself address that
question?
          MR. HOLAHAN:  It has.
          MR. THADANI:  Yes.  I think, in fact, that was the
very first bullet when I went through.
          CHAIRMAN JACKSON:  Right.
          MR. THADANI:  To that extent, it is supported by
methods and data and to be used as complement.
          CHAIRMAN JACKSON:  So what you are trying to do is
pin down in some more quantitative way what that is?
          MR. THADANI:  More details and what does it really
mean, supported by methods and data, what does that mean.
          CHAIRMAN JACKSON:  I just want to go through a
couple of them, not all of them.
          MR. KING:  Sure.
          CHAIRMAN JACKSON:  With (b), where you say what
determines the required quality of the risk analysis, will
the guidance documents answer that question?
          MR. KING:  Yes.
                                                          27
          CHAIRMAN JACKSON:  Then, my favorite topic is (d),
how is uncertainty to be addressed.  There are two
questions.  One is, how has uncertainty been treated in the
past, in past uses of PRA insights in the regulatory
process.  So, if you could give me an answer to that, and
then the other one, which is that I note that in your SECY
96-218, the Staff indicates that it intends to use the mean
value for comparison with numerical guidelines associated
with absolute measures, such as core damage frequency, and
this is my favorite topic.
          So the question becomes answering how an
uncertainty to be addressed, referencing it to how it has
been addressed in the past.  Does this imply that if you
have equal mean values -- this is where the rubber meets the
road -- with big differences in uncertainty, would that lead
to the same regulatory decision?
          MR. KING:  Not necessarily.
          CHAIRMAN JACKSON:  Okay.  So can you maybe
illuminate or amplify on that a little bit?
          MR. KING:  We haven't settled exactly on how we
are going to treat uncertainty at this point either.
          CHAIRMAN JACKSON:  Okay.
          MR. KING:  The Commission safety goal policy said
use mean values in assessing against the goals.
          CHAIRMAN JACKSON:  Do you feel that is enough?
                                                          28
          MR. KING:  In some cases, it is enough, but it
does require a full uncertainty analysis be part of the
analysis and evaluation by the licensee and by the review by
the staff.
          MR. THADANI:  I think that this is obviously a
very tough issue.  Mean values relate to where you are able
to quantify a number of things and you are able to actually
draw some sort of distribution and so on, but there are many
elements where the uncertainties are really not quantified,
organizational cultural issues, some other things, for
example, millstone issues, some of the millstone issues.
          So there are areas where uncertainties are not
quantified, programmatic weaknesses or problems.  So what it
really boils down to is when you get an issue where let's
say a licensee wants to use these techniques, it seems to me
we are going to have to look at that specific issue and try
to use some judgment on what are some of those so-called
unquantified uncertainties and should we, in fact, use a
mean value, then.  Maybe not. 
          So it seems to me that there has to be some
balance brought into this process to recognize that we
cannot answer all the questions up front, I don't think, but
the process should allow those considerations whenever there
is an application to be made.  So I am hoping that is how we
can move forward.
                                                          29
          CHAIRMAN JACKSON:  If you look through this list
of A through I, have you identified which ones minimally
have to be answered at some level in order to develop
realistic guidance documents?  Have you answered that?
          MR. KING:  I think it is all of them.  Our intent
is all of them.
          CHAIRMAN JACKSON:  To have some answer?
          MR. KING:  To have some answer, yes.
          CHAIRMAN JACKSON:  For all of them?
          MR. KING:  Yes.
          MR. HOLAHAN:  Yes.
          MR. THADANI:  Yes.
          CHAIRMAN JACKSON:  So, when you say, then, that
these draft documents are available, that means that you
have some answers to all of them relative to development of
those guidance documents?
          MR. KING:  The drafts have some answers.  Whether
there is consensus on the Staff regarding those answers is
another question.  They are under review.
          CHAIRMAN JACKSON:  That is what you mean when you
say under Staff review?
          MR. KING:  Yes.
          CHAIRMAN JACKSON:  I see.  Okay.  Now I
understand.
          COMMISSIONER ROGERS:  Well, before we leave this,
                                                          30
I'd like to just pursue one little aspect of it, and you
touched on it a bit, the question of quantitative measures
or lack of quantitative measures.
          It is kind of my intention that some areas of risk
analysis that are being used, particularly in the fuel cycle
facilities, are not really being carried out using
probabilistic analysis, but they are risk analyses, and it
seems to me that we are going to have to deal with that
issue of risk analyses which are not really based upon a
strictly probabilistic calculation, and nevertheless, do the
job in some way.
          This may or may not fit into the reactor area.  It
probably does to some extent, but it may be very important
in the nonreactor area.
          MR. THADANI:  Yes, yes.
          COMMISSIONER ROGERS:  So I do think that while I
am very high on numbers, I do think we have to recognize
that there are other ways of analyzing risk that are not
strictly based on probabilistic calculations, but
nevertheless are something closer or a little bit to the
usual traditional deterministic, but nevertheless are a risk
analysis rather than a straight engineering calculation of
some sort.
          I hope somehow we keep that in mind here for those
situations where that is the only way to go.
                                                          31
          MR. THADANI:  Yes, indeed.  In fact, I was going
to say some people don't like to hear this, but I think when
you go through the process of risk analysis, the first parts
are probably the most robust in the sense of logic models. 
The event trees and fault trees, by and large, I think, are
the most robust.
          When you get into quantification is where one has
to be cautious, and no matter what application, one needs to
look hard, I think.
          MR. HOLAHAN:  I think in many cases, this is not a
go and no-go sort of decision.  There are stages, as in the
policy statement suggestion is, first, does the
state-of-the-art support the kind of issue you are trying to
deal with, and I think after that, if the issue is amenable
to a probabilistic risk assessment, you still want to choose
the proper-sized tool for the job.  So, if it is a
relatively easy question or, in fact, a qualitative risk
assessment, it would convince you that this is a net
improvement, and why go through an elaborate uncertainty
analysis to figure out how sure are you or how big is that.
          CHAIRMAN JACKSON:  It strikes me, though, that
there is kind of a baseline question that in doing
everything that you have just described, you have to
address, which is kind of -- let me see if I can articulate
it.  It is essentially saying how much do I have to know and
                                                          32
be bale to quantify to make a judgment here, and if I want
to make a judgment somewhere else at some other level, that
it requires that much more, and if I can't get that, then
the decision-making has to be done a different way.
          Now, will the kind of guidance that you are
working your way up on allow those kinds of assessments to
be made?j
          MR. HOLAHAN:  Clearly, that is our goal.  It is
early on in this process to identify what is the proper tool
for the given issue.  Can you make a certain type of
decision with a qualitative analysis?  Does it take a
quantitative analysis, but not necessarily an elaborate
uncertainty analysis, or in some cases, are we making
sufficiently complicated decisions that a full scope, full
uncertainty analysis is needed?
          I think, in each case, what you are trying to do
is to say do I have confidence in the decision that I am
making.
          CHAIRMAN JACKSON:  I appreciate what you are
saying, and I guess all I am really asking is will the
guidance documents be such that one proceeds along a path
and comes to come bifurcation point that says I can go
further down this PRA path or I can't, and if I can't, then
it kicks over into something else.  I mean, that is
presumably where you are trying to go.
                                                          33
          MR. HOLAHAN:  We are working on that very subject. 
As recently as yesterday's meeting, we were going to divide
up regions in which more details on certainty analysis was
appropriate and where less is needed.  That is the kind of
thing that belongs in a guidance document.
          CHAIRMAN JACKSON:  Okay.
          MR. THADANI:  We had identified in an earlier
paper, actually, that generally we were looking at three
categories of applications.  One was what we called
prioritization which is, by and large, NRC activity, and
that one could go with something fairly simplified.  You
don't want to spend a lot of resources to see how to
prioritize things, but that you can use better understanding
of risk importance to make those kinds of decisions.  That
was probably the simplest type of application in terms of
the quality of analysis.
          The next one was where the decision was not really
eliminating a requirement, so to speak, but that you are
just shifting importance, so to speak, high safety
significance and medium safety significance and low safety
significance.  That would require a certain type of
analysis.
          Whereas, if you are really completely walking away
from what today's requirement might be, then one has to do a
very thorough analysis before saying that that makes sense. 
                                                          34
So those are the categories that we have been looking at,
and then, of course, the toughest issue, I think, is the
issue of how to deal with uncertainties in all of this.
          CHAIRMAN JACKSON:  Okay, thanks.
          MR. KING:  Just to quickly highlight item (i) at
the bottom of page 10, should the acceptance guidelines
apply to proposed changes individually or as a package, the
topic there is when someone comes in with a proposed change,
can they group changes together, look at risk changes due to
changes, proposed changes in tech specs versus ISI versus
graded QA and add them all up and get a net reduction or net
increase, whatever it turns out to be, or do we want to
limit it to just a single topic.  So that is the issue that
is being talked about there.
          On page 11, issues associated with implementation
and monitoring, this is tied to one of the policy issues. 
This step was explicitly added in the process so that we
would use performance monitoring as much as practical to
check the assumptions and provide feedback into the
evaluation and the changes that were being made.
          If assumptions are made regarding equipment
reliability or so forth, this is a step that would hopefully
check to see whether those assumptions are becoming true,
and if not, provide the appropriate feedback into the
process.
                                                          35
          CHAIRMAN JACKSON:  Now, a natural question that
arises is this.  Now we have a maintenance rule that just
became effective.  Presumably, each one of these questions
have to be addressed in implementing that rule for the SSCs
that we mean for it to cover.  What are the answers to those
questions within the context of the maintenance rule, and
then how does that flow into this and vice versa?
          MR. KING:  It may very well be the maintenance
rule is accomplishing this for whatever proposed change they
are making.
          CHAIRMAN JACKSON:  Well, I guess what I am trying
to say is that it strikes me that that is something you have
to come and tell us; namely, how are these four questions
being answered within the context of the maintenance rule,
and how, then, does that tie back into what you are doing
and how is what you are doing affect how these questions are
answered.
          MR. THADANI:  I think that there are two parts
that we need to be sure about.
          The first part would be depending on what
performance criteria one sets up.  If those are really
related to reliability analyses, so to speak, then, clearly,
one has to have some guidance document on how to assess and
interpret what has been done.
          As far as the maintenance rule is concerned, some
                                                          36
licensees may have used some reliability guidelines that may
have come out of the PRAs.  Some may not have.  That was not
strictly necessary under the maintenance rule.
          We have initiated our inspections, what we call
baseline inspections, under the maintenance rule.  As I had
indicated to you in the past, I am hoping that by February
time frame we will have done enough inspections, 10 or 12 or
some number like that, that we can probably draw some
inferences and some potentially generic insights.
          Our intention is to then step back.  What we learn
from those inspections would be considered, if it is
appropriate for these guides, but that I don't have the
answer today as to what we are going to find.
          CHAIRMAN JACKSON:  Let me, then, say this.  I am
going to be explicitly asking you this.  Since you would be
coming back in the March-April time frame to briefing the
Commission again, that you come back as part of that brief
with answers to these four questions in the context of the
maintenance rule --
          MR. THADANI:  Yes.
          CHAIRMAN JACKSON:  -- and how that ties into the
answers to these questions --
          MR. THADANI:  Yes.
          CHAIRMAN JACKSON:  -- within the context of what
you are doing --
                                                          37
          MR. THADANI:  Yes.
          CHAIRMAN JACKSON:  -- because it is very
important, okay?  Because first of all, we shouldn't be
going down a path relative to the maintenance rule that is
somehow different than the path we are going down in the
overall PRA Implementation Plan.
          Two, we say that the maintenance rule is our first
example of a risk-informed performance-based rule, and if it
is, then it better tie into the PRA framework that we are
developing.
          MR. THADANI:  Yes.
          CHAIRMAN JACKSON:  And I understand your point
about doing these baseline inspections, but since you
indicated that sometime after the first of the year --
          MR. THADANI:  Yes.
          CHAIRMAN JACKSON:  -- you will have more data,
then along around March-April, you should be able to put it
together, and you will be further along in these reviews of
your reg guides because I think this is very important. 
          MR. THADANI:  It is critical, I agree, and we will
do that.
          MR. HOLAHAN:  I think it serves the same role as
some of the pilot applications, but there are differences in
the scope and the intent of the maintenance rule versus the
general --
                                                          38
          CHAIRMAN JACKSON:  No, I appreciate that, and that
is, in fact, what you have to come back and tell us because,
in fact, we need to understand how the scope differs.
          MR. HOLAHAN:  Right.
          CHAIRMAN JACKSON:  We have talked about this
before within the context of the reliability data rule or
putative reliability data rule, but it is very important
because it is important in terms of consistency in how we do
things.
          MR. HOLAHAN:  Yes.
          CHAIRMAN JACKSON:  Okay, thanks.
          MR. KING:  Let me move on to Slide 12, issues
associated with integrated decision-making.  This is where
the deterministic and the probabilistic evaluations come
together and a decision has to be made.  Again, it relates
to what we talked about earlier, what are the deterministic
decision criteria.  A number of these items are directed
toward that.
          Let me just mention item (g) at the bottom of the
page, the role of 50.109.  This actually came out of one of
the pilot programs.  If the Staff has conducted the review
and feels that something else needs to be done over and
above what the licensee has volunteers to do, do we have to
follow 50.109 to get that in place.
          Counter to that or the reverse of that, we have
                                                          39
also discussed, and that is a licensee comes in and proposes
a change that causes some increase in risk, why shouldn't we
apply the backfit rule in a reverse way.  Is the cost
savings associated with that sufficient to justify the
increase in risk?  So that is what we have been talking
about.  I am not here to give you an answer, but that is
what that item means.
          CHAIRMAN JACKSON:  Let me just make one other
comments.  Aren't B, C, and D on here linked?  That is, if
one really had a process for addressing uncertainty, then
this issue of the extent to which the existing degree of
defense in depth should be maintained is more addressable,
as well as the issue of the margin of safety.
          The only reason I keep bringing this up, you say
it to me and I am saying it back to you.  Somehow we have
got to really get our hand around where we can get our hand
around.  I will put it that way, get our hands around where
we can get our hands around the uncertainty issue, because
if we don't somehow get that bullet bit and at least know
where we can and cannot do something.  I understand we
cannot do it everywhere.  I don't see how we are going to
answer (c) and (d).
          MR. KING:  Slide 13 has to do with what actually
has to be part of the submittal.  The documentation needs to
be submitted.  Do we need the full PRA or just some summary
                                                          40
of what was done that describes in enough detail?
          Regardless of whether the full PRA comes in or
just some summary information, item C is a process issue. 
Will our explicit use of risk information and plant-specific
decisions now require PRAs to be put on the docket and
litigated?  It is an item that just remains to be seen at
this point.
          With that, Gary Holahan is going to talk about the
four policy issues.
          MR. HOLAHAN:  Could I have Slide No. 14, please? 
The four policy issues that we identified are shown here,
the role of performance-based regulations, the use of the
safety goals or guidance, the decision process derived from
the safety goals on the plant-specific basis, whether
increases in risk should be allowed at all or under what
circumstances increases are appropriate, and then something
of a process question on how should changes in the ISI and
IST program be --
          CHAIRMAN JACKSON:  Mr. Holahan, you know I can't
let you slide.  How clearly do you feel the Staff knows what
performance-based regulation means, how clearly do you feel
you know, and what degree of concurrence is there on a
definition?  If not, how do you go about -- what are you
doing to clarify that?
          MR. HOLAHAN:  Well, I think there is not a
                                                          41
unanimity of understanding as to what the definition of
performance-based regulation is.
          I have seen lots of different definitions.  I
think we understand common features that performance-based
regulations have.
          I remember that we were told that
performance-based fire protection requirements are being put
in place around the world.  So a number of the staff met
with the National Institute for Standards and Technologies. 
They were involved with those things, and I guess they
confirmed our view that each country, each application has a
slightly different definition, but with some common
elements.
          So I think we are starting out with maybe buzz
words, but we are developing guidance documents which I hope
will clarify the situation.
          CHAIRMAN JACKSON:  Is there a utility, too, and
have you been able to garner any input from other types of
industries or regulatory bodies that have gone at this?  Are
we on the cutting edge?
          MR. HOLAHAN:  Well, I think there are some other
areas.  We have had some input from the industry.  There is
an industry white paper on the subject.
          A number of recent PRA conferences have identified
this as an issue, and so it has been discussed.  We
                                                          42
discussed it with the ACRS.
          CHAIRMAN JACKSON:  But is it all the nuclear
people talking to each other as opposed to --
          MR. HOLAHAN:  I would say, largely, it is.  In the
fire protection area, obviously, it goes well beyond the
nuclear area, but there are probably numerous other
industries that we haven't fully tapped.
          CHAIRMAN JACKSON:  Okay.
          MR. HOLAHAN:  Can I have Slide 15, please?
          The issue on Slide 15 being the role of a
performance-based regulation in the PRA Implementation Plan,
the Staff identified three options.  I guess it is also
important to note that as a result of the strategic
assessment, there is, in fact, a paper on the subject which
also identifies three, I would say, similar, not identical
options.
          So the Staff's moving ahead on these options, I
think, is also tied to the decision in the strategic
assessment arena.
          The first option we identified basically is to
continue our current practice, and our current practice
being what Tom King showed, which, in fact, is to have
developed what we called step four, which is as part of
risk-informed regulation, actually searching out
opportunities for monitoring in a plant application
                                                          43
information that would validate the assumptions that went
into a risk analysis or the assumptions that went into a
deterministic engineering analysis.  So the first option is
to continue with that process.
          The second option is a bit more aggressive in that
it would within the context of the PRA Implementation Plan
solicit additional areas in which the industry was
interested in pursuing examples.
          The third option would, in effect, be to create
something akin to the PRA Implementation Plan, which you
could name the performance-based regulation implementation
plan and collect together all those related topics and sort
of give it a life of its own.
          Staff has recommended option one, but I think it
is fair to say that option one with a leaning towards option
two because there is some receptiveness to additional
initiatives, and the Staff did send a letter earlier this
year to NEI suggesting that at least some additional options
as a learning or pilot-type experience would be appropriate.
          We have discussed this issue with the ACRS.  At
the bottom of the page, you will see a quote from their
August letter.  I think they were definitely very supportive
of doing at least what the Staff had recommended; that is,
to find a constructive place in each risk-informed decision
for a performance-based strategy to be included as a
                                                          44
verification or validation step, but I think the important
thing at this stage is that we are going ahead at least with
option one, and we need to do that for the development of
the regulation guides and the SRPs.  If the strategic
assessment process should have the Staff to do more, I think
it is highly unlikely that it will be asked to do less.
          So I think in the context of the reg guide and the
SRP, it is fairly clear what we should be doing.  What is
not entirely clear and is a policy issue for the Commission
to decide is how much more should we do.
          COMMISSIONER McGAFFIGAN:  What was the response
from NEI to your letter?  Did they have initiatives that
they would like to --
          MR. HOLAHAN:  I don't recall them coming back with
a specific example.  It seemed to me that it was between the
time when they send us a draft of their white paper on
performance-based regulation and when they finalized it.  So
I might say they were at least encouraged enough to go
forward and finalize their views.  I don't think they have
identified specific examples to follow up since then.
          MR. THADANI:  I think it is important to make a
point.  The PRA Implementation Plan is really focused on
risk-informed activities, and in some cases, where some
reliability guidelines are developed or to be used, then the
Implementation Plan can give guidance on how one would
                                                          45
assess that.
          Frankly, for many systems where you establish very
high reliability, steam-generated tubes, reactor and
pressure boundary, pressure vessel itself, many of the
complements, the expected reliability is very high, and so
the performance criterion one establishes cannot be first
failure.  It cannot be that.  It has to be something else. 
It has to be some engineering consideration that goes in,
how much thickness of a pipe, how much thickness can you
afford to lose.
          PIA can tell you how important that component is,
and that is an important part.  That is the risk-informed
part, but the performance-based part is non-numerical
because of what confidence one is trying to ascertain
reliability of a component.
          That is why I think we need to be very clear on
what is it that we mean by performance-based because, in
many cases, going forward in the risk analysis approach
cannot answer some of the concerns that we might have.  It
is that element of performance-based aspect.  That would be
very difficult for this plan to address.  It requires a lot
of thoughtful experience and understanding.
          Really, that is why we said in our letter to NEI
that we need to learn from experience and we need to move a
little slower in this area is basically what we said.  It is
                                                          46
not to say that we shouldn't go in this direction, but let's
make sure where are we going, learn from whatever experience
we have as we go forward.
          MR. HOLAHAN:  I think, as also pointed out in the
Commission paper, resources are an important element here. 
We are trying to maintain an aggressive schedule on the reg
guide and the SRP development, and there is a concern of
diverting resources from that activity if we take on a
little bit more than we can manage all at once.
          Can I have Slide No. 16, please?
          The second issue relates back to, as Mr. Thadani
mentioned, the June 15, 1990 SRM in which the Commission
instructed the staff not to use the safety goals for
plant-specific purposes, but to use them in a generic
decision-making, and that policy was, I would say, restated
in the PRA policy statement, but I think it was restated in
the context that the Staff needs to come back to the
Commission if it proposes to do otherwise than the 1990
directions.  I think we have read it not to be an absolute
prohibition, but if we develop this to be a worthwhile idea,
we need to bring it back to the Commission for Commission's
approval to go forward.
          COMMISSIONER ROGERS:  Well, if I could just say
anything? 
          MR. HOLAHAN:  Yes.
                                                          47
          COMMISSIONER ROGERS:  It seems to me, as I recall,
when the Commission first took that position back in 1990,
PRA was not a very attractive method of making regulatory
decisions around here, and there was considerable resistance
to it up until later than that, I believe.  So, at that
time, the Commission felt that we weren't on very solid
grounds in going beyond a kind of generic approach to using
the safety goals, but in the meantime, the kind of
development of PRA for nuclear applications and the data
that have developed have given us a great deal more
confidence that you can begin to think about the possibility
of using the safety goals themselves in some way for
plant-specific applications.
          I can't speak for the whole Commission, but it
certainly seemed to me that what we were saying there was to
open the door to that possibility cautiously and not just
say that it is still locked.
          MR. HOLAHAN:  Okay.  In fact, we developed two
options.  One option would be to develop guidelines for
plant-specific decisions and to have those guidelines
derived from safety goals and the subsidiary objectives. 
The second would be to derive plant-specific guidelines, but
to try to preserve the generic national average nature of
the safety goals by coming up with a scheme for relating an
individual plant regulatory decision, the effect it would
                                                          48
have on the nationwide risk.
          What we found is that the second approach is in
addition to being rather complicated, since one would have
to have a full understanding of the risk assessment at all
plants to make a decision on any plant, and I think it also
raises a number of rather complicated social and policy
issues about decisions, about is it appropriate to make a
risk decision at one plant where you are averaging out that
local effect nationwide.
          CHAIRMAN JACKSON:  Right.  It is a lot harder to
raise an industry average than an individual number.
          MR. HOLAHAN:  Yes.  So I think it is fair to say
we found the second option untenable.
          The first option is very much desirable in the
sense that if we are going to use risk assessment in the
decision-making process, certainly the output from the
individual calculation seems to be a natural part of the
calculation to use in that decision process.
          CHAIRMAN JACKSON:  Mindful of addressing all the
issues we have been talking about all afternoon.
          MR. HOLAHAN:  Absolutely, yes.  Yes.
          CHAIRMAN JACKSON:  Okay.
          MR. HOLAHAN:  With a full understanding of scope
and uncertainties, et cetera, but as we say, without having
a locked door.  
                                                          49
          So the Staff is recommending option one which
would be a change in Commission policy.
          MR. THADANI:  If I might, I think it is important
to note this is, in a way, one of the difficulties in pilots
because the Staff is pushing to get a lot of information,
and in some cases, that means a fair amount of additional
work on the part of those volunteer pilot licensees, and
that has caused some delays, trying to generate that
information or the need, the discussion back and forth as to
why is it really needed to generate this information.  We
will wait and see how it all works out, but currently, we
have a number of outstanding questions to those licensees.
          MR. HOLAHAN:  I think there is also a related
aspect to it, and that is, in some sense, the genie is out
of the bottle already.
          We know that the maintenance rule is being
implemented in many cases with licensees using
plant-specific risk assessment.
          CHAIRMAN JACKSON:  That is why I asked you about
the implementation and the monitoring part.
          MR. HOLAHAN:  Yes.
          CHAIRMAN JACKSON:  That is exactly why you have to
have that issue addressed.
          MR. HOLAHAN:  Yes.  And it is clear that licensees
are making other day-to-day prioritization and other
                                                          50
decisions based on the results of risk analysis.
          So, to confine the Staff to not use a similar
approach in plant-specific decision-making, it seems -- I
think, as Commissioner Rogers suggested, perhaps that had a
basis at the time, but I think that time has gone.
          CHAIRMAN JACKSON:  We heard you, Mr. Holahan.
          MR. HOLAHAN:  And the ACRS appears not only to
agree with us, but I think they were considerably ahead of
us on this issue, encouraging this view for several years, I
think.
          Can I have the seventh slide?
          The seventh slide is really contingent upon the
answer to the sixth, and that is, if you are going to use
the results of a risk analysis in plant-specific decisions,
should those results, in effect, be a proof that no risk
change has occurred or only an improvement has occurred, or
should increases be made under some circumstances.
          I think what is fair to say is this is a policy
matter to the extent that explicit changes in risk would be
identified and approved because, in an unquantified way, I
think it is clear that Staff does through the normal license
amendment process, under some circumstances where we feel it
is appropriate and the Commission's regulations are met,
that we do allow small risk increases.
          This would say that in the risk-informed
                                                          51
regulatory context, we would consciously, knowingly, and
with some numerical analysis make such a decision.  So,
basically, the two options that we have identified as to
allow small increases under certain circumstances and the
regulatory guidance and the review plan would be the
guidance document to identify how small is small and what
are those circumstances, or we could say no, it is not
appropriate, you should use your risk analysis to hold the
plant risk at some value where you think they currently are.
          We looked at the pros and cons of these options. 
The Staff has recommended the first to allow increases under
certain circumstances.  We think that is appropriate.  We
think we can identify how small is small and what is
appropriate.  The reg guides will help to balance any small
changes with deterministic engineering margins to give us
additional confidence that the decision we are making really
makes sense.
          The ACRS spent some time also reviewing this topic
and also agreed with the staff.  I would say, as a matter of
principle, we haven't brought an example to the ACRS yet as
to how those guidelines would be developed.
          COMMISSIONER DICUS:  I want to ask you a question
about this.  I don't necessarily disagree at least on the
surface with the recommendation, but if you were to, for
example, allow a small risk increase on one circumstance and
                                                          52
later there is another circumstance and then later another
one, the way to follow these small risks that accumulated,
to have a point in time, you say --
          MR. HOLAHAN:  Yes.
          COMMISSIONER DICUS:  -- and you track this and you
know this.
          MR. HOLAHAN:  We will certainly address that
issue.
          One of the techniques we have considered for that
-- and in fact, it is fair to say that the industry in its
PSA application guide, I think, has addressed it to a
certain extent in saying that a plant would develop a
baseline risk analysis and then any changes that it made,
either risk increases or decreases, at either a certain time
interval or when a next major change would be anticipated,
the analysis would be updated, in effect, if they had moved
closer to some ultimate goal, that that would be reflected
and understood before the next change would be made.
          CHAIRMAN JACKSON:  At the risk of preaching to the
choir, let me just reference Mr. Taylor's beginning comments
and Dr. Thadani's comments; that it still tracks back to the
licensing/design basis issues because if you don't know what
you are building the PRA on and if changes aren't
appropriately captured and documented as of now, then it is
very difficult to talk about moving forward in terms of
                                                          53
looking at how a risk profile of an individual plant may
change; that these are inextricably linked issues.  Do you
disagree?
          MR. THADANI:  No, not at all.
          MR. TAYLOR:  We agree 100 percent.
          MR. THADANI:  Totally.
          In addition to that, I would note that you tasked
us, the Staff.  The Staff should track cumulative changes
for those individual plants.  So in the Implementation Plan,
we have an activity.  Not only do we have expectation that
the PRA should reflect the plant design and operation; that
the plant should track if they are going to be using that
tool -- they need to keep track of cumulative effects, but
that the Staff will also be tracking that information.
          MR. KING:  Commissioner Dicus, in our list of 27
issues, it was item III-(e) on page 10, the issue you
brought up.  We didn't miss it.
          [Laughter.]
          COMMISSIONER DICUS:  You are on top of it.
          MR. HOLAHAN:  Let the record show the choir says
amen.
          Could I have Slide 18, please?
          The fourth policy issue was a little different
from the first three in that it is more of a procedure and
process question than a technical or real technical type
                                                          54
policy matter, and that is where we are considering
risk-informed changes in the in-service inspection and
in-service testing programs, how should those be treated in
the context of the current regulations. 
          We identified three options.  One would be to
consider them exemptions to the current requirements in 10
CFR 50.55a.  If we were to review these as being quite
different from previous changes and the kind of alternatives
the Staff has allowed to change in the past, then it would
be most appropriate to treat those as exemptions.
          The second option would recognize the fact that
the regulation currently allows for authorized alternatives
in Section 50.55a(a)(3)(i), and the third option would be to
defer any such changes until the national consensus
standards, the ASME standard process had actually adopted
those changes.
          We have looked at these options, and we have
considered whether the risk information and the kind of
decision we would be making would be consistent and
appropriate, similar to decisions we have made before, and
our recommendation is to treat the code alternatives, to
treat the ISI and IST alternatives to the normal code
requirements as authorized alternatives under that element
of the regulations. 
          However, we think that carrying those for a long
                                                          55
period of time as authorized alternatives is probably not
the clearest and the best approach.  So, in parallel with
that, we would be working with the ASME to move these
alternative approaches into the national codes and make them
a part of the -- that would draw them into the normal
coverage of the regulations. 
          MR. THADANI:  I might just note again, on that
one, the Staff is looking.  For example, in ISD, there are
two approaches that are being looked at.  One approach to
sponsor is ASME is the sponsor.
          MR. HOLAHAN:  I think in that context, if we were
to approve both options, then, perhaps, the ASME part would
be taken care of because the regulations refer to the ASME
code, but perhaps a role change would be appropriate to
reference the other methodology.
          CHAIRMAN JACKSON:  Is this a technical
recommendation or a legal?
          MR. HOLAHAN:  This is really a legal and
procedural matter.
          CHAIRMAN JACKSON:  I mean, this is including the
legal staff analysis of this?
          MS. CYR:  We concluded that, in our understanding
of where they have looked at alternatives in the past, that
it is an alternative under the 50.55a(a)(3).
          MR. THADANI:  Correct.  Otherwise, we couldn't
                                                          56
proceed under that.
          CHAIRMAN JACKSON:  But greater clarification would
come from this dual path.
          MS. CYR:  Ultimately to adopt that.  Now there
will be an approved alternative that the ASME adopted as a
consensus standard.  Then you would want to reflect that
essentially as your main part of your --
          MR. THADANI:  That is right.
          MR. HOLAHAN:  And largely, because this was a
legal and procedural matter, we did not ask the ACRS to
comment on it.
          I will cover the last two slides quickly.  We have
a number of activities over the next six months.  We have
meetings with the ACRS, October, November, and December. 
Those are largely focused on the regulatory guide and the
standard review plans.
          We are still striving to issue the reg guides and
the SRPs by December 31st.  We have a couple of major
activities to go in order to achieve that.  I think both the
ACRS views and the CRGR in November will be challenges to
the Staff to get those out on the current schedule.
          We will be continuing our review over the pilot
applications, with the IST pilot in March of '97 and the
technical specification pilot at the end of this year.
          We will be moving over the next six months, as Tom
                                                          57
King mentioned, to complete the IPE reviews.  The draft IPE
insights report is, in fact, to the Commission, and I guess
it will be sent out for public comment shortly.
          MR. KING:  Yes.
          MR. HOLAHAN:  I think we already covered the
reliability data rule as an ongoing activity.
          CHAIRMAN JACKSON:  When is that evaluation
expected to be completed?  When are you going to be
completed?
          MR. JORDAN:  I will ask Pat Baranowsky to give me
advice.
          MR. BARANOWSKY:  I believe we are going to try to
have something to the Commission giving the status of that
evaluation either late February or early March.
          CHAIRMAN JACKSON:  So March of '97?
          MR. BARANOWSKY:  '97, yes.
          CHAIRMAN JACKSON:  Okay.
          MR. HOLAHAN:  Of course, having developed the
number of new training programs, we will be continuing to
use those.
          May I have the twentieth slide, please?
          This is just a summary of our next commitments to
the Commission, with a December update and a briefing plan
for next April.  I think by the time we prepare the December
update, we will have a much clearer view of where we stand
                                                          58
with respect to the regulatory guide and the standard review
plan because we will have been through the ACRS and CRGR and
we will know how close we are to having a version available
for public comment.
          CHAIRMAN JACKSON:  Okay, thank you.
          Commissioner Rogers?
          COMMISSIONER ROGERS:  Just in the December
briefing, do you have essentially a topics list for that yet
as to what you think you will be discussing?
          MR. HOLAHAN:  I think the current plan is we would
have briefing -- you mean the Commission briefing?
          COMMISSIONER ROGERS:  Yes.
          MR. HOLAHAN:  I think the Commission briefings
have been set on six-month intervals.
          So, although we would produce an update of the
report, right now there is not a --
          CHAIRMAN JACKSON:  The report comes in three-month
intervals.  The briefings are six months.
          COMMISSIONER ROGERS:  Right, but just what will be
the emphasis of that?
          MR. HOLAHAN:  Just looking at the topics that are
ongoing, I would say the regulatory guide and the standard
review plans would be the dominant issues.
          COMMISSIONER ROGERS:  When will you be able to
talk to us a little bit about how we expect to use PRA in
                                                          59
inspections?
          MR. HOLAHAN:  I think we could do that at almost
any stage since it is sort of in the process.  We have heard
some progress.
          COMMISSIONER ROGERS:  Well, I would be very
interested in hearing that because the standard review plan
is important, but how do we expect to actually employ the
use of PRA out in the field, particularly, for instance,
with resident inspectors?
          MR. THADANI:  Actually, we have started to move
slowly in that direction.
          CHAIRMAN JACKSON:  Why don't you speak to it
specifically in your briefing to the Commission.
          MR. THADANI:  We will do that.  WE will do that.
          MR. HOLAHAN:  And there are two or three examples,
at least two, in the current Commission paper.
          The one thing that I would say, the thing that I
am most optimistic about is the senior reactor analyst
program where we have taken about 10 of the experienced
senior inspectors largely from the field offices and put
them in two-year training programs for PRA, and that looks
like it is working very effectively.  That is a mechanism
for getting experienced inspectors with risk insights,
putting them back into the regional offices to be the local
experts.
                                                          60
          COMMISSIONER McGAFFIGAN:  I got stuck back on page
13.  I kept going back to it.  If the answer to VI-(c) is
yes, that the licensee's PRA would be put on the docket and
subject to litigation, what are the implications of that for
this whole effort?
          MR. HOLAHAN:  Well, at first, it might make the
licensees a little bit reluctant to submit those, but I
think, in practice, what is likely to happen is not that the
whole PRA is subject to litigation any more than every code
analysis and every code run shows up in the litigation, but
the information that is extracted from it and summarized and
is actually in the licensing decision process.  Frankly, if
that element of the analysis is really what is at least in
part convincing the Staff that this is a good regulatory
decision to make, then I think it ought to be subject to
public scrutiny.
          CHAIRMAN JACKSON:  Commissioner Dicus, any
questions?
          COMMISSIONER DICUS:  No.
          CHAIRMAN JACKSON:  Commissioner Diaz?
          COMMISSIONER DIAZ:  Yes.  I just have a comment. 
When I read these documents, I was sure I was confused.  Now
your pages 10 to 13 assure me that I have good cause to be
confused because there are a lot of good questions in there.
          However, looking back a little bit over the
                                                          61
history, to be able to make an informed risk decision, I
think we already realize we have got to take a risk, and I
think we took a risk with the maintenance rule.  Is that
correct?  Was it something we did that we weren't sure how
it was going to come out and we took a risk?  And I think
that's great.
          In all these studies, sometime soon we are going
to have to come with another risk.  We are going to have to
take a risk, and that is going to be an informed risk to
implement risk-informed performance-based decision.
          Has the Staff identified any particular area where
we are closer to a definite answer to say we are going to be
able to do this at a definite time?  I mean, is any of the
multiple series of issues resolved to a point that we can
say in a year we can do that or whatever?
          MR. HOLAHAN:  I think it is fair to say that the
pilot activities are the areas that we are getting the most
experienced, and I would say I think we are pretty
optimistic in each of those.
          Certainly, perhaps with the ICI, in-service
inspection area, it requires a bit more technological
development than the others, but I think the Staff is
optimistic about using these approaches and coming to
agreement with the industry on the in-service testing.  I
think there is very good reason to think that, certainly,
                                                          62
within a year we will be ready to deal with technical
specification changes, and I think in the graded QA area, we
will come to an understanding of the appropriate uses.
          One of the difficulties in that area is it is hard
to quantify the value of doing quality assurance, but
certainly, the use of risk input in deciding what is more or
less important equipment in the plant, I am very optimistic
that all of those will be successes.
          COMMISSIONER DIAZ:  Is the critical path dependent
on the database that you have established on how to track
that database?
          MR. HOLAHAN:  I guess I don't see the data role as
absolutely essential in the sense that nothing can be done
without it.
          COMMISSIONER DIAZ:  I see.
          MR. HOLAHAN:  I think it is a very important
element.  I think it would reduce the uncertainties.  It
would make it much more practical to be making decisions.
          Perhaps you can't make as good and maybe,
therefore, you can't make as certain a decision or maybe you
need to put in a little extra margin if you don't have as
much data, but I am still optimistic that improved decisions
can be made.
          COMMISSIONER DIAZ:  Certainly, if the rule is
imposed, we will be able to track it.
                                                          63
          MR. HOLAHAN:  Yes.
          MR. THADANI:  Yes. 
          If I may expand on what Gary has said,
Commissioner Diaz, actually we at the Agency are using these
techniques today in many of our decisions.  In fact, we have
a regulation that is called the backfit rule that calls for
us to make two determinations before we can impose any new
generic requirement.
          The first one is that it should lead to
substantial improvement and safety, and then we do use the
Commission's subsidiary objectives derived from the safety
goals and do, do risk analysis to see how the issue might
relate to risk before we would go back and do cost benefit
analysis, which is the second element of the backfit rule.
          As far as we go forward, we are even today using
what I would call risk insights to complement our
deterministic evaluations and some of the changes that the
licensees come in and propose in terms of technical
specification changes.
          What we do not do today, we do not have fixed
numerical criteria or what I would call the infrastructure,
the regulatory guide standard review plans, but we do use
risk insights in these decisions as we go forward.  In fact,
it is an important element that we need to consider.
          So I want to be sure that you know that we are
                                                          64
actually using these concepts and these decisions, but we
just don't have the fixed criteria and we don't have
guidance on how far to review these things called quality
methods and so on.  It is sort of ad hoc today, I would say.
          COMMISSIONER DIAZ:  I understand. 
          CHAIRMAN JACKSON:  I would like to thank the Staff
for what has been actually a very informative briefing on
the Agency's PRA activities.  Thank you.
          We commend you, in fact, for the progress you have
made to date in this sometimes difficult area.  I know some
of you have lost a few hairs along the way, but at the same
time, we encourage you to continue to improve the process
and to provide appropriate review mechanisms to ensure that
PRA is used appropriately in our regulatory processes. 
Clearly, PRA has become an important tool of the regulatory
process, and therefore, we have to strive to enhance the
process, when necessary, but to ensure its consistent use
where appropriate, and that is where the development of what
you call, Dr. Thadani, the infrastructure is very important.
          MR. THADANI:  Yes.
          CHAIRMAN JACKSON:  It is also the reason for the
fact that I am explicitly asking you to address the
implementation and monitoring issues within the context of
the maintenance rule and so forth in the next briefing.
          We, the Commission, owe you decisions on the
                                                          65
policy issues as soon as possible; for example, the one we
have been discussing, the use of the Commission safety goals
for plant-specific applications. 
          Then, as long as we understand what is really
needed or how far one can go in the use of these
methodologies for a given decision, then I think we will
start out on more solid ground.
          So, unless my fellow Commissioners have any
further comments, we are adjourned.
          [Whereupon, at 3:42 p.m., the briefing concluded.]