1
                  UNITED STATES OF AMERICA
                NUCLEAR REGULATORY COMMISSION
                             ***
             BRIEFING ON PERFORMANCE ASSESSMENT
                PROGRESS IN HLW, LLW AND SDMP
                             ***
                       PUBLIC MEETING
                             ***
                              Nuclear Regulatory Commission
                              Commission Hearing Room
                              11555 Rockville Pike
                              Rockville, Maryland
           
                              Thursday, May 15, 1997
           
          The Commission met in open session, pursuant to
notice, at 2:07 p.m., the Honorable SHIRLEY A. JACKSON,
Chairman of the Commission, presiding.
COMMISSIONERS PRESENT:
          SHIRLEY A. JACKSON, Chairman of the Commission
          KENNETH C. ROGERS, Member of the Commission
          GRETA J. DICUS, Member of the Commission
          EDWARD McGAFFIGAN, JR., Member of the Commission
          NILS J. DIAZ, Member of the Commission
           
.                                                           2
STAFF AND PRESENTERS SEATED AT COMMISSION TABLE:
          JOHN C. HOYLE, Secretary
          KAREN D. CYR, General Counsel
          JOSEPH CALLAN, EDO
          MALCOLM KNAPP, NMSS
          JOHN GREEVES, NMSS
          NORM EISENBERG, NMSS
           
           
           
           
           
           
           
           
           
           
           
           
           
           
           
           
           
           
.                                                           3
                    P R O C E E D I N G S
                                                 [2:07 p.m.]
          CHAIRMAN JACKSON:  Good afternoon, ladies and
gentlemen.  Today, the Commission will be briefed by the NRC
staff on its performance assessment program, which covers
three technical areas that are of great importance to the
Commission.  These areas are low-level radioactive waste
disposal, high-level radioactive waste disposal and site
decommissioning.
          The staff made it clear at least year's Commission
briefing on this subject that developing a performance
assessment model in any one of these three technical areas
is a complex and challenging task.  I remember your very
informative briefing, Mr. Eisenberg.
          However, the development of high-quality
performance assessment models for low- and high-level waste
and site decommissioning would enable the Commission to
obtain significant quantitative and qualitative input for
making risk-informed regulatory decisions on these matters. 
But we also understand the performance assessment is more
than risk assessment.
          The Commission is looking forward to hearing the
new developments in the performance assessment program as it
relates to radioactive waste disposal and SDMP sites.  If
none of my clients have opening comments, Mr. Callan, why
.                                                           4
don't you proceed.
          MR. CALLAN:  Thank you, Chairman.  Good afternoon.
          Chairman, you covered the points I was going to
make in my opening remarks.  I will just introduce those at
the table.  With me are Mal Knapp, the deputy director of
NMSS, John Greeves, the director of the Division of Waste
Management and, as you introduced him, Chairman, Norman
Eisenberg, the senior advisor for performance assessment who
works for John Greeves in his division and, as last year,
Norm Eisenberg will be the principal briefer.
          Norm?
          MR. EISENBERG:  Okay, thank you very much.
          If we could go to slide two, this is an outline of
the briefing.  I will begin by defining performance
assessment, just to get us all on an even footing.  Second,
because of the Commission focus on PRA, I will discuss the
similarities and differences between PRA and performance
assessment which we feel is the manifestation of PRA and
waste management.
          Third, I will discuss for each of the Division of
Waste Management program areas the PA program recent
accomplishments and limitations that we have.  And, finally,
I will summarize.
          Performance assessment is a type of systematic
safety analysis that explores for a waste facility what can
.                                                           5
happen, how likely it is and what the impacts of the
occurrence are.  In this regard, the performance assessment
is consistent with the Kaplan-Garrick triple used to define
risk.  Performance assessment integrates information, number
one, across a wide variety of disciplines.  We go from
inside near the waste package all the way out to the far
field in the biosphere so we include disciplines such as
corrosion science, geochemistry, radio nuclide transport,
hydrology, heat transfer, rock mechanics, the list goes on
and on.  In addition, PA integrates information across
program areas.  For example, design information, site
characterization information, analytical studies and, of
course, our bottom line is regulatory compliance.
          The term performance assessment as used in the
Division of Waste Management encompasses a broad range of
quantitative analyses applied to waste disposal facilities
and we try to match these analyses to the need. 
Deterministic bounding analyses are used most often but
probabilistic analyses are used for complex facilities or
issues like the high-level waste repository.
          CHAIRMAN JACKSON:  Let me ask you a question.  I
don't want to de-track you but perhaps Dr. Knapp or
Mr. Greeves can answer this question.  Can you give us some
examples of actual regulatory uses that have been made, if
there have been any, of performance assessment results and
.                                                           6
do you have regulatory guidance documents in the performance
assessment area that, in fact, you make use of?
          MR. KNAPP:  I will turn to these gentlemen to talk
a little bit about the documents but you give me an
opportunity to talk about something we did when I was active
in this area over 15 years ago.  And that was when the
Department of Energy was actively investigating the Hanford
site, known as PWEB.  And we used performance assessment as
a basis for debating with them and I believe reaching
conclusions that we preferred over the analysis of
groundwater at PWEB.  That was a very early application of
some embryonic things that Norm has subsequently developed
in detail.  But that is one that comes to mind.  Although
you might not call it a formal regulatory use, in our
interactions with the Department of Energy in high-level, it
was a very useful tool.
          CHAIRMAN JACKSON:  It is coming to resolution on
some technical issues.
          MR. KNAPP:  Exactly.  I am sure they could have
other examples but I will turn to John to talk about it.
          MR. GREEVES:  I will try and be brief and give you
a couple of examples.  One, Norm is going to talk about the
branch technical position in low-level waste.  That is an
example of guidance level use in a regulatory format.
          Another one is, as you know, with the
.                                                           7
decommissioning rule, there is a whole set of guidance that
needs to lay underneath of that.  The staff is working on
that guide presently and our goal is to have it available in
a timely way.  So that is another example.
          And a third, it might not be quite the example you
were looking for but DOE has completed eight performance
assessments on ten of their sites and they have a
headquarters review group that is, in fact, performing a
regulatory function at this point in time in terms of
reviewing the performance assessment that was conducted at
the site.
          I am not as familiar as I would like to be with
that process but I am told it has many of the elements that
we are using and I will just finish with one.  We have got
the West Valley project facing us in the future so I see
these tools being applicable to that in a regulatory
environment where we have a direct role.
          CHAIRMAN JACKSON:  Could you speak to the issue of
the regulatory guidance documents and the extent to which
they either exist or are being developed?
          MR. GREEVES:  As far as the regulatory guidance
documents, the principal one that I would point to is the
branch technical position.  We put out a draft of that in
'94, we have been working on it since and you are going to
hear Norm talk about it in terms it is about ready to go out
.                                                           8
the door.
          Norm, can you add any other regulatory guidance?
          MR. EISENBERG:  Well, there is the --
          MR. GREEVES:  The high-level waste material, the
expert elicitation documentation.
          MR. EISENBERG:  Right.  Then regulatory decisions
have been made on specific cases.  I don't know if you
recall but last year we talked a little bit about the Curtis
Bay facility and as I understand it, either in a few weeks
or a few weeks ago they had a public meeting because they
are going to take the site off the list.  So there is real
world regulatory decisions being made.
          CHAIRMAN JACKSON:  Thank you.
          MR. EISENBERG:  Okay, I think we are on slide
four.
          Some points of comparison between performance
assessment and PRA.  Both are types of safety system
analysis and have very similar analytic structures.  I will
say a little bit more about that later.
          Although PRA is used as a complement to
deterministic requirements for reactor regulation,
performance assessment is used to demonstrate compliance
with regulatory requirements for waste facilities.  In
simple cases, it may just be a simple deterministic
analysis.  Both performance assessment and PRA generally
.                                                           9
treat the same types of uncertainties related to model
parameters.  The models themselves and future states of the
system or scenarios.
          Performance assessment and PRA integrate risks
from likely and unlikely events and both methodologies are
adaptable to the nature of the problems studied.  However, I
think, the differences from site to site for commercial
nuclear reactors is much less than the differences from site
to site for waste facilities.  There is a lot more
variability.
          CHAIRMAN JACKSON:  Are there places where the
actual models overlap?  I mean, have you ever used similar
models or the same types of models?
          MR. EISENBERG:  Certainly in some cases,
especially in the area of doses, certainly the fundamental
methodologies are similar.  However, they must be adapted
for the case at hand so we couldn't, for example, use the
CRAC code to analyze the volcano extruding waste into the
atmosphere from a repository because a very important part
of the problem was the interaction of the waste with the
magma and the dynamics of the ash plume migration.  That is
something that isn't in CRAC so it wouldn't do a good job on
that.
          CHAIRMAN JACKSON:  I guess what I am really
asking, more in terms of kind of probabilistic
.                                                          10
distributional assumptions of dating the distributions and
how things are parameterized, that kind of thing.
          MR. EISENBERG:  I think there may not be a whole
lot of similarity in that regard.  As I will point out
later, both methods use Latin hypercube sampling in order to
do a whole --
          CHAIRMAN JACKSON:  Which I am going to ask you to
define.
          And last two quick questions, you know, is most of
our performance assessment work done in house?  And how many
experts such as yourself do we have on staff?
          MR. EISENBERG:  For high-level waste, of course,
we share resources with the center and they make a great
many contributions to our efforts.  For low-level waste, we
have used some other contractors to help us but there was a
large effort internally.  For SDMP work, we are also getting
outside contractor help but a lot of work is being done in
house.  In fact, the case work is largely done in house.
          MR. GREEVES:  I'd like to point out that a number
of the staff sitting here with us are the people we rely on
in terms of doing this performance assessment, right here.
          CHAIRMAN JACKSON:  So we better not have anything
happen to them.
          MR. GREEVES:  We can't afford to have anything
happen to them.  I would say there is a large amount of the
.                                                          11
work being done in house and I would also like to point out
that research is doing performance assessment.  They are
developing some tools for us.  So there is a lot going on
and I am pleased with it, I would like to see more of it. 
But we will do what we can with the resources we have.
          CHAIRMAN JACKSON:  So six people?
          MR. EISENBERG:  No, I think Keith McConnell is the
section leader for performance assessment and I think there
are nine or ten people in the section.  So that is a core
group.  But then we take advantage of talents elsewhere on
the staff as needed.
          COMMISSIONER DIAZ:  And the center, do we do
approximately 50 percent of the work in house? 
Approximately, a ballpark?
          MR. GREEVES:  I think it is more than half.  That
would be my assessment.  I could get back to you with a
better answer.
          CHAIRMAN JACKSON:  Thank you.
          MR. GREEVES:  It is basically the center is the
main contractor we use and, actually, we are trying to use
them in all three of these areas.
          CHAIRMAN JACKSON:  Okay.  Commissioner McGaffigan?
          COMMISSIONER McGAFFIGAN:  Could I ask in
comparison with other agencies, you mentioned DOE had done
some performance assessments.  Does EPA use performance
.                                                          12
assessment?  You know, their norm sites, the coal ash sites,
that sort of thing, or any other area?  Do they look at the
sort of -- develop the sort of models you guys use here?
          MR. GREEVES:  Again, my information is somewhat
limited.  Maybe Dr. Knapp would like to add to it.  But,
yes, they are doing things.  In fact, one of the tools we
are working on, EPA participates in the funding process for
one of those we are developing in house.  So I know they do
some of it.  Carl Papierello speaks about it frequently, he
likes some of their codes.  But, Mal, you want to add to
that?
          MR. KNAPP:  I would say there is not an
unreasonable amount of overlap in terms of what we try to
do.  That has, again, gone on for years.  But the codes are
different.
          In general, I would say that EPA's codes tend to
be a little more generic and a little less site specific
than ours.  I would argue that ours tend to be a little more
realistic than theirs are but I suspect if there were an EPA
representative to my right --
          CHAIRMAN JACKSON:  We would hear it the other way.
          MR. KNAPP:  Exactly.  But the codes, we have a
number of codes that are somewhat in common that differ a
little because of our different missions and I think we talk
about them and occasionally debate them enthusiastically.
.                                                          13
          As a matter of fact, yesterday in a meeting of
discourse, the group on radiation standards, we were talking
about how we were going to reconcile some of the differences
in codes.
          CHAIRMAN JACKSON:  Did I hear you say that EPA
funds some work together with us?
          MR. GREEVES:  We just had a briefing today on a
code development process that research has a lead on it and
it has been funded by a number of entities, including DOE,
EPA and NRC.  That funding profile looks a little bit like
the budget cycle in the last five or six years but it is a
valuable tool that I think could be back giving you more
information on when it becomes more useful.
          COMMISSIONER McGAFFIGAN:  Could I clarify whether
realistic was a synonym for conservative or stringent?  I
have heard in this norm case the EPA code is a code that we
would love -- well, I am not sure we would love to use
because we don't think it is realistic, we think it is too
liberal in its assumptions.
          MR. KNAPP:  I don't have a simple answer for that. 
It would depend on the code and the particular assumptions. 
Some, we view as more realistic.  Of those cases where we
feel we are more realistic, in some cases we think that EPA
may be nonconservative.  Some of our concerns, and I will
look to Norm and John to correct me, but in low-level codes,
.                                                          14
we would say there are some areas where their more generic
codes developed earlier were less conservative.  But I
wouldn't say we are putting conservatism in.
          If you would like, I would argue that we are
realistic and they may be nonconservative.  But that might
be for two or three variables in the code and the fourth
variable it might be either way.  It is just there is no
simple answer but I would certainly ask either of them to
correct me or elaborate.
          MR. EISENBERG:  I think that's right.  There is
not an across-the-board, simple relationship.
          CHAIRMAN JACKSON:  Okay, why don't you proceed.
          MR. EISENBERG:  I think we are on slide five.
          I would like to talk a little bit about the
approaches in PA and PRA.  There are many shared approaches,
the structure of the analysis, both have a risk focus. 
Latin hypercube sampling was adopted by the PRA folks.  It
was developed in the waste program.  Latin hypercube
sampling is a type of stratified sampling.
          Instead of doing strict Monte Carlo -- well, there
will be a slide coming up, two slides, where we will talk
about doing sampling or propagating uncertainties for
consequence models.
          CHAIRMAN JACKSON:  Would it be better to wait,
then?
.                                                          15
          MR. EISENBERG:  Okay, let's wait.  You convinced
me.
          Certainly, also, the categorization and treatment
of uncertainties.  However, there are some fundamental
differences between the systems analyzed in waste and the
systems analyzed in PRA for reactors.  There are differences
in approaches because of the differences in systems.
          For example, waste systems are largely continuous
and their components degrade in a continuous fashion.
          CHAIRMAN JACKSON:  Actually, so do reactors but
they are treated discretely.
          MR. EISENBERG:  Yes.
          CHAIRMAN JACKSON:  Some of them do.  That's my
statement.  You don't have to agree.
          Sorry to throw you off, but go on.
          MR. EISENBERG:  The waste systems have engineered
and natural components whereas the reactor is a largely
engineered system with natural events possibly impinging on
it.  The waste facility is often large and dispersed with
many similar components like waste packages, while the
reactor is a single system with major failure modes
affecting the entire system.
          For example, a single leaking waste package in a
repository of 20,000 may not be a major thing.  If you have
a loss of coolant in the reactor vessel, that's a problem.
.                                                          16
          The mission time for the reactor, say 40 years, is
long compared for the time of development of the
consequences of a reactor accident, say hours to days;
whereas, for a waste system, the mission time, say 10,000
years, is comparable to the time of development of
consequences, which is also thousands to tens of thousands
of years.  Thus, for the waste facility, one failure mode,
say waste package corrosion, will be overlapped by other
failure modes such as an earthquake.
          For a reactor, these multiple events occurring
together are so unlikely they are generally left out of the
analysis with good cause.
          Finally, the waste facilities are largely passive
while the reactor has many active redundant safety systems.
          CHAIRMAN JACKSON:  Let me stop you for a second. 
You are going to talk about how uncertainties are accounted
for in the decisionmaking process at the same time when we
talk about the Latin hypercube sampling.  Is that what you
promised?
          MR. EISENBERG:  One slide later.
          CHAIRMAN JACKSON:  Okay.  And let me just ask one
last question.  Can you talk a little about how passive
systems are treated probabilistically in performance
assessment and would you venture a statement as to whether
that approach would also work for passive reactor systems?
.                                                          17
          MR. EISENBERG:  We tend to treat the passive
systems and their behavior in the consequence analysis
because that is really the essence of the waste system
behavior.  We treat some -- but, of course, we include
uncertainties as we will see in a minute.
          But in terms of probabilistic treatments in terms
of conditions that occur or don't occur, these are treated
similar to the way external events are for the reactor
analysis.
          We are working on a problem and I hope we are
successful trying to develop importance measures for the
waste system.  Since we don't have a strong embedding in
fault tree and event tree analysis, we can't take full
advantage of the current methods, the pressure vessel type
importance, things like that, and we are trying to develop
some other methods.  If we are successful, they might be
applicable to some of the passive systems in the reactor
business also but we are not sure we will succeed.
          CHAIRMAN JACKSON:  Thank you.
          MR. EISENBERG:  Okay.  The next figure shows the
sequence of analysis for PRA and performance assessment and,
as you can see, they are quite similar.  The components of
the analysis are similar but they are not identical and if I
could just take one as an example, the source term analysis
in a level two PRA deals with phenomena relating to
.                                                          18
migration of radio nuclide material from a damaged core to
outside the containment structure and those phenomena could
include high-temperature chemical reactions of the corium
plate-out inside the containment building and leakage
through penetrations in the building.
          For performance assessment, the facility source
term involves or could involve corrosion of the waste
package, chemical conditioning of the water coming in
contact with the waste, dissolution of the waste, and all of
that occurs at relatively low temperatures to what might go
on inside a reactor vessel.  So the structure is quite
similar but the components we use in each facet are
different.
          Okay, here comes -- I can't avoid it any longer.
          This slide attempts to show how a linked chain of
performance assessment consequence models with uncertain
inputs produces a distribution of performance for the
system.  In other words, this is variability.  Treatment of
uncertainty in the inputs is propagated to a distribution of
outputs giving you a measure of uncertainty in the output.
          Just for example, model A could represent the
source term with uncertain inputs for things like corrosion
rate, the solubility limit, flux into the waste package. 
Model B might be transport of radio nuclides into
groundwater with uncertain inputs for porosity, permeability
.                                                          19
and groundwater flux.  Model C could represent biosphere
transport with uncertain variables representing foodstuff
intake, irrigation rate variables like that.
          By sampling these input parameters repetitively,
one runs the whole chain of models and gets an estimate of
performance related to that particular choice.  When you do
this hundreds of thousands of times, you get a distribution
of performance for the entire system.  One way to do that is
to do strict Monte Carlo sampling where the input parameters
are chosen randomly.
          We used a method called Latin hypercube sampling
which partitions each distribution into segments that are
equally probable and we sample from them without
replacement.  So we have a whole routine to go through in
order to generate samples for the inputs that give us these
variable outputs.  The advantage is that you are assured of
covering the entire probabilistic regime with the
appropriate probability weights in a much more economical
fashion than doing strict Monte Carlo sampling which, of
course, is going to sample a whole lot around the mean.
          CHAIRMAN JACKSON:  Do you perform sensitivity
studies?
          MR. EISENBERG:  Once we have the distribution of
outputs and we have the distributions of inputs, we can then
look for correlations which translate into sensitivities. 
.                                                          20
It is kind of a global type of sensitivity rather than
sensitivity in a particular point in this multi-dimensional
parameter space.
          CHAIRMAN JACKSON:  And how do you identify the
dominant contributors to risk for a given scenario?
          MR. EISENBERG:  We have so far looked at these
correlations and the ones that come up being having the
greatest influence or the highest effect on the output are
deemed the ones that are most important.
          I should say in that regard that the same variable
might appear as an input to different models or the inputs
to the models might be correlated.  So in what I just said,
the irrigation rate is related to the rainfall.  Well, the
infiltration of water into the waste package is related to
the rainfall too.  So you have these correlations and they
have to be taken care of and considered in the sampling.
          CHAIRMAN JACKSON:  What is the role of expert
opinion?  And then I am going to defer to Commissioner
McGaffigan.
          MR. EISENBERG:  Well, I would say expert judgment
is used to generate the distributions, the probability
distributions for the various parameters, the PDFs.  I think
it is -- you could almost take as a given in the waste
business these parameters are rarely, if ever, measured
directly.  These are almost always inferential measurements
.                                                          21
so that if you are interested in the porosity of a geologic
unit, one thing you can do is take a piece of core, take it
into the lab, push water through it and see what its
permeability is.
          Well, you can also pump water into it and see how
much water you can pump into it with a given back pressure. 
That is another way.  So there are all different ways to
approach these things and these various lines of evidence
have to be integrated and expert judgment plays a strong
role --
          CHAIRMAN JACKSON:  -- role in that integration.
          MR. EISENBERG:  And interpreting these fundamental
data into the inputs for the performance assessment.
          CHAIRMAN JACKSON:  Commissioner McGaffigan.
          COMMISSIONER McGAFFIGAN:  I am not sure this is
the right time to ask this but Latin hypercube sampling
versus Monte Carlo, what biases get introduced?  I mean, I
understand Monte Carlo.  I haven't studied Latin hypercube.
          If you have the same graphs for the various models
and you run it, how close do they come to each other?
          MR. EISENBERG:  If you are able to take enough
samples, and there is an art to determining how many
samples, and in fact sometimes what we do is we run 200 and
then we run 400 and if the answer doesn't change very much
we say, well, 200 is good enough.
.                                                          22
          But the answer is, if done properly, there should
be no biases introduced because you are using the same
probability distributions; you are just using an economical
way of sampling.
          CHAIRMAN JACKSON:  It's the sampling, right.
          MR. KNAPP:  One of the ways I think of it is if
you use Monte Carlo with the same numbers as Latin
hypercube, there is a risk that when you generate your final
response surface there might be some holes in it.  Just by
virtue of where you picked your initial parameters there is
an area where you don't have very much data.
          Whereas, by Latin Hypercube, you would tend to get
a better distribution among your input parameters so you
would have more confidence in the response surface for the
same number of runs.  And runs can be in the -- not now as
much as they used to be.  But they can be expensive.  So
anything you can do to run fewer runs and still have high
confidence is very valuable.
          That was what was the basis for the development of
Latin hypercube.
          CHAIRMAN JACKSON:  But now with workstations --
          MR. KNAPP:  you know, it is interesting that 20
years ago we had models that were right on the ragged edge
of what workstations could do and 10 years ago.  So even
today, as the models are developed with, I think, some very
.                                                          23
good results from what has been done in the last year or so,
Latin hypercube is still a valuable asset.
          CHAIRMAN JACKSON:  Let me ask you two last
questions.
          Can you describe the peer process for your
performance assessment program?  Peer review.
          MR. EISENBERG:  Well, of course, let me talk about
high-level waste.  There, we have usually a team of
performance assessment analysts plus other required
disciplines that are doing the study.  Quite often or
universally before it goes out it is given distribution to
other staff that have not been involved for internal peer
review as well as the normal management review.
          We have always issued our performance assessments
for review in public and we published papers on it and peer
reviewed the literature.
          CHAIRMAN JACKSON:  And perhaps either Dr. Knapp or
Dr. Greeves, you know, NMSS has done considerable work on
expert elicitation in the high-level waste program and so
you have obviously developed a strong knowledge base.  Have
you passed along any of this to NRR and Research?
          MR. KNAPP:  My understanding is that we have and
are but I would certainly turn to John and Norm to talk
about specifics.
          MR. EISENBERG:  Yes.  First of all, when we were
.                                                          24
developing the BTP on expert elicitation, we passed it
around to all the other offices in the agency and we
certainly have made it available to them and they indicate
that they use it as it seems to apply in their work and we
of course participate in the PRA coordinating committee and
the subject comes up there and is discussed.
          MR. GREEVES:  The real key is DOE is using this
process, as they indicated this morning.  And it has, I
think, been a valuable tool.
          CHAIRMAN JACKSON:  Thank you.
          MR. GREEVES:  I think Norm probably has that later
in a slide, too.
          MR. EISENBERG:  Okay, if I could just make one
more point on slide seven --
          CHAIRMAN JACKSON:  You're taking a chance.
          [Laughter.]
          MR. EISENBERG:  I would say that a distribution of
the output is produced and it may be some normalized
releases or individual dose, whatever.  But because the
uncertainties are large for many waste facilities, some
realizations will exceed the regulatory limit, almost
always.  This means that the staff has to provide reasonable
but protective limits for compliance and must use
appropriate statistical criteria to determine compliance. 
Given that the performance of the system is represented as a
.                                                          25
distribution.
          For example, in the low-level waste PA BTP,
compliance is based on the mean of the distribution of dose
plus there is a cap on the ninety-fifth percentile of the
dose distribution.  So that is the kind of thing we will be
facing here and, I expect, for decommissioning in the
future.
          Slide eight.
          The evolution of PA as a programmatic tool can be
described in four stages.  Method development began in the
mid-'70s with the same group at Sandia National Labs that
was doing the pioneering work on the reactor safety study. 
From that, we got insights into the repository system and it
helped to formulate Part 60.
          Second, we entered a demonstration of capability
phase in the mid-'80s to early '90s.  It helped to identify
R&D needs and the need for integration across the various
disciplines.
          We are now in a mode of applying PA to high-level
waste.  It is an integrated technical basis for interactions
with the Department of Energy.  It is an input to rule
development and it is helping to set NRC program priorities.
          Another stage that we entered very recently is the
high-level waste tools and methods have been adapted for
other waste applications and those include things like the
.                                                          26
low-level waste performance assessment working group which
started in 1990 and continues to now, the development of the
PA branch technical position on low-level waste,
demonstration of the test case and some applications in SDMP
which began in 1995.
          Slide nine, please.
          Performance assessment supports the Division of
Waste Management mission in decommissioning, low-level waste
and high-level waste.  For decommissioning, the goal is
evaluation of options for remediation and decommissioning. 
For most cases, simple analyses suffice.  For complex sites,
we perform analyses to support the NEPA process.
          For low-level waste, we are providing guidance and
support for state regulators and are attempting to maintain
an NRC review capability.
          For high-level waste, our focus in the proposed
Yucca Mountain repository, of course.  Two main areas of
activity are analyses to support high-level waste
regulations including those for interactions with EPA and
interactions with DOE on important stages of the program,
viability assessment, recommendation of the site to the
President and, of course, licensing.
          Slide 10, please.
          Now, I am going to begin the description of
performance assessment in the three Division of Waste
.                                                          27
Management program areas.  For decommissioning performance
assessment, we try to fit the analysis method to the problem
at hand.  As I have said before, simple analyses for simple
problems, complex tools when we have to.
          We do a probabilistic treatment to the extent
appropriate of source term which, as I mentioned before, is
highly variable for the decommissioning sites, for
environmental transport and for the dose calculations and
the decommissioning PA because it is involved in NEPA has to
consider chemical as well as radiological effects.
          Slide 11.
          Recent progress and plans in decommissioning PA. 
We have a draft methodology for performance assessment
applied to SDMP sites, we got a draft methodology in January
and will be briefing the ACNW next week.  This methodology
includes the ability to evaluate sites under the new
decommissioning rule.  Those sites which will not be
releasable for unrestricted use.
          We have a very preliminary analysis of the no-
action alternative for Sequoyah Fuels and Sequoyah is being
used as a test case to evaluate this draft methodology.
          We have published a draft EIS for the Shieldalloy
site in 1996.  There is a public meeting scheduled for
September and a preliminary final environmental impact
statement for July.  This is an example of the failure of
.                                                          28
institutional controls.
          We have a preliminary analysis for Parks Township
with a plan to publish the draft EIS in July of '97 and the
final in March of '98.  That analysis includes a
probabilistic analysis of the well location for an intruder.
          MR. GREEVES:  Which the staff is conducting.  It
is the staff making these calculations.
          MR. EISENBERG:  Slide 12.
          Now going to the scope for low-level waste PA, we
developed methods to treat uncertainty, especially the
propagation of parameter uncertainty, developed process
level models to describe the performance of various
components and for low-level waste you have the unique thing
is the engineered component such as cells and covers.
          We have a flexible overall performance assessment
methodology.  It is an iterative approach that links site
characterization, design and performance assessment. 
Individual dose is the compliance end point and to date it
has been applied only to hypothetical sites and designs.
          Recent progress and plans, the draft BTP on low-
level waste performance assessment will be ready for
public -- issued for public comment momentarily.  The -- we
have assisted in the reviews of the Nebraska low-level waste
state regulatory program and plan to participate in the
IMPEP review for Texas in June and provided assistance as
.                                                          29
called on by states for their low-level waste regulatory
program.
          Slide 14.
          The scope for high-level waste PA, we have to
treat both the undisturbed repository and disrupted
repository with associated probabilities.  We have to
consider the entire chain of consequence models, if you will
recall the earlier chart.  In some decommissioning work you
can get away with a subset if you make very conservative
bounding assumptions about pieces of the system.  In high-
level waste, we have to look at everything.
          We do a probabilistic treatment for model
parameters and future states which, of course, lead to
scenario classes, things like climate change, earthquakes
and vulcanism.
          The potential regulatory changes, that is a new
standard based on the current law or on proposed
legislation, may reorder the importance of subsystems and
issues.  For example, currently proposed legislation and the
NAS recommendations both would treat human intrusion as a
separate stylized calculation whereas the current standard
causes it to be incorporated into the distribution of total
system performance.  This reemphasizes the need for flexible
quantitative performance assessment methods.
          Slide 15.
.                                                          30
          We have two slides of recent progress and plans.
          In May 1996, we had a technical exchange on DOE's
latest total system performance assessment, TSPA-95, which
resulted in general agreement on the importance of
infiltration and establishing the basis -- having a strong
basis for mixing depth assumptions in the dilution analysis.
          We have reached general agreement with DOE on the
use of expert elicitation.  The branch technical position
was published in November of '96.  DOE adopted the BTP for
their work, as they indicated in their statement that they
gave today.
          NRC and center staff have been observing the
ongoing expert elicitations on volcanism, seismic hazard,
unsaturated flow and other topics as they are progressing.
          CHAIRMAN JACKSON:  Now that the DOE has come out
of the mountain, they have completed their principal
tunneling, I don't know what the status is of the various
alcoves within the ESF but have you been able to use any of
the site-specific data in your models?
          MR. EISENBERG:  Yes.  Maybe the best example is
that there has been a long-running controversy over what the
infiltration rate is in the Yucca Mountain repository and
whether the flow from the surface is localized in fractures
or whether it is spread out in the matrix and the chlorine
36 measurements in the tunnel seem to indicate that indeed
.                                                          31
there are areas where the flow is focused.
          CHAIRMAN JACKSON:  So you have been able to
actually fold that into your model?
          MR. EISENBERG:  Yes.  And as more information is
available, they are the principal processors of the
information.  But as it becomes available to us, we fold it
into our models.
          They have also been gathering information on the
structure of the geology in the region, as evidenced also
under ground and that has also been folded in.
          Okay.  Slide 16.
          We have another area of progress that we have
provided analysis to EPA for evaluating the implementability
of draft rules.  We had information exchange meetings last
spring.  Some summary analyses of this implementability
nature were published in the annual report for high-level
waste which you heard about yesterday and Wes Patrick spoke
to that.
          We expect to more fully document these analyses in
a NUREG coming this year.
          Finally, we have a more user-friendly total system
performance assessment code.  It facilitates use by a
broader segment of the staff.  We have a beta testing
version that was delivered on March 17.  It is under review. 
We are currently running the TSPA code or the total system
.                                                          32
PA code.  We call it the TPA code, on a Sun workstation.  It
was formerly run only on the Cray supercomputer and we
anticipate that this rapid local response will facilitate
the analysis.
          CHAIRMAN JACKSON:  This is the code developed by
the center?
          MR. EISENBERG:  Well, it was developed jointly by
the center and the NRC staff.
          Okay, moving into the summary and the look
forward, first some generic points.
          Guidance on the use of performance assessment,
which is, as I claimed, the waste management version of PRA,
will consider the complexity, the safety issues, the
availability of the data and the capabilities of the
licensees and I believe that is consistent with the PRA
implementation plan.
          We will continue a program of PA training for the
NRC staff.
          We have teamed new hires with experienced PA
staff.  The staff as well as the tools which are the
computer codes and the computer facility are essential
ingredients to provide a technical basis for making risk-
informed regulatory decisions in the entire waste management
program.
          Declining funding is a challenge which we are
.                                                          33
trying to address by the use of more powerful computing
tools and enhanced staff training.
          Now, for some specifics, in decommissioning, we
have tried to achieve a degree of optimization by applying
the staff experience in PA, both high level and low level,
to the complex decommissioning sites.  Under the current
regulatory regime and the current schedule, PA is providing
analyses for about a third of the complex sites requiring
site-specific environmental impact statements.  The
remainder of the sites are on backlog.
          Low-level waste --
          CHAIRMAN JACKSON:  They are on backlog in terms of
your being able to provide the analysis.
          MR. EISENBERG:  That's right.
          CHAIRMAN JACKSON:  Because of your resources.
          MR. GREEVES:  Let me jump in a little bit.  As you
know, there are a large number of these sites.  Fortunately,
not all of them are the "large" sites but nominally we have
about 13 large sites.
          We are working aggressively on Parks Township, you
have heard about that, we have briefed you on that.  The
Sequoyah Fuels site, we are actively looking at making that
a trial run on the test case.  Another one, the West Valley
site is going to be challenging us early on.
          The point is, we have 14 of these sites and we are
.                                                          34
working aggressively on three.  I don't know how we would
handle any more than that at any one point in time.  So
these others are there.  The owners of these sites are
developing their plans.  If they all come in at once, that's
the problem.  So we are pretty thin.
          CHAIRMAN JACKSON:  You can say that again for the
record.
          MR. GREEVES:  The staff is thin.  Stretched.
          [Laughter.]
          MR. GREEVES:  I would be happy to go further.
          CHAIRMAN JACKSON:  No, that's fine.  I think
you've gone far enough.
          [Laughter.]
          MR. EISENBERG:  On low-level waste, we plan to
respond to comments on the BTP and finalize ti as resources
permit, pursuant to the direction-setting issue number five.
          COMMISSIONER DICUS:  Could I ask you a question
about that?  Is that, given the fact the resources permit is
a qualifier but is that on track or is that going to be
delayed?
          The states, I understand, were rather critical
about that.
          MR. GREEVES:  Yes, there are some states that have
been critical of it.  I think the view is mixed.  But even
at the recent low-level waste forum meeting, a number of the
.                                                          35
states said, where is it.  In fact, they would love to have
had us come in, some of them, would love to have us come out
and brief at this forum meeting.
          But when you are working on these sites over here
and people are putting in extra hours, it is hard to keep up
with the BTP.  I would have hoped that we could have gotten
it out before this and called it a product.  It is right
there in terms of going out through the door.
          But an example, to answer your question, there was
also the test case which we wanted to go with the BTP.  We
can't get the test case done.  The documentation is showing
how to implement the branch technical position.
          CHAIRMAN JACKSON:  How far behind are you on that,
roughly?
          MR. GREEVES:  In which?
          CHAIRMAN JACKSON:  When do you see yourself
getting to the point of --
          MR. GREEVES:  I don't see us finishing the test
case.
          CHAIRMAN JACKSON:  You don't see finishing it at
all?
          MR. GREEVES:  Keith?
          CHAIRMAN JACKSON:  Can you talk about the test
case again, Dr. Greeves?  They want to --
          MR. GREEVES:  Could I ask Keith to come up to the
.                                                          36
table?  He is more familiar with it --
          CHAIRMAN JACKSON:  Sure.
          MR. McCONNELL:  My name is Keith McConnell.  I am
the section leader for performance assessment.  I would say
right now we are several months behind on BTP itself.  The
original SRM that we got asked us to come back to the
Commission in August of this year and we are just about
ready to go out for public comment, as I have said, and we
are looking at a 90-day public comment period.  We expect,
as people are aware, a significant number of comments and
some tough issues to address before we come back to both the
ACNW and the Commission.
          The test case, we have used it, we have used the
results of it quite a bit but we are not staffed to do the
documentation.  I would say that extends into fiscal year
'98.
          MR. GREEVES:  This is a program that has 10 to 20
FTE back in the early '90s and with the DSI-5, we are at
four FTE and, frankly, the West Valley site came in on top
of us and it is a real challenge.  So it is going to have
first bidding in terms of these kind of resources.
          Mal, do you want to --
          MR. KNAPP:  The only thing I would say is I don't
want to belabor issues which we visited in strategic
assessment, recognizing things like low-level waste sites by
.                                                          37
and large are being developed in agreement states,
recognizing what our limitations are and what we are trying
to do.
          I think what we have here is responsive to the
spirit of the SRM on DSI-5.  That is the best I can do.
          COMMISSIONER McGAFFIGAN:  The test case, is that a
real case or what do you mean by a test case?  I am just not
familiar.  Is it a low-level waste site, a real site, that
you are applying the BTP to?
          MR. McCONNELL:  It is a hypothetical site.
          COMMISSIONER McGAFFIGAN:  It is a hypothetical
site?
          MR. McCONNELL:  It is a hypothetical site for
humid conditions with a realistic source.
          MR. KNAPP:  Understand it is hypothetical
deliberately because, were we to pick a real site, there
would be implications with our results.
          CHAIRMAN JACKSON:  Okay.
          MR. EISENBERG:  Okay.  Slide 19.
          Clearly, in high-level waste we have moved from a
demonstration of PA capability to heavy usage of it in
support of programmatic goals.  We will invest in
refinements to our computing capability, especially our
total performance codes, only to the extent that such
refinements are expected to have a significant impact on
.                                                          38
performance.
          Our near-term focus for performance assessment is
developing a technical basis for the new high-level waste
rules.  Early feedback to DOE on the total system
performance assessment for the viability assessment, we have
been doing this.  We have been attending abstraction
workshops and expert elicitations.  We plan a technical
exchange with DOE, as you heard earlier today, in July on
their approach for the total system performance for
viability assessment and we expect to receive the PA for
viability assessment in September of '98.  At the requested
budget level, we plan to use PA to help prioritize our KTIs.
          CHAIRMAN JACKSON:  Commissioner Rogers?
          COMMISSIONER ROGERS:  Yes.
          To what extent are licensees able to use
performance assessment for decommissioning, particularly
those sites which are not the biggies that represent an
organization with a lot of capability and a lot of
resources?  Is it a tool that is actually useful for
licensees to adopt or is it just something that we have to
sort of retain for our own purposes and share basic
conclusions from it with licensees.
          In other words, can they -- you know, it seems to
me this has taken us a long time and a lot of hard work to
get where we are and I don't quite see how a garden variety
.                                                          39
licensee could do it at all and so how do we couple that
into the activities of licensees other than to evaluate
their plans when they have gotten them together?  Is there
any way this can be used to assist them in analyzing their
own sites?
          MR. EISENBERG:  Well, remember that we are using
performance assessment in a programmatic sense that covers a
wide variety of analytic techniques and at various levels of
complexity so I would tend to agree that, for some
facilities, it is inappropriate and the licensees would not
use these very complex tools nor would they try to maintain
a capability to do so.
          However, for some of the sites, they are doing
very complex analyses because that is what the problem calls
for.
          COMMISSIONER ROGERS:  I guess what I am trying to
get at, is there any way that this very specialized
expertise that we are developing here now could somehow or
other be partly digested and fed to licensees so that they
could use bits and pieces of it?  In other words, through
some kind of guidance or some technical reports or something
that they could actually use?
          MR. GREEVES:  Let me try for a minute.
          This is what I view as a graded process.  There
are a number of licensees out there now that use the RESRAD
.                                                          40
approach.  RESRAD is a code, it's out there, DOE does
briefings, people go around the country.  And our staff uses
RESRAD.  I think what Norm was portraying was, use the
simplest tool you can and if you can satisfy the criteria
then you're out.
          So we have sites where we run the RESRAD code, the
licensees, they are capable of running that, many of them,
if they hire the right consultant.  So that one is a fairly
first-level type of approach.
          Separately, we mentioned earlier on, we got this
decommissioning rule that we are expecting to deal with and
we are putting in place guidance on how to deal with that. 
So the goal is, within a period of time, that we would have
some tools out there, and you probably heard about the D&D
code that would allow people to determine what the
concentration is, if they have a single isotope and is there
a way we can come up with a concentration for that.  And
then it gets more complicated as you go on.  So I think we
have between three and four levels.
          Most of what Norm was talking about here today was
the fourth level, the third and the fourth level.  It is
more complicated.  The probabilistic distributions, et
cetera.  There are a couple of levels above that that we do
need to get out to the licensees that I think, you know,
within the next year we will have some of that.  There are a
.                                                          41
few tools like RESRAD out there now and, also, under the
action plan, the licensee could come in and use a 30
microcurie per gram uranium.  He doesn't have to do a
performance assessment.  What he has to do is go in and do a
survey and show he is under 30 micorcuries per gram of
uranium or 10 microcuries per gram of thorium.  So it is
what I call a graded approach to a regulatory process and I
think it will be better after we get this guidance out that
is coming underneath --
          CHAIRMAN JACKSON:  So you are working on this?
          MR. GREEVES:  Yes, we are.  And I would expect the
next time we brief you, we will include that.
          CHAIRMAN JACKSON:  Have more to say?
          MR. GREEVES:  Excuse me?
          CHAIRMAN JACKSON:  You will have more to say on
that?
          MR. GREEVES:  Yes.
          CHAIRMAN JACKSON:  Commissioner Dicus.
          COMMISSIONER DICUS:  I understand that there have
been some differences in the use of PA between DOE and NRC,
particularly like waste package lifetime.  I wonder if those
differences still existed and, if so, are they particularly
significant?  Are we going to try to resolve them?
          MR. EISENBERG:  Well, of course, we are trying to
resolve them.  This is part of the meetings that we attend
.                                                          42
with them and the technical exchanges that we have with
them.
          I think, and Chairman Jackson asked the question
how the PAs for DOE and NRC compare this morning.
          I would say that there are three areas that need
to be discussed in terms of comparability.  There is the
results of the performance assessment.  That is, the
estimates of performance of the system.  There is the
overall methodology and approach and how comparable those
are.  Then there are the specifics of the models and
parameters and assumptions that are used to describe the
various components of the system, so I think there are those
three levels.
          In terms of overall performance, generally we
have, and it is probably because it goes with the territory,
come up with higher doses, worse performance for the system,
than DOE.  Right now, based on the last analyses available,
it is one to two orders of magnitude for the average dose in
the undisturbed case.  So that is one answer.
          A second aspect of the answer is that the methods
that are used are quite comparable.  They share an awful lot
in common.  There is a little wrinkle regarding the
treatment of scenarios and how disruptive events are treated
but I expect we will be working on ironing that out with
them.
.                                                          43
          The real differences start to arise in the various
areas of assumptions and parameter ranges and models and
some of the significant differences based on their most
recent analysis which, of course, we expect to see some
changes in the one for the viability assessment, is that
they placed a lot of reliance on matrix diffusion.  That is,
they assumed that the flow in the fracture and radio nuclide
transport in fractures was tied very heavily and integrated
with the flow and transport in the matrix.  That had the net
result of slowing down the migration of the radio nuclides.
          We didn't.  We didn't assume -- we weren't so
positive.  So that is a difference that we need to see
worked out.  There has been an historical difference in
assumptions on infiltration rates and, as you heard this
morning, we are coming together.  They seem to be moving up
into the area where we have been.
          Another area of concern is the potential for
dilution and how much dilution you can take credit for at
Yucca Mountain and how the entire process works and the one
thing we also feel needs to be worked on is the consequences
of volcanism.  We have made a lot of progress in closing on
the probabilities but we need to maybe look some more at
consequences.
          So there are certain areas where there are
differences and we expect to continue to pursue that.
.                                                          44
          CHAIRMAN JACKSON:  Commissioner Diaz.
          COMMISSIONER DIAZ:  On something that you said,
you said there is one to two orders of magnitude difference
between dose assessment?
          MR. EISENBERG:  Between the expected dose in the
undisturbed repository case.
          COMMISSIONER DIAZ:  Can you give me an idea of
where they lie in absolute value?
          MR. EISENBERG:  TSPA-95 at five kilometers had
four-tenths of a millirem and NRC doing an analysis for one
of these EPA analyses got 23 millirem.
          CHAIRMAN JACKSON:  That is a large difference.
          Commissioner McGaffigan.
          COMMISSIONER McGAFFIGAN:  You may regret making
the last remark you did on page 7 but the essence of your
remark was when you are dealing with the performance measure
you end up with the distribution and you end up having to
regulate the mean and the 95 percent confidence interval. 
That set me to thinking about last week's briefings from NRR
and the various documents we are about to put out where we
are dealing with issues like 10 to the minus 6 probability
of core damage frequency and what that really means.
          Is there interaction between -- we were looking
for things like 95 percent confidence intervals.  In fact, I
think those words were used by the Chairman last week.
.                                                          45
          Is it appropriate to use some of these same
techniques in NRR space that you are using in NMSS space?
          That may be the Chairman's question.
          CHAIRMAN JACKSON:  No, you didn't.  You are
following on my earlier question about sharing.
          MR. EISENBERG:  Well, certainly John Austin has
been participating in PRA coordinating committee and we have
been following what has been going on over there.  But
remember --
          CHAIRMAN JACKSON:  They have been following what
you have been doing I think is really more the question.
          MR. EISENBERG:  We go more directly to the issue
which is what is the dose.  You are using surrogates like
large early release frequency and things like that.
          CHAIRMAN JACKSON:  Right, but I think the message
is you are dealing with this issue of what it means to
regulate the mean at a certain confidence interval.  And
that is what we have been pressing in reactor space and I
think that may be -- he and I talk back and forth, but --
          COMMISSIONER McGAFFIGAN:  That is what I am trying
to get at.  It sounds like the technique you have come up
with here may have some application there in terms of
telling us something about confidence around means.
          MR. GREEVES:  This is part of the branch technical
position and we expect to get some comments on this too. 
.                                                          46
The low-level waste branch is the one that is going to go
out and we may learn something through the comment process.
          COMMISSIONER McGAFFIGAN:  NRR might learn
something too.
          CHAIRMAN JACKSON:  Okay, well, thank you very
much.
          The Commission wishes to thank you for an
excellent and a very informative briefing.  Mr. Eisenberg,
you are setting very high standards here.  You can never
fall from this particular perch on your performance
assessment program.  And, as I had indicated earlier, these
are areas of great importance to the Commission.  Evaluation
of long-term performance of low-level waste disposal, high-
level waste disposal and site decommissioning, as you have
illustrated so amply is not a simple task.  But it would
appear that, based on today's briefing, you are really
making real progress and have a much better sense of that on
developing models that should allow us to characterize site
performance.
          I am particularly struck by the synergy that seems
to have developed between the low-level waste program and
the SDMP program and you are to be commended for that and
that appears to be an excellent approach.  It is useful in
both areas and they can play off each other.
          So the Commission encourages you to continue to
.                                                          47
develop this program and particularly as you develop the
ability to do these assessments on other platforms and, as
you have just heard, to interact and share the knowledge you
have gained in this area both with others within the NRC who
are developing PRA models as well as on the outside, to the
extent it makes sense through the appropriate regulatory
guidance.
          These kinds of interactions among our staff can
improve the final products for all that are involved in
these developmental efforts and allow us to potentiate our
resources when we have a lot of work, as you have outlined,
Dr. Greeves, on our plates but yet are in a budgetarily and
a programmatically constrained system.  But it has to be
optimized.
          So you might consider further the development of
base line regulatory guidance even beyond your branch
technical position as well as the simple perhaps modular
pieces of your codes or other products that could be used by
licensees to expedite the processes, particularly as they
relate to decommissioning.
          Thank you.
          Unless we have further comments, we are adjourned.
          [Whereupon, at 3:13 p.m., the briefing was
adjourned.]



Privacy Policy | Site Disclaimer
Thursday, February 22, 2007