Protecting People and the EnvironmentUNITED STATES NUCLEAR REGULATORY COMMISSION
1
1 UNITED STATES OF AMERICA
2 NUCLEAR REGULATORY COMMISSION
3 ***
4 BRIEFING ON PERFORMANCE ASSESSMENT PROGRESS
5 IN HLW, LLW, AND SDMP
6 ***
7 PUBLIC MEETING
8 ***
9
10 Nuclear Regulatory Commission
11 One White Flint North
12 Rockville, Maryland
13 Tuesday, June 30, 1998
14
15 The Commission met in open session, pursuant to
16 notice, at 2:05 p.m., Shirley A. Jackson, Chairman,
17 presiding.
18
19 COMMISSIONERS PRESENT:
20 SHIRLEY A. JACKSON, Chairman of the Commission
21 GRETA J. DICUS, Commissioner
22 NILS J. DIAZ, Commissioner
23 EDWARD McGAFFIGAN, JR., Commissioner
24
25
2
1 STAFF AND PRESENTERS SEATED AT THE COMMISSION TABLE:
2 JOHN C. HOYLE, Secretary
3 KAREN D. CYR, General Counsel
4 L. JOSEPH CALLAN
5 MALCOLM KNAPP
6 NORMAN EISENBERG
7 MICHAEL BELL
8 WILLIAM OTT
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
3
1 P R O C E E D I N G S
2 [2:05 p.m.]
3 CHAIRMAN JACKSON: Good afternoon, ladies and
4 gentlemen.
5 Today the Commission will be briefed by the NRC
6 staff on its performance assessment program which covers
7 three technical areas that are of treat interest and
8 importance to the Commission. These are low-level
9 radioactive waste disposal, high-level radioactive waste
10 disposal, and site decommissioning.
11 The staff briefs the Commission annually on the
12 topic of performance assessment. The Commission was last
13 briefed by the staff on this subject on May 15th of last
14 year.
15 The staff made it clear at last year's Commission
16 briefing that developing a performance assessment model in
17 any one of these three technical areas is a complex and
18 challenging task.
19 However, the development of high quality
20 performance assessment models for low- and high-level waste
21 and site decommissioning would enable the Commission to
22 obtain significant quantitative and qualitative input for
23 making risk-informed, performance-based regulatory decisions
24 on these matters.
25 So we look forward to hearing about the new
4
1 developments that have occurred in the past year in the
2 performance assessment program particularly as it relates to
3 radioactive waste disposal and the decommissioning of sites.
4 Unless my colleagues have anything to add,
5 Mr. Callan, are you leading off?
6 MR. CALLAN: Yes, Chairman. Thank you, and good
7 afternoon, Commissioners.
8 Present at the table with me today are Mal Knapp,
9 to my right, the acting director of NMSS; Mike Bell, to my
10 far left, who is the chief of the Performance Assessment and
11 High-Level Waste Integration Branch, NMSS; to my far right
12 is Bill Ott, the acting chief of the Waste Management Branch
13 in Research; and the primary briefer this afternoon will be
14 Norm Eisenberg, just like he was last year. He set a high
15 standard last year.
16 CHAIRMAN JACKSON: That's his reward.
17 [Laughter.]
18 MR. CALLAN: He's the senior advisor for
19 performance assessments in the Division of Waste Management,
20 NMSS.
21 Norm.
22 [Slide.]
23 MR. EISENBERG: Thank you very much. Good
24 afternoon.
25 CHAIRMAN JACKSON: Good afternoon.
5
1 MR. EISENBERG: If I could have slide 2, which is
2 an outline of the presentation.
3 [Slide.]
4 MR. EISENBERG: I will begin as usual by defining
5 performance assessment to set a context for this briefing.
6 Second, I will discuss three current issues in
7 performance assessment, and for each issue I will describe
8 the issue and the staff's approach to resolving the issue.
9 For two of the issues I will describe examples to
10 illustrate both the issue and the approach that the staff
11 has to resolving it.
12 Third, for each of the Division of Waste
13 Management program areas I will describe the performance
14 assessment program, including recent accomplishments and
15 planned activities.
16 As I have mentioned in the past, Division of Waste
17 Management has performance assessment activities in
18 high-level waste, low-level waste, and decommissioning.
19 Then I will briefly touch on support for
20 performance assessment from the Office of Nuclear Regulatory
21 Research, and finally, I will summarize.
22 [Slide.]
23 MR. CALLAN: Performance assessment is a type of
24 systematic analysis that explores three questions for a
25 waste facility:
6
1 What can happen?
2 How likely is it?
3 What are the consequences of the occurrence?
4 Performance assessment integrates information
5 across a wide variety of disciplines that are required to
6 analyze the performance of a waste facility. These could
7 include such diverse fields as corrosion science,
8 geochemistry, hydrology, heat transfer, rock mechanics.
9 In addition, performance assessment integrates
10 information across different program areas. For example,
11 design, site characterization, and the analysis used to
12 examine safety.
13 The term "performance assessment" as used in the
14 Division of Waste Management encompasses a broad range of
15 quantitative analyses that are applied to waste disposal
16 facilities. The analyses are attempted to be matched to the
17 need. We go from deterministic bounding analyses, which are
18 used most often, to probabilistic analyses, which are used
19 on the most complex facilities and issues.
20 CHAIRMAN JACKSON: I've asked you this kind of
21 question before, but now I will put a twist. How does
22 performance assessment compare with dynamic PRA?
23 MR. EISENBERG: To the degree that I understand
24 dynamic PRA, there are many similarities. The dynamic PRA
25 is trying to look at components and subsystems that are
7
1 normally not included in a standard PRA and look at their
2 response to the damages states that are produced by a given
3 fault or initiating event. This is where the focus is
4 primarily in performance assessment, looking, if you will,
5 at the level 3 aspect of a PRA rather than the level 1 and
6 level 2.
7 Our analysis of scenarios is really quite
8 simplistic compared to the complex logic trees and diagrams
9 that you have in PRAs because we don't have a complex piece
10 of machinery; we have a different kind of system.
11 Our focus is primarily on what I believe the
12 dynamic PRAs are attempting to focus on.
13 CHAIRMAN JACKSON: Thank you.
14 One other question. Maybe you can speak to this
15 as you go through because I think I know the answer in the
16 high-level waste program. Is there a role for expert
17 panels, or are they necessary once you get away from
18 high-level waste kind of issues?
19 MR. EISENBERG: When you say get away from
20 high-level waste, do you mean other kinds of waste or other
21 areas?
22 CHAIRMAN JACKSON: When we talk about low-level
23 waste disposal and site decommissioning.
24 MR. EISENBERG: I am sure there is room for the
25 use of expert elicitation and informal expert judgment
8
1 throughout the waste programs because the information is
2 often soft; there is not a large amount of data in a lot of
3 the areas; and you need to evaluate it.
4 CHAIRMAN JACKSON: Thank you.
5 Commissioner.
6 COMMISSIONER McGAFFIGAN: The output of a
7 performance assessment and high-level waste or
8 decommissioning, or whatever, is a number for total
9 effective dose to an average number of a critical group. Is
10 that what we are trying to get?
11 MR. EISENBERG: It certainly is the focus on
12 high-level waste in our current code efforts. I think I
13 have to hastily add that we fully intend to look at
14 intermediate outputs from different parts of the system. If
15 other measures of performance might be of interest, even if
16 they are not strictly speaking required for regulatory
17 judgment, we might want to look at those also. You always
18 design the tool to fit the need, and if the regulatory need
19 is to get the total effective dose, then that is how we
20 design.
21 COMMISSIONER McGAFFIGAN: What other intermediate
22 outputs do you have in mind that might not be regulatory but
23 might be of interest?
24 MR. EISENBERG: We might be interested, for
25 example, in the time that the waste packages start to fail.
9
1 We might be interested in the time for certain radionuclides
2 to traverse the saturated zone, for example.
3 COMMISSIONER McGAFFIGAN: If you took the TEDE
4 over a period to the average number critical group and
5 plotted it over time, these would be intermediate results to
6 getting a resonance at some date.
7 MR. EISENBERG: These certainly are all
8 incorporated in the end result but they give us an idea of
9 how the system works and how the different parts contribute,
10 which is an important part of making a regulatory judgment.
11 COMMISSIONER McGAFFIGAN: To the extent you are
12 doing deterministic bounding analyses for most issues, does
13 that lead to conservatism, and how much conservatism does
14 that lead to?
15 MR. EISENBERG: There is a lot of discussion later
16 in the briefing about conservatism. Maybe we could wait.
17 Certainly conservatism is a way to simplify the analysis and
18 to do bounding when it's appropriate.
19 COMMISSIONER McGAFFIGAN: Is vulcanism something
20 that is deterministic or probabilistic?
21 MR. EISENBERG: We are treating it
22 probabilistically.
23 COMMISSIONER DIAZ: Since the time domain came
24 into question, looking at your definition of dynamic PRA, it
25 seems to me that the definition that you are using does not
10
1 really consider putting time as an independent variable,
2 which some of the new PRAs do. You are still keeping time
3 as a dependent variable; is that correct?
4 MR. EISENBERG: I would say no. We track the
5 evolution of the repository through time. So we look at the
6 behavior of each component in the system as a function of
7 time. One of the things that we have to do is to roll up or
8 convolve, for example, the output of the various waste
9 packages into the transport and the geosphere to look at the
10 effect of the geosphere. That is very much a time dynamic
11 situation.
12 COMMISSIONER DIAZ: Right. But it has two
13 independent variables at any one time. You can do it like
14 we do a point time analysis. The other one you have two
15 independent variables. I think that is the key difference
16 in what some people are calling dynamic PRA.
17 MR. EISENBERG: I'm not sure I could answer the
18 question.
19 COMMISSIONER DIAZ: All right. Let it go.
20 CHAIRMAN JACKSON: Why don't we go on.
21 [Slide.]
22 MR. EISENBERG: We thought we would articulate
23 three current issues in performance assessment. I will
24 discuss the staff's approach to resolving these issues and
25 in two cases, as I said, provide examples of the staff's
11
1 approach. The issues are:
2 How can the optimize its efficiency by choosing
3 the most appropriate analytical tool for the regulatory task
4 at hand?
5 Second, how can the staff eliminate or greatly
6 reduce unnecessary conservatism in regulatory analyses while
7 simultaneously assuring adequate protection of public health
8 and safety.
9 Issue three is, how can the staff employe a
10 risk-informed, performance-based approach in framing
11 regulations, guidance and procedures so that flexibility is
12 provided to licensees?
13 Now I would like to go ahead and describe the
14 staff's approaches to these three issues.
15 [Slide.]
16 MR. EISENBERG: The first issue is how to optimize
17 efficiency by choosing analytic tools most appropriate to
18 the task.
19 We tailor our tools to the requirements of the
20 performance assessment. First, we have different kinds and
21 types of tools for each of the programmatic areas.
22 For high-level waste we have what I would think is
23 the most complex and detailed level of modeling.
24 For low-level waste, because the regulatory
25 structure is different and the problem is different, we have
12
1 less complexity -- for example, there is no substantial heat
2 generation by the waste -- with more flexibility in treating
3 aspects of modeling and in treating uncertainty.
4 For decommissioning there is a divers range of
5 contamination, complexity and site conditions. For example,
6 it can go from a very complex site involving several
7 radionuclides to a very simple site involving a single
8 radionuclide.
9 In addition, within each program area we vary the
10 level of detail and complexity in the modeling approach so
11 that it's commensurate with the importance of the aspect
12 being modeled.
13 As an example, in high-level waste groundwater
14 flow is given a lot of attention because it has such a
15 pervasive influence on the performance of the repository.
16 The migration of gaseous radionuclides is given relatively
17 less attention because the dose potential for those nuclides
18 is small.
19 [Slide.]
20 MR. EISENBERG: Moving along to the second issue,
21 how do we assure adequate protection of public health and
22 safety while eliminating unnecessary conservatism in
23 regulatory analyses?
24 We first define, evaluate and consider
25 uncertainties in the regulatory decisions.
13
1 We first identify the uncertainties. Some of the
2 uncertainties are quantified; others are evaluated
3 qualitatively.
4 Finally, we factor uncertainties in the decision,
5 and we need to consider the degree and type of uncertainties
6 and the impact of the uncertainty and also the facility's
7 operation on public safety.
8 CHAIRMAN JACKSON: Is your approach consistent
9 with the approach outlined in the generic reg guide on PRA,
10 Reg Guide 1.174?
11 MR. EISENBERG: I can't answer that.
12 [Slide.]
13 MR. EISENBERG: To move along and talk about how
14 we treat conservatism, we first need to say a little bit
15 about uncertainty in performance assessment. There are five
16 different kinds of uncertainties described in this slide.
17 These are not necessarily mutually exclusive sets, but let
18 me say a few words about each one.
19 Parameter uncertainty relates to the parameters
20 used in models to describe consequences. Examples include
21 corrosion rate, solubility limit, flux of water into the
22 waste package, porosity; all the dose parameters such as
23 foodstuff intake and irrigation rate factor into these kinds
24 of parametric uncertainties.
25 Disruptive scenario uncertainty relates to the
14
1 inability to determine whether a disruptive event will or
2 will not occur. Usually one determines a fixed set of
3 scenarios for consideration and each scenario has an
4 associated probability of occurrence.
5 I will say that one of the things we do in our
6 high-level waste performance assessment is that the time of
7 occurrence of the particular event is taken as a random
8 variable, and therefore in that sense we do not have a fixed
9 time evolution, but from one realization to another it will
10 change.
11 Exposure scenario uncertainty relates to the
12 inability to predict accurately the behavior of humans in
13 the future. Often a stylized set of exposure scenarios are
14 established by the regulator. For example, an intruder is a
15 stylized scenario in low-level waste. The license
16 termination rule is another example that permits development
17 of site-specific exposure scenarios.
18 Model uncertainty. In this context I mean
19 alternative conceptual model uncertainty. It relates to the
20 uncertainty in the choice of a model to describe the
21 performance of the waste facility. Often different models
22 may have different degrees of support but they will produce
23 different estimates of performance. So that is a measure of
24 uncertainty.
25 Another way to try to quantify the uncertainties
15
1 is to attach subjective probability or credibility to each
2 alternative conceptual model if one wants to.
3 Finally, there are programmatic factors that
4 produce uncertainties, things such as QA, management
5 effectiveness, and adequacy of recordkeeping. These all
6 have an influence on safety but are very difficult to
7 quantify.
8 COMMISSIONER DIAZ: Excuse me. I am trying to
9 understand the role of risk information in your performance
10 assessment. You are going to use risk information to
11 develop a risk-informed regulatory framework or to set
12 levels for performance thresholds, or both?
13 MR. EISENBERG: Both.
14 COMMISSIONER McGAFFIGAN: The Environmental
15 Protection Agency just went through the waste isolation
16 pilot plant certification. Did they use performance
17 assessment type capabilities?
18 In that case, I think it was 15 millirem TED.
19 They didn't have a groundwater issue because there was no
20 potable groundwater, but they had to figure out whether it
21 met a 15 millirem TED. The intruder scenario, was that
22 legislated away?
23 MR. EISENBERG: They had scenarios like drilling
24 into a brine pocket that spurted waste and brine out of the
25 repository.
16
1 COMMISSIONER McGAFFIGAN: Did they use techniques
2 similar to what you use?
3 MR. EISENBERG: Yes, I would say so.
4 COMMISSIONER McGAFFIGAN: Do you talk to EPA about
5 how their performance assessment worked in that case?
6 MR. EISENBERG: Yes. As a matter of fact we
7 commented on the criteria that they published for evaluating
8 the performance assessment. We talk to them frequently
9 about it. We have observed their activities. Several staff
10 members have been involved in observing the WIPP activities
11 for a long time.
12 COMMISSIONER McGAFFIGAN: When you deal with
13 parameters do you use a range of values with a probability
14 assigned, or do you use a single value and do sensitivity
15 analysis on whether if you vary off of that point you get
16 significantly different results?
17 MR. EISENBERG: We use a variety of techniques.
18 For the things which we think are quite important we prefer
19 to use a probability distribution and examine through a
20 formal type of sensitivity analysis what the impact is.
21 However, most of these models have more variables than you
22 would ever want to have to deal with. So the ones that are
23 either not very important or that are judged to be
24 relatively easy to fix we go ahead and fix those. We don't
25 want to do things like do sensitivity analyses to see how
17
1 the variation of gravitational constant is going to --
2 COMMISSIONER McGAFFIGAN: I wouldn't advise that
3 either.
4 MR. EISENBERG: There are some that are quite
5 important that we want to focus on and others that for a
6 variety of reasons we may decide to just fix.
7 COMMISSIONER McGAFFIGAN: If the gravitational
8 constant changes we've got bigger problems.
9 CHAIRMAN JACKSON: It depends on which planet you
10 are on.
11 [Laughter.]
12 [Slide.]
13 MR. EISENBERG: To further the discussion on page
14 8, I'd like to say what I mean by conservatism. As far as
15 I'm concerned it's the choice for any area of the various
16 types of uncertainties that I have previously mentioned that
17 would produce numerical results that underestimate the good
18 performance of a facility. For most of our cases this means
19 that the calculated doses are higher for greater
20 conservatism.
21 Often the analysis is simplified by making
22 conservative assumptions. For example, by choosing a
23 bounding value for a parameter rather than dealing with the
24 full range of variability. This can save time and money.
25 Some uncertainties are expressed quantitatively in
18
1 the analysis. As we just discussed, parameter uncertainty
2 can be propagated through change of models. Other
3 uncertainties are not quantified but a conservative approach
4 to their treatment should be factored into decisions because
5 those uncertainties and those conservatisms are there.
6 Different stakeholders may have a different view
7 of how much conservatism might be in a particular analysis
8 and that may not correspond to the staff's view. This is
9 just par for the course, I think.
10 So the question is, how should the staff balance
11 the cost in terms of the analysis and the results of the
12 conservatisms against public safety and the confidence in
13 the decisions?
14 I would like to now go to the next slide and see
15 how it comes out.
16 [Slide.]
17 MR. EISENBERG: Unfortunately, I chose to draw two
18 of these lines. One is green and one is blue, but I can't
19 tell the difference looking at the monitor.
20 There are at least two points to be made from
21 these figures.
22 Number one, these analyses involved quantified an
23 unquantified uncertainties. Both should be considered in
24 the decision making.
25 Secondly, the manner in which the decisions
19
1 incorporate the various kinds of uncertainties can have a
2 substantial effect on the cost of regulation both to the
3 licensee community and also to the staff.
4 Those are the two points I'm trying to get across.
5 I have to state a couple of caveats. Number one, this is a
6 schematic drawing which is not based on an actual analysis.
7 This is just one portrayal of what might occur. The
8 relationship between the screening analysis and the
9 site-specific analysis could be completely different in a
10 specific case.
11 The upper graph represents the dose distribution
12 obtained from a screening analysis in which less data are
13 available. So you will see the spread in the dose
14 calculated is much broader than in the lower figure.
15 The dose limit is in red.
16 If the decision is made on the mean dose, which is
17 the green line, the decision on this particular site would
18 be to release the site. If the decision were made on the
19 95th percentile dose, it would exceed the dose limit, and
20 the decision would be to do more analysis or perhaps go
21 ahead and take out some of the contamination or take
22 contaminated concrete away, to actually move material.
23 The lower graph represents the dose distribution
24 obtained from a site-specific analysis in which more data
25 presumably are available. So the spread in the calculated
20
1 dose is narrow.
2 Note that for this hypothetical example both the
3 mean dose and the 95th percentile dose are both below the
4 dose limit. So in this case we would definitely release the
5 site.
6 Also note that the mean dose is put on as being
7 smaller than in the screening case. This is because we
8 presume that because you have site-specific data you can
9 reduce some of your modeling conservatisms and have a more
10 realistic, less conservative model. So the whole analysis
11 shifts downward.
12 CHAIRMAN JACKSON: Commissioner.
13 COMMISSIONER McGAFFIGAN: This isn't quite as
14 theoretical as you lay out. I think it probably describes
15 some of the discussion that we've had in recent months about
16 the D&D; code with the staff. The D&D; code, which is this
17 new Sandia code that has been developed for decommissioning
18 purposes, as I understand it, it builds in sort of 95th
19 percentile parameter values. You plug in and you get a
20 number. You don't get a range under D&D;, right? You get a
21 number. Not quite 95th percentile.
22 MR. EISENBERG: The code as currently configured
23 operates with some default parameters. For a full range of
24 parameter distributions characteristic of the U.S. it will
25 yield a 90th percentile of dose. Let me hastily add that
21
1 you could --
2 COMMISSIONER McGAFFIGAN: Do a site specific.
3 MR. EISENBERG: You could change any or all of the
4 parameters, number one. This would be the case only if you
5 chose to use the default parameters. Or, as the staff is
6 planning to investigate, we could put a Monte Carlo driver
7 ahead of the code and do the full distributions on whatever
8 parameters we wanted to explore.
9 CHAIRMAN JACKSON: Right.
10 COMMISSIONER McGAFFIGAN: This is something that
11 is going to have to happen over a period of time. At the
12 moment D&D; produces a 90th percentile result with the
13 default parameters on what is probably a fairly broad
14 distribution. If you then do a site-specific analysis, and
15 in many cases you will want to, you can narrow the
16 parameters.
17 As I understand it, the staff is saying a
18 screening tool should be more conservative than a tool that
19 is used for a final regulatory decision. If the screening
20 tool produces a curve that is way over on the left with 95th
21 percentile and indeed the 99th percentile way below the dose
22 limit, then okay, not to bother. That one is
23 decommissioned. But if it is producing a curve like the one
24 you show at the top, you are saying you want to have the
25 licensee do a more detailed analysis with more information.
22
1 You'll have a lot of dialogue the next couple of
2 years. I'm not trying to do it today. The question is how
3 expensive that is and how frequently it has to be done and
4 are we overdoing it. I think that's a dialogue that is
5 occurring. I am just highlighting that it is occurring.
6 MR. EISENBERG: That's correct. That in fact was
7 my punch line for this slide. The staff is currently
8 grappling with how to balance these factors and how to make
9 the appropriate decisions, for example, for choosing default
10 parameter sets when you are considering only a parameter
11 uncertainty, when in fact we know that we have other kinds
12 of uncertainties involved in making the regulatory decision.
13 CHAIRMAN JACKSON: How would the Monte Carlo
14 driver help you here?
15 MR. EISENBERG: For example, if we replaced a few
16 variables on a site-specific basis, we could then do a Monte
17 Carlo analysis and generate a distribution such as here and
18 compare it with the dose standard rather than relying on a
19 predetermined limit, that is, a 90th percentile type dose
20 limit. Actually those parameters don't any longer guarantee
21 you that the resulting distribution will give you the 90th
22 percentile if you did a full Monte Carlo analysis.
23 CHAIRMAN JACKSON: That's right.
24 MR. EISENBERG: It's really a tool that would
25 enable us to understand more about how the system worked at
23
1 a particular site.
2 COMMISSIONER DIAZ: In practicality, if you ever
3 get your 95th percentile below the dose limit in any case,
4 whatever the distribution is, you will then have reason to
5 say we don't need to do any further analysis; is that
6 correct?
7 MR. EISENBERG: Right.
8 CHAIRMAN JACKSON: You were going to say
9 something, Dr. Knapp?
10 MR. KNAPP: I was just going to note that as we
11 move towards more site-specific information on the Dandy
12 code, some of these could be very inexpensive, because some
13 of these things, for example, are variables such as distance
14 to water table or soil type, which can be determined by a
15 call to your local county agent or by a quick measurement of
16 the depth to groundwater. There could be, if you like,
17 pencil sharpening that could be very inexpensive. So it
18 does not imply a great deal of resources would be needed if
19 in fact it did not necessarily meet the standard at the
20 first screening evaluation. But it's quite correct to say
21 that these are things we will wrestle with over the next two
22 years.
23 CHAIRMAN JACKSON: Is it feasible to expect the
24 licensees to be capable of utilizing these performance
25 assessment codes in decommissioning of sites given the site
24
1 complexities and the complexity of the codes?
2 MR. KNAPP: I'll offer an answer and then perhaps
3 Norm may wish to correct me.
4 CHAIRMAN JACKSON: Mr. Greeves wants to answer
5 also.
6 MR. KNAPP: Okay.
7 MR. GREEVES: There's a lot of meat on this slide.
8 I will point out that there is a range of licensees out
9 there; there is a set of licensees. We are talking to the
10 regions in terms of the payoff in this because they have the
11 large majority of cases to deal with.
12 There is a set out there that want the simple
13 number. They want the 5 picocurie per gram number. They
14 don't want to fool with this code business. So that set of
15 licensees would like that criteria. For that nuclide they
16 want to know how many picocuries per gram I can leave on
17 this site; I want to be out of here.
18 By the way, if they are little bit above that
19 number, they aren't going to want to run this code. They
20 are going to say get another shovel out; let me get out of
21 here; I don't want to argue with the NRC over 5 versus 10
22 picocuries per gram; I'll take another 100 cubic feet out of
23 here and be done with this.
24 There is that set of licensees. Then there is
25 another set who want to take advantage of this because they
25
1 aren't talking about a few cubic yards; they are talking
2 about large amounts of material or large buildings to
3 decontaminate. They're going to want to come in and have
4 this conversation with Norm, the staff you see here behind
5 the table about, okay, I didn't pass the screening criteria,
6 but I'm going to use D&D; or I'm going to use RESRAD, and
7 here is what I did; will you accept that?
8 So there is a set of licensees that can do that.
9 Then there is probably another set that are much
10 more complex, and there are a handful of entities out there
11 that can do that. So it's a spectrum of activities out
12 there.
13 I think over the next two years, working with the
14 licensee community, Research and the Decommissioning Board
15 that I think you have either seen in one of our papers or
16 you will hear about, we want to set that process up. This
17 was the paper that was sent up to you in March, I believe.
18 That's what we want to achieve over the next couple of
19 years. And it's needed. The full range of these licensees
20 need an answer, and that is what the staff you see in front
21 of you are working towards.
22 I hope I have answered part of your question.
23 CHAIRMAN JACKSON: Thank you.
24 MR. EISENBERG: I think we can move on to slide
25 10.
26
1 [Slide.]
2 MR. EISENBERG: The third issue is, how can the
3 staff employ a risk-informed, performance-based approach in
4 framing regulations, guidance and procedures so that
5 flexibility is provided to the licensee?
6 In general we will follow a three-step process.
7 One, we will reduce or eliminate prescriptive
8 requirements.
9 Second, we will use the results of PA to provide
10 risk information.
11 Third, and quite specifically, we will use PA to
12 compare calculated system performance to the objective
13 regulatory criteria.
14 [Slide.]
15 MR. EISENBERG: As an example, we look to the
16 high-level waste arena and our approach to drafting a new
17 regulation for high-level waste.
18 First, we are removing the quantitative subsystem
19 performance requirements.
20 Second, we are evaluating various quantitative
21 methods to demonstrate implementation of a multiple barrier
22 concept.
23 We have developed and proposed importance measures
24 for the repository system pursuant to a recommendation of
25 the ACNW.
27
1 A measure of importance is indicated by the change
2 in system performance if the functions of a barrier are
3 neutralized.
4 Finally, we intend to be flexible and allow DOE to
5 propose its own quantitative measures for demonstrating an
6 effective implementation of multiple barriers.
7 CHAIRMAN JACKSON: Can you give us a qualitative
8 statement about how many differences there are between NRC
9 codes and DOE and with EPA? Are we all on different planets
10 in terms of how these computations are done?
11 MR. EISENBERG: Are you speaking strictly in terms
12 of high-level waste?
13 CHAIRMAN JACKSON: With high-level waste I'm
14 interested in DOE; in the others I'm interested in EPA.
15 MR. EISENBERG: No, I don't think we are on
16 different planes at all. Certainly DOE and we are taking a
17 very similar approaches to modeling repository performance.
18 We generally model the same components. We may have
19 differences about their capability to perform. An example
20 might be DOE wants to take credit for the cladding of the
21 spent fuel. We have incorporated that into our modeling in
22 a very limited way because we have more doubts about its
23 survivability.
24 Similarly, DOE has been taking credit, as you will
25 see later, for something called matrix diffusion, which is
28
1 the communication between flow in fractures and flow in the
2 matrix of the rock. We think this is a process that maybe
3 won't buy them very much. So we are not modeling things
4 quite the same way.
5 So there are differences, but I think there are
6 more differences in the treatment of topics rather than in
7 the overall approach.
8 In decommissioning of low-level waste I think
9 everybody is pretty much looking at things the same way.
10 The bottom line is dose. The question is which pathways do
11 you include in a particular code and analysis and how do you
12 treat various processes, but they are very similar.
13 CHAIRMAN JACKSON: Commissioner.
14 COMMISSIONER McGAFFIGAN: The performance
15 assessment you have been developing for purposes of looking
16 at Yucca Mountain and what DOE did with WIPP, is WIPP easier
17 to model than Yucca Mountain because it's salt? Are there
18 differences? I know you have been following it, but you
19 haven't done it. Is a large salt formation easier to model
20 and does it reduce uncertainties compared to rock
21 formations?
22 MR. EISENBERG: I would say just off the top of my
23 head that salt is probably easier to model. I turns out
24 that unsaturated flow is a very complicated flow system and
25 it is difficult to model all the processes that can occur.
29
1 Salt by comparison, I think, is relatively simple.
2 MR. KNAPP: I can throw in some very old
3 information. When we started in this business nearly 20
4 years ago we did a lot of work in salt. By comparison to
5 unsaturated flow in fractured media salt was much simpler.
6 MR. EISENBERG: Let me add one more thing on page
7 11. One of the things that the Nuclear Waste Technical
8 Review Board is urging DOE to consider is alternative design
9 features. An approach such as outlined here would allow
10 evaluation of the merit of these different design features.
11 [Slide.]
12 MR. EISENBERG: Now begins the status update of
13 performance assessment in the three Division of Waste
14 Management program areas. For each area I will describe the
15 progress and plans that have occurred over the past year.
16 The first one is for decommissioning. We have a
17 framework and methodology that has been developed. It is
18 being tested and enhanced by the Office of Research.
19 We are developing a standard review plan to
20 implement the licence termination rule. This is where we
21 are working out the details of what codes to use and how to
22 use them and what distributions and what the appropriate
23 approach to screening is.
24 Dose modeling is obviously a key aspect of that
25 particular activity.
30
1 We are coordinating the guidance with the ongoing
2 casework to minimize any changes in the future.
3 The casework is another thing that we are involved
4 in. It is either proceeding or awaiting submittals by
5 licensees.
6 A Decommissioning Management Board has been formed
7 which provides oversight and coordination for activities in
8 a decommissioning area, and it involves membership from
9 NMSS, the Office of Research, NRR, and the regions also are
10 participating.
11 CHAIRMAN JACKSON: Commissioner McGaffigan has a
12 question.
13 COMMISSIONER McGAFFIGAN: How is this all coming
14 together? You have the casework. In individual cases that
15 were previously working in the SDMP, which uses these
16 figures that John Greeves talked about, 5 picocuries per
17 gram, or whatever, are we looking at that from the point of
18 view -- even if they are in SDMP as is allowed by our rule,
19 using SDMP criteria, how would it work under the Subpart E
20 criteria? We probably can't require it of licensees if it's
21 not a regulatory requirement.
22 MR. EISENBERG: You mean ones that have already
23 been --
24 COMMISSIONER McGAFFIGAN: Ones that are casework
25 that is ongoing, that are going to be cleaned up to SDMP
31
1 criteria, as is allowed if they submit their plan by October
2 of this year. How do you learn from those sites so that's a
3 benefit to the longer term program which is going to be all
4 done under Subpart E and yet devote the resources honorably
5 to get them to decommission?
6 MR. EISENBERG: One of the approaches is to have
7 the project managers for these areas come in and brief this
8 dose modeling group on the kind of activities that are
9 ongoing and the decisions that are being made so that there
10 will be a two-way communication; they can be warned if it
11 looks like there is going to be something that would be in
12 gross disagreement with a decommissioning under the new rule
13 and at the same time, so that the guidance for implementing
14 the new rule can be crafted, taking advantage of the lessons
15 learned, if you will, from the ongoing cases.
16 COMMISSIONER McGAFFIGAN: One of the later
17 documents says you are going to have this standard review
18 plan by FY-2000, which is about the time they will have two
19 years experience with the document we are about to put out.
20 I think it's already out there on the Web, the various
21 guidance, the reg guides.
22 You are also talking about interim guidance
23 sooner. How soon will that interim guidance be available?
24 MR. EISENBERG: For example, one piece of guidance
25 that we expect to get out is for building contamination, to
32
1 come out with, if you will, very simple surface
2 contamination criteria for release of buildings. We expect
3 to come out with that in the late summer or early fall. As
4 pieces are completed by the dose modeling group or other
5 parts of the standard review plan development, we expect to
6 put out those pieces.
7 There is a whole suite of guidance that is kind of
8 out there. There are pieces of the manual chapter; there
9 are handbooks; there are NUREG BRs; there are NUREGs; there
10 are branch technical positions. Many of these will have to
11 be updated, revised or discarded. I believe there is a
12 Commission paper that you've asked for that is in the works.
13 It is coming to you soon. I can't answer all the questions
14 right now, but I'm sure that will answer a lot of them.
15 CHAIRMAN JACKSON: Commissioner.
16 COMMISSIONER DIAZ: I guess in this slide you are
17 referring to SDMPs practically exclusively, right? This
18 refers to site decommissioning?
19 MR. EISENBERG: That's correct, site
20 decommissioning; materials licensees.
21 COMMISSIONER DIAZ: Have you considered the issue
22 of clearance of materials and how it would impact site
23 decommissioning at all?
24 MR. GREEVES: I think you have a paper on
25 clearance of materials.
33
1 CHAIRMAN JACKSON: Right.
2 MR. GREEVES: You have that paper separately.
3 They are related. I think a lot more energy needs to go
4 into the clearance paper. These are mostly focused on the
5 license termination rule issues, which actually in some ways
6 is a little simpler to deal with. It gives us a cleaner
7 target to look at.
8 Let me add. You asked the question, is it only
9 SDMP? First, we owe you a paper. You are going to get it
10 shortly. I would say what you see in front of you is the
11 SDMP-like sites, and there are a few more complicated sites
12 that this apply to other than the SDMP. So it's basically
13 the complicated site.
14 Norm mentioned to you that we are trying to
15 consolidate the guidance. One of the things that we need
16 are these screening tables that the regions could use to
17 release sites. If you come up with a 5 picocurie per gram
18 table, the licensee can see that number, the regional staff
19 can see that number, and they can disposition sites quickly.
20 So it's the full set of those issues.
21 [Slide.]
22 MR. EISENBERG: Slide 13 talks about low-level
23 waste. We had few resources in this area this year.
24 We did participate in an IMPEP review.
25 Also we were able to respond to specific requests
34
1 from Illinois and Nebraska. Not mentioned on the slide is
2 that we commented to the Department of Interior on a
3 sampling protocol for Ward Valley.
4 The main operation for the future is to revise the
5 draft technical position on low-level waste performance
6 assessment based on the input from Agreement States and the
7 public in FY99.
8 [Slide.]
9 MR. EISENBERG: Next is high-level waste.
10 A major focus of activity and achievement this
11 year has been the development and use of a total system
12 performance assessment code which we call a TPA code. We
13 have performed sensitivity analyses at a total system and
14 subsystem level, which has helped to reprioritize KTIs and
15 sub-issues. It has been a major factor in integrating
16 performance assessment with other high-level waste
17 activities. For example, a PA staffer was assigned to each
18 KTI team to work with them and integrate their other
19 activities with the involvement with the code.
20 It has proved to be a basis for interactions with
21 DOE on their total system performance assessment code and
22 results that they are using for the viability assessment.
23 This is a user friendly code with a large
24 interdisciplinary users group, ten to 15 NRC staff members
25 and about the equivalent number at the Center for Nuclear
35
1 Waste Regulatory Analysis.
2 We are currently revising the code for the TSPA-VA
3 review. Some of the things that we are looking at are some
4 design features that DOE has thrown into the mix. We are
5 always in an ongoing evaluation to reduce excess
6 conservatism.
7 In the future we plan to improve the code for the
8 license application review.
9 [Slide.]
10 MR. EISENBERG: Another major activity this year
11 has been development towards a draft rule for high-level
12 waste disposal at Yucca Mountain, the site-specific rule.
13 The strategy was formulated and it was accepted by
14 the Commission.
15 The staff is employing a risk-informed,
16 performance-based approach.
17 ACNW has endorsed the approach for multiple
18 barriers.
19 Currently we are preparing a draft rule package as
20 directed in the Commission's SRM.
21 [Slide.]
22 MR. EISENBERG: The main purpose of this next
23 slide is to show the hierarchical nature of the rule and
24 other guidance planned for development in high-level waste.
25 At the top you have the total system performance
36
1 standard.
2 Then you have in a tier below that the subsystems
3 and a tier below that components of the subsystems, and
4 below that very detailed phenomenon processes and related
5 technical issues.
6 Guidance could be developed up and down the
7 diagram. The main point here is that the main and central
8 feature of the rule is the use of overall risk criteria.
9 The other requirements would be treated in subsidiary
10 guidance, not in the main rule.
11 [Slide.]
12 MR. EISENBERG: Another important area of
13 accomplishment in the past year in high-level waste has been
14 our interactions with DOE on their performance assessment
15 for the viability assessment.
16 We have had three technical exchanges on the dates
17 that are indicated.
18 There are several positive aspects of DOE's
19 approach.
20 If I could just point out a couple. One was the
21 increased use of performance assessment to focus site
22 characterization activities. NRC has long advocated that
23 DOE adopt such an approach, and it looks like they are
24 moving strongly in that direction.
25 Another positive aspect is that they have
37
1 recognized as a key issue the support they can muster for
2 claims of longevity for the C-22 material proposed as the
3 corrosion resistant material in their waste package.
4 There are a few questions that remain:
5 There is consistency in transparency of the
6 analysis.
7 Credit for new and enhanced engineering features,
8 and as I mentioned before, credit for matrix diffusion.
9 And a longstanding issue has been the weighting of
10 alternative conceptual models.
11 CHAIRMAN JACKSON: Last year I asked you about the
12 use of site-specific data from Yucca Mountain in your
13 performance assessment models. Are you able to have that
14 data for the development of your own models?
15 MR. EISENBERG: We are using the data that DOE has
16 published. They have a whole protocol where they gather the
17 data and compile in QA and they don't let us have that much
18 access to it until they have gone through part of the
19 process. But by and large I believe we are getting good
20 access to their data and we are using it.
21 COMMISSIONER DIAZ: Going back to the issue of the
22 difference between NRC and DOE, I remember that we had some
23 numbers last year in the dose assessment that were two
24 orders of magnitude higher than DOE. We got 23 millirems
25 and DOE had .4. That seems to be a significant difference.
38
1 Are those being reconciled? You said we are in the same
2 plane. The same plane is the same order of magnitude, a
3 factor of two difference?
4 MR. EISENBERG: I think you are still likely to
5 see some significant differences because of the extremely
6 long lifetime that DOE is presuming that their waste
7 packages will survive. In our analysis we are not as
8 optimistic.
9 COMMISSIONER DIAZ: Two orders of magnitude
10 different?
11 MR. EISENBERG: I wouldn't want to say right now.
12 Their results are in flux and our results are in flux.
13 Rather than trying to guess the difference between two
14 moving targets, I'd rather pass.
15 CHAIRMAN JACKSON: Commissioner McGaffigan.
16 COMMISSIONER McGAFFIGAN: When will the targets
17 quit moving? Is the viability assessment going to be
18 submitted later this year, on schedule?
19 MR. EISENBERG: Sometime this fall as far as we
20 know.
21 COMMISSIONER McGAFFIGAN: It will be submitted to
22 the Congress or to the President? Remind me how the process
23 works.
24 CHAIRMAN JACKSON: The viability assessment is a
25 congressional request.
39
1 COMMISSIONER McGAFFIGAN: Will we have seen it in
2 advance or will we see it only as it goes to the Congress,
3 and if we are asked to comment on it, how quickly would we
4 be able to comment on it?
5 MR. BELL: We have arrangements with the
6 Department of Energy that we will be able to review drafts
7 as it is being prepared. As a matter of fact, the whole
8 chapter on their TSPA they have committed to provide us in
9 draft form at the same time they provide it to their peer
10 review panel, which actually should happen next month. We
11 think we are getting good visibility.
12 On the schedule, they continue to tell us that the
13 DOE staff will get it to whoever is acting as secretary of
14 energy on schedule at the end of the fiscal year but that
15 it's essentially a political decision by that person as to
16 whether it will be released immediately or whether some
17 other process will take place.
18 One of the department's concerns is they have
19 received a letter from the state requesting that the
20 department go through some sort of a public process before
21 they transmit it to Congress.
22 CHAIRMAN JACKSON: Okay.
23 [Slide.]
24 MR. EISENBERG: On the next slide there are just
25 two smaller items.
40
1 We have completed the first version of the issue
2 resolution status report for the performance assessment
3 methodology. This brings together performance assessment
4 considerations from a wide variety of disciplines.
5 We have also developed some proposed importance
6 measures for the geologic repository system. There are
7 several that have been proposed. They are undergoing peer
8 review and we are evaluating how to use them in a regulatory
9 context, especially for evaluating the implementation of a
10 multiple barrier approach.
11 [Slide.]
12 MR. EISENBERG: Research support for performance
13 assessment. One of the great contributions that the Office
14 of Research made was to develop the high-level waste
15 performance methodology. If I could say, I was involved in
16 that work when I worked in the Office of Research.
17 CHAIRMAN JACKSON: You can say that.
18 [Laughter.]
19 MR. EISENBERG: Their current focus is on generic
20 radionuclide transport which has a focus on decommissioning
21 of material sites. They are developing a flexible user
22 friendly framework to implement a performance assessment
23 methodology and they are trying to insert in that some
24 enhanced process models.
25 If I could speak to two of the enhancements. One
41
1 is to add a mechanistic treatment of sorption and the other
2 is to look specifically at radioactive slag as a source term
3 in decommissioning. It is a not infrequent source term
4 which is not that easy to treat.
5 [Slide.]
6 MR. EISENBERG: To close things up, a few generic
7 points.
8 The applications of performance assessment are
9 tailored to fit the problem. This includes the magnitude of
10 the hazard, the complexity of the safety issues, the
11 availability of data, and the capabilities of the licensees.
12 We want to allow appropriate flexibility while
13 ensuring safety through a risk-informed, performance-based
14 regulation. Again, PA is the waste program's equivalent of
15 PRA.
16 Declining resources is a continuing challenge and
17 it is being addressed by the use of more advanced computing
18 tools, both hardware and software, enhanced staff training,
19 and a focus on what we consider to be the most important
20 issues.
21 [Slide.]
22 MR. EISENBERG: To summarize what is coming up in
23 the three program areas.
24 For decommissioning, development of the standard
25 review plan as guidance for implementing the license
42
1 termination rule is a key item expected by fiscal year 2000.
2 As I mentioned before, interim guidance would be issued
3 sooner, as available, and we are coordinating that with the
4 ongoing casework.
5 For low-level, as I said before, we will revise
6 the draft BTP on the low-level waste performance assessment
7 methodology.
8 [Slide.]
9 MR. EISENBERG: High-level waste is likely to be a
10 large focus for our activities. In the performance
11 assessment in high-level waste we are using it to identify
12 vulnerabilities of the repository. It helps us structure
13 the flow of information into our decision making; it helps
14 to prioritize the key technical issues and therefore
15 provides assistance to management; and it provides insights
16 into the development of the site-specific Yucca Mountain
17 rule.
18 We are striving for appropriate improvements in
19 the capability.
20 The near-term focus will be to use technical
21 insights for the rule, to provide timely feedback to DOE,
22 and to prepare for the TSPA-VA review. If we get it, we
23 understand we have two months after the formal receipt to
24 give our comments to the Commission. So we are trying very
25 hard to get in a state of readiness for that endeavor.
43
1 CHAIRMAN JACKSON: Thank you.
2 Commissioner McGaffigan.
3 COMMISSIONER McGAFFIGAN: I want to go back to
4 Commissioner Diaz' question about the two orders of
5 magnitude. You mentioned the difference in the lifetime of
6 the waste packages. You and DOE must have been talking
7 about that issue for a couple of years now. Is it that you
8 think that the waste package lifetime distribution that they
9 have is incredible or that a waste package might be able to
10 be designed to that but it isn't designed yet? Is it a
11 matter of cost, how much they want to invest in the waste
12 package, or do you think that any amount of cost they will
13 not get as long a lifetime as they are projecting?
14 MR. EISENBERG: I guess there are two aspects of
15 it as far as I'm concerned. We have some of the other staff
16 here. They might want to contribute.
17 One aspect is that we feel strongly that they have
18 to consider all possible environments in the future
19 appropriately weighted by their probability to evaluate the
20 performance of the waste packages.
21 We are not sure that they have taken enough
22 account on these long-lived waste packages the effects of
23 the ongoing seismicity in the Yucca Mountain region and the
24 kinds of environments produced by rock fall or the casing
25 for the tunnel falling in and how those might damage the
44
1 waste packages.
2 For a not so long-lived waste package this doesn't
3 turn out to be a very important feature, but the longer the
4 waste package is around the more subject it is to some of
5 these destructive environments.
6 That is one area where we think we would like to
7 see a slightly different approach on the part of DOE.
8 The other is, and this is a very difficult issue,
9 we have very limited data on the performance of these
10 materials when we are trying to project it. In some cases
11 DOE is claiming 60,000-year lifetimes, but even for the
12 10,000-year performance period these are extraordinary
13 periods of time for engineered materials. I certainly would
14 question the ability to project that far into the future and
15 know that things will perform that way.
16 I think there is a question about whether all the
17 uncertainties have been incorporated into their projections
18 of waste package performance.
19 CHAIRMAN JACKSON: The Commission would like to
20 thank the staff for an excellent and very informative
21 briefing. Mr. Eisenberg, you do such a wonderful job, we
22 will look for you same station, same time next year.
23 [Laughter.]
24 CHAIRMAN JACKSON: As I said earlier, the
25 technical areas that this performance assessment program
45
1 cover are of great importance to the Commission as you can
2 tell by the degree of the discussion.
3 The evaluation of the long-term performance of
4 low-level waste disposal, high-level waste disposal, and
5 site decommissioning is not simple. Each time we hear from
6 you we have a better sense of the complexities.
7 It would appear that based on today's briefing the
8 staff is making excellent progress on developing models that
9 should allow us to characterize site performance in the long
10 term.
11 I am particularly struck by the synergy that now
12 appears to be working between the low-level waste program
13 and the SDMP program, and that is the kind of synergy we
14 like to see, and that appears to be an excellent approach.
15 It's useful in both areas and one can play off of the other.
16 I also was encouraged to hear about the
17 involvement of Research in developing usable models. I
18 think that is very useful.
19 So the Commission encourages you to continue to
20 develop the performance assessment program, to interact, and
21 to share the knowledge gained in this program with others in
22 the NRC who are developing PRA models. Maybe you could
23 almost claim that you helped the people doing dynamic PRA to
24 make it dynamic. These types of interactions among our
25 technical staff can only improve the final products for all
46
1 that are involved.
2 Commissioner Dicus didn't want me to say anything,
3 but we are going to miss her for a short time. I think we
4 should thank her for the service she has given us to this
5 point.
6 [Applause.]
7 CHAIRMAN JACKSON: But you better hurry up and
8 come back.
9 With that, we are adjourned.
10 [Whereupon, at 3:10 p.m., the briefing was
11 adjourned.]
12
13
14
15
16
17
18
19
20
21
22
23
24
25
[an error occurred while processing this directive]
[an error occurred while processing this directive]