skip navigation links 
 
 Search Options 
Index | Site Map | FAQ | Facility Info | Reading Rm | New | Help | Glossary | Contact Us blue spacer  
secondary page banner Return to NRC Home Page

United States Nuclear Regulatory Commission

Briefing on Status of Nuclear Reactor Safety

Commissioners Conference Room

One White Flint North
11555 Rockville Pike
Rockville, Maryland

Wednesday, January 17, 2001

9:30 a.m.



	Commissioners

		RICHARD MESERVE, Chairman
		JEFFREY MERRIFIELD
		NILS DIAZ
		GRETA DICUS
		EDWARD McGAFFIGAN

	Staff

		ANNETTE L. VIETTI-COOK, Secretary
		KAREN D. CYR, General Counsel

	Also Present

		KEN RAGLIN, Associate Director
		Training and Development
		Human Resources

		ASHOK THADANI, Director
		Office of Nuclear Regulatory Research

		HUB MILLER
		Region I Administrator

		FRANK MIRAGLIA, Deputy
		Executive Director for Operations

		DR. WILLIAM TRAVERS
		Executive Director for Operations

		SAM COLLINS, Director
		Office of Nuclear Reactor Regulation

		FRANK CONGEL, Director
		Incident Response Operations
	P R O C E E D I N G S
									9:31 a.m.			MR.
MESERVE:  The Commission meets this morning to hear from the staff on the Status of Programs in the Nuclear
Reactor Safety Arena.  This is the second of the briefings that the Commission has held in the Arena
format.
		The briefing is focused on reactor issues, but we are looking not only at the activities in
NRR but also a variety of the other entities within the Commission and impacting on safety of nuclear
reactors.  Those, of course, include research, training, regional activities, incident response and many
others.
		This is obviously an area of prime importance to the agency.  There's a lot that has been
going on, and we very much look forward to hearing from you.
		Let me turn to my colleagues and see if anyone would have an opening statement.  
		(No response)
		MR. MESERVE:  If not, Dr. Travers, you may proceed.
		DR. TRAVERS:  Thank you, Mr. Chairman, and good morning.
		As you've indicated, we're glad to be here today to give you a status report in the Reactor
Safety Arena.  
		Accordingly, we're here to highlight the achievements from the past fiscal year, describe
current and planned initiatives, particularly those involving organizational management and reactor
oversight, and we also plan to discuss some of the key challenges that we face in the upcoming fiscal year,
and as you are aware, within the Nuclear Reactor Safety Arena, there are a number of activities associated
with facility licensing and renewal, inspection, enforcement and assessment, investigations, incident
response, and safety research, among others.
		Implicitly in all of these activities is the need we all recognize for timely and effective
technical training of the NRC staff.
		Today, you're going to hear from several of the key NRC staff managers who are playing a
vital role in our efforts to integrate and meet the agency's goals and measures associated with these
areas' activities.
		A key player in this, of course, is Frank Miraglia, the Deputy Executive Director for
Reactor Programs, Sam Collins, the Director of the Office of Nuclear Reactor Regulation, Hub Miller, our
Regional Administrator, NRC Region I.
		Frank Congel is here as the Director of Incident Response Operations, Ashok Thadani,
Director of the Office of Nuclear Regulatory Research, and, lastly, Ken Raglin, who is the Associate
Director for Training and Development in the Office of Human Resources.
		With that, let me turn it over to Frank.
		MR. MIRAGLIA:  Thank you, Bill.
		Good morning, Mr. Chairman, Commissioners.  I'd like to give you a broad overview of our
performance for fiscal year 2000 in the Nuclear Reactor Safety Arena.
		All strategic goals and measures were met in fiscal year 2000.  Those goals were no reactor
accidents, no deaths due to acute radiation exposure, no events at the reactors resulting in significant
radiation exposures, no acts of radiological sabotage, no events resulting in releases of significant
amount of radioactive materials to the environment.
		In addition, all performance measures were met as well, and those include no more than one
event identified as a significant precursor.  In fiscal year 2000, we had no events that -- there were no
statistically significant adverse trends in industry performance in the reactor area.  There was no event
resulting in exposure that exceeded regulatory limits.
		We had a goal of no more than three releases to the environment that exceeded regulatory
limits.  That was the goal and performance for the year 2000, and it was zero.  No significant breakdowns
in physical security resulting in weaknesses in protections against radiological sabotage.
		We had a goal to review all our license  renewal applications reviews within 30 months.  Our
goal was two, and we performed at that level.  We met two.
		In addition to the strategic goal measures and performance goal measures, there are output
measures, and you'll hear about the performance in each of the areas in briefings to follow.
		In addition, there will be a discussion of the performance evaluations and self-assessment. 
Performance evaluations are a higher level of degree of reviews that we have committed to to measure the
effectiveness of our programs in the Strategic Plan.
		Within the Strategic Plan, we have indicated that we would conduct four to five major
program reviews, one in each strategic arena, and one in the corporate management strategy over a
three-year period.  This would coincide with the update, the triennial update of the Strategic Plan.
		One is a schedule in the Nuclear Reactor Safety Arena for the year 2001, and that's the one
with the inspection oversight process.  That review is scheduled for commission discussion in June of 2001,
and that review will assess the implementation and prioritize lessons learned and recommend program
adjustments, and that's a significant activity that's underway in the year 2000.
		In addition, each office has some self-assessments at the office level in each of these
areas, and some of those will be discussed and covered in the program reviews to follow.
		In addition, there has been significant discussion in terms of criticisms of our Strategic
Plan in the alignment of our performance goals and strategic goals and output measures in terms of not all
the goals or perhaps performance and outcome measures as opposed to output measures.
		We have recognized this, in response to GAO and other internal reviews, and we have a number
of activities to try and improve on that process.
		In terms of our validation and verification on the measures, we've assigned an SES manager
to each of the pieces of measures to assure development of the appropriate data for assessing our
performance.  This goal is to generally use existing databases, and we are developing new venues in a
number of areas to improve on those type of processes.
		Next slide, please.  Turning to the Key Challenges, we have a number of challenges within
the reactor arena.  As mentioned, the oversight process is a challenge, the implementation of that
continues to be a challenge, and I think that's progressing well. 
		We have continued progress in terms of the risk- and performance-based activities that we
previously briefed the Commission on in two briefings in December on those programs.
		In addition, another key challenge that we have is communications.  Communication covers a
broad area.  In terms of our ability, we need to continually articulate to our staff internally and to our
external stakeholders the kind of changes we're making, the basis for those changes, and that those bases
and changes are predicated on maintaining safety performance goals.
		I think sometimes there are other goals for reducing unnecessary burden, improving the
effectiveness and efficiency and increased public confidence are seen to be as perhaps not fully considered
to maintain safety goals, and it's a challenge internally as well as externally to articulate what we're
doing, and why we're doing it.
		Staff has had many, many outreaches in terms of public involvement and outreach to our
stakeholders involving them in terms of comments on rules, guidance, workshops, has significant outreach in
this effort.
		In addition to improved communications, we've developed communication plans for a number of
key areas which communication plan is guidance internally to understand, so there's an understanding
vertically within our organization of what we're trying to do in each of the programs, so our staff can
communicate at all levels with the objectives of those programs.
		In terms of the reactor arena, we have the communications plan on the oversight process. 
We're developing orientation plans with regulations.  We are also developing plans on oversight assessment
and allocations program, just to mention a few.
		In addition, the Commission heard at the EDO staff meeting last week, and also at the
Materials Arena Briefing, the challenge of acquiring, maintaining a highly-qualified staff.  That is an
agency challenge, and that's a challenge within the reactor arena as well in terms of our ability to
recruit, train, and retain staff.
		I'm sure you'll be hearing a little bit more about the training activities, and you heard
about the recruitment efforts.  The recruitment effort has significant support in terms of the reactor
arena, and in terms of some of our more recent successes, and that's with the support of our regions as
well as by NRR, and the Commission heard some of that at the EDO briefing.
		With that, I'd like to turn to Sam Collins to discuss the licensing and the other aspects of
the agreement.
		Sam?
		MR. COLLINS:  Thank you, Frank.
		Good morning, Mr. Chairman, Commissioners.  Could I have Slide 5, please?
		I'm pleased to be here this morning to represent the Nuclear Reactor Regulation Office Team
and the executive leadership at the operating levels.
		Our goal in the next two slides is to provide you the context of performance for fiscal year
2000 as well as to delve into some future activities, including self-assessment and key challenges in the
go-forward sense, to talk briefly about our status in those areas.
		The next presenter will be Hub Miller, who will talk about the application of the NRR
Programs in the regions as a representative of regional administrators.
		Focusing on Slide 5, we're talking here about key output measures, and those are articulated
in the performance plan throughout this initiative.  Those are very specific and measurable in terms of
licensing actions, licensing actions being those, including amendments, exemptions and leave requests.  The
target was 1,500, and the actual was exceeded, which was 1,574.
		An additional output measure is the age of licensing action inventory.  We have one-year and
two-year goals.  In those areas, the one-year goal is 95 percent, the actual was 98 percent completed
within one year.  The two-year target is a 100 percent, less than two years, and we achieved that goal of a
100 percent.
		Other licensing tasks, the target was 800 of those specific tasks to be completed, and
actually 1,100 were completed.  We're able to over-achieve in that area due to a shift in resources that
was decided by the Leadership Team as a result of utilizing resources freed up from the delay in fiscal
year 2000 of the anticipated license renewal application.  
		So, the Leadership Team took those resources and were preprogrammed for license renewal and
targeted those towards the backlog of licensing tasks, and I think that's an example of the dynamics that
are available to the performance-based management and the leadership goal and their willingness to meet
those challenges.
		An additional output measure are licensing exams.  The target here is a little softer, and
it's essentially to meet the licensee's demand in that area. Although we do have some assumptions, the
licensees will gravitate towards performing their own examinations in this area.  We have mixed success in
there, but we did meet the actual demand of 352 initial and 292 general fund exams, although that was lower
than the estimate of 565 and 400, respectively.  So, that's a planning assumption area there that we'll
have to improve on for the next year.
		License renewal applications.  We did meet the targets.  We met the milestones.  We
completed the reviews within 30 months.  Calvert Cliffs, as you know, was issued on March 23rd of 2000, and
a challenge was issued in May 23rd of 2000.
		We do have challenges in those areas, and we'll discuss those in the next slide.  Currently,
we're working on ANO 1 at Turkey Point, and those are on schedule.
		Under the Major Accomplishment Area, I'll cover some general topics, and some of these
cascade into future initiatives and challenges themselves, particularly in the areas as we move forward
with risk-informed regulation to the extent that's practical within the regulatory arena.
		We have a number of infrastructure improvements.  We have a number of investments in those
infrastructure improvements, and we have budgeted initiatives throughout the years of 2001 through 2004 to
address not only program areas but to address internal improvements in the Office of Nuclear Reactor
Regulation.
		We've been able to do that by making some assumptions as far as efficiencies within our
programs, and I know one that's been discussed previously, particularly with Commissioner McGaffigan, was
efficiencies assumed in the outgoing years with less interval.
		The infrastructure area we're looking at there, risk-informed, as we move forward in that
area, Part 50, Office of Nuclear Reactor Regulation is responsible for the Option 2, for coordinating with
the Office of Research, who has the lead on Option 3, and as you know, Option 1 is the continuation of the
current licensing amendment process and emergency action process for risk-improved initiatives.
		Guidance on risk-informed decision-making.  Risk-informed licensing amendments, as I
mentioned, and exemptions.  We completed 48 of those this past year.  Design basis information regulatory
guidance.  Event-reporting rule, which is soon to go into force, which is the Part 72, Part 73, reporting
requirements.
		The alternate source term regulatory guidance, and the maintenance rule, which is the
revised rule, effective November 28th of 2000.
		Under the Organizational Effectiveness Accomplishments, I'd like to acknowledge that we
continue with the focus-based management scheme.  We're into the quality management system arena now, using
performance management modeling, including multiple tools within the organization.
		We have basic work in oversight that's being done, based on the clear set of expectations,
measures, and indicators, and some of those have continued to be under development.
		We're identifying goals for the three levels in the organization, including the executive,
the leadership and the operating role, and we're focusing on discipline in the planning and oversight area.
		I believe as far as 6, I'd like to move the discussion into specific areas, including
self-assessment activities.  As Frank mentioned, NRC is to conduct program evaluations, one in each
strategic arena, and one in the corporate management strategy area, over the next three-year period, to
coincide with the triennial update of the Strategic Plan.
		For fiscal year 2000 through 2003, one evaluation is planned in the Nuclear Reactor Safety
Arena, and that's an advisory oversight process.  That also continues to be one of our future challenges in
that area.
		Process improvements include the 2.206 petition process for fiscal year 2001, and that's one
that's well known to the Commission, and it involves stakeholders in addition to public involvement in
those areas.
		For the fiscal year 2000 self-assessment area, as a result of the Executive Leadership
Review activities, we are reviewing the utilization of the Reactor Licensing Improvements, including the
best assessment.  I mentioned the previous 30 percent efficiency assumption in that area.  Rulemaking and
general administration.
		We provided for the unique lessons learned process, including an independent review by the
Office of Research, and also the internal review conducted by the Office of Nuclear Reactor Regulation, and
those have been published.
		They're reviewing contractors, particularly in light of the challenge that we have with
conflict of interest with some of our contractors.  Finding the right type of technical resources and
timely expertise.  These are emerging issues in the area of contract resources.
		We're getting to the area of Key Challenges.  I mentioned achieving the process efficiencies
and license renewal.  We have an expected increase in applications, five, later this year, in fiscal year
'01, four in fiscal year '02, three -- excuse me -- six in fiscal year '03.  That's down from the
assumption of two and up and identified as emerging.
		Continuing in outgoing years, the number approaches eight, although not all of those have
been identified as of yet.
		Risk-informing, the fiscal protection requirements, and as you know, there's a number of
cross-cutting areas with the Part 73.55, including those areas that fan out into the NMSS arena.  Those
include identification of harley sets, getting into the definition of adversary characteristics, the
application of the program towards indices, the fuel storage areas as well as approvals, and a number of
those will be emerging as commission policy issues as we move forward in the application of improvements in
those areas.
		Frank mentioned workforce planning is a significant challenge for us in the corporate
management strategies area.  42 percent of the NRR technical staff are eligible for some type of
retirement, either early option.  77 percent of the Senior Executive Service is eligible for some type of
retirement in the Office of Nuclear Reactor Regulation.
		Our leadership level, which is composed of division directors, has performed an internal
analysis of recruiting and retention.  We have that document under advisement.  
		There was a meeting on-going this morning between the Executive Leadership Level of NRR in
coordination with the Human Resources area to provide for additional initiatives in the recruitment and
retention area.  I can talk about those, if you like.
		Aligning activities to outcomes, we have a number of infrastructure issues that we're
dealing with.  Cost center initiatives have increased for fiscal year '01.  IT initiatives, which tends to
be areas that are focused around application of the technical support, including items of electronics,
those areas are also eligible for cost center initiatives, which is an increasing accountability for the
program offices for the application of IT.
		We have increased resources for their public confidence initiatives, increased inspection
and assessment declined due to the maturity, I would say, of the development of the oversight process after
the first of the year.
		Licensure and resources increased.  Preparatory improvements increased in the area of
licensing actions and licensing tasks due to the elimination of the backlog, and resources in those areas
declined.
		Maintaining safety is paramount in the four performance goals.  Hub Miller, representing the
regional administrators, will talk to that area.  We do not budget for emergent resources in those areas. 
In other words, we do not budget for that response.  We budget for programmatic reviews.
		So, the application of the programs by the regions in coordination with the other offices to
ensure that they can maintain the safety has a large influence on the office's ability to not only continue
to define our programs but also to continue to move forward in our missions.
		In the revised oversight process, in looking at any statistically-significant adverse
trends, right now, those trends are positive.  When you look at the input in the Office of Research and
those trends that are being tracked by the Office of Nuclear Reactor Regulation, the industry is doing a
good job of maintaining safety and actually improving on the number of indicators that we historically
track, recognizing that there's a wide scope of the things that are tracked.
		Under the New Initiative areas, as we move the organization forward, we will have challenges
in the area of the potential for advanced reactors.  In the technical reviews in the Office of Research, we
show that this is on-going, and we have the lead in the technical provisions, also in any aspects of
advanced siting, and we also have under current review, as you know, Phase II, the AP-1000 reactor time
		I mentioned the safeguards.  Steam generator regulatory improvements, potential policy issue
for the Commission that's emerging.  In the fuels area, we have Tritium loss in HEU applications.  Right
now, in the area, Marsdon probably has the lead in this area.  We're looking at the potential for the
McGuire Station to have four test assemblies, depending on the DoD fabrication and the submittal of the DoE
license amendment, and that's projected for 2001.  We may have radiation taking place around 2003 or so.
		Deregulation and industry consolidation.  We have submitted a paper in December, which deals
with some of the aspects and influences of deregulation and industry consolidation, and there's a broad
outline there to cover, which are not only technical areas and program areas but also into the corporate
arena of the structure of the offices in that consolidation and the focus of resources to support those
initiatives.	
		Finally, I'd like to mention the decommissioning rulemaking and the transfer of licenses
which has a tendency to evolve with the sophistication of the industry, and there are new challenges that
are emerging in those areas, not only with decommissioning funds with the -- in the context of the
structure and the financial aspects of license renewals.
		So, with that broad overview, I'd like to move next to Hub Miller, who represents the
regions, and I'd be glad to respond to questions after the presentations.
		Thank you.
		MR. MILLER:  Thank you, Sam, Mr. Chairman, Commissioners.
		I'm going to speak this morning about several separate but related areas, inspection
assessment, enforcement and investigations, and before I talk about specific output measures, let me start
by talking briefly about and broadly about two major accomplishments, the initial implementation of the
Reactor Oversight Program and Policy in the enforcement area.
		Obviously because we are just entering Phase II in this first year of assessing the
implementation of the oversight program, it's premature to talk about final outcomes, but I think it's fair
to say at this point that the specific things that we have done to ready ourselves and to begin
implementation of the oversight program have been a significant accomplishment.
		On the programmatic side, within our own Reactor Inspection Branch taking the lead with a
lot of support from the regions, we have the basic framework of the program established, the detailed
guidance, the inspection procedures were issued.
		There have been countless meetings with stakeholders, both internal to the agency and
externally, meetings to explain the program and to get feedback.  An example is the meetings that were held
at all clients across the country before the program was fully implemented.
		These are meetings to in plain words describe the program, what we're attempting to do in
this program and get feedback on the initiative.
		Each week, again coordinating with NRR, we have held meetings with the licensees and other
stakeholders in each region on a regional basis twice during this process, first before we started the
process and then after implementation.
		Training has been completed.  Formal training has been provided for all regional people and,
of course, the people involved in implementing the program.
		Speaking of actual implementation, I have to say this has been challenging.  We expected
this scheduling and completing the inspections that were called for, the baseline and the supplemental
inspections, those called for by the action makers, has been a significant undertaking.
		A great deal of coordination has to occur among the many branches in the regions and the
agents, and I have to say among regions, in many cases, to make that happen.
		Most challenging aspect of it, I think, is the performance of the numerous team inspections
that are a part of the program.
		I think, also, we expected kind of the sweeping changes that are involved in this program to
at least start on issues that arise, questions of interpretation.  The people for the first time have met
the new inspection procedures, exercised the significance determination process, the action makers, and
those have arisen, and I think I can say personally that at the times I've been at the Commission, I've
never seen the level of coordination that has taken place between regions and the Headquarters as has
occurred here appropriately.
		I think that has been a significant part of our program getting off to a good start.  On top
of this, I have to say, and Sam mentioned it, we are still responsible for responding to events in each
region, and we've had an opportunity to do that.
		We've had special situations, like the start-up of the EC Compliance and Issue 1, and events
that you point to have been challenging, but we have kept up with that.  Again, this is resource-sharing. 
We have a major inspection going on as we speak.  A team leader is a staff person from Region 4, and there
are members of the other regions, and I think it goes to the -- to how we -- our objectives and goals and
to make adjustments as needed to meet the goals.
		In the enforcement area, two things have come up.  We have eliminated the concept of
aggregating items of lesser significance to escalate enforcement and to eliminate the regulatory
significance concept and to focus more on risk.
		The second thing is in April of 2000, we modified the enforcement policy to conform to the
Reactor Oversight Program.  In the reactor arena, we no longer talk in terms of enforcement severity levels
or the significance of the termination process, and I think we can say at this point, there's a lot of
consistency between the inspection and enforcement -- inspection assessment and enforcement arenas.
		Next slide.  Speaking to output measures, first inspections.  The measure here really is a
measure of how it relates to performance of inspections that are called for, and as I mentioned at this
point, we've been performing all our inspections for baseline and supplemental.
		Output measure in the area of assessment is the performance of the mid-cycle reviews, the
mid-cycle reviews of all reactors, making sure they're being done in a timely way.  Also, cases where the
action meters are called for, for quarterly updates of those on time.
		Allegations is a big part of inspection and assessment.  There's been a small decrease in
the past year or so in the number of cases, but it remains a significant part of our inspection assessment
effort.
		There's been some increased activity in the area of wrongdoing, but speaking to the output
measure, the goal was a 180-day efficient time on average, and the current time is a 137 days.
		Going to enforcement, again a bit of a decreasing trend.  The output measure in this area is
tracking timeliness on estimated enforcement.  Here, I'm speaking of, under the old program, Severity Level
3 and above violations.  
		In the new program, issues that meet the determination process rise to a greater than a
degree finding, we are meeting much better than the goal.  The goal is a 120 days on average, and the
current average is 78 days.
		Investigations.  Very many caseload.  The output measure -- several output measures.  The
first is average time to close.  The goal is nine months.  We're doing those now in five months, and
speaking also to the backlog of older items, the goal is nine percent operator for 12 months, and the
current percentage is seven percent.  So, you can see we're doing better with the goal.
		A great deal of coordination goes on between the regions and the field offices, given the
significant caseload, at best to our effort in that area. 
		Self-assessment is just a couple of things Sam mentioned in self-assessments.  I want to
speak a bit more in a moment about the exact oversight assessment being performed, but I think a couple of
things that, speaking from the regional perspective, are very valuable to us, in the audits that are done
in the allegations area each year.
		Field offices know why and benefit from the audits.  Each region is performing
self-assessments.  Just to name one that we've done recently in our region, is of the implementation of the
new enforcement policy changes.
		Coming to the next slide, which talks about challenges, looking forward, the most
significant to talk about is the assessment that is beginning of this first year implementation program.
		First, detailed assessment measures were established.  Groups are forming.  The groups are
made up of the regional people and Headquarters, looking at both individual issues that have arisen, like
how to deal with cross-cutting issues in the new program, level of documentation and inspection reports and
the like, as well as rolling up all of the data that has been collected through feedback forms as
inspectors have performed inspections and modifying the program, inspection procedures and guidance.
		This will be a heavy activity in January-February.  There are both internal and external
lessons learned.  Workshops.  All this will be rolled up at the end of June in a commission paper.
		Staffing.  Staffing is always a critical issue.  It's a critical issue now.  Certainly as we
move to this new program, several arenas continue to train people on the program itself.  It's still
evolving.  It will undoubtedly evolve for some time.
		Also very important, training and making sure people have the skills to do this program. 
Just to give you an example, fire protection.  We have to be sure we have the people, the tools, the
skills, to do that inspection program effectively.
		There's a task force that has formed jointly among the regions with Headquarters and NRR,
and they're examining program guidance for inspector qualifications, and, lastly, I think all of us are
very much in a hiring mode, and we need to replenish the staff with top talent.
		There's another aspect of that, and that is, training those people, the matriculation
process, retaining those individuals, so all regions, and as Sam mentioned, are in a heavy effort of
training and providing the skills to be effective, and we in the region have an interim program developed
to assure that folks who come to us have very high skills and are given the needed attention and mentoring
and the like to be effective.
		Communications, Frank mentioned at the beginning.  I won't say anything more than that it
continues to be a significant activity as change occurs in really several arenas. 
		The GAO study last year reported a certain skepticism about the new program.  I would say
that we have come a long way.  I believe in terms of having people have buy-in in this program and the
support, but that's a continuing effort and a continuing challenge.
		The last thing I want to mention is the area of enforcement.  Two things to mention here. 
There's a task force that has been formed to address issues that have arisen over the way we handle our
cases, employment discrimination.
		This is a group led by Phil Archer, OGC, regional people, program offices, and numerous
meetings that have been held with input from stakeholders, meetings in six cities actually, cities that
were in fact targeted in some cases to have personnel event history and traffic in this area.
		We have to reach out and get input, examine options, and at this point, the task is on
target to deliver an assessment to the Commission in June.
		Just briefly, one last item, and it has to do with examining alternative dispute resolution,
looking at the potential of damages.  That will be the end of September.
		MR. MESERVE:  We'll now turn to Mr. Congel for his presentation.
		MR. CONGEL:  Good morning, Chairman, Commissioners.  
		I'm pleased to be here this morning to describe to you the program accomplishments by my
group for the past year.
		Before I get into Slide Number 10, I'd like to just quickly reflect on our output measure,
the output measure for the IRO, as reflected in the Emergency Response Performance Index.  That index is
composed of seven parameters that we believe are critical to performing the measure of capabilities that we
maintain in this agency's response requirements and responsibilities.
		As an example, I won't go all through them, through all seven, but I'd point out, as an
example, responsible organization staffing is one of the principle parameters.  It's a reflection of the
level of readiness we have.  We all have designated positions with a goal of having at least three people
who train and qualify for that position.  We have exceeded that.  There are only actually a couple that are
3D, most are 4, 5 and 1.  
		Also want to point out that this index reflects both Headquarters as well as regional
capabilities.  We frequently have to trade responsibility for being ready.  More recently, in Region 4, it
resulted in our staff here going out to back up in case anything happened in the Region 4 plant. 
Conversely, if we had a snow day here, one of the regional offices would back us.
		We have spent extra time also to ensure that Region 4 is capable of essentially replicating
the capabilities here, and in any case, all of these seven parameters are combined linearly to provide a
measure of effectiveness.
		When we established this about three years ago, we had a goal of 90 percent based on seeing
what kind of level that we were attaining.  We've been meeting 99 percent of change over the years, and it
was 95 percent last year, to hopefully 99 percent coming in.
		So, you will provide a continuing challenge for this agency to perform in an outstanding
manner in this area.
		Now, I'll go to my first slide, Slide 10.  Over the past year, the principal major
accomplishments are listed here.  Of course, Y2K response effort was substantial.  I just find it
remarkable that something of that magnitude has faded actually so quickly, but there was, as you all know,
an agency-wide place in this, and I would have to say the extra effort has paid off in terms of it turning
out to be a relatively smooth transition.
		What we have done, of course, is learned from that, and we'll get into, in a few moments,
how we had continuing efficiency study and response early on in my organization at all times.
		But just as a reminder that we not only had the establishment in a very effective way in
Region 4 as a back-up capability, but we also utilized that experience to combine with the agency's
requirements to have continuity of operations planned.
		Overall, what we did for Y2K continues.  That's particularly notable in our communications
capabilities as well.  
		The one-voice initiative, another major effort that is on-going.  It was a result of an
accident that happened in Japan over a year ago now, and what we found out from that experience is that the
existing infrastructure, which is oriented toward naturally self-protecting U.S. citizens, did not work in
as effective a way when we had a distant accident that didn't have a direct impact on the United States,
but nevertheless generated substantial interest in what possibly could happen here if we had the same
technology.
		As a result of the increase in the activation of the response center, we learned very
rapidly that EPA had lead responsibility, did its job, mainly activated the radioactive sensors around the
country to ensure protection of the citizens.   However, the questions remain from news sources about what
was going on technically, and what did we have here in the United States?
		In fact, there's a meeting with the FRPCC going on right now, where a proposal in front of
them is being discussed, but we are working with our departmental agencies on when an issue like this comes
up again, we're coordinating beyond what the role of EPA is currently.
		I would say we're roughly halfway through that, and there is awareness now that didn't exist
a year ago.
		A major effort initiated over the past several years and is continuing now has to do with
the presidential decision directives.  Since the bombing in Oklahoma City in 1995, there have been four
PDDs issued that put direct responsibility on us as well as other federal agencies to be ready for this,
what's called, unconventional threats to national security.
		There are efforts involved here that go well beyond what we had in the past.  There are
multiple -- there are no single points of contact to coordinate this, and it continues to be a real
challenge to ensure that we fulfill our statutory responsibilities for public health and safety in our
licensed facilities but also would be able to function in the environment where there are new
responsibilities on our law enforcement agencies.
		I believe that we've made substantial progress.  We conducted two initial exercises with our
counterpart federal agencies, law enforcement ones, over the past year, and we are well on our way to
another major exercise this year.
		I'd point out that we have submitted a commission paper approximately a year ago with an
estimated schedule and have been able to exceed what we anticipated doing within this time frame.  We hope
to be able to continue with this same progress.
		As an ultimate measure, of course, of our readiness, is how we respond to real actual
events, something other than an exercise, and, of course, the most notable one at this time period was
Indian Point 2, and we looked at our activation times, decision-making times, all of the parameters that go
into our performance index and found that we indeed met, if not exceeded, them.
		We also have smaller events to respond to, not as major as Indian Point, but all of them
were also subject to the same test.
		Very levels of activation of our response center as well as the corresponding regional
response center.
		Next slide, please.  This refers to self-assessment activities.  My organization performed
this about three years ago.  All of the principal findings were implemented either as they come out of the
report before it was completed or shortly thereafter.
		The principal efforts continuing as a result of this self-assessment were areas where we can
improve efficiencies, and data is essential in terms of the budget structures that we're in now, and I'd
like to just point out a couple of examples where we have continued to do that.
		The International Event Scale.  It's a process where all of the cooperative agencies in the
world go, and we're looking into having some manual submissions as we course over an imminent basis similar
to the Hughes system that was utilized during Y2K.
		A state outreach program is one that is very popular, and we continue to get requests that
exceed our capability in responding, and what we are doing with state outreach is looking into ways in
which we can combine the training, combine meetings, combine the different subjects, so that we can make
maximum use of this unique communication link that we have, it's an essential one, and at the same time
improve the manner in which we interact.
		There is nothing that substitutes for direct face-to-face-type interactions.
		Overall, utilization of the agency's web page as well as use of the Internet communication
is one of the places where we're spending substantial amount of time in order to promote efficiency.
		Interactions, for example, with FEMA during a national event, through the Internet home
system has been utilized a number of times, and this will be something that we'll utilize again.  The old
way we did it was with fax machines.
		The Diagnostic Evaluation Program was in place when I arrived at the agency and continued in
a support role for some number of years.  This past year, we were able to link it directly with the new
Reactor Oversight Program, so that the features of the diagnostic evaluations that have proved very
valuable in the past were preserved but integrated with an existing program so that it was a nice smooth
flow with an overview associated with NRR's responsibilities with this already-established program.
		The Commission was informed of this relatively recently, but it was another efficiency area
that wanted to preserve the safety functions performed.
		In addition to the key challenges I've just mentioned, particularly with the PEDs, we have
and continue to have a challenge of assuring and ensuring that we have our direct access lines to all of
our licensee sites.
		There was and continues to be difficulty in transitioning from the FTS-2000 to FTS-2001
System.  We got our own, have developed a separate contract with AT&T;, the original providers of the
direct access lines, because the transition did not take place within the originally-established time
schedule.
		We're still working on that.  We regard it as a key part of our capability to respond, and
there has been no loss of continuity in this system at all as a result really of my staff's substantial
effort here.
		I mentioned the PDDs.  We're working with the FB.  That has presented a number of challenges
and other law enforcement agencies but principally the FBI.  Their structure is different than ours.  The
orientation of a law enforcement agent is different from ours and has presented a challenge to our
technical staff.
		I believe that with the interactions we're having, particularly with their field offices,
proving to be fruitful, and I look forward to the end of this fiscal year, resulting in a plan in place
that we can be fully reliant on if one of the terrorist-based events occurs.
		And with that, I will entertain questions at the end of the presentation.
		Thank you.
		MR. MIRAGLIA:  We'll turn to Ashok Thadani.
		MR. THADANI:  Thank you, Frank.
		Good morning.  May I have Slide Number 12, please?
		I will very briefly cover the Office of Nuclear Regulatory Research accomplishments and the
challenges that we face in the reactor environment.
		We slightly exceeded our measure of 45 products.  We've had 47.  By the very nature of a
researcher's work, most issues tend to be one of a kind, but I have divided our accomplishments in four
areas to give a sense of what we're involved in.
		First area is analytical evaluations.  The second area relates to experimental work.  The
third is programmatic, and the fourth one is cooperative efforts.
		I will give some illustrative examples to give you a sense of the areas.  In the analytical
area, we're continuing to improve the agency's analytical tools to increase our efficiency and
effectiveness and be able to conduct more realistic analyses.
		We're now consolidating multiple approach and conduct a modern approach.  At the same time,
we're trying to improve the fidelity of these goals.  Last year, we also qualified.  This was actually a
center accident code which was modified to be able to do this.
		We also consolidated three vehicles, some probes, Track B, Track P and Ramona.  We did a
single track code.  We for the first time did a pilot 3-D project for pressurized water reactors.  We think
this is particularly important as we go forward having to do with issues, such as mock spill and so on.
		Another example under analytic area is the evaluation of operating experience the office
conducts.  As you know, this is a function that was formerly the responsibility of the Office of Analysis
and Evaluation of Operational Data.
		Key elements of the work here is, as Frank mentioned, Strategic Plan has a goal of no more
than one significant precursor, the Accident Sequence Precursor Program, that evaluates and attempts to
understand risk significance of operational events.
		Another area of the operational experience and evaluation has to do with understanding
reliability on the more important systems.  We've conducted a number of studies.  This is on-going effort. 
Continuing to see if there's training in terms of the systems, frequency of various events.
		We also conduct some specialized studies, such as issues related to design errors, their
importance, as well as this year, we completed a study of the elements of valves, to see if they have
issues that deserve attention.
		These evaluations have additional value in that they identify important areas of inspection
activities, reviewing submittals, as well as understanding the training in terms of maintaining safety and
performance.
		In the experimental area, I will just discuss one example we have of accomplishments on
leak.  An example I want to talk about has to do with coatings and emergency coolant system performance.
		Containment coatings applied during construction are expected to last for four years and
during a loss of coolant accident as well.  Qualified protective coatings and containments are actually
checked during the design life cycle, and this has raised concerns.
		Failure of such coatings could result in a debris source which could transport to and impact
emergency system sump performance.  Specifically, the accumulation of debris on sump screens or strainers,
as the case may be, could increase the resistance across the screens and thus reduce the net positive
suction head to the emergency pumping system that takes the suction.
		We divided research in two areas.  The first part is coatings, and the second part is the
sump performance.  Coatings is essentially complete.  The coatings research will provide data on most
proper coatings debris characteristics, and we will use in dealing with potential sump products.
		Results will also be used to support or revise an SDN coating standard for nuclear power
plant safety codes.  The emergency systems will be used to understand transport path to internal features
and will include the suitability at the loss of net positive suction at margin to debris accumulation on
subsequent ones.
		That is because we believe that there will be some path-specific issues of the geometry
level in kind on the transport to the sump.  Finally, of course, we will conduct regulatory analyses to
determine what actions are appropriate.
		In the area of problematic considerations of risk-informed regulations, the Commission has
been briefed on a number of occasions.  I believe you know all the major issues.  So, I will not go over
those.
		I just want to say a couple of words about risk-based performance indicators.  We're looking
at the technical feasibility to see if we can develop a set of performance indicators that can provide an
expanded understanding of contribution to risk from operational considerations.
		We are working with the Office of Nuclear Reactor Regulation and internal stakeholders to
ensure that whatever concerns they may be looking at, we consider.
		The other commission paper this Summer as well as brief, we will provide that paper in the
future.
		Another program area has to do with generic safety issues.  Over the last two years, we've
had significant attention on this matter, and I believe the results were the result of this increased
attention.  We were able to resolve last year six generic safety issues.
		The fourth area relates to cooperative agreements.  Again in our attempt to be more
efficient and partially compensate for some of the budget reduction efforts we've experienced, we have
increased our cooperation both with domestic organizations, such as Department of Energy, Electric Power
Research Institute, as well as the international community, and we have signed a number of additional
agreements of cooperative research.
		Chart Number 13, please.  The research involved in several self-assessments, I'll very
quickly cover four of them.  The first has to do with issues processing.  We developed a draft management
directive, October of 1999, which we believe provides more efficient procedures for handling and evaluating
generic issues, and we are actually conducting some trial cases using the revised safety.
		The lessons learned, we will discuss with the Advisory Committee in March, and we will
finalize the management directive in June of this year.
		The second area has to do with regulatory effectiveness responsibility.  As part of this
effort, we assessed the effectiveness of two regulations, the station black-out regulation and the
anticipated transient regulation.
		These assessments are based on comparison of the expectations and determinable outcomes. 
Basically, the station black-out activity, considering the risk reduction expectations were achieved, and
that industry and NRC costs to implement the rule -- these rules were reasonable.
		There were a few areas that were identified for NRC attention as a part of this evaluation.
		We also have had on-going effort at evaluating research programs.  You might recall in 1997,
there was a Research Review Committee which provided -- evaluated research programs to provide guidance to
the Director of Research and its recommendations to the Commission as well.
		That function has now been taken over by the ACRS, and it conducted an annual review of
research programs and provided a report.
		As part of this evaluation, we also have Research Effectiveness Review Board, composed of
senior managers from research, NRR and NMSS.  We also have a regional representative as part of this board. 
They look at the effectiveness of our internal processes, quality, timeliness, research audits, and again
we owe a paper to the Commission, which we hope to complete in the next two-three months.
		A lot of evaluation, and this time, it's an external panel that's being conducted, by a
panel of experts from industry, public interest groups, Department of Energy.  They are particularly
looking at the role and direction of research and some of the issues that we may have to deal with.
		The panel is chaired by former Commissioner Kenneth Rogers.  They have prepared a draft
proposal on Phase 1 of their evaluation.  Getting into more details under Phase 2, meetings are set for
January 24th and 25th, this month.  They expect to complete their report in March of this year.
		That went to some of the key challenges that the office faces.  Once again, I will not go
into challenges, because I think you know that very well.
		Another area that I think is beginning to be probably more likely that we'll be facing some
of these challenges has to do with advanced reactors and advanced technologies, new technologies.
		In terms of advanced reactors, the long-life reactor technology, PBMR, in fact, is something
we have to deal with in the not-too-distant future.  At the agency, we hope we have the expertise or
experience in this technology.
		Even internationally, many of the people who are actively engaged have retired, and we're
going to be struggling to retrofit some geotechnical information.
		We are preparing a plan on PBMR which we  hope to send to the Commission in about six weeks,
four to six weeks.
		Obviously you know about the potential challenges for AP-1000, high-risk designs.  There
will be some unique issues, I believe, in terms of scaling key design functions, and we'll be challenged in
terms of our ability to perform analyses, for example, for design.
		In the advanced technologies, there are several areas, but two areas I want to highlight,
and one has to do with continuing changes in the digital controls as well as the industry's moving forward
to go to newer code and planning designs.  
		There are important barriers in terms of limiting any releases.  We need to make sure that
we have an adequate database as the industry moves forward.
		The last challenge which I think -- I personally think is probably the biggest challenge,
which is threatening our infrastructure, which really is related to the points I've been making in terms of
our capability to deal with the major issues, redesign, new technologies and so on.
		From a research perspective, infrastructure is highly skill-staffed, has key experimental
facilities, and up-to-date analytical tools.  It is in my view difficult to have one without the other two. 
I think one needs to have three key elements.
		At the Office of Research, we have a particular challenge.  49 percent of the staff is
actually eligible for retirement.  I think it will be a challenge to recruit highly-skilled,
well-recognized specialists as we move forward, but we are working on that.
		Over the past several years, as you know, there's been a decline in terms of the percentages
available to us.  I think there is danger that some additional facilities will be shut down over the next
two to three years.
		In fact, recently, we were informed by the University of Michigan that they're going to shut
down their reactor.  This is going to be particularly challenging for us because that reactor was modified
to be able to do special studies for us in terms of looking at the pressure vessel materials.
		They are able to take big pieces of materials, look at the influence, without repeating
cycles.  That is, they promise that they'll operate the reactor continuously, so the temperature and cycle
situation will be proper.
		If they shut down, which I believe now they have made the decision to shut down, now we're
going to be looking to see what our alternatives are because we do need to make sure that we have data in
terms of the regulatory requirements of this material for long-term operation.
		So, in summary, I see the issue of infrastructure as probably the biggest challenge to the
Office of Nuclear Reactor Regulation research, and we are trying to compensate for some of these
challenges, as I said previously.
		Thank you.
		MR. MIRAGLIA:  We'll go next go to Mr. Raglin.
		MR. RAGLIN:  Good morning.  Slide 14, please.  I'd just like to highlight a couple of major
points.  This will be in technical training for the reactor arena.
		There's one key output measure and some secondary measures.  The key output measure is that
the numbers and types of forces provided can meet at least 90 percent of the cumulative identified needs by
the office agents.
		There are some secondary measures for quality, effectiveness and efficiencies that are
immersed in those slides.
		I would like to reinforce one thing that Hub Miller mentioned a little earlier in the
presentation, and that is the training in support of the initial implementation of the revised reactor
oversight process.
		We were able to accomplish that training while providing a regularly-expected set of
technical training courses, and this was not a trivial effort.  We trained 438 staff members, I think 12
courses, two of which were given in each of Regions 1, 2 and 3, three in Headquarters, one in Region 4, and
two in the Technical Training Center.
		This required very close coordination between HR, NRR and the regions to keep the content
updated as the program continued to evolve, even while the training was in progress.
		The secondary measures for technical training, quality is measured based on the percentage
of course examinations that are satisfactory or better.  In fact, this is measured by percentage of
students who pass the exams or courses which have exams.  Inefficiency is based on the enrollment in terms
of actual versus capacity.  All of these measures are exceeded for reactor technical training.
		Slide 15, please.  There are a couple of key challenges, mainly in two areas.  One has to do
with the qualification and training requirements for reactor program inspectors in general, and the other
has to do with risk training in support of a reactor arena.
		Hub Miller mentioned the Manual Chapter 1245, which has been created and is working and will
continue to work till about the end of the summer.
		This task group is sponsored by NRR and contains active participation from each region, NRR
and HR.  Additionally, there is a steering group consisting of SES managers and HR, NRR and Region 2. 
Additionally, this group has met in various locations.  		They have or will have met in every
regional office, here in Headquarters a few times, and at the Technical Training Center, and as such,
they've taken the opportunity to gain the perspectives of the regional and Headquarters managers up through
office directors and regional administrators.
		This is a comprehensive effort, and part of the work is to develop competency requirements
and associated knowledge, skills and other attributes for the inspection staff, to assess the current
inspector qualification requirements, refine the objectives of Manual Chapter 1245, and consider new or
revised training requirements in the context of the revised director oversight process.
		The group will be making a number of recommendations by about late summer, and then we will
have the challenge of incorporating those recommendations, modifying the program as necessary to provide
the reactor technical training needed for the future.
		Another key challenge has to do with risk training for some selected regional personnel. 
There is a need to develop a small cadre of people within each region who can assist the regional senior
reactor analysts in working through the probabilistic STPs.
		As such, the Office of NRR, the regions and HR are again working in concert to develop a
program which will allow this, and we generally anticipate that three to four people from each region will
attend this same formal training that has previously been done to qualify the senior reactor analysts.
		HR will therefore be providing a second series of these sequenced PRA courses to support
that effort for FY 2001.  We generally expect to do the same thing for FY 2002.
		With that, I'll pass it back.
		MR. MIRAGLIA:  Slide 16 is a brief summary.  We've met our strategic and performance goals,
and we're using a disciplined planning and oversight assessment process to address the key challenges. 
		That completes the staff's prepared presentations, Mr. Chairman, Commissioners.
		MR. MESERVE:  Thank you very much.
		It's apparent from the presentations you've made this morning that you have a sweeping set
of responsibilities and a very aggressive set of programs you have underway.  I'd like to thank you very
much.
		Let me turn to Commissioner Dicus first for any questions.
		MS. DICUS:  Okay.  Let me ask a couple generic questions and then probably maybe have one or
two specifics very quickly.
		Communications has been talked about quite a bit, and you brought up the issues with it and
dealing with it, and it's one of the key challenges which I think we all can agree to throughout in various
offices and the input of this arena and maybe even in getting to the training.
		I know one of the things that we've talked about in the past is with the Revised Reactor
Oversight Program, being sure that we have buy-in all the way through the staff, and you mentioned the
quals of inspectors and things of that nature, the challenge of doing team inspections, a little bit
different than what we've done.
		You want to elaborate on how you feel that we're really getting down to the last person in
the arena to get the buy-in, to have the training, to feel a comfort level?  Anyone?
		MR. MIRAGLIA:  We've added many internal communications plans relative to that subject.  Hub
addressed it briefly in his remarks.  
		Sam, do you want to add to the perception that we have?  We haven't conducted a survey in
follow-up in terms of getting statistical data, but I think in terms of our interactions with the staff at
counterpart meetings, region communications with staff, I think there's the initial implementation
broadened the understanding of the program.
		The completion of the training gave a level of understanding of the program.  I think our
sense is, is that the survey would show a significant improvement.
		MR. MILLER:  Yeah.  We spent a lot of time in the field, and obviously one of the things
we're looking for is our people, you know, accepting the program and doing it with some enthusiasm, and my
feeling is that as they have gotten into it, and it has been challenging, the acceptance and the support
for it has gone up.
		They've seen the results.  They've seen the positive results.  They're inspectors, and they
are questioning, and, so, they need to continue to question and make sure that it's the best program we can
possibly produce, but I think acceptance has improved.
		MS. DICUS:  Okay.  That's good to hear.  Did you want to add to it?
		MR. COLLINS:  Thank you.  I think this is a very good area.  It's a challenge both
internally and externally between the NRC staff themselves, the licensees who are one of the direct
stakeholders, and the interested and affected public.
		We're approaching it in various ways.  There's internal staff training which Ken and our
regions are sponsoring.  Bill Dean and his team have done numerous briefings within the regions at
counterpart meetings.  There's been specific training and updates to the revised process as changes ensue
as a result of the initial implementation done at each region.
		They have visited a number of selected sites at each region throughout the revised oversight
process to sit down with the resident inspectors who are a significant stakeholder, and licensees to
receive feedback.
		My impression, and, John Johnson, you keep me honest, is that we do intend to do another
survey to ensure that we do have buy-in or understand the issues from stakeholders.  We have the web site,
which not only depicts the on-going status of the program implementation but also allows for questions to
be asked through the interactive web site, and the results of those frequently-asked questions are posted
on the web site to ensure that the clarifications and the issues are available to all stakeholders.
		In a broader aspect under new initiatives in the Office of NRR, we have budgeted resources
in the area of converting the staff to a risk-informed mindset, if you will, to ensure that we continue to
progress in that area, and also staff outreach and changed management.
		Those two areas are budgeted with resources, FTE and contract dollars, to provide for
expertise through the next three years.
		MR. MIRAGLIA:  I think Sam's last comments are indicative of the challenges not only within
the context of the reactor oversight process, it's all the changes that are on-going, and we need to
continue.  It's going to be a continuing challenge, and I'm not sure that the challenge's ever going to be
gone.
		As the degree of change occurs, we're obligated to explain internally and externally the
reasons for the change and how it's consistent with our mission and provides the training and the
understanding both internally and externally.  So, it's going to be a constant challenge.
		MR. THADANI:  If I may just give you a sense?  So, I think research has some unique aspects
that deserve some mention.
		We periodically, with all our staff, in addition to the various plans, such as risk-informed
regulation, communication plan and so on, a number of meetings that we have with our own staff to make sure
that they have an understanding of what some of the initiatives are, and how they relate to the work that
the office does, including the various external assessments and so on that are going on.
		In addition to that, it has become increasingly important for us to have continuing dialogue
with Electric Power Research Institute, Department of Energy and some other organizations in this country,
to share with them what our views are on major issues, where we might be heading, and where they're going
to see if there are opportunities for cooperation.  So, it's an important element for us.
		MS. DICUS:  And it's clear that everyone, all of us, need to understand exactly where we fit
in with the Strategic Plan, what our job does to support that, and also the performance goals and the
output measures ultimately.
		One other thing, kind of along this line.  Since we've gone to the arenas, and this is the
first briefing that we've had on the reactor side of the house, with multiple offices involved, and
certainly we had the one for the materials side of the house.
		Are there any lessons learned or any challenges or opportunities that you've seen among the
various offices interacting in this arena?
		MR. MIRAGLIA:  I think there have been a number of those alluded to at the briefing today. 
		Many of the initiatives that -- where the lead would be in a particular office, Frank Congel
mentioned the PDDs.  That was an integration not only across the reactor arena but also the materials arena
again.
		So, I think that's fostering the communications across the agency.  The challenge continues
to be how do you integrate the competing needs within the arena, and then integrate all of the arenas, and
that's going to be a continuing challenge, and that's because of the budget realities that exist.
		Sam, you are going to add to that.
		MR. COLLINS:  I think that is an insightful question, in that the more that our programs cut
across program offices, and we depend on arenas to provide for continuity and clarity and predictability,
which is a concern of our stakeholders, as you know, or at least three years ago, that's what drove a lot
of our new initiatives, we have to be fairly tightly coupled and have the same philosophical view and be
willing to share leadership and support roles.
		Examples of that would be the role of the Office of Enforcement in defining and supporting
the significance determination process and the action matrix as well as their role to provide for support
for OI findings in discrimination cases.
		Frank provides for the role of the Incident Response, which is heavily dependent on the
availability and the expertise of the program offices.
		The arena managers provide for that.  My view, looking up at that landscape, is that through
the role of Frank and Bill and the arena managers, they keep everybody closely coupled.
		At the executive level of the offices themselves, there needs to be continued emphasis on
the willingness to share information and challenges, and we have those.  Those are real issues, as you cut
across the Research Review Board as a way to deal with those.
		We had a meeting yesterday, for example, with HR on workforce planning, and as we continue
with these specific initiatives, the challenge in the future will be to be sure that the lead office who
defines the program keeps the implementers, if you will, engaged and involved and provides for feedback.
		The cost center initiatives under the agency, I think, are going to further stress those
areas that you mentioned, Commissioner Dicus, and we're moving in that direction.
		MS. DICUS:  Okay.  One quick final question.  You mentioned that you met all of your
timeliness goals with IP-2, with the response to IP-2.  Are there any lessons learned in IP-2 response?
		MR. CONGEL:  There are two things.  Sam sitting next to me was one of the initial
decision-makers, and we can add some firsthand oversight, but we learned something from all of our
responses, regardless of whether they're an exercise or the real thing.
		The guidelines that we have is that after initial call is received, we'd like to make a
decision what the agency response would be, if it's an escalative response within 30 minutes.
		The other goal is that after you've made a decision to respond, do we have staff in the
operations center ready to perform those functions within 60 minutes, and then, as I recall, with Indian
Point 2, the timeliness actually with that one was pretty easy because the severity of the event lent it to
a rather straightforward decision.  So, that was easily made.
		In terms, it was approximately 8 or 9:00 in the evening, and we did -- I got there after
Sam, I remember.  We looked, and we were staffed up in our usual time, as we have done unannounced
practices, in the 50- to 55-minute range.
		MS. DICUS:  Okay.  What about lessons learned?  Were there any?  Have there not been?
		MR. CONGEL:  The lessons learned with that, the principal one that came out was in terms of
communications.  The manner in which we expressed the accident in the public arena, the glitch that I
remember very well was the statement about off-site radioactivity releases, and we have had a number of
sessions since then because if you talk in absolute terms, essentially something did come up.  There was an
operation that was performed in terms of internalizing the release.
		Rather quickly, there was one that took place for awhile.  You can't say zero, but there
certainly was something insignificant.  We learned a lot about how we have to communicate this in the
future.  We worked very closely with Bill Meecher and his staff on that.
		MR. MILLER:  Commissioner, can I add a few things, having been involved in that from about
8:00 one night till 6 the next day, the afternoon, as we secured from our heightened monitoring in the
region?
		The decision-making went well.  We went in to stand by, and I was in conversation with Sam
and with Frank's people and made timely decisions.  We made good timely decisions with respect to shifting
from standby back to regional monitoring.
		There are always things that you learn from the point of view of how the ops center works in
the region, the incident response center, and we are always in fact learning almost more from those than we
do from exercises, and, so, there are a lot of detailed things that we typically learn from that.
		With respect to the issue of the release, there is an issue that we learned, I think,
industry-wide, and I was with Luiz yesterday in Region -- speaking to plant managers of Region 1 and Region
2, in fact, talking about this.  
		That is, the need to condition stakeholders, off-site response people, to the much more
likely situation that if there is an event that involves an alert as this one did, that it's far more
likely that it will be something that is minor, that if there is a release, it is small, as opposed to the
very significant events that they normally practice on.
		It's a matter of conditioning people off-site so that when they hear about a release,
they're not instantly reacting as if it's the thing that we practiced on, and there was a fair amount of
that.
		So, that's an industry-wide learning, I think, that has been passed around.  I know
ConEdison has attempted to pass that on.  That's less something for us, you might say, but it's, I think,
one of the most significant things that came out of that event.
		MS. DICUS:  I may need to dust off my reality and exercises speech. 
		Thank you.
		MR. MESERVE:  Commissioner Diaz?
		MR. DIAZ:  Thank you, Mr. Chairman.  I have one question with two parts that will be
addressed to Sam, to Frank, to Ashok, and to Ken, and it addresses, you know, the issues of this year and
next year or past year and this year.
		We have always been talking about, and it's clear in here, that we're always looking for key
process improvements, and by key, I mean, something that will rank high as a significant determination
process available will really be something that will catch your attention.
		So, for this past year, I want each one of you to tell me one key significant improvement,
change, okay, that contributed to increase efficiency, effectiveness and realism, okay, in the agency,
okay, and for this following year, one similar issue that has some policy implications that the Commission
needs to be aware that will be able to increase efficiency, effectiveness and realism.
		So, last year, this year.  Sam, would you like to take the lead on that?
		MR. COLLINS:  Yes.  Thank you.  I think I have some choices here, but I'll focus on --
		MR. DIAZ:  Have to be big now.
		MR. COLLINS:  Have to be big?
		MR. DIAZ:  Big.
		MR. COLLINS:  Okay.  For last year, I would focus on the license renewal process.  Let me
explain the efficiency, effectiveness and realism aspect of your question.
		This gets into a question that Commissioner McGaffigan asked during Bill King's Materials
Arena Briefing, too, Stretch Goals.
		I believe the original prospect for license renewal was five years, and we're down to 36
months, and we're at 36 -- 30 months now.  We're at 25 months without a hearing.
		Additionally, as a result and specifically not only have we met that goal time-wise, but the
initiatives from Chris Grimes and his team with the issuance of the Generic Aging Lessons Learned Report
and the Standard Review Plan, although there are challenges that have been expressed by the industry as far
as the use of those and the depth of those, I believe that those will further improve our ability to
achieve the 30-percent efficiencies that we're challenged with in the out-going years.
		MR. DIAZ:  Okay.  I'm sorry.  We realize that, you know, what was the key process
improvement that actually led you to achieve the outcome?
		MR. COLLINS:  All right.  Well, I would point to those products themselves, and those
products which are -- especially the Generic Aging Lessons Learned, which is an optional document, created
for the purpose of forwarding information to the industry and providing standards, if you will, for
follow-on plants.
		Standard Review Plan is a more typical document.  However, I think its composition would
lend itself towards achieving those sufficiencies and effectiveness.
		Realism is a scoping issue in license renewal, and I would say that the ability to limit the
scope of license renewal to those areas that specifically are age-related degradation with the input of the
industry and the stakeholders would reach to that argument.
		Next year, 2001, I think we could talk about either the revised oversight process, but
perhaps I'll focus on the security and safeguards area and talk briefly about not only the integrated
rulemaking of Part 7355, which the Commission is well aware of, and the staff is challenged, and this is an
area that's integrated with OGC support and Materials support from Bill King's office, as well as other
stakeholders.
		It cuts across a number of areas, including the OSRE and the SPA.  I would say that the
efficiency and the effectiveness and the realism in a performance-based area is the redefinition of the
OSRE Program, the testing on-site, and our ability to achieve that with the input of stakeholders, although
certainly that's still an evolving area.
		More importantly in the transition to the SPA, which is the industry initiative for testing,
I think that will be another efficient and to-be-seen effectiveness that will have to be tested, that will
include realism, and the first of those is scheduled some time in the March area.  We believe that it may
be a little later than that.
		NEI is a large stakeholder in that SPA initiative as far as sponsoring the pilot for pilot
plants.
		The definition -- I don't want to get into a lot of detail, but the definition of the
adversary characteristics as far as efficiency, effectiveness and realism, I think, is an accomplishment
that will move forward.
		Spent fuel pool and the SPC and the working group report having to do with security is yet
to play out.  The report is in front of the Commission now.  I think we have that to look forward to, but
the intent of that is driven by the initial exemption request, and the Commission's tasking of the staff to
do that study is meant to provide for efficiencies and realism in the handling of spent fuel.
		MR. DIAZ:  Okay.  All right.
		MR. COLLINS:  Did I wear you down?
		MR. DIAZ:  Yeah.  No, no.
		MR. COLLINS:  No?
		MR. DIAZ:  I will see you later.
		MR. CONGEL:  I think you have, Sam.
		MR. DIAZ:  Key process improvements, last year or on-going, and this year, its policy
implications.
		MR. COLLINS:  I would focus on the OSRE.
		MR. CONGEL:  No.  You're done.
		MR. COLLINS:  I'm done.  Thank you.
		MR. CONGEL:  You're out.  I believe mine is pretty straightforward.  The thing that has
provided the major challenge over the past year has been upgrading our interactions with law enforcement
agencies.
		I certainly touched on it briefly with my presentation, but you have to picture we went from
a rather mature program, where we had the infrastructure with our fellow federal agencies to a wide range
of what we consider conventional-types of accidents, and we went from that into an arena that involved an
evolutionary process within our society itself, new major responsibilities on law enforcement agencies,
and, of course, the same requirements thrust upon us.
		The challenge was, is to make use of the existing infrastructure to the extent possible and
practical, and at the same time, learn and grow with the lessons that were being applied and in an area for
which we don't have a lot of internal expertise.
		I believe that we were able to meet most of the challenges in that time frame by making use
of the expertise that existed with NRR and particularly NMSS, but also at the same time communicate the new
needs in a changing environment.
		MR. DIAZ:  Okay.  This year?
		MR. CONGEL:  Okay.  Next year is really a continuation of that.  We had two drills that were
learning processes with law enforcement agencies last calendar year.  What we did and plan to do this
coming year, we're waiting for feedback from the Commission on our proposed exercise schedule, and that is,
to build on what we've learned, to replace what I consider a conventional exercise, namely with the
long-term infrastructure with a new one at Palo Verde that is one based on the lessons learned we've had in
the past, and then proceed to a point where we have a degree of comfort that is close to the kind of
capabilities that we feel confident with currently with our standard approaches.
		MR. DIAZ:  Okay.  Thank you.  Key process improvements this past year, this coming year.
		MR. MILLER:  One thing that we did in Region 1, and I think other regions did similar
things, but we set up a work control center to help facilitate this planning process, to schedule all of
the inspections that have to be done.  It has been very challenging, and we've met the goals so far, and I
think much of it is the result of the tools that were established.
		We developed some tools actually for tracking inspection hours as a part of that, and I
think I'm a little bit like Frank.  I see this coming year as being continuation of that same thing, you
know, the effort to implement the program effectively; at the same time, this assessment roll-up for the
first year and respond to these special situations.
		MR. DIAZ:  Okay.
		MR. THADANI:  For this year, I will go back to the example I used, generic safety issues. 
As I said, we've revised the process.  Both the initial assessment and the prioritization process has been
changed.
		We are now -- we have an interoffice group that takes a look at any identified issue, making
a decision on is it important enough to pursue at all, and if it is, then to go forward with a friendly
abbreviated prioritization process, and then during the resolution phases, we have increased management
attention.
		I have monthly meetings on the status of the generic safety issues.  What the challenges
are, and how we might want to deal with those?  I think it's worked pretty well, and we're going to
continue to do that.
		I'm going to go to a somewhat different example for future because I think in the end, it
achieves the same objective, and that has to do with, as I said, we're moving towards more realistic
analytical tools, but in addition to that, there's some aspects that I think are very important.
		An example of that is the use of graphical user interface approach, wherein the people who
use these codes can be more effective and less prone to make mistakes with the availability of tools, such
as the graphical user interface.
		We hope to complete that next year, and I expect that that's going to lead to some
efficiencies.
		MR. DIAZ:  Thank you.  Ken?
		MR. RAGLIN:  For this year, the most significant thing that I can think of was the
integration of the Training and Development Web Site, coupled with People-Self Training Administration,
coupled with On-Line Registration.  These are three IT-related things.
		When the former Technical Training Division of the former AEOD was combined with the Office
of Human Resources, we basically started with a web site that had two different structures added together. 
It's now totally integrated, same look and feel, seamless to the user.
		By shifting to the People Self-Training Administration module, we've gone away from three
different training recordkeeping systems that were used in the past to one, and a side benefit of that is
that it opens up opportunities for people in each of the offices, appropriately-designated people, to
access the training records, get training profiles on employees and so forth.
		Finally, we've gone to on-line registration through the use of the web site.  That
eliminates the need for paper processing from the employee to the supervisor, and it automatically works
through e-mail in something that acts a little bit like work flow as far as the registration goes.
		For the future, it's my opinion that the most significant thing that we can and should do is
to transition some part of the technical training, where appropriate, and some part of other training,
where appropriate, to some sort of distance learning methodologies.
		It's always difficult for people to go on travel to attend training courses, particularly to
attend relatively short ones.  So, I believe that that's certainly an area that we should and could pursue
distance learning, including web-based training and other possibilities.
		MR. DIAZ:  Thank you.  For time purposes, I'm not going to go back and ask what were the
policy implications of these future things, but it might be something that when you see me the next time,
you might want to bring up.
		Thank you.
		MR. MESERVE:  Commissioner McGaffigan?
		MR. McGAFFIGAN:  Thank you, Mr. Chairman.  I want to join the Chairman in commending the
staff for getting a lot done over the past year and recognizing that they have an awful lot more to do.
		My first question goes to Frank, and it's just a presentational one.  When I first saw the
graphs, and I looked at your challenges, I was going to say, well, geez, where are all the other
challenges, and then they all were in the later sections.
		I mean, but if I were looking in your Graph 4 at the major challenges facing this area in
terms that might drive budgets and whatever, I would have had Sam's challenge of trying to, you know, deal
with license renewal, which he's talked about several times.
		I would have had Hub's challenge of continuing to revise the reactor oversight process and
make improvements in it.  Those are two huge undertakings that we keep -- whenever we see something --
whenever we do letters to someone, we tend to stick them -- the Commission keeps sticking them in the
front, and we get this boilerplate.
		So, I think all the challenges have been identified by the various other folks, but why did
you highlight these three as opposed to the challenges that sort of individuals identified later?
		MR. MIRAGLIA:  I think the key challenge is identifying all the challenges.  I think as an
arena, these are integrated challenges.  This is what faces the arena.
		Sam's challenge is my challenge, and in fact it flows to other areas.  I think a key
challenge is balancing and integrating the competing priorities, and part of that challenge is the
Commission direction, and that changes the balance, and the key is how do you integrate these competing
needs and competing priorities within the resource constraints for the reactor area?
		MR. McGAFFIGAN:  You're a good predictor of my next question.  How is all of this -- I mean,
if I were trying to convey to an external stakeholder -- I see Mr. Beetle in the audience and others --
just how tight are things for you all in trying to get these multiple challenges identified within tight
budgets, with events that are not budgeted for, as Mr. Collins suggested in your Point 2 was not budgeted
for, --
		MR. COLLINS:  Absolutely.
		MR. McGAFFIGAN:  How tight are things?  How is morale?  I mean, I gave a speech at the Reg
Info Conference about -- I think it was last year or the year before, about the seven simultaneous miracles
you guys were trying to pull off, and how pressed you were to do that.
		Do you continue to feel stressed, and can you, within the tight budgets that we have, pull
all this off?
		MR. MIRAGLIA:  I think the planning process is giving us the ability to say what can go, and
what work are we doing that we shouldn't be doing?  I think we have lots of competing demands.
		I think during the course of the presentations, you heard of several things that where we
responded to either increasing workload or decreasing workload.  There had been a slight decrease in
allegations.
		Sam indicated that the timing of renewals didn't come in.  So, we had resources, and, so, we
applied those resources to licensing actions, and, so, we're limited in how much we can do because the
slice of the pie.  I mean, you can only get so many slices out of the pie.
		MR. COLLINS:  We think things are tight, but we think the planning, budgeting, performance
management process that we're using, and I think it's an improvement on sort of the global process that
we're using, and it gives us an opportunity.
		It gives us an opportunity to come to the Commission with the benefit of this sort of
integrated thinking, such that we can make arguments for or against pushing the envelope on our budgetary
needs.
		So, we haven't optimized that process in my thinking yet, and I think we're making progress,
and we're looking forward to going through it once again with the goal and objective of integrating this
kind of thinking into each of the arenas and ultimately across arenas, so that we can come to you and say,
here's the key work.  Here's what we think we need to do, and here's what we think we need -- we're doing
it in terms of resources.
		MR. McGAFFIGAN:  Could I ask Mr. Congel just briefly?  It's again a budgetary issue.
		The fact that the transition from FTS-2000 to 2001 didn't work, and we had to do some
extraordinary things, have you -- how much did that cost us that wasn't budgeted?
		I know we're going to take it out of the hide and figure out how to do it, but -- because it
was vital that we do it, but do you have a guesstimate as to how much that cost us?
		MR. CONGEL:  Well, yes.  There are two aspects to it.  In terms of handling the transition,
it took me more in terms of FTE than I had ever anticipated.  We had people both from IT as well as my
group working with it.
		In terms of the efforts expended, though, we're able to negotiate a contract on our own,
separate from GSA, that has ultimately now -- I couldn't predict this three months ago, but based on the
fact that there's been only one transition to 2001, one site, we are ahead with the budget proposal that I
had made.
		So, financially, it's not going to cost us any more than I had estimated.  It was just more
staff work involved than I had anticipated.
		MR. MIRAGLIA:  But there is cost in terms of FTE.
		MR. CONGEL:  Yes.
		MR. McGAFFIGAN:  The -- one challenge that hasn't -- that wasn't mentioned that may be down
a little bit in the weeds that I'll just mention to you because we've been thinking about international
issues on the Commission, and that is getting ready for the 2002 Nuclear Safety Convention Review
Conference and writing the country report, and I forget.
		I think NRR has the lead, but it's probably an arena-wide and maybe even across agency, but
how -- what resources do you envision dedicating to that, and what sort of policy issues are going to have
to be revisited?  Because in order to get ready for that review conference, which is about 15 months away.
		MR. MIRAGLIA:  My recollection is that the resource numbers were three or four FTE.  We have
the experience, Commissioner McGaffigan, of the previous -- although we had ratified, we went through the
process in gathering the information, and I think we see a learning process in that.
		So, it's within the context of the planning assumptions, and it's planned-for activity in --
and I think it may be in the international arena.
		MR. McGAFFIGAN:  It's because some of the skepticism that the Chairman faces when he goes to
interim meetings, presumably that the team that goes to that review conference is going to have to be ready
for in terms of -- you know, if you read the French magazine Control that Mr. LaCoste puts out every two
months, there's obviously skepticism about risk-informed regulation.
		There is some -- they're watching our revised reactor oversight process with interest but
with skepticism.  The latest issue of Control, they had the Swiss regulator bemoaning the fact that he had
been -- you know, his regulatory system was tweaked in the direction of our old process, just as we were
moving to a new process, which looked a lot, in his view, like our old process -- like his old process.
		So, I think you're going to get a fair amount of policy interaction with other countries as
they critique us, and we probably should take that very seriously.
		MR. COLLINS:  I think that's true.  Of course, Janice Stone Lee with the Office of
International Programs is the overall coordinator of this effort for the agency in concert with the
International Panel, which she chairs, and the program offices represent the technical resources.
		We have discussed this topic, and there's two challenges.  One is the update of the U.S
report, and the other is the review of those reports of which we are the member state.
		The types of issues that we're looking at and the range of expertise is from those that are
very familiar with our programs, so they can update our report and the status of the industry to the
overall context of policy and program issues which ultimately will be represented at IAEA itself, and
that's potentially a two-to-three-week assignment for a number of individuals in that location.
		MR. McGAFFIGAN:  Up to and including the EDO himself, if Mr. LaCoste is the person
representing France.
		MR. COLLINS:  There is various opportunities there.
		MR. McGAFFIGAN:  Right.
		MR. COLLINS:  It has been budgeted for through the International Programs.  We're taking a
wait and see attitude, depending on the grouping of plants, and through Janice, ultimately that might be a
policy issue which the Commission determines they have an appropriate say of attendance.
		MR. McGAFFIGAN:  The output measures, you briefly mentioned the need to keep revising those. 
For instance, the operator licensing one, which may well be a success, that we had to give fewer exams than
was planned rather than a failure, because we met the need, and it may mean that the rule is working, and
people are indeed doing their own exams.
		But what is the process for revising -- the other one that comes to mind, you know, the
Commission by Fiat a couple of summers ago said a hundred percent in two years, 95 percent in one year,
just because we were pulling numbers out of the hat, and we wanted to improve.	
		But should you -- at some point, you should probably seize those numbers yourself, and given
that you're getting 98 percent in one year and a hundred percent in two, maybe the goal should be a
six-month goal or a nine-month goal or a 15-month goal or an 18-month goal, not trying to push you to
perfection but to better align, you know, your actual work efforts.
		What is the process for thinking about those sorts of performance indicators -- those sort
of output measures?
		MR. COLLINS:  That was actually a very disciplined process that involves the quarterly
reviews, which results in the operating plan and the operating plan updates.
		On a quarterly basis, the Leadership Team, which is composed of the division directors in
the Office of NRR, meet to look at the data, and the data ranges from resources utilized, time, people and
money, through the product lines against the performance goals which are usually outcomes and outputs.
		There is a standard that's established based on the operating plan and the performance goals
below the level that we've talked here.  It goes down to the individual products.
		There are out-of-standard ranges that are applied to each one of those.  Every time there's
an out-of-standard condition, that's reviewed by the Leadership Team, and it's reviewed for under what
influences.  It can be out-of-standard high as well as low.
		If we're overachieving in one area but under-achieving in another, then that means that we
need to move resources to stay within the bands.  We look at those trends overall, and those trends are
rolled up in many cases as adjustments to the targets, to the indicators themselves, and it's usually done
on a semi-annual or an annual basis.
		MR. MIRAGLIA:  And that's done on the operating plan level, and then, as arena managers, we
have quarterly reviews to talk about the out-of-standard conditions with respect to performance and
strategic goals.
		MR. McGAFFIGAN:  I'm glad that's all going on.  The only question I have, and it may be a
presentational question.  We do in the budget submission each year include output measures which really
sounds like a one-time snapshot as to what we really intend to do, but those are the ones that we are sort
of held to in some sense the next year when -- I'd sort of pull out the last year's output measure and say,
well, geez, why did you do this, why did you do that?
		We may want to caveat -- well, we may want to put in a little bit of an explanation in the
future, that this is a dynamic process, where, because of budget reality and the need to balance things,
these goals are reviewed throughout the year, and in some cases are overachieved, some cases underachieved,
but we do so in order to make best use of our --
		MR. MIRAGLIA:  I think that's pretty much understood within the GPRA process in terms that
there are goals that are set, and there's two outcomes.  You either meet the goal or you don't meet the
goal, and then, if you meet the goal, you have to -- and perhaps if you meet it by a mile, you might want
to ask yourself are those reasonable kinds of goals, or, if you don't meet the goal, are there contributing
factors that influence that, and those are acceptable kinds of things.
		So, I think they are goals, and they are measures to say where are you performing with
respect to those kinds of goals, and it's a feedback process.  The whole strategic planning process that
feeds into the GPRA has that feedback.  It's a closed circle.
		We haven't gotten through all of those processes in a robust way for all of the arenas, and,
so, we're in a constant learning curve.  So, those things are being improved as we go along, and that
feedback has to occur.
		MR. McGAFFIGAN:  As I say, I agree with everything you guys are saying, absolutely, but I
just hope everybody else is on the same sheet so that we're not arbitrarily held --
		DR. TRAVERS:  Interestingly enough, this is part of the training that all of the managers in
the agency got associated with this process, that included this emphasis on the nature of revisiting and
re-evaluating and changing, where appropriate, these larger issues.
		MR. McGAFFIGAN:  Thank you.
		MR. MESERVE:  Mr. Merrifield?
		MR. MERRIFIELD:  Thank you very much, Mr. Chairman.  The first question I want to direct
towards Frank Miraglia.
		In the presentation, we did have on Page 4 a notion of the issue of communications.  One of
the things that we are very concerned about obviously is the cornerstone of increasing public confidence.
		While it wasn't specified in the slides at all, obviously I expect it's staff's
interpretation that this -- the public comments are in concert with a lot of the efforts that are underway.
		In visits I've had with some of our inspectors, I know there's an effort to try to get out
and meet more with local government and community leaders.
		I know that we've had outreach to Congress.  I don't know the extent to which perhaps we've
informed people in Congress how they can have access to our new performance indicators, but that may be
something worth exploring.
		The question.  What are we doing within all of this to try to increase public confidence? 
Are there some areas where we've had some -- you would point out that we've had some success this year?
		And in looking toward next year, what are some areas where we can enhance this important
area?
		MR. MIRAGLIA:  I think the area -- I think the answer is yes, and I think we've had mixed
success.  I think the Commission has heard where there's a view and a perception out there that we haven't
done enough, and we could do more.
		I think the reactor oversight process is one where we have -- Hub talked about the numerous
meetings where, before implementation, we went out to the region and local -- invited local people, local
officials to understand the process.
		The PPR meetings resulted in meetings out there.  Again, those are public.  We've had
workshops in terms of initial implementation.  We've had a workshop within each region with the affected
utilities as well as open to the public to say what are the implementation issues, and things of that
nature.
		We'll have additional workshops, taking all of the input at these initial meetings in terms
of implementation issues, how we're going to prioritize them, what are the most important changes, and how
should we structure these types of changes, and those workshops, as Hub indicated, are going to be taking
place in the January-February time period.
		So, I mean, that is a model of what we've used in reactor oversight, and I think in terms of
feedback, and I think Hub could add to this, in terms of meeting with local officials, in terms of meeting
the transparency, the web site Sam has indicated is out there, I think the response back or the feedback is
the local officials, state officials, seem to --
		MR. MILLER:  I've got good feedback from those meetings that we held in connection with the
start-up.  The web site has been talked about by people that I've been in touch with, and the ability to
look at and see the performance of the plant, I think, has had a positive impact.
		Most important of all, of course, is licensees maintaining a high level of performance.  If
their performance is not there, I assure you it's very difficult to maintain public confidence, and we've
got at least one situation where that exists.
		But I think these various initiatives, the best I can tell, have had pay-off.
		MR. MIRAGLIA:  It goes without saying, though, is that that was an extraordinary effort, and
it has resource implications, and part of what we do and how we decide what kind of outreach, and how much
outreach to do is again to use our four goals of maintaining safety and making those trade-offs.
		DR. TRAVERS:  If I could -- and I'll be very brief.  I'm sorry.  
		I think you've highlighted an area where we find it very difficult to at least
quantitatively assess our success or lack thereof.  I think the best thing that making public confidence
one of our strategic goals has done is to plant in the front of our brain in all of the areas that we have
responsibility for, what can we do or try to do that enhances public confidence?
		We may miss the mark on occasion, but it's something that we're explicitly attempting to do
and think about as we go forward in all of these programs.
		MR. MERRIFIELD:  Thanks for your comment.  I think it should be at the forefront of our
minds.  I agree with that, and obviously continued innovative thinking on all of our parts makes sense.
		Sam Collins.  A little different issue.  You talk about meeting a lot of your goals in terms
of outputs, in terms of making sure that licensing actions are done in a timely fashion, and I think the
staff has done a very good job overall in success in that area.
		We have had some areas that I would term our "legacy" issues, DPOs, 2.206 petitions, and
other actions that have been out there from some time ago.
		Are you confident that we have identified those such actions that are out there and have a
plan to address those so that we -- going down the line, we don't have any surprises about things that have
been hanging out or requests that have been outstanding for a significant amount of time?
		MR. COLLINS:  As far as the existing backlog of work --
		MR. MERRIFIELD:  Yes.
		MR. COLLINS:  -- in those areas, specifically 2.206, DPBs, which will be the office
responsibilities, DPO being the EDO level responsibilities, I am confident that those are tracked.
		There are some improvements that are warranted.  Some of those are based on the OIG report,
for example, on the DPO/DPB process.  2.206 is an initiative that we have underway.  It's in the proofing
process now, but there's been significant changes to that process in the past year or so.
		The challenge that we have, I think, on-going is ensuring that the stakeholders who use
those processes understand the expectations of them, and I believe that's an area where we need to continue
to interface.
		Our work planning center, which is one of our premiere initiatives, is meant to further
refine that ability within the Office of NRR, all the way down to workforce planning.  So, we expect to get
better in that area, but in those areas that you mentioned, I believe I'm confident in saying that we know
what we have.
		MR. MERRIFIELD:  We -- the next one is to Bill Travers.  We had a variety of comments today,
talking about the number of our most mature workers who are qualified to retire, who can retire at this
point.
		We've had some discussions lately about trying to get people in the pipeline, new workers,
and to the extent that we may be able to get people from other agencies.
		One of the things we haven't discussed is being innovative in terms of retaining some of
that knowledge, you know.  I know that there are a number of staff members who I've met who are over the
age of 65, who are still here, who are contributing very valuable work here to the agency.
		Have we thought at all about using some innovative ways, whether it's telecommuting or
job-sharing or things of that nature, to try to retain in a more limited capacity some of those workers who
are eligible to retire, more mature workers, but who can continue to contribute valuable work product to
our efforts?
		DR. TRAVERS:  It's among the things that we're looking at.  Certainly, we have had some
experience with work-at-home in specific instances, and across government, there's an initiative to look at
where that sort of thing can make sense, and where it can be effectively utilized by an agency.
		We're looking at a host of measures that speak to the issue, the very important one that you
raised, of retention.  Clearly, it doesn't do you much good, even if you're bringing in high-quality
people, if you don't have the work place and the programs that support your being able to retain them.
		Included in that, in my own view, is job satisfaction, training.  Are you actively engaging
your workforce, vesting them with responsibility and the tools to carry out that work?
		I think we are, in connection with a mandate from the Chairman and the Commission, looking
at a number of other things that we might do, and I think your entreaty to look particularly at how we
might retain elements of older NRC employees who might be utilized in some limited fashion perhaps, and I
agree with you, because what we're looking at today, as you've underscored, is a lot of experience
potentially walking out the door, and if there is some innovative approach that we could use to retain
that, even if it's on some limited basis, we ought to try to explore that.
		MR. MERRIFIELD:  No.  It's clear.  A person's age is not a limitation of what they can
contribute, and certainly for our older workers, we shouldn't limit ourselves in that respect.
		MR. McGAFFIGAN:  Commissioner Merrifield, could I just very quickly?
		MR. MERRIFIELD:  Sure.
		MR. McGAFFIGAN:  I think there may be some statutory problems.  The person would want to
retire.  So, they'd be getting their retirement check if they've got 30 years civil service, and then when
they come in and work for us, it can affect their retirement check, and if we want -- this may be a
governmentwide initiative, but at some point, we might have to look at -- all the technical agencies face
it, but in order to retain some of these folks, there may be some statutory relief that has to be requested
through OPM, but I'm not sure.
		I just vaguely remember that there are these caps or these reductions in retirement pay if
you come in and also work at your old agency.
		MR. COLLINS:  That's true.  That is a barrier for re-employment rights, particularly if an
individual is a full retirement.  However, the Committee of Age Discrimination, which we have met with
recently, and I think all of the offices are involved in, would say that there's two roles for those
individuals.
		One is to capture corporate knowledge and to move that into the organization, particularly
with the new hires, as far as lessons learned and on-the-job training, and the second is, as a resource,
which they are, there's some more administrative barriers, maybe even statutory barriers, after retirement,
which, in today's age, where it's very difficult in some cases to achieve quality contractors who do not
have conflict of interest, that's an untapped resource at this time.
		MR. MERRIFIELD:  Commissioner makes a good point.  There may be statutory limitations.  It
would be useful to have the staff think about that, and if there are things we need to do, with the
assistance of Congress, I agree with you.
		Ashok Thadani.  You mentioned a lot of areas where we don't -- there are a lot of new
innovative technologies out there where we need to be thinking about.
		To what extent -- let me phrase this a little differently.  Do you think we're doing the
best job that we can in terms of capturing or understanding what is going on in the international arena, so
that we may take advantage of what some of our international counterparts are doing, or is there more than
we need to do?
		MR. THADANI:  What I'd say reasonable amount in terms of particularly the new technologies
and the new designs in the international arena, and we're staying fairly close in terms of the IAE
activities, for example, on PBMR, looking at current regulations, their application and trying to develop
some safety guides and so on.
		We're also working closely, of course, with NEA, Committee for Safety of Nuclear
Installations, and there, NEA is now considering an effort to try to develop best understanding they can in
terms of what sort of technical information would be necessary to deal with potential safety issues.
		That means to first define what the safety issues are, and we're staying fairly close and
engaged with them, and I indicated earlier that we'll be sending up a plan to the Commission, and we'll be
discussing some of these issues in that plan as well.
		MR. MERRIFIELD:  Okay.  That's helpful.  One editorial comment.  You made a mention about
the Michigan reactor and resources we have to depend on.  That was news to me.  
		I would opine that perhaps if there are areas where you think the Commission ought to be
made aware or where we can make Congress aware of some concerns that we have about infrastructure, I think
that's important information we may want to transmit.
		Let me -- my follow-up question goes to Hub Miller.  You talked a little bit of how we have
eliminated the aggregation of issues leading to escalated enforcement and made some changes in terms of the
ways we go about doing enforcement.
		Given those changes, do you believe that there has been any reduction in the responsiveness
of licensees or any down playing of the significance of what they focus on on some of these issues, given
the fact that we are treating it differently in enforcement space?
		MR. MILLER:  I see no -- nothing on the surface.  One of the big issues, of course, as we
move forward in the new program is the question of how we deal with cross-cutting issues; that is, issues
that may not rise to the level of a green or white finding, but which, if you look across the various
cornerstones, are an issue and an indicator.
		I think we've got to sort through that and evaluate, you know, what the right thing to do
is, get input from stakeholders.  I think it's too early to tell really what the effect is there.
		MR. MERRIFIELD:  But would it be fair to characterize that you haven't seen any reduction in
safety as a result of reductions --
		MR. MILLER:  That's right.
		MR. MERRIFIELD:  -- in escalated enforcement?
		MR. MILLER:  That's right.  I see none.
		MR. COLLINS:  Commissioner Merrifield, if I can add?  The goal of the oversight process and
the enforcement policy, as Hub has articulated, is to provide for that risk-informed safety view of
ensuring that the most important risk and safety sense issues are put before the licensee, and their
corrective action program prioritizes the disposition and the corrective action, such that there's no undue
and perhaps unintended consequences of the NRC driving the use of the industry's resources in those areas.
		I think the regions have a pretty good view of how that's working, and to date, I believe,
we can say we've met that goal in an interim way.  I see Alan out there.  But I believe that's one of the
areas that we need to address after the first year of implementation to ensure that the Commission has that
information in the aggregate sense.
		MR. MERRIFIELD:  Mr. Chairman, I just want to make one closing comment, and that is, I've
only done so in private, but I would like to publicly thank Hub Miller for a significant amount of work
that he and members of his staff have conducted relative to the difficulties we've had with Indian Point 2. 
That has resulted in significant interest on the part of our stakeholders, on the part of media and on the
part of Congress.
		I think Hub has done an outstanding job of responding to those in a straightforward and
honest manner and has certainly helped this agency in terms of our commitment to public information.
		Thank you.
		MR. MESERVE:  Yes.  We're certainly all indebted to you for your efforts.
		MR. MILLER:  A lot of help from these folks, I assure you.
		MR. MESERVE:  In light of the time, I'll be very brief.  I just have a couple of very
hopefully short questions and maybe some comments.
		One directed at Sam, and I won't turn this into a question, is that, you have emphasized, I
think appropriately, the fact that you have these efforts to attain efficiencies in the processing of the
license renewal applications, and that simultaneously, you're trying to maintain some deadlines.
		This is a singularly-important activity for the Commission, that we process these
applications in a thorough and complete efficient manner, and if it should happen that the efficiencies
that we all expect and hope that you would attain prove to be a problem, that we would expect you to come
to the Commission promptly, so that we would have an opportunity to assist you in being sure that those
issues are addressed.
		For Ashok, I have a few hopefully close -- short questions.  You had mentioned this issue,
which I was not familiar with, with the containment coatings and the possibility that debris might foul the
sumps and compromise the capacity response in a loss of coolant accident.
		You didn't close the loop completely for me as to whether you are satisfied that this is
something we're going to be able to deal with in the normal course, that this is something -- this is a
high-priority issue that we should be addressing more swiftly than we otherwise would.
		MR. THADANI:  It is -- Chairman, it is a high-priority issue, and we have been very
aggressive in trying to make sure we have adequate information in terms of moving forward to resolve this
issue.
		I believe that we are moving at appropriate speed on this issue.  It gets a great deal of
attention, and I'm pretty confident that we will be able to finish the evaluations and reg analysis next
year.
		MR. MIRAGLIA:  In terms of immediate safety issue, I think that this issue has been looked
at on a number of occasions, and the coatings issue has raised a nuance in terms of transport and that kind
of thing, and, so, in terms of there's not an immediate safety need that needs to be addressed.  I think
that was the direction of it.
		MR. THADANI:  No.  There are two parts.  We are talking about, to begin with, a low
probability of an initiating event, and there's still questions, and the reason for our wanting to get
additional information, there are still questions whether one can actually transport significant amounts of
debris to actually impact the NPSH of these pumps, and, so, we're not certain yet that the end result is
going to be necessarily new requirements.
		DR. TRAVERS:  This issue, I should point out, has resulted in interactions between us and
our licensees.  Information has been provided by licensees.  All of that has been factored into Frank's
conclusion, and it's basically a staff conclusion that there isn't an immediate safety issue at hand.
		MR. MESERVE:  Ashok, you mentioned this University of Michigan reactor and the importance of
the influence measurements on materials for reactor pressure vessels, and that there's a unique facility
apparently that is there.
		Are there alternatives that are available to us?
		MR. THADANI:  We don't think we have an alternative in this country, but I have to confirm
that.  I believe we're still exploring that option.
		It's quite likely that we will have to go to an international facility if we want to try and
get this information.  There are a couple of international facilities that may provide this information.
		MR. MESERVE:  There's an international source for this information?
		MR. THADANI:  We believe there is, and we're exploring those options now.  We were just in
fact, Chairman, we were just notified about the decision by the university to shut down the reactor.  So,
it was, I believe, two or three weeks ago.
		MR. MESERVE:  Did we have an understanding with the University of Michigan about continuity
of availability of that reactor?
		MR. THADANI:  Well, this was one place where they made the modifications that they had to
make, and their Safety Committee supported that.
		Their operating costs are quite significant.  Our contribution is quite small in terms of
the piece of work that they do for us, and we cannot support the overall operating costs.  They have
discussed this matter with the Department of Energy.
		I believe their decision to shut down is final, as I understand.
		MR. MESERVE:  Mr. Raglin, you'd asked me a point about the need for advanced training and
special focus on training for regional staff in the area of risk assessment.  You didn't say anything about
Headquarters, and you've touched on the issue about training of the entirety of our staff as a vehicle for
building skills and being able to maintain the capacity to deal with issues as they arise.
		Do you see any unique issues with regard to our Headquarters staff in connection with either
risk assessment training or related areas?
		MR. RAGLIN:  I don't think I meant to limit it to just the regional people.  I believe there
will also be some Headquarters participants in the same program that will broaden the number of staff
members who have risk training roughly equivalent to that of the senior reactor analysts.
		Over the last two or three years, there have been some initiatives sponsored by NRR that led
to development and implementation of certain PRA courses for inspectors.  We've had a fairly broad
implementation of the PRA for Technical Managers Course, which included really Headquarters managers as
well as regional managers, and then just stepping outside the nuclear reactor safety bounds for a minute,
there are some significant risk assessment training initiatives with NMSS right now that brings into play
risk assessment as applied to NMSS plans to proceed with risk-informed regulation, as appropriate, of its
activities.
		So, there are many things that are going on.  At the same time, we're maintaining the same
core curriculum of PRA training that we've had available for the last few years.
		On top of that, there may be some specialized instances where individual staff members
attend externally-sponsored risk training of one type or another.
		MR. COLLINS:  Mr. Chairman, just to elaborate on Ken's response, our highest-ranking new
initiative is to convert the staff to a risk-informed mindset, and the details are to fully integrate
risk-informed principles into every-day staff activities, and the Executive and Leadership Team has
programmed 10 FTE and $665,000 over the next three years to support that goal.	
		So, we'll be working closely with HR and other sources to implement those resources.
		MR. MESERVE:  Good.  Well, let me, on behalf of the Commission, thank you all for a very
helpful presentation this morning.  
		It's clear, as we look at the wide range of things that you've done over the past year, that
this has been a year of very significant accomplishments for which you all have to be commended.
		It's also clear that you have in coming years great challenges ahead, and we want to be as
helpful to you as possible in ensuring that every year, you're able to give a word about having met your
goals, and, so, we want to be able to commend you for your work, and we want to spur you on for greater
accomplishments in the future, and I'd like to thank you all again.
		With that, we stand adjourned.
		(Whereupon, at 11:51 a.m., the briefing was adjourned.)