skip navigation links 
 
 Search Options 
Index | Site Map | FAQ | Facility Info | Reading Rm | New | Help | Glossary | Contact Us blue spacer  
secondary page banner Return to NRC Home Page

UNITED STATES OF AMERICA
NUCLEAR REGULATORY COMMISSION

+ + + + +

BRIEFING ON
RESULTS OF THE AGENCY ACTION REVIEW MEETING
(REACTORS)

+ + + + +

THURSDAY, JULY 19, 2001

+ + + + +

ROCKVILLE, MARYLAND

+ + + + +


           The Commission met at the Nuclear Regulatory Commission, One
White Flint North, Room 1F14, 11555 Rockville Pike, Rockville, Maryland,
at 9:30 a.m., Richard A. Meserve, Chairman, presiding. 
PRESENT:
RICHARD A. MESERVE, Chairman
GRETA JOY DICUS, Commissioner
EDWARD McGAFFIGAN, JR., Commissioner
JEFFREY S. MERRIFIELD, Commissioner


P R O C E E D I N G S

9:26 p.m.

		CHAIRMAN MESERVE:  On behalf of the Commission, I'd like to
welcome everyone for today's briefing on the results of the Agency Action
Review Meeting for Reactors.  As I think everyone knows, we proceeded with
the initial implementation of the revised oversight process in April of
2000.  This is an extraordinarily important activity because the inspect
of our reactors is crucial to the fulfillment of the Agency's mission. 
And the revised oversight process reflected very significant change in the
way in which the Commission was undertaking its business.
		The Agency Action Review Meeting is one of the final stages
of each year's evaluation of the process and involves the staff's wrap up
of its evaluation of how the fleet is performing and that is then to be
followed by this Commission meeting at which the results of that activity
are to be reported to the Commission.  So this is a very important
meeting.  It's one that is really our initial opportunity for us to survey
the process as it is proceeded.  We will have a Commission meeting
tomorrow in which the evaluation of the initial implementation will be
discussed, but in a certain sense the proof is in the pudding, so that
this is a meeting, I'm sure that will touch to some extent on issues that
we probably also will be discussing tomorrow.
		With that, unless one of my colleagues has a comment, why
don't we proceed?
		Dr. Travers?
		DR. TRAVERS:  Thank you, Mr. Chairman, good morning to you
and the Commission.  We are glad to be here to discuss with you and brief
you on the results of the first Agency Action Review Meeting.  We've got a
great new acronym, AARM, or something like that, I guess we'll call it
AARM which was just held in Region 2 in Atlanta just a few weeks ago.
		As you know, the AARM or AARM is an integral part of the
new revised reactor oversight process and it is conducted to achieve a
number of objectives that are defined in a new draft NRC management
directive.  They include, are not limited to, they include reviewing the
Agency actions that have been taken for plants with significant
performance problems, basically to ensure that the coordinated courses of
action that have been developed and which are being implemented for
licensees of concern are appropriate, to confirm that the ROP is meeting
the NRC's strategic goals, to ensure, as you indicated, Mr. Chairman, that
the trends in the industry and licensee performance are recognized and
addressed and some others that we'll probably get into the discussion
today.
		This annual meeting essentially replaces the Senior
Management Meeting process conducted under our former reactor oversight
program, but it has a number of important differences from that initial
process that we'll discuss this morning.
		One difference I'd like to highlight though in opening this
meeting is that while we conduct this meeting certainly to review the
performance of specific nuclear power plants and to assess whether or not
the activities that we have developed in response to those performance
issues are appropriate, much like we did in the senior management meeting,
our expectation going into the AARM meeting is that it's very unlikely
that we would identify significant differences in our assessment of
performance or for that matter significant differences in the approach
that we've already decided to take and in fact, in most cases are
implementing.  This is fundamentally because of the fact that the new
reactor oversight process is viewed as a more predictable and open
continuum of reactor assessment over the course of any given year.  Things
like performance indicators quarterly posted on the web, things like
letters that describe where in the action matrix any particular plant is,
which are issued, when those conclusions are reached, we think and our
ability to discuss internally the actions that are being taken in response
to performance problems along the course of the year, we think all add to
this notion of a continuum of assessment and activities that make the
meeting that we have to reaffirm where we're at, are much more likely to
be a reaffirmation than a significant change in approach.  So that's what
I think as a significant difference between the senior management meeting
process that we had for years where, in fact, we made critical decisions
about changing classification or in some cases our regulatory actions.  So
it is a fundamental point I'd like to highlight.
		I'd also like to point out that a wide range of topics were
reviewed and discussed during the AARM not the least of which were lessons
learned from the first year of implementing the reactor oversight process. 
We have, as you point out, a separate meeting to discuss that tomorrow and
we'll provide some detailed discussions of our own assessment of that
first year of initial implementation.
		Additionally, I should point out that we took the
opportunity having the senior managers gathered to talk about a number of
wide ranging management topics that are currently before us within
the Agency.  During today's briefing, however, we will focus mainly on
specific plant performance reviews and our new power reactor industry
trending program.  The Trends Program is described in detail in
SECY-010111 which was issued late last month and with that I'll introduce
the folks who are joining me here at the table.
		Mike Johnson is the Chief of the Reactor Inspection
Program.  Jon Johnson is Deputy Director of the Office of Nuclear Reactor
Regulation.  Of course, Bill Kane is my Deputy for Reactor Programs.  Jim
Dyer and Hub Miller are the Regional Administrators of Region III and I,
respectively.  So with that, I'd like to begin the meeting by turning it
over to Jon Johnson.
		MR. JON JOHNSON:  Thank you, Bill.  Good morning, Chairman,
Commissioners.  As you're aware and has been indicated the staff recently
completed the first year of the initial implementation of the new reactor
oversight process.  
		As an integral part of that process the senior managers met
in our Region II Office in late June and completed the first Agency Action
Review Meeting.  This meeting essentially replaces the Senior Management
Meeting of the old process which was last conducted in May of 2000.  In
addition to elimination of the Senior Management Meeting, the SALP process
and Plant Performance Reviews have been replaced by the new Reactor
Oversight Process Assessment Program.  We believe this is more integrated,
objective, risk informed and predictable.
		The purpose of today's briefing is to inform the Commission
of the results of this Agency Action Review Meeting, but first I'd like to
provide some background on the assessment process and some key events that
have occurred prior to meeting today.  
		May I have slide 2, please?
		(Slide change.)	
		MR. JON JOHNSON:  The assessment process is described in
detail in Inspection Manual Chapter 0305.  Each Regional Office conducted
end-of-cycle reviews using the most recent performance indicators and
inspection findings for the past 12 months.  These assessments were
conducted to analyze licensee performance from inspection reports and
performance indicators, to confirm NRC actions and to allocate resources. 

		In addition to these end-of-cycle reviews, mid-cycle
reviews were completed in November of 2000 and quarterly reviews were
conducted at periods of time between then when indicators or inspections
crossed thresholds.  Supplemental inspections were scheduled to evaluate
these performance issues that caused the PIs or inspection findings to be
greater than green and therefore had at least low safety significance.
		End-of-cycle summary meetings were conducted at conclusion
with the Director of NRR for those plants whose performance over the past
annual assessment cycle was in a degraded cornerstone or a multiple
repetitive degraded cornerstone column.
		The regional staff also presented the results for those
plants that were considered to have substantial cross cutting issues. 
Based on the results of these meetings, annual assessment letters were
issued to all plants at the end of May.  Those letters contained a
discussion of the plant performance for a 12-month period, focusing on
risk significant performance indicators or inspection findings,
substantial cross cutting issues and as well, included a summary of the
Agency actions and the licensee actions.  They also included detailed
inspection plans.
		In addition, public meetings have been held or will be
shortly over the next few days with each licensee to discuss the results
of this assessment.  These meetings were conducted on site or in the
vicinity of the site so they were accessible to members of the public.
		Lastly, the first Agency Action Review Meeting was held
from June 26th through 28th in our Region II offices in Atlanta.  As
described in Manual Chapter 0305 this meeting is an integral part of the
evaluative process used by the Agency to ensure operational safety
performance.  This meeting was chaired by the Executive Director for
Operations, Dr. Travers and was attended by the NRC senior managers.
		The remainder of this morning's presentation will focus on
the conduct and results of that meeting.  Could I have slide 3, please?
		(Slide change.)
		MR. JON JOHNSON:  The inaugural June 2001 Agency Action
Review Meeting was conducted in accordance with Draft Management Directive
8.14.  The staff is also currently developing lessons learned from this
process, but the initial feedback has been positive.
		This directive describes the meeting has having four
distinct purposes.  The first three are related to reactor oversight
process.  The first is to review Agency actions resulting from the
performance reviews for those individual plants that had significant
performance problems.  The second part is to review the industry trends
analysis and the third part is to review the staff's self-assessment of
the reactor oversight process.
		Plant performance was reviewed at stages leading up to the
Agency Action Review Meeting, but those discussions at the meeting were
limited to those plants that places them in either the multiple/repetitive
degraded cornerstone column or the unacceptable performance column.
		The second piece of the reactor oversight process is to
discuss industry trends.  This is a joint program between NRR and the
Office of Research and as the EDO indicated, the description of this
process was described in a Commission paper recently issued.  The NRC uses
selected indicators to monitor trends in industry performance as a measure
of success of the Agency's efforts to meet the performance goal of
maintaining safety.
		Mr. Michael Johnson will discuss the industry trends
program and the results after the Regional Administrators have completed
their plant performance discussions.
		The last piece of the ROP related portion is the staff's
self-assessment of the oversight process.  These results were also
discussed at the Agency Action Review Meeting.  We do not plan to go over
that in detail as we indicated.  We have a meeting set up tomorrow to do
that in detail.  However, I would like to note that the Agency Managers
concluded that the ROP was successful in enabling the Agency to oversee
the performance of reactor licensees including bringing forward those
plants whose performance warranted increased attention.
		In addition to the ROP topics, the AARM provides a forum
for senior managers to discuss emerging technical and policy issues. 
These discussions were held on the second day of the 3-day meeting.  A
suggestion that came out of the meeting was to consider separating these
non-ROP related topics and to consider having a separate designated
meeting.
		Slide 4, please.
		(Slide change.)
		MR. JON JOHNSON:  Two plants were discussed during this
first Agency Action Review Meeting, Indian Point in Region 1 was discussed
because it met the criteria of being in the multiple, repetitive degraded
cornerstone column.  In a few moments, Mr. Hub Miller, the Regional
Administrator will discuss plant performance, including the NRC and
licensee actions that have been taken to address the performance concerns.
		For a different reason, D.C. Cook, Units 1 and 2 and Region
III were also discussed due to their unique transition into the ROP
process.  Mr. Jim Dyer, the Regional Administrator will provide a status
briefing shortly.
		Finally, I'd like to remind us that as part of the ROP
assessment process in Manual Chapter 0305, all other operating reactors
were reviewed during the end-of-cycle meetings and actions were taken in
accordance with the action matrix.  The staff's reviews of those reactors
were documented in the assessment letter sent to the licensees the last
week of May.
		Under the ROP which was implemented industry-wide in April
2000, the level of oversight and actions taken were determined by the
action matrix as the problems were identified.  We don't wait until the
Agency Action Review Meeting to take the necessary actions.  The purpose
of the AARM was to confirm the staff actions as opposed to deciding what
action should have been taken or ranking the plants.
		With that, Mr. Hub Miller will now discuss the performance
of Indian Point II.
		MR. MILLER:  Good morning, Mr. Chairman, Commissioners. 
Over the past year while operating in a manner that preserved public
health and safety, Indian Point II has been in the multiple integrated
cornerstone column of the reactor oversight action matrix.  The degraded
cornerstones are associated principally with performance problems revealed
by an August 1999 reactor trip with electrical system complications and a
February 2000 steam generator tube failure.
		The plant was shut down much of the assessment period to
replace steam generators.  It was restarted in late December of last year
and reached full power in late January.  A number of equipment problems
and personnel errors impacted on operations during this period.  This
included, for example, a turbine trip with complications that occurred
shortly after restart.  However, since that time, the plant has operated
continuously on line.
		Over the assessment period, we conducted a variety of
inspection and oversight activities consistent with the action matrix and
program guidance.  During the extended shutdown and restart phases,
numerous inspections were performed to ensure steam generator replacement
and associated testing activities were performed adequately to assure that
plant and station personnel were ready for plant restart.
		In addition to baseline inspections, a number of special
inspections were performed to assess emergent issues and events such as
design control issues that surface before restart and the turbine trip
shortly after start up.
		Following the action matrix, an extensive supplemental team
inspection was performed to independently assess the breadth, depth and
root causes of performance deficiencies at the facility.  Using inspection
procedure 95003, the 14-member team spent 3 weeks on-site in the
January-February time frame evaluating licensee corrective action
processes and assessing further whether acceptable margins of safety
exist.  
		The 95003 team concluded that the facility was being
operated safely.  However, it found a number of problems that are similar
to those identified during previous inspections and events.  These
included issues in the areas of design control, human and equipment
performance, problem identification and resolution and emergency
preparedness.  The team noted that while progress was being made, it was
slow and limited in some areas.  
		Importantly, in one area of improvement relates to better
alignment between ConEd's business and performance improvement plans. 
ConEd's response to the inspection captures well the nature of the
performance problems that exist and in broad outline describes actions
needed to address them.
		In order to verify effectiveness of corrective actions,
particularly given past problems in following through on improvement
plans, several focused inspections and special oversight activities are
planned through the end of the year beyond the baseline.  Regulatory
performance meetings have been held throughout the last year to monitor
licensee performance improvement efforts and we will continue these
meetings and be sensitive to the effect any license transfer will have on
the business plan and related supporting initiatives.
		We expect that by end of the year through these planned
inspections and other oversight activities to be able to judge whether the
station has substantially addressed identified performance weaknesses.
		Significant staff effort and management attention was aimed
over the past year at addressing public and external stakeholder interest
and concerns.  It has been extensive and very intense at times.  We
conducted numerous public meetings and meetings with the licensee in open
forum.  There were 13 meetings this past year, 9 of which were held
locally in the vicinity of the site.  
		Consistent with the action matrix, these included
regulatory performance meetings that I convened and the annual end of
cycle meeting held recently on site which was led personally by Dr.
Travers.  Very significant too is Chairman Meserve's tour of the site
after the 95003 inspection in April.  We frequently briefed government and
elected officials at all levels, federal, State and local to keep
stakeholders informed of our activities and to receive input.
		Similar to how we coordinated technical and safety issues,
we employed an inter-office communications coordination group to help in
handling this extremely challenging aspect of our activities and this
worked quite well.
		We will continue these special communication efforts.
		Finally, at the Agency Action Review Meeting, senior
managers were briefed on NRC actions and licensee performance.  The senior
managers concluded that actions taken and those planned are appropriate,
that they are consistent with the reactor oversight program guidance and
that no additional actions are warranted at this time.
		MR. JON JOHNSON:  Thank you, Hub.  Before turning the
discussion over to Mr. Jim Dyer, I'd like to remind everyone that D.C.
Cook was not under the revised -- the reactor oversight process during
this first year due to its extended shutdown.
		Jim?
		MR. DYER:  Thank you, Jon.  As Jon just stated during the
recent Agency Action Review Meeting the transition of D.C. Cook to units 1
and 2 to the reactor oversight program was discussed.  The implementation
of the reactor oversight program at D.C. Cook was delayed because in April
2000 both units were shut down with manual chapter 0350 oversight of the
restart activities.

		Subsequently, Unit 2 started up in June of 2000 and has
operated well.  Unit 1 started up in December 2000 after steam generators
were replaced and shortly after start-up of Unit 1, Unit 1 experienced
some problems that caused power transience and the licensee-initiated
corrected actions to reduce those challenges.
		After the start up of each unit, the reactor oversight
program guidance was used for NRC inspection activities under the
oversight and direction of the Manual Chapter 0350 Panel and the licensee
began accumulating performance indicator data. 
		In May of 2001, D.C. Cook Units 1 and 2 performance was
evaluated with the other Region III plants during the end-of-cycle
reviews.  Performance indicator data submitted by the licensee in April
revealed a white performance indicator for unplanned power changes on Unit
1 and incomplete data for five performance indicators on both units.
		Inspection findings were in the licensee response column,
however, issues with corrective action backlogs and maintenance rule
implementation were of some concern.
		A supplemental inspection under Inspection Procedure 95001
has been performed to address the white performance indicator on Unit 1
and augmented inspection hours are being applied to the baseline
inspections for the areas covered by the incomplete performance
indicators.
		Additionally, NRR Region III and the licensee are
developing appropriate methods to report the incomplete performance
indicator data.
		After reviewing the post-start up operational performance
of both units, the success of the licensee's improvement programs and the
results of the end-of-cycle review, the Manual Chapter 0350 Panel
concluded that enhanced oversight was no longer needed for D.C. Cook and
recommended that the oversight activities be terminated.  After
consultation with the Deputy Executive Director for Reactor Programs, and
the Director of Office of Nuclear Reactor Regulation, I closed out the
Manual Chapter 0350 oversight of D.C. Cook on June 7th, 2001.
		At the Agency Action Review Meeting, we confirmed that the
transition activities for D.C. Cook to the reactor oversight program were
appropriate. 
		This concludes my presentation.
		MR. JON JOHNSON:  Thank you, Jim.  That concludes our
discussion of individual plant performance.  I'd now like to turn the
briefing over to Michael Johnson, Branch Chief from the Inspection Program
Branch for discussion on industry-wide trends and results to date.
		MR. JOHNSON:  Thank you, John.  Good morning, Chairman,
Commissioners.  May I have the next slide, please?
		(Slide change.)
		MR. MICHAEL JOHNSON:  As mentioned earlier and in addition
to providing a discussion of the plan's multiple repetitive degraded
cornerstone column of the action matrix and the unacceptable column of the
action matrix, the Agency Action Review Meeting also includes a review of
industry trends and any actions planned or taken based on those trends.
		The trending process that we developed and the results to
date are documented, as was mentioned earlier in SECY 01-01-0111 and were
reviewed at the Agency Action Review Meeting.
		In future briefings on the Agency Action Review Meetings
we'll focus primarily on reviewing the results from the previous year. 
However, for this briefing, I think it's appropriate for us also to
describer the process including the background, how we plan to communicate
the results and plan future enhancements to the process.
		Next slide, please.
		(Slide change.)
		MR. MICHAEL JOHNSON:  Before I describe the process, let me
just say that the staff recognizes that monitoring industry trends can
provide valuable insights.  The Agency has historically monitored trends
and the results have been reported annually to Congress in the Performance
and Accountability Report and in the Budget Estimate and Performance Plan
Report to OMB.  
		In reaching previous determinations regarding industry
trends, the staff has indicated from the ex-office of AEOD and the ASP
Programs, those trend graphs have been published under various NUREGs
including the Information Digest and the ASP results have been provided in
various Commission papers and also in various NUREGs.
		Although we've monitored and reported on those trends and
we've taken actions based on the insights from those trends, the use of
industry trends until now has not been part of an integrated and
structured process.  
		Next slide, please.
		(Slide change.)
		MR. MICHAEL JOHNSON:  And so in conjunction with revising
the reactor oversight process and getting to first year of initial
implementation, we developed a more systematic means for monitoring
industry performance in order to enable us to confirm their reactor safety
performance is being maintained.  As the slide points out, the industry
trends program is not intended to supplant the ROP or other processes such
as generic issues process that enable the Agency to oversee safety
performance of plants and to take actions to address plant-specific or
generic concerns, rather, the trending process is intended to complement
those various processes.
		Next slide, please.
		(Slide change.)
		MR. MICHAEL JOHNSON:  Briefly stated, the objectives of the
program are to monitor selected indicators that provide insights regarding

industry-wide safety performance, to assess the results, to provide for
implementation of appropriate action based on the trends that we identify
and to communicate the results along with actions to our stakeholders.
		Next slide.
		(Slide change.)
		MR. MICHAEL JOHNSON:  In developing the program, we looked
first at existing sources of information to provide an initial set of
indicators with the expectation that the trends program will continue to
evolve.  We started with the XAEOD performance indicators and the ASP
results.  We will add the ROP PIs.  We'll add other indicators as I'll
discuss in a little bit.  But as we add those indicators, they'll be
qualified for use in the program.
		Secondly, we look for indicators that are based on
quantitative industry-wide data and that relate to safety performance.  I
should point out that the indicators chosen are not intended to be all
inclusive.  There are many, many things that can be trended, in fact, that
are being trended by us and by other folks.  The indicators that we've
chosen for the program, we believe, however, are indicative or reflective,
I should say, of current performance trends in the industry.
		We also wanted the program to be focused on the
identification of long-term trends to minimize the impact of short-term
variations due to things such as the operating cycle and seasonable
variations and random fluctuations. 
		And so although we'll keep track of those short-term trends
and we'll look to see if they tell us anything, we'll use long-term trend
data to make a determination regarding trends.
		Finally, we believe that determination of whether there is
a statistically significant adverse trend should be objective and
transparent and once we've determined that we have a statistically
significant adverse trend we ought to evaluate and take actions as
appropriate to address those trends and you'll see that that is reflected
in the process.
		Next slide, please.
		(Slide change.)
		MR. MICHAEL JOHNSON:  I'm going to briefly describe the
process.  There are three primary activities associated with the process. 
They involve identifying the trend, evaluating its significance and
determining and implementing Agency response to those trends.  First, to
determine the trend, the program uses common statistical techniques to fit
a trend line to each of the indicators.  And we recalculate that trend
line each year.  And an improving or flat trend line indicators,
obviously, no adverse trend.  A degrading trend line would be considered a
statistically significant adverse trend.  We would evaluate the causes of
that as I'll talk about in a minute, but also we would report that
statistically significant adverse trend to Congress and to our other
stakeholders.
		We built in an added feature to the process to enable us to
react to a single point that potentially represents performance that is an
abrupt change from previous performance.  Based on historical performance,
we computed the limits, the upper limits that should contain future values
within a 95 percent confidence level.  We talk about a 95 percent
prediction limit associated with those graphs in the Commission paper and
so in addition to evaluating long-term trends, we will look for a single
point that falls outside of that prediction limit and we will investigate
that single point as it occurs.
		Furthermore, if obvious trends emerge during the year, we
would not wait until the end of the year, but we'll take action on a real
time basis to understand what is causing those trends.
		Once we determine that we have a statistically significant
adverse trend, we will evaluate it.  We'll conduct an initial analysis to
determine if the duty is being unduly influenced by a small number of
outliers.  If it is being unduly influenced by a small number of outliers,
our determination would be that that is not indicative of an industry-wide
trend and we would focus our actions on those specific outliers.  
		If the trend is not being unduly influenced by a small
number of outliers, that is, if the trend truly is a broad, widespread
industry trend, we would conduct a broader review of data such as -- data
from the LERs, date from inspection results to determine the extent of the
issue and any potential root causes.
		Finally, the staff will determine the appropriate action in
response to the industry trends using the Agency's processes for dealing
with generic issue.  That would include assigning the issue to an
appropriate branch, to the appropriate branch.  It would involve engaging
senior managers.  It would involve initiating interaction with the
industry, all intended to help us determine what the appropriate actions
are to address that particular trend.
		Trends could result in us requesting industry groups or
owners' groups to provide utility information.  It could result in
industry initiatives.  It may also result in generic correspondence or
general safety inspections to address the trend.  
		In addition, depending on the issue the trend may also be
addressed as part of the generic issue process by the Office of Research.
		Next slide, please.
		(Slide change.)
		MR. MICHAEL JOHNSON:  With respect to communication the
results of the trending program, we plan to publish the trend graphs on
the external web as they are developed each quarter.  We will report the
results annually in support of the Agency Action Review Meeting and brief
you on those trends in our planned actions in these meetings as they occur
each year.  
		Finally, we'll continue to provide the results in the NRC
performance and accountability report and in the budget estimate and
performance plan.
		Next slide, please.
		(Slide change.)
		MR. MICHAEL JOHNSON:  We're happy to report that based on
the trend process that we have, we have not found, that is, there are no
statistically adverse trends in industry performance identified based on
that process, either based on the AEOD indicators or the ASP indicators. 
In fact, using the ASP program, there were no significant precursors and
there were declining trends in the frequency and the significance of
precursors from 1993 to 1999 and although we have insufficient data using
just the ROP indicators and the insights from the ROP, we did not in
looking at the ROP identify issues that would indicate to us that we have
a statistically significant adverse trend.
		Next slide, please.
		(Slide change.)
		MR. MICHAEL JOHNSON:  I indicated that we anticipate that
the trending program will continue to evolve.  We already know of several
sources of data that when updated, will enhance the trends program.  For
example, Research is updating the initiating events data.  That is the
data that was in the old NUREG 5750 and they're updating data for
reliability studies and we believe those will provide valuable additions
to our trending program.
		In addition, the staff may find other indicators and if we
do, we'll add them to the trending program in a structured and considered
way.
		Finally, we look for ways to make the process more
risk-informed, more objective and predictable.  Just as the ROP has clear
thresholds and pre-established ranges of action, we will work with
Research's Operating Experience and Risk Analysis Branch to develop
risk-informed thresholds for industry trends and more clearly defined
actions for us to take based on thresholds being crossed with respect to
those industry trends.
		Thank you.
		DR. TRAVERS:  Mr. Chairman, that concludes our
presentation.  One of the things that will be pointed out somewhat in this
briefing and we'll discuss again tomorrow with you is there is continued
assessment in the ROP is going to include an assessment of this meeting
and its make up and how it ought to proceed and so having conducted the
first one, we're already beginning some thought and discussion on how it
ought to be configured the next time.
		As a function of timing I think the next one actually
occurs in March of next year, so it will be in line with the briefing of
the purpose of the report.
		That's all we have.  Thank you.
		CHAIRMAN MESERVE:  I'd like to thank you for a very helpful
discussion.  
		Commissioner McGaffigan, would you like to proceed with any
questions?
		COMMISSIONER McGAFFIGAN:  Let me just start where we
finished on these statistically significant adverse trends.  In the paper,
SECY-01-0111, I'm a little concerned and I took just enough statistics and
advanced statistics to be dangerous and what I needed to do to be a
physicist, probably the Chairman had the same experience and you fitted a
bunch of exponential curves here, indicating exponential curves and if the
industry and that reflects this tremendous improvement in safety
performance over the last decade,but the way you've set the industry up,
if they flatten out, rather than continue to decay exponentially, they
will have a statistically significant adverse trend even under your
definition, even though all they will have done is flatten out at truly
excellent level, but they won't continue to proceed to zero.  
		Is that a flaw in the statistics or would you fit a curve? 
If I faced this, curves can be exponential a point and then it doesn't
have to be a single function.  You can have a flat line function.  You
could reevaluate the curve, but there is a little bit, I look at these
curves in the appendix here and you know everything has been nicely
decaying exponentially to zero, but if they flatten out -- if, for
example, in significant events per plant year they bop up to where they
were in 1997, you guys would be raising red flags and saying oh my gosh
they are way outside their 95 percent confidence band, even though they
would be doing historically pretty darn well.  They just wouldn't have
continued to decay exponentially, so as I say, is there a definitional
issue there?
		MR. MICHAEL JOHNSON:  Let me just try to answer that and
then I'll get help from folks who perhaps have more of a statistical
background than I do.
		We actually talked about this issue at the Agency Action
Review Meeting a little bit.  And it is important, as you say, to fit the
right curve, the function to the data that we're trying to analyze.  We
recognize that there is, as we go further, an inherent flaw with respect
to the approach that we're taking and that's why in that last line when I
talked about future development, we think really the salvation for this
process in the long run is to identify 
risk-informed thresholds so that we're not talking about focusing in on
trends and what the curve and what the data is doing, but we're talking
about thresholds that we would look to be crossed for the Agency to take
action.  That's the long-term solution.
		We are, in fact, looking at the data.  We've been, in fact,
carefully fitting the curves to make sure that as the functions shift
over, if it's no longer exponential, if it's a linear function that we do
that in a way that --
		COMMISSIONER McGAFFIGAN:  Just watch that because it looks
like you've been trending exponential to K curves so far.
		The other issue in this paper, I just want to bring up, you
talked about the ASP program results through 1999.  In 2000, it looks like
we had four events from figure 2 on page 20; four events, at least
preliminarily, between 1 and 9.9 x E-4 range which is high compared to
previous years and if I'm reading the table right, so do you have any
comment on whether we in 2000, we were doing pretty well in the 1993 to
1999 time period, but is there any significance to this one blip up where
we seem to have four events?
		MR. MICHAEL JOHNSON:  Unfortunately, Commissioner, I'm
probably not the best person to answer that question.  The way we
developed this input to the industry trends process is we take this
directly from the analysis that the Office of Research does and those are
the results that we are presenting.  And in fact, in a report just issued
by the Office of Research where they considered that particular year,
their analysis of that data was that there weren't any statistical --
		COMMISSIONER McGAFFIGAN:  Somebody has popped up to the
microphone, so perhaps --
		DR. BARANOWKSI:  Dr. Baranowski, Chief of the Operating
Experience and Risk Analysis Branch and we do the accident sequence
precursor program and we've looked at the preliminary data and although
it's up a little bit and we looked at the statistical significance and it
doesn't come above the 95 percent line, nor does it change the trend line
at this point, but we haven't completed the data either.  And the issue
that you raised earlier about the exponential fit is correctly described
by Commissioner McGaffigan.  We are looking at developing pure thresholds
as opposed to fitting exponential curves out into the future when things
do flatten out into the tails.
		COMMISSIONER McGAFFIGAN:  Okay.  Let me switch to a
different line of questioning and Luis may need to head to the microphone. 
I'm a little concerned about the Farley Unit 2 situation.  We sent a
letter in June and to the licensee saying that we made a preliminary
determination of a yellow finding in the OSRE conducted has September. 
And that is still preliminary, but if it turns out to be sustained through
the process that you'll go through in the next month, they will have had
for several quarters during this past assessment period multiple degraded
cornerstones and be a Column 4 plant.  
		I'm a little concerned about the speed with which we -- on
what is a pretty important SDP finding.  I know that's a generic issue
we'll talk about tomorrow, the lack of speed in getting there.  And just a
delay oftentimes is justice denied, but as I understand the process, next
year Farley is firmly in Column 1 at the moment.  In the latest quarter,
if I click on the webpage, Farley is a Column 1 plant.  That will not
change perhaps this year.  They have another OSRE in September.  If they
do well on that, they may well remain a Column 1 plant all the way through
the three quarters of this assessment period and next year never be
discussed.  And it's a very peculiar situation we're in where they could
have for several quarters last -- I guess the first, second and third
quarters of this past assessment period, we're doing -- the second, third
and fourth quarters of this past assessment period, they may have belonged
and either multiple degraded or single degraded cornerstone situation, so
what do we -- what do we do with Farley?  I know you had a meeting with
them.
		MR. REYES:  Luis Reyes, Regional Administrator for the NRC
Center in Atlanta.
		Commissioner, let me try to answer it first from a
programmatic point of view and then go back to a specific example.  
		The Revised Oversight Program requires us to analyze and
assign a risk value to any finding that the staff may have on any subject
matter, so the program, as presently structured, if you have a finding and
you assign a risk value to it that gets to the white, yellow or red, then
requires to go back to when it existed, so you could have a situation on a
plant, any plant and you have a finding today.  It may have been risk
significance and it goes back in history several years.  It's the time it
existed.  If you're in that period of time the plant had other issues
which would put that plant in a multiple or a degraded cornerstone, then
it will get into a situation, it could get into a situation you're talking
about which it puts it in the fourth column.
		We're taking a hard look at the ROP on how to address that
because it may be, it may not be the best way to do, four or five years
later, go and do all the actions that the matrix require when all the
issues, all the previous issues have been resolved to the satisfaction, to
the staff and you only have one remaining issue.
		COMMISSIONER McGAFFIGAN:  In the case of Farley, they had
some white indicators and a mitigating systems column.  They got them into
that degraded cornerstone.  Those have turned green.
		MR. REYES:  Correct.
		COMMISSIONER McGAFFIGAN:  So that's largely behind you. 
They have this OSRE from last September where they have a preliminary
yellow, may be final yellow and there's another OSRE plan for September,
but under the Action Matrix, some time this fall, long after they were in
the multiple degraded area, we will do a 95003 inspection unless we
deviate from the Action Matrix and that may not make a lot of sense or it
may make sense, depending on what your judgment is to what the situation
at Farley really is.
		MR. REYES:  Yes.  The key thing is that when -- first of
all, immediately when a situation comes up in this case, the safeguards
issue, we look into making sure they're taken care and the licensee has
been in the process of doing that.  The resolution in this case is in OSRE
because it happens to be the kind of finding and we scheduled that way
ahead of time.  That's in process.
		Now the question is when we finish assigning the risk to
that finding in the next few weeks, if indeed it gets you into the fourth
column, then we have a decision to make.  If we don't follow the matrix
verbatim, we will have to go to the program office and the EDO and
recommend a deviation for the particular situation you have and we haven't
got to that point, but those are the options under the program and in the
next few weeks we'll finalize a meeting with the licensee.  We'll finalize
the risk assignment to the finding and we'll come to that cross in the
road where we either do what's required by the Action Matrix or we'll have
to come to the EDO and explain why we think something else makes sense.
		COMMISSIONER McGAFFIGAN:  So the EDO seems 
-- well?
		DR. TRAVERS:  You are exactly right.  There's a temporal
disconnection between the actual classification and the actions that may
be either underway already or completed already, including our actions to
oversee from our perspective what we think needs to be done.
		So the process has a little issue within it that we're
looking at, but it also has, we think, the flexibility for us to address
it in a way that we can come to you and say it makes sense.
		COMMISSIONER McGAFFIGAN:  Do we need the timeliness goal
for SDP determinations?  If I'm in NRR and Mr. Johnson is the Deputy
Director, I've got timeliness goals for licensing actions.  I've got
timeliness goals for enforcement actions in which I have to concur. 
Timeliness goals here, timeliness goals there.  And rulemakings, the
Commission is demanding rulemaking X or rulemaking Y, get placed before it
in a finite period of time.  And I used to work for Senator Bingaman.  The
thing he wasn't monitoring is probably this thing I didn't do, so if 
-- we honestly knew that this was a likely yellow finding, I believe last
October, under the revised SDP for the system protection area, but it took
us from October of last year when I was told this orally to June or July
before -- June, I guess, before we hit them with a letter and I understand
that that group of people who worked on security have been writing
security papers to us, preparing rule makings, doing all sorts of things,
so I'm not -- I think they had more to say grace over than probably any
other part of this Agency.  They also have KI and other interesting issues
to work on. 
		So but I  wonder whether we shouldn't have something that
drives, especially an SDP that could move somebody in the Action Matrix
two columns to the right or something, whether we shouldn't have a
timeliness goal in that area.
		MR. JON JOHNSON:  We agree.  We can do a lot better in
timeliness and we have a lot of actions that were taken to improve the
timeliness, including training inspectors and how to implement the SDP
evaluation process, but we agree, we can do much better. 
		We also have nonsecurity issues.  We have some difficult
fire protection issues that we're struggling with that are getting old and
we need to get on those and resolve them from a risk standpoint and
they're very difficult.  We agree we need some timeliness goals.  We do
track timeliness from enforcement.  If we're taking enforcement actions,
we have some specific timeliness goals for those.  We are considering a
tracking system, but we agree.  We need to track the timeliness and what
Luis did say is in our program this could happen even with an item that is
not untimely.  We could identify an old design issue of some kind that was
in existence in the previous year or years before and we have to go back
and look and say what would we have done differently and what Luis
described is in our program and the decision would have to be made.  Do we
want to do a large team inspection or not and if not, and we didn't want
to implement the Action Matrix.  We would have to go back to the EDO.
		COMMISSIONER McGAFFIGAN:  That's not unique to this
program, of course.  That's always been a --
		MR. MICHAEL JOHNSON:  And if I could just add --
		MR. KANE:  I don't it inhibits us from doing the right
thing, but I agree that we do need to establish these timeliness goals and
certainly as we said, the program is one we're just into.  We're learning
some of the nuances about it and that's obviously one of them.
		MR. MICHAEL JOHNSON:  I was just going to add, again, not
to say for the third time what's already been said, we have timeliness
goals and we need to do a better job at meeting those and we're going to
talk more tomorrow about the complications with respect to the SDP and
some of the challenges ahead for us to meet those goals.  Notwithstanding
that, I did want to leave you with the prospective that even if we had an
SDP, to get to a very timely result, we're always going to have this
temporal disconnect, if you will, between -- that's going to cause us to
go back and relook at the actions that we took, for example, with respect
to performance indicators, if we get a resubmitted performance indicator
result as does happen on occasion.  We have to go back and look at what
actions we took and readjust those.  You're always going to be looking
back at the previous quarter with what you thought you know and the
findings that you had to make sure you ended up in the right place.
		We think it's important that the program not have us react
to preliminary findings.  We really do want to make sure that we've
reached the final determination with respect to our significance before we
take action, but we recognize that we need to do that in a timely way so
that we're not looking at these extended periods where we've had this
issue linger.
		COMMISSIONER McGAFFIGAN:  Thank you, Mr. Chairman.
		DR. TRAVERS:  If I could just add 30 seconds.  I just want
to add that we don't, in this process, we think it's flexible enough it
doesn't constrain us, even though everything that everyone here has said
is operative and that is we ought to have timeliness goals and we're
certainly learning through doing, but it's also noteworthy, I think, to
suggest that where we were at Farley is in a position to take the actions
that we thought were deemed appropriate and we didn't feel constrained
within the process that would in the action matrix cause us to look at
some additional special inspections.
		CHAIRMAN MESERVE:  Commissioner Merrifield.
		COMMISSIONER MERRIFIELD:  Thank you, Mr. Chairman.  In the
July 17th memo on page 1, you reiterated that plants, only the plants with
significant performance problems discussed in the AARM, and you define
those as whose performance has resulted from them being placed in either
the multiple/repetitive degraded cornerstone or in the unacceptable
performance columns, Columns 4 and 5.
		Now Commissioner McGaffigan has gone into this in some
detail, but I'm wondering are there any other, having gone through this
process in the AARM, do we have sufficient focus on Column 3, those in
which we have a degraded cornerstone column and obviously that wasn't
discussed in the AARM.  Do you think it should have been and are you
comfortable that it wasn't?  Are we looking in the right places in that
grouping?
		DR. TRAVERS:  I think we're still rolling up experience,
but I think our view is that we had the right focus for the AARM.  When we
talk about discussion plants we sort of focused on the ones in that
column.  We don't necessarily feel constrained to discuss amongst
ourselves other issues as we've indicated.  It turns out that at this
meeting we didn't get into a discussion of Column 3 plants, but in the
main that's because we have had discussions previous at end-of-cycle
meetings and roll-ups between the Regional Administrators and the Office
of New Director Regulation and Research.  And so we were comfortable going
into this meeting that where we were was the appropriate focus.
		COMMISSIONER MERRIFIELD:  So there were other
opportunities, it's not as if it's a one shot deal.  There are other
opportunities to discuss not only Column 3, but also obviously other
plants, in Column 1 and Column 2 --
		DR. TRAVERS:  You're exactly right.  I call them
opportunities and you're correct to use that word.  But if you look at the
process, in fact, there's a required point along the course of any given
year where plants that have performance issues are discussed among senior
management.  We think that's a good --
		MR. MILLER:  If you look at Millstone Unit 2, for example,
as a degraded cornerstone, we followed the process, did a 95002 inspection
which is a level above the baseline and there was the end-of-cycle
discussions within the region and then at the end of that there's a
discussion with Sam and Jon and others at NRR of selected plants like that
and so it has worked quite well in terms of focusing on plants that have
something that's beyond the base line and so I think we can say that we
have had appropriate focus and discussion.
		COMMISSIONER MERRIFIELD:  I think it's important to put
that in context only because one might take from the discussions that the
AARM is really, on the outside that's the only time in a year we're going
to be taking a look at where these plants stand and in fact, what you're
doing in the AARM is you're looking, obviously, at those which would have
the greatest degree of significance, but there are other period -- there
is other periodicity through the year in which we're reviewing all the
plants in our satisfaction or dissatisfaction of their performance.
		MR. DYER:  Commissioner, I think also and to sort of
address Commissioner McGaffigan's order of concern too, at the end of
cycle roll-up meeting where the Regional Administrator discusses with Sam
and certainly in the case of Region 3, we talked about the degraded
cornerstone plants and was there any work in progress or any indication
that by the time we got to the Agency Action Review Meeting or downstream,
that we may be in the Column 4, Column 5 areas.  Do we have any work in
progress?  So we briefed him on those plants as well as the other.
		COMMISSIONER MERRIFIELD:  But just for information sake,
those roll-up meetings, in fact, discuss all of the plants in your region,
when you meet with Sam and Jon.  It's an opportunity.
		MR. DYER:  It's an opportunity, yes sir.
		COMMISSIONER MERRIFIELD:  In the memo, in the narrative
relative to Indian Point 2, it states that the senior managers discussed
the means to ensure that established licensee performance improvement
plants would be continued following a potential, underline, potential
operating license transfer to Entergy.  
		Could you share a little bit more about the outcome of that
discussion and in fact, are comfortable with the direction in which things
are going right now?
		DR. TRAVERS:  We did discuss that and I'll let Hub address
your question, Commissioner.
		MR. MILLER:  This is a question that has come up a lot at
public meetings and other places and we've been pretty consistent in our
answer and that is that our process, it does us focus on the performance
issues independent of who the owner is.  Having said that, it is
significant that a number of the issues that ConEd has identified and a
number of the initiatives in improvement programs are going to span out
over several years and the area of design, for example, and so it does
become a concern about what the future owner would do.  And so for that
reason we have identified this as something we would meet with Entergy on. 
If there's a transfer following the transfer, we have already had Entergy
at regulatory performance meetings, one in April, for example, on design. 
Remarked that they are committed to addressing those issues.  We do not
expect Entergy to have the same processes exactly that ConEd has, but in
terms of fundamentally are the programs designed to address the broad
issues.  We will continue through the periodic management meetings that I
talked about, through our inspections, to determine whether or not there
is a significant change and so we will -- I think the bottom line is we --
I think the program is structured such that we will through our
inspections and oversight activities have a good handle on the direction.
		COMMISSIONER MERRIFIELD:  I know and appreciate that Tom
and that you and I have discussed in a variety of circumstances the
situation in Indian Point 2.  And although I have not visited there, I was
there in the last Fiscal Year.  What is the -- let me ask the question
here.  Do you think the Commission as a whole is providing you and your
staff with the resources necessary to do the type of oversight you think
is appropriate at Indian Point 2?
		MR. MILLER:  Yes.  It's been tight this past year.  We've
gotten help from the other regions.  I think we've learned a lot.  We fed
that back to Jon, Mike Johnson and others on what to expect if there is a
multiple degraded cornerstone.  A plant like Indian Point -- Indian Point
was unique also in that we had the steam generator replacement project
placed on top of it and we have this intense public interest that existed
which had a very large impact on us from a management point of view.  And
so the short answer is yes.  We got the help we needed.  We had to defer
one inspection at one other plant.  That was the only casualty of it.  So
I would say we were able to do it and we're going to learn from this and
factor it into future budgeting and the like.
		COMMISSIONER MERRIFIELD:  In the June 22nd paper to the
Commission on development of an industry trends program, you indicate on
page 6 that the staff is mindful that trends, individual indicators may be
considered in the larger context of their overall risk significance.  And
then it goes on to provide on page 7 a hypothetical example in which there
may be an increase in automatic scrams, but an overall risk may decline
because of an improved performance in other areas such as safety system
availability.  I've got sort of two reactions to that and I'm interested
in any comments you may have.  First one is when we make that kind of
comparison, we say well, an increase in scrams may be okay because there's
other mitigating factors that may be of lesser risk significance and that
gets us into that sort of trade off between where we are risk-informed and
we are risk-based in our thinking.  We don't worry about scrams because we
did a risk comparison and it's not so concerning.  That's issue number
one.
		Issue number two is I think for me where we see a trend, I
think we've got an obligation because of our public confidence concerns to
engage with the industry to communicate with our stakeholders and then
make it transparent that where we see a trend, we're going to share it
with people in that regard.  
		I didn't know if you had any reaction to those two
particular thoughts.
		MR. MICHAEL JOHNSON:  Those are good points, Commissioner. 
I really don't.  In fact, the way that we built the process today is to
say we're going to call it a statistically significant adverse trend if we
see that increase in scrams, for example.  It's just that in the
evaluation of what we're going to do, what actions that we may take, we'll
look in the broader context to be risk-informed, to make our decision
about is this something that we -- at what level should we engage and how
shall we engage to correct that particular problem.  
		This is another example of why we think it's important for
us to develop risk-informed thresholds to the extent we're able to agree
to the individual indicators to get out of that uncomfortable situation
that you point out.
		COMMISSIONER MERRIFIELD:  Okay, thank you.  Last question
in this regard, there were a variety of other issues that were discussed
at the Senior Managers  meeting, one of which on the list was the SES
candidate report on communication.  I know this is an issue that the EDO
had tasked our SES candidates to conduct.  I'm wondering if you could
comment at all on the report.  I know you're going to be making some more
formalized comments on that, but perhaps share the flavor of the
discussions about that particular, those particular recommendations.
		DR. TRAVERS:  I'd be glad to, thanks for the question.  In
fact, the whole team recognizes going forward the importance, continuing
importance of internal communications.  At the outset, I'd have to say
that the effort that the candidates put in is probably the hallmark, from
my standpoint, of maybe sets the standard for future SES candidate classes
because of the scope and depth of what they did and the insights that in
my view transcend the specific recommendations that you can glean from
this report.  		Just a few days ago I participated with Mike
Johnson and the rest of his class in a roll-out of this report and its
recommendations to the entire NRC staff.  It was broadcast to all the
regions, to all the sites, Technical Training Center, and so forth.  And
what I told the group assembled about our consideration of the
recommendations and report going forward is that fundamentally we
certainly accept the spirit in which all of these recommendations and the
insights were gathered.  The management team, in fact, is in the process
of seeing what more we can do.  We think we're doing more than we have
been in the past in the area of internal communications and some of the
discussion we had at AARM was a recap of some of what we're doing today
and perhaps have instituted relatively recently, but things we're doing
and some ideas that we have for going forward.
		I explained my own view to the staff that I think we're
doing a lot, but that we can do more.  And frankly, we're reviewing the
report to see both in the near term and perhaps in the longer term what we
can do to enhance our internal communications.
		Specific to some of the recommendations, whether or not you
develop a champion for the Agency or whether you distribute that function
among the management team, we'll have to see because there are
implications for budget and resources and so forth that are attendant to
some of the recommendations in there, but generally, the report and the
insights you can glean from the report, I think, put us in a much better
place, puts the management team in a much better place for understanding
and reacting appropriately to what is a legitimate continuing concern of
our staff regarding internal communications.
		I did emphasize in my discourse with the staff something
the candidates emphasize and that is the shared responsibility that we all
have, staff and the management team, for internal communications, to make
it work in a way that optimizes the situation internally.
		So we are looking, as I said, in the 
short-term and I expect to come out with something that would propose some
near term actions that we can do.  I personally am looking to be doing
some things that I'm not doing today and so I expect to announce some of
those.  But in the longer term, we'll work in to some further discussions
with the senior management team and certainly the Commission as to what we
ought to do.
		COMMISSIONER MERRIFIELD:  Thank you, Mr. Chairman.
		CHAIRMAN MESERVE:  Thank you, Bob.   I have a few questions
for Hub Miller about Indian Point 2.  Both in your annual assessment
letter and then in your briefing this morning, you indicated that from
your perspective that progress had been slow.  I think we all have to live
with the circumstances that have existed in the past and obviously we hope
that they had not occurred and we would have to learn from them, but
perhaps for me the most troubling aspect of what you said was that you're
not seeing the improvement occurring at the rate at which you would hope
to see it.
		I wonder if you'd share with us your views on whether, in
fact, things are getting better, whether the rate of improvement is
getting better or not, what the causes of the difficulty are.
		MR. MILLER:  We did not say that it was not at the level
that we would want to see it.  I say that because in some respects it's
almost predictable that given the nature of the issues that existed at the
site, and given what was on the plate of the licensee last year regarding
steam generator replacement which was something that was undertaken on
very, very short notice, that there would be slow progress.  
		Now it depends upon what time frame you're looking at.  If
I look back over the last several years, I would say it isn't and has not
been what it should have been.  If I look back over the last year, I would
say it's slow.  That's just telling it the way it is.  Somewhat
understandable, given the challenges that have been placed on the plant.  
		I think one of the significant things and I highlighted
this in my remarks, was the alignment with the business plan and what that
means is is that not only are these plans on paper, but there's funding
associated with it and I think that gets to one of the past significant
problems at the station.  It led to what I think was the failure to follow
through in a number of the performance efforts, because if you look at the
issues that exist, that were documented in the 9503 report, look at the
issues as they were described here two years ago, they're very, very
similar.  I think that commitment to funding these things, putting them on
a solid foundation, you can track it to the budget, has made a big impact.
		So we'll see.  One of the areas where the progress was very
limited was in the area of design.  And dealing with design related
issues, the company laid out in a meeting, it was the first meeting that
we asked for following the 9503 inspection.  The company laid out a quite
comprehensive program of reconstituting a number of calculations,
developing road maps for the designers to use to calculations.  So I hope
that answers your question.  I mean I think it's been slow, but much of at
least the last year or so is somewhat understandable, given what they've
been addressing.
		CHAIRMAN MESERVE:  Is it your expectation that with the
alignment of the business plan with need for action that that removes the
barrier that has caused the difficulty in having corrective actions to
proceed efficiently?
		MR. MILLER:  It removed one of the major barriers.  One of
the other things that was talked about in the 9503 inspection was the
importance of the plans that underlie the broad things talked about in the
business plan.  
		The company has set priorities.  I mean they have
identified a number of things that we really simply must do first and put
less emphasis on other things that need to be done, but not immediately. 
The implementing plans are very important.  In our 
end-of-cycle meeting we talked about or end-of-cycle letter we focused on
that, in fact, in the 9503 inspection report as well, as an important area
for us to be focused on in our inspections and in these meetings that I
talked about.
		I'll give you an example.  Again, in the area of design
control, I mentioned targeted inspections.  We've got two inspections that
we are conducting, an initial inspection that is looking at the scoping in
detail of what they plan in that area and that's something that is, I
believe, happening some time within the month and one later in the year
where we will have looked at actual implementation in detail of the
efforts in that area.
		CHAIRMAN MESERVE:  Good.  Thank you.  This is a question
for Mr. Dyer.  I know that you're performing supplemental inspections in
certain areas where you're unable to do the performance indicators yet and
I'm curious as to when you expect that you will be able to move to the
full performance indicator suite for D.C. Cook.
		MR. DYER:  Chairman, there was five performance indicators
which we were doing this for.  One of the performance indicators at the
next quarter will be for the emergency plan drilling and that will be
fully reported.  So there's four in the area of mitigating systems and
they actually require 12 quarters of historical data.  So if we fully time
out, it will be quite a while.  One of the things I alluded to in my
presentation was that we're looking, working with the licensee to figure
out how do we recapture that time.  Conceivably we could go all the way
back when they were operating prior to 1997 or is it more appropriate to
capture and reduce the -- we have a minimal amount of time for the 12
quarters to review it, so that's currently in progress right now.  I'm --
we should have an answer as to where it's going to happen by the next
quarter's data reports.
		CHAIRMAN MESERVE:  Are you learning things from our
augmented inspections because you don't have performance indicators to
rely on that you would not have learned from the performance indicator?
		MR. DYER:  No.  I believe, well, we are having findings. 
They've all been green findings in that we refer to the licensee.  The
insights -- it's a level of assurance.  We're doing increased walk downs
of systems.  We don't have reliability information on the safety systems,
so the inspection program is complementing the performance indicators, the
missing performance indicators data.  We're just looking at ensuring on a
greater frequency that the systems are operating the way they should be.
		CHAIRMAN MESERVE:  This is a question for Mike Johnson.  I
think you ended with a slide that would suggest that the performance trend
data is on the web, but I think you indicated when you talk that you plan
to put it on the web.
		MR. MICHAEL JOHNSON:  That's correct.
		CHAIRMAN MESERVE:  When do you expect to be able to do
that?  I think this is the kind of information in which there will be
great public interest and this ought to be sped along and I'm curious as
to where we are in that.
		MR. MICHAEL JOHNSON:  We are near ready.  We anticipate
putting them on the external web in August.  We wanted to wait until the
Commission paper had gotten up and the fact that we had this briefing on
the process to put that up.  So in August, we will be putting a quarterly
trend, the industry trend information on the external web.		
		CHAIRMAN MESERVE:  Let me just make a comment that -- take
whatever value it has.  I've been struck on visiting some plants to see
that at least some of the licensees have huge number of things that
they're trending and evaluating that far exceed anything that we're doing. 
And some of it is for their internal purposes and we ought to be
encouraged.  But it does seem to me that there may be some value in
probing into the industry and things that for their reasons, they're
trending that might be of value to us as well and that's another source of
information on trend information that may have thought of some things that
we haven't thought of and there may be some, there may be a source of data
there or a source of ideas that we maybe should be universally trending
that we might learn from.
		MR. MICHAEL JOHNSON:  Thank you. We are working at and
continue to work very closely with the industry as a part of the NRC OP
working, industry working group.  And we have briefed them on the process
and gotten their insights.  We've looked at the INPO WANO indicators, for
example, in picking the suite of indicators that we've chosen.  We did the
same thing for the ROP indicators.  But your point is well taken and we'll
continue to work on that.
		CHAIRMAN MESERVE:  Thank you.  Commissioner Dicus?
		COMMISSIONER DICUS:  Thank you.  Some of the questions
seems like they'd be more appropriate to ask tomorrow when we get into
talking about the ROP, but these two are so linked, let me get into a
couple of things.
		As I understand what we're doing with industry trends,
we're fundamentally dealing with if I can use the term averaging.  We're
looking at -- our plants, our fleet of plants and we're looking at things
that we're considering important with it, as I understand the trending.
		But have we considered whether and do we consider and you
can educate me on this point, that our trending should be design specific
or should we separate BWRs from PWRs and are we doing that?  Are our
trends more fleet-wide in averaging?
		MR. JON JOHNSON:  Well, I think your point is a good one. 
One of the things that we trend industry, we want to trend industry
performance for is to not just look at an individual plant's performance,
but also look at our programs.  As an example, the grid stability or
losses of off-site power, if an individual plant is still in a low risk
situation that would be fine, but if we stand back and look at a large
number of these, do we need to do something different?  Do we need to make
some rule making or something like that?
		On a plant-specific basis, one area where we have taken
into account some plant-specific information is on the radiation exposure
data, recognizing that some of the boiling water reactors have a larger
challenge, let's say, and so in the assessment of their performance we've
taken that into account.  But we continue to look at these indicators and
we're not done and we don't feel like we have the perfect set yet, so we
need to continue to look at that.
		COMMISSIONER DICUS:  So it's a work in progress.
		MR. JON JOHNSON:  Yes.
		COMMISSIONER DICUS:  Fair enough.  Just reassure me on a
point.  When we talk about statistically significant adverse trend, by
definition we're saying something that does have regulatory concern or
significance.  Do we get into a trend that we may have identified, we may
be calling it this, but fundamentally doesn't have a safety or significant
regulatory impact?
		MR. JON JOHNSON:  I might be able to answer some of that
detail, but we don't want to trend things unless they really relate to
safety in the first place.  There are, like the Chairman indicated, the
utilities trend many things, economics and so forth and we want to focus
on things that are safety significant.
		COMMISSIONER DICUS:  That's the reassurance that I wanted
from it.
		On trending, are we basing it more on indicative findings,
rather than predictive findings?  Do we have any trending that would be
predictive?
		MR. JON JOHNSON:  We haven't been able to come up with very
good predictive indicators.  We've been looking at indicators for a number
of years and we're very good at looking at the past, I guess, and coming
up with some and we continue to look for those, but as the ones we have
now are still looking at the past. 
		I know that we have some requests into the Office of
Research to help us look at the possibility of more risk-based indicators
and they're still evaluating that.  We want to get some indicators for
some areas.  We don't have them such as shutdown operations, containment,
but we would like to get some predictive indicators, but I don't think
we've really been really successful in doing that yet.
		COMMISSIONER DICUS:  But you are looking at it?
		MR. MICHAEL JOHNSON:  And if I can add to that, just to
remind us, one of the things that when we start the ROP and we're trying
to figure out what indicators we would use, we came across a couple of
indicators, one was safety system functional failures in today's ROP PIs
and the transience performance indicators and when we did the benchmarking
back, compared them to the plants that we actually put on the watch list
and the industry agreed to those as plants that had performance issues,
those two indicators tended to have a good correlation with those.  And
so, Jon is right, we don't have, we don't really have a suite of
indicators that we would point to as being predictive, but those seem to
have a strong correlation to if you had a plant that was having problems
with respect to those indicators, you had a plant that you needed to look
at and that's why we captured those in the ROP PIs and they exist.
		COMMISSIONER DICUS:  One final question. Obviously, our
trending is for our industry and the people that we have the
responsibility to regulate.  Have you looked at international trending and
international programs and made any comparisons?
		MR. JON JOHNSON:  We have requested that information.  We
specifically -- we have a lot of interface with international programs on
specific component problems.  As an example, the recent concerns we have
with control rod drive cracking, we've been working with our international
folks, especially the French to see their experience with that cracking.
		We have had some initial discussions about grid.  When we
meet with them we ask questions about that, but I think we can do a lot
more in that area.
		COMMISSIONER DICUS:  Thank you.
		CHAIRMAN MESERVE:  Commissioner Merrifield has just
indicated to me he has a quick question.
		COMMISSIONER MERRIFIELD:  Yes.  Thank you, Mr. Chairman.  I
just wanted to get an answer for the record. 
		In public meetings I've had over the course over the last
six months I've been frequently asked the impact of the situation in
California relative to the San Onofre and Diablo Canyon plants and while I
knew that wasn't specifically an issue for the Agency Action Review
Meeting, my yes or no question is and this is directed to Dr. Travers, are
we comfortable with the activities undertaken by the licensees at those
two facilities, that they are doing what is necessary to maintain the
safety of those reactors?
		DR. TRAVERS:  The short answer is yes, we are and we've
been taking a number of additional opportunities to scrutinize the level
of performance at those facilities, including efforts by Ellis and other
senior management.
		COMMISSIONER MERRIFIELD:  Thank you, Mr. Chairman.
		CHAIRMAN MESERVE:  Good.  I'd like to thank the staff very
much for a very helpful presentation.  The inspection activities, as I
indicated, at the outside are central to the Agency.  This is a very
important meeting as sort of the culmination of what I know reflects an
enormous amount of work and very important work by the staff.
		I'd like to thank you very much.  With that, we're
adjourned.
		(Whereupon, at 10:46 a.m., the meeting was concluded.)