1
                  UNITED STATES OF AMERICA
                NUCLEAR REGULATORY COMMISSION
                             ***
        BRIEFING ON STATUS OF NRC OPERATOR LICENSING
              INITIAL EXAMINATION PILOT PROCESS
                             ***
                       PUBLIC MEETING
           
                              Nuclear Regulatory Commission
                              Commissioners Conference Room
                              One White Flint North
                              11555 Rockville Pike
                              Rockville, Maryland
           
                              Tuesday, June 18, 1996
           
          The Commission met in open session, pursuant to
notice, at 10:02 a.m., the Honorable SHIRLEY A. JACKSON,
Chairman of the Commission, presiding.
           
COMMISSIONERS PRESENT:
          SHIRLEY A. JACKSON,  Chairman of the Commission
          KENNETH C. ROGERS, Member of the Commission
          GRETA J. DICUS, Member of the Commission
           
.                                                           2
STAFF AND PRESENTERS SEATED AT THE COMMISSION TABLE:
           
John C. Hoyle, Secretary
Karen D. Cyr, General Counsel
James Milhoan, NRR
Frank Miraglia, NRR
Bruce Boger, NRR
Stuart Richards, NRR
           
           
           
           
           
           
           
           
           
           
           
           
           
           
           
           
           
.                                                           3
                    P R O C E E D I N G S
                                             [10:02 a.m.]
          CHAIRMAN JACKSON:  Good morning, everyone.  The
purpose of this meeting is for the NRC staff to brief the
Commission on the results of the operator licensing program.
          The Commission previously consented to the staff's
proposal to initiate a pilot process to examine the
feasibility of revising the operator licensing program to
allow facility licensees to draft the written examinations
and operating tests that the NRC administers as part of
initial operator licensing.
          The Commission directed the staff to carefully
consider the experience gained from the pilot program before
considering full implementation.
          The staff provided its recommendation regarding
this program to the Commission in SECY 96-123.  The
Commission now must make a decision on whether to fully
implement the new examination process on a voluntary basis
and consider the pursuit of rule-making to require all power
reactor licensees to implement the new program. 
          Before the Commission makes its decision, it is
important for us to review the results of the pilot program
and hear not only the potential gains to be made and
achieved by full implementation, but also to weight any
potential negative consequences of changing a program of
.                                                           4
such importance, that we have successfully performed for
quite some time.
          And I would say that in reading the SECY paper,
96-123, it leaned very much on the gain side and there was
essentially very little in terms of the consequence.  So it
would be important today that the balance be presented.
          So the Commission is very interested in the pilot
results, especially a comparison, and there was an
interesting comparison between historical past fail rates
and those experienced under the pilot program.  
          The Commission would also like to understand what
controls a plant, because, again, the paper was a bit light
on that, to ensure that a challenging test of initial
licensed operator applicants continues to take place.  
          Finally, a discussion on the staff's position that
a rule change would be necessary to implement the revised
program, if it is considered acceptable, is warranted.
          So we'll look forward to a full discussion of all
aspects of the proposed revision to the operator licensing
program with you today.
          Now, I understand that copies of your presentation
materials are available at the entrance to the meeting.  Do
any of my fellow Commissioners have any comments at this
point?  Mr. Milhoan.
          MR. MILHOAN:  Good morning.  I think you have
.                                                           5
certainly stated the purpose of the meeting and I will not
repeat that.  With me at the table today is Frank Miraglia,
Deputy Director of NRR; Bruce Boger, Director, Division of
Reactor Controls and Human Factors; and, Stu Richards, Chief
of the Operator Licensing Branch.
          Bruce will lead off the first part of the
presentation.  Bruce.
          MR. BOGER:  Good morning.  I'd like to spend a few
moments with you to review some background information to
set the stage for Stu Richards, who will really talk about
the pilot exam process.  Can I have the slide numbered two,
please?
          [Slide.]
          MR. BOGER:  As a result of our favorable
experience with the transfer of examination responsibilities
to facility licensees as part of the requalification
examination process and the potential for us to save about
$3 to $4 million in resources, the staff felt that we ought
to embark on a path to consider alternate approaches to the
exam process.  As you indicated, Chairman Jackson, we
outline those in SECY 95-75.
          Our underlying authority to conduct examinations
stems from the Atomic Energy Act.  It requires us to
determine the qualifications of candidates, to prescribe
uniform conditions for licensing, and to issue licenses as
.                                                           6
appropriate.  10 CFR 55 is our guidance and our rules and it
establishes the requirements for applications and the
contents for examinations.  It identifies that a written
examination and an operating test be administered, but it
does not state who will create the exam, who will write,
administer and grade the exams.
          NUREG 1021 is the NUREG that provides the guidance
to the examiners.  Specifically, it provides procedures for
examiners to use and follow to prepare and conduct exams. 
Historically, these examiner standards have been used by the
NRC and contract examiners.  However, facility licensees
have had considerable experience with them and are quite
familiar with the standards.
          As you indicated, we did receive an SRM indicating
that we could go ahead with the pilot process and we are
here today to talk about the final implementation.
          CHAIRMAN JACKSON:  The hoped final.
          MR. BOGER:  Yes, ma'am.  When we considered the
exam process a little over a year ago, we had several
objectives in mind and I thought we might go over those just
to see where we stand.  We saw this as an opportunity to
improve our efficiency and maintain effectiveness if we
maintained a high level of NRC involvement in the process
and also to get the people that have the most knowledge of
the plant design and operations, the facility licensees, to
.                                                           7
contribute more in the exam process.  At the same time, that
would allow us to reduce our reliance upon contractors.
          Our intent was to remain consistent with the act
and Part 55 and, in fact, we originally considered that a
change to Part 55 would not be necessary because the rule
was silent with respect to who administers the exams and
also because we determined that it was not a backfit.
          However, we now believe that rule-making is
appropriate because we want it to be a mandatory process
across all licensees and also because it's a task that has
been historically performed by the government, by the NRC,
and that would require rule-making or issuance of orders. 
We decided to stay away from the issuance of orders because
that would lead to individual orders for each plant and we
felt that rule-making would be the cleaner and the clearer
way of establishing that regulation.
          We wanted to make sure that the process was not -
- changes to the process were not apparent to the candidate. 
We didn't want the test-taker to suffer any consequences
because of the change in the examination process. 
Accordingly, the exam format, the exam content and the level
of difficulty are expected to remain unchanged from the
current process.  
          One of our objectives was to implement the process
early in 1997.  In view of the aspects of rule-making, we
.                                                           8
will have to delay that somewhat.  So it will take a little
longer transition for us if we do, in fact, proceed into a
full implementation.
          Stu has the results of the pilot program that you
requested.
          MR. RICHARDS:  Slide four, please.  
          [Slide.]
          MR. RICHARDS:  I'm Stu Richards.  I'm the Chief of
the Operator Licensing Branch.  The first slide talks a
little about the genesis of the program, the pilot process. 
After briefing the Commission on the proposed changes to the
program in April of 1995, we formed a team of examiners to
define the proposed new process.  All four regions were
represented on the team.  The results of the team's efforts
were used to issue Generic Letter 95-06 in August of 1995.
          Generic Letter 95-06 outlined the pilot process to
the industry and solicited participants on a strictly
voluntary basis.  We advertised with the power program.  We
scheduled a run from October of '95 to March of 1996 and we
followed up on the Generic Letter with a public workshop in
September of 1995.  During the workshop, we explained the
process and we fielded questions from the industry.
          We were fortunate to have 20 sites volunteer to
participate.  I might add I think we had a lot of help from
NEI on that.  Between October 1st of 1995 and April 5th of
.                                                           9
'96, we administered 22 exams; that is, of those 20 sites,
there were two sites that had two exams.  
          The exams were administered without contractor
assistance.  All four regions participated and all the
reactor vendors were represented in the sample size.
          We had 95 reactor operator candidates and 92
senior reactor operator candidates tested as a part of the
pilot process.  
          CHAIRMAN JACKSON:  The slide says 54.
          MR. RICHARDS:  54, I'm sorry.  You're right.  
          [Slide.]
          MR. RICHARDS:  Slide five starts an overview of
the process and then weighs out the facilities'
responsibilities under the pilot program.  In accordance
with the pilot program, facility licensees drafted the
initial operator licensing written exams and the operating
tests, submitted the to the regional offices for review and
approval about 30 days before they were scheduled to be
given.
          To maintain the uniformity required by the Atomic
Energy Act, the staff expected that the examinations would
be drafted in accordance with our examiner standards and the
supplemental information provided in Generic Letter 95-06. 
We did not consider any alternative testing methods.
          The Generic Letter also contained the
.                                                          10
supplementary criteria in addition to what was already in
the examiner standards.  The criteria were added recognizing
that this was a change; no longer were we preparing the
exams, but now the utilities were.  So additional criteria
were added to ensure the integrity of the examination
process, to maintain the consistency of the examinations,
and to limit predictability to the candidates.
          Examples of the additional criteria that were
added; to minimize a potential conflict of interest in
writing exams, facility employees who played a substantial
direct role in training the license applicants were not
permitted to write the licensing examinations or tests. 
Because the NRC does not regulation examination banks and
some utilities maintain question banks, we placed limits on
drawing questions out of the examination banks.  We allowed
up to 50 percent of the questions to be drawn from the
utility's question bank.  We allowed another 40 percent to
be drawn as long as those questions were substantially
modified, and required ten percent of the questions to be
new.
          Limits were also placed on the degree to which
examination questions could be duplicated from examinations
or quizzes that the candidates had seen during their
training process or questions being duplicated from the last
two NRC exams administered at that facility.
.                                                          11
          Additionally, the licensees were required to state
the source of each examination, written examination item,
and if it was given at the facility in the past, to state
when it was past given, so we could recognize if this was a
question that the candidates had been exposed to in the
past.
          CHAIRMAN JACKSON:  Let me just ask.  How many of
these criteria that were in this Generic Letter then were
migrated or remained parts of NUREG 1021?  Because it seemed
from the SECY paper that, in fact, you were proposing
relaxations along each of these lines that you said were
part of the pilot. 
          MR. RICHARDS:  I think it's really a mix.  There
were some areas where, in discussions with the industry, we
feel it was appropriate to relax, not significantly, but
there were other areas where we added more restrictions.  So
I think on whole, the steps we're taking should ensure the
integrity of the exam.
          CHAIRMAN JACKSON:  Can you walk us through?
          MR. RICHARDS:  It's largely covered on a later
slide, if it would be all right to wait until then.
          CHAIRMAN JACKSON:  Sure.
          MR. RICHARDS:  And if that slide doesn't answer
all the questions, we'll add to it at that point, if that's
all right.
.                                                          12
          CHAIRMAN JACKSON:  Yes.
          MR. RICHARDS:  The last bullet there talks about
the utility administering and grading the written
examinations.  Upon approval by the NRC staff of the written
examination, the utility was allowed to administer the exam. 
You should note that the examination is a 100-point multiple
choice test.  It's not an essay test.  So the answer key is
developed before the test is administered. 
          Administering the exam is largely placing it in
front of the candidate, ensuring that the appropriate
security measures are maintained.  There can be questions
asked during the exam, but there are guidelines on how to
address those questions or whether they should even be
fielded at all.  So it's largely an administrative function.
          The licensee, after administering the test and
grading the test, again, it's a multiple choice test, they
submit the results to the regional offices with any
recommended changes in the answer key.  Then the regional
office would review the results, check the grading, review
any changes recommended by the facility, and then make the
final decisions on whether those changes should be made and
approve the final grading.
          CHAIRMAN JACKSON:  Is the grading machine grading
or is it done by individuals?
          MR. RICHARDS:  It varies.  I think some people do
.                                                          13
it machine at sites.  Some people use hand.  I think it's
mostly hand right now.  
          [Slide.]
          MR. RICHARDS:  Slide six, please.  This slide is
an overview of the NRC's participation in the pilot process. 
As stated before, the examinations, once drafted by the
utilities, were submitted to the regional offices for review
and approval.  We did not restrict the regional offices on
how many changes that they could ask the facility to make.
          Basically, we told the regional examiners that we
wanted this examination to be on par with an examination
that we would administer and were adamant that we would not
administer any examinations that did not meet that criteria.
          The examiners were focused to or instructed to
focus on the content and construction of the written exam,
that being the format and the level of knowledge and
difficulty, rather than on the technical accuracy.  One of
the things we felt we could save some time on here was when
we write the exam, frequently the utilities will review it
ahead of time and they'll comment that, well, this valve
number is not right or we don't use that terminology.
          We expected that the fact that the facilities were
writing the exam, that all the terminology, the technical
details would be correct, but we did focus on the level of
difficulty and format and the psychometric principles of the
.                                                          14
examination. 
          We did administer all the operating tests.  This
is the simulator portion of the test and the plant walk-
throughs.  This is a change from our intent when we briefed
the Commission back in the spring of '95.  At that time, we
still hadn't worked out the details on what we proposed to
do, but I think the discussion was more along the lines of
there being parallel grading, where the facilities would
actually administer the operating tests, be grading their
own candidates, and the NRC examiners would be doing
parallel grading or some kind of an oversight inspection.
          When we got our team of examiners together and
looked at the pros and cons of that, we felt that, one, we
had about the same number of resources that it would take to
do one or the other.  We felt it would be more independent
for us to do that portion of the exam.  Additionally, the
industry was generally opposed to the concept of doing
parallel grading.  So we decided that we would go with the
NRC examiners performing, in the field, the operating
portion of the test.
          Of course, we graded those tests in accordance
with the existing examiner standards.  So that portion of
the test, once the exam is written by the facility, has not
substantially changed.  
          As noted before, we did review the facility
.                                                          15
grading of the written exams.  Management and the regions
continue to review the examination results and we were
ultimately responsible for making the final licensing
decisions.  We did administer the appeal process that was
preexisting before the pilot process was put into effect. 
The operators had the opportunity to ask for an informal
review if they should fail the exam.  The process calls for
the candidates to submit a package to the regional office.
          They have an opportunity to consider the
information and if they contend that the failure is still
valid, then the package comes to headquarters.  The process
presently has us forming an appeal panel of three examiners
from an uninvolved region or regions.  They consider the
candidates' contentions and then they provide a
recommendation to my branch.  We put together a final
package and Mr. Boger is the final decision-maker on whether
to uphold that failure or overturn it.
          Slide seven, please.  
          [Slide.]
          MR. RICHARDS:  Overall, we felt that the
examinations drafted by facility licensees subject to review
and changes were appropriate by the NRC staff or as
effective as examinations written by our contractors and
administered in the traditional manner.
          We felt that -- 
.                                                          16
          CHAIRMAN JACKSON:  Let me ask you a question in
terms of pass rates.  How many total SRO examiners were
there during the pilot period on SRO exams?
          MR. RICHARDS:  It's in the table attached to the
Commission paper.  The total number was 85 written and 85
operating tests.  
          CHAIRMAN JACKSON:  Is that the number of pilot
tests?
          MR. RICHARDS:  Yes.
          CHAIRMAN JACKSON:  I'm asking for the total for
'95 for all tests.
          MR. RICHARDS:  For 1995.
          CHAIRMAN JACKSON:  That's right.  
          MR. RICHARDS:  In 1995, I believe we gave 386
examinations.  
          CHAIRMAN JACKSON:  Okay.  And what about operating
exams?  
          MR. RICHARDS:  I don't have the breakdown between
written and operating.  Typically, that 386, the vast
majority of those are going to be people taking both the
written and the operating.  The difference by a few numbers
is in the case where a candidate passes one part of the test
and fails the other.  Typically, we will wave the part that
they passed, and so they only have to retake the part that
they failed.  
.                                                          17
          So there's a few odds and ends that way that they
don't add to the same number.
          On the effectiveness issue, we noted that some of
the written -- well, we felt that the as-written
examinations that were administered to the candidates were
comparable in quality and level of difficulty to those that
we traditionally give.  We did note that a number of the
examinations that the facilities submitted required
substantial involvement by the NRC examiners and re-work in
order to get that up, that examination up to the standards,
we felt were acceptable.
          CHAIRMAN JACKSON:  I had a question for you anyway
on that one.  Can you square for me, when you say that they
were comparable, within the substantial number required
significant work, those don't seem to track.
          MR. RICHARDS:  The examinations that were put in
front of a candidate for an exam we felt were comparable.
          CHAIRMAN JACKSON:  But not as written.  
          MR. RICHARDS:  Not as written.  The examinations
that we received from the facilities, there were some that
were fairly well written.  There were a few that,
particularly in the written exam, we felt were actually
harder than what we typically would have produced.  But
there were others that we felt missed the mark.
          CHAIRMAN JACKSON:  Did you actually keep
.                                                          18
statistics on how many needed to be changed and what were
the major kinds of changes?  
          MR. RICHARDS:  I can talk to the major kinds of
changes.  Actually, there were a few that were not very well
that stick in mind and there were a few that were very well
done generally that stick in mind and then the rest kind of
fall in the middle of the pack and they were in need of some
work, but not too far off the mark.
          CHAIRMAN JACKSON:  Did you keep those statistics?
          MR. RICHARDS:  I can give you the names of the
plants.  I can't tell you ten percent at the bottom and two
percent at the top.  We only had 22 facilities that -- or 22
exams with 20 facilities that participated.
          MR. BOGER:  Is your question directed at the types
of problems that we saw on the exams and whether -- 
          CHAIRMAN JACKSON:  It's twofold.  It's kind of the
-- how widespread or not, what the problems were, where they
concentrated at a few plants, and then, yes, then among
those, where you had to make -- have the change and work
with the licensees, what were the kinds of problems you
found.
          MR. RICHARDS:  There were four or five facilities
where it required substantial involvement on our part in
order to get to the exam, where we felt it was ready to go. 
The kind of problems that we noticed on the written
.                                                          19
examination, probably the number one issue was that too many
of the questions were at too easy of a level, simple memory
level questions.  Sometimes the questions were not
psychometrically sound.  We have instructions on how to
construct questions and in a number of cases, the questions
weren't constructed the way we felt they were appropriate or
that they should be.  But number one was that too many of
the questions were too simple.
          On the operating exams, the exams split into the
simulator portion of the test and then the walk-through
portion.  I think generally the simulator examinations were
close to the mark.  In some cases, we felt they might be a
little bit too easy, but generally they weren't too bad.
          In the walk-through portions, this is where you
ask the candidate on a one-on-one basis to perform some kind
of a task, a surveillance or a system lineup, we felt that
some of those job performance measures, as they're called,
were too simple or that once the job performance measure is
conducted, there are two questions that are prescripted or
asked the candidate, once the evolution is conducted.  Those
questions are supposed to challenge the candidate's depth of
knowledge in the area.
          A number of times, we felt that those questions
were too simple, a simple look-up question where you would
go to the tech specs or a procedure and read it, and those
.                                                          20
are not the type of questions we expected to see. 
          Any more on that?
          COMMISSIONER ROGERS:  Just a general comment that
multiple choice questions are a lot harder to write
ambiguously than most people recognize.  Even today, on SAT
exams, which get put through all kinds of reviews, people
turn up alternative answers that are not regarded as a
correct answer, that are, in fact, correct, they are an
acceptable answer.
          So it's a very tough area to deal with and I was
just curious as to how you felt the licensees were dealing
with that.  I know that our experience has probably kind of
honed our ability to write multiple choice questions, but
how do you feel licensees dealt with that?
          MR. RICHARDS:  Again, it's a mixed bag, but I
think it's an area that the industry as a whole will need to
gain experience in.  I mentioned before that the
psychometric construction of the questions was a problem in
a number of cases and that's exactly the challenge of
writing a multiple choice question.  You need to have, one,
a hopefully higher level of knowledge, not simply a memory
level question, and, secondly, you need to have distractors
that are reasonably close to being correct, but clearly
wrong at the same time, and that makes a good multiple
choice question very difficult to write.
.                                                          21
          We did have that problem with a number of exams. 
As we address later, though, we feel that the industry is
learning quickly.  It is a steep learning curve, but we feel
that with some additional experience, that they will be able
to write those questions as well as we do.
          COMMISSIONER ROGERS:  Do you mean that they are
learning very fast or that there's a tough challenge?
          CHAIRMAN JACKSON:  A tough hill.
          MR. RICHARDS:  Well, it's a tough hill.  I think
that -- the sense I got was that they are learning quickly. 
We did have two sites that did two exams and in both cases,
they maintained that the second time around was quite a bit
easier than the first. 
          There's the challenge of understanding what we
expect in the writing of questions and then there's the
challenge of just understanding the entire process, which is
quite detailed and strictly controlled.
          So for somebody who thinks that you're going to
sit down and write an examination in a few days time, they
have the wrong concept.
          MR. BOGER:  We've had over ten years experience in
writing the questions and provided instructions to the
examiners, training sessions and the like, and I think the
utilities got into the mode of reviewing our questions
instead of having to create the questions, and there's a
.                                                          22
difference in that area.  But it was positive that we
thought people were seeing the light as we went through the
program.  
          CHAIRMAN JACKSON:  And now we're going to switch
it.  We'll be the reviewers.
          MR. BOGER:  Right.  Yes, ma'am.  
          MR. MIRAGLIA:  But the key to that, Madam
Chairman, is NRC involvement.  At your opening remarks, you
talked about controls and Stu indicated some of the controls
that we tried to institute above 1021, since we are having
the utilities write the exam, and it's the NRC involvement
that I think is the key to that control and we'll have to
maintain that level of involvement to have confidence in
those examinations. 
          MR. RICHARDS:  Unless there are further questions,
the next item speaks to a point that I believe, Chairman
Jackson, you mentioned in your opening remarks, and that's
the pass rates.  We felt that the pass rates on the exams
compare with the pass rates prepared in past NRC
examinations.  
          Those pass rates are contained in the Commission
paper in the last table attached.  Actually, the pass rates
from the pilot examinations were a little bit lower than
what we've seen in fiscal year '95.  
          Generally, as I've said before, we felt the
.                                                          23
examinations were as challenging as the ones that we had
given traditionally in the past and the pass rates seem to
bear that out.
          CHAIRMAN JACKSON:  But that was after your
involvement. 
          MR. RICHARDS:  Yes, ma'am.  We feel that our
involvement is very important in getting the proper product
in front of the candidate.  
          The staff believes that the new examination
process can be implemented with the same level of direct NRC
resources that are currently allotted to the operator
licensing program.  We estimated that we would spend about
370 hours, on average, for each pilot examination and when
we were done, the average came out to be about 350 hours per
examination.  
          There is quite a range from the low to the high,
depending upon how well the licensee did in writing the
initial product.  
          We did not ask for feedback from the facilities on
exactly how many hours it took them to write their products,
but the feedback that we got from NEI was that it was taking
roughly 400 to 600 staff hours to prepare the examinations
by the utilities.  This is somewhat higher than we would
have expected and it's higher than it would have taken us to
write the exams, but, again, we feel there is a steep
.                                                          24
learning curve and with experience that that number will
come down and the industry will become more efficient.
          In talking with the industry, it appears that they
feel that they would get better at it, also.
          Slide eight.  
          [Slide.]
          MR. RICHARDS:  This slide speaks a little bit to
the lessons learned from the pilot process.  The industry
generally agrees with the changes that the staff proposes to
make to the licensing process.  In February of this year, as
I think was previously mentioned, the staff issued for
public comment and for industry comment a draft revision to
the examiner standards.  The revision incorporated the
lessons learned up to that point through the pilot process.
          The Nuclear Energy Institute submitted comments
and recommendations on behalf of the nuclear industry.  In
addition, two facility licensees provided additional
comments which generally mirrored the NEI comments.
          CHAIRMAN JACKSON:  The NEI comments, do you have
any way of judging how many -- the universe of licensees who
actually -- that actually represented?
          MR. RICHARDS:  I do know that NEI formed a working
group on this issue and, in fact, we met with the working
group on two occasions.  The NEI representative on that
group is here today and I believe there were ten or 12
.                                                          25
utilities represented on that group.
          So I feel that their comments -- the NEI comments
incorporated those received from the working group.  We did
have a number of utilities, a large number, represented at
our workshop in September.  We spoke to the issue at the
regulatory information conference.  The regions have talked
probably to all utilities almost on a one-on-one basis and
had training managers meetings in the last year.  So there's
been a lot of opportunity to receive feedback.
          CHAIRMAN JACKSON:  Have you solicited the industry
to know that they also support the rule-making to impose
this on all licensees?
          MR. RICHARDS:  No, we did not, and the reason for
that is that that was a change that came about late in the
process.  We felt that because we had told the industry that
we intended to implement this as an administrative change,
to go back in the recent times and to talk to the industry
would have been, in effect, sharing something that was pre-
decisional at that point, because we hadn't really shared it
publicly.  
          So I don't understand or I really can't speak to
the industry's viewpoint on rule-making.  
          CHAIRMAN JACKSON:  Please go on.  
          MR. RICHARDS:  The comments that we received from
NEI and the two facilities are summarized, the major ones
.                                                          26
are summarized on the additional or the later slide which I
will cover in a minute.
          We already addressed the fact that the facility
learning curve was fairly steep, but with time, we feel that
they will gain experience and become more efficient at
producing the exams.  I think as they gain experience and
become more efficient, that will also reduce the amount of
time that we have to spend working with them to produce the
exams. 
          I might note that if the change is approved, the
staff is also anticipating participating in a national
workshop that NEI would arrange later in the year to, again,
cover the finer points of how to construct an NRC exam and
some of the lessons learned from the pilot process.
          One of the lessons that we did learn was to
increase emphasis on the technical accuracy of the
examinations.  As I had mentioned before, we had hoped that
because the facilities were writing the exams, that the
exams would be technically accurate when received. 
Unfortunately, there were cases where that was not so.
          We have taken some corrective action in the final
examiner standards to address that and we're encouraging the
licensees to take more deliberate steps to validate the
examinations before they provide it to us.  We're also going
to ask the regional examiners to validate or check a
.                                                          27
sampling of the technical accuracy of the questions on the
examinations as part of their review process.
          COMMISSIONER ROGERS:  For every exam.
          MR. RICHARDS:  Every exam, yes.  One of the
additional items that we did learn out of the pilot process
that involved technical accuracy has to do with the number
of appeals that we got.  The appeal rate jumped up
substantially from the pilot.  We have roughly ten failures
from the appeal process out of 21, where the candidate
appealed the decision.
          Based on that appeal rate, we decided to take some
additional actions to make sure that we were making the
right licensing decisions.  We're encouraging the licensees
to provide to us the candidate comments after the written
examination is complete.  Typically, after the examination
is over, the facility will provide comments on the written
examination, asking that different answers be accepted or
questions be deleted because of the experience of having
given the exam. 
          It wasn't clear to us that because the licensees
were now writing it, that they were as open to accepting
comments from the candidates.  So for the future, we would
ask that the facilities provide us not only the facility
endorsed comments on the examination, but they also provide
us the comments that they receive from their candidates,
.                                                          28
even if they don't endorse those comments.  So hopefully we
would get that feedback early on rather than getting that
through the appeal process.
          CHAIRMAN JACKSON:  What happened with those
appeals?
          MR. RICHARDS:  We have two of the appeals that are
still in the process of being reviewed.  But of the ten
failures to date, two of them were overturned and resulted
in licenses being issued.  I might add that one of the two
that were overturned were overturned at the region and
basically it was the result of an examiner making an error
in the way he administered part of the test and it was
really not related to the pilot process. 
          There was a third appeal where the candidate had
failed both the written exam and the operating exam, and we
at headquarters had overturned one part of the examination,
that being the operating exam, but we sustained the failure
on the written examination.  We count that as a half.  So of
the ten appeals to date, two and a half of one have been
overturned.
          Unless there are other questions on lessons
learned.  The next slide, slide nine, please, speaks to our
coordination with the industry.  
          [Slide.]
          MR. RICHARDS:  I mentioned before that we have
.                                                          29
communicated regularly with the industry generally through
NEI.  I might note, again, that last September, we did have
a workshop with the industry here in Rockville before we
kicked off the pilot process, where we described the process
and answered their questions.
          At that point, in January, midway through the
pilot program, we attended a public meeting with NEI and
their working group to get feedback from the industry on the
process at that point and also to share our views.
          After completing the pilot program, we conducted
another public meeting with NEI and other industry
representatives to review their comments and
recommendations.  At this point, the draft examiner
standards had been issued for public comment, so they had
the benefit of that to provide us their feedback on. 
Because the pilot examinations had been complete, we had
full benefit of our feedback to share with them.
          As noted earlier, we do plan to have a workshop
later in the year if we go forward with this process.
          There was a formal comment period.  We issued a
full public comment in February.  We put the entire examiner
standards on the worldwide web.  I already mentioned that we
did three responses, one from NEI and two from the facility
licensees.  
          COMMISSIONER ROGERS:  But as you pointed out, that
.                                                          30
was not on the basis of it being a rule.  Is that right?
          MR. RICHARDS:  That is correct.  At the time it
went out, the examiner standards were a format that they
would be implemented on a mandatory basis.  And in the
Federal Register, as a matter of fact, we specifically asked
for feedback on the burden on the industry and the three
comments we got back spoke to specific details in the
examiner standards, but we didn't get any comments back
about the burden.
          The last bullet on the slide talks about some
specific industry concerns and hopefully this will address
some of the questions you had earlier, Chairman Jackson. 
          CHAIRMAN JACKSON:  When you go through it, could
you explain what was in NUREG 1021 relative to this, when we
were giving the exams, and as you talk about the concerns,
what -- proposed changes in the NUREG that addresses that
concern and what vulnerabilities you see in making that
change. 
          MR. RICHARDS:  All right.  The first issue is
probably the largest issue we've faced from day one, and
clearly this is the biggest issue of the industry.  It
involves who can participate in writing the examination. 
Originally, the Generic Letter that kicked off the pilot
program said that if you had substantial involvement in
instruction of the candidates, that you could not be
.                                                          31
involved in writing the examinations.
          Now, in the Generic Letter, there was an out.  It
said if this is too hard on you, come and talk to us and
we'll accept, on a case-by-case basis, other arrangements,
recognizing that we couldn't foresee every situation.  But
generally speaking, we limited the people who authored the
exams to people who were not substantially involved.  At the
time, we did not define what substantially involved meant.
          We had a lot of dialogue with the industry on this
and for the larger utilities, this isn't very much of a
concern.  They have enough of a trained staff that they have
people to draw on without too much difficulty.  But
generally speaking, for the smaller utilities, they may only
have four or five people who are involved in licensing
initial classes and they don't have a lot of experienced
people elsewhere in the training staff to draw on, they felt
that this was a burden.
          The industry argues, and I think with some merit,
that because of all the other restrictions we've added to
the way the examination is constructed, where you can draw
up the questions from the fact that there is a sample plan
that has to be developed, that basically locks you into
examining in various areas, that it's very difficult to bias
the test in a significant way.
          There is some merit, I believe, to that argument. 
.                                                          32
But nonetheless, on our side, the concern was we didn't want
people who were teaching candidates, even so consciously,
that in writing the exam, writing the exam, recognizing what
they had presented to the candidates, because the test
should be an independent judgment of those candidates'
ability to be licensed.
          What we proposed to do, because of the numerous
restrictions we have placed on the construction of the
examination, is to allow the facility to have one individual
who was involved in a substantial way participate in
constructing the exam.  That individual would not be allowed
to write questions in an area in which they instructed and
their total involvement in the class would be limited to 15
percent of the scheduled construction time in the classroom
and an additional five percent in the simulator, for a total
involvement of 20 percent.
          Additionally, I'd have to go back and check on the
numbers, but I believe that if somebody has had less
involvement than 40 hours in instructing, we would allow a
limited number of those people to participate, recognizing
that in a lot of these classes, you might have an engineer
come over and teach a class for a day or somebody come out
of the plant and spend the day teaching the class.  We don't
want to completely eliminate all of those people from
participation.  So we said if you have less than, say, 40
.                                                          33
hours of involvement, I believe the number is, those people
can participate in an unlimited number, but, again, they
can't write questions in the areas in which they instructed.
          That's our proposal to address that area.
          CHAIRMAN JACKSON:  What are the vulnerabilities?
          MR. RICHARDS:  Well, again, the vulnerabilities
are for the one individual that you have instruct,
substantially involved, the person who may have up to 20
percent involvement, somehow introducing a bias into the
exam.  But I believe that we have addressed that by
precluding them from writing questions in the areas that
they participate in the instruction by requiring that a
sample plan be drawn up.  And I didn't mention this before,
but we're going to require that the sample plan be defined
by somebody who has had no involvement in the instruction of
the class.
          So that the base document that the examination
construction is drawn from is the sample plan; that is,
somebody who has not participated.  Then the NRC review;
typically, these pilot examinations demonstrate that we get
these examinations and we're going to recognize questions
that we felt are overly simplistic.  The examiners, I think,
are in a good position with their experience to recognize an
examination that, on its whole, is felt to be too easy.
          So we put a lot of faith in the examiners in
.                                                          34
ensuring that the proper level of difficulty is included in
that examination.  
          The second item addresses a burden that the
industry felt was too excessive with regard to defining the
history of examination questions.  We put limits on where
they can draw these questions from, but I think as the
industry gains experience, they'll probably get good at
sharing their examination banks to increase efficiency,
working together on this.  We see an opportunity for a lot
of gains in efficiency that way. 
          But at the same time, you want to make sure that
the examination is not predictable to the candidate.  So we
want to know where the questions come from.  This issue
basically had to do with the industry feeling that we were
asking for an extensive amount of information on the history
of each question.  We really hadn't intended that.  We just
wanted to know if it came out of a bank, whether -- has it
been seen by the class or not, if it was modified from the
bank, what the original question looked like, and if it came
from outside the utility, where did it come from.
          For instance, a utility may go to a facility
elsewhere that has a similar power plant and take the NRC
exam that was administered six months before and convert it
into their product.  Well, if they're going to do that, we
want to know that because there's a chance that the
.                                                          35
candidates may have had access to that old exam at another
site.  So for that reasons, we do want to know the history
of where all these questions come from, but we think we've
clarified our requirements in that area to demonstrate that
it's not that burdensome on the industry.
          CHAIRMAN JACKSON:  Are you then asking for the
same thing?
          MR. RICHARDS:  Yes.  We think it's the same
material we've always asked for.  It's the material we need
to ensure that the questions aren't predictable to the
candidates. 
          CHAIRMAN JACKSON:  Whereas in the first area, the
revision you've made of NUREG 1021 is actually a relaxation
in terms of the restrictions on exam authors.
          MR. RICHARDS:  It's a relaxation specifically with
regard to who can participate, but, again, I think we've
inserted some additional safeguards to address that; for
example, the sampling plan being written by somebody who was
not involved in the instruction of the class at all.  That
was something we had not spelled out previously.
          CHAIRMAN JACKSON:  I'm sorry.  Go ahead.
          COMMISSIONER ROGERS:  Just one observation.  It's
a little off to the side, but I think we ought to keep it in
mind.  That is the -- we're concerned here about the
validity of the examinations and that there not be any
.                                                          36
opportunity to pass an exam without knowing the material
that's appropriate.  I think it's well to keep in mind some
thought with respect to the relationship between instruction
and examination.  
          One of the negatives of having people make out the
exams that teach is that they will be teaching to the exams. 
And it isn't so much that they're prepping the students for
the exam, that's a problem, but the problem is that other
things are left out that might be important that aren't
necessarily on the exam.
          And so the best quality instruction covers a lot
more material than will ever be tested on any one exam and
one of the big difficulties in having people who teach also
make the exams is if they are participating continuously in
this process, they may unconsciously be teaching to the
material which they know is on the exam, whether they quite
recognize that or not.  That's why it's very good to have
someone else do the teaching from making out the exam.
          MR. RICHARDS:  I might add that once somebody
becomes involved in developing the examination, one of the
requirements is that you are no longer involved in the
instruction of the candidates.  So typically a utility may
start this process as much as 90 days ahead of time and at
that point, that person is removed from any involvement with
that class.
.                                                          37
          The next bullet talks to the duplication of items
from an audit exam.  An audit exam is typically an
examination that the facility either writes themselves
presently or has a contractor write it for them that they
give their candidates towards the end of the instruction and
just prior to the examination.  It's a screening tool
typically or a tool that demonstrates where they have weak
areas.  The examination is intended typically to look just
like an NRC exam.
          One of our original criteria was that there would
be no questions repeated from the audit exam on the
examination written by the facility.  The industry's comment
on that was, well, we, the NRC, don't apply that same
standard to our exams right now.  If we have a few questions
that happen to show up on both examinations, it's the luck
of the draw, we're not going to change our test.  Their
comment was that if the people developing the audit
examination and the people developing the facility-developed
NRC exam are independent and a few questions happen to be
common to both written exams, somewhere through the rules we
apply to ourselves, why should they have to come up with
more questions at the last minute. 
          We felt that that was an agreeable comment, up to
a limit, and our proposal is to say that's acceptable up to
five percent.  That's not uncommon.  You might say, well,
.                                                          38
that's the chances of a question showing up on both exams,
because one of the sources for building questions is a
utility's examination bank.  If you happen to go to the same
question area, it can happen.
          The next comment had to do with facilities who
have closed examination banks.  Some facilities do not allow
candidates the opportunity to study from the examination
banks that they maintain.  The industry felt that in those
cases, the ability to draw questions out of those banks
ought to be treated differently.
          We agree that that's a valid argument, but we feel
that that whole issue is a complex issue.  We would have to
get into security of those closed banks and how you ensure
that the candidates don't have access, and our decision was
originally and is now to maintain or to consider all the
banks as being open, recognizing that some facilities do
control their banks.
          We had two commenters who recommended that the
facilities be able to use site-specific task lists when
defining the contents of the exams.  We haven't discussed it
in any detail here today, but there is a fairly rigid
procedure for building an examination.  Part of that
procedure is to make sure that you sample across many
different areas of knowledge.  
          We have a document called a Knowledge and
.                                                          39
Abilities Catalog and it lays out a large number of items
that reactor operators, senior reactor operators are
required to know in order to do their job.  The test items
are based on drawing these so-called K&As out of these
catalogs.  The licensees would like to be able to substitute
their own site-specific task lists for those knowledge and
ability catalog items, but our conclusion is that on a
wholesale basis, that would be inappropriate, but that on a
task-by-task basis, that may be appropriate. 
          So if there is some specific area of an
examination that a utility wants to examine and they would
like to substitute that, in a few cases, before it would be
required by the typical sampling plan, we would find that
acceptable, but that would be discussed in the review with
the NRC examiner.  
          The last comment we received from one of the
utilities was that the process did not appear to allow for
utilities who did not want to draft their own examinations
to ask the NRC to draft those exams for them.  
          Originally, of course, our intent was to implement
this on a mandatory basis.  Of course, here today, we're
proposing to implement it on a voluntary basis for those
utilities who do not want to draft their exams.  They would
be able to ask us to do it and we would do that consistent
with resources.  But over the long term, we don't feel that
.                                                          40
that would be a predictable way to go without knowing how
many utilities would want us to draft exams and how many
would want to do their own.
          It would make the scheduling of resources and the
planning of resources very difficult.  So we feel it's
important in the long term to either have them do it on a
mandatory basis or not.
          CHAIRMAN JACKSON:  And you feel it's not a
backfit.
          MR. RICHARDS:  No, I don't feel that it's a
backfit.  We talked with the Office of the General Counsel
extensively on this issue.  The backfit rule, 50.109, the
entry conditions discuss what defines a backfit for entering
50.109.  Basically, it says if you're going to change the
facility in a physical way, if you're going to require a
change in the procedures to operate the facility, or require
a change in the staffing, then you must require or consider
the change to be a backfit.
          What we propose to do, if it does not change the
physical plant and it does not involve an operating
procedure for operating the facility, our examiner standards
are written strictly to produce an examination that is
really not related to the operation of the facility and it
does not impact the facility organization, in that all
facilities already have training organizations and they have
.                                                          41
staff who already evaluate their candidates prior to taking
them up for an NRC exam.  
          COMMISSIONER ROGERS:  But it might require them to
add more people in that organization.
          MR. RICHARDS:  Well, there's the legal question of
whether you meet the backfit rule, 50.109, and in consulting
with OGC, the answer is it does not constitute a backfit
under 50.109.
          The issue of whether it increases the burden on
the utility is a little bit different.  I agree that it
shifts the burden of providing the examinations to the
utility.  Our view is that the utility is already paying for
that burden.  We go with the contractors generally now and
have contractors write these examinations, and that bill has
passed on directly to the facility who asks for the exam. 
          This process would allow the facility to go to the
same contractors or other contractors and, I think likely
for the same amount of money or less, have an exam produced
for them.  If they feel they have the resources internally
to produce the test in a more efficient way, they have that
option.  They can work cooperative agreements with other
utilities to share resources.
          Additionally, it provides them the opportunity to
have more input into the technical content of the exam on
the front end.
.                                                          42
          So we see this as resource-neutral for the
utilities and perhaps if they're efficient at it or good at
the way they go about it, resulting in a resource reduction
for them.
          CHAIRMAN JACKSON:  Have you actually got that
input from NEI or from the industry?
          MR. MIRAGLIA:  No, we haven't to that degree,
Madam Chairman.  Commissioner Rogers, you raised the issue. 
It could be perceived and may be initially and maybe in the
long term an imposed imposition of burden.  That's why the
staff has come to the conclusion that if we do this in a
mandatory way, it should either by rule or order, because it
is, and that would put it through the process of proposed
rule, comment, and fully expose and indicate what those
concerns may or may not be.
          So it is that potential and perceived imposition
of burden that we're addressing through the proposal of
changing the rule in order to make it mandatory.  Until such
time, we would continue it in a voluntary way.
          MR. RICHARDS:  The informal feedback we have
received from some of the utilities that participated in the
pilot process, I think some of the facilities' views on the
process depended upon how well they did and the effort they
put into it.
          But some facilities came out very positive.  They
.                                                          43
felt that it wasn't an overburdensome thing to write the
exam.  Of course, those were the facilities that generally
did a good job.  Some of the facilities, it turned out to be
somewhat of a struggle.  Having gotten through the process,
they seemed willing to go along because they recognized that
there was that learning curve, but once it was overcome,
they saw the opportunity to be more efficient rather than
paying for the contractors through us.
          There were a few facilities, however, that felt
that the burden was significant and it may have impacted the
results of the examination and probably felt that they
shouldn't have volunteered.  But I think as a whole, in
talking with the NEI and the industry working group, through
the meetings we've had with them, that the feeling is
generally positive, that they think they can write these
exams, that they're in the training business, they are much
more familiar with the plant and the rest of the materials. 
They should be in a better position to write them more
efficiently, and I think, generally, they want to take it
on.
          CHAIRMAN JACKSON:  So OGC agrees that this was not
a backfit, even in the procedural -- with respect to
operating procedures.  That's what you're saying.
          MR. RICHARDS:  Yes, ma'am.  The Commission paper
has about three-quarters of a page specifically addressing
.                                                          44
those aspects.  That was written with a heavy input from
OGC, and, of course, they concurred on the Commission paper.
          CHAIRMAN JACKSON:  Why don't you finish.
          MR. RICHARDS:  The last slide, please, number ten. 
          [Slide.]
          MR. RICHARDS:  The planning milestones for the
future.  I might note that in EDO's memorandum of April 12
of this year, sent to the Commission, indicated that we
intended to continue to use the pilot process through the
end of this calendar year, and we have solicited volunteers
to continue that process.  So we are doing that.
          With the Commission approval, our intent is to
issue Revision 8 to the examiner standards.  That would
provide all the changes that we have considered necessary to
date.  We would issue that and implement that revision six
months after the date at which it's published.  It's been
our tradition to give the industry six months to acclimate
themselves to that revision and to prepare to carry out the
examinations in accordance with Revision 8.
          Of course, it's a voluntary program at this point. 
So that would be for those utilities who wanted to volunteer
under Revision 8 to write their own exams.  We would also go
out with a supplemental Generic Letter 95-06, where we would
describe the lessons learned from the pilot examination
process and formally solicit volunteers to continue to write
.                                                          45
their exams.
          We are also asking the Commission to approve the
staff pursuing with rule-making to make writing of
examinations by the power facilities mandatory in the
future.
          That concludes my prepared remarks.
          MR. BOGER:  In that regard, we would like the
Commission to consider this SECY paper as our submittal of a
formal rule-making plan.  That was not explicit in the paper
itself.
          CHAIRMAN JACKSON:  Has there been unanimous
support from headquarters and the regions regarding this
initiative?
          MR. RICHARDS:  It depends on who you want to count
as unanimous.  I don't think that if we had asked every
examiner in the nation if they felt this was the way to go,
they would all say yes.  
          On the other hand, we did have feedback --
          CHAIRMAN JACKSON:  What is the breakdown?  Can you
give us a rough idea?
          MR. RICHARDS:  I think that given the
circumstances, we feel that -- we entered into this process
change because of the resource issue and I think basically
with the resource issue being considered, the examiners
recognized the changes.  They largely, I would say 90
.                                                          46
percent of the people that participated in the pilot
programs would say that the examinations that were
administered after our involvement were as effective as the
ones we administered.
          There are feedback forms that were filled out by
examiners that say that the process was more effective or
the examination was harder than what we would have typically
written.  But I think the large majority of the feedback was
that the new process is as effective.
          I don't think anyone was dissatisfied with the old
way of writing exams.  We thought we were doing a good job
and we were effective and sometimes change is hard.  But I
think from the feedback forms, probably 80 to 90 percent of
the people felt that this new process was as effective, as
efficient or more efficient, and, in some cases, more
demanding.
          CHAIRMAN JACKSON:  Your paper states that the
revised process could be implemented at all power reactor
facilities with existing NRC resources allocated to the
operator licensing program and then it further states that
an initial resource investment would have to be made to
train additional NRC employees as examiners.  
          Have you quantified what that additional
investment would be?
          MR. RICHARDS:  Right now, the budget plan I think
.                                                          47
is still in the draft stage and has not been sent to the
Commission, is looking at, for fiscal year '97, an
additional seven FTE that would be provided in the way of
inspection support out of headquarters to allow additional
inspectors in the region to be qualified as examiners.  
          Our intent here is to increase the examiner pool
in the regions to provide flexibility for the scheduling of
exams.  It does not require more resources to actually carry
out the program.  But because we've always depended upon our
contractors to be our surge tank of resources for the highs
and lows and exam demands, we recognize that that surge tank
is no longer going to be available.  So in order to have
that surge tank of examiners, we have to have those in the
regions.
          In order to have those people in the regions, we
have to train up more people.  We have to be in a better
position to have examiners move on to new jobs or leave the
agency.  So we have to staff up more people and there are
seven FTE, I think, in the draft budget in '97 and then an
additional five beyond that in '98.
          But we see that as a short-term investment until
we staff up to a higher level and then we anticipate that
additional support dropping off and returning to a
maintenance level of examiners.  
          MR. MIRAGLIA:  It's basically a transition from
.                                                          48
contractors, Madam Chairman, to in-house capabilities and
talent.  As Stu has indicated, the contractors were sort of
the surge tank.  
          Operator examiners in the region have been dual-
qualified in order to be an examiner.  There is a
qualification program.  So it takes time to build up those
kinds of capabilities within the staff and the program.  So
what we have is a transition to get that in-house capability
and reliance on contractors decreased in the transitional
kind of way, and that's what that -- 
          CHAIRMAN JACKSON:  So you're going to try to be
revenue-neutral.  You talk about savings of three to four
million in contractor funds.
          MR. MIRAGLIA:  In terms of the contractors, yes. 
The contractor numbers will go down and the staff
capabilities in terms of number of examiners and inspectors
qualified to give operating exams would go up.
          MR. MILHOAN:  That will be addressed as part of
the budget process, assuming what the Commission would
reflect -- the Commission decision in the budget review
process, whatever that decision would be.
          CHAIRMAN JACKSON:  Would sufficient resources then
be available in-house on a moment's notice if we had to
assess a licensee's program if we thought it was perhaps
deficient?  
.                                                          49
          MR. MIRAGLIA:  I think the answer to that
questions is yes, because consistent with what we do at re-
qual, we have an inspection module, and I think our
expectations are, as Stu explained, the controls in the NRC
involvement in the front end.  We also recognized that while
we're in this voluntary mode, we're going to have to try to
respond to utilities' requests for exams and the question of
-- depending upon availability of resources, is how
responsive could they be. 
          They need to give enough notice such that we can
plan and utilize resources perhaps from headquarters or
other regions to support those kinds of things.  So it's a
planning and resource application issue in that kind of
context.
          CHAIRMAN JACKSON:  Do you imagine any role or does
INPO accreditation of training programs play any role in any
of this?
          MR. MIRAGLIA:  I would say that we wouldn't expect
any changes to that program, because our goal and objective
is to make this exam as effective as if we were doing it. 
That's not to say that given the change, they might look at
things differently, but we have not had any indication of
that at this point in time, nor would it necessarily be
expected.  
          MR. BOGER:  INPO has, likewise, created a task
.                                                          50
force, if you will, to take a look at the issue.  So they
might be able to respond to it.  To date, they haven't come
forth in any way to say that they would like to volunteer to
sponsor a program or anything like that.
          CHAIRMAN JACKSON:  We were talking a long time
about instructors being involved in making up exams and
there are two sides to it.  One may be instructors teaching
to an exam and that's a clear vulnerability.  Or the
converse, that if there's too much involvement, that the
exams could too much reflect what an instructor would teach.
          But, again, I ask you, the paper was very thin on
pros and cons.  Are there any vulnerabilities that the
Commission needs to know about today?  
          MR. MIRAGLIA:  I think that Stu tried to address
those vulnerabilities in terms of having instructors that
were involved in the process.  We probably -- it could be
perceived as a relaxation from a going-in position, as
exposed -- a 95-08 Generic Letter, but there was
countervailing provisions to try to limit those kinds of
vulnerabilities.  And I think the key is the NRC involvement
in the review of the process.  
          MR. BOGER:  And certainly each candidate is
receiving an operating test from an NRC examiner.  So that
way, we will have -- 
          CHAIRMAN JACKSON:  Repeat that again.
.                                                          51
          MR. BOGER:  Each candidate will receive an
operating test from an NRC examiner.  So there will be one-
on-one contact with each candidate and an NRC examiner, both
in the simulator setting and the plant walk-through.
          MR. MIRAGLIA:  That might be worth emphasizing and
maybe it was inferred and maybe not said explicitly.  As Stu 
mentioned, when we first conceived of the program, the
operating test would also be administered with kind of a
parallel grading line examiner.  But the way the program was
implemented, it's just a written piece, with the NRC
involvement at the front end that's administered.  Then they
prepare the operating exam, but we administer everything
else and do the walk-down.  So there is independent eyeball-
to-eyeball contact by an NRC examiner with each candidate,
as well.  So that's a difference.
          COMMISSIONER ROGERS:  I think this has been a very
interesting presentation and I feel quite comfortable with
the fact that this will lead to equally demanding tests of
operators, and I don't really see it as offering too much in
the way of -- or anything that I can see in the way of a
diminution of safety.
          Nevertheless, there is a public perception
question that I think has to be addressed here and I think
we have to think very hard and take whatever steps we can to
try to make clear to anyone who is concerned and might be
.                                                          52
concerned about a diminution in safety as a result of this
that we really have gotten on top of it and do not believe
that there is any diminution in safety, if that is, in fact,
the case.
          So I think some real thought has to be given to
how to present this to the public, because I can see it as a
sensitive issue.
          MR. MILHOAN:  I understand.  
          CHAIRMAN JACKSON:  Commissioner Dicus.
          COMMISSIONER DICUS:  No questions.  
          CHAIRMAN JACKSON:  I'd like to thank you for your
briefing of the Commission.  You've presented a lot of
information that indicates that the staff is concerned with
changes to the operating licensing program and has carefully
measured the pilot program to assess the impact.
          All the changes sound positive.  I just will say,
and this is a personal comment, I, frankly, did find the
SECY paper thin with respect to, in a sense, providing that
comfort to the public in the sense of truly laying out not
only the pros and the pros for the industry, but what the
vulnerabilities are and laying out how those vulnerabilities
are specifically addressed in the proposed changes or what
remains in the NUREG 1021. 
          I think that that would have been helpful to the
Commission not only in terms of the evidentiary record on
.                                                          53
which to ask the Commission to make a decision, but also
addresses the public perception piece.  So I think that
there are questions that have been raised or that kind of a
balance that I think, as part of a submittal for the record,
that you need to make, because the Commission wants to
ensure that all positive and negative aspects of the program
have been considered and that they have been fully
understood before you take steps to permanently change the
operator licensing program and that the appropriate controls
have been put into place relative to the changes in the
NUREG.
          It's clear that you're aware of the sensitive
nature and the Commission is particularly aware of the
sensitive nature of changing a process, as important as the
initial licensing.  I know we do -- we have changed already
the requalifications and we're talking about the control on
operators.  So, again, it's important that the Commission
fully understand the expected benefits and be able to weigh
them against any vulnerabilities of a change like this and
understanding the extent of the industry sign-off in this
regard.
          I think the Commission, even in spite of your
comment, Mr. Boger, would benefit from a more complete
discussion of whether a rule change is appropriate and
looking again at the question of the backfit.  And since
.                                                          54
you're talking about a proposed rule-making, I think you owe
it to the Commission to present something that specifically
addresses that.
          Unless my fellow Commissioners have any further
comments, we're adjourned.
          [Whereupon, at 11:17 a.m., the briefing was
adjourned.]