This is the accessible text file for GAO report number GAO-05-728 
entitled 'Aviation Safety: FAA Management Practices for Technical 
Training Mostly Effective; Further Actions Could Enhance Results' which 
was released on September 7, 2005. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Requesters: 

September 2005: 

Aviation Safety: 

FAA Management Practices for Technical Training Mostly Effective; 
Further Actions Could Enhance Results: 

[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-728]: 

GAO Highlights: 

Highlights of GAO-05-728, a report to congressional requesters: 

Why GAO Did This Study: 

One key way that the Federal Aviation Administration (FAA) makes air 
travel safer is to inspect the manufacture, operation, and maintenance 
of aircraft that fly in the United States. To better direct its 
resources, FAA is shifting from an inspection process that relied on 
spot-checks of compliance with regulations to one that evaluates 
operating procedures and analyzes inspection data to identify areas 
that pose the most risk to safety (called system safety). While FAA 
believes the new approach requires some technical knowledge of 
aircraft, Congress and GAO have long-standing concerns over whether FAA 
inspectors have enough technical knowledge to effectively identify 
risks. 

GAO reviewed the extent that FAA follows effective management practices 
in ensuring that inspectors receive up-to-date technical training. In 
addition, GAO is reporting on technical training that the aviation 
industry provides to FAA. 

What GAO Found: 

For its technical training, FAA follows many of the effective 
management practices for training that GAO has advocated and is 
improving its efforts in others. (See below.) In planning, FAA has 
linked technical training efforts to its goal of safer air travel and 
has identified technical proficiencies needed to improve safety 
inspectors’ performance in meeting this goal. It plans to better relate 
training to job tasks and is in the early stages of developing an 
approach to set priorities for new courses and course revisions. 

FAA Mostly Follows Effective Management Practices for Its Technical 
Training: 

Element: Practices in planning training efforts; 
Extent followed: Mostly. 

Element: Practices in developing training curriculum and courses; 
Extent followed: Mostly. 

Element: Practices in delivering training; 
Extent followed: Partially. 

Element: Practices in evaluating training efforts; 
Extent followed: Mostly. 

Source: GAO. 

[End of table] 

In developing technical courses, FAA has a structured process aimed at 
ensuring that courses meet performance objectives. It allows inspectors 
and others to identify the need for new training courses and to aid in 
developing courses. FAA is developing an initiative to systematically 
identify specific technical competencies and training requirements for 
inspectors. 

In delivering courses, FAA offers a wide array of technical courses 
from which inspectors can select to meet job needs. From GAO’s survey 
of FAA’s inspectors, we estimate that only about half think that they 
have the technical knowledge needed for their jobs. FAA officials told 
us that inspectors’ negative views stem from their wanting to acquire 
proficiencies that are not as crucial in a system safety environment. 
GAO also estimates that 28 percent of inspectors believe that they get 
the technical training that they request. However, FAA’s records show 
that FAA approves about 90 percent of these requests, and inspectors 
are making good progress in receiving training. Over half of the 
inspectors have completed at least 75 percent of technical training 
that FAA considers essential. 

In evaluating courses, FAA continuously assesses technical training 
through end-of-course evaluations and surveys of inspectors and 
supervisors. FAA is developing an approach to measure the impact of 
training on FAA’s mission goals, such as reducing accidents. This is a 
difficult task. 

Technical and Other Training Enables FAA to Inspect a Wide Variety of 
Aircraft: 

[See PDF for image] 

[End of figure] 

What GAO Recommends: 

Within the context of an overall system safety approach, GAO recommends 
that FAA take several actions, including systematically assessing 
inspectors’ technical training needs. FAA officials generally agreed 
with the contents of this report and agreed to consider GAO’s 
recommendations. 

www.gao.gov/cgi-bin/getrpt?GAO-05-728. 

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Gerald Dillingham at 
(202) 512-2834 or DillinghamG@gao.gov. 

[End of section] 

Contents: 

Letter: 

Results in Brief: 

Background: 

Strategic Planning Activities Generally Reflect Effective Practices and 
Focus on Reducing a Large Gap in System Safety Knowledge: 

FAA Follows Effective Management Practices in Developing Individual 
Courses but Recognizes the Need to Develop a Unified Curriculum: 

FAA Provides Extensive Support for Delivering Training; However, Many 
Inspectors Believe Improvements Could Help Them Do Their Jobs More 
Effectively: 

Although FAA Uses Several Approaches to Evaluate Technical Training 
Provided, Assessing Impact on Performance Remains to Be Done: 

Industry Provides Much of FAA's Technical Training; Additional 
Safeguards Needed to Prevent Real or Appearances of Conflicts of 
Interest: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendixes: 

Appendix I: Inspector-Reported Travel for Technical Training: 

Appendix II: Additional Details on Training Data and Selected Inspector 
Survey Responses: 

Appendix III: Scope and Methodology: 

Appendix IV: GAO Contact and Staff Acknowledgments: 

Related GAO Products: 

FAA Safety Inspector Training: 

Human Capital: 

Related FAA Training: 

Tables: 

Table 1: Types of Inspectors, Responsibilities, and Numbers, as of 
April 2005: 

Table 2: Extent That FAA Followed Effective Management Practices in 
Planning for Training: 

Table 3: Extent That FAA Followed Effective Management Practices in 
Developing Courses: 

Table 4: Extent That FAA Follows Effective Management Practices in 
Delivering Technical Training: 

Table 5: Percent of Inspectors Completing Essential Technical Courses: 

Table 6: Average Number of Technical and Nontechnical Training Courses 
Taken, Fiscal Years 2002 through 2004: 

Table 7: Extent That FAA Followed Effective Management Practices in 
Evaluating Its Training Program: 

Table 8: Number of Memoranda of Understanding and Fleets Enrolled as 
Part of the Aircrew Designated Examiner Program and Agreements with 
Training Centers: 

Table 9: Numbers of Inspectors Trained under Aircrew Designated 
Examiner Program and Agreements with Training Centers, Fiscal Years 
2002 through 2004: 

Table 10: Percent of Essential Courses That Are Technical in Nature: 

Table 11: Percent of Inspectors Completing Essential Courses: 

Table 12: Average Number of Technical Training Courses Taken Outside of 
Requirements, Fiscal Years 2002 through 2004: 

Table 13: Inspectors' Views on Extent to Which They Currently Have 
Enough Technical Knowledge to Do Their Jobs: 

Table 14: Inspectors' Views on Extent to Which Requested Technical 
Training Is Approved: 

Table 15: Inspectors' Views on Whether Availability of Courses Helped 
or Hindered Their Ability to Take Requested Technical Training: 

Table 16: Inspectors' Views on Whether Availability of Funds Helped or 
Hindered Their Ability to Take Requested Technical Training: 

Table 17: Inspectors' Views on Whether Management's Determination of 
Need Helped or Hindered Their Ability to Take Requested Technical 
Training: 

Table 18: Inspectors' Views on Whether Inspection Workload Helped or 
Hindered Their Ability to Take Requested Technical Training: 

Table 19: Inspectors' Views on the Degree to Which Technical Training 
Is Delivered in a Timely Manner: 

Table 20: Inspectors' Views on the Extent That They Receive Technical 
Training Prior to Scheduled Oversight Activities: 

Table 21: Percent of Technical Training Provided by Industry as 
Reported by FAA, Fiscal Years 2002 through 2004: 

Table 22: Inspectors' Views on the Extent to Which Technical Training 
Opportunities Exist Closer to Their Work Location: 

Table 23: Experts Consulted for Our Work: 

Figures: 

Figure 1: FAA's Safety Inspections Cover a Wide Range of Activities: 

Figure 2: FAA Safety Inspector Training Roles and Responsibilities: 

Figure 3: FAA's Structured Approach for Course Development: 

Figure 4: FAA Inspectors Receiving Training in a Classroom Setting: 

Figure 5: Inspectors Responding that to a Great or Very Great Extent 
They Currently Have Enough Technical Knowledge to Do Their Jobs: 

Figure 6: Extent to Which Requested Technical Training Is Approved: 

Figure 7: Inspectors' Views on Factors Hindering Their Ability to Take 
Requested Technical Training: 

Figure 8: Inspectors' Views on the Extent to Which Technical Training 
Is Delivered in a Timely Manner: 

Figure 9: Inspectors' Views on the Extent to Which They Received 
Technical Training Prior to Scheduled Oversight Activities: 

Figure 10: Percent of Technical Training Provided by Industry as 
Reported by FAA, Fiscal Years 2002 through 2004: 

Figure 11: Number of Weeks Inspectors Reported Spending on Travel for 
Technical Training within the Past 12 Months: 

Figure 12: Inspectors' Views on the Extent to Which Technical Training 
Opportunities Exist Closer to Their Work Location: 

Abbreviations: 

FAA: Federal Aviation Administration: 

ATOS: Air Transportation Oversight System: 

Letter September 7, 2005: 

The Honorable Ted Stevens, Chairman: 
The Honorable Daniel K. Inouye, Co-Chairman: 
Committee on Commerce, Science and Transportation: 
U.S. Senate: 

The Honorable Don Young, Chairman: 
The Honorable James L. Oberstar, Ranking Member: 
Committee on Transportation and Infrastructure: 
House of Representatives: 

FAA's overarching goal for technical training is to improve aviation 
safety. One key way that the Federal Aviation Administration (FAA) 
makes air travel safe for the public and the movement of goods is to 
inspect the manufacture, operation, and maintenance of aircraft that 
fly in the United States. To do so, about 3,700 FAA inspectors perform 
hundreds of thousands of inspections annually.[Footnote 1] Carrying out 
these inspections has become more challenging with the rapid growth in 
the number and type of aircraft in use and their increasing technical 
sophistication. 

Concerns about the quality of inspections heightened after the 
investigation of the 1996 crash of ValuJet flight 592 revealed 
deficiencies in FAA's inspection system. In response, FAA began to make 
fundamental changes in its approach to inspections. Traditionally, FAA 
aviation safety inspectors relied on their expertise to conduct 
inspections that spot-checked manufacturing processes, aircraft 
operations, and aircraft maintenance for compliance with regulations. 
FAA is transitioning to a risk-based system safety approach to 
inspections that requires inspectors to apply data analysis and 
auditing skills to identify, analyze, assess, and control the potential 
hazards and risks of flying and to prevent accidents.[Footnote 2] While 
we have endorsed FAA's move toward a system safety approach to 
inspections, congressional oversight committees and we have had long-
standing concerns over whether FAA inspectors have sufficient knowledge 
of increasingly complex aircraft, aircraft parts, and systems to 
effectively identify safety risks. 

The Vision 100-Century of Aviation Reauthorization Act, enacted in 
December 2003, requires that we report on FAA's actions to ensure that 
inspectors receive up-to-date training on the latest technologies. We 
call this technical training, although this use of the term "technical" 
differs somewhat from FAA's use of the term.[Footnote 3] Consistent 
with the act, this report focuses on the extent to which FAA follows 
effective management practices for (1) planning, (2) developing, and 
(3) delivering up-to-date technical training, and (4) ensuring that 
technical training for inspectors contributes to improved performance 
and results. It also discusses the degree to which the aviation 
industry provides technical training to FAA safety inspectors and 
discusses the safeguards in place to help preclude the appearance of or 
an actual conflict of interest when inspectors receive certain kinds of 
training from a regulated entity. Finally, as required by the act, the 
report provides information on the amount of travel required of 
inspectors in receiving technical training. (See app. I.) 

This report focuses on how FAA ensures that its inspectors possess the 
technical proficiency they need to do their jobs through following 
effective management practices and whether inspectors are receiving the 
technical training that FAA has determined is essential for its 
inspectors.[Footnote 4] We did not attempt to assess the technical 
proficiency that FAA's workforce requires (and will require in the near 
future) and compare it with the proficiency that currently exists. 
Because of the diversity and size of the inspector workforce and the 
wide variety of aircraft technologies that FAA is responsible for 
overseeing, this type of assessment would have been a massive 
undertaking and would be more properly done by FAA. We also did not 
attempt to compare the technical training received by inspectors with 
the tasks and activities that inspectors perform. FAA's inspector 
activity database contains tens of thousands of task and activity 
records, and the manner in which these records are stored did not allow 
us to electronically sort and analyze the data. However, to provide 
some insight into these two issues, we did discuss these issues with 
FAA officials and surveyed FAA's inspectors on their views, as 
described below. 

To assess whether FAA follows effective management practices regarding 
technical training, we compared FAA's management of its inspector 
technical training efforts with effective management practices outlined 
in our 2004 guide for assessing strategic training activities in the 
federal government and determined the extent to which FAA followed the 
relevant elements of this guidance.[Footnote 5] In addition, we 
analyzed FAA documents pertaining to planning, developing, delivering, 
and evaluating inspector training and discussed these activities with 
FAA officials involved in inspector training and the management of 
inspection programs at FAA headquarters in Washington, D.C., and the 
FAA Training Academy in Oklahoma City. To examine the training 
provided, including technical training, we analyzed FAA data on 
training courses taken by inspectors from 2002 through 2004 and FAA's 
evaluation of technical training courses during that period. We 
discussed technical training with safety inspectors and their 
supervisors at 7 of FAA's approximately 130 field locations. The 
locations were chosen to represent the range of FAA inspection 
responsibilities. We also conducted a self-administered electronic 
survey posted on the World Wide Web to a stratified random sample of 
FAA safety inspectors to obtain their views about their technical 
proficiency and the technical training they receive. We received 
useable responses from 79 percent of the inspectors surveyed. This 
report does not contain all the results from the survey. The survey and 
a complete tabulation of the overall results (excluding results by type 
of inspector, which are too voluminous to present) can be viewed at 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-704SP]. Finally, 
we obtained further perspective on FAA's training curriculum through 
semistructured interviews with 16 experts from the aviation industry 
and the field of aviation education who were selected on the basis of 
having extensive background and knowledge of the technical areas 
covered by FAA inspections. As part of our review, we assessed internal 
controls and the reliability of FAA's data on the amounts and types of 
training received that are pertinent to this effort. We determined that 
the data elements were sufficiently reliable for our purposes. We 
conducted our work from March 2004 through July 2005 in accordance with 
generally accepted government auditing standards. (See app. III for 
additional information on our scope and methodology.) 

Results in Brief: 

FAA has made training an integral part of its safety inspection system, 
which in recent years has emphasized risk analysis techniques over 
individual inspector technical knowledge of aircraft, aircraft parts, 
and systems. FAA has generally followed several effective management 
practices for planning, developing, delivering, and assessing the 
impact of its technical training for its aviation safety inspectors, 
although some practices have yet to be fully implemented. Regarding 
planning for technical training, for example, FAA's training efforts 
for the most part follow effective management practices and are 
intended to support its goals for improving aviation safety, and they 
largely focus on effectively implementing a system safety approach to 
inspections. According to FAA, it has identified gaps in several of the 
competencies required to conduct system safety inspection, including 
risk assessment, data analysis, auditing, and systems thinking, and the 
agency is currently working to address these gaps. In FAA's view--
although it recognizes the importance of inspectors staying up to date 
with changes in aviation technology--the competencies needed for system 
safety inspections are the most critical for inspectors, and the gaps 
in these competencies are much larger than gaps in technical skills and 
competencies relating to the production, operation, and maintenance of 
aircraft. In addition, FAA Office of Aviation Safety officials said 
that inspectors do not need a substantial amount of technical training 
because inspectors are hired with a high degree of technical knowledge 
of aircraft and aircraft systems, and they can sufficiently keep 
abreast of many of the changes in aviation technology through FAA and 
industry training courses and on-the-job training. Nevertheless, FAA 
plans to identify specific technical competencies and training 
requirements as part of a process intended to better relate training to 
the job tasks of each inspector specialty.[Footnote 6]

FAA also for the most part follows effective management practices for 
developing its inspector technical training curriculum. For example, 
FAA integrates the development of courses with overall strategies to 
improve performance and to meet emerging demands. In this regard, FAA 
develops courses that support changes in inspection procedures 
resulting from regulatory changes or agency initiatives, such as the 
implementation of the system safety approach to inspections. FAA will 
also consider developing training courses that are requested by 
inspectors and managers. FAA also works to match the training delivery 
approach with the nature of the material presented to best meet 
inspector and agency needs--such as delivery at a central location in 
FAA's Training Academy in Oklahoma City, Oklahoma; in multiple 
locations closer to field offices; or through computer-based 
instruction. While following many effective practices in this area, FAA 
has not systematically identified the technical skills and competencies 
each type of inspector needs to effectively perform inspections. As a 
result, technical courses are developed on an ad hoc basis rather than 
as part of an overall curriculum for each inspector specialty. FAA has 
recognized this problem and is developing an initiative that will 
systematically assess whether the complete array of training for each 
inspector specialty meets performance requirements. 

In delivering technical courses, FAA has followed effective management 
practices to differing degrees. For example, FAA has established clear 
accountability for ensuring that inspectors have access to technical 
training, developed a way for inspectors to choose courses that meet 
job needs and further professional development, and offers a wide array 
of technical and other courses. However, inspectors are for the most 
part dissatisfied with the technical training they receive. From an 
analysis of the survey, we estimate that only about half of FAA's 
inspectors think that they have the technical knowledge needed to do 
their jobs,[Footnote 7] only about one-third are satisfied with the 
technical training they have recently received, and less than half 
believe that they get to take the technical training that they request. 
However, our analysis of FAA training data indicates that FAA has 
approved about 90 percent of the technical courses requested by 
inspectors, and inspectors in general are making good progress in 
completing the technical training essential for their positions (77 
percent of the inspectors have completed at least half of their 
essential courses, and 46 percent have completed at least 80 percent of 
their essential courses). In addition, according to the survey, we 
estimate that only 23 percent of FAA's inspectors think that they 
receive technical training in time to do their current job. FAA's 
records do not allow us to assess the timeliness of training. FAA 
officials told us that inspectors' negative views on their technical 
knowledge and the training they received stem from their not accepting 
FAA's move to a system safety approach. That is, the inspectors are 
concerned about acquiring individual technical proficiency that is not 
as crucial in a system safety/risk management environment. Given that 
it has not completed assessing whether training for each inspector 
specialty meets performance requirements, FAA is not in a position to 
make definitive conclusions concerning the adequacy of inspector 
technical training. 

FAA for the most part follows several effective management practices in 
evaluating individual technical training courses. For example, it 
continuously assesses technical training through participant end-of-
course evaluations and surveys of inspectors and supervisors that focus 
on the application of skills and knowledge to the job. FAA also 
requires that each training course receive a systematic evaluation 
every 3 years to determine if the course is up to date and relevant to 
inspectors' jobs, although training officials noted that many courses 
have yet to undergo such an evaluation. However, FAA has taken limited 
action to evaluate the overall impact of its technical training on 
inspector performance in achieving mission goals, such as reducing 
accidents. Although FAA surveys its employees on their attitudes 
regarding many aspects of their employment, including the extent to 
which they were able to apply agency training to their jobs and perform 
their jobs effectively, it is not able to isolate inspectors' responses 
from those of its other employees. Moreover, the survey does not ask 
employees to differentiate between the types of training they receive, 
such as technical and nontechnical training. Experts on training in 
government agencies emphasize the importance of using an approach to 
evaluating training that goes beyond individual course evaluations and 
includes such indicators as the amount of learning that occurs from 
training programs and their organizational impact. However, training 
experts acknowledge that isolating performance improvements resulting 
from training programs is difficult for any organization. 

FAA has increasingly relied on the aviation industry to provide 
technical training in fiscal years 2002 through 2004. In fiscal year 
2004 (latest data available), industry delivered nearly half of FAA's 
technical training. Although FAA pays for most of the technical 
training that industry provides, from fiscal years 2002 through 2004 
about, 17 percent of industry-provided technical training was supplied 
to FAA in exchange for an in-kind service, such as delegating authority 
to conduct inspections, (called quid pro quo arrangements) with some 
apparently limited additional training supplied at no cost to the 
agency. To a large degree, FAA has established safeguards to help 
preclude actual or appearances of a conflict of interest, such as 
executing agreements with aviation industry training providers it 
regulates outlining the conditions under which it will accept training 
for in-kind service or at no cost. However, FAA has not included 
provisions covering its enforcement and oversight authority in all 
agreements with aviation industry training providers. In addition, two 
regional officials said that their regions accept free training on a 
limited basis outside the formal agreements with the training 
providers; one of these officials identified 57 instances over the past 
5 years in which inspectors received free training from aircraft 
manufacturers or operators. Because these opportunities generally arise 
at the local office level, whether such an offer is reviewed by legal 
counsel is dependent on the office manager, the manager's understanding 
of the FAA policy, and a judgment about whether a specific training 
opportunity raises any concern that should be reviewed by legal 
counsel. 

Although FAA has followed effective management practices in many areas 
in providing technical training to its safety inspectors, we are making 
several recommendations aimed at, among other things, improving FAA's 
identification of gaps in inspectors' technical knowledge that relate 
to their jobs, better aligning the timeliness of training to when 
inspectors need the training to do their jobs, gaining inspectors' 
acceptance for changes made or planned to their training, and ensuring 
that the acceptance of training from aviation industry providers does 
not limit FAA's enforcement authority or pose a real or potential 
conflict of interest. 

In commenting on a draft of this report, the Department of 
Transportation generally agreed with the information that we presented 
and agreed to consider our recommendations. However, the department 
expressed the view that we should have considered, as positive 
responses, the views of inspectors who responded to survey questions as 
"moderate extent," along with those who responded to a "great extent" 
or "very great extent." The extent scale that we used in our survey 
represents a unidirectional scale. As such, it is possible to interpret 
any point along that scale, other than "no extent," as positive, 
depending upon how a question is worded. Generally, we presented 
information in the report with both "very great extent" and "great 
extent" combined to represent the clearly most positive responses. 

Background: 

Ensuring the safety of the nation's aviation system is the shared 
responsibility of FAA and the aviation industry. Aircraft manufacturers 
are responsible for building safe aircraft. Aircraft operators are 
responsible for the safe maintenance and operation of aircraft. FAA is 
responsible for, among other things, certifying that the manufacture of 
aircraft and aircraft parts meets FAA standards, encouraging the 
development of new aviation technologies, and conducting periodic 
inspections to ensure continued compliance with safety regulations. 
Within FAA, the Office of Aviation Safety (1) directs and manages 
aviation safety through inspection (called surveillance by FAA) and 
oversight programs; (2) creates and amends standards and policies; and 
(3) certifies that aircraft, manufacturers, maintenance services, and 
individuals who operate aircraft meet FAA safety standards before they 
carry out their activities (called certification). 

FAA's 3,700 inspectors are located in more than 130 offices throughout 
the world. About 3,000 of these are front-line inspectors. These 
inspectors specialize in conducting inspections of various aspects of 
the aviation system, such as aircraft and parts manufacturing, aircraft 
operation, aircraft airworthiness, and cabin safety. (See table 1.) 

Table 1: Types of Inspectors, Responsibilities, and Numbers, as of 
April 2005: 

Inspector type: Air carrier operations; 
Areas of responsibility: Responsible for evaluating airmen (pilots, 
aviators, or aviation technicians) for initial and continuing 
qualifications, airmen training programs, equipment, and facilities, 
and aircraft operations for adequacy of facilities, equipment, and 
procedures to ensure the safe operation of the aircraft. Air carrier 
inspectors are responsible for evaluating pilots, dispatchers, air 
carriers, and similar operators; 
Number: 908. 

Inspector type: Air carrier maintenance; 
Areas of responsibility: Focuses on evaluating mechanics and repair 
stations for initial and continuing certification and mechanic training 
programs. Examines the overall aircraft maintenance program, including 
the development of maintenance manuals and the procedures for repairing 
aircraft and their components. Inspects aircraft and related equipment 
for airworthiness. Air carrier inspectors evaluate maintenance programs 
of air carriers and similar operators; 
Number: 831. 

Inspector type: General aviation operations; 
Areas of responsibility: Duties are similar to air carrier operations 
inspectors, with the exception that general aviation operations 
inspectors are responsible for evaluating pilots, flight instructors, 
air taxis, and similar operators; 
Number: 636. 

Inspector type: General aviation maintenance; 
Areas of responsibility: Duties are similar to air carrier maintenance 
inspectors, with the exception that general aviation maintenance 
inspectors evaluate maintenance programs of air taxis and similar 
operators; 
Number: 571. 

Inspector type: Air carrier avionics; 
Areas of responsibility: Responsible for inspecting aircraft 
electronics and related systems for airworthiness, evaluates avionics 
technicians, repair stations, and technician training programs. Air 
carrier inspectors conduct surveillance and oversight of air carriers 
and similar operators; 
Number: 341. 

Inspector type: Aircraft certification; 
Areas of responsibility: Administers and enforces safety regulations 
and standards for the production and/or modification of aircraft. 
Evaluates and oversees the plants that build or assemble aircraft. 
Inspects prototype or modified aircraft, aircraft parts, and avionics 
for conformity with design specifications and safety standards. Issues 
certificates for all civil aircraft; 
Number: 187. 

Inspector type: General aviation avionics; 
Areas of responsibility: Duties are similar to air carrier avionics 
inspectors, with the exception that general aviation avionics 
inspectors conduct surveillance and oversight of air taxis, travel 
clubs, and similar operators; 
Number: 180. 

Inspector type: Cabin safety; 
Areas of responsibility: Serves as a resource and technical authority 
on cabin safety requirements, such as verifying that emergency 
equipment is onboard the aircraft, as they relate to activities 
affecting civil aviation; 
Number: 60. 

Total; 
Number: 3,714. 

Source: GAO summary of FAA information. 

[End of table]

Some inspectors, such as operations and airworthiness inspectors, 
further specialize according to the type of aircraft and aircraft 
operators they oversee. Other inspectors, such as general aviation 
inspectors, are responsible for inspecting a wide range of aircraft, 
such as those used for agriculture, air taxi service, industry, and 
pleasure. (See fig. 1.) In addition, they inspect flight instructors. 
Some air carrier inspectors are assigned to one of the 16 carriers that 
are currently subject to the Air Transportation Oversight System (ATOS) 
program, which is intended to identify safety problems through risk 
analysis;[Footnote 8] while other air carrier inspectors are 
responsible for overseeing the operations of several smaller carriers 
in a geographic area. 

Figure 1: FAA's Safety Inspections Cover a Wide Range of Activities: 

[See PDF for image]

Note: As a workforce, FAA inspectors conduct a wide variety of 
inspections, including ensuring that pilots are qualified to operate 
air carrier and general aviation aircraft and inspecting air carrier 
and general aviation aircraft for safety. 

[End of figure]

FAA requires that candidates for safety inspector positions have 
extensive technical qualifications and experience, which is usually 
gained during careers in the aviation industry. For example, 
prospective manufacturing inspectors need experience in and knowledge 
of industrial technologies. Similarly, operations inspectors need pilot 
licenses to fly specific makes and models of aircraft; maintenance 
inspectors need to have certifications to repair the aircraft's 
airframe and power plant; cabin safety inspectors need extensive 
experience in aircraft cabin safety procedures; and avionics inspectors 
need extensive experience in servicing an aircraft's avionics system, 
which includes radar and other electrical systems. 

To supplement the skills inspectors bring with them from their previous 
careers in the aviation industry, FAA provides inspectors with 
extensive training in federal aviation regulations; inspection and 
investigative techniques; and technical skills, such as flight training 
required for operations inspectors. The services within FAA's Office of 
Aviation Safety that are responsible for conducting inspections of 
aircraft operators and aircraft repair stations[Footnote 9] (Flight 
Standards) and manufacturers (Aircraft Certification) have each 
established training units that develop curricula and specific courses 
for inspectors. Most of the regulatory, inspection, and investigative 
courses are taught by FAA instructors at the FAA Training Academy in 
Oklahoma City, Oklahoma. Much of the technical training (training that 
enhances skills concerning the production, maintenance, and operation 
of aircraft, aircraft parts, airworthiness, and systems) is contracted 
out to vendors, such as flight schools. (App. I provides information on 
the amount of inspector training provided by FAA and vendors.) FAA is 
also making increased use of nonclassroom training delivery methods, 
such as computer-based instruction, Web-based training, interactive 
video training, and correspondence courses. Inspectors also receive 
extensive on-the-job training, particularly when they are first hired. 
FAA has spent an average of $43 million per year on inspector training 
activities from fiscal years 2002 through 2004 and plans to spend $41 
million in fiscal year 2005. 

The Flight Standards and Aircraft Certification training divisions have 
training priorities, which are set by the Associate Administrator for 
Aviation Safety. Determining training needs is based on the inspectors' 
job. Acquiring training is a shared responsibility between inspectors 
and their supervisors. (See fig. 2.) Each year inspectors and their 
supervisors meet to decide which training inspectors will request in 
the coming year. The inspectors are expected to choose training that 
will fulfill their mandatory training requirements in areas such as 
basic aircraft accident investigation, air carrier airworthiness, 
aviation safety inspector job functions, and data analysis and related 
skills needed to perform system safety inspections. Inspectors can also 
request training that they believe will further their professional 
development. FAA's directorate, regional, and headquarters offices then 
compile and fund the training requests. Headquarters training officials 
coordinate with the FAA academy and other training vendors to deliver 
the training. Inspectors provide feedback to headquarters and academy 
officials through surveys and course evaluations. 

Figure 2: FAA Safety Inspector Training Roles and Responsibilities: 

[See PDF for image] 

[End of figure] 

FAA's transition to the ATOS system safety concept represents a major 
change in the way the agency operates as it shifts the oversight 
emphasis from the traditional methods of inspection to identifying and 
assessing risks to safety. Under the traditional or compliance 
approach, inspectors rely upon random inspection activities, such as 
observing aircraft parked at departure gates. When applying the system 
safety approach inspectors develop comprehensive surveillance plans for 
each air carrier. Developing the plans requires using existing safety 
data, risk indicators and the inspector's knowledge of the operations 
to determine the priority and frequency of inspection activities. The 
resulting comprehensive surveillance plan includes a series of 
inspection tasks to determine whether an airline has systems in place 
to ensure safety and a second series of inspections to verify that the 
airline is actually using those systems. 

FAA has taken steps to introduce concepts used in ATOS into its 
traditional oversight process for the air carriers not in the ATOS 
program. In November 1999, FAA instructed its inspectors to begin 
adjusting planned inspections for new air carriers,on the basis of 
evaluation of areas of potential safety risks.[Footnote 10] In 2002, 
inspectors were instructed to perform safety risk evaluations of all 
other non-ATOS carriers using ATOS risk assessment principles as part 
of their inspections. However, the inspections of the non-ATOS carriers 
are still based on a determination of whether air carriers are 
complying with regulations rather than whether air carriers' systems 
are operating effectively.[Footnote 11] According to FAA, in the 
transition to the system safety concept, safety inspectors are learning 
new skills, such as data analysis, risk assessment, computer 
operations, auditing, systems thinking, and interpersonal skills. 
Inspectors will continue to need technical expertise in avionics, cabin 
safety, operations, maintenance, and aircraft production and design, 
and may need training in composites, basic accident investigation, and 
nondestructive inspections courses. FAA has concluded that it will take 
a significant training effort to develop and maintain both the system 
safety approach as well as the technical competencies. 

Strategic Planning Activities Generally Reflect Effective Practices and 
Focus on Reducing a Large Gap in System Safety Knowledge: 

FAA's strategic planning acknowledges the central importance of 
aviation safety inspectors and defines their role as mission critical. 
In its planning activities for training, FAA has, for the most part, 
followed effective management practices by developing strategic 
approaches to training that have established broad training priorities 
for inspectors, among other things. (See table 2.) 

Table 2: Extent That FAA Followed Effective Management Practices in 
Planning for Training: 

Effective management practice: Ensures training goals and related 
performance measures and targets are consistent with overall mission 
and goals; 
Extent followed: Fully. 

Effective management practice: Ensures human capital professionals work 
in partnership with agency leadership in addressing agency priorities, 
including training, in strategic and annual performance planning 
processes; 
Extent followed: Fully. 

Effective management practice: Determines skills and competencies its 
workforce needs to achieve current and emerging agency goals and 
identifies gaps--including those training strategies can help address; 
Extent followed: Mostly. 

Effective management practice: Identifies appropriate level of 
investment for training and prioritizes funding so that the most 
important training needs are addressed first; 
Extent followed: Partially. 

Effective management practice: Ensures agency strategic and tactical 
changes are promptly incorporated into training efforts; 
Extent followed: Fully. 

Source: GAO. 

[End of table]

Establishing training goals and performance measures that further 
overall agency goals. One of the goals of the inspector training 
program is to provide the training required to support inspectors in 
the FAA transition to a system safety approach for meeting its goal of 
increased safety. Both Flight Standards and Aircraft Certification have 
developed training initiatives to support this training goal. Flight 
Standards also has eight training initiatives focused on improving 
aviation safety in general, each of which has related performance 
measurements or targets along with strategies, time lines, and resource 
estimates.[Footnote 12] In addition, to support the FAA safety goal, 
the Office of Aviation Safety, which contains both the Flight Standards 
and Aircraft Certification services, measures what the services have 
done to prepare their workforce to operate in a system safety 
environment. This measurement includes information on how the services 
have developed and delivered training, redesigned existing courses, and 
validated inspector competencies. 

Human capital professionals partnering with agency leadership. In June 
2003 Flight Standards established a human capital council which brings 
senior managers together with training officials to oversee all human 
capital efforts to, among other things, establish priorities that will 
both maintain existing inspector technical competencies as well as new 
and emerging system safety competencies. In Aircraft Certification, the 
manager responsible for training programs is involved in the service's 
annual planning process and also participates in weekly meetings of 
senior level managers. The training manager keeps the training 
development staff informed of new or changing priorities that could 
affect the training program. 

Determining gaps in workforce skills and competencies. Both Flight 
Standards and Aircraft Certification conducted human capital analysis 
for their aviation safety inspectors that revealed significant gaps in 
their needed competencies and skills as FAA continues its 
implementation of a more risk assessment-based system approach to 
safety oversight. In Flight Standards the analysis involved a team of 
FAA senior managers and FAA subject-matter experts.[Footnote 13] The 
team reviewed the existing competency requirements for inspectors and 
then determined which competencies should be modified or added over the 
next 5 years. The list of competencies compiled was then reviewed by 
another group of subject-matter experts, primarily program managers, 
who estimated the relative importance of the competencies (existing and 
new) in the next 5 years as well as the gaps between the current 
workforce's actual and needed level for each competency. The largest 
critical competency gaps for the inspectors in Flight Standards 
included (1) risk assessment, (2) data analysis, (3) systems thinking, 
and (4) designee and industry oversight.[Footnote 14] Technical 
proficiency training was the only competency that Flight Standards did 
not identify as having a critical competency gap, and the list of 
competencies for Flight Standards field inspectors issued in February 
2005 does not include technical proficiency. FAA Office of Aviation 
Safety officials said that inspectors do not need a substantial amount 
of technical training courses because inspectors are hired with a high 
degree of technical knowledge of aircraft and aircraft systems, and 
they can sufficiently keep abreast of many of the changes in aviation 
technology through FAA and industry training courses and on-the-job 
training. Flight Standards officials said that the list of competencies 
contains items that cut across all inspection specialties and that it 
will be the role of individual curriculum oversight teams to identify 
the technical skills and competencies for each inspector specialty. 
Flight Standards officials said they will establish these teams as part 
of an effort to develop specific curriculums for each inspector 
specialty (see the following section on developing training 
activities). 

Aircraft Certification subject-matter experts who manage inspectors 
identified four similar critical competency gaps for the implementation 
of system safety for its manufacturing inspectors. The gaps included 
(1) business and management, (2) data analysis/risk assessment, (3) 
system thinking skills, and (4) designee oversight. Aircraft 
Certification officials noted that these are skills that its inspectors 
need to perform their primary inspection function, which is ensuring 
that manufacturers meet design specifications for aircraft parts and 
components. Inspectors do this by inspecting the processes and quality 
assurance systems involved in aircraft and parts manufacturing. 

Prioritizing funding for training activities. Currently in Flight 
Standards, requests for course development projects come from the 
operational policy divisions to the training division. The training 
division then works closely with these individual policy divisions 
(such as the Air Transportation Division or the Aircraft Maintenance 
Division) to develop or revise the courses they request. However, the 
existing process does not explicitly consider which course development 
projects are most critical. During fiscal year 2005, Flight Standards 
plans to develop an approach that will consider the organizational 
factors necessary to prioritize requests for new courses and revision 
of current courses, including exploring ways to engage senior 
management. According to FAA officials, a curriculum oversight steering 
committee will provide strategic direction and prioritization for the 
service's training needs. Aircraft Certification already employs a 
process that prioritizes training activities on the basis of three 
factors: impact on aviation safety, inspector job functions, and the 
needs of the customer. Training division officials meet each year to 
establish training priorities and to determine the resources needed to 
meet these priorities. According to these officials, they are guided by 
FAA strategic plans and direction they receive from operational program 
managers. 

Promptly incorporating strategic and tactical changes into training and 
development. Flight Standards has recognized the need to quickly deploy 
its training in order to reduce the time lag between the identification 
of training needs and its delivery as inspector performance 
requirements change. Currently Flight Standards has quarterly and 
semiannual training program reviews between the training division, the 
Flight Standards organizations which sponsor the training, and the FAA 
academy to discuss the sponsoring organizations' training needs. It is 
also the responsibility of staff who oversee individual courses, known 
as course mentors, to ensure courses reflect changes in FAA policies 
and procedures or new developments in aviation technologies. In 
Aircraft Certification an executive-level mentor is selected from its 
management team for each course and is responsible for managing the 
development of new courses and the updating of existing courses to 
respond to changes in FAA policies and priorities. We did not attempt 
to assess the extent to which FAA incorporates strategic and tactical 
changes into its inspector training curriculum. 

FAA Follows Effective Management Practices in Developing Individual 
Courses but Recognizes the Need to Develop a Unified Curriculum: 

FAA has for the most part followed effective management practices for 
developing individual safety inspector courses. (See table 3.) These 
practices, such as establishing guidelines that call for the formation 
of course development teams for each new course and that require each 
team to follow a series of progressive course development steps, are 
aimed at enhancing course quality and ensuring that the content of the 
course meets the intended course goals and performance objectives. 
However, FAA has not systematically identified technical training needs 
because it develops courses on a course-by-course basis rather than as 
part of an overall curriculum framework. 

Table 3: Extent That FAA Followed Effective Management Practices in 
Developing Courses: 

Effective management practice: New courses developed to meet emerging 
demands and improve performance; 
Extent followed: Fully. 

Effective management practice: Course development teams enable 
stakeholders to provide input; 
Extent followed: Fully. 

Effective management practice: Guidelines provide progressive course 
development steps with ongoing evaluation at each step; 
Extent followed: Fully. 

Effective management practice: Merits of different course delivery 
methods are considered; 
Extent followed: Fully. 

Effective management practice: Criteria used for decisions regarding 
outside training providers; 
Extent followed: Fully. 

Effective management practice: Analysis of training needs and course 
development linked to overall curriculum approach[A]; 
Extent followed: Partially. 

Source: GAO. 

[A] This management practice is not specifically identified in our 
assessment guide. However, a management approach that assesses training 
needs holistically rather than on a course-by-course basis can provide 
for a more systematic assessment of whether and how training will help 
meet organizational needs. 

[End of table]

FAA Follows Many Effective Management Practices for Developing 
Technical Courses: 

FAA follows many effective management practices for developing 
technical training courses for Flight Standards and Aircraft 
Certification safety inspectors. 

Ensuring new courses meet emerging demands and improve performance. At 
the very beginning of any new course development effort, Flight 
Standards and Aircraft Certification validate the need for new aviation 
safety inspector training by discussing (1) the facts that indicate the 
need for training, (2) the desired outcome of the training in terms of 
performance, and (3) the target audience of inspectors who will receive 
the training. Before any substantial course development activities 
occur, Flight Standards and Aircraft Certification training guidelines 
require that a task analysis be conducted. The purpose of the task 
analysis is to identify essential tasks, knowledge, and skills needed 
for effective safety inspector job performance. FAA then uses this task 
analysis as the basis for determining the scope, content, and 
sequencing of training topics for each new course. 

Flight Standards and Aircraft Certification have also created different 
ways for field and headquarters personnel to request the development of 
new aviation safety inspector training courses when a new training need 
has emerged. Field personnel who see a need for safety inspectors to 
perform a new task or acquire new knowledge can submit training 
development requests. In addition, officials in Flight Standards and 
Aircraft Certification can propose new training courses for inspectors 
when regulatory changes occur or when new FAA initiatives, such as the 
system safety approach for aviation safety inspectors, create a need 
for additional inspector knowledge and skills. Those proposing new 
courses must describe how the proposed course will contribute to FAA's 
mission, explain the inspector knowledge and skills that will be 
acquired by taking the course, define the target audience for the 
proposed course, and describe the impact on the inspector workforce if 
the course is not developed. 

Enabling qualified personnel to participate as stakeholders. When 
Flight Standards and Aircraft Certification begin to develop a new 
course, a training development team is formed and different policy, 
technical, and training personnel participate in team activities 
throughout new course development. As a result, each member of the 
course development team has different skills and unique perspectives 
that he or she can contribute to course development. Each course 
development team has a course mentor whose role is to work with other 
team members through all course development stages to ensure that the 
content of the course meets the intended course goals and performance 
objectives. In addition, Flight Standards and Aircraft Certification 
also encourage course development teams to have subject-matter experts. 
According to FAA, these subject-matter experts, who can be FAA 
employees or even outside consultants, can improve the quality and 
accuracy of the new course because they have specific knowledge and 
experience in one or more course topics. For example, Flight Standards 
is now developing a new course for the advanced avionics "glass 
cockpit" displays that are increasingly being used by air carrier and 
general aviation operators. (See sidebar.) The expertise and knowledge 
of the course development team for the new advanced avionics glass 
cockpit display course were significantly enhanced by subject-matter 
experts who were assigned to the team and had experience approving 
aircraft equipped with these advanced avionics displays. 

Besides the course mentor and subject-matter experts, other key team 
members on course development teams include instructional systems 
designers who provide expertise in training design and course 
developers who write the actual lesson plans for the new course. 

Experts outside of FAA can also provide input on course development in 
many technical subjects. For example, FAA established a partnership 
with universities and affiliated industry associations and businesses 
throughout the country to form Centers of Excellence, which conduct 
aviation research in a number of areas including advanced materials, 
aircraft emissions, and airworthiness. The General Aviation Center of 
Excellence, formed in 2001, has conducted research on aircraft seat-
restraint systems, increasing aircraft landing safety, and aircraft de-
icing. In addition to their technical expertise, many universities and 
private sector companies in the aviation industry have substantial 
experience conducting aviation training and education programs. For 
example, Embry-Riddle Aeronautical University, which is the lead 
university for the General Aviation Center of Excellence, has been in 
the aviation training industry since 1926; and in addition to pilot and 
maintenance training, it offers more than 30 degree programs, including 
programs in engineering, aviation management, and aviation safety 
science. FAA already contracts with Embry-Riddle for some inspector 
flight training and recently expanded the number of training locations 
with another General Aviation Center of Excellence program. Airlines 
also have substantial experience offering pilot, crew, and maintenance 
training. FAA receives input from training providers like Embry-Riddle 
on course development as part of contracted training courses. FAA 
officials who work with Centers of Excellence said that there could be 
more opportunities for the agency to utilize the technical and aviation 
training expertise of the Centers of Excellence in developing its 
inspector training program. 

Using a structured approach for course development. Flight Standards 
and Aircraft Certification both use a structured course development 
approach that calls for progressive course development steps and 
ongoing evaluation of the training at each step. This approach provides 
course development teams with a description of activities that should 
occur at each step, which helps to ensure that lesson plans, course 
materials, and course delivery methods enable the student to meet 
course objectives and increase job performance. (See fig. 3.) 

Figure 3: FAA's Structured Approach for Course Development: 

[See PDF for image] 

[End of figure] 

Flight Standards and Aircraft Certification have also built quality 
controls into their course development guidelines by requiring 
evaluation at each of these course development steps. Generally, after 
each stage in the course development process described in figure 3, the 
course development team reviews the work that occurred up to that 
point. For example, once the course developer has created lesson plans 
and any course materials, all members of the course development team 
review and make suggestions for revising them. Both Flight Standards 
and Aircraft Certification evaluate and test newly developed training 
in the final stages of course development. For example, Flight 
Standards conducts an operational tryout to see how effective the 
course is, with actual course instructors teaching the new course 
lesson plans and team members acting as observers. New FAA courses are 
tested again in the final prototype stage when instructors, with 
observers, teach the course in front of students from the course's 
planned target audience. These students provide feedback on each lesson 
in the course in such areas as clarity of objectives, appropriateness 
of the level of instruction, and the usefulness of training materials. 
These quality checks are aimed at ensuring that lesson plans flow 
smoothly and support the course objectives. 

Considering different approaches for presenting courses. Flight 
Standards and Aircraft Certification generally make decisions on the 
delivery approach or method to use for aviation safety inspector 
courses in the initial stages of course development. For example, the 
course development team will consider factors such as the complexity of 
the topic, how soon the course is needed, and how many students will 
need the training. In the case of training a large number of inspectors 
on a relatively simple topic or a quick refresher course, a short self-
paced computer-based or Web-based training course might be selected. 
Because the course development process can take months to complete, 
self-paced training can also be used when the knowledge or information 
needs to be conveyed quickly to a large number of students. 

However, when a course requires interaction and hands-on learning and 
it covers a lengthy or complex topic, the course development team could 
decide that a classroom format followed by practical exercises is the 
most suitable delivery method. (See fig. 4.) For example, in developing 
the glass cockpit course discussed above, the course development team 
considered several factors, including the complexity and the rapid 
growth of the technology and the fact that relatively few students have 
had a chance to become familiar with glass cockpit systems. The course 
development team then decided to use a combination of classroom and 
practical exercises as the primary delivery methods. Under this course 
format, students participate in classroom lecture and discussion 
sessions for the introductory lesson on glass cockpit technology. The 
students then have practical exercises on flight simulators with glass 
cockpit displays to integrate and reinforce the knowledge gained in the 
classroom. 

Figure 4: FAA Inspectors Receiving Training in a Classroom Setting: 

[See PDF for image] 

Note: When covering a technical or complex subject FAA will often use a 
classroom format that allows for group interaction and practical 
exercises. 

[End of figure] 

Using criteria for decisions on outside training providers. Flight 
Standards and Aircraft Certification have developed and apply criteria 
for deciding whether to use outside training providers for their new 
aviation safety inspector courses. For example, one criterion is 
whether FAA or an outside training provider has more technical 
expertise. Generally, FAA will use its own instructors to teach many of 
the introductory courses that inspectors receive when they first join 
FAA. This is because many of these courses provide the new safety 
inspector with a familiarization of inspector responsibilities and job 
functions and a description of aviation regulations, and FAA is usually 
the most appropriate training provider to cover these topics. However, 
in a given aviation technology area, some private sector companies that 
concentrate in a technology will have more expertise than FAA. For 
example, because an outside training provider has more specialized 
technical knowledge in composite materials,[Footnote 15] Flight 
Standards contracts with this provider to deliver composites and 
composites repair training. 

FAA Plans to More Systematically Identify Technical Training Needs in 
Developing Its Inspector Training Curriculum: 

As discussed above, FAA's course development activities follow many 
effective management practices for developing individual courses, but 
it has not yet systematically identified its inspectors' overall 
training needs to ensure that the curriculum addresses the unique 
training needs of each type of inspector. However, FAA is developing a 
specific training curriculum for each type of inspector. 

Flight Standards recognizes that it manages courses as individual 
components and that it needs to develop courses and address training 
needs for each of its inspector specialties as part of an overall 
curriculum. In addition, our survey indicates that only 27 percent of 
inspectors said that the current set of FAA recommended training 
courses for each inspector type captures the training needed to do 
their jobs to a great or very great extent.[Footnote 16] Flight 
Standards recognizes that for curriculum transformation to work 
effectively, a strategy for curriculum management, as opposed to course 
management, needs to be clearly articulated. In response, Flight 
Standards is developing a new performance-based training initiative 
with the goal of systematically assessing the complete array of 
training to ensure it meets the performance requirements of the many 
specialties, disciplines, and positions in Flight Standard's ranks. In 
an effort to implement a more curriculum-based approach that addresses 
different inspector training needs, the curriculum transformation plan 
recommends creating curriculum oversight teams for each type of 
inspector made up of representative inspectors from the field and from 
headquarters. Rather than the current approach, in which course 
development teams focus on individual courses, these curriculum 
oversight teams would be responsible for the overall curriculum for 
each type of inspector, including defining training requirements and 
ensuring that curriculum and course content are current and consistent 
with Flight Standards policy and practices in the field. The Flight 
Standards steering committee is responsible for chartering these 
curriculum oversight teams and approving the curriculum they develop 
for each inspector type. Flight Standards estimates that it will 
complete implementation of its curriculum transformation plan in 2008. 
If effectively implemented, we believe that these approaches would 
allow Flight Standards to develop a more systematic method for 
identifying training needs and provide a curriculum that is more 
relevant to different types of inspectors and their needs. 

While Flight Standards is in the first stages of implementing its new 
curriculum-based approach to training, Aircraft Certification has 
recently taken steps to revise existing courses and develop new courses 
within an overall curriculum approach. For example, it formed a 
curriculum study team and completed a proposed curriculum for its 
manufacturing aviation safety inspectors and has revised inspector 
courses and other aspects of training according to its new curriculum 
plan. Aircraft Certification has only one type of aviation safety 
inspector; Flight Standards has avionics, maintenance, and operations 
inspectors for both general aviation and air carriers as well as other 
inspectors, such as cabin safety inspectors. Because Aircraft 
Certification has only one type of safety inspector, a permanent 
curriculum study team may not be absolutely necessary. 

FAA Provides Extensive Support for Delivering Training; However, Many 
Inspectors Believe Improvements Could Help Them Do Their Jobs More 
Effectively: 

FAA recognizes that effective delivery of quality inspector training is 
crucial to the success of the agency's mission to obtain industry 
compliance with safety standards and promote the continuing safety of 
air travel. FAA has generally followed effective management practices 
for training deployment to help ensure effective delivery of training, 
but improvements could be made. (See table 4.) Experts from the 
aviation and academic communities whom we consulted generally agreed 
that, for the most part, the courses FAA offers meet current and 
emerging technical needs. However, many inspectors question whether the 
training they receive is sufficient to provide them with the technical 
knowledge needed to perform their jobs. 

Table 4: Extent That FAA Follows Effective Management Practices in 
Delivering Technical Training: 

Effective management practice: Clearly delineates accountability for 
achieving agency training goals; 
Extent followed: Fully. 

Effective management practice: Uses a suitable and timely process for 
selecting inspectors for technical training given inspectors' current 
duties and existing skills; 
Extent followed: Partially. 

Effective management practice: Fosters an environment that is conducive 
to learning; 
Extent followed: Fully. 

Effective management practice: Takes steps to encourage employee buy-in 
to goals and priorities of technical training; 
Extent followed: Partially. 

Source: GAO. 

[End of table]

FAA Generally Follows Several Effective Management Practices for 
Delivering Technical Training: 

FAA generally follows several of the effective management practices 
that are important for delivering technical training. 

Clearly delineating accountability for ensuring access to technical 
training. According to FAA officials, FAA program and training 
officials have developed a list of mandatory and recommended courses 
for each inspector position. These training lists contain some 
technical courses but focus mainly on courses involving the 
fundamentals of the inspection process (such as courses covering 
inspection of automation systems, compliance and enforcement 
procedures, and system safety concepts) and job tasks for each safety 
inspector specialty. FAA inspection program managers note that the 
recommended course lists are not more prescriptive for technical 
training because the need for technical training depends on the 
specific types of aircraft and equipment with which inspectors work. 
Thus, decisions on technical training needs are mainly the 
responsibility of the individual inspectors and their immediate 
supervisors, in accordance with FAA guidance that provides decision-
tree criteria for approving training requests. After inspectors and 
their supervisors agree on inspectors' technical training requests, 
regional, headquarters, and academy training executives determine which 
courses will be taught. From there, FAA training divisions work with 
the FAA academy to implement the training by developing course 
schedules and inspector quota allocations. 

Using a suitable process for selecting inspectors for technical 
training. FAA's automated training request process provides inspectors 
with the opportunity to plan for, request, and be selected for the 
technical training necessary for their positions. The Flight Standards 
and Aircraft Certification lists of technical and other courses 
essential for each inspector position, as well as other courses that 
are available to further inspectors' professional development, are 
available for review and planning by the inspectors and their 
supervisors.[Footnote 17] The training system contains information on 
all the training courses previously completed by the inspectors and 
outlines their progress toward meeting training requirements. With the 
supervisor's guidance and approval, each year inspectors request 
training courses reflecting training needs related to the inspector's 
position, office inspection activity, and succession planning. However, 
both FAA and its inspectors recognize the need for more timely 
selection of inspectors for technical training, another aspect of 
technical training. This topic is discussed later in this report. 

Fostering an environment conducive to learning. FAA has provided many 
of the elements necessary to promote inspectors' learning of technical 
material, including allowing them time away from work to receive 
classroom or computer-based training. FAA also has invested in 
technologies such as computer-based and interactive-video training that 
help meet the demand for technical and other training. Similarly, FAA 
has moved some training closer to the inspector duty locations to 
facilitate and encourage training attendance and has begun 
experimenting with bringing training to the duty locations when 
appropriate. FAA's on-the-job training program also gives inspectors 
hands-on experience with the aircraft and components for which they are 
responsible. In addition, FAA streamlined the process for acquiring 
training opportunities that arise on short notice, such as when 
inspectors are assigned a new aircraft type to inspect. Finally, FAA's 
training management system allows inspectors to schedule available 
technical training courses tailored to their individual needs. 

Acting to obtain inspector buy-in for training goals and priorities. 
While believing that its inspectors have sufficient technical knowledge 
to perform inspections, FAA has recognized the need to facilitate 
communication between inspectors and management in order to gain 
inspector buy-in for a training program emphasizing system safety over 
technical courses. Currently, FAA primarily depends on its local office 
managers (the inspectors' supervisors) to communicate training goals 
and priorities to the inspectors, mostly during the annual training 
planning process. According to FAA training officials, this information 
is also disseminated to inspectors in strategic training plans and 
other guidance on training. FAA officials further note that inspectors 
have opportunities to communicate their views on training in course 
evaluations and employee surveys. 

Nevertheless, FAA recognizes the need to increase communication between 
inspectors and management with respect to the training program. Flight 
Standards recognizes that without inspector buy-in the safety 
inspectors will not be able to effectively execute system safety 
oversight and thus this buy-in is recognized as critical to Flight 
Standards success. FAA is concerned that inspectors have not fully 
bought into the system safety approach to inspections. In an attempt to 
gain support for and understanding of the system safety approach and 
the ways in which inspectors will be affected by the change, Flight 
Standards plans to host focus groups with management and inspectors, 
conduct individual interviews with all Flight Standards employees, and 
create an outreach and communication team to foster better 
understanding between FAA management and the inspectors. Similarly, 
Aircraft Certification has planned a number of steps to increase 
communication between management and the inspector workforce, including 
facilitated focus groups, individual interviews, and more effective 
employee feedback mechanisms. These actions may well be needed because, 
as discussed in the following section, inspectors are generally 
dissatisfied with the technical training that they receive. 

Inspectors Are Generally Dissatisfied with the Technical Training That 
They Receive: 

Although FAA has followed or is taking steps to follow many of the 
effective management practices in planning, developing, and delivering 
technical training, inspectors expressed widespread dissatisfaction 
with this training. Inspector dissatisfaction covered three areas: (1) 
having insufficient technical knowledge to do their jobs, (2) not being 
able to take training they say they needed, and (3) not receiving 
training in time to do their jobs. 

One possible explanation for this seeming contradiction is that, 
although FAA generally employs sound approaches for putting its 
technical training in place, its actual delivery falls short--the 
latter being the view of the bulk of its inspector workforce. We were 
not able to assess this possible explanation because, as discussed at 
the beginning of this report, we had no practical way to assess the 
amount of training necessary for inspector proficiency or the 
timeliness of the training provided. Another possible explanation is 
that the technical training that FAA provides meets the current and 
future needs of the agency to a large degree and its inspectors have 
unrealistic expectations about technical training. This is the view of 
FAA, and its reasons are discussed later in this section. 

Having Sufficient Technical Knowledge and Training: 

On the basis of our survey, we estimate that only about half of FAA 
inspectors believe, to a great or very great extent, that their 
technical knowledge is sufficient to enable them to do their jobs 
properly.[Footnote 18] (See fig. 5.) This belief varies somewhat among 
inspector specialties. Some inspectors--such as those who specialize in 
cabin safety and aircraft certification--told us that, to a great or 
very great extent (78 percent and 68 percent, respectively), they have 
enough technical knowledge to do their jobs.[Footnote 19] On the other 
hand, only a third of air carrier avionics inspectors told us that they 
currently have sufficient technical knowledge to do their 
jobs.[Footnote 20]

Figure 5: Inspectors Responding that to a Great or Very Great Extent 
They Currently Have Enough Technical Knowledge to Do Their Jobs: 

[See PDF for image] 

Note: See table 13 in appendix II for additional results. 

[End of figure] 

One reason for the disparity of views concerning technical knowledge 
among inspectors of different specialties could be their perceived need 
for specialized knowledge. For example, cabin safety inspectors noted 
that much of their knowledge of the cabin environment comes from 
previous experience with airlines and through on the job experience. 
Similarly, according to FAA, Aircraft Certification inspectors bring 
with them a high degree of technical knowledge, gained in previous 
careers in the aviation industry; and typically these inspectors need 
less technical training than other types of inspectors. Our analysis of 
training received confirms that aircraft certification and cabin safety 
inspectors receive less technical training than other inspector 
specialties. Additionally, as shown above, they are the most satisfied 
of all inspector specialties that they have the technical knowledge 
needed to do their jobs. Alternatively, avionics inspectors--who were 
the least satisfied that they have received enough technical training 
to do their jobs--indicated that they believe they require specialized 
knowledge of the avionics systems they inspect. Inspector training data 
shows that these inspectors receive the most technical training of all 
inspector specialties. (This topic is discussed in more detail later in 
the report. See table 6.) However, from our survey, they believe that 
they need more. 

Our survey also indicates that most inspectors believe that the 
technical training they have recently received has not greatly 
contributed to their ability to perform inspections.[Footnote 21] 
Specifically, we estimate that about 35 percent of the inspectors 
believe that the technical training that they received in the last 2 
years helped them do their current jobs to a great or very great 
extent.[Footnote 22] The results ranged from a high of 39 percent for 
air carrier operations inspectors to a low of 23 percent for general 
aviation avionics inspectors.[Footnote 23] The higher percentage for 
operations inspectors could be attributed to the fact that they are 
required to take flight training on an annual basis, whereas other 
inspector specialties such as avionics, maintenance, and cabin safety 
do not have similar requirements for annual training. In comments 
included with their surveys, inspectors expressed opinions on whether 
they have sufficient training to do their jobs. Of the 240 inspectors 
who took the time to write narrative responses about the sufficiency of 
training, 31 offered positive comments, 105 were strongly negative, and 
another 119 had weaker negative comments. In addition, 37 inspectors 
indicated they found themselves inspecting aircraft or components they 
had not been trained on.[Footnote 24]

Our survey also indicates that inspectors believe that most of the 
technical knowledge they possess was gained in their previous careers 
in the aviation industry. For inspectors who said that to a great or 
very great extent they have sufficient technical knowledge to do their 
jobs, we estimate that 80 percent also noted that the knowledge and 
skills they brought to FAA from their previous careers contributed, to 
a great or very great extent, to this technical knowledge.[Footnote 25] 
We estimated that lower percentages of inspectors from this group rated 
technical training from FAA instructors (25 percent) and aviation 
industry sources (41 percent) as contributing, to a great or very great 
extent, to the technical knowledge needed to perform their 
jobs.[Footnote 26] Our analysis of survey responses indicates that the 
amount of time since inspectors left their careers in the aviation 
industry was not a factor in inspectors' views about their job-related 
technical knowledge. Newer inspectors were no more likely than longer-
tenured inspectors to say that to a great or very great extent they 
have enough technical knowledge of the aircraft, systems, or operations 
they inspect to do their jobs. 

FAA officials indicated to us that inspectors will always believe they 
need more training. In addition, FAA officials further stated that 
inspectors need to have only enough technical knowledge of aircraft, 
systems, and components to be effective inspectors: they need to know 
enough to ask the right questions, recognize potential problems, and be 
able to understand issues that arise. Full proficiency with the 
aircraft and components is not necessary. However, FAA officials 
indicated that inspectors believe that full or near full proficiency is 
necessary. An FAA official attributed inspectors' views about the 
perceived insufficiency of technical training to many of them not fully 
accepting the agency's transformation to a system safety approach to 
inspections with its emphasis on risk analysis over technical 
knowledge. The traditional inspection system relied to a great extent 
on an individual inspector's technical expertise to identify safety 
problems with operations or aircraft systems. Although the system 
safety approach requires inspectors to have an understanding of 
aircraft and aircraft systems, it is more important for them to have 
the skills to analyze data to identify vulnerabilities in aircraft 
operators' and manufacturers' systems for ensuring safety. FAA 
officials believe that as inspectors gain experience with system 
safety, they will better understand the more limited role technical 
knowledge plays in this inspection approach. 

Most of the inspector training experts we consulted on the inspector 
technical training curriculum generally agreed with FAA's position. 
Seven of the 10 experts we contacted told us that the technical courses 
that FAA offers sufficiently covered existing and emerging technical 
areas.[Footnote 27] In particular, they noted that the flight training 
for operations inspectors was adequate for performing flight checks. 
However, two experts were concerned that there was not enough training 
in advanced aviation technologies for maintenance and avionics 
inspectors. These experts thought that maintenance and avionics 
inspectors should have periodic refresher training that would allow 
them to become more familiar with changes in the aircraft and systems 
they deal with during inspections. The experts did not comment on 
whether individual inspectors receive all the technical training 
necessary for their positions since this would have required an 
extensive, detailed review of training records. 

Most representatives from airlines we contacted were at least 
moderately satisfied that FAA inspectors have sufficient technical 
knowledge and training.[Footnote 28] On the basis of their experience 
with FAA inspectors, 19 of the 23 airline representatives we consulted 
said that to a great extent (7 responses) or moderate extent (12 
responses) FAA inspectors had the technical knowledge to fulfill 
inspection responsibilities. Regarding training, 16 of the 23 thought 
that to a very great extent (1 response) or moderate extent (15 
responses) FAA inspectors have the technical training to fulfill these 
responsibilities. 

Being Able to Take Requested Training: 

Also reflecting the disparity in views between inspectors and FAA 
management concerning technical training, many inspectors indicate that 
they are not greatly encouraged to take technical training and that to 
a large extent they do not get all the technical training they request. 
However, inspectors' views are not supported either by FAA training 
request data or from progress made in taking training courses deemed 
essential by FAA.[Footnote 29] On the basis of our survey, we estimate 
that less than half (43 percent) of inspectors think their supervisors 
encourage them to request the technical training needed to do their 
current jobs.[Footnote 30] We also estimate that about 28 percent of 
FAA's inspectors believe, to a great extent or very great extent, that 
they receive the technical training that they request.[Footnote 31] 
(See fig. 6.) If we include responses citing receiving requested 
training to a moderate extent, we then estimate that about 49 percent 
of inspectors overall believe that they receive the training they 
request at least to a moderate extent.[Footnote 32]

Figure 6: Extent to Which Requested Technical Training Is Approved: 

[See PDF for image] 

Note: See table 14 in appendix II for additional results. 

[End of figure] 

Our survey also indicates that inspectors with longer tenures at FAA 
have more difficulty getting technical training they request than 
inspectors who have recently been hired. According to the survey, of 
those inspectors with 10 or more years with FAA, we estimate that 55 
percent said that requested technical training was approved to some or 
no extent as compared to an estimated 35 percent of inspectors who had 
been with FAA 3 years or less.[Footnote 33] This may be due in part to 
newer inspectors' technical training opportunities tending to be 
essential courses, requests likely to be approved. In contrast, 
inspectors with more experience request courses outside of FAA 
requirements, requests more likely to be denied. As discussed earlier, 
FAA records did not allow us to assess the merits of the inspectors' 
views. 

FAA's data on the extent to which requested courses were taken do not 
support the inspectors' contention that they do not receive requested 
training. According to Flight Standards data, for fiscal years 2003 
through 2005, on average FAA approved about 90 percent of requested 
technical training and has approved a similar percentage for upcoming 
fiscal year 2006. (See fig. 6.) FAA officials note that these data do 
not include late course cancellations that occur after the training 
schedule for the year is set. Though we were unable to obtain similar 
data from Aircraft Certification data on course cancellations or 
denials, these courses occasionally get cancelled or changed. In fiscal 
year 2005, one course was cancelled by Aircraft Certification. 
Officials told us that if a training course essential for an inspector 
to perform their job is cancelled, the inspector will be placed in the 
next available course. Aircraft Certification inspectors represent 
about 5 percent of the inspector workforce (excluding supervisors, 
managers, and others in the aviation safety inspector job series who do 
not perform front-line inspections). 

According to FAA, the agency tries to accommodate inspectors' requests 
for technical training to the extent possible. However, an inspector's 
request for technical training may be denied because (1) the 
inspector's need for the course was not adequately justified based on 
the inspector's current position, (2) the inspector had already 
completed a similar course, or (3) insufficient funding was available. 
Officials also said that, infrequently, a technical course requested by 
an inspector may be cancelled due to low enrollment or because its 
content is outdated. According to the officials, when a requested 
course is cancelled, inspectors can request it the next time it is 
offered, and in the meantime they can choose a replacement course from 
the list of courses for their position. Inspectors' views on why they 
did not get training that they requested corresponded somewhat with the 
reasons that FAA cited. We estimate that about 54 percent of the 
inspectors believe that lack of funds hindered or greatly hindered 
their ability to get requested technical training.[Footnote 34] (See 
fig. 7.) Inspectors cited other reasons somewhat less frequently: about 
36 percent cited availability of courses, 28 percent cited impact on 
their workload, and 27 percent cited management's determination about 
the need for them to attend the course as hindering or greatly 
hindering their ability to receive the training they 
requested.[Footnote 35] Because it was impractical to investigate the 
reasons why thousands of training requests were granted or denied, we 
were not able to reconcile inspectors' views with FAA data. 

Figure 7: Inspectors' Views on Factors Hindering Their Ability to Take 
Requested Technical Training: 

[See PDF for image] 

Note: See tables 15 through 18 in appendix II for additional results. 

[End of figure] 

In addition to receiving most of the technical training they request, 
our analysis of inspectors' training records indicates that most are 
making good progress in taking the technical training FAA considers 
essential for their jobs. Our analysis of FAA training data[Footnote 
36] indicates that over half of the inspectors have completed at least 
75 percent of their essential technical training courses for their 
positions.[Footnote 37] (See table 5.) In addition, more than three-
quarters have finished at least half of these essential technical 
courses. However, only 20 percent of air carrier avionics inspectors 
have completed 75 percent of their technical courses. Avionics 
inspectors have the most technical training requirements, due to the 
complexity of the aircraft components they inspect. 

Table 5: Percent of Inspectors Completing Essential Technical Courses: 

Type of inspector: Air carrier avionics; 
Percent of inspectors completing at least 75 percent of technical 
courses: 20; 
Percent of inspectors completing at least 50 percent of technical 
courses: 69. 

Type of inspector: Air carrier maintenance; 
Percent of inspectors completing at least 75 percent of technical 
courses: 46; 
Percent of inspectors completing at least 50 percent of technical 
courses: 82. 

Type of inspector: Air carrier operations; 
Percent of inspectors completing at least 75 percent of technical 
courses: 36; 
Percent of inspectors completing at least 50 percent of technical 
courses: 74. 

Type of inspector: Cabin safety; 
Percent of inspectors completing at least 75 percent of technical 
courses: 59; 
Percent of inspectors completing at least 50 percent of technical 
courses: 85. 

Type of inspector: General aviation avionics; 
Percent of inspectors completing at least 75 percent of technical 
courses: 52; 
Percent of inspectors completing at least 50 percent of technical 
courses: 78. 

Type of inspector: General aviation maintenance; 
Percent of inspectors completing at least 75 percent of technical 
courses: 53; 
Percent of inspectors completing at least 50 percent of technical 
courses: 84. 

Type of inspector: General aviation operations; 
Percent of inspectors completing at least 75 percent of technical 
courses: 69; 
Percent of inspectors completing at least 50 percent of technical 
courses: 88. 

Type of inspector: Aircraft certification[A]; 
Percent of inspectors completing at least 75 percent of technical 
courses: N/A; 
Percent of inspectors completing at least 50 percent of technical 
courses: N/A. 

Type of inspector: All inspectors; 
Percent of inspectors completing at least 75 percent of technical 
courses: 46; 
Percent of inspectors completing at least 50 percent of technical 
courses: 80. 

Source: GAO analysis of FAA data. 

Note: N/A = nonapplicable. 

[A] Aircraft certification inspectors have no essential technical 
courses according to the definition of technical training used in this 
report. Aircraft Certification considers all training related to the 
inspection process as technical training. 

[End of table]

There are several reasons that may explain why inspectors have not 
completed most or all of their essential training--technical and other. 
First, a significant portion of the inspector workforce is relatively 
new to the agency and would thus not be expected to have completed the 
essential training. In fact, FAA data show that 28 percent of the 
inspectors have been employed by FAA for less than 5 years. Second, 
inspectors change specialties, which can affect their training 
requirements. Third, FAA allows inspectors to substitute prior 
experience for some essential technical training courses, and such 
substitutions are not always reflected in inspector training records. 
Finally, because the lists of essential training courses have been 
developed only within the past few years, some inspectors may not have 
had time to complete new essential courses. FAA officials emphasized 
that, especially in Flight Standards, training is carried out over the 
course of an inspector's career rather than occurring primarily at the 
beginning of the career. According to Aircraft Certification officials, 
although their service's inspectors receive training over the course of 
their careers, they receive the majority of their training within the 
first year on the job. This early training emphasizes the skills needed 
to perform inspections. 

Overall, according to our analysis, inspectors have taken an average of 
3.4 technical training courses from fiscal years 2002 through 2004, or 
about one per year. (See table 6.) Avionics and maintenance inspectors 
have taken more technical training on average. Generally, these 
avionics and maintenance inspectors require more technical training 
than other inspector specialties because they often inspect several 
different models of aircraft. 

Table 6: Average Number of Technical and Nontechnical Training Courses 
Taken, Fiscal Years 2002 through 2004: 

Type of inspector: Air carrier avionics; 
Technical courses: 4.8; 
Nontechnical courses: 9.6. 

Type of inspector: Air carrier maintenance; 
Technical courses: 3.9; 
Nontechnical courses: 9.6. 

Type of inspector: Air carrier operations; 
Technical courses: 2.6; 
Nontechnical courses: 9.7. 

Type of inspector: Cabin safety; 
Technical courses: 1.2; 
Nontechnical courses: 8.5. 

Type of inspector: General aviation avionics; 
Technical courses: 4.0; 
Nontechnical courses: 8.8. 

Type of inspector: General aviation maintenance; 
Technical courses: 4.0; 
Nontechnical courses: 9.9. 

Type of inspector: General aviation operations; 
Technical courses: 3.1; 
Nontechnical courses: 9.6. 

Type of inspector: Aircraft certification; 
Technical courses: 1.2; 
Nontechnical courses: 5.4. 

Type of inspector: All inspectors; 
Technical courses: 3.4; 
Nontechnical courses: 9.4. 

Source: GAO analysis of FAA data. 

[End of table]

Receiving Training in a Timely Manner: 

With the rapid development of aircraft and aircraft components, 
especially aircraft avionics, a training delivery mechanism that is 
responsive to these changes is critical. For the most part, FAA 
inspectors are dissatisfied with receiving the technical training they 
need in time to do their jobs. We estimate that only 20 percent of 
inspectors believe to a great or very great extent that they have 
received technical training in time to do their jobs.[Footnote 38] (See 
fig. 8.) No more than one-third of any type of inspector thought that 
technical training was timely to a great or very great extent, and none 
of the general aviation avionics inspectors who responded to our survey 
thought that this was so.[Footnote 39] Avionics are the most rapidly 
changing technological components of aircraft, which could account for 
this result. As discussed at the beginning of this report, FAA's 
records did not allow us to assess the extent to which inspectors 
received training before they conducted inspection activities related 
to that training. 

Figure 8: Inspectors' Views on the Extent to Which Technical Training 
Is Delivered in a Timely Manner: 

[See PDF for image] 

Note: See table 19 in appendix II for additional details. 

[End of figure] 

Similarly, we estimate that only about 23 percent of all inspectors 
indicated that they always or frequently received technical training on 
the equipment they were to inspect prior to scheduled inspection 
activities.[Footnote 40] (See fig. 9.) No more than 35 percent of 
inspectors in any specialty responded this way. In comments supplied 
with their surveys, many inspectors expressed the view that FAA is slow 
to react to changes in industry technology and slow to develop courses 
in response to the changes. 

Figure 9: Inspectors' Views on the Extent to Which They Received 
Technical Training Prior to Scheduled Oversight Activities: 

[See PDF for image] 

Note: Approximately 4 percent of inspectors responded that they had no 
basis to judge or did not know. See table 20 in appendix II for 
additional details. 

[End of figure] 

FAA has recognized the need to provide training on a timelier basis and 
has taken some actions that have yet to be fully implemented. One of 
the goals of Flight Standards is to establish a way to ensure that 
training is current and well designed, can be tailored to the needs of 
the individual employees, and is administered in a fast and flexible 
way in response to changing needs. Flight Standards plans to improve 
training delivery by taking advantage of new delivery mechanisms, 
increasing utilization of vendors where appropriate, and streamlining 
training programming and scheduling to reduce the lag time between the 
identification of training needs and the delivery of training. In 
addition, Flight Standards recently instituted a process to 
continuously monitor courses and to update their content when changes 
in FAA policy or aviation technology warrant doing so. Ensuring that 
course content is up to date and that courses are available when needed 
is an important aspect of delivering timely training. In fiscal year 
2005,[Footnote 41] Flight Standards developed 5 new courses, revised 16 
existing courses, and completed 13 course evaluations.[Footnote 42] 
Similarly, Aircraft Certification officials indicated that they have 
been evolving toward a more integrated approach to training delivery, 
mixing classroom training with Web-based technologies, on-the-job 
training and adding additional job aids, in part, for providing more 
timely training. These officials also noted that Aircraft Certification 
has a long history of providing just-in-time training when new work 
processes or job-related information needs to be disseminated to 
inspectors quickly. 

Although FAA Uses Several Approaches to Evaluate Technical Training 
Provided, Assessing Impact on Performance Remains to Be Done: 

For the most part, FAA has followed--or has begun to implement--
effective management practices in evaluating its efforts to provide 
technical training to inspectors and ensuring that this training leads 
to improved performance and results. (See table 7.) For example, it 
continuously assesses technical training through participant end-of-
course evaluations and surveys of inspectors and supervisors that focus 
on the application of skills and knowledge to the job. While FAA's 
evaluation efforts provide information about these areas, these 
assessments have not measured the impact of training on FAA's mission 
goals, such as reducing accidents. Isolating improvements in mission 
performance that are a result of training programs is difficult for any 
agency. 

Table 7: Extent That FAA Followed Effective Management Practices in 
Evaluating Its Training Program: 

Effective management practice: Systematically plans for and evaluates 
the effectiveness of training and development efforts; 
Extent followed: Mostly. 

Effective management practice: Uses the appropriate analytical 
approaches to assess its training and development programs; 
Extent followed: Mostly. 

Effective management practice: Uses appropriate performance data 
(including qualitative and quantitative measures) to assess the results 
achieved through training and development efforts; 
Extent followed: Partially. 

Effective management practice: Incorporates evaluation feedback into 
the planning, design, and implementation of its training and 
development efforts; 
Extent followed: Fully. 

Effective management practice: Incorporates different perspectives 
(including those of line managers and staff, customers, and experts in 
areas such as financial, information, and human capital management) in 
assessing the impact of training on performance; 
Extent followed: Mostly. 

Effective management practice: Assesses the benefits achieved through 
training and development programs; 
Extent followed: Partially. 

Source: GAO. 

[End of table]

FAA Mostly Follows Several Effective Practices for Evaluating Technical 
Training: 

FAA has taken several actions to evaluate the effectiveness of its 
technical training efforts. Collectively, the actions generally cover 
the effective management practices cited in table 7 by continuously and 
systematically evaluating the technical courses FAA provides for 
inspectors. In performing these evaluations, FAA has focused primarily 
on obtaining inspectors' and, to some extent, their supervisors' views 
on individual courses. FAA requires that participant evaluations be 
distributed after each training course for inspectors. The evaluations 
ask participants to rate the extent to which the course and course 
material (e.g., workbooks, slides, labs, and tests) met objectives as 
well as the extent to which the instructor provided assistance. 
According to FAA, participants return the evaluations 95 percent of the 
time. FAA also sends surveys to inspectors and their supervisors 90 to 
180 days after course completion to obtain their perspectives on 
whether the course was needed and the extent to which the inspector is 
applying new skills and knowledge to the job. FAA reports that since 
the inception of post-course surveys, the return rate from inspectors 
and supervisors has ranged from 49 to 50 percent. The post-course 
survey results from the six most highly attended technical courses in 
the last 2 years reflected generally positive responses. These findings 
suggest that survey respondents generally think that the individual 
technical courses they received helped them in their jobs. Results from 
both the participant course evaluations and post-course surveys are 
automated and are available to training officials. In addition, 
according to Flight Standards training officials, they assess all 
complaints concerning a course and discuss the issues identified with 
the Flight Standards office that sponsors the course. 

According to FAA, it is the responsibility of the mentor for each 
training course to use the information from the participant evaluations 
and post-course surveys as well as other tools to determine if courses 
are meeting their objectives and enhancing inspectors' ability to do 
their jobs. In February 2005, Flight Standards established a policy 
that its course mentors evaluate each course for which they are 
responsible at least every 3 years using a standardized approach. 
According to the policy, course mentors should review the results of 
participant evaluations and post-course surveys as well as personally 
sit in the course to determine if a course is still current and is 
meeting objectives. Flight Standards has a performance plan initiative 
to track the completion of planned course evaluations. Flight Standards 
began training mentors on these and other mentor responsibilities and 
procedures in April 2005, and some course mentors have already begun 
thorough evaluations of their courses. Prior to this date, Flight 
Standards officials said that participant evaluations and post-course 
surveys were used by its Quality Assurance Branch in its annual course 
reviews and were routinely reviewed by FAA academy course managers and 
their supervisors to update or improve courses. However, because Flight 
Standards had not assigned specific individuals to be responsible for a 
particular course, some requests for updates were not tracked. 
According to Flight Standards officials, with the implementation of the 
course mentor program, each course will now have a point of contact for 
all course improvements and updates. 

Aircraft Certification is implementing a new approach for evaluating 
courses that officials believe will provide course mentors and other 
training officials more comprehensive information on technical courses 
sponsored by each office. The approach is based on the work of Dr. 
Robert O. Brinkerhoff, in particular his Success Case Method and High 
Impact Learning approach.[Footnote 43] This approach helps to increase 
and demonstrate organizational results from learning. With the Success 
Case Method, post-course surveys are used to gauge the extent of 
reported application of learning and are then validated through 
personal interviews with selected course participants. Using such 
interviews, Aircraft Certification seeks to determine whether and how 
much training was actually transferred to the job and if there was an 
impact on the organization as a result of the training. The High Impact 
Learning methodology allows for up-front evaluation prior to 
development or revision of courses to ensure that the objectives of the 
proposed training will lead to organizational impact through "impact 
mapping" of course objectives to organizational goals. Results of 
Success Case Method learning evaluations will be available to the 
course mentor, course managers at FAA's Training Academy, and program 
officials at FAA headquarters. Aircraft Certification began 
implementing the evaluation tool in spring 2005 and has thus far 
prototyped it with one course and plans to have it applied to all its 
technical courses within 2 years. 

FAA has also surveyed employees for their views on training in general, 
and one of these surveys will lead to revisions to the overall 
inspector curriculum, according to FAA. Every 2 years, FAA surveys all 
of its employees about many aspects of their employment, including the 
training they receive. This employee attitude survey asked employees 
about the extent to which they received the training they needed to 
effectively perform their jobs and whether or not they have been able 
to apply that training. However, the survey does not ask employees to 
differentiate between the types of training they receive, such as that 
relating to inspection processes or technical skills. In addition, 
although FAA isolates responses according to employee's work location-
-such as headquarters and Flight Standards and Air Certification field 
offices--it does not ask respondents their position, so inspectors' 
responses cannot be identified. As a result, although the survey can be 
useful for FAA's workforce as a whole, it is not as useful for 
isolating safety inspectors' attitudes about their technical training. 

In order to obtain information on inspector training in particular, in 
August 2004, Flight Standards conducted a separate survey of 51 field 
inspectors and 8 field and 3 headquarters inspection program managers 
that revealed inspector dissatisfaction about several aspects of 
training. Because of its limited nature, Flight Standards recognizes 
that the survey does not necessarily represent the views of the entire 
inspector workforce. Of the 51 field inspectors who agreed or strongly 
agreed with certain statements--the only large group surveyed--21 
percent indicated they received the training they needed, 10 percent 
said training is current and technically up to date, and less than 20 
percent indicated training supports current or future job requirements 
in Flight Standards.[Footnote 44] Flight Standards officials expressed 
concern about inspectors' negative views toward the training they 
receive. Officials said that as part of their plans for a curriculum 
for inspectors, they will identify and implement measures necessary to 
monitor course satisfaction and content currency, and the data they 
gather from monitoring the course will provide the basis for continuous 
course improvement. According to Aircraft Certification officials, 
their office has not undertaken any similar surveys to field 
inspectors. 

FAA officials said that they also encourage inspectors to submit 
suggestions for revising existing courses or adding new courses to 
provide training in the technical skills not covered in the current 
curriculum. According to Flight Standards training program officials, 
inspector suggestions for new courses are placed in a pool of potential 
new courses, which are reviewed by program staff on the basis of need 
and the availability of funds. Suggestions for course revisions are now 
reviewed by course mentors as part of the course evaluation 
process.[Footnote 45]

Although FAA has a process for inspectors to make recommendations 
regarding technical courses, our survey indicated that many inspectors 
are either not aware of or do not take advantage of this process. 
According to our survey, we estimate that 55 percent of the inspectors 
believe they have had an opportunity to recommend new courses for their 
position to some or no extent.[Footnote 46] In addition, 49 percent 
thought they had an opportunity to recommend new content in existing 
courses for their position to some or no extent.[Footnote 47]

FAA Lacks Comprehensive Data on How Technical Training Contributes to 
Improved Performance and Results: 

The analytical approach FAA employed for evaluating technical training 
programs emphasized individual course evaluations and employee surveys 
that collect useful, but limited, information on the effectiveness of 
technical training courses. According to experts on training in 
government, agencies should adopt a balanced, multilevel approach to 
evaluating their training and development efforts. One commonly 
accepted model is the Kirkpatrick model, which consists of five levels 
of assessment.[Footnote 48] The first level measures the training 
participants' reaction to, and satisfaction with, the training program 
or planned actions to use new or enhanced competencies. The second 
level measures the extent to which learning has occurred because of the 
training effort. The third level measures the application of this 
learning to the work environment through changes in behavior that 
trainees exhibit on the job because of the training or development 
program. The fourth level measures the impact of the training program 
on the agency's program or organizational results. Finally, the fifth 
level--often referred to as return on investment--compares the benefits 
(quantified in dollars) with the costs of the training and development 
program. 

As discussed earlier, the course evaluations and surveys FAA uses to 
evaluate its technical and other training programs for inspectors 
cover, to some extent, the first three levels of assessment. Aircraft 
Certification has taken the first steps in evaluating the impact of 
training on organizational results (the fourth level) by linking course 
objectives to organizational goals, and Flight Standards is in the 
initial stages of implementing a process to assess the return on 
investment from the courses in its training program. Experts 
acknowledge that isolating performance improvements resulting from 
training programs and the cost-effectiveness of these programs is 
difficult for any organization. Federal agencies, such as FAA, have to 
consider the feasibility and cost-effectiveness of conducting these in-
depth evaluations, along with budgetary and staffing circumstances that 
may limit the agencies' ability to complete such evaluations. The 
challenge of performing evaluations of the impact and cost-
effectiveness of its training efforts is great for FAA. Along with 
undertaking these evaluations, FAA must also determine how its ongoing 
shift to a system safety approach to inspections is affecting its 
organizational goals of reducing accidents and increasing the overall 
safety of flying. 

Industry Provides Much of FAA's Technical Training; Additional 
Safeguards Needed to Prevent Real or Appearances of Conflicts of 
Interest: 

Over the past 3 years, the aviation industry has provided about 40 
percent of the technical training for FAA's safety inspectors. To a 
limited degree, inspectors have received training from the aviation 
industry in exchange for in-kind services or at no cost to FAA. 
Although FAA has taken steps to address concerns over possible real or 
apparent conflicts of interest resulting from receiving this training, 
it has not consistently applied these policies. 

FAA Contracts with the Aviation Industry for Much of the Technical 
Training Provided to Inspectors: 

FAA provides technical training to its inspectors either through the 
FAA Academy (with courses taught in Oklahoma City or at FAA regional 
locations) or through contracts with the outside training providers, 
including some in the aviation industry that FAA regulates under FAA's 
gift authority. Technical training provided by the aviation industry 
includes: 

* pilot training;

* aircraft maintenance training;

* training covering inspection technologies and procedures; and: 

* training on aircraft systems, structures, and components. 

FAA has increasingly relied on industry to provide technical training 
to its inspectors over the past 3 fiscal years. In fiscal year 2004 
(latest data available), industry delivered nearly half of FAA's 
technical training. (See fig. 10.) Industry-provided training occurs 
most frequently for air carrier and general aviation operations 
inspectors because it is often more economical to have flight training 
provided by an outside vendor than for FAA to maintain or lease its own 
aircraft for this purpose. 

Figure 10: Percent of Technical Training Provided by Industry as 
Reported by FAA, Fiscal Years 2002 through 2004: 

[See PDF for image] 

Note: Numbers reported as a percent of total FAA-and industry-provided 
training for each type of inspector. See table 21 in appendix II for 
additional details. Because aircraft certification inspectors receive 
limited technical training, the percent of this training provided by 
industry can vary widely from year to year. 

[End of figure] 

Under Limited Circumstances, FAA Receives Training in Exchange for In-
Kind Services: 

In addition to paying industry to provide technical training for 
inspectors, FAA employs two arrangements by which inspectors obtain 
training from the aviation industry in exchange for in-kind services. 
In return for receiving training from the aviation industry, FAA 
delegates certain regulatory authority to qualified employees of the 
entity being overseen (called quid pro quo arrangements).[Footnote 49] 
Both programs apply to Flight Standards operations inspectors called 
aircrew program managers and training center program managers. FAA's 
Aircraft Certification service does not have any equivalent 
arrangements by which inspectors receive training in exchange for in-
kind services. 

Under the aircrew designated examiner program, there is an arrangement 
between FAA and major passenger-carrying airlines, cargo-only carriers, 
and regional airlines by which FAA delegates, under the supervision of 
FAA inspectors, certain pilot certification authority and 
responsibility to pilots of the airline.[Footnote 50] In exchange, FAA 
inspectors receive training in the airline-specific programs and 
procedures in airline-specific aircraft or simulators at no cost to the 
agency. Airlines benefit from the increased flexibility of being able 
to certify their own pilots and not having to arrange and schedule 
certification by an FAA inspector. FAA also benefits from this 
flexibility because delegating the certification activities increases 
its capacity for and efficiency of its oversight and management 
activities. In addition, because the training received is airline-
specific, it further enhances the inspectors' knowledge of the specific 
aspects of the airlines' operations that they are responsible for 
overseeing. The aircrew designated examiner program originated in 1982 
as an agreement between a single airline and FAA, stemming from FAA's 
inability to meet industry's increasing demand for certification 
specialists. Each agreement between FAA and the airline is governed by 
a memorandum of understanding that outlines the reasons for 
establishing the specific aircrew designated examiner agreement, lists 
the aircraft types involved, and contains an overview of how the 
program requirements will be met by both parties. This arrangement was 
approved by the FAA ethics officer and was reviewed by the Department 
of Transportation Inspector General. 

FAA does not keep central records of training received through these 
arrangements, and it was not practical for us to gather the data from 
over 100 FAA field locations. Therefore, we asked FAA's nine regional 
office officials to contact their respective flight standards district 
offices and certificate management offices to compile these data. Some 
data was provided to us with incomplete information, sometimes without 
names or specific dates. Because of the many remote locations that 
gathered this information for us, it was not practical for us to 
independently verify the completeness or accuracy of these data. 

Overall, FAA regional office-supplied data indicate that FAA has 
memoranda of understanding with 61 airlines and 42 training centers 
under the two programs, encompassing about 300 fleets in total. (See 
table 8.) 

Table 8: Number of Memoranda of Understanding and Fleets Enrolled as 
Part of the Aircrew Designated Examiner Program and Agreements with 
Training Centers: 

Aircrew designated examiner program; 
Number of memoranda of understanding: 61; 
Number of fleets: 141. 

Training centers; 
Number of memoranda of understanding: 42; 
Number of fleets: 162. 

Total; 
Number of memoranda of understanding: 103; 
Number of fleets: 303. 

Source: GAO analysis of FAA supplied data. 

Note: This information may not be complete and was not independently 
verified. (See text.) 

[End of table]

For the aircrew designated examiner program, FAA regional officials 
indicated that the agency has memoranda of understanding with 61 of the 
134 airlines, and these agreements cover 141 aircraft fleets. More than 
175 training activities took place per year, on average, from fiscal 
years 2002 through 2004, representing nearly 30 percent of all 
technical training received by air carrier operations inspectors. (See 
table 9.) 

Table 9: Numbers of Inspectors Trained under Aircrew Designated 
Examiner Program and Agreements with Training Centers, Fiscal Years 
2002 through 2004: 

Aircrew designated examiners: Number of inspectors trained; 
2002: 111; 
2003: 114; 
2004: 155; 
Average per year: 127. 

Aircrew designated examiners: Number of training activities; 
2002: 155; 
2003: 170; 
2004: 200; 
Average per year: 175. 

Training centers: Number of inspectors trained; 
2002: 43; 
2003: 45; 
2004: 59; 
Average per year: 49. 

Training centers: Number of training activities; 
2002: 51; 
2003: 52; 
2004: 73; 
Average per year: 59. 

Source: GAO analysis of FAA supplied data. 

Note: This information may not be complete and was not independently 
verified. (See text.) 

[End of table]

Similar to the aircrew designated examiner program, FAA's Flight 
Standards office also employs memoranda of understanding with private, 
FAA-certified training centers that provide training, testing, and 
pilot certification services to commercial and private pilots 
throughout the United States. Under these agreements, certain training 
center employees may be certified by FAA to serve as designees provided 
they meet FAA requirements. On behalf of FAA, these training center 
designees certify commercial and private pilots as qualified to operate 
an aircraft. FAA assigns one or more inspectors to each training 
center. The inspector is responsible for FAA regulatory management and 
oversight of the training center through periodic inspections of 
training center equipment, training courses, course materials, and 
instructors. As part of the agreement granting designee authority to 
the employee of the training center, the FAA inspector receives 
aircraft-specific training from the training center at no cost to the 
agency. This training benefits FAA by increasing inspector knowledge 
and familiarization with the actual equipment being inspected, thereby 
providing more effective oversight of the training center. In addition, 
FAA does not have to utilize inspectors to certify the individual 
pilots. The training center benefits from having its own employees 
authorized to certify the pilots attending training at the center, 
rather than having to schedule and wait for FAA inspectors to 
accomplish the certifications. 

According to FAA regional officials, FAA has agreements with 42 of the 
approximately 50 training centers across the United States that include 
162 aircraft simulator fleets. From the data FAA regional offices 
supplied, we determined that an average of 59 instances of training per 
year occurred under these arrangements from fiscal years 2002 through 
2004. (See tables 8 and 9.) 

In total, the technical training provided by industry sources in 
exchange for in-kind services through these two sets of arrangements 
accounts for approximately 17 percent of all industry-provided training 
and approximately 7 percent of all technical training provided to 
inspectors. 

The memoranda of understanding described above were formalized, in 
part, to eliminate actual or appearances of a conflict of interest. The 
U.S. Government Standards of Ethical Conduct precludes federal 
employees from accepting gifts, including training, from those whom 
they regulate. An exception includes anything that is paid for or 
secured by the government under contract. FAA considers the granting of 
check airmen authority to designated examiners to be payment in-kind 
for the training received by FAA inspectors. The purpose of the 
memoranda of understanding for the two arrangements discussed above is 
to outline the nature of the payment in-kind that FAA will provide to 
eliminate the appearance that FAA is receiving a service for free. The 
memoranda of understanding address the conflict-of-interest issue by 
explicitly outlining the duties and responsibilities of both FAA and 
the operator employees who are party to the agreement, and they also 
outline the specific nature of the in-kind exchange. 

For the aircrew designated examiner program, a sample memorandum of 
understanding was written into FAA guidance in 1989. In 1997, a 
Department of Transportation Office of the Inspector General report 
expressed concerns that the aircrew designated examiner program might 
entail a conflict of interest by precluding FAA from enforcing its 
safety regulations.[Footnote 51] As a result, FAA altered the memoranda 
of understanding by eliminating any language that could be construed as 
limiting FAA's oversight authority and by specifically adding language 
to the contrary. 

Unlike the memoranda of understanding for the aircrew designated 
examiner program, the memoranda of understanding between FAA and 
training centers do not contain a provision stating that FAA will take 
enforcement action against any individual who violates any regulation. 
When we brought this to their attention, FAA officials indicated that 
the absence of this enforcement language in the training center 
memoranda of understanding was most likely the result of a simple 
oversight. As a result of our inquiry, FAA officials told us that they 
are revising their guidance to incorporate this enforcement language 
for future arrangements under a memorandum of understanding. Although 
this action will address any concerns about future arrangements, it 
does not make enforcement authority explicit under existing 
arrangements. 

Some Inspectors Receive Free Training without Getting Approval from FAA 
Legal Counsel: 

Some safety inspectors have received training opportunities from 
aircraft manufacturers or operators that they regulate at no cost to 
FAA and without providing an in-kind service in exchange. FAA requires 
that any such free training opportunities be reviewed by FAA legal 
counsel, at the regional or headquarters level, to determine propriety 
of accepting the training. However, some Flight Standards inspectors 
have received free training for which FAA gave them training credit in 
the absence of prior approval from legal counsel. According to Aircraft 
Certification officials, its inspectors do not receive training credit 
for free training offered by aircraft manufacturers, although both 
Aircraft Certification and Flight Standards inspectors often audit 
classes, at no charge to FAA, without receiving credit. 

As mentioned above, U.S. Government Standards of Ethical Conduct 
generally preclude federal employees from accepting gifts, including 
training, from those they regulate. FAA generally does not accept 
offers of gifts unless there is some recognized need, or if acceptance 
will result in cost savings or other benefits in carrying out its work. 
FAA is allowed, however, to receive free training from the aviation 
industry in limited circumstances as a gift under the FAA 
Administrator's gift acceptance authority. Under this authority, the 
FAA Administrator can accept any gift of services in carrying out 
aviation duties and powers. FAA's Chief Counsel concluded in 1988 that 
FAA may accept free training if the session is necessary for the 
employee to perform his or her responsibilities, with respect to the 
provider's projects, and the information cannot be obtained from 
another source. Before inspectors can receive free training, they must 
obtain approval from FAA legal counsel, either at the regional or 
national level. 

Because FAA does not keep central records of free training received, 
and gathering this data was not practical, we asked FAA's nine regional 
offices to request and compile this information for us. In response, 
two of the nine regions indicated that they accept free training on a 
limited basis. One region cited 57 instances, over the past 4 years, of 
training accepted and credited to inspectors' personnel records that 
was provided free of charge. We were able to independently verify only 
12 of these instances because some of the data lacked specific dates, 
some lacked inspector names, and some of the training was presented as 
completed but with no specific information at all. For the second 
region, although it could not identify specific records of such 
training, an official indicated that perhaps five instances of free 
training were accepted over the past 3 years. 

FAA's policy regarding acceptance of free training is not well known or 
uniformly applied by its regional offices. Though some regions 
indicated that the acceptance of free training of any type is not 
allowed under any circumstances, other regions were unsure how the 
policy for acceptance of this type of training is applied. Regarding 
the two regions that told us they had accepted free training, an 
official from the first region indicated that sometimes the regional 
legal counsel's office was asked to comment on the propriety of the 
training and sometimes not. In cases where legal counsel determined the 
training was improper, according to the official, the training was not 
accepted. An official from the second region indicated that free 
training opportunities were taken advantage of when normal FAA channels 
for obtaining the same training were often slow or difficult; 
therefore, accepting this training became necessary if inspectors were 
to receive it at all. Similar to the official from the first region 
where free training is routinely accepted, this official indicated that 
legal counsel was sometimes contacted for an opinion on the propriety 
of accepting a specific instance of training, and sometimes not. 
Because these opportunities generally arise at the local office level, 
whether such an offer is reviewed by legal counsel is dependent on the 
office manager, the manager's understanding of the FAA policy, and a 
judgment about whether a specific training opportunity raises any 
concern that should be reviewed by legal counsel. 

Both the government standards of ethical conduct and FAA policy address 
the propriety of accepting gifts and free training, but FAA has not 
clearly communicated this policy and the processes for accepting free 
training to its regional offices. None of the regions we contacted were 
able to cite any specific, relevant policy guidance governing this 
issue. Several of the FAA officials we interviewed cited "verbal 
policy" from FAA headquarters and a general, long-standing 
understanding that acceptance of such training is not allowed. Other 
regional officials indicated that although acceptance of free training 
is generally to be avoided due to conflict-of-interest considerations, 
they would treat each occurrence separately and likely consult with the 
regional legal counsel for an opinion on the propriety of accepting 
free training. In fact, one region supplied us with an opinion from the 
regional legal counsel, stating that as a general rule, FAA has long 
held that the agency must pay for its own training. The document goes 
on to say that it is permissible to accept such an opportunity to audit 
the class but warns that an inspector is not to consider it formal 
training. Many regional and headquarters officials we spoke with 
indicated that it is common practice for FAA inspectors to audit 
training in this manner, for informational purposes and not for formal 
FAA training credit. On the basis of our survey, about 37 percent of 
inspectors indicated that, in the past 2 years, they have attended or 
inspected a technical training course offered by an airline or 
manufacturer for which they did not receive credit.[Footnote 52]

FAA headquarters officials agreed that the FAA order governing the 
acceptance of gifts and the government's standards of ethical conduct 
address the broad issue of gift acceptance. However, our work indicates 
that these policies may not be clearly and uniformly understood by the 
FAA regional offices. 

Conclusions: 

In providing training to its inspectors, FAA follows many of the 
effective management practices we have outlined in our guide for 
assessing training and development efforts in the federal government. 
In doing so, FAA has put in place thoughtful, structured processes for 
linking training to strategic goals, identifying and developing courses 
to improve individual and agency performance, actively encouraging and 
supporting technical training, ensuring that inspectors have 
opportunities to receive this technical training, and obtaining 
inspectors' and their supervisors' views on the extent to which 
technical training affects job performance. FAA also recognizes the 
need for improvements, including (1) systematically assessing 
inspectors' needs for technical and other training, (2) better timing 
of technical training so that inspectors receive it when it is needed 
to perform their jobs, and (3) better linking the training provided to 
achieving agency goals of improving aviation safety. FAA has begun to 
act in these areas, and we believe that, if effectively implemented, 
the actions should improve the delivery of training and ultimately help 
lead to fewer aviation accidents, fatalities, and injuries. Therefore, 
it is important for FAA to follow through with its efforts. 

FAA's plans for inspector training are premised on the assumption that 
inspectors currently have enough technical proficiency overall and that 
future training efforts should be geared toward closing gaps in 
proficiencies that the agency has determined inspectors require for 
system safety inspections, such as risk assessment and data analysis. 
However, FAA has not convinced inspectors of the merits of its approach 
nor has it systematically identified inspectors' training needs for 
conducting system safety inspections. Inspectors instead believe that 
they are not receiving all the training they need to stay current with 
rapidly changing aviation technologies. Many inspectors spoke out 
strongly on this issue--it is clearly a hot-button topic for them. 
Therefore, it is essential that as FAA continues to implement a system 
safety inspection process, it works closely with inspectors to 
demonstrate the benefits of the system safety approach, how inspectors' 
technical and other training needs will be met, and how aviation safety 
will benefit from a system safety approach. 

Finally, FAA has recognized that the manufacturers and operators of 
aircraft and aircraft systems can be the best source of much of the 
technical training for its inspectors. While FAA pays for most of the 
training its inspectors receive from aviation sources, some of this 
training is provided at no cost or in exchange for in-kind services. 
However, because FAA keeps only scattered records on the extent to 
which such training occurs, we cannot tell how widespread it is or 
whether FAA legal counsel reviewed each training activity for 
propriety. FAA has not communicated its policy on the acceptance of 
training without charge; and, as a result, some FAA regions have 
accepted training that has not been approved and could pose conflict-of-
interest issues--or the appearance of such a conflict--for the agency. 

Recommendations for Executive Action: 

We are making five recommendations, three involving technical training 
and two involving industry provided training. Regarding technical 
training, we recommend that the Secretary of Transportation direct the 
FAA Administrator to complete the following two actions that are either 
planned or are in early stages of development or implementation: 

* To ensure that inspector technical training needs are identified and 
met in a timely manner, the Administrator should systematically assess 
inspectors' technical training needs, increase inspector involvement in 
the decision-making process for assessing the need for courses, 
including the need for more training for maintenance and avionics 
inspectors to familiarize them with recent changes in aviation 
technology, and ensure the technical curriculum meets those needs. The 
Administrator should also take the actions needed, including developing 
guidelines for inspectors, supervisors, and training managers, to 
ensure that technical training is requested and delivered closer to the 
time it is needed to help inspectors perform their jobs. 

* With a view toward maximizing the contributions of training to 
furthering FAA's safety mission, FAA's training organizations should 
determine the feasibility of developing measures of the impact of 
inspector training, including technical training, on achieving 
organizational goals. 

Third, to gain better acceptance from the inspector workforce for 
changes being made and planned for the inspector training curriculum, 
we recommend that the Secretary of Transportation direct the FAA 
Administrator to increase the focus of its training efforts on how 
system safety/risk management will improve inspections and aviation 
safety. 

Fourth, we recommend that the Secretary of Transportation direct the 
FAA Administrator to ensure that all existing and future memoranda of 
understanding pertaining to training received in exchange for in-kind 
services contain language stating that the agreement does not preclude 
FAA from fulfilling its oversight and enforcement role. 

Finally, to preclude situations where the provision of free training by 
the aviation industry may create a conflict of interest or result in 
the appearance of such a conflict, we recommend that the Secretary of 
Transportation direct the FAA Administrator to review its policies on 
the acceptance of free training accepted from the aviation industry to 
ensure they are understood by inspectors, supervisors, managers, and 
regional counsel; implement a process for monitoring field office 
compliance with these policies; and follow up on any noncompliance. 

Agency Comments and Our Evaluation: 

We provided a draft of this report to the Department of Transportation 
and received comments from FAA officials, including its Deputy 
Associate Administrator for Aviation Safety. FAA generally agreed with 
the report's findings and agreed to consider our recommendations. The 
FAA representatives appreciated the report's positive recognition of 
its efforts to provide safety inspectors with the technical training 
they need to effectively accomplish their mission. 

FAA officials suggested that we modify how we grouped our presentation 
of findings from our survey of inspectors. Specifically, they 
maintained that our analysis of results should have included "moderate 
extent" along with "very great" and "great extent" as a positive 
response because the inspectors would have viewed a "moderate extent" 
response as a positive response. Thus, in FAA's view, combining the 
"moderate extent" responses with "great extent" and "very great extent" 
responses would more accurately reflect the respondents' intent. The 
extent scale that we used in our survey represents a unidirectional 
scale. As such, it is possible to interpret any point along that scale, 
other than "no extent," as positive, depending upon how a question is 
worded. Generally, we presented information in the report with both 
"very great extent" and "great extent" combined to represent the 
clearly most positive responses. The combination of "very great extent" 
and "great extent" responses was intended to give FAA a clearer 
understanding of inspectors' perceptions and guidance as to where the 
application of its efforts is likely to have the greatest effect. 
Although this approach served our purposes best, there are naturally 
multiple ways in which one might combine response categories. As such, 
we have provided detailed results showing responses for each question 
by each response category in appendix II and the e-supplement to this 
report. 

The officials also noted that we defined technical training for the 
purpose of this report differently from what FAA considers to be 
technical training for inspectors. While these officials appreciated 
the recognition of the differences in the two definitions in our 
report, they said that the different definitions account for some 
disparity between what FAA considers the percentage of training 
achieved and that shown in the draft report. For example, the 
department considers the use of computer automation tools as a critical 
element of an inspector's ability to provide effective and efficient 
safety oversight. Because the Vision 100-Century of Aviation 
Reauthorization Act required that we focus on training in the latest 
aviation technologies--which we termed technical training--we did not 
include courses such as the use of computer tools in our assessment. 
Nevertheless, our draft and final reports acknowledge the importance of 
other training provided to inspectors, particularly training in skills 
relating to system safety and risk assessment. 

The department also provided several clarifying comments and technical 
corrections, which we have incorporated in this report as appropriate. 

We are sending copies of this report to congressional committees and 
subcommittees with responsibilities for transportation safety issues; 
the Secretary of Transportation; the Administrator, FAA; and the 
Director, Office of Management and Budget. We will also make copies 
available to others upon request. This report will be available at no 
charge on the GAO Web site at [Hyperlink, http://www.gao.gov]. 

If you have any questions about this report, please contact me at (202) 
512-2834 or [Hyperlink, dillinghamg@gao.gov]. Contact points for our 
Offices of Congressional Relations and Public Affairs may be found on 
the last page of this report. Staff who made key contributions to this 
report are listed in appendix IV. 

Signed by: 

Gerald L. Dillingham, Ph.D.: 
Director, Physical Infrastructure Issues: 

Appendixes: 

[End of section]

Appendix I: Inspector-Reported Travel for Technical Training: 

The Vision 100-Century of Aviation Reauthorization Act required that we 
report on the amount of travel required of Federal Aviation 
Administration (FAA) inspectors in receiving training. To attempt to 
accomplish this requirement, we asked FAA to provide us with 
information on the number of times each inspector was in travel status 
for training, the location of the training, and the duration of the 
trips. FAA was not able to readily provide this information, citing 
limitations of its databases that track inspector travel. FAA told us 
that it is able to access and review individual inspector travel 
records, but its information systems are not set up to compile and 
analyze travel data for inspectors' travel for training, as a whole. In 
part, this information is not readily available because the data are 
stored in multiple databases, and the information is recorded 
differently, depending on how the training is arranged and budgeted. 
Further, FAA officials told us that (1) these data would be extremely 
time consuming to collect and compile and (2) a manual search for 
location of training would be necessary in some cases. On the basis of 
our inquiries, we concluded that it was not unreasonable for FAA to 
lack an easily accessible, comprehensive set of travel data. 

Thus to obtain information on inspectors' travel for training we used 
our survey of aviation safety inspectors (conducted in late 2004). We 
asked inspectors to tell us how many weeks they were on travel status 
for technical training in the past 12 months. On the basis of our 
survey, we estimate that inspectors spend an average of about 3.1 weeks 
per year on travel status for technical training.[Footnote 53] (See 
fig. 11.) We found that an estimated 54 percent of inspectors were on 
travel status for 1 to 3 weeks, and 27 percent spent 4 weeks or more on 
travel for technical training.[Footnote 54] About 19 percent of 
inspectors spent no time on travel status for technical training in the 
past year.[Footnote 55]

Figure 11: Number of Weeks Inspectors Reported Spending on Travel for 
Technical Training within the Past 12 Months: 

[See PDF for image] 

[End of figure] 

On average, Flight Standards inspectors spent more time on travel for 
technical training than did Aircraft Certification inspectors, 
according to our analysis of survey responses. Flight Standards 
inspectors spent an average of approximately 3.2 weeks over the past 
year, and Aircraft Certification inspectors were on travel for training 
for approximately 2 weeks.[Footnote 56]

The Vision 100-Century of Aviation Reauthorization Act contained a 
"Sense of the House" that stated that, if possible, FAA inspectors 
should be allowed to take training at the location most convenient for 
the inspector. As part of our survey, we asked the inspectors the 
extent to which there are opportunities for FAA to offer or contract 
for technical training closer to the inspectors' work location. 
According to our survey, we estimate that approximately 13 percent of 
inspectors indicated, to a great or very great extent, that such 
opportunities existed.[Footnote 57] (See fig. 12.) However, more than 
one-third of inspectors indicated they did not know if such training 
opportunities existed.[Footnote 58] We did not attempt to verify 
inspectors' views on opportunities for nearby technical training. 

Figure 12: Inspectors' Views on the Extent to Which Technical Training 
Opportunities Exist Closer to Their Work Location: 

[See PDF for image] 

Note: See table 22 in appendix II for additional details. 

[End of figure] 

[End of section]

Appendix II: Additional Details on Training Data and Selected Inspector 
Survey Responses: 

Tables 10 through 22 provide additional inspector training data as well 
as additional detail on inspectors' views on FAA technical training, as 
discussed earlier in this report. The survey results, exclusive of 
inspector specialty breakouts, can be found at [Hyperlink, 
http://www.gao.gov/cgi-bin/getrpt?GAO-05-704SP]. 

The stratified random sample of FAA inspectors was designed to have an 
overall margin of error of plus or minus 4 percentage points at a 95 
percent level of confidence. Due to nonresponse, the actual overall 
margin of error is plus or minus 4.6 percentage points. The individual 
types of FAA inspectors represent strata in the sample. The precision 
for results within each stratum is less than the overall precision for 
population level estimates. Estimates for each individual type of 
safety inspector (stratum level) have margins of error greater than 4.6 
percentage points. Estimates are more accurate for strata that have a 
larger number of responding inspectors than for those with fewer 
inspectors in them. 

For tables in this appendix that provide results of our survey of 
safety inspectors, we present both the estimated percentage of those 
responding in a certain way to each question and the confidence 
interval associated with that estimate. For example, in table 13, we 
report on the percentage of general aviation inspectors who responded 
to a great extent that they have enough technical knowledge to do their 
job as 39 (31-46). This means that we estimate that 39 percent of all 
general aviation inspectors believe this to a great extent. Had we 
surveyed the population of all general aviation inspectors, we are 95 
percent confident that the percentage point responding "to a great 
extent" for this survey question would lie between 31 and 46 percentage 
points. The confidence interval reflects the sampling error that 
corresponds to the estimate of 39 percent. The tables associated with 
our survey in this appendix provide the number of respondents within 
each row. In some cases, the numbers are small because FAA has 
relatively few of these types of inspectors. See appendix III for more 
information on how we conducted our survey. 

Table 10: Percent of Essential Courses That Are Technical in Nature: 

Type of inspector: Air carrier avionics; 
Number of essential technical courses: 14; 
Number of essential courses overall: 28; 
Percent of essential courses that are technical: 50. 

Type of inspector: Air carrier maintenance; 
Number of essential technical courses: 8; 
Number of essential courses overall: 22; 
Percent of essential courses that are technical: 36. 

Type of inspector: Air carrier operations; 
Number of essential technical courses: 1; 
Number of essential courses overall: 10; 
Percent of essential courses that are technical: 10. 

Type of inspector: Cabin safety; 
Number of essential technical courses: 3; 
Number of essential courses overall: 12; 
Percent of essential courses that are technical: 25. 

Type of inspector: General aviation avionics; 
Number of essential technical courses: 12; 
Number of essential courses overall: 26; 
Percent of essential courses that are technical: 46. 

Type of inspector: General aviation maintenance; 
Number of essential technical courses: 8; 
Number of essential courses overall: 23; 
Percent of essential courses that are technical: 35. 

Type of inspector: General aviation operations; 
Number of essential technical courses: 2; 
Number of essential courses overall: 13; 
Percent of essential courses that are technical: 15. 

Type of inspector: Aircraft certification; 
Number of essential technical courses: 0; 
Number of essential courses overall: 7; 
Percent of essential courses that are technical: 0. 

Type of inspector: All inspectors; 
Number of essential technical courses: 48; 
Number of essential courses overall: 141; 
Percent of essential courses that are technical: 34. 

Source: GAO analysis of FAA data. 

[End of table]

Table 11: Percent of Inspectors Completing Essential Courses: 

Type of inspector: Air carrier avionics; 
Percent of inspectors completing at least 75 percent of essential 
courses: 13; 
Percent of inspectors completing at least 50 percent of essential 
courses: 81. 

Type of inspector: Air carrier maintenance; 
Percent of inspectors completing at least 75 percent of essential 
courses: 25; 
Percent of inspectors completing at least 50 percent of essential 
courses: 83. 

Type of inspector: Air carrier operations; 
Percent of inspectors completing at least 75 percent of essential 
courses: 30; 
Percent of inspectors completing at least 50 percent of essential 
courses: 79. 

Type of inspector: Cabin safety; 
Percent of inspectors completing at least 75 percent of essential 
courses: 73; 
Percent of inspectors completing at least 50 percent of essential 
courses: 88. 

Type of inspector: General aviation avionics; 
Percent of inspectors completing at least 75 percent of essential 
courses: 30; 
Percent of inspectors completing at least 50 percent of essential 
courses: 81. 

Type of inspector: General aviation maintenance; 
Percent of inspectors completing at least 75 percent of essential 
courses: 36; 
Percent of inspectors completing at least 50 percent of essential 
courses: 86. 

Type of inspector: General aviation operations; 
Percent of inspectors completing at least 75 percent of essential 
courses: 47; 
Percent of inspectors completing at least 50 percent of essential 
courses: 88. 

Type of inspector: Aircraft certification; 
Percent of inspectors completing at least 75 percent of essential 
courses: 9; 
Percent of inspectors completing at least 50 percent of essential 
courses: 94. 

Type of inspector: All inspectors; 
Percent of inspectors completing at least 75 percent of essential 
courses: 30; 
Percent of inspectors completing at least 50 percent of essential 
courses: 84. 

Source: GAO analysis of FAA data. 

Note: See text for a discussion of why inspectors may not have 
completed essential courses. 

[End of table]

Table 12: Average Number of Technical Training Courses Taken Outside of 
Requirements, Fiscal Years 2002 through 2004: 

Type of inspector: Air carrier avionics; 
Average number of technical courses: 1.5. 

Type of inspector: Air carrier maintenance; 
Average number of technical courses: 1.7. 

Type of inspector: Air carrier operations; 
Average number of technical courses: 2.3. 

Type of inspector: Cabin safety; 
Average number of technical courses: 0.2. 

Type of inspector: General aviation avionics; 
Average number of technical courses: 1.4. 

Type of inspector: General aviation maintenance; 
Average number of technical courses: 1.8. 

Type of inspector: General aviation operations; 
Average number of technical courses: 2.6. 

Type of inspector: Aircraft certification; 
Average number of technical courses: 1.2. 

Type of inspector: All inspectors; 
Average number of technical courses: 1.7. 

Source: GAO analysis of FAA data. 

[End of table]

Table 13: Inspectors' Views on Extent to Which They Currently Have 
Enough Technical Knowledge to Do Their Jobs: 

Percent (confidence interval). 

Type of inspector: Air carrier; 
Unweighted sample size: 231; 
Very great: 13% (Confidence interval: 9-18); 
Great: 40% (Confidence interval: 34-46); 
Moderate: 33% (Confidence interval: 27-39); 
Some: 12% (Confidence interval: 9-17); 
None: 1% (Confidence interval: 0-4). 

Type of inspector: Air carrier avionics; 
Unweighted sample size: 46; 
Very great: 7% (Confidence interval: 2-17); 
Great: 28% (Confidence interval: 17-42); 
Moderate: 37% (Confidence interval: 24-52); 
Some: 28% (Confidence interval: 17-42); 
None: 0% (Confidence interval: 0-6). 

Type of inspector: Air carrier maintenance; 
Unweighted sample size: 88; 
Very great: 8% (Confidence interval: 3-15); 
Great: 44% (Confidence interval: 35-54); 
Moderate: 35% (Confidence interval: 26-45); 
Some: 9% (Confidence interval: 4-17); 
None: 3% (Confidence interval: 1-9). 

Type of inspector: Air carrier operations; 
Unweighted sample size: 79; 
Very great: 20% (Confidence interval: 13-30); 
Great: 41% (Confidence interval: 31-51); 
Moderate: 30% (Confidence interval: 21-40); 
Some: 9% (Confidence interval: 4-16); 
None: 0% (Confidence interval: 0-4). 

Type of inspector: Cabin safety; 
Unweighted sample size: 18; 
Very great: 28% (Confidence interval: 11-50); 
Great: 51% (Confidence interval: 30-71); 
Moderate: 11% (Confidence interval: 2-31); 
Some: 11% (Confidence interval: 2-31); 
None: 0% (Confidence interval: 0-15). 

General aviation: General aviation; 
Unweighted sample size: 132; 
Very great: 10% (Confidence interval: 6-16); 
Great: 39% (Confidence interval: 31-46); 
Moderate: 36% (Confidence interval: 29-44); 
Some: 15% (Confidence interval: 10-22); 
None: 0% (Confidence interval: 0-2). 

General aviation: General aviation avionics; 
Unweighted sample size: 22; 
Very great: 0% (Confidence interval: 0-13); 
Great: 45% (Confidence interval: 25-67); 
Moderate: 36% (Confidence interval: 18-58); 
Some: 18% (Confidence interval: 6-39); 
None: 0% (Confidence interval: 0-13). 

General aviation: General aviation maintenance; 
Unweighted sample size: 56; 
Very great: 13% (Confidence interval: 6-23); 
Great: 41% (Confidence interval: 29-53); 
Moderate: 36% (Confidence interval: 24-49); 
Some: 11% (Confidence interval: 4-21); 
None: 0% (Confidence interval: 0-5). 

General aviation: General aviation operations; 
Unweighted sample size: 54; 
Very great: 11% (Confidence interval: 4-22); 
Great: 33% (Confidence interval: 22-47); 
Moderate: 37% (Confidence interval: 25-49); 
Some: 19% (Confidence interval: 10-31); 
None: 0% (Confidence interval: 0-5). 

Aircraft certification; 
Unweighted sample size: 25; 
Very great: 20% (Confidence interval: 7-39); 
Great: 48% (Confidence interval: 29-67); 
Moderate: 28% (Confidence interval: 13-48); 
Some: 4% (Confidence interval: 0-19); 
None: 0% (Confidence interval: 0-11). 

All inspectors; 
Unweighted sample size: 388; 
Very great: 12% (Confidence interval: 9-16); 
Great: 40% (Confidence interval: 35-45); 
Moderate: 34% (Confidence interval: 29-38); 
Some: 13% (Confidence interval: 10-16); 
None: 1% (Confidence interval: 0-2). 

Source: GAO survey of FAA inspectors. 

Note: The data in this table represent the responses from inspectors to 
the following question, "To what extent do you currently have enough 
technical knowledge about the aircraft, systems, or operations you 
inspect to do your present job?" For more detail about the estimates 
and the corresponding confidence intervals (numbers in parentheses), 
please see the text at the beginning of this appendix. Some of the row 
percentages will not add up to 100 percent due to rounding. See figure 
5 for visual illustration. 

[End of table]

Table 14: Inspectors' Views on Extent to Which Requested Technical 
Training Is Approved: 

Percent (confidence interval). 

Type of inspector: Air carrier; 
Unweighted sample size: 231; 
Very great: 8% (Confidence interval: 5-12); 
Great: 23% (Confidence interval: 18-28); 
Moderate: 24% (Confidence interval: 19-30); 
Some: 28% (Confidence interval: 22-33); 
None: 13% (Confidence interval: 9-18); 
Don't know: 4% (Confidence interval: 2-7). 

Type of inspector: Air carrier avionics; 
Unweighted sample size: 46; 
Very great: 4% (Confidence interval: 1-14); 
Great: 22% (Confidence interval: 12-35); 
Moderate: 24% (Confidence interval: 13-38); 
Some: 37% (Confidence interval: 24-51); 
None: 9% (Confidence interval: 3-19); 
Don't know: 4% (Confidence interval: 1-14). 

Type of inspector: Air carrier maintenance; 
Unweighted sample size: 88; 
Very great: 3% (Confidence interval: 1-9); 
Great: 24% (Confidence interval: 16-33); 
Moderate: 27% (Confidence interval: 19-37); 
Some: 28% (Confidence interval: 20-38); 
None: 11% (Confidence interval: 6-19); 
Don't know: 6% (Confidence interval: 2-12). 

Type of inspector: Air carrier operations; 
Unweighted sample size: 80; 
Very great: 14% (Confidence interval: 7-23); 
Great: 21% (Confidence interval: 13-31); 
Moderate: 21% (Confidence interval: 13-31); 
Some: 24% (Confidence interval: 15-34); 
None: 17% (Confidence interval: 10-27); 
Don't know: 2% (Confidence interval: 0-8). 

Type of inspector: Cabin safety; 
Unweighted sample size: 17; 
Very great: 12% (Confidence interval: 2-33); 
Great: 34% (Confidence interval: 17-54); 
Moderate: 24% (Confidence interval: 8-47); 
Some: 18% (Confidence interval: 6-40); 
None: 6% (Confidence interval: 0-25); 
Don't know: 5% (Confidence interval: 0-22). 

General aviation: General aviation; 
Unweighted sample size: 132; 
Very great: 5% (Confidence interval: 2-9); 
Great: 19% (Confidence interval: 13-26); 
Moderate: 17% (Confidence interval: 12-24); 
Some: 30% (Confidence interval: 22-37); 
None: 24% (Confidence interval: 17-31); 
Don't know: 6% (Confidence interval: 3-11). 

General aviation: General aviation avionics; 
Unweighted sample size: 22; 
Very great: 5% (Confidence interval: 0-22); 
Great: 5% (Confidence interval: 0-22); 
Moderate: 18% (Confidence interval: 6-39); 
Some: 36% (Confidence interval: 18-58); 
None: 27% (Confidence interval: 11-49); 
Don't know: 9% (Confidence interval: 1-28). 

General aviation: General aviation maintenance; 
Unweighted sample size: 56; 
Very great: 7% (Confidence interval: 2-17); 
Great: 18% (Confidence interval: 9-30); 
Moderate: 20% (Confidence interval: 11-32); 
Some: 29% (Confidence interval: 18-41); 
None: 20% (Confidence interval: 11-32); 
Don't know: 7% (Confidence interval: 2-17). 

General aviation: General aviation operations; 
Unweighted sample size: 54; 
Very great: 2% (Confidence interval: 0-9); 
Great: 26% (Confidence interval: 15-39); 
Moderate: 15% (Confidence interval: 7-26); 
Some: 28% (Confidence interval: 17-41); 
None: 26% (Confidence interval: 15-39); 
Don't know: 4% (Confidence interval: 1-12). 

Aircraft certification; 
Unweighted sample size: 25; 
Very great: 4% (Confidence interval: 0-19); 
Great: 16% (Confidence interval: 5-35); 
Moderate: 28% (Confidence interval: 13-48); 
Some: 16% (Confidence interval: 5-35); 
None: 28% (Confidence interval: 13-48); 
Don't know: 8% (Confidence interval: 1-24). 

All inspectors; 
Unweighted sample size: 388; 
Very great: 6% (Confidence interval: 4-9); 
Great: 21% (Confidence interval: 17-25); 
Moderate: 22% (Confidence interval: 18-26); 
Some: 28% (Confidence interval: 24-32); 
None: 18% (Confidence interval: 14-21); 
Don't know: 5% (Confidence interval: 3-8). 

Source: GAO survey of FAA inspectors. 

Note: The data in this table represent the responses from inspectors to 
the following question, "To what extent have the technical training 
courses you requested been approved?" For more detail about the 
estimates and the corresponding confidence intervals (numbers in 
parentheses), please see the text at the beginning of this appendix. 
Some of the row percentages will not add up to 100 percent due to 
rounding. See figure 6 for visual illustration. 

[End of table]

Table 15: Inspectors' Views on Whether Availability of Courses Helped 
or Hindered Their Ability to Take Requested Technical Training: 

Percent (confidence interval). 

Type of inspector: Air carrier; 
Unweighted sample size: 231; 
Greatly helped: 4% (Confidence interval: 2-7); 
Helped: 24% (Confidence interval: 18-29); 
Neither helped nor hindered: 27% (Confidence interval: 22-33); 
Hindered: 30% (Confidence interval: 24-36); 
Greatly hindered: 8% (Confidence interval: 5-12); 
No basis to judge: 5% (Confidence interval: 3-8); 
Don't know: 2% (Confidence interval: 1-4). 

Type of inspector: Air carrier avionics; 
Unweighted sample size: 46; 
Greatly helped: 4% (Confidence interval: 1-14); 
Helped: 22% (Confidence interval: 11-35); 
Neither helped nor hindered: 28% (Confidence interval: 17-42); 
Hindered: 35% (Confidence interval: 22-49); 
Greatly hindered: 9% (Confidence interval: 3-20); 
No basis to judge: 0% (Confidence interval: 0-6); 
Don't know: 2% (Confidence interval: 0-10). 

Type of inspector: Air carrier maintenance; 
Unweighted sample size: 88; 
Greatly helped: 3% (Confidence interval: 1-9); 
Helped: 27% (Confidence interval: 19-37); 
Neither helped nor hindered: 20% (Confidence interval: 13-30); 
Hindered: 34% (Confidence interval: 25-43); 
Greatly hindered: 9% (Confidence interval: 4-17); 
No basis to judge: 2% (Confidence interval: 0-7); 
Don't know: 3% (Confidence interval: 1-9). 

Type of inspector: Air carrier operations; 
Unweighted sample size: 79; 
Greatly helped: 4% (Confidence interval: 1-10); 
Helped: 22% (Confidence interval: 13-32); 
Neither helped nor hindered: 34% (Confidence interval: 24-44); 
Hindered: 25% (Confidence interval: 17-36); 
Greatly hindered: 6% (Confidence interval: 2-14); 
No basis to judge: 9% (Confidence interval: 4-17); 
Don't know: 0% (Confidence interval: 0-4). 

Type of inspector: Cabin safety; 
Unweighted sample size: 18; 
Greatly helped: 6% (Confidence interval: 0-24); 
Helped: 16% (Confidence interval: 5-36); 
Neither helped nor hindered: 28% (Confidence interval: 11-50); 
Hindered: 12% (Confidence interval: 2-31); 
Greatly hindered: 17% (Confidence interval: 5-38); 
No basis to judge: 22% (Confidence interval: 8-43); 
Don't know: 0% (Confidence interval: 0-15). 

General aviation: General aviation; 
Unweighted sample size: 131; 
Greatly helped: 3% (Confidence interval: 1-7); 
Helped: 18% (Confidence interval: 12-26); 
Neither helped nor hindered: 34% (Confidence interval: 27-42); 
Hindered: 25% (Confidence interval: 18-32); 
Greatly hindered: 9% (Confidence interval: 5-15); 
No basis to judge: 8% (Confidence interval: 4-14); 
Don't know: 2% (Confidence interval: 0-5). 

General aviation: General aviation avionics; 
Unweighted sample size: 21; 
Greatly helped: 0% (Confidence interval: 0-13); 
Helped: 10% (Confidence interval: 1-29); 
Neither helped nor hindered: 33% (Confidence interval: 15-56); 
Hindered: 29% (Confidence interval: 12-51); 
Greatly hindered: 14% (Confidence interval: 3-35); 
No basis to judge: 14% (Confidence interval: 3-35); 
Don't know: 0% (Confidence interval: 0-13). 

General aviation: General aviation maintenance; 
Unweighted sample size: 56; 
Greatly helped: 4% (Confidence interval: 1-12); 
Helped: 23% (Confidence interval: 13-36); 
Neither helped nor hindered: 38% (Confidence interval: 26-49); 
Hindered: 21% (Confidence interval: 12-34); 
Greatly hindered: 7% (Confidence interval: 2-17); 
No basis to judge: 5% (Confidence interval: 1-14); 
Don't know: 2% (Confidence interval: 0-9). 

General aviation: General aviation operations; 
Unweighted sample size: 54; 
Greatly helped: 4% (Confidence interval: 1-12); 
Helped: 17% (Confidence interval: 8-29); 
Neither helped nor hindered: 31% (Confidence interval: 20-45); 
Hindered: 28% (Confidence interval: 17-41); 
Greatly hindered: 9% (Confidence interval: 3-20); 
No basis to judge: 9% (Confidence interval: 3-20); 
Don't know: 2% (Confidence interval: 0-9). 

Aircraft certification; 
Unweighted sample size: 25; 
Greatly helped: 8% (Confidence interval: 1-24); 
Helped: 20% (Confidence interval: 7-39); 
Neither helped nor hindered: 36% (Confidence interval: 19-56); 
Hindered: 4% (Confidence interval: 0-19); 
Greatly hindered: 20% (Confidence interval: 7-39); 
No basis to judge: 12% (Confidence interval: 3-30); 
Don't know: 0% (Confidence interval: 0-11). 

All inspectors; 
Unweighted sample size: 387; 
Greatly helped: 4% (Confidence interval: 2-6); 
Helped: 22% (Confidence interval: 18-25); 
Neither helped nor hindered: 30% (Confidence interval: 26-35); 
Hindered: 27% (Confidence interval: 23-31); 
Greatly hindered: 9% (Confidence interval: 7-12); 
No basis to judge: 7% (Confidence interval: 5-9); 
Don't know: 2% (Confidence interval: 1-3). 

Source: GAO survey of FAA inspectors. 

Note: The data in this table represent the responses from inspectors to 
the following question, "Have the following factors helped or hindered 
your ability to take the technical training you requested to do your 
current job? Factor: Availability of courses." For more detail about 
the estimates and the corresponding confidence intervals (numbers in 
parentheses), please see the text at the beginning of this appendix. 
Some of the row percentages will not add up to 100 percent due to 
rounding. See figure 7 for visual illustration. 

[End of table]

Table 16: Inspectors' Views on Whether Availability of Funds Helped or 
Hindered Their Ability to Take Requested Technical Training: 

Percent (confidence interval). 

Type of inspector: Air carrier; 
Unweighted sample size: 229; 
Greatly helped: 4% (Confidence interval: 2-8); 
Helped: 10% (Confidence interval: 6-14); 
Neither helped nor hindered: 21% (Confidence interval: 16-26); 
Hindered: 28% (Confidence interval: 22-33); 
Greatly hindered: 24% (Confidence interval: 18-29); 
No basis to judge: 7% (Confidence interval: 4-11); 
Don't know: 7% (Confidence interval: 4-10). 

Type of inspector: Air carrier avionics; 
Unweighted sample size: 46; 
Greatly helped: 9% (Confidence interval: 3-20); 
Helped: 6% (Confidence interval: 2-16); 
Neither helped nor hindered: 15% (Confidence interval: 7-28); 
Hindered: 39% (Confidence interval: 26-54); 
Greatly hindered: 17% (Confidence interval: 8-30); 
No basis to judge: 0% (Confidence interval: 0-6); 
Don't know: 13% (Confidence interval: 5-25). 

Type of inspector: Air carrier maintenance; 
Unweighted sample size: 87; 
Greatly helped: 3% (Confidence interval: 1-9); 
Helped: 16% (Confidence interval: 9-25); 
Neither helped nor hindered: 22% (Confidence interval: 14-31); 
Hindered: 24% (Confidence interval: 16-34); 
Greatly hindered: 22% (Confidence interval: 14-31); 
No basis to judge: 6% (Confidence interval: 2-12); 
Don't know: 7% (Confidence interval: 3-14). 

Type of inspector: Air carrier operations; 
Unweighted sample size: 78; 
Greatly helped: 4% (Confidence interval: 1-11); 
Helped: 5% (Confidence interval: 2-12); 
Neither helped nor hindered: 23% (Confidence interval: 15-33); 
Hindered: 26% (Confidence interval: 17-36); 
Greatly hindered: 28% (Confidence interval: 19-39); 
No basis to judge: 10% (Confidence interval: 5-19); 
Don't know: 4% (Confidence interval: 1-10). 

Type of inspector: Cabin safety; 
Unweighted sample size: 18; 
Greatly helped: 0% (Confidence interval: 0-15); 
Helped: 6% (Confidence interval: 0-24); 
Neither helped nor hindered: 17% (Confidence interval: 5-38); 
Hindered: 33% (Confidence interval: 15-56); 
Greatly hindered: 28% (Confidence interval: 12-51); 
No basis to judge: 16% (Confidence interval: 5-36); 
Don't know: 0% (Confidence interval: 0-15). 

General aviation: General aviation; 
Unweighted sample size: 131; 
Greatly helped: 3% (Confidence interval: 1-7); 
Helped: 8% (Confidence interval: 4-13); 
Neither helped nor hindered: 16% (Confidence interval: 11-23); 
Hindered: 30% (Confidence interval: 22-37); 
Greatly hindered: 29% (Confidence interval: 22-36); 
No basis to judge: 10% (Confidence interval: 6-16); 
Don't know: 5% (Confidence interval: 2-9). 

General aviation: General aviation avionics; 
Unweighted sample size: 22; 
Greatly helped: 0% (Confidence interval: 0-13); 
Helped: 9% (Confidence interval: 1-28); 
Neither helped nor hindered: 18% (Confidence interval: 6-39); 
Hindered: 18% (Confidence interval: 6-39); 
Greatly hindered: 36% (Confidence interval: 18-58); 
No basis to judge: 14% (Confidence interval: 3-34); 
Don't know: 5% (Confidence interval: 0-22). 

General aviation: General aviation maintenance; 
Unweighted sample size: 56; 
Greatly helped: 2% (Confidence interval: 0-9); 
Helped: 11% (Confidence interval: 4-21); 
Neither helped nor hindered: 13% (Confidence interval: 6-23); 
Hindered: 32% (Confidence interval: 21-45); 
Greatly hindered: 30% (Confidence interval: 19-43); 
No basis to judge: 5% (Confidence interval: 1-14); 
Don't know: 7% (Confidence interval: 2-17). 

General aviation: General aviation operations; 
Unweighted sample size: 53; 
Greatly helped: 6% (Confidence interval: 1-15); 
Helped: 4% (Confidence interval: 1-12); 
Neither helped nor hindered: 19% (Confidence interval: 10-31); 
Hindered: 32% (Confidence interval: 21-46); 
Greatly hindered: 25% (Confidence interval: 14-37); 
No basis to judge: 13% (Confidence interval: 6-25); 
Don't know: 2% (Confidence interval: 0-9). 

Aircraft certification; 
Unweighted sample size: 25; 
Greatly helped: 12% (Confidence interval: 3-30); 
Helped: 4% (Confidence interval: 0-19); 
Neither helped nor hindered: 8% (Confidence interval: 1-24); 
Hindered: 28% (Confidence interval: 13-48); 
Greatly hindered: 28% (Confidence interval: 13-48); 
No basis to judge: 12% (Confidence interval: 3-30); 
Don't know: 8% (Confidence interval: 1-24). 

All inspectors; 
Unweighted sample size: 385; 
Greatly helped: 4% (Confidence interval: 3-7); 
Helped: 9% (Confidence interval: 6-12); 
Neither helped nor hindered: 18% (Confidence interval: 15-22); 
Hindered: 29% (Confidence interval: 24-33); 
Greatly hindered: 26% (Confidence interval: 22-30); 
No basis to judge: 8% (Confidence interval: 6-11); 
Don't know: 6% (Confidence interval: 4-9). 

Source: GAO survey of FAA inspectors. 

Note: The data in this table represent the responses from inspectors to 
the following question, "Have the following factors helped or hindered 
your ability to take the technical training you requested to do your 
current job? Factor: Availability of funds." For more detail about the 
estimates and the corresponding confidence intervals (numbers in 
appendix), please see the text at the beginning of this appendix. Some 
of the row percentages will not add up to 100 percent due to rounding. 
See figure 7 for visual illustration. 

[End of table]

Table 17: Inspectors' Views on Whether Management's Determination of 
Need Helped or Hindered Their Ability to Take Requested Technical 
Training: 

Percent (confidence interval). 

Type of inspector: Air carrier; 
Unweighted sample size: 230; 
Greatly helped: 7% (Confidence interval: 4-11); 
Helped: 29% (Confidence interval: 23-34); 
Neither helped nor hindered: 33% (Confidence interval: 27-39); 
Hindered: 15% (Confidence interval: 11-20); 
Greatly hindered: 7% (Confidence interval: 4-11); 
No basis to judge: 6% (Confidence interval: 4-10); 
Don't know: 3% (Confidence interval: 1-6). 

Type of inspector: Air carrier avionics; 
Unweighted sample size: 46; 
Greatly helped: 4% (Confidence interval: 1-14); 
Helped: 37% (Confidence interval: 25-51); 
Neither helped nor hindered: 30% (Confidence interval: 19-44); 
Hindered: 15% (Confidence interval: 7-28); 
Greatly hindered: 6% (Confidence interval: 2-16); 
No basis to judge: 2% (Confidence interval: 0-11); 
Don't know: 4% (Confidence interval: 1-14). 

Type of inspector: Air carrier maintenance; 
Unweighted sample size: 88; 
Greatly helped: 7% (Confidence interval: 3-14); 
Helped: 28% (Confidence interval: 20-38); 
Neither helped nor hindered: 38% (Confidence interval: 28-47); 
Hindered: 15% (Confidence interval: 8-23); 
Greatly hindered: 7% (Confidence interval: 3-14); 
No basis to judge: 3% (Confidence interval: 1-9); 
Don't know: 2% (Confidence interval: 0-7). 

Type of inspector: Air carrier operations; 
Unweighted sample size: 79; 
Greatly helped: 8% (Confidence interval: 3-15); 
Helped: 25% (Confidence interval: 17-36); 
Neither helped nor hindered: 31% (Confidence interval: 21-40); 
Hindered: 15% (Confidence interval: 8-24); 
Greatly hindered: 8% (Confidence interval: 3-15); 
No basis to judge: 10% (Confidence interval: 5-19); 
Don't know: 4% (Confidence interval: 1-10). 

Type of inspector: Cabin safety; 
Unweighted sample size: 17; 
Greatly helped: 6% (Confidence interval: 0-25); 
Helped: 28% (Confidence interval: 14-46); 
Neither helped nor hindered: 18% (Confidence interval: 6-40); 
Hindered: 25% (Confidence interval: 10-46); 
Greatly hindered: 6% (Confidence interval: 0-25); 
No basis to judge: 17% (Confidence interval: 5-38); 
Don't know: 0% (Confidence interval: 0-16). 

General aviation: General aviation; 
Unweighted sample size: 132; 
Greatly helped: 6% (Confidence interval: 3-11); 
Helped: 23% (Confidence interval: 17-31); 
Neither helped nor hindered: 28% (Confidence interval: 21-35); 
Hindered: 19% (Confidence interval: 13-26); 
Greatly hindered: 13% (Confidence interval: 8-19); 
No basis to judge: 8% (Confidence interval: 4-14); 
Don't know: 2% (Confidence interval: 1-6). 

General aviation: General aviation avionics; 
Unweighted sample size: 22; 
Greatly helped: 0% (Confidence interval: 0-13); 
Helped: 27% (Confidence interval: 11-49); 
Neither helped nor hindered: 14% (Confidence interval: 3-34); 
Hindered: 23% (Confidence interval: 8-44); 
Greatly hindered: 18% (Confidence interval: 6-39); 
No basis to judge: 14% (Confidence interval: 3-34); 
Don't know: 5% (Confidence interval: 0-22). 

General aviation: General aviation maintenance; 
Unweighted sample size: 56; 
Greatly helped: 9% (Confidence interval: 3-19); 
Helped: 23% (Confidence interval: 13-36); 
Neither helped nor hindered: 34% (Confidence interval: 22-47); 
Hindered: 16% (Confidence interval: 8-28); 
Greatly hindered: 13% (Confidence interval: 6-23); 
No basis to judge: 4% (Confidence interval: 1-12); 
Don't know: 2% (Confidence interval: 0-9). 

General aviation: General aviation operations; 
Unweighted sample size: 54; 
Greatly helped: 6% (Confidence interval: 1-15); 
Helped: 22% (Confidence interval: 13-35); 
Neither helped nor hindered: 28% (Confidence interval: 17-41); 
Hindered: 20% (Confidence interval: 11-33); 
Greatly hindered: 11% (Confidence interval: 4-22); 
No basis to judge: 11% (Confidence interval: 4-22); 
Don't know: 2% (Confidence interval: 0-9). 

Aircraft certification; 
Unweighted sample size: 25; 
Greatly helped: 4% (Confidence interval: 0-19); 
Helped: 20% (Confidence interval: 7-39); 
Neither helped nor hindered: 20% (Confidence interval: 7-39); 
Hindered: 28% (Confidence interval: 13-48); 
Greatly hindered: 12% (Confidence interval: 3-30); 
No basis to judge: 16% (Confidence interval: 5-35); 
Don't know: 0% (Confidence interval: 0-11). 

All inspectors; 
Unweighted sample size: 387; 
Greatly helped: 6% (Confidence interval: 4-9); 
Helped: 26% (Confidence interval: 22-30); 
Neither helped nor hindered: 30% (Confidence interval: 26-35); 
Hindered: 17% (Confidence interval: 14-21); 
Greatly hindered: 9% (Confidence interval: 7-13); 
No basis to judge: 8% (Confidence interval: 5-10); 
Don't know: 3% (Confidence interval: 1-5). 

Source: GAO survey of FAA inspectors. 

Note: The data in this table represent the responses from inspectors to 
the following question, "Have the following factors helped or hindered 
your ability to take the technical training you requested to do your 
current job? Factor: Management's determination of your need for the 
course." For more detail about the estimates and the corresponding 
confidence intervals (numbers in parentheses), please see the text at 
the beginning of this appendix. Some of the row percentages will not 
add up to 100 percent due to rounding. See figure 7 for visual 
illustration. 

[End of table]

Table 18: Inspectors' Views on Whether Inspection Workload Helped or 
Hindered Their Ability to Take Requested Technical Training: 

Percent (confidence interval). 

Type of inspector: Air carrier; 
Unweighted sample size: 231; 
Greatly helped: 1% (Confidence interval: 0-4); 
Helped: 5% (Confidence interval: 3-9); 
Neither helped nor hindered: 58% (Confidence interval: 52-64); 
Hindered: 21% (Confidence interval: 16-25); 
Greatly hindered: 7% (Confidence interval: 4-11); 
No basis to judge: 6% (Confidence interval: 4-10); 
Don't know: 2% (Confidence interval: 1-4). 

Type of inspector: Air carrier avionics; 
Unweighted sample size: 46; 
Greatly helped: 0% (Confidence interval: 0-6); 
Helped: 9% (Confidence interval: 3-20); 
Neither helped nor hindered: 63% (Confidence interval: 48-76); 
Hindered: 15% (Confidence interval: 7-28); 
Greatly hindered: 9% (Confidence interval: 3-20); 
No basis to judge: 2% (Confidence interval: 0-10); 
Don't know: 2% (Confidence interval: 0-10). 

Type of inspector: Air carrier maintenance; 
Unweighted sample size: 88; 
Greatly helped: 2% (Confidence interval: 0-7); 
Helped: 7% (Confidence interval: 3-13); 
Neither helped nor hindered: 65% (Confidence interval: 55-74); 
Hindered: 15% (Confidence interval: 9-23); 
Greatly hindered: 5% (Confidence interval: 1-11); 
No basis to judge: 6% (Confidence interval: 2-12); 
Don't know: 1% (Confidence interval: 0-6). 

Type of inspector: Air carrier operations; 
Unweighted sample size: 79; 
Greatly helped: 1% (Confidence interval: 0-6); 
Helped: 3% (Confidence interval: 0-9); 
Neither helped nor hindered: 48% (Confidence interval: 38-59); 
Hindered: 29% (Confidence interval: 20-40); 
Greatly hindered: 9% (Confidence interval: 4-17); 
No basis to judge: 7% (Confidence interval: 3-15); 
Don't know: 2% (Confidence interval: 0-8). 

Type of inspector: Cabin safety; 
Unweighted sample size: 18; 
Greatly helped: 0% (Confidence interval: 0-15); 
Helped: 0% (Confidence interval: 0-15); 
Neither helped nor hindered: 62% (Confidence interval: 39-81); 
Hindered: 17% (Confidence interval: 5-38); 
Greatly hindered: 0% (Confidence interval: 0-15); 
No basis to judge: 22% (Confidence interval: 8-43); 
Don't know: 0% (Confidence interval: 0-15). 

General aviation: General aviation; 
Unweighted sample size: 132; 
Greatly helped: 2% (Confidence interval: 0-5); 
Helped: 6% (Confidence interval: 3-11); 
Neither helped nor hindered: 55% (Confidence interval: 47-63); 
Hindered: 18% (Confidence interval: 13-25); 
Greatly hindered: 11% (Confidence interval: 6-17); 
No basis to judge: 8% (Confidence interval: 4-13); 
Don't know: 1% (Confidence interval: 0-4). 

General aviation: General aviation avionics; 
Unweighted sample size: 22; 
Greatly helped: 0% (Confidence interval: 0-13); 
Helped: 5% (Confidence interval: 0-22); 
Neither helped nor hindered: 59% (Confidence interval: 37-78); 
Hindered: 5% (Confidence interval: 0-22); 
Greatly hindered: 23% (Confidence interval: 8-44); 
No basis to judge: 9% (Confidence interval: 1-28); 
Don't know: 0% (Confidence interval: 0-13). 

General aviation: General aviation maintenance; 
Unweighted sample size: 56; 
Greatly helped: 0% (Confidence interval: 0-5); 
Helped: 13% (Confidence interval: 6-23); 
Neither helped nor hindered: 57% (Confidence interval: 45-69); 
Hindered: 14% (Confidence interval: 7-25); 
Greatly hindered: 7% (Confidence interval: 2-17); 
No basis to judge: 7% (Confidence interval: 2-17); 
Don't know: 2% (Confidence interval: 0-9). 

General aviation: General aviation operations; 
Unweighted sample size: 54; 
Greatly helped: 4% (Confidence interval: 1-12); 
Helped: 0% (Confidence interval: 0-5); 
Neither helped nor hindered: 52% (Confidence interval: 39-64); 
Hindered: 28% (Confidence interval: 17-41); 
Greatly hindered: 9% (Confidence interval: 3-20); 
No basis to judge: 7% (Confidence interval: 2-17); 
Don't know: 0% (Confidence interval: 0-5). 

Aircraft certification; 
Unweighted sample size: 25; 
Greatly helped: 4% (Confidence interval: 0-19); 
Helped: 0% (Confidence interval: 0-11); 
Neither helped nor hindered: 56% (Confidence interval: 36-74); 
Hindered: 8% (Confidence interval: 1-24); 
Greatly hindered: 12% (Confidence interval: 3-30); 
No basis to judge: 20% (Confidence interval: 7-39); 
Don't know: 0% (Confidence interval: 0-11). 

All inspectors; 
Unweighted sample size: 388; 
Greatly helped: 2% (Confidence interval: 1-3); 
Helped: 5% (Confidence interval: 3-8); 
Neither helped nor hindered: 57% (Confidence interval: 52-62); 
Hindered: 19% (Confidence interval: 15-23); 
Greatly hindered: 8% (Confidence interval: 6-12); 
No basis to judge: 7% (Confidence interval: 5-10); 
Don't know: 1% (Confidence interval: 0-3). 

Source: GAO survey of FAA inspectors. 

Note: The data in this table represent the responses from inspectors to 
the following question, "Have the following factors helped or hindered 
your ability to take the technical training you requested to do your 
current job? Factor: The impact on your workload of the time commitment 
required for the training." For more detail about the estimates and the 
corresponding confidence intervals (numbers in appendix), please see 
the text at the beginning of this appendix. Some of the row percentages 
will not add up to 100 percent due to rounding. See figure 7 for visual 
representation. 

[End of table]

Table 19: Inspectors' Views on the Degree to Which Technical Training 
Is Delivered in a Timely Manner: 

Percent (confidence interval). 

Type of inspector: Air carrier; 
Unweighted sample size: 229; 
Very great: 1% (Confidence interval: 0-4); 
Great: 20% (Confidence interval: 15-25); 
Moderate: 27% (Confidence interval: 21-32); 
Some: 38% (Confidence interval: 32-44); 
None: 14% (Confidence interval: 10-18). 

Type of inspector: Air carrier avionics; 
Unweighted sample size: 46; 
Very great: 0% (Confidence interval: 0-6); 
Great: 15% (Confidence interval: 7-28); 
Moderate: 15% (Confidence interval: 7-28); 
Some: 56% (Confidence interval: 43-70); 
None: 13% (Confidence interval: 5-24). 

Type of inspector: Air carrier maintenance; 
Unweighted sample size: 87; 
Very great: 0% (Confidence interval: 0-3); 
Great: 17% (Confidence interval: 10-26); 
Moderate: 37% (Confidence interval: 27-46); 
Some: 28% (Confidence interval: 19-38); 
None: 18% (Confidence interval: 11-28). 

Type of inspector: Air carrier operations; 
Unweighted sample size: 78; 
Very great: 4% (Confidence interval: 1-10); 
Great: 27% (Confidence interval: 18-38); 
Moderate: 20% (Confidence interval: 13-30); 
Some: 40% (Confidence interval: 30-50); 
None: 9% (Confidence interval: 4-17). 

Type of inspector: Cabin safety; 
Unweighted sample size: 18; 
Very great: 0% (Confidence interval: 0-15); 
Great: 6% (Confidence interval: 0-24); 
Moderate: 44% (Confidence interval: 24-66); 
Some: 34% (Confidence interval: 16-56); 
None: 16% (Confidence interval: 5-36). 

General aviation: General aviation; 
Unweighted sample size: 132; 
Very great: 2% (Confidence interval: 1-6); 
Great: 13% (Confidence interval: 8-19); 
Moderate: 22% (Confidence interval: 16-29); 
Some: 51% (Confidence interval: 43-59); 
None: 12% (Confidence interval: 7-18). 

General aviation: General aviation avionics; 
Unweighted sample size: 22; 
Very great: 0% (Confidence interval: 0-13); 
Great: 0% (Confidence interval: 0-13); 
Moderate: 27% (Confidence interval: 11-49); 
Some: 55% (Confidence interval: 33-75); 
None: 18% (Confidence interval: 6-39). 

General aviation: General aviation maintenance; 
Unweighted sample size: 56; 
Very great: 2% (Confidence interval: 0-9); 
Great: 16% (Confidence interval: 8-28); 
Moderate: 21% (Confidence interval: 12-34); 
Some: 46% (Confidence interval: 34-59); 
None: 14% (Confidence interval: 7-25). 

General aviation: General aviation operations; 
Unweighted sample size: 54; 
Very great: 4% (Confidence interval: 1-12); 
Great: 15% (Confidence interval: 7-26); 
Moderate: 20% (Confidence interval: 11-33); 
Some: 54% (Confidence interval: 41-66); 
None: 7% (Confidence interval: 2-17). 

Aircraft certification; 
Unweighted sample size: 25; 
Very great: 8% (Confidence interval: 1-24); 
Great: 24% (Confidence interval: 10-44); 
Moderate: 28% (Confidence interval: 13-48); 
Some: 32% (Confidence interval: 16-52); 
None: 8% (Confidence interval: 1-24). 

All inspectors; 
Unweighted sample size: 386; 
Very great: 2% (Confidence interval: 1-4); 
Great: 18% (Confidence interval: 14-21); 
Moderate: 25% (Confidence interval: 21-29); 
Some: 42% (Confidence interval: 37-47); 
None: 13% (Confidence interval: 10-16). 

Source: GAO survey of FAA inspectors. 

Note: The data in this table represent the responses from inspectors to 
the following question, "During your FAA career, to what extent have 
you received technical training in a timely manner-meaning receiving 
training in time to do your current job?" For more detail about the 
estimates and the corresponding confidence intervals (numbers in 
parentheses), please see the text at the beginning of this appendix. 
The number of inspectors responding "do not know" was 2 percent or 
less. These results are not presented. Some of the row percentages will 
not add up to 100 percent due to rounding. See figure 8 for visual 
representation. 

[End of table]

Table 20: Inspectors' Views on the Extent That They Receive Technical 
Training Prior to Scheduled Oversight Activities: 

Percent (confidence interval). 

Type of inspector: Air carrier; 
Unweighted sample size: 231; 
Always: 4% (Confidence interval: 2-8); 
Frequently: 18% (Confidence interval: 14-23); 
Occasionally: 31% (Confidence interval: 26-37); 
Rarely: 29% (Confidence interval: 23-34); 
Never: 14% (Confidence interval: 10-19); 
No basis: 3% (Confidence interval: 1-6). 

Type of inspector: Air carrier avionics; 
Unweighted sample size: 46; 
Always: 0% (Confidence interval: 0-6); 
Frequently: 7% (Confidence interval: 2-17); 
Occasionally: 26% (Confidence interval: 15-40); 
Rarely: 61% (Confidence interval: 46-74); 
Never: 7% (Confidence interval: 2-17); 
No basis: 0% (Confidence interval: 0-6). 

Type of inspector: Air carrier maintenance; 
Unweighted sample size: 88; 
Always: 6% (Confidence interval: 2-12); 
Frequently: 11% (Confidence interval: 6-19); 
Occasionally: 35% (Confidence interval: 26-45); 
Rarely: 22% (Confidence interval: 14-31); 
Never: 24% (Confidence interval: 16-33); 
No basis: 2% (Confidence interval: 0-7). 

Type of inspector: Air carrier operations; 
Unweighted sample size: 79; 
Always: 5% (Confidence interval: 2-12); 
Frequently: 29% (Confidence interval: 20-40); 
Occasionally: 29% (Confidence interval: 20-40); 
Rarely: 23% (Confidence interval: 14-33); 
Never: 9% (Confidence interval: 4-17); 
No basis: 5% (Confidence interval: 2-12). 

Type of inspector: Cabin safety; 
Unweighted sample size: 18; 
Always: 5% (Confidence interval: 0-21); 
Frequently: 28% (Confidence interval: 12-51); 
Occasionally: 40% (Confidence interval: 21-61); 
Rarely: 11% (Confidence interval: 2-31); 
Never: 11% (Confidence interval: 2-31); 
No basis: 5% (Confidence interval: 0-21). 

General aviation: General aviation; 
Unweighted sample size: 132; 
Always: 2% (Confidence interval: 1-6); 
Frequently: 19% (Confidence interval: 13-26); 
Occasionally: 27% (Confidence interval: 20-35); 
Rarely: 33% (Confidence interval: 26-41); 
Never: 11% (Confidence interval: 7-17); 
No basis: 7% (Confidence interval: 3-12). 

General aviation: General aviation avionics; 
Unweighted sample size: 22; 
Always: 0% (Confidence interval: 0-13); 
Frequently: 5% (Confidence interval: 0-22); 
Occasionally: 18% (Confidence interval: 6-39); 
Rarely: 36% (Confidence interval: 18-58); 
Never: 23% (Confidence interval: 8-44); 
No basis: 18% (Confidence interval: 6-39). 

General aviation: General aviation maintenance; 
Unweighted sample size: 56; 
Always: 4% (Confidence interval: 1-12); 
Frequently: 20% (Confidence interval: 11-32); 
Occasionally: 27% (Confidence interval: 16-39); 
Rarely: 32% (Confidence interval: 21-45); 
Never: 14% (Confidence interval: 7-25); 
No basis: 4% (Confidence interval: 1-12). 

General aviation: General aviation operations; 
Unweighted sample size: 54; 
Always: 2% (Confidence interval: 0-9); 
Frequently: 24% (Confidence interval: 14-37); 
Occasionally: 31% (Confidence interval: 20-45); 
Rarely: 33% (Confidence interval: 22-47); 
Never: 4% (Confidence interval: 1-12); 
No basis: 6% (Confidence interval: 1-15). 

Aircraft certification; 
Unweighted sample size: 25; 
Always: 4% (Confidence interval: 0-19); 
Frequently: 28% (Confidence interval: 13-48); 
Occasionally: 32% (Confidence interval: 16-52); 
Rarely: 20% (Confidence interval: 7-39); 
Never: 16% (Confidence interval: 5-35); 
No basis: 0% (Confidence interval: 0-11). 

All inspectors; 
Unweighted sample size: 388; 
Always: 4% (Confidence interval: 2-6); 
Frequently: 19% (Confidence interval: 15-23); 
Occasionally: 30% (Confidence interval: 26-34); 
Rarely: 30% (Confidence interval: 26-34); 
Never: 13% (Confidence interval: 10-17); 
No basis: 4% (Confidence interval: 3-7). 

Source: GAO survey of FAA inspectors. 

Note: The data in this table represent the responses from inspectors to 
the following question, "In thinking about the timing of when you 
received technical training during your FAA career, how often did the 
following situations apply? Situation: Technical and/or equipment 
training was received prior to scheduled oversight/surveillance 
activities." For more detail about the estimates and the corresponding 
confidence intervals (numbers in parentheses), please see the text at 
the beginning of this appendix. Some of the row percentages will not 
add up to 100 percent due to rounding. See figure 9 for visual 
representation. 

[End of table]

Table 21: Percent of Technical Training Provided by Industry as 
Reported by FAA, Fiscal Years 2002 through 2004: 

Type of inspector: Air carrier: Air carrier avionics; 
2002: Number: 181; 
2002: Percent[A]: 30; 
2003: Number: 193; 
2003: Percent[A]: 42; 
2004: Number: 218; 
2004: Percent[A]: 53. 

Type of inspector: Air carrier: Air carrier maintenance; 
2002: Number: 475; 
2002: Percent[A]: 39; 
2003: Number: 313; 
2003: Percent[A]: 38; 
2004: Number: 323; 
2004: Percent[A]: 53. 

Type of inspector: Air carrier: Air carrier operations; 
2002: Number: 282; 
2002: Percent[A]: 54; 
2003: Number: 341; 
2003: Percent[A]: 57; 
2004: Number: 332; 
2004: Percent[A]: 55. 

Type of inspector: Air carrier: Cabin safety; 
2002: Number: 9; 
2002: Percent[A]: 39; 
2003: Number: 7; 
2003: Percent[A]: 23; 
2004: Number: 6; 
2004: Percent[A]: 33. 

Type of inspector: Air carrier: Subtotal; 
2002: Number: 947; 
2002: Percent[A]: 40; 
2003: Number: 854; 
2003: Percent[A]: 45; 
2004: Number: 879; 
2004: Percent[A]: 54. 

Type of inspector: General aviation: General aviation avionics; 
2002: Number: 77; 
2002: Percent[A]: 24; 
2003: Number: 65; 
2003: Percent[A]: 33; 
2004: Number: 59; 
2004: Percent[A]: 33. 

Type of inspector: General aviation: General aviation maintenance; 
2002: Number: 240; 
2002: Percent[A]: 30; 
2003: Number: 160; 
2003: Percent[A]: 28; 
2004: Number: 148; 
2004: Percent[A]: 33. 

Type of inspector: General aviation: General aviation operations; 
2002: Number: 167; 
2002: Percent[A]: 41; 
2003: Number: 248; 
2003: Percent[A]: 56; 
2004: Number: 222; 
2004: Percent[A]: 40. 

Type of inspector: General aviation: Subtotal; 
2002: Number: 484; 
2002: Percent[A]: 32; 
2003: Number: 473; 
2003: Percent[A]: 39; 
2004: Number: 429; 
2004: Percent[A]: 37. 

Type of inspector: Aircraft certification; 
2002: Number: 28; 
2002: Percent[A]: 34; 
2003: Number: 27; 
2003: Percent[A]: 41; 
2004: Number: 3; 
2004: Percent[A]: 8. 

Total; 
2002: Number: 1,459; 
2002: Percent[A]: 37; 
2003: Number: 1,354; 
2003: Percent[A]: 42; 
2004: Number: 1,311; 
2004: Percent[A]: 46. 

Source: GAO analysis of FAA data. 

Note: See figure 10 for visual representation: 

[A] As a percent of total FAA-and industry-provided training for each 
type of inspector. 

[End of table]

Table 22: Inspectors' Views on the Extent to Which Technical Training 
Opportunities Exist Closer to Their Work Location: 

Percent (confidence interval). 

Type of inspector: Air carrier; 
Unweighted sample size: 231; 
Very great: 4% (Confidence interval: 2-7); 
Great: 11% (Confidence interval: 7-15); 
Moderate: 5% (Confidence interval: 3-9); 
Some: 13% (Confidence interval: 9-18); 
None: 30% (Confidence interval: 24-35); 
Don't know: 38% (Confidence interval: 32-44). 

Type of inspector: Air carrier avionics; 
Unweighted sample size: 46; 
Very great: 2% (Confidence interval: 0-10); 
Great: 9% (Confidence interval: 3-20); 
Moderate: 2% (Confidence interval: 0-11); 
Some: 24% (Confidence interval: 13-37); 
None: 26% (Confidence interval: 15-40); 
Don't know: 37% (Confidence interval: 24-52). 

Type of inspector: Air carrier maintenance; 
Unweighted sample size: 87; 
Very great: 3% (Confidence interval: 1-9); 
Great: 7% (Confidence interval: 3-14); 
Moderate: 8% (Confidence interval: 4-15); 
Some: 10% (Confidence interval: 5-18); 
None: 26% (Confidence interval: 18-36); 
Don't know: 45% (Confidence interval: 35-54). 

Type of inspector: Air carrier operations; 
Unweighted sample size: 80; 
Very great: 5% (Confidence interval: 2-12); 
Great: 16% (Confidence interval: 9-25); 
Moderate: 4% (Confidence interval: 1-10); 
Some: 11% (Confidence interval: 6-20); 
None: 34% (Confidence interval: 24-44); 
Don't know: 30% (Confidence interval: 20-39). 

Type of inspector: Cabin safety; 
Unweighted sample size: 18; 
Very great: 0% (Confidence interval: 0-15); 
Great: 6% (Confidence interval: 0-24); 
Moderate: 0% (Confidence interval: 0-15); 
Some: 6% (Confidence interval: 0-24); 
None: 37% (Confidence interval: 22-54); 
Don't know: 51% (Confidence interval: 32-70). 

General aviation: General aviation; 
Unweighted sample size: 132; 
Very great: 2% (Confidence interval: 1-6); 
Great: 9% (Confidence interval: 5-15); 
Moderate: 10% (Confidence interval: 6-16); 
Some: 18% (Confidence interval: 12-24); 
None: 26% (Confidence interval: 19-33); 
Don't know: 35% (Confidence interval: 28-43). 

General aviation: General aviation avionics; 
Unweighted sample size: 22; 
Very great: 5% (Confidence interval: 0-22); 
Great: 0% (Confidence interval: 0-13); 
Moderate: 14% (Confidence interval: 3-34); 
Some: 5% (Confidence interval: 0-22); 
None: 23% (Confidence interval: 8-44); 
Don't know: 55% (Confidence interval: 33-75). 

General aviation: General aviation maintenance; 
Unweighted sample size: 56; 
Very great: 2% (Confidence interval: 0-9); 
Great: 7% (Confidence interval: 2-17); 
Moderate: 11% (Confidence interval: 4-21); 
Some: 11% (Confidence interval: 4-21); 
None: 27% (Confidence interval: 16-39); 
Don't know: 43% (Confidence interval: 31-55). 

General aviation: General aviation operations; 
Unweighted sample size: 54; 
Very great: 2% (Confidence interval: 0-9); 
Great: 15% (Confidence interval: 7-26); 
Moderate: 7% (Confidence interval: 2-17); 
Some: 30% (Confidence interval: 19-43); 
None: 26% (Confidence interval: 15-39); 
Don't know: 20% (Confidence interval: 11-33). 

Aircraft certification; 
Unweighted sample size: 25; 
Very great: 4% (Confidence interval: 0-19); 
Great: 0% (Confidence interval: 0-11); 
Moderate: 12% (Confidence interval: 3-30); 
Some: 20% (Confidence interval: 7-39); 
None: 28% (Confidence interval: 13-48); 
Don't know: 36% (Confidence interval: 19-56). 

All inspectors; 
Unweighted sample size: 388; 
Very great: 3% (Confidence interval: 2-5); 
Great: 10% (Confidence interval: 7-13); 
Moderate: 7% (Confidence interval: 5-10); 
Some: 15% (Confidence interval: 12-18); 
None: 28% (Confidence interval: 24-32); 
Don't know: 37% (Confidence interval: 32-41). 

Source: GAO survey of FAA inspectors. 

Note: The data in this table represents the responses from inspectors 
to the following question, "To what extent are there opportunities for 
FAA to offer or contract for technical training, such as recurrency 
training, closer to your work location that is currently held at a 
central location far from your work location?" For more detail about 
the estimates and the corresponding confidence intervals (numbers in 
parentheses), please see the text at the beginning of this appendix. 
Some of the row percentages will not add up to 100 percent due to 
rounding. See figure 12 for visual representation. 

[End of table]

[End of section]

Appendix III: Scope and Methodology: 

To assess the extent to which FAA followed effective management 
practices in planning for, developing, and delivering up-to-date 
technical training, and ensuring that the technical training for 
inspectors contributes to improved performance and results, we 
identified key elements for assessing effective training and 
development efforts in the federal government using our recent guide on 
this subject.[Footnote 59] We identified the elements of this guidance 
that were most relevant to the training activities at FAA for aviation 
safety inspectors and then determined the extent to which FAA followed 
these practices. In determining the extent to which FAA followed a 
practice, we used the following scale: "fully" indicated that in our 
judgment all or virtually all aspects of the practice were followed; 
"mostly" indicated that more than half were followed; "partially" 
indicated that less than half were followed; and "not followed" 
indicated that few or no aspects of the practice were followed. For 
each element, we obtained information from FAA on its plans and 
activities and compared this information with the published criteria. 
We discussed this information with FAA training and program officials 
to gain their perspectives. In addition to gaining an understanding of 
these plans, and activities generally, we applied the elements in our 
training guidance to two emerging technologies (glass cockpits and 
composite materials) and determined how training needs in these areas 
were incorporated into training courses for FAA inspectors. 

We supplemented these activities in several ways to gain additional 
perspectives of inspector technical training needs and FAA's efforts to 
meet these needs. First, we collected materials from and interviewed 
FAA managers, supervisors, and inspectors at 7 of approximately 130 
locations across the United States where FAA inspections take place. 
These efforts illuminated a mix of FAA inspector responsibilities for 
air carrier and general aviation operations and maintenance, new 
aircraft certifications, and oversight of manufacturing facilities. 
Second, we discussed technical training needs and FAA's actions with 
senior management of the Professional Airways System Specialists, the 
collective bargaining unit for air safety inspectors. Third, we sought 
the advice of two sets of experts, one to provide advice on the overall 
design of our study and a second to help us assess FAA's technical 
training curriculum and the extent to which FAA ensures that safety 
inspectors receive needed technical training. (See table 23.) We 
selected these experts on the basis of their knowledge of FAA safety 
inspectors and aviation technologies. We also sought the views of 23 
member airlines of the Air Transport Association and the Regional 
Airline Association on the technical training of inspectors. Fourth, we 
visited the FAA Training Academy in Oklahoma City, Oklahoma, and Embry-
Riddle Aeronautical University in Daytona Beach, Florida, to learn more 
about how courses are delivered to inspectors. Fifth, we reviewed 
National Transportation Safety Board recommendations concerning FAA 
safety inspector technical training. Lastly, we reviewed our studies 
and those of the Department of Transportation's Inspector General 
concerning inspector training and human capital issues. (See the 
Related Products section of this report for a list of our products.) 

Table 23: Experts Consulted for Our Work: 

Design experts: Mr. Gary Kiteley, Executive Director, Council on 
Aviation Accreditation; 
Curriculum experts: Mr. Brian Finnegan, President, Professional 
Aviation Maintenance Association. 

Design experts: Mr. Kent Lovelace, Chairman and Professor, Department 
of Aviation, University of North Dakota; 
Curriculum experts: Mr. David Lotterer, Vice President of Technical 
Services, Regional Airline Association. 

Design experts: Dr. Thomas Q. Carney, Professor and Department Head, 
Department of Aviation Technology, Purdue University; 
Curriculum experts: Mr. Basil Barimo, Vice President, Operations and 
Safety and Mr. Mont J. Smith, Director, Safety; Air Transport 
Association of America. 

Design experts: Mr. Anthony J. Broderick, Independent Aviation Safety 
Consultant; 
Curriculum experts: Mr. David Wright, Director of Training, Aircraft 
Owners and Pilots Association, Air Safety Foundation. 

Curriculum experts: Mr. Theodore Beneigh, Professor, Aeronautical 
Science; Mr. Charles Westbrooks, Assistant Professor, Aeronautical 
Science; Mr. Fred Mirgle, Director, Aviation and Avionics Training; Mr. 
Neill Fulbright, Associate Program Coordinator, Avionics Line 
Maintenance; Embry-Riddle Aeronautical University. 

Curriculum experts: Mr. Walter Desrosier, Vice President, Engineering 
and Maintenance; Mr. Jens Hennig, Manager, Operations; Mr. Gregory 
Bowles, Manager, Engineering and Maintenance; General Aviation 
Manufacturers Association. 

Curriculum experts: Dr. Michael Romanowski, Vice President, Civil 
Aviation and Mr. Ronald R. Baker, Jr., Manager, Civil Aviation 
Programs; Aerospace Industries Association. 

Curriculum experts: Ms. Sarah MacLeod, Executive Director and Mr. Paul 
Hawthorne, Vice President, Operations; Aeronautical Repair Station 
Association. 

Source: GAO. 

[End of table]

To determine the type and amount of technical and other training that 
FAA inspectors receive, we obtained course descriptions from FAA and 
data from the Flight Standards training management system database and 
spreadsheets from Aircraft Certification for fiscal years 2002 through 
2004. FAA officials indicated their belief that a study of inspector 
technical training should encompass training records over the whole of 
the inspectors' careers. However, because the Vision 100-Century of 
Aviation Reauthorization Act asked us to study up-to-date training on 
the latest technologies, we analyzed only the most recent 3 fiscal 
years of data. The data that we obtained included (1) essential and 
recommended courses by type of inspector; (2) training completed by 
each inspector; and (3) inspector specialty, location, and date of 
employment. We then calculated the amount of inspector training 
completed by course category and by type of inspector. In addition, we 
used the training records and course requirements to determine the 
extent to which inspectors have completed essential FAA courses. 

To assess the reliability of the training data, we (1) interviewed 
knowledgeable agency officials about the data, (2) performed electronic 
testing of relevant data fields for obvious errors in accuracy and 
completeness, and (3) collected and reviewed documentation from data 
system managers about the data and the systems that produced them. We 
determined that the data were sufficiently reliable for the purposes of 
this report. 

To gather information about inspectors' perspectives on the technical 
training available to them, we conducted a Web-based survey of a 
representative sample of FAA safety inspectors. The survey asked a 
combination of questions that allowed for open-ended and close-ended 
responses. We drew a stratified random probability sample of 496 
inspectors from the population of 2,989 aviation safety inspectors 
across the United States.[Footnote 60] We stratified the population 
into 12 groups on the basis of the type of work the inspector 
performed. Each sample element was subsequently weighted in the 
analysis to account statistically for all the members of the 
population. 

Because we followed a probability procedure based on random selection, 
our sample is only one of a large number of samples that we might have 
drawn. Since each sample could have provided different estimates, we 
express our confidence in the precision of our particular sample's 
results as a 95 percent confidence interval (e.g., plus or minus 4.6 
percentage points). This is the interval that would contain the actual 
population value for 95 percent of the samples we could have drawn. As 
a result, we are 95 percent confident that each of the confidence 
intervals in this report will include the true values in the study 
population. The percentage estimates for all survey respondents have a 
margin of error of plus or minus 4.6 percentage points. However, the 
percentage estimates for the subgroups are larger with a range of 
margin of error of plus or minus between 9.7 and 20.0 percentage 
points. Survey estimates presented as comparisons between groups are 
statistically significant when the 95 percent confidence intervals do 
not overlap. 

The surveys were conducted using self-administered electronic 
questionnaires accessible on the Internet through a secure Web browser. 
We sent e-mail notifications to 496 inspectors, beginning on December 
4, 2004. We then sent each potential respondent a unique password and 
username to ensure that only members of the target population could 
participate in the survey. The initial version of the questionnaire 
that was posted on December 4, 2004, did not include three questions. A 
revised version was posted on December 14, 2004, before most 
respondents had answered the questionnaire. Because approximately one-
quarter of the respondents did not answer these three new questions 
(questions 9, 20, and 25d), these results are not included in the 
report. To encourage respondents to complete the questionnaire, we sent 
a subsequent e-mail message to further prompt each nonrespondent 
approximately 2 weeks after the initial e-mail message. We sent 
nonrespondents two more notices and closed the survey on February 4, 
2005. Of the 496 inspectors whom we surveyed, we received 392 useable 
responses (79 percent). 

In addition to these sampling errors, the practical difficulties in 
conducting surveys of this type may introduce other types of errors, 
commonly referred to as nonsampling errors. For example, questions may 
be misinterpreted, or the respondents' answers may differ from those of 
the inspectors who did not respond. We took steps to reduce these 
errors. 

Finally, we pretested the content and format of the questionnaire with 
safety inspectors at local FAA offices in Baltimore, Los Angeles, and 
Seattle. During the pretests we asked the inspectors questions to 
determine whether (1) the survey questions were clear, (2) the terms 
used were precise, (3) the questionnaire placed an undue burden on the 
respondents, and (4) the questions were unbiased. We made changes to 
the content and format of the final questionnaire on the basis of the 
pretest results. 

To determine the amount of training FAA receives from the aviation 
industry, we analyzed the training records of all FAA safety 
inspectors. We obtained FAA's course numbering and categorization 
system and used it to determine whether individual courses were 
provided by FAA or by the aviation industry. We computed the total 
number of technical courses attended by FAA inspectors from fiscal 
years 2002 through 2004 and identified those provided by the aviation 
industry. We discussed our results with FAA training officials. See the 
discussion above for our actions to assess the completeness and 
reliability of these data. 

To determine the amount of training safety inspectors received from 
industry either (1) in return for in-kind services or (2) for free, we 
reviewed training records and interviewed FAA headquarters officials 
and regional officials about FAA policies for accepting these types of 
training. We also asked about procedures used when such training is 
requested, including steps taken to ensure that any real or apparent 
conflict-of-interest issues are addressed. FAA does not keep separate 
records of these two types of training, and these data cannot easily be 
identified from the central training data files. Therefore, we instead 
interviewed officials at FAA's nine regional officials and requested 
these training data from them. Subsequently, we used the safety 
inspector training records to validate some of these data. We relied on 
FAA's nine regional office officials to contact over 100 Flight 
Standards and Air Certification offices to collect these data for 
fiscal years 2002 through 2004. Some regions indicated that their 
offices did not keep full records. Other regions provided us with 
incomplete data records, sometimes without names or specific dates. 
Because of the large number of offices from which the data were 
gathered, it was not practical for us to independently verify the 
completeness or accuracy of these data. As a result, we cannot be sure 
that the information FAA supplied includes all industry-provided 
training received for the 3 fiscal years. 

[End of section]

Appendix IV: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Gerald L. Dillingham, Ph.D. (202) 512-2834: 

Acknowledgments: 

In addition to the contact named above, James Ratzenberger, Assistant 
Director; Carl Barden; Nancy Boardman; Brad Dubbs; Alice Feldesman; Jim 
Geibel; Kim Gianopoulos; David Hooper; Michael Krafve; Ed Laughlin; 
Donna Leiss; Jean McSween; Minette Richardson; and Sandra Sokol made 
key contributions to this report. 

[End of section]

Related GAO Products: 

FAA Safety Inspector Training: 

National Aerospace System: Reauthorizing FAA Provides Opportunities and 
Options to Address Challenges. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03-473T] 
Washington, D.C.: February 12, 2003. 

Major Management Challenges and Program Risks: Department of 
Transportation. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03-108] 
Washington, D.C.: January 1, 2003. 

Aviation Safety: FAA's New Inspection System Offers Promise, but 
Problems Need to Be Addressed. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-99-183] 
Washington, D.C.: June 28, 1999. 

Aviation Safety: Weaknesses in Inspection and Enforcement Limit FAA in 
Identifying and Responding to Risks. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-98-6] 
Washington, D.C.: February 27, 1998. 

Aviation Safety: FAA Oversight of Repair Stations Needs Improvement. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-98-21] 
Washington, D.C.: October 24, 1997. 

Aviation Safety: Targeting and Training of FAA's Safety Inspector 
Workforce. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-RCED-96-26] 
Washington, D.C.: April 30, 1996. 

FAA Technical Training. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-94-296R] 
Washington, D.C.: September 26, 1994. 

FAA Budget: Important Challenges Affecting Aviation Safety, Capacity, 
and Efficiency. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-RCED-93-33] 
Washington, D.C.: April 26, 1993. 

FAA Budget: Key Issues Need to Be Addressed. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-RCED-92-51] 
Washington, D.C.: April 6, 1992. 

Aviation Safety: Commuter Airline Safety Would Be Enhanced with Better 
FAA Oversight. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-RCED-92-40] 
Washington, D.C.: March 17, 1992. 

Aviation Safety: FAA Needs to More Aggressively Manage Its Inspection 
Program. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-RCED-92-25] 
Washington, D.C.: February 6, 1992. 

Aviation Safety: Problems Persist in FAA's Inspection Program. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-92-14] 
Washington, D.C.: November 20, 1991. 

Serious Shortcomings in FAA's Training Program Must Be Remedied. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-RCED-90-91] 
Washington, D.C.: June 21, 1990. 

Serious Shortcomings in FAA's Training Program Must Be Remedied. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-RCED-90-86] 
Washington, D.C.: June 6, 1990. 

Staffing, Training, and Funding Issues for FAA's Major Work Forces. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-RCED-90-42] 
Washington, D.C.: March 14, 1990. 

Aviation Training: FAA Aviation Safety Inspectors Are Not Receiving 
Needed Training. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-89-168] 
Washington, D.C.: September 14, 1989. 

FAA Training: Continued Improvements Needed in FAA's Controller Field 
Training Program. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-89-83] 
Washington, D.C.: March 29, 1989. 

FAA Staffing: Recruitment, Hiring, and Initial Training of Safety-
Related Personnel. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-88-189] 
Washington, D.C.: September 2, 1988. 

Aviation Safety: Measuring How Safely Individual Airlines Operate. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-88-61] 
Washington, D.C.: March 18, 1988. 

Aviation Safety: Needed Improvements in FAA's Airline Inspection 
Program Are Underway. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-87-62] 
Washington, D.C.: May 19, 1987. 

FAA Work Force Issues. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-RCED-87-25] 
Washington, D.C.: May 7, 1987. 

Human Capital: 

Human Capital: A Guide for Assessing Strategic Training and Development 
Efforts in the Federal Government. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-04-546G] 
Washington, D.C.: March 1, 2004. 

Human Capital: Selected Agencies Experiences and Lessons Learned in 
Designing Training and Developing Programs. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-04-291] 
Washington, D.C.: January 30, 2004. 

Human Capital Management: FAA's Reform Effort Requires a More Strategic 
Approach. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03-156] 
Washington, D.C.: February 3, 2003. 

A Model of Strategic Human Capital Management. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-02-373SP] 
Washington, D.C.: March 15, 2002. 

Human Capital: A Self-Assessment Checklist for Agency Leaders. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/OCG-00-14G] 
Washington, D.C.: September 1, 2000. 

Human Capital: Design, Implementation, and Evaluation of Training at 
Selected Agencies. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-GGD-00-131] 
Washington, D.C.: May 18, 2000. 

Human Capital: Key Principles From Nine Private Sector Organizations. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/GGD-00-28] 
Washington, D.C.: January 31, 2000. 

Related FAA Training: 

Federal Aviation Administration: Challenges for Transforming into a 
High-Performing Organization. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-04-770T] 
Washington, D.C.: May 18, 2004. 

Aviation Safety: Data Problems Threaten FAA Strides on Safety Analysis 
System. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/AIMD-95-27] 
Washington, D.C.: February 8, 1995. 

Aircraft Certification: New FAA Approach Needed to Meet Challenges of 
Advanced Technology. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-93-155] 
Washington, D.C.: September 16, 1993. 

Aviation Safety: Progress on FAA Safety Indicators Program Slow and 
Challenges Remain. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/IMTEC-92-57] 
Washington, D.C.: August 31, 1992. 

Aviation Safety: FAA's Safety Inspection Management System Lacks 
Adequate Oversight. 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/RCED-90-36] 
Washington, D.C.: November 13, 1989. 

[End of section] 

(540089): 

FOOTNOTES

[1] In addition, FAA delegates about 90 percent of its safety 
inspection activities to about 13,600 private persons and 
organizations, known as designees. The designees augment FAA's 
inspection workforce by allowing inspectors to concentrate on what FAA 
considers to be the most critical safety areas. For example, while 
designees conduct routine functions, such as approvals of aircraft 
technologies that the agency and designees have had previously 
experience with, FAA inspectors focus on new and complex aircraft 
designs or design changes. For an assessment of the designee programs, 
see GAO, Aviation Safety: FAA Needs to Strengthen the Management of Its 
Designee Programs, GAO-05-40 (Washington, D.C.: Oct. 8, 2004). 

[2] System safety is a multidisciplinary, integrated, and comprehensive 
regulatory approach using engineering and management principles, 
criteria, and techniques to identify and mitigate high-risk areas. When 
FAA uses it in the oversight of airlines, the system safety approach 
covers every aspect of an airline's operations, from the design of the 
hardware to the culture and attitudes of the airline's personnel. The 
approach calls for a systematic review of an airline's policies and 
procedures to ensure that they incorporate such basic safety principles 
as clear lines of responsibility and written documentation. According 
to FAA, the approach allows it to concentrate and target inspector 
resources where there is the greatest safety risk. The success of a 
system safety approach to regulation depends on comprehensive safety 
data, sophisticated analysis tools, and a workforce well trained in 
risk assessment, auditing, systems thinking, and communications. 

[3] In addition to training involving aviation technologies (such as 
use of new materials in aircraft and aircraft electronic systems), FAA 
includes in its definition of technical training, topics such as 
inspector job skills, risk analysis, data analysis and training in 
software packages, such as spreadsheets. Our use of the term 
"technical" is limited to aviation technologies. 

[4] We considered all training that FAA classifies as either mandatory, 
position essential, or continuing development as essential training. 
Mandatory is training that is required for all newly hired inspectors 
and previous experience may not be substituted for this training. 
Position essential is training or a skill that is required based on an 
inspector's current position (e.g., training required for maintenance 
inspectors). To determine which courses were technical, we reviewed the 
description for each course taken from 2002 through 2004 and determined 
whether it was primarily technical in nature, within our use of the 
term. 

[5] GAO, Human Capital: A Guide for Assessing Strategic Training and 
Development Efforts in the Federal Government, GAO-04-546G (Washington, 
D.C.: Mar. 1, 2004). 

[6] FAA inspectors specialize in conducting inspections of various 
aspects of the aviation system, such as aircraft and parts 
manufacturing, aircraft operations, aircraft airworthiness, and cabin 
safety. See the background section of this report for more information 
on inspector specialization. 

[7] Because of the statistical survey techniques we employed in 
surveying FAA's inspectors, we are 95 percent confident that the 
results we present are within 4.6 percentage points of the results that 
we would have obtained if we had surveyed all 3,000 front-line 
inspectors. That is, we are 95 percent confident that had we surveyed 
all inspectors, between 48 and 57 percent of them would have told us 
that, to a great or very great extent, they have the technical 
knowledge to do their jobs. All percentage estimates from the survey 
have a margin of error of plus or minus 4.6 percentage points or less, 
unless otherwise noted. 

Throughout the survey we used a 5-point scale (very great, great, 
moderate, some, and no extent). For the most part, we report on the 
degree to which inspectors expressed their views to a very great or 
great extent because we believe that "a moderate extent" does not 
represent a strong positive or negative view and does not represent a 
level of performance to which a high-performing organization should 
aspire. 

[8] The goal of ATOS is to identify safety trends in order to spot and 
correct problems at their root cause before an accident occurs. This 
program allows FAA inspectors to now look at an airline as a whole, to 
see how the many elements of its operations--from aircraft to pilots to 
maintenance facilities to flight dispatch to cabin safety--interact to 
meet federal standards. The program will ultimately encompass all of 
the approximately 120 American airlines that operate in the United 
States, at any given time. 

[9] A repair station is an FAA-certified maintenance facility that is 
authorized to perform maintenance or alterations on U.S.-registered 
aircraft. 

[10] Air carriers are considered new entrants (or new air carriers) for 
their first 5 years of operation. 

[11] We will issue a report on FAA's oversight of non-ATOS carriers 
later this year. 

[12] In addition, Flight Standards has recently moved forward with a 
curriculum transformation strategy that will fundamentally change how 
its training program is managed. Acknowledging that its training 
activities tend to be fragmented, sometimes working at cross purposes, 
and sometimes leaving major gaps, Flight Standards' transformation 
strategy calls for a transition from the current course management 
structure to one that is curriculum based. The new structure will 
integrate individual training courses into a logical curriculum for 
each type of inspector that incorporates the specialty needs of 
inspectors. Flight Standards believes this new approach will cover 
inspectors' technical training needs, including their training on 
current technologies. Under this transformation strategy, training 
curriculum oversight teams will be formed for each inspector specialty 
to ensure that those inspectors receive the appropriate technical 
training. 

[13] We did not attempt to assess how Flight Standards conducted its 
analysis because it could not locate documentation associated with it. 

[14] Flight Standards officials said that the office is validating the 
inspector competencies. 

[15] Composite materials are materials that when combined are stronger 
than the individual materials by themselves. The benefits of using 
composite materials in aviation include light weight, durability, and 
corrosion resistance. 

[16] The 95 percent confidence interval for this estimate is from 23 to 
32 percent. 

[17] About 34 percent of all essential courses are technical and range 
from 0 percent for Air Certification inspectors to 50 percent for air 
carrier avionics inspectors. See table 10 in appendix II for additional 
results. 

[18] The 95 percent confidence interval for this estimate is from 48 to 
57 percent. 

[19] The 95 percent confidence interval for the cabin safety inspector 
responses is from 62 to 94 percent. The 95 percent confidence interval 
for the aircraft certification inspector responses is from 51 to 85 
percent. 

[20] The 95 percent confidence interval for this estimate is from 22 to 
48 percent. The survey also asked inspectors about their knowledge of 
the automated systems used in their jobs, such as ATOS or the 
Performance Tracking and Reporting Subsystem, because these are 
important tools for the system safety approach to inspections. We 
estimate that about 46 percent of inspectors believe, to a great or 
very great extent, that they have enough knowledge of automated systems 
to do their jobs. The 95 percent confidence interval for this estimate 
is from 42 to 51 percent. 

[21] FAA notes that its course evaluations support that 78 percent of 
its employees report that training has improved their job performance. 
However, its survey results are not comparable to ours. First, FAA's 
results represent responses to evaluations for all courses, both 
technical and nontechnical. In addition, it represents inspectors who 
responded that FAA training greatly improved, improved, and somewhat 
improved their job performance. In analyzing the results of our survey, 
we did not include the third category, as it does not represent a 
strong endorsement for the results of FAA training. 

[22] The 95 percent confidence interval for this estimate is from 30 to 
39 percent. FAA officials stressed that training occurs over the span 
of a career and cautioned that asking inspectors' views about 2 years 
experience would present a distorted view. According to our analysis, 
FAA inspectors have been with the agency an average of 9.3 years, 
according to our survey inspectors have been in their current position 
an average of 5.3 years. We recognize FAA's concern. However, it is not 
reasonable to expect inspectors to recall their views on training 
received over a large time span, as doing so could lead to unreliable 
results. In addition, since this report focuses on FAA's current 
actions to ensure up-to-date technical training, we believe it is more 
useful to measure inspectors' views about the training that they are 
receiving or have recently received. 

[23] The 95 percent confidence intervals for these estimates are from 
29 to 49 percent and from 6 to 39 percent, respectively. 

[24] Our survey provided the opportunity for inspectors to relate 
anything they wanted us to know about technical training. Some 
inspectors submitted both positive and negative comments. 

[25] The 95 percent confidence interval for this estimate is from 76 to 
84 percent. 

[26] The 95 percent confidence intervals for these estimates are 21 to 
29 percent and 36 to 45 percent, respectively. 

[27] See appendix III for a list of these experts. Experts commented in 
the area of their expertise. 

[28] We contacted representatives from the airlines that belong to the 
Air Transport Association and the Regional Airline Association to 
obtain their perspectives on FAA inspector technical training. 

[29] See footnote 4 for how we defined essential training. 

[30] The 95 percent confidence interval for this estimate is from 38 to 
48 percent. 

[31] The 95 percent confidence interval for this estimate is from 23 to 
32 percent. 

[32] The 95 percent confidence interval for this estimate is from 45 to 
54 percent. 

[33] The 95 percent confidence intervals for these estimates are 47 to 
64 percent and 27 to 43 percent, respectively. 

[34] The 95 percent confidence interval for this estimate is from 50 to 
59 percent. 

[35] The 95 percent confidence intervals for these estimates are 32 to 
41 percent, 23 to 32 percent, and 23 to 31 percent, respectively. 

[36] We analyzed the training records for FAA's approximately 3,000 
front-line inspectors only. The analysis did not include supervisors, 
managers, and others in the aviation safety inspector job series who do 
not perform front-line inspections. 

[37] About 81 percent of the inspectors have completed at least half of 
their essential courses, both technical and nontechnical. (See table 11 
in app. II.) In addition to the essential courses, most inspectors were 
also able to take other technical training courses that they, their 
supervisors, and FAA management have determined are related to their 
jobs. Inspectors averaged 1.7 technical courses outside of their list 
of essential courses over the past 3 fiscal years. (See table 12 in 
app. II.)

[38] The 95 percent confidence interval for this estimate is from 16 to 
24 percent. 

[39] The maximum value for the upper end of the 95 percent confidence 
interval for all other inspectors is 49 percent. The 95 percent 
confidence interval for general aviation avionics inspectors is between 
0 and 13 percent. 

[40] The 95 percent confidence interval for this estimate is from 19 to 
26 percent. Inspector dissatisfaction with the timeliness of training 
delivery is not limited to technical training. Inspectors also 
expressed concern about the timeliness of automation training. 
According to our survey, about 29 percent of inspectors indicated, to a 
great or very great extent, that this type of training was received in 
time to do their current job. The 95 percent confidence interval for 
this estimate is from 24 to 33 percent. 

[41] As of June 2005. 

[42] Flight Standards estimates that in addition it will develop 7 new 
courses, revise 9 existing courses, and complete 9 course evaluations 
by the end of fiscal year 2005. 

[43] Robert O. Brinkerhoff and Anne M. Apking, High Impact Learning: 
Strategies for Leveraging Business Results from Training, (Cambridge: 
Perseus, 2001) and Robert O. Brinkerhoff, The Success Case Method, (San 
Francisco: Berrett-Koehler Publishers, Inc., 2003). 

[44] We do not present the results of the 8 field and 3 headquarters 
managers because of the small numbers, and some did not answer all 
questions. 

[45] According to data provided by Flight Standards, there have been 
over 60 requests for revised, updated, or new courses to be developed 
since 2000, with more than 33 of them submitted since 2003 (plus a 
possible 11 more that were undated requests). Of all that have been 
submitted, 23 were technical. 

[46] The 95 percent confidence interval for this estimate is from 50 to 
60 percent. 

[47] The 95 percent confidence interval for this estimate is from 44 to 
54 percent. 

[48] Donald L. Kirkpatrick, Evaluating Training Programs: The Four 
Levels (San Francisco: Berrett-Koehler Publishers, Inc., 1994). 
Kirkpatrick conceived a commonly recognized four-level model for 
evaluating training and development efforts. The fourth level is 
sometimes split into two levels, with the fifth level representing a 
comparison of costs and benefits quantified in dollars. The fifth 
level, return on investment, is attributed to Jack Phillips and is 
taught in education and training seminars linking the two 
methodologies. 

[49] In addition, FAA receives a very limited amount of training for 
free from aircraft manufacturers during the development and deployment 
of a new or reconfigured aircraft type. FAA inspectors and 
representatives from the aircraft manufacturer certify each other 
initially by flying in the new aircraft. During these initial flights, 
the FAA inspector works closely with a manufacturer's test pilot to 
learn to operate the aircraft while also identifying requirements for 
special training or operation of the aircraft that will be necessary to 
certify future pilots. In addition, the criteria for pilot 
certification on the unique characteristics of the aircraft are 
identified during this process. As this is accomplished, the FAA 
inspector also learns to fly the aircraft at no cost to FAA. These 
instances account for less than 10 training sessions per year for FAA 
inspectors, according to FAA. 

[50] These qualified pilots become designated check airmen who are then 
permitted to conduct flight checks or instruction in an airplane for 
the purpose of certifying other air carrier pilots to ensure they are 
properly trained and able to fly the aircraft. 

[51] U.S. Department of Transportation, Office of Inspector General, 
Free Industry Flight Training of Inspectors: Federal Aviation 
Administration, AV-1998-042 (Washington, D.C.: Dec. 9, 1997). 

[52] The 95 percent confidence interval for this estimate is from 33 to 
42 percent. 

[53] The 95 percent confidence interval for this estimate is from 2.7 
to 3.5 weeks. 

[54] The 95 percent confidence intervals for these estimates are from 
50 to 59 percent and 22 to 31 percent, respectively. 

[55] The 95 percent confidence interval for this estimate is from 15 to 
23 percent. 

[56] The 95 percent confidence intervals for these estimates are from 
2.8 to 3.6 weeks and 1.0 to 3.0 weeks, respectively. 

[57] The 95 percent confidence interval for this estimate is from 9 to 
16 percent. 

[58] The 95 percent confidence interval for this estimate is from 32 to 
41 percent. 

[59] GAO-04-546G. 

[60] Our population included only those inspectors who actively 
participate in inspection activities as part of their regular job 
duties. It did not include managers, supervisors, or inspectors 
detailed to headquarters or regional offices. FAA employs a total of 
approximately 3,700 safety inspectors. 

GAO's Mission: 

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics. 

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading. 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office

441 G Street NW, Room LM

Washington, D.C. 20548: 

To order by Phone: 

Voice: (202) 512-6000: 

TDD: (202) 512-2537: 

Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm

E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director,

NelliganJ@gao.gov

(202) 512-4800

U.S. Government Accountability Office,

441 G Street NW, Room 7149

Washington, D.C. 20548: