A Guide to Auditing
Defense Acquisition Programs
Critical Program Management Elements, September 1998
Prepared by: Acquisition and Technology Management Directorate, Office
of the Assistant Inspector General for Auditing, Department of Defense
|What's
New
TRAINING AID FOR DOD AUDITORS IN PLANNING,
EXECUTING, AND REPORTING FOR ACQUISITION AUDITS, August 2005
Table
of Contents
Table of Contents (TOC)
INTRODUCTION
Background TOC
In June 1988, the Inspector General, DoD,
implemented a standard approach to performing audits of major acquisition
programs. We refer to audits accomplished using that approach as Critical
Program Management Elements audits or PME audits. Initially, the overall
objective of a PME audit was to determine whether a major acquisition
program was on the right track.
In July 1993, we updated the PME audit
guide to reflect changes to the DoD Acquisition Directives and Instructions
(the DoD 5000 series), which were revised in February 1991. In March 1996,
OUSD(A&T) again revised the DoD 5000 series. This revision to the
PME approach reflects the latest issuance of the DoD 5000 series.
DoD 5000 Series TOC
The latest issuance of the 5000 series incorporates acquisition reform
initiatives and new laws and policies that have been enacted since the
last update. Those laws and initiatives include Federal Acquisition Streamlining
Act (FASA) of 1994 and the institutionalization of Integrated Product
Teams (IPTs). The revised 5000 series also incorporates automated information
system life-cycle management policy and procedures, which were previously
covered in the DoD Directive 8120.1 and DoD Instruction 8120.2.
The revised 5000 series separates mandatory
policies and procedures from discretionary practices. DoD Directive 5000.1
establishes guiding principles for all Defense acquisitions. DoD Regulation
5000.2-R specifies mandatory policies and procedures for Major Defense
Acquisition Programs (MDAPs) and Major Automated Information Systems (MAISs),
and specifically where stated, for other than MDAPs or MAISs. DoD Regulation
5000.2-R also serves as a general model for other than MDAPs and MAISs.
The Defense Acquisition Deskbook describes the discretionary information
to which program managers and other participants in the Defense acquisition
process can turn for assistance in implementing guiding principles and
mandatory procedures.
Objectives of a PME Audit. TOC
As mentioned above, the objective of the original PME guide was to determine
whether a major acquisition program was on the right track. While that
is still an important objective, our objectives now revolve around identifying
and focusing on the high risk areas or areas of special interest of a
particular program. We no longer have the luxury of expending extensive
audit resources on all critical program management elements of a program.
In the past, we performed detailed audit work on all program management
elements and reported on each element in the report. In an effort to streamline
and make the acquisition audit process more responsive, the audit team
must tailor its guide.
Our objectives also include determining
whether a program is implementing smart business practices and is making
full use of the flexibilities available through acquisition reform initiatives.
The Acquisition Reform Implementation section of this guide contains suggested
audit steps specifically geared to gauging the status of implementation
of Acquisition Reform efforts within a program. Those steps and other
segments of this guide address many of the major focus areas that OUSD(A&T)
set forth as underlying Defense Acquisition Reform. In addition, we continue
to look for "best practices" or those things that a program is doing especially
well that could be held up as a model or a lesson learned for other programs.
We hope that this guide will help us in obtaining the above objectives
and in further establishing credibility with our audit clients.
Preparing for a PME Audit TOC
Thorough preparation is essential to the effective execution of a PME
audit. Excellent preparation will save audit and client resources, and
enhance our credibility with our audit clients. Before initiating a PME
audit, the audit team should take several steps to ensure that the audit
objectives will be accomplished effectively and efficiently. Those steps
include meeting with key program officials, reviewing program assessments
that various organizations have done on the program, and reviewing program
reports and other information.
Key Officials.
TOC Several months before the initiation of the audit,
the audit program director or audit project manager should meet with the
program manager, the program executive officer, and the OSD action officer
for the program. That meeting will start a dialogue on how the audit team
can make the audit as useful as possible and focus on areas in which they
can add the most value. The OSD Comptroller point of contact for the program
is another excellent individual to meet with before or in the early stages
of the audit survey. That individual frequently has an understanding of
the key issues that the program is facing.
Program Assessments. TOC
Before initiating an audit, the audit team should become familiar with
all assessments that other organizations have done. We discuss some of
those key assessments below:
- Defense Acquisition Board (DAB). The DAB is the DoD senior
level forum for advising the USD(A&T) on critical decisions regarding
ACAT ID programs. The auditors should determine when the next DAB
review is planned and obtain the Acquisition Decision Memorandum (ADM)
from the last DAB review. In addition, the auditors should obtain
the "Blue Book" from the last DAB review. The "Blue Book" will set
forth the baseline established for the program at the last milestone
review and also issues and problems that were raised at the review.
- OIPT Integrated Assessment. The OIPT leader for ACAT ID programs
provides an integrated assessment to the DAB at major program reviews
and milestone decision reviews using information gathered through
the IPT process. It focuses on core acquisition management issues
and takes account of independent assessments that OIPT members normally
prepare. The auditors should obtain the OIPT integrated assessment
from the last DAB or milestone review.
- Joint Requirements Oversight Council Review (JROC) Procedures.
The JROC reviews all deficiencies that may necessitate development
of major systems before any consideration by the DAB at Milestone
0. The JROC validates an identified mission need, assigns a joint
potential designator for meeting the need, and forwards the Mission
Need Statement (MNS) with JROC recommendations to USD(A&T). The
auditors should obtain a copy of the MNS and determine whether the
JROC is playing a continuing role in validating key performance parameters
in program baselines before a DAB.
Periodic Reports. TOC
Before initiating the audit, the audit team should also review key program
documentation and reports. Part 6, DoD 5000.2-R, describes mandatory reports
that must be prepared periodically to provide acquisition executives and
Congress with adequate information to oversee the acquisition process
and make necessary decisions. The Defense Acquisition Executive Summary
(DAES) is a key report for gaining an understanding of program issues
and concerns and is the principal mechanism for tracking programs between
milestones reviews. The DAES is the vehicle for reporting program assessments,
unit cost, current estimates of the APB parameters, status reporting exit
criteria, and vulnerability assessments. The DAES and OSD DAES Assessments,
which contains the OSD assessment of the DAES, will give key indicators
of program progress and issues by function from both the Service Program
Manager and OSD analyst perspective. Copies of these reports can be obtained
at the initial meetings with key program officials. Auditors should analyze
the DAES information and interview the functional OSD analysts concerning
a program before visiting the program office during the audit. That will
enable the auditor to focus on potential issue areas and a more "tailored"
application of the audit approach. See Periodic Reporting section of this
guide for other reports to ask for and audit steps relating to the reports.
The discretionary section of the Defense
Acquisition Deskbook has a table that lists mandatory as well as discretionary
items for each milestone. Those tables will be helpful in determining
what program information should be available. When assigned areas in a
PME audit to review, the auditor should thoroughly review the relevant
sections of the Deskbook. The Deskbook also provides access to the most
current directives and references.
Engaging Integrated Product Teams
TOC
Early in the survey phase of a PME audit,
the audit team should become familiar with the IPT-structure within the
program. IPTs are now an integral part of the Defense acquisition
oversight and review process. Engaging the IPT early in the audit process
will help the audit team identify critical issues and also issues that
are already being effectively worked. Several types of IPTs exist. For
each ACAT ID and IAM program, the Overarching IPT (OIPT) provides assistance,
oversight, and review as the program proceeds through the acquisition
life-cycle. The Working-Level IPTs (WIPTs) focus on a particular topic,
such as cost/performance, testing, or contracting. An Integrating IPT
(IIPT) will coordinate WIPT efforts and cover all topics not otherwise
assigned to another IPT. The IPT section of this guide sets forth possible
audit steps to use in evaluating how well the IPT process is working for
a program.
Conducting a PME Audit
TOC
This PME audit manual is a general approach
for audits of acquisition programs. It should not be construed as a checklist
or as the audit guide for a particular system. Rather, this manual is
simply a tool that the audit team can use to facilitate identifying those
key high risk areas of an acquisition program needing audit or review
and in developing a "tailored" audit guide for a specific system. The
audit team should develop a survey program and subsequent audit program
if necessary, based on this manual information known about the program,
prior audit experience or similar programs or topics, and any changes
in DoD or Military Department policies, procedures and practices made
since this manual was published.
We grouped the following steps into categories
that, for the most part, parallel the categories identified in the DoD
5000 Series. In order to streamline our audits as much as possible, we
realize that we must apply "risk management" to the audit process. Auditors
should use the information obtained about the program in the research
and preannouncement efforts to assist them in determining whether significant
problems may exist in a particular area and in determining how much effort
should be put into the area. All audit steps are by no means required
for all systems. We are no longer required to review and comment on all
areas of a program. Instead, we want to focus on those areas of a program
that are experiencing problems and in which we can add the most value.
The auditor's judgment will come to play in determining what steps to
perform for a particular program. Also, certain steps may only be applicable
to Major Defense Acquisition Programs (MDAPs).
Unlike previous editions of the PME guide,
this guide does not have separate sections for the various acquisition
phases. However, if a step is only applicable to a particular acquisition
phase (e.g., engineering and manufacturing development), we have indicated
that in parenthesis after the step. Auditors are cautioned to remember
that the response to a particular audit step will be different depending
on what acquisition phase the system is in. For example, a program in
program definition will have a logistics plan that is not as detailed
as a program in the production phase. This guide is also being maintained
"on-line" so that it can be continually revised as auditors make suggestions
for improving PME audits.
Reporting on a PME Audit
TOC
The reporting process for a PME has also
changed. In order to be more responsive to our audit clients, we want
to strive to report out on significant issues as we find them so that
management can take appropriate action as soon as possible. In the past,
we frequently waited and issued a single report with multiple findings
at the end of a 10 or 11-month time period. Just as DoD cannot afford
a 15-year acquisition cycle when the comparable acquisition cycle in the
commercial sector is 3 to 4 years, we can no longer wait a year before
issuing an audit report that addresses significant issues.
Electronic Version of PME Guide TOC
An electronic version of the PME Guide is available at the Inspector
General, DoD website at www.dodig.mil/audit/pmeguide.html.
Lessons Learned and Suggestions for Updates,
Changes and Corrections TOC
Each audit will result in information, ideas, methodologies, etc that
others can use in performing their audits. A lessons learned feedback
form is available [temporarily off-line]. Anyone who has lessons learned
or other ideas that might be of interest can electronically submit them.
This can include suggested problem areas or other issues that for some
reason were not in the report of draft policy changes. Also, suggestions
to make this manual more responsive to the auditor's need can be submitted
on the feedback form.
Other Audits TOC
Acquisition audit reports that have issues
similar to ones being addressed during a PME audit are a good source of
information. Links are being provided to existing audit reports with findings
related to specific PME topics.
PROGRAM DEFINITION
TOC
The purpose of program definition is to translate broadly stated mission
needs into operational requirements from which specific performance specifications
are derived. Acquisition programs may be initiated in response to specific
military threat, economic benefits, new technological opportunities or
other considerations. The audit steps in this section will help to ensure
that programs are well-defined and carefully structured in order to obtain
a balance of cost, schedule, and performance; available technology; and
affordability constraints. The purpose of reviewing program definition
is to:
- Determine whether the program was initiated to satisfy a specific
military threat, economic benefits, new technological opportunities,
or for other considerations.
- Evaluate that deficiencies in the current capability exist and that
non-material changes would not adequately correct the deficiencies.
- Validate that the program performed an analysis of alternatives.
- Review program stability through assessment of program affordability
and determination of affordability constraints.
- Ensure that the system program definition clearly assessed the system's
operational effectiveness, operational suitability, and life-cycle
cost reduction.
Intelligence Support: TOC
Intelligence support of Defense acquisitions will be prepared and maintained
throughout the life of the system. Validation of intelligence documentation
is required to ensure that a system is able to satisfy the mission in
its intended operational capacity throughout its expected life. DoD Regulation
5000.2-R states that acquisition programs initiated to satisfy a specific
military threat shall be based on authoritative, current, and projected
threat information.
Questions/Steps for Consideration:
a. Determine whether the program is a major program, if not, refer to
the Defense Acquisition Deskbook for steps to assess the threat information.
b. Determine if the program has an updated and validated threat that
reflects changes in environment.
- Has the environment changed significantly - enough to call into
question the need for the program?
- Was an assessment done identifying the operational threat environment,
the threat to be countered, the system specific threat, reactive threat,
and technologically feasible threat?
c. Has an assessment been performed to determine the risk of not meeting
the requirements necessary to thwart the threat? (see p. 12 - Risk
Management)
Command Control, Communications, Computers,
Intelligence, Surveillance, and Reconnaissance (C4ISR): TOC
One of the most comprehensive parts of the Intelligence Support area is
the Command, Control, Communications, Computers, Intelligence, Surveillance,
and Reconnaissance (C4ISR) Support. DoD 5000.2-R states that a C4I support
plan must be prepared for programs that interface with C4I systems. In
accordance with CJCSI 3170.01 (formerly CJCS MOP 77), C4ISR requirements
should be reviewed and updated as necessary at every milestone decision
and whenever operation or intelligence requirements change.
Questions/Steps for Consideration:
a. What C4I systems do the weapon systems/programs interface with?
b. Have the specific interfaces been defined and are they reasonable?
c. Assess the C4I support plan to determine if the C4ISR requirements
were prepared in accordance with CJCSI 3170.01 and also to ensure that
the following areas are adequately addressed:
(1.) system description,
(2.) employment concept to include targeting, battle damage assessment,
and bomb impact assessment,
(3.) operational support requirements to include C4I, testing, and training,
(4.) interoperability and connectivity characteristics,
(5.) management, and
(6.) scheduling concerns.
d. Was there functional participation in developing the plan?
e. Have the C4ISR requirements been reviewed and updated as necessary?
(1.) at every Milestone decision, and
(2.) if the concept of operations or intelligence requirements change.
f. Determine the effect/impact if these actions were not properly accomplished.
Requirements Evolution: TOC
DoD Regulation 5000.2-R requires the DoD Components to document deficiencies
in current capabilities and to develop a mission need statement (MNS)
that the new capabilities will provide. System performance objectives
and thresholds will be developed from and remain consistent with the initial
broad statements of operational capability. DoD Regulation 5000.2-R states
that requirements shall be refined at successive milestone decision points,
as a consequence of cost-schedule performance trade-offs during each phase
of the acquisition process.
Questions/Steps for Consideration:
a. Does the MNS identify, describe, and support the mission deficiency
and are these tied back to the need for the program? If it does not, what
is the justification for the program?
b. How does the MNS correlate with Mission Area Analysis (MAA)? Discuss
the results of the MAA. What does it show?
c. Were non-material changes (doctrine and tactics) adequately considered
to correct the deficiencies?
d. Were potential material alternatives identified?
e. Identify other system/program impacts, such as information warfare,
that could drive the mission need or influence its direction.
f. Has the user or user's representative documented the program Measures
of Effectiveness (MOEs) or Measures of Performance (MOPs) and minimum
acceptable requirements in an Operational Requirements Document (ORD)?
g. Does the ORD include thresholds and objectives that consider the results
of the Analysis of Alternatives (AoA) and the impact of affordability
constraints? Has program management addressed Cost as an Independent Variable
(CAIV) early on in the acquisition?
h. Has the program developed Key Performance Parameters (KPP) and were
they validated by the Joint Requirements Operations Committee (JROC)?
i. Have those key performance parameters been included in the Acquisition
Program Baseline (APB)?
Analysis of Alternatives: TOC
DoD Regulation 5000.2-R requires that an analysis of alternatives be prepared
and considered at appropriate decision reviews.
- ACAT 1 Programs - at program initiation DoD Component Head (usually
Milestone 1)
- ACAT 1A Programs - shall be prepared by the PSA for consideration
at Milestone 0, and
- ACAT 1D and ACAT 1AM Programs - the DoD Component Head or designated
official shall coordinated with USD (A&T) or USD (C3I) staff, JCS,
or PSA staff, and Director, PA&E.
Questions/Steps for Consideration:
a. Did the program complete the AOA before the program initiation decision
(usually Milestone 1)?
b. Was the AoA comprehensive and what were the results?
c. Was the analysis updated for Milestone II to examine cost performance
trades? What were the results and did the analysis include CAIV?
d. If the program is an ACAT 1A, did the Program Manager (PM) incorporate
the analysis into the cost/benefit element structure?
e. Is there a clear link between the AOA, systems requirements, and system
evaluation MOEs? If there is not, what is the impact?
Affordability: TOC
Affordability is the ongoing assessment of a program to ensure that it
is being executed within DoD planning and funding guidelines, has sufficient
resources identified and approved in the Future Years Defense Program
(FYDP), and is managed based on accurate cost and manpower data. Affordability
decisions are made throughout the entire acquisition cycle. DoD Regulation
5000.2-R states the assessment of the program affordability and determination
of affordability constraints is intended to ensure greater program stability.
Questions/Steps for Consideration:
a. Was the program's funding profile provided to the Defense Acquisition
Board (DAB) or Major Automated Information Systems Review Council (MAISRC)?
b. Programmed funds are based on estimates. Determine if the estimates
are reasonable.
c. Does the program have funding over the FYDP? If funds were not programmed,
was notification provided? What further action will be taken?
d. Does the program have sufficient resources to include manpower?
e. Did the Cost Analysis Improvement Group (CAIG) review the program?
(for ACAT 1 programs only) and did the CAIG report state that the program
was affordable? (Note: Military Departments may have similar requirements
for ACAT ICs and ACAT IIs.)
f. Is the program reasonably funded based on the proposed cost of the
program?
g. Were any program deficiencies documented and presented at the program
review?
h. Determine if the Cost/Performance Integrated Product Team reviewed
the cost and benefit data and was it sufficiently accurate?
i. Assess the reasonableness of schedule estimates.
Supportability: TOC
DoD Regulation 5000.2-R states that supportability is far more than the
logistics elements.
Questions/Steps for Consideration:
a. How do the supportability factors relate to the system's operational
effectiveness, operational suitability, and life-cycle cost reduction?
b. Look at the sustainment costs and determine if they are reasonable.
c. Has the program considered support and sustainment in the design?
Are theses costs reasonable?
d. Are "back end" sustainment costs receiving "up front" design attention?
e. Is the program using life-cycle costs models that include estimates
for operational and support elements like unit level consumables, training,
expendables, depot maintenance, and mission personnel?
f. Has the program analyzed the possibility of using commercial dual
use technology? Is the analysis documented?
g. If the program determined that commercial dual use technology cannot
be used, determine why.
PROGRAM STRUCTURE Table
of Contents
Program structure identifies management elements that are necessary to
structure a sound, successful program. The elements address what the program
will achieve (program goals); how the program will be developed and/or
procured (acquisition strategy); how the program will be evaluated against
what was intended (test and evaluation); and what resources will be needed
for the program (life-cycle resource estimates). Properly tailored program
strategies form the basis for sound management and include innovative
ways to achieve program success.
Defining the elements necessary to structure a successful weapons system
program can be accomplished by:
- establishing program goals,
- formulating a program development and production acquisition strategy
to execute goals,
- measuring program success and progress by evaluating test results,
and
- estimating life-cycle resources that concurrently result in an effective
and efficient product
Program Goals: TOC DoD Regulation
5000.2-R Policy states that program goals identified as thresholds and
objectives shall be established for the minimum number of cost,
schedule, and performance parameters that describe the program
For years the non-defense sector has successfully developed and produced
high-quality products that fully meet or exceed customer needs, while
also meeting specific, predetermined cost targets for those products.
The thrust of "cost as an independent variable" (CAIV) adapts these successful
practices to meet DoD needs.
The CAIV process recognizes that needed military capabilities are, in
most cases, best expressed as end results, which actually are the aggregate
combination of numerous more detailed, parameters. With rare exception,
there are multiple sets of detailed specifications that can be combined
to attain the desired end results, so that any one item can be varied
significantly so long as compensating adjustments are made else where
in the system.
Questions/Steps for Consideration:
a. What are the thresholds and objectives of the program?
- Determine who established the thresholds and objectives. Were they
user sanctioned? Were the thresholds and objectives independently
validated?
b. Who determined minimum number of cost, schedule, and performance parameters
and were these minimums supported and independently validated?
c. Does the acquisition strategy have aggressive, achievable cost objectives
and are they being managed by the Program Manager taking into account
out-year resources and process improvements? Have these objectives been
validated?
d Is there an approved Acquisition Program Baseline (APB) for the acquisition?
- Does the APB comply with the DoD Regulation 5000.2-R guidance for
cost, schedule, and performance and is it supportable?
- How many times has the baseline been modified and determine the
causes of the breach? What issues lead to the breach and have those
issues been adequately addressed?
- Determine what the program has done in terms of risk management
to eliminate further breeches. (See Page 12, Risk Management)
e. Obtain the Acquisition Decision Memorandum (ADM) to determine the
exit criteria established by the Milestone Decision Authority (MDA). Is
the exit criteria being met and how is the program progressing to meet
the exit criteria before the next Milestone decision?
- Using the most recent Defense Acquisition Executive Summary (DAES)
compare the extent of performance and progress with ADM exit criteria
requirements.
- Validate the DAES data by determining how the reported information
was derived.
f. Has a Cost/Performance Integrated Process Team been established to
examine and analyze cost performance tradeoffs and when do they meet and
what was accomplished? Is action being taken on those recommendations?
g. Determine whether the program is successfully implementing the CAIV
acquisition reform initiative.
- For a new program, is CAIV being used from the program's onset?
- For an existing program in the later acquisition stages, are CAIV
concepts being retrofitted?
- Have cost, schedule, and performance trade-offs been made and where
required have they been accepted by designated approving authorities
h. Were incentives to meet or exceed cost, schedule and performance objectives
placed in the development/production contract
Acquisition Strategy: TOC The acquisition
strategy serves as the evolving roadmap for program execution from program
initiation through post-production support. Essential elements include,
but are not limited to, sources, risk management, cost as independent
variable, contract approach, management approach, environmental considerations,
and source of support as well as other major initiatives as described
in the acquisition regulation such as critical events.
Questions/Steps for Consideration:
a. Is the acquisition strategy commensurate with the level of risk in
the program?
b. How is the acquisition strategy incorporating acquisition reform initiatives
and does the acquisition strategy address multi-year funding, component
breakout, and other contracting alternatives for reducing costs.
Commercial and nondevelopmental items:
TOC A Commercial Item (CI) is any item which
evolves from or is available in the commercial marketplace that will be
available in time to satisfy the user's requirement. Services such as
installation, maintenance, and training. for these items may also be obtained
for Government use.
A Nondevelopmental Item (NDI) is one that was developed exclusively at
private expense and sold to multiple state and local governments. The
NDI can be bought from any of the above sources and used "as is." An NDI
can also be an item bought from the above sources that requires minor
modification before it is operationally effective.
In general, the DoD policy is to use commercial and nondevelopmental
items whenever possible. Market research and analysis is required to determine
if commercial and nondevelopmental items are available to satisfy the
requirements of the acquisition.
Questions/Steps for Consideration:
a. In compliance with DoD Regulation 5000.2-R and the Federal Acquisition
Regulation (FAR), did the Program Manager in his acquisition strategy
exhaust all commercial and nondevelopmental sources of supply before commencing
a new program start?
b. Did the acquisition strategy encourage offerors to employ dual use
technologies for defense-unique items?
c. Did the Program Manager consider industrial capability in the acquisition
strategy by addressing:
- program stability for industry to invest, plan and bear risk;
- industrial capability to design, develop, produce, support, and
if appropriate, restart the program; and
- preservation of industrial capabilities.
Risk Management: TOC
To effectively tailor a program, one needs to understand the risks present
in the program and to develop a plan for managing these risks. DoD policy
calls for the continual assessment of program risks, beginning with the
initial phase of an acquisition program, and the development of risk management
approaches before decision to enter all subsequent phases. A risk management
program should be established to identify and control performance, cost
and schedule risk. The risk management program must include provisions
for eliminating risk or reducing risk to an acceptable level.
The application of risk management processes (planning, assessment, handling,
and monitoring) is particularly important during Phase 0 of any program,
when various program alternatives are evaluated, CAIV objectives are established
and the acquisition strategy is developed. All of these activities require
acceptance of some level of risk and the development of plans to manage
risk.
As a program evolves into subsequent phases, the nature of risk management
effort changes, with the new assessments building on previous assessments.
Risk areas will become more specific as the program becomes more defined.
Integrated Product Teams play a key role in risk management activities.
Questions/Steps for Consideration
a. Did the Program Manager apply a formal risk management program to
abate identified cost, schedule, and performance risks?
b. Determine if the risk management plan is effectively controlling identified
risks as well as identifying new risks as the program evolves through
the acquisition process.
- Does the risk management program define high, moderate, and low
risk?
- Identify those tasks that have been labeled as high risk
- Has a risk abatement plan been established for those tasks identified
as high risks? Does the risk abatement plan appear reasonable?
- Review tasks that have been rated moderate risks to determine if
they have been rated properly in accordance with the definitions of
high and moderate risk (often, borderline high risk tasks will be
labeled as moderate risks)
c. Are Integrated Product Teams actively involved in the identification
of risks and the establishment of risk abatement plans?
d. Do any of the actions on the risk abatement plan exceed the authority
of the program manager?
e. Determine if there are high risk tasks that the program manager has
not identified.
f. Review Cost Performance Report for tasks that are behind schedule
or over cost.
g. Review reports of testing, such as component testing, to identify
failures that require major redesigns.
Contracting: TOC
Contracting is the way the Government acquires items and services from
contractors. It describes work requirements, specifications, costs, deliverables,
administration, restrictions, and limitations between the Government and
contractors. The selection of contractual sources and contract requirements
must be well thought out and tailored to accomplish stated objectives
while ensuring an equitable sharing of program management risks. Poorly
planned and executed contracts result in delays, higher costs, wasted
resources, and an increased opportunity for fraud.
Federal Acquisition Regulation, Part 6, "Competition Requirements," states
the requirement that if other than full and open competition is going
to be used in awarding a contract, justification for use of other than
full and open competition must be approved. That justification should
contain sufficient facts and rationale to demonstrate why full and open
competition would not be possible in this particular situation and include
a description of the market research conducted.
Note: Any reference to a "pre-negotiation memorandum" may actually refer
to what is commonly termed a "Business Clearance".
Questions/Steps for Consideration:
a. Does the acquisition strategy discuss contract types for current and
succeeding acquisition phases, risk sharing, and incentives for contractors
to decrease risks?
b. Is the contract type appropriate for the work required and risk? Did
selection and award of the contract follow policies, practices and procedures?
(Auditor's Note: Sources of information for this question include,
the Federal Acquisition Regulation, the Defense Federal Acquisition Regulation,
source selection guidelines, legal reviews, and other reviews by pertinent
organizations.)
c. Does the contract Statement of Work (and other referenced clauses)
provide performance specifications, minimum specifications, manufacturing
standards, etc. that are consistent with the ORD requirements and with
program technical, cost and schedule baselines?
d. Where necessary has the acquisition allowed for systems acquisition
advanced procurement?
e. To what extent has the acquisition been streamlined?
f. Is the contract performance specification based versus military standards/specification
based?
g. What was the extent of competition in the contracting process?
- How is the prime contractor maintaining competition in the program
through its subcontractors?
- How is the prime contractor flowing down streamlining initiatives
to the subcontractors?
- If other than full and open competition, was justification documented
and approval obtained?
h. Determine issues/concerns relating to any undefinitized contractual
actions.
- Was there an extensive number of undefinitized contractual actions
that remained undefinitized for an excessive amount of time?
- Was there an excessive number of changes to the requirements of
the system?
i. Are commercial specifications being identified for inclusion in the
contract? If not, does the file documentation reflect the basis for the
use of military specifications?
j. Evaluate the contract planning and the acquisition strategy. What
evidence is there that some kind of market research was conducted to determine
the availability of commercial items(s) to satisfy the government's requirements?
k. Is there evidence that the Program Management Office has addressed
CAIV as a consideration in its trade-off analysis as required by DoD 5000.2-R,
paragraph 3.3.3? (Note: Cost must always be a consideration in every
source selection.)
l. Is there evidence to suggest that the Contracting Officer has determined
that the price at which that contract was awarded is fair and reasonable?
Contract Management: TOC Contract
management spans the timeframe from contract award through close-out or
termination of the contract. When a contract award is received, a post-award
strategy is developed within the terms and conditions of the contract.
Once the strategy is defined, the analyses of contractor data and contract
performance data are used by the Government to analyze the contractor's
performance. Based on these analyses, performance indicators are generated
which may impact post award strategies. Upon receipt of evidence of physical
completion or termination of the contract, close-out activities are performed.
Questions/Steps for Consideration:
a. Does the acquisition strategy make maximum use of the Defense Contract
Management Command (DCMC) personnel at contractor facilities?
b. Have MOAs been signed between the PM and DCMC establishing their defined
functions?
c. Have technical representatives been assigned by the PM to the contractors'
facilities? How do their duties and responsibilities compare to those
of DCMC representatives?
d. In the development of the acquisition strategy, has DCMC shared relevant
information concerning the contractor's operation and performance with
the PM and the principal contracting officer.
e. Determine whether a Contractor Management Council exists?
f. Have DCMC personnel been integrated into pertinent IPT's? If not,
what is the impact?
Joint and Reciprocal Programs: TOC
A joint program is any acquisition system, subsystem, component, or technology
program that involves a strategy that includes funding by more that one
DoD Component (even if one Component is only acting as an acquisition
agent for another Component). It is a DoD policy to consolidate and co-locate
a joint program, to the maximum extent practicable.
Questions/Steps for Consideration:
a. To what extent does the acquisition strategy discuss the potential
for enhancing reciprocal defense trade and cooperation and joint program
management? Check the Analysis of Alternatives Report (AOA) to determine
if it was discussed
b. If the program is a joint program, determine if the program is being
effectively managed by both parties. Look at the stability of the funding
and the jointness of testing.
Life-cycle support TOC
Questions/Steps for Consideration:
a. How does the acquisition strategy address life-cycle support concepts?
Based on the acquisition strategy and projected use of the program, do
the concepts appear reasonable?
b. In accordance with DoD policy, does the acquisition strategy maximize
the use of contractor provided long-term, total life-cycle logistics support.
- If the above is not the case and DoD will logistically support the
acquisition rather than the contractor, was a waiver prepared by the
program manager and approved by the Milestone Decision Authority for
the exception?
c. Where logistic support will be contractor provided have arrangements
been made by the program manager to access the original equipment manufacturer's
technical database to compete out year support for the acquired weapon
system
Life-cycle resource estimates: TOC
(Note: See Page 27, Life-Cycle Cost)
Questions/Steps for Consideration:
a. Determine if a life-cycle cost estimate was prepared for the last
milestone review.
b. Does the life-cycle cost estimate include all elements of cost for
the life of a weapon system such as the cost of hazardous materials?
c. Assess the life-cycle assessment for reasonableness.
Warranties: TOC
The principle purpose of a warranty in a Government contract is to delineate
the rights and obligations of the contractor and the Government for defective
items and service and to foster quality performance. Generally, a warranty
should provide (1.) a contractual right for the correction of defects
notwithstanding any other requirement of the contract pertaining to acceptance
of the supplies or service by the Government; and (2.) a stated period
of time or use, or the occurrence of a specified event, after acceptance
by the Government to assert a contractual right for the correction of
defects. The benefits to be derived from the warranty must be commensurate
with the cost of the warranty to the Government.
Questions/Steps for Consideration:
a. To what extent in the acquisition strategy were warranties discussed
for weapon system logistic support?
b. What plans are in place to ensure that the warranty is cost effective
and functional?
c. Does the program, logistics user have procedures and practices in
place to take advantage of the warranty?
d. Is there a method of tracking warranty items to determine systematic
or quality problems needing an overall fix?
Test and Evaluation: TOC Test and
evaluation programs shall be structured to integrate all developmental
test and evaluation (DT&E), operational test and evaluation (OT&E),
live-fire test and evaluation (LFT&E), and modeling and simulation
activities conducted by different agencies as an efficient continuum.
All such activities shall be part of a strategy to provide information
regarding risk and risk mitigation, to provide empirical data to validate
models and simulations, to permit an assessment of the attainment of technical
performance specifications and system maturity, and to determine whether
systems are operationally effective, suitable, and survivable for intended
use.
Test and evaluation planning shall begin in Phase O, Concept Exploration.
Both developmental and operational testers shall be involved early to
ensure that the test program for the most promising alternative can support
the acquisition strategy and to ensure the harmonization of objectives,
thresholds, and measures of effectiveness (MOEs) in the ORD and TEMP.
Test and evaluation planning shall address MOEs and measures of performance
(MOPs) with appropriate quantitative criteria, test event or scenario
description, resource requirements (e.g., special instrumentation, test
articles, validated threat targets, validated threat simulators and validated
threat simulations, actual threat systems or surrogates, and personnel),
and identify test limitations.
Questions/Steps for Consideration:
Test Planning TOC
a. Is the Test and Evaluation Master Plan (TEMP) being prepared and does
the plan ensure that all requirements will be tested?
- Have the developmental and operational testers been involved in
the test and evaluation planning process?
- Are the measures of effectiveness and measures of performance consistent
with the Operational Requirements Document and does the test and evaluation
plan address MOEs and MOPs with the appropriate quantitative criteria,
test event, and scenario description and resource requirement?
- Does the TEMP ensure that testing is accomplished against a realistic
threat environment? If not, how has the program office assessed the
risk and identified risk mitigation actions?
- Has the pass/fail criteria for the system been clearly identified?
- Determine whether the minimum acceptable performance specified in
the ORD was used to establish test criteria for operational test and
evaluation.
- Determine if the TEMP identifies the resources needed to execute
the test plan.
- Does the test plan address all system components to include hardware,
software and human interfaces that are critical to demonstrating performance
against the specifications and requirements in the ORD?
- Does the TEMP focus on the overall structure, major elements, and
test objectives that are consistent with the acquisition strategy?
- Has the test and evaluation program been approved in writing by
the appropriate officials? Was a TEMP approved at Milestone I, II,
III?
Models and Simulation Table
of Contents
b. Does the TEMP list all models and simulations to be used?
c. Have all models and simulations to be used been verified, validated,
and accredited? If not, have risks been assessed and risk mitigated?
d. Has the pass/fail criteria for the system been clearly identified?
Test Plan Changes Table of
Contents
e. What significant changes to the test plan have been made? What is
the impact of those changes?
- Have additional resources been added if needed?
- Have risks resulting from changes been assessed and risk mitigation
actions identified?
- If the mitigation actions affect how or when the system can be
used in a threat environment (i.e. operational use), have the users
approved?
Miscellaneous Table of Contents
f. Was the Military Department independent operational test activity
involved in the development of the TEMP?
g. Is adequate funding for testing reflected in the budget?
h. Does the test plan address all system components to include hardware,
software, and human interfaces that are critical to demonstrating performance
against the specifications and requirements in the ORD?
i. Has the Director of Operational Test and Evaluation provided Congress
the Beyond Low Rate Initial Production and the Live Fire Test and Evaluation
Report? Are the results of testing consistently reflected in program documents?
j. If major modifications have been made to the system, has the Follow-On
Test and Evaluation been scheduled?
k. Are test results accurately reflected in program documentation? Do
test results to-date show sufficient progress toward meeting program requirements
and goals.
l. Are design changes and corrections being made based on the results
of the testing?
PROGRAM DESIGN TOC
The purpose of program design is to establish the basis for a comprehensive,
structured, integrated, and disciplined approach to the life-cycle design
of major weapons and automated information systems.
Integrated Product and Process Development TOC
Integrated Product and Process Development is a management technique
that simultaneously integrates all essential acquisition activities though
the use of multidisciplinary teams to optimize the design, manufacturing,
and supportability processes. IPPD facilitates meeting cost and performance
objectives from product concept through production including field support.
The key IPPD tenet is teamwork through Integrated Product Teams (IPTs).
The Program Manager (PM) shall employ the concept of IPPD throughout
the program design process to the maximum extent practicable. The IPPD
management process shall integrate all activities from product concept
through production and field support, using multidisciplinary teams to
simultaneously optimize the product and its manufacturing and supportability
to meet cost and performance objectives. It is critical that the processes
used to manage, develop, manufacture, verify, test, deploy, operate, support,
train people, and eventually dispose of the system be considered during
program design.
Note: If all major program issues disclosed through your audit work on
other sections have been identified and are being aggressively addressed
through IPTs, audit effort can be scaled back. Issues disclosed by your
audit work that are not being addressed by the IPTs could form the basis
for audit findings and recommendations.
Questions/Steps for Consideration:
a. Meet with IPT leaders to discuss your audit objectives and to make
arrangements for:
- Obtaining minutes and issue status listings from previous meetings
- attending one or more IPT meetings
- Interviewing selected IPT members
b. Determine if the program have an IPT structure in place? If so, does
the Program Office and the contractors have documentation regarding the
IPT structure to include:
- Hierarchy of IPTs
- IPT mission and functions statements
- Meeting Schedules
- Permanent and "as needed" IPT members
c. Determine if the IPT missions and member responsibilities are clearly
defined and documented and if there are contract provisions relating to
contractor use of and participation in IPTs. If so , assess these provision;
are they being adequately implemented?
d. Is the IPT:
- working as an effective means of communicating and reaching consensus,
- employing the appropriate organizations, functional disciplines
and expertise,
- aggressively addressing major issues, and
- formulating and approving the critical acquisition documentation.
e. Based on their experience with IPTs, what, if anything would IPT members
like to change?
f. Are issues disclosed by the audit being addressed by the IPTs or are
they taking too long to be resolved?
Systems Engineering: TOC
The Program Manager shall ensure that a systems engineering process is
used to translate operational needs and/or requirements into a system
solution that includes the design, manufacturing, test and evaluation,
and support processes and products. The systems engineering process shall
establish a proper balance between performance, risk, cost, and schedule,
employing a top-down iterative process of requirements analyses, functional
analysis and allocation, design synthesis and verification, and system
analysis and control.
Questions/Steps for Consideration
a. What is the systems engineering process?
b. Is the systems engineering process structured, disciplined, or documented
for the program (both from the government program office and contractor
perspectives)? For example, does the Program Office has an internally
developed process or use available DoD models (such as the Systems Engineering
and Integrated Product Development Capability Maturity Models) to develop
and manage their systems engineering program.
c. Was the systems engineering process implemented as part of an overall
IPPD approach using multidisciplinary teamwork (IPTs)?
d. Assess whether technical risks are clearly identified and managed.
(See Page 12, Risk Management)
e. Are functional and physical interfaces easily identifiable and continually
reviewed to ensure they are compatible, interoperable, and capable of
integration?
f. Determine whether performance metrics were established to measure
how well the technical development and design are evolving.
g. Does a structured review process exist in the Statement of Work to
demonstrate and confirm completion of required accomplishments?
h. Determine if management policy stresses program and delivery schedules
(schedule driven) rather than accomplishment of objectives (event driven).
i. Do operations requirements drive design instead of state-of-the-art
capabilities?
j. Determine if detailed design requirements evolved with the design
effort.
k. Are software progress reviews part of the periodic project reviews?
l. Is there a process in place that addresses:
(1.) How the Program Office will work with the user to establish and
refine requirements?
(2.) Is the user involved and satisfied with the refinement of requirements?
(3.) Traceability between user and design requirements?
(4.) Requirements for all of the following: hardware, software, facilities,
personnel, procedures, technical data, personnel training, verification
matrices, spares, repair parts, consumables, interfaces, and life cycle
cost considerations?
m. Are requirements continually reviewed and assessed throughout the
design process, including interfaces?
n. Has the Program Office implemented Cost-as-an-Independent Variable?
Have cost/performance trade-offs been considered? (see Page 10, Program
Goals)
o. Is the user involved and satisfied with the refinement of requirements?
p. To what extent is analysis, modeling & simulation, demonstration,
and testing used to verify the design?
q. Design policy guidelines are established after contract award. The
design policy is not included during source selection.
r. Has a configuration management process been established?
- Assess configuration control plans. Do the plans tailor requirements,
and incorporate subcontractor products? Do the plans outline procedures
to adequately manage and control the system configuration?
- Is an integrated system been developed to capture and control the
technical baseline?
s. Have performance metrics been established to measure how well the
technical development and design are evolving?
t. Have interface controls been established to ensure all internal and
external interface requirements are properly recorded and communicated?
Manufacturing and Production TOC
The producibility of the system design shall be a priority of the development
effort. Design engineering efforts shall focus on concurrent development
of producible designs, capable manufacturing processes, and process controls
to ensure requirements satisfaction and minimize manufacturing costs.
The use of existing manufacturing processes shall be capitalized upon
whenever possible. When new manufacturing capabilities are required, flexibility
(i.e. sensitivity to rate and product configuration) shall be considered.
Full rate production of a system shall not be approved until the system's
design has been stabilized, the manufacturing processes have been proven,
and the production facilities and equipment are in place (or are being
put in place). (see DoD 5000.2-R, Part 4, Paragraph 4.3.1).
Significant and unresolved issues exist from government and contractor
design reviews. Principle among these reviews are the Preliminary Design
Review (PDR), Critical Design Review (CDR), Functional and Physical Configuration
Audits (FCAs and PCAs), and Production Readiness Review (PRR) or similar
production readiness assessment.
Questions/Steps for Consideration
Technical Data TOC
a. Review the contract to determine the level of technical data
required. Has the contractor provided the level of technical data required?
b. Is a Technical Data Package complete or expected to be complete at
the time of the final PRR or production readiness assessment prior to
start of production?
c. Determine who owns the rights to the technical data package.
- Does the program office have access to the technical data package?
What access is available?
- Is the required level of data ordered consistent with the program
office's acquisition and logistics support plan?
- Was the supporting activity consulted when determining the technical
data requirements?
- Were responsibilities for monitoring and closing these action items
assigned?
- Will DCMC provide on site verification of action item closures?
d. Determine whether an integrated data management plan was developed.
- Determine whether the plan lays out the technical data requirements.
- If a plan was not developed, determine why not.
e. Identify subcontractors and suppliers. Has the prime contractor required
subs to identify risks and potential work around to help ensure hat subs
can provide timely delivery of acceptable material in sufficient quantities
at a reasonable cost.
Preliminary Design Review
TOC
f. Review results of Preliminary Design Review (PDR).
g. Determine critical issues arising from the PDR.
h. Evaluate whether the critical PDR issues have been resolved or if
an effective plan is in place to resolve them prior to scheduled Critical
Design Review (CDR).
Critical Design Review TOC
i. If CDR has not yet been performed, evaluate the planning for CDR.
j. What key tasks must be accomplished to be ready in CDR? Has the PM
established internal exit criteria for CDR?
k. Does scheduling of the CDR seem reasonable to support LRIP considering
what is to be accomplished prior to CDR and the amount of time scheduled
between CDR and LRIP?
l. After CDR has occurred, determine what critical issues were identified.
m. Determine whether reasonable plans have been established for timely
resolving CDR issues prior to scheduled LRIP.
Production Readiness Review
TOC
n. Has a PRR (or similar production readiness assessment) plan
been established? Does the plan provide for consideration of:
- design instability,
- changes in technical requirements,
- quantity and funding fluctuations,
- long-lead times to procure critical parts,
- availability of facility and equipment, and
- availability of special tooling and equipment?
o. Does the contract require PRRs or similar production readiness assessments.
If so:
- Will the reviews be performed incrementally? Incremental reviews
allow for more timely discovery of problems.
- Will incremental reviews be performed in an event-based manner,
i.e., incremental PRRs held after completion of Critical Design Review
and prior to: long-lead procurement for Low-Rate Initial Production
(LRIP), prior to start of LRIP, and prior to start of full-rate production.
p. Has an IPT been established or planned to support the PRR (or
production readiness assessment) process? (The IPT may be described in
the PRR plan, if one has been completed.) Note: The IPT should include
persons involved in the following disciplines: engineering and design,
manufacturing, testing, contracting, and logistics. It should also include
persons from the Defense Contract Management Command (DCMC) and the contractor.
q. Evaluate the results of any completed PRR (or other production
readiness assessment). Did it follow the plan? Were corrective actions
taken in a timely manner?
- Were responsibilities for monitoring and closing action items assigned?
- Are action items being addressed?
- Will DCMC provide on-site verification of action items?
Manufacturing Plan TOC
r. Determine whether the production and manufacturing plan has
been updated at the milestone decision points.
s. Determine whether the production and manufacturing plan covers
producibility, risk-reduction efforts, and plans for proofing new or critical
manufacturing processes, manufacturing feasibility, and industrial base.
t. Determine whether the plan identifies Government-Furnished Equipment
and its integration into the production process.
u. Were manufacturing and production people involved early on in the
design and development process?
v. Determine whether manufacturing engineering tasks were defined and
worked beginning early in EMD.
w. Was manufacturing planning accomplished concurrent with the product
design process?
x. Determine whether schedule and resource planning provided for validating
the suitability of any new manufacturing processes that will be required.
y. Has a producibility program been developed as part of the systems
engineering effort? (See Page 21, Systems Engineering)
z. Has, or will the Government test the manufacturing process sufficiently
to ensure that the process is capable of achieving producibility requirements?
aa. Has the adequacy of manufacturing plant capacity been considered?
Critical Materials and Long Lead Time Materials
TOC
bb. Determine whether long lead material scheduling will preclude or
increase the expense of producibility redesigns that may be required to
support production.
cc. Are alternatives being assessed to minimize the use of strategic
or critical materials?
Miscellaneous Question
TOC
dd. Determine if production equipment, special test equipment, and special
tooling have been identified in terms of specifications and quantities.
Quality: TOC
Quality products and services are necessary for the success of military
operations, as well as to successful system development and production.
The quality of the product or service is determined by deciding how well
they meet the requirements and satisfy the customers needs at a reasonable
cost. The goal of an effective acquisition program is to acquire goods
and services that meet or exceed requirements better, faster, and at less
cost. The emphasis on quality has evolved dramatically over the years.
The shift in thinking now emphasizes development of quality products through
design and associated processes. The key to success is to prevent quality
problems through sound processes up front and early on as opposed to finding
them later and performing rework.
Questions/Steps for Consideration
a. Determine whether excessive quality problems were found during development
and test. What is the planned corrective action?
b. Was environmental stress screening (ESS) at the highest practical
level of assembly, using temperature and vibration, planned before the
start of production. ESS should be conducted in accordance with DoD 4245.7-M.
c. Has the PM allowed contractors flexibility to use their preferred
quality management process that meets program objectives?
d. Does the quality management process include:
(1.) establishment of capable processes?
(2.) monitoring and control of critical processes and product variation?
(3.) Feedback mechanisms for field performance?
(4.) An effective root cause analysis and corrective system?
(5.) Continuous process improvement?
e. Do quality staff report independently to higher management?
f. Are representatives from quality staff included on IPTs?
Acquisition Logistics: TOC
DoD Regulation 5000.2-R requires acquisition programs to establish
logistics support concepts early in the program and refine them throughout
the development process. The regulation also requires life-cycle costs
to play a key role in the overall selection process and support concepts
for new and future weapon systems. The Program Office should conduct acquisition
logistics management activities throughout system development to ensure
the design and acquisition of systems can be cost-effectively supported
and to ensure that these system are provided to the user with the necessary
support infrastructure for achieving the user's peacetime and wartime
readiness requirements. (Auditor's Note: for additional information on
supportability issues, see page 8.)
Questions/Steps for Consideration
a. Were supportability analyses conducted at program initiation and throughout
program development?
b. Determine whether data requirements were consistent with the planned
support concept.
c. Determine if supportability analyses were perform as an integral part
of the systems engineering process beginning at program initiation and
continuing throughout program development.
d. Determine if the Program Office allowed contractors the maximum flexibility
in proposing the most appropriate supportability analyses.
e. Determine if data requirements were consistent with the planned support
concept and represent the minimum essential to effectively support the
fielded system. (See Page 23, Technical Data)
f. Determine if government data requirements for contractor developed
support data was coordinated with the data requirements of other program
function specialties to minimize data redundancies and inconsistencies.
Life-Cycle Cost: TOC
For all ACAT I and IA programs, a life-cycle cost estimate shall be
prepared by the program office in support of program initiation (usually
milestone I) and all subsequent milestone reviews. For ACAT I programs,
the MDA may not approve entry into engineering and manufacturing development
or production and deployment unless an independent estimate of the full
life-cycle cost of the program and a manpower estimate for the program
have been completed and considered by the MDA. Historically, we pay a
huge amount of attention to the initial cost of our acquisition program,
but little attention to the sustainment costs. Between 50% and 75% of
total program costs are in the sustainment area. (Auditor's Note: For
additional information relating to Life-Cycle Cost, see Page 17, Life-Cycle
Resource Estimates.)
Questions/Steps for Consideration
a. Determine if an independent logistics support cost estimate was performed
and a manpower estimate was completed and kept current to reflect appropriate
changes.
b. Has the acquisition logistics life cycle cost estimate been updated
to reflect estimate changes?
c. Were all the acquisition logistics life-cycle costs included for a
particular acquisition phase?
d. Was an independent review of the life-cycle cost estimate performed?
Who performed it?
e. Determine whether the life-cycle cost estimate is based on specific
acquisition logistic program objectives.
f. Determine whether the cost estimates were consistent with those used
in the analysis of alternatives.
g. Assess the use of attrition trade studies as an element of the life-cycle
costing of a major weapon system. These studies are cost models that measure
the impact of adding various safety technologies to reduce the number
of attrition units needed to replace units consumed by accidents. (Source:
D. Steensma note forwarding suggestion.)
Integrated Logistics Support Plan: TOC
The Integrated Logistics Support Plan (ILSP) is the principal logistics
document for an acquisition program and serves as a source document for
summary information required in various milestone documents. This formal
planning document is kept current through the program life and sets forth
the plan of operational support, provides a detailed Integrated Logistics
Support program to fit with the overall program, and provides decision-making
bodies with the necessary logistics information to make sound decisions
in system development and production.
Questions/Steps for Consideration
a. Assess the Program Office's ILSP. Was the plan appropriately updated
or revised at milestone decisions or for major modifications, upgrades
or changes in quantities to the weapon system.
b. Assess the adequacy of the plan's outline for schedules, procedures
and actions necessary to deploy a new system.
c. Review the ILSP to ensure it is current and includes all the necessary
logistic support requirements.
Maintenance Plan and Implementation:
TOC The maintenance plan is a description
of maintenance considerations and constraints for the system under development.
The plan is one of the principal elements of the ILS and should establish
maintenance concepts and requirements for the lifetime of the system.
A preliminary maintenance concept is developed and submitted as part of
the preliminary system operation concept for each alternative solution
candidate by the operating command with the assistance of the implementing
and supporting commands.
Questions/Steps for Consideration:
a. Was a maintenance plan developed?
- Did the plan show the responsibilities and requirements of the Program
Office and the operating sites where the system will be deployed?
b. Determine if the maintenance plan prepared describes the maintenance
concepts and maintenance requirements for the life of the system.
- Determine if simulation models were used in deciding on the maintenance
concept. If models were used determine if the criteria used in the
model is logical and meets the parameters of the maintenance plan.
- Determine if the maintenance concept is consistent with the deployment
concept of the weapon system.
Deployment: TOC
The deployment process is designed to turn over newly acquired or
modified systems to users who have been trained and equipped to operate
and maintain the system. All elements of the integrated logistic support
must be in place at deployment with the exception of those for which interim
contractor support is available. When properly planned and executed, deployments
result in high unit readiness, reduced cost, less logistical turmoil,
and help establish reliability for the new system.
Questions/Steps for Consideration:
a. Determine if there is a comprehensive, coordinated deployment plan
containing realistic lead times and supported by adequate funds and staff.
b. Determine if the ILS manager established a management information
system to assist the deployment planning and implementation process. If
a system was developed, determine whether it covers all aspects of deployment
and operational support.
c. Determine if an IPT deployment working group was established. As a
minimum, the group should have members from the user and supporting commands.
Environmental, Safety and Health:
TOC Environment, Safety and Health
(ES&H) analyses cover a broad range of topics directly related to
the well-being of persons, places and things that may be affected by a
weapon system over its life cycle (concept to disposal). DoD 5000.2-R,
part 4, section 4.3.7 provides specific requirements covering five areas
which form the principle technical and management disciplines of ES&H:
- Compliance with the National Environmental Policy Act,
- Periodic review of environmental regulations for the purpose of
analyzing the impact on the program's cost, schedule and performance,
- Establishing a system safety and health hazards program,
- Establishing a hazardous materials management program, and
- Establishing a pollution prevention program.
All programs, regardless of acquisition category, must comply with ES&H
requirements. Further, Program Offices must conduct their ES&H program
in accordance with applicable federal, state, interstate, and local environmental
laws and regulations, Executive Orders, treaties, and agreements. These
requirements constitute an external constraint, beyond the PM's control,
on system design construction, modification, testing, operation, support,
maintenance, repair, demilitarization and disposal. Often the environmental
regulations prescribe what must be done and how to do it. These regulations
are very costly to comply with during production and even more so later
on during the operation and support of the system and because of this,
environmental regulations should be fully evaluated early on in the program
and then periodically re-evaluated to determine their impact on the program
cost, schedule, and performance.
ES&H analyses must be conducted to integrate ES&H issues into
the systems engineering process and to support development of the Programmatic
ESH Evaluation (PESHE). The first audit step in evaluating any program
is to determine which ES&H requirements apply to that particular program
and whether those requirements have been incorporated throughout the life
of the program (program documentation, requirements, and contractor support).
Questions/Steps for Consideration:
a. Determine if an IPT was established to review and maintain oversight
of the environmental aspects of the program.
b. Determine if the PESHE was initiated early in the program.
- Determine whether the ES&H program is an integral part of integrated
product and process development teams.
- Does the Program Manger have an aggressive attitude towards ES&H
management issues? For example, determine whether ES&H is an agenda
item at contractor program status briefings.
- Does the system contractor have a strong ES&H management history
or a structured ES&H management program?
- Are ES&H considerations integral components of the design process?
c. Determine whether the Defense Contract Management Command has closely
monitored the contractor's ES&H program or identified numerous problem
areas.
d. Determine if the program office is coordinating with the OSD Joint
Acquisition Pollution Prevention Group, Joint Group on Acquisition Pollution
Prevention (JG-APP), or OSD/Service environmental offices.
e. Has the Program Office estimated and planned for life-cycle costs
associated with environmental compliance? Have ES&H resource and cost
estimates been based on experience with similar systems?
f. Has the Program Office considered the effect of environmental compliance
on schedule and performance?
g. Did the Program Office consider international laws, treaties, and
host nation obligations for systems that will be deployed overseas?
h Have system safety and health hazards been identified and assessed?
i. Has the decision to accept risks associated with an identified hazard
been documented? Have "serious" risk hazards been approved at the Program
Executive Officer level?
j. Has the Component Acquisition Executive approved the acceptance of
high risk hazards?
k. Have all participants of joint programs approved the acceptance of
high risk hazards?
l. Is the Program Office complying with applicable Federal and DoD safety
and health standards for military-unique operations? (Note: Suggested
criteria includes Executive Order 12196 and DoD Instruction 6055.1)
m. Is the PM considering safety trade-offs? Assess the use of attrition
trade studies as an element of life-cycle costing of major weapon systems?
These studies are cost models that measure the impact of adding various
safety technologies to reduce the number of attrition units needed to
replace units consumed by accidents.
Hazardous Materials: TOC
The PM is required to establish a hazardous material management and
analysis program to consider eliminating and reducing the use of hazardous
materials. The hazardous material analyses support the systems engineering
process.
The hazardous material management strategy and requisite hazardous material
analyses ensure that the program investigates methods for eliminating
and reducing the use of hazardous materials over the systems life cycle.
Questions/Steps for Consideration:
n. Has the Program Office established a Hazardous Material Management
Program (HMMP)?
o. Does the HMMP focus on eliminating and reducing the use of hazardous
materials instead of only managing pollution created?
p. Does the HMMP include plans for identifying, minimizing use, tracking,
storing, handling, and disposing of hazardous materials?
q. Is a database maintained of environmentally unsafe materials?
Demilitarization and Disposal: TOC
Demilitarization is the act of destroying the military offensive of
defensive advantages inherent in certain types of equipment of material.
The term encompasses mutilation, scrapping, melting, burning, or alteration
designed to prevent further use of this equipment of material for its
originally intended military or lethal purpose and applies equally to
material in unserviceable or serviceable condition. The equipment or material
has been screened through the Inventory Control Point and declared Excess,
Surplus, and Foreign Excess.
Disposal is the process of redistributing, transferring, donating, selling,
abandoning or destroying property. Extreme care must be exercised in the
disposal of property that is dangerous to public health and safety.
Questions/Steps for Consideration:
r. Does a reasonable estimate exist for demilitarization & disposal
costs? (See Page 27, Life-Cycle Cost)
s. Have cost-effective analyses been conducted on the selection, use,
and
disposal of hazardous materials?
Pollution Prevention: TOC
The PM must establish a pollution prevention program to help minimize
environmental impacts and the life-cycle costs associated with environmental
compliance. This is best done early in the program by recognizing and
avoiding the creation of pollutants and environmental impacts that have
to be managed.
When designing, manufacturing, testing, operating, maintaining and disposing
of systems, all forms of pollution should be prevented or reduced at the
source whenever feasible. Pollution that cannot be prevented or recycled
should be treated in an environmentally safe manner. Disposal or other
release to the environment should be employed only as a last resort and
must be conducted in an environmentally safe manner and in compliance
with all applicable statutes and regulations. In developing work statements,
specifications and their product descriptions, PMs must consider elimination
of virgin material requirements, use of recovered materials, reuse of
products, life-cycle cost, recyclability, use of environmentally preferable
products, waste prevention and disposal methods.
Questions/Steps for Consideration:
t. Did the Program Office establish a pollution prevention program?
- Does the program help minimize environmental impact?
- Does the program consider life-cycle costs?
u. Is pollution that cannot be prevented released and recycled in an
environmentally safe manner?
v. Is the disposal or other release of pollution into the environment
employed only as a last resort?
w. Were pollution prevention requirements incorporated into contract
documents?
x. In accordance with Executive Order l2873, does the Program Office
consider the following:
(1) Elimination of virgin material requirements?
(2) Use of recovered materials?
(3) Reuse of products?
(4) Life-cycle cost?
(5) Recyclability?
(6) Use of environmentally preferable products?
(7) Waste prevention?
(8) Disposal?
y. Determine if a Programmatic Environmental Analysis (PEA) was developed.
z. Verify that environment considerations were addressed during the acquisition
logistics process.
Open System Design TOC
The objective of this area is to determine if program managers are
effectively implementing the use of the open systems (OS) approach in
systems acquisition. The requirements for opens systems architecture has
been incorporated into DoD acquisition policy and guidance. In July 1996,
the Open Systems Joint Task force for the Principle DUSD(A&T) surveyed
60 program offices. The results were:
- One half were well aware of open systems
- One third fully understood open systems
- One quarter were fully implementing open systems
Audit steps concerning open systems should be performed together with
the steps annotated in the Software Engineering section since open system
design criteria is a consideration of software engineering. See Change
3 to 5000.2-R for details.
Questions/Steps for Consideration:
a. Is the program office implementing an open systems plan and using
open systems as part of the acquisition strategy?
b. Did the request for proposal specify use of an open system approach?
c. Has the program office documented their approach for measuring the
level of openness at the system, subsystem, and component level?
d. Does the plan cover all aspects of the system including logistics
support?
e. Are IPTs addressing the use of open systems?
f. Was implementation of open systems a factor in evaluating contractor
proposals?
g. What areas of the system design were most amenable to the use of the
open systems approach?
h. Does the contract define how the open system approach will be used
in developing the system?
i. Is the program office working with the contractors to ensure that
an open systems approach is used throughout the acquisition process?
j. Can the program manager and contractor staffs provide:
- documentation on how the use of the open system approach affected
system design;
- documentation on both open (non-proprietary) and closed (proprietary)
interfaces and protocols within the system;
- a plan for certifying contractor compliance with agreed-to open
systems interfaces.
k. Has the program office and the contractor quantified the life-cycle
costs and benefits of using an open system approach in terms of cost,
performance, logistics support, interoperability, and reuse?
l. What factors have inhibited the use of an open systems approach?
m. Have specifications and standards adapted by industry standards bodies
or market standards been used for functional and physical system interfaces?
n. For C4I systems, information systems, and systems that interface with
C4I systems, was the mandatory guidance in DoD Joint Technical Architecture
(JTA) used?
o. Is the program office implementing an open systems plan and using
open systems as part of the acquisition strategy?
- Did the request for proposal specify use of an open system approach?
- Has the program office documented their approach for measuring the
level of openness at the system, subsystem, and component level?
Software Engineering: TOC
Software shall be managed and engineered using best processes and
practices that are known to reduce cost, schedule, and performance risks.
It is DoD policy to design and develop software systems that are based
on systems engineering principles to include:
- Developing software systems architectures that support open systems
concepts; exploit commercial off-the-shelf (COTS) computer systems products;
and provide for incremental improvements based on modular, reusable,
extensible software;
- Identifying and exploiting software reuse opportunities, Government
and commercial, before beginning new software development;
- Selection of programming language in the context of the systems
and software engineering factors that influence overall life-cycle costs
risks, and potential for interoperability. Additional guidance is contained
in ASD(C3I) Memorandum, April 29, 1997;
- Use of DoD standard data. Additional guidance is contained
in DoDD 8320.1;
- Selecting contractors with the domain experience in developing
comparable software systems, a successful part performance record, and
a demonstratable mature software development capability and process;
and
- Use of a software measurement process in planning and tracking
the software program, and to assess and improve the software development
process and associated software product.
- Ensuring that information operations risks have been assessed
(DoDD S 3600.1).
- Ensuring software is Year 2000 compliant.
Questions/Steps for Consideration:
The Computer Resources Life-Cycle Management
Plan (CRLCMP) TOC
a. Determine if the CRLCMP was approved before award of the Engineering
and Manufacturing Development (EMD) contract.
b. Determine if the CRLCMP is updated to reflect changes in the program.
c. Does the CRLCMP cross-reference to the Integrated Logistics Support
Plan (ILSP)?
d. Does the CRLCMP address factors that drive requirements for software
such as system interfaces; interoperability; communication functions;
human interface; the anticipated urgency of change; and requirements for
safety, security, and reliability?
e. Is the Computer Resources Life-Cycle Management Plan being followed?
f. In your opinion, is the Computer Resources Life-Cycle Management Plan
adequate for a program in this stage of the acquisition cycle?
Strategy and Requirements
TOC
g. How important and what is the magnitude of software in the system?
(Note: this assessment may determine the extent of your additional
work in this area.)
h. Are the acquisition strategy and the software development strategies
consistent?
- Does the program's acquisition strategy consider evolutionary and
incremental models of software development?
i. How well are the software requirements understood? Is the program's
acquisition strategy consistent with the level of understanding and anticipated
volatility of requirements? (Phase 0)
j. Does the software system architecture support open systems concepts?
(see Page 32, Open Systems)
k. Is the program exploiting COTS products?
l. Has the program office provided for incremental improvements based
on modular, reusable, extensible software? Were software reuse identified
and exploited before development began?
m. Determine who is defining the requirements and writing, reviewing,
and approving the specifications.
n. What programming language(s) are being used? Why were those
selected?
o. Assess how the program office is considering information warfare risks?
(see Page 5, C4ISR)
Software Evaluation and Management TOC
p. Has the program office identified a person or person as responsible
for the management/oversight of the software aspects of the program?
- Do those individuals have sufficient background and training to
perform that function?
- Have they taken the basic, intermediate or advanced courses in software
acquisition management? (Note: several Defense Science Board studies
have identified software acquisition management training as a weak
spot in overall program office management.)
q. Are software reviews conducted with customer (Government) participation?
r. Are software metrics being used to effect the necessary discipline
of the software development process and assess the maturity of the software
product? (Note: Software metrics is one management tool that should
be used and available for review. Metrics are quantifiable indices used
to compare software products, processes, or projects or to predict their
outcomes. Metrics can monitor requirements, predict development resources,
track development progress, and explain maintenance costs. Risk management
is another major effort which should be established for effective control
of software development.)
s. What other management tools is the program office using to review
the software development effort?
t. Is software being managed and engineered using best practices and
has the program office considered best practices in software acquisition
management, such as software capability evaluations or the practices recommended
by the DoD's Software Acquisition Best Practices Initiative? (Note:
The Air Force's Software Technology Support Center has published "Guidelines
for Successful Acquisition Management of Software-Intensive Systems: Weapon
Systems; Command and Control Systems; Management Information Systems."
The Guidelines are also available through the Acquisition Deskbook CD-ROM
software and on-line. Those guidelines provide program management guidance
for a software acquisition that can also be used to determine whether
adequate controls are in place.)
u. Determine whether hardware and software development is being
coordinated. (Note: Consider reviewing and comparing hardware and
software development schedules.)
v. Assess the extent of schedule slips, cost overruns, requirements creep,
test problem reports and software specification changes. (Note:
High numbers may indicate that the program has problems in the software
area.)
w. Has the program office developed an adequate process for monitoring
and validating the contractor's performance as the software is developed
and integrated through the system?
x. What steps has the program office taken to ensure that the contractor
is capable of developing the necessary software?
y. Determine the procedures used by the contractor to handle and correct
software-related problems. (See Page 33, Software Engineering)
Software Configuration Management Plan TOC
z. Assess the adequacy of the program office's software configuration
management process.
aa. If an actual plan exists, review it to ensure that it reflects the
most recent strategy.
bb. Determine how changes to the software are controlled.
cc. Determine whether problems were identified during physical
and functional configuration audits. Have these problems been corrected?
(Note: Consider reviewing pertinent meeting minutes or reports.)
Verification and Validation
TOC
dd. Will an independent verification and validation effort be performed
on the system? If not, document the reasons.
ee. Determine who the independent verification and validation agent is
for the program office.
ff. Document the level of effort applied to the verification and validation
effort. Has the program office budgeted for the proper amount of work?
Critical Design Review TOC
gg. Were critical design reviews performed on the hardware and the software?
Did the reviews cover the entire system?
hh. If any portion of the system was omitted, what was the reason?
ii. Did the program office observe and attest to the completeness of
the critical design reviews?
Test and Evaluation TO
C
jj. Determine whether the software is being tested and whether
Government representatives are observing the tests. Are user/customer
representatives participating on the software test team?
kk. There is a major responsibility to test and evaluate both at the
component level and at the system level during EMD to demonstrate that
the program meets stated requirements. Does the program office have the
technical expertise to judge the contractor--if it lacks this expertise,
is it relying on someone else for that evaluation?
ll. Relating to test and evaluation, how does the program office ascertain
software and system maturity in determining readiness to proceed? In particular,
what software maturity criteria have been established and what mechanism
is the program office using to track progress towards meeting those criteria?
A software test program must demonstrate the requirements allocated to
the software have been met. This is in addition to the system level, integrated
development test and evaluation program. Due to the nature of the software
development, the software test program usually takes additional resources
and becomes a separate, traceable effort within the program.
Software Support TOC
mm. Has the post deployment software support strategy been articulated?
nn. Who was involved in the software support planning?
Year 2000 (Y2K) Issue. TOC
The Year 2000 (Y2K) problem, also known as the Century Date Change
presents problems because developers of computer systems and applications
in the past have used two digits to designate the year. For the past several
decades, systems have typically used two digits to represent the year,
such as "98" representing 1998, to conserve electronic data storage and
to reduce operating costs. With the two-digit format, however, the year
2000 is indistinguishable from 1900, or 2001 from 1901, and so forth.
As a result, legacy systems and some raecent systems may abort or produce
erroneous data once the year 2000 arrives. Y2K presents a challenge for
management. A fix for updating the systems to deal with the new millennium
will require placing those individuals who are able to make the corrections
in the field where the systems have been deployed. This will require integrated
planning by the acquisition managers and procurement personnel at all
levels.
To assess a program's plans for addressing the Y2K problem, please refer
to Appendix B for possible audit questions or steps that should be considered.
Reliability, Maintainability, and Availability:
TOC Reliability, maintainability
and availability have a direct impact on both operational capability and
life cycle costs and therefore are important considerations for the warfighter.
The ability to successfully complete a mission is directly dependent on
the weapon or automated information system performing that mission without
experiencing a mission critical failure (reliability). An item that cannot
be repaired in a timely or efficient manner (isn't maintainable) will
unnecessarily consume warfighting resources and can decrease that warfighter's
ability to initiate and complete a mission. Warfighting capability is
also dependent on the weapon or automated information system being ready
to under take a mission when called upon to do so (availability). From
the life cycle cost perspective, it has been universally recognized that
the qualities of reliability, maintainability and availability result
in reduced life cycle costs. Operating and support costs of a weapon or
automated information system are major elements of life cycle cost and
typically overshadow the development costs. The reliability, maintainability,
and availability characteristics of a system are major drivers of operating
and support costs.
PM shall ensure that reliability, maintainability, and availability activities
are established early in the acquisition cycle to assure meeting operational
requirements and reduced life-cycle ownership cost. The RAM requirements
shall be based on operational requirements and life-cycle cost considerations;
stated in quantifiable, operational terms; measurable during developmental
and operational test and evaluation; and derived from and directly support
system readiness objectives. Maintainability requirements shall address
servicing, preventive, and corrective maintenance. Availability requirements
shall address the readiness of the system.
The PM shall plan and execute RAM and availability design, manufacturing
development and test activities such that equipment used to demonstrate
system performance prior to production reflects the mature design. Demonstrations
shall use production representative systems (or as near as possible) and
actual operational procedures (e.g., actual technical orders, spare parts,
tools, support equipment, and personnel with representative skill levels).
(For general guidance and information on reliability see "RADC Reliability
Engineer's Toolkit and PAC-3 Report findings C&D for examples of shortfalls
in a reliability program.)
The objective of this section of the PME audit is to determine whether
there are any indications that the RAM requirements will not be met or
the objective and projected RAM growth cannot be achieved.
Questions/Steps for Consideration:
RAM Objectives and Thresholds TOC
a. Were firm threshold objectives established for each system RAM
parameter?
b. Determine if RAM objectives are derived from and directly support
the system readiness objectives.
RAM Testing TOC
c. Determine whether a Test, Analyze and Fix (TAAF) program or process
was planned during EMD.
d. Were the verifications accomplished under natural or induced
conditions that were at least as severe as the design requirements?
e. Determine what tests have been conducted or planned under environmental
stresses to disclose weak parts and workmanship defects. Are the environmental
conditions realistic and do they correlate directly the design requirements.
What steps have been taken to ensure that recurrence of failures due to
weak parts and workmanship are precluded by specific quality control provisions
in the production contracts?
Corrections TOC
f. Was an effective failure, reporting, analysis, and corrective action
(FRACA) system implemented?
g. Is verification of design changes prior to LRIP permitted by schedule?
h. Determine whether the contractor has a written procedure in
the system documentation to correct RAM problems that are identified.
i. Determine whether a maintainability data collection analysis
and corrective action system are in place.
Contract TOC
j. Have the RAM requirements been translated into contractual requirements?
k. Were specific quality assurance provisions developed for inclusion
in the production contracts that will preclude recurring defects resulting
from faulty parts or poor workmanship?
l. Determine the source of the RAM data to be used during engineering
and manufacturing development.
Data TOC
m. Determine if a maintenance task analysis has been completed? (See
Page 8, Supportability)
n. Determine if the government plans to procure a product warranty. If
so, determine what type of warranty will be procured. What analyses have
been performed to determine the cost effectiveness of obtaining a warranty.
o. Determine if separate estimates have been provided for each mission
environments.
p. Determine if goals of average Mean Time Between Failure (MTBF) are
correlated to readiness goals such as operational availability.
q. Determine if goals for average time to restore equipment have been
correlated to readiness goals such as operational availability.
Human Systems Integration: Table
of Contents A comprehensive management and technical strategy
for human systems integration shall be initiated early in the acquisition
process to ensure that: human performance; the burden the design imposes
on manpower, personnel, and training (MPT); and safety and health aspects
are considered throughout the system design and development processes.
( see Human Systems Integration DoDI 5000.2-R, Part 4, Paragraph 4.3.8).
Human factors engineering requirements shall be established to develop
effective human-machine interfaces, and minimize or eliminate system characteristics
that require extensive cognitive, physical, or sensory skills; require
excessive training or workload for intensive tasks; or result in frequent
or critical errors or safety/health hazards. The capabilities and limitations
of the operator, maintainer, trainer, and other support personnel shall
be identified prior to program initiation (usually Milestone I), and refined
during the development process. Human-machine interfaces shall comply
with the mandatory guidelines for all C4I systems, automated information
systems, and weapons systems that must interface with C4I systems or automated
information systems, as defined in the DoD Technical Architecture Framework
for Information Management, Version 2.0. June 30, 1994.
Reports, plans, and program decisions made by the HSI communities outside
the acquisition infrastructure (e.g., manning documents and personnel
occupational specialty decisions) must reflect and, to every extent possible,
be reflected in program design decisions, trade-offs, risk assessments,
and test results.
Questions/Steps for Consideration:
a. Review the MANPRINT Assessment for the most recent milestone decision.
Follow-up on any outstanding issues in this report pertaining to human
factors.
b. Were significant issues and concerns raised by OSD or Service Human
Systems Integration Community during the review process leading to and
supporting the most recent program milestone decision?
c. What efforts have been initiated to review interfaces that require
extensive cognitive, physical, or sensory skills that should be evaluated
for human factors engineering changes?
d. What effort has been initiated to identify and assess frequent
or critical human performance errors?
e. Assess whether the Human Factors Engineering Program provided
a baseline for improving system performance, reducing personnel and training
requirements, reducing ownership costs, and reducing or eliminating critical
human performance errors?
Interoperability: Table
of Contents Compatibility, interoperability, and integration
are goals that must be addressed for all acquisition programs. These goal
should be determined and validated during the generation of the requirements.
Questions/Steps for Consideration:
a. Have interoperability and compatibility requirements been specified
and validated as part of the requirements generation process for the system?
b. For C4I systems, were interoperability requirements established in
compliance with DoD 4630.5, DoDI 4630.8, and CJCSI 6212.01A? (see Page
5, C4ISR)
Survivability: TOC
Unless waived by the Milestone Decision Authority, mission critical
systems, regardless of ACAT, shall be survivable to the threat levels
anticipated in their operating environment. System survivability from
all threats found in the various levels of conflict shall be considered
and fully assessed as early as possible in the program, usually during
Phase I.
Questions/Steps for Consideration:
a. Do system requirements include survivability against threat levels
anticipated in the operational environment?
- Compare most recent System Threat Analysis Report (STAR) against
the approved Operational Requirements Document (ORD).
- Determine whether the STAR has been updated to support the prior
milestone review and if it has been validated by the Defense Intelligence
Agency.
- Does the ORD include requirements for survivability against each
of the various threats described in the STAR?
- Are survivability capabilities in the ORD expressed in terms of
measurable and quantitative objective and threshold performance requirements
that can be validated through test and evaluation?
b. Have Key Performance Parameters (KPPs) relating to survivability been
extracted from the ORD and included in the Acquisition Program Baseline
(APB)? Note: Survivability KPPs should be included in the APB if
failure to meet threshold performance requirements would be so significant
that it would be cause to reassess the need for the weapon system based
on it's ability to meet user needs (See Sec.2.3 in DoD 5000.2-R)
c. If you find shortfalls in the planned system survivability, have these
shortfalls been approved by the Service or OSD milestone decision authority
after application of cost versus performance trade offs? Have risk assessments
of the impact of the shortfalls been made?
d. Will planned and performed testing adequately assess survivability
and validate the critical system survivability characteristics defined
in the ORD and the APB? (See Page 18, Test and Evaluation)
e. Do available reports on the results of completed test and evaluation
show significant system or subsystem failures, which relate to survivability?
If so, have viable plans been established to address the failures before
any upcoming system production decision? (See Page 18, Test and Evaluation)
f. Does planned system design (as documented in system contract specifications)
fully address survivability requirements of the system, as documented
in the ORD and has a risk assessment been made to determine the impact
of not meeting the survivability goals?
Work Breakdown Structure: TOC
The Work Breakdown Structure (WBS) is the basis for communications
throughout the acquisition process. A WBS defines a defense system in
product terms -- hardware, software, services, data, facilities -- and
relates them in a family tree that displays the relationships of the product(s)
to each other and to the end product.
The WBS is a product of the systems engineering process. As new concepts
take form, the WBS helps program management organize, define and display
the emerging program. Just as a program evolves during various phases,
so too does its unique WBS. But all programs, however different their
end products may be, share common elements, for example, program management,
data, and training. After the Program WBS has been developed to reflect
both its unique and common elements, it becomes the basis for the Contract
WBS--the discretionary extension by the contractor from the Program WBS
to include all the products for which a given contractor is responsible.
In summary, the work breakdown structure is a useful tool as program
managers engage in planning and controlling the program. It relates the
various work efforts or parts of the program to the overall products or
whole system. It is the foundation for:
- Program and technical planning through Systems Engineering
- Contractor's work effort
- Schedule definition
- Cost estimation, accountability, tracking and budget formulation
- Progress status reporting and problem analysis
Program offices shall tailor a program WBS for each program using the
guidance in MIL-HNBK-881. MIL-HNBK-881 shall be cited in solicitation
and contracts "for guidance only" in extending the program WBS to develop
the complete contract WBS.
Assessing the WBS is a good way for the auditors to understand the program.
Generally, a finding in this area is rare, but review of the area may
lead to a more significant issue such as failure to address a key program
element.
Questions/Steps for Consideration:
a. Has a work breakdown structure been created to map normal processes
to meet requirements imposed by the Program Office and did the Program
Office tailor their program work breakdown structure to MIL-HDBK-881?
b. Does the work breakdown structure provide a framework for the following:
(1) Program and technical planning?
(2) Cost estimating?
(3) Resource allocation?
(4) Performance measurements?
(5) Status reporting?
c. Does the work breakdown structure include hardware, software, services,
data, and facility requirements?
d. Does the work breakdown structure relate the elements of work to each
other and to the end product?
e. Was overall system software that will facilitate operation and maintenance
of computer systems and applications software called out at an appropriate
work breakdown structure level?
f. Are functional cost elements (engineering, tooling, quality control
& manufacturing) erroneously represented as work breakdown structure
elements?
Contractor Performance: TOC
Unless waived by the Milestone Decision Authority or designated representative,
compliance with the EVMS criteria is required on significant contracts
and subcontracts within all acquisition programs. Significant contracts
are defined as DoD development contracts and subcontracts with a value
of $70 million or more and production contracts and subcontracts with
a value of $300 million or more (in FY 1996 constant dollars).
The following exceptions apply:
o Compliance with the criteria is not required on contracts and subcontracts
that are firm-fixed price, time and materials contracts, and contracts
consisting mostly of level-of-effort work; or
o If there are situations involving significant contracts where the application
of the criteria is not believed necessary (as decided by the MDA) such
as:
oo follow-on contracts with mature production programs that are not experiencing
cost and schedule problems and where no significant changes to the product
are anticipated.
oo contracts to acquire items directly from the production lines that
manufacture commercial items.
Key Contractor Performance Documents that include contractor performance
information include but are not limited to:
- Defense Acquisition Executive Summary
- Contractor Performance Reports
- Memorandum of Agreement between DCMC and the Program Office
- Work Breakdown Structure
- Statement of compliance with CSCS or EVMS criteria
- Surveillance Plan/Agreement
Questions/Steps for Consideration
a. Review the Contractor Performance section of the Defense Acquisition
Executive Summary (DAES). Is it rated yellow or red?
Integrated Baseline Review: TOC
The primary emphasis of the integrated baseline review is comprehensive
planning and integration and substantiation of the validity of the performance
measurement baseline. The objective of integrated baseline reviews are
to (1) reduce the number of government reviews and (2) improve the understanding
and use of cost performance data by contractor and government managers.
Program managers are requested to conduct an integrated baseline review
within six months after contract award to assure the accuracy of the contractor's
planning and budgeting activities.
Questions/Steps for Considerations:
a. Obtain and review the last integrated baseline review.
b. Did the Program Office conduct an integrated baseline review (IBR)
within 6 months of the contract award or major changes to the existing
contract. If not, why?
c. Was the staff adequately trained to conduct the IBR to ensure that
the information generated from the review is a useful management tool.
d. Did the IBR include participation from the Program Office, contractor,
and DCMC?
e. Does the Contractor Performance Measurement monitor actively participate
in IBRs
f. Did the IBR identify action items and have all action items been resolved.
g. What risk areas were identified and has management taken appropriate
actions to mitigate those risks?
h. Has the Program Manager rebaselined the program several times within
an acquisition phase. Why?
CPR Analysis: TOC
Questions/Steps to Considered:
a. Obtain and review 3 months of CPR data from the CPM monitor, program
office and OSD. CPR data should provide some insight into the contractor
performance and identify potential risk areas and challenges for the Program
Manager. (Note: AFMCPAM 65-501 is an extremely useful audit resource
for developing a specific audit approach to assess the contractor's performance.)
Key indicators for assessing and analyzing contractor performance include:
o Cost Efficiency/Variance
o Schedule Efficiency/Variance
o Estimate-at-Completion
o To-Complete-Performance-Index
o Cost Performance Index
o Schedule Performance Index
o Budget-at-Completion
o Management Reserve
The Program Office or DCMC may have software available (Windows and DOS-based)
that can help to perform most of the audit analysis (i.e., Performance
Analyzer).
Generally, this section will not lead to a finding, but is an excellent
way of assessing the status of the program and identifying other program
problems such as cost over runs, schedule delays, and performance problems.
b. Determine whether there are any significant inconsistencies among
the analysis? Why?
c. Are there any cost or schedule variances? Why?
d. Use a range of different methodologies in assessing the contract EAC.
e. Are the contractor and program office in agreement regarding the contract
EAC?
f. How much has the EAC grown since the contract award and is it reasonable?
g. If the EAC increased, what were the causes for the EAC increase?
h. What methodologies and assumptions were used to determine the EAC?
Do they appear reasonable?
i. Discuss assumptions with DCMC and Program Office staff before making
any specific conclusions about the contractor's performance.
j. Determine the cost and schedule variances for at least the level 2
cost accounts. (more detailed analysis may be necessary based on the results
of the initial analysis). Determine if there are any significant cost
or schedule variances and the reasons for those variances.
k. How much management reserve was set aside for the contract?
l. How is management reserve controlled?
m. How have management reserve funds been applied toward the contract?
n. How much of the work is level of effort? Is this reasonable? (Auditor's
Note: An excessive amount of level of effort work could mean and ill-defined
program and also show true program status.)
Earned Value Management: TOC
Earned Value Management provides a basis for responsible decision
making by both contractor and DoD personnel. It produces objective, reliable
and auditable data at a summary level, which reflects cost, schedule and
technical accomplishments. This measurement tool allows government and
contractor personnel to assess work progress providing each with valuable
trend data for informed decision making.
Questions/Steps to Consider:
a. Does the Contractor Performance Measurement (CPM) monitor actively
participate in EVMS reviews?
b. Determine whether the contractor's Earned Value Management System
(EVMS) is in compliance with EVMS criteria.
c. Is the contractor required to comply with EVMS criteria?
d. Is the contractor certified as compliant with EVMS criteria? If so,
when and have there been any significant changes to the contractor's system
since it was certified?
e. Has the DCMC office and/or contractor established a surveillance plan/agreement?
f. Does the memorandum of agreement between the DCMC and the Program
Office clearly define the roles, responsibilities, and reporting requirements
of the DCMC? (See Page 16, Contract Management)
g. Has the memorandum of agreement been updated within the last 12 months
and tailored to reflect the phase of the acquisition process?
h. Has DCAA assisted with any surveillance monitoring or issued any reports
related to the Program? Did they identify any issues and how were they
resolved?
Standardization Documentation:
TOC Preference shall be given to
specifications and standards developed under the Defense Standardization
Program. DoD 5000.2-R authorized the publication of DoD 4120.3-M that
describes the Defense Standardization and Parts Management Program.
Questions/Steps for Consideration:
a. Determine whether the Program Office gave preference to specifications
and standards developed under the Defense Standardization Program.
b. Did the Program Office keep abreast of standardization initiatives?
c. Were the following questions explicitly considered when deciding whether
or not to standardize?
- Is physical uniformity a minimum essential requirement or necessary
for ease of operation and safety?
- Will the end item be used in a variety of applications?
- Would training for operation, maintenance, or repair be improved
by standardization?
d. Is the Program Office using DoD 4120.3-M as guidance for standardization-related
issues?
e. Has the Program Office assessed the DoD Military Specification Reform
Homepage to obtain the most current information on standardization initiatives?
f. Has the system contractor recommended the use of standard materials,
parts, components, and other items?
g. Did the Program Office decide not to standardize when the following
conditions existed:
- The technology is unstable and the Program Office would like to
take advantage of technological advances?
- The primary goal of the system is to satisfy specific user preferences?
- Standardization will unacceptably inhibit design flexibility and
innovation?
Metric System: TOC
The metric system of measurement is to be used for all acquisitions
requiring new designs.
Questions/Steps for Consideration:
a. Is the metric system being used for all system elements requiring
new design?
b. If not, has a waiver been granted by the Milestone Decision Authority
based on the best interest of the Government.
Program Protection: TOC
The program must identify those parts of the acquisition which require
special consideration to prevent unauthorized disclosure or inadvertent
transfer of important technology. Planning for program protection should
begin early in the acquisition process.
Question/Step for Consideration:
Have elements of the program that require protection from unauthorized
disclosure or transfer of technology been identified.
Electromagnetic Environmental Effects
(E3) and Spectrum Management: TOC
Electric and electronic systems should be designed to be compatible
with all other electric or electronic equipment with other systems within
the operational environment.
Questions/Steps for Consideration:
a. Have electronic systems been designed to be mutually compatible with
all other electronic equipment within the expected operational environment?
b. Does the system emit or receive hertzian waves? If so, was spectrum
supportability determined as required by OMB Circular A-11?
Program Management
TOC
Questions/Steps for Consideration:
a. Evaluate the adequacy of SPO staffing (training, stability, numbers).
b. Is there stability as far as program manager and deputy program manager.
c. Is the program office placing an emphasis on training and educating
its staff on changes in acquisition, new initiatives, and implementing
policies?
d. Evaluate the adequacy of support to the SPO from matrix organizations
within the Military Department.
e. Determine whether there are problems in interfaces with system users,
DCAA, DCMC, etc.
f. Determine whether there are laws, rules, and regulations that the
SPO finds especially onerous. Give the SPO a chance to voice its opinions
whether or not we agree with them.
g. What is the program doing that is especially good and perhaps could
be held up as a model or lesson learned for other programs?
h. Determine whether there are any MILCON projects buried in the appropriations
for the weapons systems. The normal budget process and MILCON review does
not identify and scrub any construction that gets funds with RDT&E
and/or procurement programs.(Source: E-Mail from D. Steensma)
Implementing
Acquisition Reform TOC
Acquisition reform is a key focus within the Department of Defense. Although
the following questions can be asked as a separate objective, many of
the questions should be considered within the other sections.
Questions/Steps for Consideration:
a. How is the program aware of or keeping abreast of acquisition reform
initiatives?
b. How has the program made full use of the flexibilities available since
acquisition reform?
c. Does the program focus on performance-based contracting versus specification-based?
d. What is being done differently as a result of acquisition reform that
was not being done before? Find out exactly what initiatives management
is taking to "reengineer" or streamline their processes in light of acquisition
reform.
e. Are the lowest levels of the acquisition organization aware of the
initiatives? Does it appear that acquisition reform has flowed down to
all levels of the organization?
f. Specifically, relating to business process improvement--how are they
doing as far as the following initiatives:
(1) electronic commerce/electronic data interchange
(2) MILSPEC reform
(3) Single Process Initiative
(4) Integrated Product Teams (see audit steps for IPTS)
(5) Earned Value Management
(6) Simulate, Test, and Evaluation Process (STEP)
(7) CAIV
g. Throughout the audit, look for positive things that the program is
doing in the acquisition reform arena that could be an example to other
programs.
h. What metrics or measures of effectiveness has the program office set
up to measure the success of its acquisition initiatives?
i. If the program is estimating savings from the acquisition reform initiatives,
assess the reasonableness of the estimates.
Appendix A: Documentation to be Reviewed
TOC
Operational Requirements Document (ORD). TOC
The operational requirements document provides performance requirements
necessary to meet an operational need and defines the desired characteristics
and capabilities for the proposed new system. It is first prepared by
the user or a user representative during Phase 0 for use in Phase 1 and
is used to update the program baseline and develop contract specifications
during each subsequent acquisition phase. The ORD is required for all
programs, validated and approved by the operational validation authority
and is included the Acquisition Program Baseline. The ORD identifies required
performance capabilities, provides the number of systems/subsystems needed,
addresses integrated logistics support requirements and discusses infrastructure
support and interoperability issues.
Analysis of Alternative (AOA). TOC
The AOA is an evaluation of the advantages and disadvantages of alternatives
being considered to satisfy a requirement, to include the sensitivity
of each alternative to possible changes in key assumptions or variables.
The analysis will aid decision makers in judging whether or not any of
the alternatives offer sufficient benefit to be worth the cost.
Acquisition Strategy. TOC The
acquisition strategy establishes a framework within which detailed acquisition
planning and program execution are to be accomplished. It describes how
DoD will acquire a major system. The acquisition strategy will describe
the relationship of the essential elements of the program such as management,
technical, resource, testing, training, deployment, support, safety, procurement
and contracting.
Mission Need Statement (MNS). TOC
The mission need statement documents deficiencies in current capabilities
and opportunities for taking advantage of current technology. The MNS
is required for all potential materiel acquisitions and describes the
need in broad operational terms not in system specifics. It is submitted
to the operational validation authority who is responsible for its validation
and approval. The approved MNS is forwarded to a milestone decision authority
for a Milestone 0 decision.
Acquisition Decision Memorandum (ADM). TOC
The Acquisition Decision Memorandum documents the decision made by
the Defense Acquisition Board during Milestone Reviews. It provides written
direction to the services and is signed by the Office of the Under Secretary
of Defense(Acquisition and Technology). Typically the ADM will:
- approve the initiation of a new program and entry of the program into
Phase I;
- approve proposed changes or modifications to the acquisition strategy;
- establish program-specific exit criteria that must be accomplished
before a program can move from one phase to the next; and
- identify affordability constraints derived from the planning, programming
and budgeting system.
Test and Evaluation Master Plan. TOC
The Test and Evaluation Master Pan (TEMP) documents the overall structure
and objectives of the test and evaluation program. It provides a framework
within which to generate detailed test and evaluation plans and it documents
schedule and resource implications associated with the test and evaluation
program.
Defense Acquisition Executive Summary (DAES). TOC
The DAES is the vehicle for reporting to the Under Secretary of Defense
(Acquisition and Technology) (USD(A&T)) program assessments, unit
costs, current estimates of the Acquisition Program Baseline parameters,
status reporting of exit criteria, and vulnerability assessments. Its
purpose is to highlight potential and actual program problems before they
become significant.
Selected Acquisition Report (SAR). TOC
The SAR provides the status of total program cost, schedule, and performance,
as well as program unit cost and unit cost breach information. In the
case of joint programs, the SAR includes such information for all joint
participants. Each SAR also includes a full life-cycle cost analysis for
the reporting program and its predecessor.
Appendix B: Audit Questions/Steps
Relating to the Year 2000 Issue TOC
a. Does the program have a plan for identifying, assessing, validating/testing
for Y2K compliance?
b. What are the Y2K issues facing the program?
c. What resources are required for Y2K compliance? Is funding available?
d. Has the program identified interfaces with other systems and if those
systems are Y2K compliant? Were MOAs obtained for all external interfaces?
e. Do system/program contracts have the Y2K clause? If not, are there
actions being taken to have a clause incorporated?
f What is the certification process? Is the Y2K compliance certification
conducted by an independent third party?
g. Depending on the status of the program/system, what contingency plans
have been made?
Return to Audit Home Page
Please send comments or questions on this site
to: auditnet@dodig.mil
|