Click here
      Home    DAG Tutorial    Search    Available Downloads     Feedback
 
The DAG does not reflect the changes in the DoDI5000.02. Work is in progress to update the content and will be completed as soon as possible.
 
.

4.2.8. Technical Reviews and Audits Overview

Topic

Previous and Next Page arrows

DEFENSE ACQUISITION GUIDEBOOK
Chapter 4 -- Systems Engineering

4.2.8. Technical Reviews and Audits Overview

4.2.8. Technical Reviews and Audits Overview

For DoD weapon systems development, a properly tailored series of technical reviews and audits provide key points throughout the life cycle to evaluate significant achievements and assess technical maturity and risk. DoDI 5000.02, Enclosure 4 presents the statutory, regulatory, and milestone requirements for acquisition programs. Properly align the technical reviews to support knowledge-based milestone decisions to streamline the acquisition life cycle and save precious taxpayer dollars. As a companion to DoDI 5000.02, see the OUSD(AT&L) memorandum, “Expected Business Practice: Document Streamlining – Program Strategies and Systems Engineering Plan” dated April 20, 2011.

Technical reviews and audits allow the Program Manager and Systems Engineer to jointly define and control the program’s technical effort by establishing the success criteria for each review and audit. A well-defined program facilitates effective monitoring and control through increasingly mature points (see Technical Maturity Point table in DAG section 4.2.1. Life-Cycle Expectations).

Technical reviews of program progress should be event driven and conducted when the system under development meets the review entrance criteria as documented in the SEP. Systems engineering (SE) is an event-driven process based on successful completion of key events as opposed to arbitrary calendar dates. As such, the SEP should discuss the timing of events in relation to other SE and program events. While the initial SEP and Integrated Master Schedule have the expected occurrence in the time of various milestones (such as overall system CDR), the plan should accommodate and be updated to reflect changes to the actual timing of SE activities, reviews, and decisions.

Figure 4.2.8.F1 provides the end-to-end perspective and the integration of SE technical reviews and audits across the acquisition life cycle.

Figure 4.2.8.F1. Weapon System Development Life Cycle

Figure 4.2.8.F1. Weapon System Development Life Cycle

Properly structured, technical reviews and audits support the Defense Acquisition System by:

  • Providing a disciplined sequence of activities to define, assess, and control the maturity of the system’s design and technical baseline, reducing risk over time
  • Facilitating an accurate technical assessment of the system’s ability to satisfy operational requirements established in capability requirements documents
  • Providing a framework for interaction with the Joint Capabilities Integration and Development System (JCIDS) and Planning, Programming, Budgeting, and Execution (PPBE) processes
  • Providing a technical assessment and assurance that the end product fulfills the design and process requirements

Successful development of a complex weapon system requires a knowledge-based approach. Increasing levels of knowledge are a natural consequence of design maturation; however, successful programs establish a deliberate acquisition approach whereby major investment decision points are supported by requisite levels of knowledge. The Government Accountability Office’s (GAO) study on Assessments of Selected Weapons Programs (GAO-12-400SP) provides quantitative evidence to affirm this best practice.

Figure 4.2.8.F1 illustrates the notional sequence of technical reviews and audits. It also provides typical timing associated with the acquisition phases. Technical reviews should occur when the requisite knowledge is expected and required. The guidance provided in DAG sections 4.2.9. through 4.2.17. defines the entrance and exit criteria for the level of maturity expected at each technical review and audit. OSD established the expected reviews and audits for each acquisition phase in the outline for the Systems Engineering Plan (SEP). These policy and guidance documents provide a starting point for the Program Manager and Systems Engineer to develop the program’s unique set of technical reviews and audits. Tailoring is expected to best suit the program objectives (see DAG section 4.1. Introduction). The SEP captures the output of this tailoring and is reviewed and approved to solidify the program plan.

Programs that tailor the timing and scope of these technical reviews and audits to satisfy program objectives increase the probability of successfully delivering required capability to the warfighter. Technical reviews provide the forum to frame important issues and define options necessary to balance risk in support of continued development.

The technical baseline (including the functional, allocated and product baselines) established at the conclusion of certain technical reviews inform all other program activity. Accurate baselines and disciplined reviews serve to integrate and synchronize the system as it matures, which facilitates more effective milestone decisions and ultimately provides better warfighting capability for less money. The technical baseline provides an accurate and controlled basis for:

  • Managing change
  • Cost estimates, which inform the PPBE process and ultimately the Acquisition Program Baseline (APB)
  • Program technical plans and schedules, which also inform the APB
  • Contracting activity
  • Test and Evaluation efforts
  • Risk analysis and risk balancing
  • Reports to acquisition executives and Congress

The Program Manager and the Systems Engineer need to keep in mind that technical reviews and audits provide visibility into the quality and completeness of the developer’s work products. These requirements should be captured in the contract specifications or Statement of Work. The program office should consider delivering the SEP with the Request for Proposal (RFP) and having it captured in the contractor’s SE Management Plan (SEMP); this best practice also should include delineating entrance criteria and associated design data requirements needed to support the reviews. The configuration and technical data management plans should clearly define the audit requirements.

For complex systems, reviews and audits may be conducted for one or more system elements depending on the interdependencies involved. These incremental system element-level reviews lead to an overall system-level review or audit (e.g.,. PDR, CDR, or PRR). After all incremental reviews are complete, an overall summary review is conducted to provide an integrated system analysis and capability assessment that could not be conducted by a single incremental review. Each incremental review should complete a functional or physical area of design. This completed area of design may need to be reopened if other system elements drive additional changes in this area. If the schedule is being preserved through parallel design and build decisions, any system deficiency that leads to reopening design may result in rework and possible material scrap.

Test readiness reviews (TRR) are used to assess a contractor’s readiness for testing configuration items, including hardware and software. They typically involve a review of earlier or lower-level test products and test results from completed tests and a look forward to verify the test resources, test cases, test scenarios, test scripts, environment, and test data have been prepared for the next test activity. TRRs typically occur in the EMD and P&D phase of a program.

Roles and Responsibilities

For each technical review, a technical review chair is identified and is responsible for evaluating products, determining the criteria are met, and determining that actions items are closed. The Service chooses the technical review chair who could be the Program Manager, Systems Engineer, or independent subject matter expert selected according to the Service’s guidance. This guidance may identify roles and responsibilities associated with technical reviews and audits. It also may specify the types of design artifacts required for various technical reviews. In the absence of additional guidance, each program should develop and document its tailored design review plan in the SEP.

The following notional duties and responsibilities associated with the Program Manager and Systems Engineer should be considered in the absence of specific Service or lower level (e.g., System Command or Program Executive Officer) guidance:

The Program Manager is responsible for:

  • Co-developing with the Systems Engineer the technical objectives of the program that guide the technical reviews and audits
  • Co-developing with the Systems Engineer the earned value credit derived from the review
  • Approving, funding, and staffing the planned technical reviews and audits; documenting this plan in the SEP and applicable contract documents
  • Ensuring the plan includes independent subject matter experts to participate in each review (maintaining objectivity during these reviews with respect to satisfying the pre-established review criteria)
  • Ensuring the plan provides timely and sufficient data to satisfy the statutory and regulatory requirements of DoDI 5000.02
  • Controlling the configuration of each baseline and convening configuration steering boards when user requirement changes are warranted. This can lead to an unscheduled gateway into the Functional Capabilities Board (FCB) and JCIDS process not identified in Figure 4.2.8.F1 above.

The Systems Engineer is responsible for:

  • Co-developing with the Program Manager the technical objectives of the program that guide the technical reviews and audits
  • Developing and documenting the technical review and audit plan in the SEP, carefully tailoring each event to satisfy program objectives and SEP outline guidance associated with the minimum technical reviews and audits. Technical review checklists are available on the DASD(SE) website.
  • Ensuring the plan is event based with pre-established review criteria for each event, informed by the knowledge point objectives in Table 4.2.1.T1
  • Identifying the resources required to support the plan, paying particular attention to the importance of the integrating activity leading up to the official review and audit. See Figure 4.2.8.F2.
  • Ensuring technical reviews and audits are incorporated into the IMP and IMS
  • Coordinating with Chief Development Tester to provide at each technical review: reliability growth progress to plan/assessments, DT&E activities to-date, planned activities, assessments to-date, and risk areas
  • Ensuring a status of applicable design considerations are provided at each technical review
  • Establishing technical reviews and audits and their review criteria in the applicable contract documents (e.g., Statement of Work, IMP)
  • Monitoring and controlling execution of the established plans
  • Coordinating with the appointed technical review chairperson on the technical review plans and supporting execution of the technical reviews
  • Assigning responsibilities for closure actions and recommend to the chairperson and Program Manager when a system technical review should be considered complete, see Figure 4.2.8.F2

Figure 4.2.8.F2. Technical Review Process

Figure 4.2.8.F2. Technical Review Process

The Program Manager and Systems Engineer should identify key stakeholders who have an interest or role in the review, which may include:

  • Technical review chairperson
  • Program Executive Office
  • Contracting Officer
  • Defense Contract Management Agency (DCMA) and Defense Contract Audit Agency (DCAA)
  • Product Support Manager
  • Product Improvement Manager/Requirements Officer
  • End-user Community
  • Chief Developmental Tester
  • Interdependent Acquisition Programs
  • Business Financial Manager
  • Deputy Assistant Secretary of Defense for Systems Engineering (DASD(SE))
  • Service Technical Leadership such as chief engineers
  • Independent Subject Matter Experts

Review Criteria

Specific review criteria are provided in each technical review and audit section below. These criteria should be achieved and all action items closed before a technical review is considered complete. The Systems Engineer may want to consider the technical review-specific checklists available at DAU’s website as a resource.

Contract incentives are frequently tied to completion of technical reviews. The developer may have a strong incentive to call the review complete as soon as possible. The review chairperson and Systems Engineer should exercise best judgment in an objective, informed manner to ensure the reviews are not prematurely declared complete.

Previous and Next Page arrows

Previous Page Next Page

List of All Contributions at This Location

No items found.

Popular Tags

ACC Practice Center Version 3.2
  • Application Build 3.2.9
  • Database Version 3.2.9