Click here
      Home    DAG Tutorial    Search    Available Downloads     Feedback
 
The DAG does not reflect the changes in the DoDI5000.02. Work is in progress to update the content and will be completed as soon as possible.
 
.

9.7.6 Information Assurance Testing

Topic
Previous Page Next Page

9.7.6. Information Assurance Testing

An integral part of the overall T&E process includes the T&E of IA requirements. DoDI 5000.02, “Operation of the Defense Acquisition System,” dated December 8, 2008, directs the conducting of IA T&E during both DT&E and OT&E. To ensure IA testing adequately addresses system IA requirements, the PM must consider IA requirements that protect and defend information and information systems by ensuring their availability, integrity, authentication, confidentiality, and non-repudiation. This includes providing for restoration of information systems by incorporating protection, detection, and reaction capabilities. DoDI 8500.02, Information Assurance (IA) Implementation,” dated February 6, 2003, specifies baseline IA controls for DoD systems. PMs should ensure adequate testing of all applicable IA controls prior to testing in an operational environment or with live data, except for those programs requiring testing in an operational environment. In consultation with the PM or Systems Manager, the Designated Approving Authority (DAA) determines which programs require testing of IA controls in an operational environment. In addition to baseline IA controls, some capabilities documents (e.g., ICD, CDD, and CPD) may also specify unique IA requirements, such as a specific level of system availability. PMs may also identify additional IA requirements as a result of the risk management process, or as directed by the DoD Components. They should also consider the impact of the DoD Information Assurance Certification and Accreditation Process (DIACAP) on the systems overall T&E cost and schedule.

Prior to conducting operational tests programs must receive an “Interim Authorization to Operate” or “Authorization to Operate” from the cognizant DAA, followed by a corresponding “authorization to connect (ATC)” from the system or network manager providing the system connection (e.g. DISA).

Significant C&A activities and events should be visible on the integrated test schedule to ensure appropriate coordination of events. The DoD Component IA program regularly and systematically assess the IA posture of DoD Component-level information systems, and DoD Component-wide IA services and supporting infrastructures through combinations of self-assessments, independent assessments and audits, formal testing and certification activities, host and network vulnerability or penetration testing, and IA program reviews. The planning, scheduling, conducting, and independent validation of conformance testing should include periodic, unannounced in-depth monitoring and provide for specific penetration testing to ensure compliance with all vulnerability mitigation procedures; such as the DoD information assurance and vulnerability assessment or other DoD IA practices. Testing ensures the system's IA capabilities provide adequate assurance against constantly evolving threats and vulnerabilities.

 PMs should consider the re-use and sharing of information to reduce rework and cycle time. DoD memorandum for establishing “DoD Information System Certification and Accreditation Reciprocity,” dated June 11, 2009, mandated a mutual agreement among participating enterprises to accept each other’s security assessments in an effort to reuse IS resources and/or accept each other’s assessed security posture for the timely deployment of IS critical to attaining the Department’s strategic vision of Net-Centricity. Additionally, DOT&E memorandum, “Procedures for Operational Test and Evaluation (OT&E) of Information Assurance in Acquisition Programs”, dated January 21, 2009 contains the OT&E strategy for IA assessment; addressing the test process, identification of required IA test resources and funding, and a reference to the appropriate threat documentation. For more information, see DAG Section 7.5.

9.7.7. Interoperability Testing

All IT & NSS must undergo joint interoperability testing and evaluation for certification prior to fielding, in accordance with section 2223 of Title 10 USC, DoDI 5000.02, DoDD 4630.05, “Interoperability and Supportability of Information Technology (IT) and National Security Systems (NSS),” dated April 23, 2007, DoDI 4630.8, “Procedures for Interoperability and Supportability of Information Technology (IT) and National Security Systems (NSS), dated June 30, 2004, CJCSI 3170.01H, and CJCSI 6212.01F, “Interoperability and Supportability of Information Technology and National Security Systems,” dated March 21, 2012. This includes IT & NSS compliance with technical standards, Net-Ready Key Performance Parameters (NR-KPP), solution architectures, and spectrum supportability requirements. Interoperability compliance with joint interoperability test certification requirements remains a continuous process throughout a system’s life cycle. JITC bases a Joint interoperability test certification on test and evaluation results from operationally realistic test configurations as well as joint and coalition environments. It then provides input to the MDA and PM for a fielding decision. The PM must plan, program, budget, execute and provide resources according to agreed-to costs, schedules, and test plans. Interoperability requirements impact a program’s schedule and costs, so PMs must provide adequate time and funding for Interoperability and Supportability (I&S), NR-KPP, test certification, and Spectrum Supportability Risk Assessments (SSRA). Additional information can be found in Chapter 7.6.4.

Joint interoperability certification testing involves system-of-systems and family-of-systems simulated/live events, and verifies the actual net-centric interoperability characteristics. Additionally, certification testing validates the capability’s interoperability, ensuring it proves sufficient in support of a fielding decision. As with most other aspects of a system, PMs should consider net-readiness during early consideration for design and test. The PM should include the strategy for evaluating net-readiness in the TEMP. One important aspect includes developing a strategy for testing each system in the context of the system-of-systems or family-of-systems architecture in which the system operates.

Early assessments and testing opportunities reduce interoperability risk as well as minimize the impact of interoperability requirements on schedule and program costs. Early identification and resolution of interoperability issues minimizes negative impact to the joint, multi-national, interagency, and Warfighter community. Interoperability testing of all IT & NSS follows the NR-KPP development process. Net-ready attributes determine specific measurable and testable criteria for interoperability, and operationally effective end-to-end information exchanges. The NR-KPP identifies operational, net-centric requirements with threshold and objective values that determine its measure of effectiveness (MOE) and measure of performance (MOP). Architectures provide a foundation to effectively evaluate the probability of interoperability and net-centricity. The NR-KPP covers all communication, computing, and electromagnetic spectrum requirements involving information elements among producer, sender, receiver, and consumer. Information elements include the information, product, and service exchanges. These exchanges enable successful completion of the Warfighter mission or joint business processes. Mandatory KPPs for all program increments include the NR-KPP.

JITC acts as the DoD organization responsible for joint interoperability testing and net-readiness certifications. Statute requires JITC to provide a system Net-Ready certification evaluation memorandum to the Director, Joint Staff J-8, throughout the system life cycle and regardless of acquisition category. Based on net-readiness evaluations and other pertinent factors, the Joint Staff J-8 issues a Net-Ready System Certification memorandum to the respective DoD Components as well as developmental and operational test organizations in support of the FRP Decision Review. JITC collaborates with the PM and lead DT&E organization during development of the TEMP, recommending interoperability T&E measures to ensure I&S testing satisfies all requirements during DT&E, OT&E, or IA T&E events. PMs should include JITC as a member of the T&E WIPT and ensure they participate in TEMP development. JITC’s philosophy leverages test results from planned test events or exercises to generate the necessary data for joint test and net-ready certifications; combining valuable resources, eliminating redundancy, and ultimately ensuring one test. JITC evaluates the operational effectiveness of information exchanges using joint mission threads in an operational environment. JITC establishes processes to ensure operational tests include operationally mission-oriented interoperability assessments and evaluations using common outcome-based assessment methodologies to test, assess, and report on the impact interoperability and information exchanges have on a system’s effectiveness and mission accomplishment for all acquisitions, regardless of ACAT level.

9.7.8. Software Test and Evaluation (T&E)

Software is a rapidly evolving technology that has emerged as a major component in most DoD systems. Within the DoD acquisition domain, the following are essential considerations for success in testing software; to include a security focused code audit/analysis as part of the Software Development Life Cycle (SDLC), IAW the ‘Application Security and Development’ Security Technical Implementation Guide (STIG), dated June 3, 2012:

  • The T&E strategy should address evaluation of highest risk technologies in system design and areas of complexity in the system software architecture. The strategy should identify and describe:
    • Required schedule, materiel and expertise,
    • Software evaluation metrics for Resource Management, Technical Requirements and Product Quality, including Reliability,
    • Types and methods of software testing to support evaluation in unit, integration and system test phases across the life cycle,
    • Data and configuration management methods and tools,
    • Models and simulations supporting software T&E including accreditation status.
  • A defined T&E process consistent with and complementing the software and system development, maintenance and system engineering processes, committed to continuous process improvement and aligned to support project phases and reviews, including an organizational and information flow hierarchy.
  • Software test planning and test design initiated in the early stages of functional baseline definition and iteratively refined with T&E execution throughout allocated baseline development, product baseline component construction and integration, system qualification and in-service maintenance.
  • Software T&E embedded with and complementary to software code production as essential activities in actual software component construction, not planned and executed as follow-on actions after software unit completion.
  • Formal planning when considering reuse of COTS or GOTS, databases, test procedures and associated test data that includes a defined process for component assessment and selection, and T&E of component integration and functionality with newly constructed system elements.
  • The following link provides additional information:

Medical devices and systems must comply with the SEP, in terms of Health Insurance Portability and Accountability Act (HIPAA) and DIACAP information protection procedures and measures. These procedures and measures ensure the software complies with the security standards specified in the Health Insurance Portability and Accountability Act of 1996 (Public Law 104.191) as well as Subtitle D of the Health Information Technology for Economic and Clinical Health (HITECH) Act, Title VIII of Division A and Title IV of Division B of the American Recovery and Reinvestment Act of 2009 (Public Law 111.5). Most medical devices will require IM/IT testing and validation of information security protocols. Given that requirement, programs should start test planning as early as possible. Programs must also validate FDA clearance prior to any medical software implementation.

9.7.9. Post Implementation Review (PIR)

Subtitle III of Title 40 of the United States Code (formerly known as Division E of the Clinger-Cohen Act) requires that Federal Agencies ensure that outcome-based performance measurements are prescribed, measured, and reported for IT (including NSS) programs. DoDI 5000.02 requires that PIRs be conducted for MAIS and MDAP programs in order to collect and report outcome-based performance information. The T&E community will participate in the planning, execution, analysis, and reporting of PIRs, whose results will be used to confirm the performance of the deployed systems and possibly to improve the test planning and execution for follow-on increments or similar systems. For further information, refer to the Acquisition Community Connection or Chapter 7.

9.7.10. System-of-Systems (SoS) Test and Evaluation (T&E)

SoS testing can result in unexpected interactions and unintended consequences. T&E of SoS must not only assess performance to desired capability objectives, but must also characterize the additional capabilities or limitations due to unexpected interactions. The SoS concept should include the system in the broadest sense, from mission planning to sustainment. SoS is a new and evolving area for development, acquisition, and T&E. For further information refer to the Systems Engineering Guide for Systems of Systems, dated August 2008.

9.7.11. Reliability Growth Testing

Reliability growth testing supports improvements in system and component reliability over time through a systematic process of stressing the system to identify failure modes and design weaknesses. The emphasis in reliability growth testing is in finding failure modes. The reliability of the system is improved, or experiences growth, as the design is modified to eliminate failure modes. The reliability growth testing approach is sometimes referred to as Test-Analyze-Fix-Test (TAFT). A successful reliability growth program depends on a clear understanding of the intended mission(s) for the system, including the stresses associated with each mission and mission durations, and configuration control. Reliability growth testing should be a part of every development program and used to provide input to predicted sustainment needs and the reliability KSA. In addition, the results should be used in developing a realistic product support package. For further information, see the DoD Guide for Achieving Reliability, Availability, and Maintainability, dated August 3, 2005 and associated template. For more information, read DTM 11–003, “Reliability Analysis, Planning, Tracking, and Reporting,” dated December 2, 2011.

9.7.12. Evaluation of Test Adequacy

Operational Test and Evaluation adequacy encompasses both test planning and test execution. Considerations include the following:

  • Realistic combat-like conditions
  • Equipment and personnel under realistic stress and operations tempo
  • Threat representative forces
  • End-to-end mission testing
  • Realistic combat tactics for friendly and enemy
  • Operationally realistic environment, targets, countermeasures
  • Interfacing systems
  • Articles off production line preferred
  • Production representative materials and process
  • Representative hardware and software
  • Representative logistics, maintenance manuals
  • Sample size
  • Size of test unit
  • Threat portrayal
  • Properly trained personnel, crews, unit
  • Supported by typical support personnel and support package
  • Missions given to units (friendly and hostile)
  • Production representative system for IOT&E
  • Adequate resources
  • Representative typical users

9.7.13. Medical Materiel T&E

The acquisition and management of medical materiel must ensure quality, availability, and economy in meeting the clinical requirements of the Military Health Systems (MHS). Medical programs, by nature, consist almost exclusively of GOTS, COTS and NDI (non-developmental item) items; and with the inclusion of other government agencies’ participation (i.e., FDA) follow a similar acquisition strategy to other T&E programs. PMs must not disregard T&E of COTS, NDI, and GFE. The operational effectiveness, operational suitability, and operational capabilities of these items and any military-unique applications must be tested and evaluated before a FRP or fielding decision. The ITT will plan to take maximum advantage of pre-existing T&E data to reduce the scope and cost of government testing.

The PM governs medical materiel procurement as a program with significant oversight, consisting of performance-based requirements composed by an IPT or a high performance team (HPT). Whether Joint or Service-specific, the FDA must clear medical materiel for use, if applicable, and comply with the FDA’s rules governing manufacturing. Medical devices must also comply with the SEP in terms of the HIPAA and DIACAP information protection procedures and measures.

PMs, Joint and Service procurement agencies, Service/Defense Agency T&E activities, and other governmental organizations assist with development of operational testing and performance evaluation criteria for medical materiel evaluation; for both developmental and non-developmental programs, as stipulated in DoDI 6430.02, Defense Medical Materiel Program, dated August 17, 2011. Testing of medical devices, due to the reliance on COTS items, may not involve the rigorous DT&E imposed on other systems. Unless developed for military use, PMs normally limit DT&E to airworthiness and environmental testing to ensure the device does not fail due to austere or harsh conditions imposed by the operational environments or interfere with the aircraft’s operating environment. Programs can integrate this testing, or perform it alongside, operational testing events to determine the operational effectiveness and operational suitability of the device. Often, this “usability” question can identify the difference between various devices of like construction or capability.

Lead DT&E test organizations can perform medical item testing, as delineated by the individual Service/Defense Agency, and may not require the approval or input of the Service/Defense Agency OTA. Defer to Service/Defense Agency guidelines for these processes.

9.7.14. FY 2012 National Defense Authorization Act (NDAA) Section 835

Based on the FY 2012 NDAA, Section 835, a Chief Developmental Tester will be designated for MDAP and MAIS programs. PMs for MDAP programs shall designate a government test agency as the Lead DT&E organization. All of these designations shall be made as soon as practical after the Materiel Development Decision (MDD). They shall be maintained until the program is removed from OSD T&E oversight or as agreed.

The Chief Developmental Tester position shall be performed by a properly qualified member of the Armed Forces or full-time employee of the DoD. The Chief Developmental Tester shall be in a T&E acquisition-coded position, designated as a Key Leadership Position, assigned or matrixed to the MDAP or MAIS program office, unless otherwise specified within the TEMP. The Chief Developmental Tester for a program shall be responsible for coordinating the planning, management, and oversight of all DT&E activities; maintaining insight into contractor activities; overseeing the T&E activities of other participating Government activities; and helping the PM make technically informed, objective judgments about contractor and Government T&E planning and results.

The Lead DT&E organization shall be separate from the program office. The Lead DT&E organization shall be responsible for providing technical expertise on T&E issues to the Chief Developmental Tester; conducting DT&E activities as directed by the Chief Developmental Tester; assist the Chief Developmental Tester in providing oversight of contractors; and assist the PM and Chief Developmental Tester in reaching technically informed, objective judgments about contractor and Government T&E planning and results.

9.8. Best Practices

Best practices as derived from lessons learned are available and continuously updated at the DAU Best Practices Clearinghouse.

9.9. Prioritizing Use of Government Test Facilities for T&E

Programs shall use DoD Government T&E capabilities and invest in Government T&E infrastructure unless an exception can be justified as cost-effective to the Government. PMs shall conduct a cost-benefit analysis for exceptions to this policy and document the assumptions and results of the CBA in an approved TEMP before proceeding.

Previous and Next Page arrows

List of All Contributions at This Location

No items found.

Popular Tags

ACC Practice Center Version 3.2
  • Application Build 3.2.9
  • Database Version 3.2.9