Click here
      Home    DAG Tutorial    Search    Available Downloads     Feedback
 
The DAG does not reflect the changes in the DoDI5000.02. Work is in progress to update the content and will be completed as soon as possible.
 
.

This is an older version

Version:
7
Reason for change:

9.7 Special Topics

Topic
Previous Page Next Page

9.7. Special Topics

9.7.1. Network–Centric Operations

Implementation of the Department's transformation strategy, calling for shifting to an information-age military, will result in fewer platform-centric and more net-centric military forces. This requires increased information sharing across networks. The net-centric concept applies to a DoD enterprise-wide information management strategy that includes not only military force operations but also all defense business processes, such as personnel actions, fuel purchases and delivery, commodity buying, deployment and sustainment activities, acquisition and development. Key tenets of the strategy include: handle information only once, post data before processing it, users access data when it is needed, collaborate to make sense of data, and diversify network paths to provide reliable and secure network capabilities.

The shift away from point-to-point system interfaces to net-centric interfaces brings implications for the T&E community. The challenge to the test community will be to represent the integrated architecture in the intended operational environment for test. Furthermore, the shift to net-centric capabilities will evolve gradually, no doubt with legacy point-to-point interfaces included in the architectures. PMs, with PEO support, are strongly encouraged to work with the operating forces to integrate operational testing with training exercises, thereby bringing more resources to bear for the mutual benefit of both communities. It is imperative the T&E community engages the user community to assure that test strategies reflect the intended operational and sustainment/support architectures and interfaces within which the intended capabilities are to be tested and evaluated.

9.7.2. Modeling and Simulation in T&E

For T&E, the appropriate application of M&S is an essential tool in achieving both an effective and efficient T&E program. T&E is conducted in a continuum of Live, Virtual, Constructive (LVC) environments. DoD Components have guidelines for use of M&S in acquisition, especially T&E. These guidelines are intended to supplement other resources. The PM should have an M&S subgroup to the T&E WIPT that develops the program's M&S strategy that should be documented in the program’s SEP and the TES/ TEMP. Some DoD components require planning for M&S to be documented in a separate M&S Support Plan. This M&S strategy will be the basis for program investments in M&S. M&S should be planned for utility across the program’s life cycle, modified and updated as required to ensure utility as well as applicability to all increments of an evolutionary acquisition strategy. A program's T&E strategy should leverage the advantages of M&S. M&S planning should address which of many possible uses of M&S the program plans to execute in support of T&E. M&S can be used in planning to identify high-payoff areas in which to apply scarce test resources. Rehearsals using M&S can help identify cost effective test scenarios and reduce risk of failure. During conduct of tests, M&S might provide adequate surrogates to provide stimulation when it is too impractical or too costly to use real world assets. This impracticality is particularly likely for capability testing or testing a system that is part of a system-of-systems, or for hazardous/dangerous tests or in extreme environments, or for testing the system’s supportability. M&S can be used in post-test analysis to help provide insight and for interpolation or extrapolation of results to untested conditions.

To address the adequacy and use of M&S in support of the testing process the program should involve the relevant OTA in planning M&S to ensure support for both DT and OT objectives. This involvement should begin early in the programs planning stages.

An initial goal for the T&E WIPT is to assist in developing the program’s M&S strategy by helping integrate a program’s M&S with the overall T&E strategy; plan to employ M&S tools in early designs; use M&S to demonstrate system integration risks; supplement live testing with M&S stressing the system; and use M&S to assist in planning the scope of live tests and in data analysis.

Another goal for the T&E WIPT is to develop a T&E strategy identifying ways to leverage program M&S which could include how M&S will predict system performance, identify technology and performance risk areas, and support in determining system effectiveness and suitability. For example, M&S should be used to predict sustainability or KSA drivers. The T&E WIPT should encourage collaboration and integration of various stakeholders to enhance suitability (see section 5.2.3).

A philosophy for interaction of T&E and M&S is to use the model-test-fix-model. Use M&S to provide predictions of system performance, operational effectiveness, operational suitability, and survivability or operational security and, based on those predictions, use tests to provide empirical data to confirm system performance and to refine and further validate the M&S. This iterative process can be a cost-effective method for overcoming limitations and constraints upon T&E. M&S may enable a comprehensive evaluation, support adequate test realism, and enable economical, timely, and focused tests.

Computer-generated test scenarios and forces, as well as synthetic stimulation of the system, can support T&E by creating and enhancing realistic live test environments. Hardware-in-the-loop simulators enable users to interact with early system M&S. M&S can be used to identify and resolve issues of technical risk, which require more focused testing. M&S tools provide mechanisms for planning, rehearsing, optimizing, and executing complex tests. Integrated simulation and testing also provides a means for examining why results of a physical test might deviate from pre-test predictions. Evaluators use M&S to predict performance in areas impractical or impossible to test.

All M&S used in T&E must be accredited by the intended user (PM or OTA). Accreditation can only be achieved through a rigorous VV&A process as well as an acknowledged willingness by the user to accept the subject M&S for their application requirements. Therefore, the intended use of M&S should be identified early so resources can be made available to support development and VV&A of these tools. The OTA should be involved early in this process to gain confidence in the use of M&S and possibly use them in support of OT. DoDI 5000.61, "DoD Modeling and Simulation (M&S) Verification, Validation, and Accreditation (VV&A)," dated December 9, 2009, provides further guidance on VV&A.

The following is provided to help the M&S subgroup to the T&E WIPT think through the planning process to best incorporate M&S into the testing process. Additional guidance for M&S is available in section 4.5.8 and in the “Modeling and Simulation Guidance for the Acquisition Workforce,” dated October 2008.

  • Document the intended use of models and simulations:
    • Decisions that will rely on the results of the M&S.
    • The test objectives/critical operational and sustainment issues the models and simulations will address.
    • The requirements for the use of the M&S.
    • Risk of use of M&S.
  • Identify all M&S intended to support T&E including (but not limited to):
    • Type: LVC simulations, distributed simulations and associated architecture, federates and federations, emulators, prototypes, simulators, and stimulators;
    • Suitability of model use: Legacy systems, new developments, and modified or enhanced legacy M&S;
    • Management of M&S: Developed in-house, Federally Funded Research and Development Centers (FFRDC), industry, academia, and other Federal or non-Federal government organizations;
    • Source: COTS and GOTS M&S;
    • Facilities: hardware-in-the loop, human-in-the-loop, and software-in-the-loop simulators; land-based, sea-based, air-and space-based test facilities;
    • Threat models, simulations, simulators, stimulators, targets, threat systems, and surrogates;
    • Synthetic countermeasures, test beds, environments, and battle spaces;
    • M&S whether embedded in weapon systems, implemented as stand-alone systems, or integrated with other distributed simulations; and
    • Test assets, test planning aids, and post-test analysis tools that address other than real time characteristics.
  • Infrastructure needed to conduct the test(s), to include networks, integration software, and data collection tools:
    • Provide descriptive information for each M&S resource:
      • Title, acronym, version, date;
      • Proponent (the organization with primary responsibility for the model or simulation);
      • Assumptions, capabilities, limitations, risks, and impacts of the model or simulation;
      • Availability for use to support T&E; and
      • Schedule for obtaining.
  • Identify the M&S data needed to support T&E:
    • Describe the input data the M&S needs to accept;
    • Describe the output data the M&S should generate;
    • Describe the data needed to verify and validate the M&S; and
    • Provide descriptive information for each data resource:
      • Data title, acronym, version, date;
      • Data producer (organization responsible for establishing the authority of the data);
      • Identify when, where, and how data was or will be collected;
      • Known assumptions, capabilities, limitations, risks, and impacts;
      • Availability for use to support T&E; and
      • Schedule for obtaining.
  • For each M&S and its data, describe the planned accreditation effort based on the assessment of the risk of using the model and simulation results for decisions being made:
    • Explain the methodology for establishing confidence in the results of M&S;
    • Document historical source(s) of VV&A in accordance with DoDI 5000.61; and
    • Provide the schedule for accreditation prior to their use in support T&E.
  • Describe the standards (both government and commercial) with which the M&S and associated data must comply; for example:
    • IT standards identified in the DoD IT Standards Registry (DISR);
    • Standards identified in the DoD Architecture Framework Technical Standards Profile (TV-1) and Technical Standards Forecast (TV-2);
    • M&S Standards and Methodologies (requires registration/login);
    • Data standards; and
    • VV&A standards:
      • IEEE Std 1516.4TM -2007, IEEE Recommended Practice for VV&A of a Federation—An Overlay to the High Level Architecture Federation Development and Execution Process;
      • IEEE Std 1278. 4TM -1997(R2002), IEEE Recommended Practice for Distributed Interactive Simulation - VV&A;
      • MIL-STD-3022, DoD Standard Practice for Model & Simulation VV&A Documentation Templates, dated January 28, 2008.

M&S is an essential tool for achieving both an effective and efficient T&E program. T&E should be conducted in a continuum of LVC environments throughout a system’s acquisition process. DoD Components have guidelines for the use of M&S in acquisition, especially T&E. The PM should have an M&S subgroup to the T&E WIPT that develops the program's M&S strategy which should be documented in the program’s SEP and the TES/TEMP or in a separate M&S Support Plan.

M&S can be used in test planning to identify high-payoff areas in which to apply scarce test resources, and in dry-running a test to assess the sensitivity of test variables to the response variable being used, and to evaluate system operational effectiveness, operational suitability or survivability or operational security. During the conduct of tests, M&S can provide surrogates to provide stimulation when it is too impractical or too costly to use real world assets. This impracticality is particularly likely for capability testing or testing a system that is part of a system-of-systems, or for hazardous/dangerous tests or in extreme environments, or for testing the system’s supportability. M&S can be used in post-test analysis to help provide insight, and for interpolation or extrapolation of results to untested conditions.

9.7.3. Validation of Threat Representations (targets, threat simulators, or M&S)

To ensure test adequacy, OT should only incorporate validated and accredited threat representations unless coordinated with DOT&E.

The following are the recommended validation guidelines:

  • Threat representation validation supports the objective of ensuring that threat representations meet DT&E and OT&E credibility requirements. Validation of threat representations is defined as "the baseline comparison of the threat to the threat representation, annotation of technical differences, and impact of those differences on testing."
  • Validation of threat representations is typically conducted by the DoD Component responsible for the threat representation and culminates in a validation report which documents the results. DOT&E approves the DoD Component-validated reports.
  • Only current, DIA- or DoD Component-approved threat data should be used in the validation report. Specifications pertaining to the threat representation should accurately portray its characteristics and may be obtained from a variety of sources including the developer and/or government-sponsored testing. For new developments, validation data requirements should be integrated into the acquisition process to reduce the need for redundant testing.
  • Incorporation of an Integrated Product and Process Development (IPPD) process for new threat representation developments is recommended. The objective of the IPT is to involve DOT&E and its Threat Systems Office (TSO) early and continuously throughout the validation process. DoD Component organizations responsible for conducting threat representation validation should notify DOT&E of their intent to use an IPPD process and request DOT&E/TSO representation at meetings and reviews, as appropriate. The DOT&E representative will be empowered to provide formal concurrence or non-concurrence with these validation efforts as they are accomplished. After the IPPD process, DOT&E will issue an approval memorandum, concurring with the threat representation assessment.
  • When a WIPT is not used, draft threat representation validation reports should be forwarded to the TSO for review. The TSO will provide recommendations for corrections, when necessary. Final reports are then submitted by the TSO for DOT&E approval.
  • DOT&E approval confirms that an adequate comparison to the threat has been completed. It does not imply acceptance of the threat test asset for use in any specific test. It is the responsibility of the OTA to accredit the test resource for a specific test and for DOT&E to determine if the threat test resource proves adequate.

These guidelines do not address the threat representation verification or accreditation processes. Verification determines compliance with design criteria and requires different methods and objectives. Accreditation, an OTA responsibility, determines the suitability of the threat representation in meeting the stated test objectives. The data accumulated during validation should be the primary source of information to support the accreditation process.

9.7.4. Mission-oriented Context

A mission-oriented context to T&E means being able to relate evaluation results to an impact on the warfighters' ability to execute their mission-essential tasks. Including mission context during test planning and execution provides for a more rigorous test environment, and allows for the identification of design issues that may not be discovered in a pure developmental test environment. The results of testing in a mission-oriented context will allow these issues to be addressed earlier in the development phase of a component or system. Additionally, testing in a mission-oriented context will allow the developmental evaluators to predict system performance against the COIs evaluated in OT&E.

Testing in a mission-oriented context will also allow the OTA to participate earlier in the development cycle and use the results of integrated tests to make operational assessments. Integrated planning of tests is a key element in this process. This allows the data to be used by the developmental community to better predict system performance and allows the OTA to potentially reduce the scope of IOT&E while still providing an adequate evaluation of the COIs.

Previous and Next Page arrows

Benefit/Value

Defense Acquisition Guidebook

Created By

  • Private
    Unavailable


Browse

https://acc.dau.mil/UI/img/bo/minus.gifWelcome to the Defense Acquisition...
https://acc.dau.mil/UI/img/bo/plus.gifForeword
https://acc.dau.mil/UI/img/bo/plus.gifChapter 1 -- Department of Defense...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 2 -- Program Strategies
https://acc.dau.mil/UI/img/bo/plus.gif2.0 Overview
https://acc.dau.mil/UI/img/bo/plus.gif2.1. Program Strategies—General
https://acc.dau.mil/UI/img/bo/plus.gif2.2. Program Strategy Document...
https://acc.dau.mil/UI/img/bo/plus.gif2.3. Program Strategy Relationship to...
https://acc.dau.mil/UI/img/bo/plus.gif2.4. Relationship to Request for...
https://acc.dau.mil/UI/img/bo/minus.gif2.5. Program Strategy Classification...
https://acc.dau.mil/UI/img/bo/minus.gif2.6. Program Strategy Document Approval...
https://acc.dau.mil/UI/img/bo/plus.gif2.7. Acquisition Strategy versus...
https://acc.dau.mil/UI/img/bo/minus.gif2.8. Technology Development...
https://acc.dau.mil/UI/img/bo/plus.gif2.8.5. Program Schedule
https://acc.dau.mil/UI/img/bo/plus.gif2.8.7. Business Strategy
https://acc.dau.mil/UI/img/bo/plus.gif2.8.8. Resources
https://acc.dau.mil/UI/img/bo/plus.gif2.8.9. International Involvement
https://acc.dau.mil/UI/img/bo/minus.gifChapter 3 -- Affordability and...
https://acc.dau.mil/UI/img/bo/plus.gif3.0. Overview
https://acc.dau.mil/UI/img/bo/minus.gif3.1. Life-Cycle Costs/Total Ownership...
https://acc.dau.mil/UI/img/bo/plus.gif3.1.2. Life-Cycle Cost Categories and...
https://acc.dau.mil/UI/img/bo/plus.gif3.1.4. Implications of Evolutionary...
https://acc.dau.mil/UI/img/bo/plus.gif3.2. Affordability
https://acc.dau.mil/UI/img/bo/minus.gif3.3. Analysis of Alternatives
https://acc.dau.mil/UI/img/bo/plus.gif3.4. Cost Estimation for Major Defense...
https://acc.dau.mil/UI/img/bo/minus.gif3.5. Manpower Estimates
https://acc.dau.mil/UI/img/bo/plus.gif3.6. Major Automated Information Systems...
https://acc.dau.mil/UI/img/bo/plus.gif3.7. Principles for Life-Cycle Cost...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 4 -- Systems Engineering
https://acc.dau.mil/UI/img/bo/minus.gifChapter 5 -- Life-Cycle Logistics
https://acc.dau.mil/UI/img/bo/minus.gif5.0. Overview
https://acc.dau.mil/UI/img/bo/minus.gif5.1. Life-Cycle Sustainment in the...
https://acc.dau.mil/UI/img/bo/plus.gif5.1.2. Life-Cycle Sustainment and the...
https://acc.dau.mil/UI/img/bo/minus.gif5.1.3. Life-Cycle Sustainment in the...
https://acc.dau.mil/UI/img/bo/plus.gif5.1.4. Performance-Based Agreements...
https://acc.dau.mil/UI/img/bo/plus.gif5.1.5. Contracting for Sustainment
https://acc.dau.mil/UI/img/bo/plus.gif5.1.6. Technical Data Computer Software...
https://acc.dau.mil/UI/img/bo/plus.gif5.1.7. Configuration Management
https://acc.dau.mil/UI/img/bo/plus.gif5.2. Applying Systems Engineering to...
https://acc.dau.mil/UI/img/bo/minus.gif5.3. Supportability Design...
https://acc.dau.mil/UI/img/bo/minus.gif5.4. Sustainment in the Life-Cycle...
https://acc.dau.mil/UI/img/bo/plus.gif5.4.2. Sustainment in the Technology...
https://acc.dau.mil/UI/img/bo/plus.gif5.4.3. Sustainment in the Engineering...
https://acc.dau.mil/UI/img/bo/plus.gif5.4.4. Sustainment in the Production and...
https://acc.dau.mil/UI/img/bo/plus.gif5.4.5. Sustainment in the Operations and...
https://acc.dau.mil/UI/img/bo/plus.gif5.5. References
https://acc.dau.mil/UI/img/bo/minus.gifChapter 6 -- Human Systems Integration...
https://acc.dau.mil/UI/img/bo/plus.gif6.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif6.1. Total System Approach
https://acc.dau.mil/UI/img/bo/plus.gif6.2 HSI - Integration Focus
https://acc.dau.mil/UI/img/bo/plus.gif6.3. Human Systems Integration Domains
https://acc.dau.mil/UI/img/bo/plus.gif6.4. Human Systems Integration (HSI)...
https://acc.dau.mil/UI/img/bo/minus.gif6.5. Manpower Estimates
https://acc.dau.mil/UI/img/bo/plus.gif6.6. Additional References
https://acc.dau.mil/UI/img/bo/minus.gifChapter 7 -- Acquiring Information...
https://acc.dau.mil/UI/img/bo/plus.gif7.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif7.1. Introduction
https://acc.dau.mil/UI/img/bo/plus.gif7.2. DoD Information Enterprise
https://acc.dau.mil/UI/img/bo/plus.gif7.3. Interoperability and Supportability...
https://acc.dau.mil/UI/img/bo/plus.gif7.4. Sharing Data, Information, and...
https://acc.dau.mil/UI/img/bo/plus.gif7.5. Information Assurance (IA)
https://acc.dau.mil/UI/img/bo/plus.gif7.6. Electromagnetic Spectrum
https://acc.dau.mil/UI/img/bo/plus.gif7.7. Accessibility of Electronic and...
https://acc.dau.mil/UI/img/bo/plus.gif7.8. The Clinger-Cohen Act (CCA) --...
https://acc.dau.mil/UI/img/bo/plus.gif7.9. Post-Implementation Review (PIR)
https://acc.dau.mil/UI/img/bo/plus.gif7.10. Commercial Off-the-Shelf (COTS)...
https://acc.dau.mil/UI/img/bo/plus.gif7.11. Space Mission Architectures
https://acc.dau.mil/UI/img/bo/minus.gifChapter 8 -- Intelligence Analysis...
https://acc.dau.mil/UI/img/bo/plus.gif8.0. Introduction
https://acc.dau.mil/UI/img/bo/plus.gif8.1. Threat Intelligence Support
https://acc.dau.mil/UI/img/bo/plus.gif8.2. Signature and other Intelligence...
https://acc.dau.mil/UI/img/bo/plus.gif8.3. Support to the Intelligence...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 9 -- Test and Evaluation (T&E)
https://acc.dau.mil/UI/img/bo/plus.gif9.0 Overview
https://acc.dau.mil/UI/img/bo/plus.gif9.1 OSD T&E Organization
https://acc.dau.mil/UI/img/bo/plus.gif9.2 Service-Level T&E Management
https://acc.dau.mil/UI/img/bo/plus.gif9.3 Test and Evaluation
https://acc.dau.mil/UI/img/bo/plus.gif9.4 Integrated Test and Evaluation
https://acc.dau.mil/UI/img/bo/plus.gif9.5 Test and Evaluation Planning
https://acc.dau.mil/UI/img/bo/plus.gif9.6 T&E Reporting
https://acc.dau.mil/UI/img/bo/minus.gif9.7 Special Topics
https://acc.dau.mil/UI/img/bo/minus.gif9.7.5 Testing in a Joint Operational...
https://acc.dau.mil/UI/img/bo/plus.gif9.7.6 Information Assurance Testing
https://acc.dau.mil/UI/img/bo/plus.gif9.8. Best Practices
https://acc.dau.mil/UI/img/bo/plus.gif9.9. Prioritizing Use of Government Test...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 10 -- Decisions Assessments and...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 11 -- Program Management...
https://acc.dau.mil/UI/img/bo/plus.gifChapter 12 - Defense Business System...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 13 -- Program Protection
https://acc.dau.mil/UI/img/bo/plus.gif13.0 Overview
https://acc.dau.mil/UI/img/bo/plus.gif13.1 The Program Protection Process
https://acc.dau.mil/UI/img/bo/plus.gif13.2 The Program Protection Plan (PPP)
https://acc.dau.mil/UI/img/bo/plus.gif13.3 Critical Program Information (CPI)...
https://acc.dau.mil/UI/img/bo/minus.gif13.4. Intelligence and...
https://acc.dau.mil/UI/img/bo/plus.gif13.5. Vulnerability Assessment
https://acc.dau.mil/UI/img/bo/minus.gif13.6. Risk Assessment
https://acc.dau.mil/UI/img/bo/plus.gif13.7. Countermeasures
https://acc.dau.mil/UI/img/bo/plus.gif13.8. Horizontal Protection
https://acc.dau.mil/UI/img/bo/minus.gif13.9. Foreign Involvement
https://acc.dau.mil/UI/img/bo/minus.gif13.10. Managing and Implementing PPPs
https://acc.dau.mil/UI/img/bo/plus.gif13.11. Compromises
https://acc.dau.mil/UI/img/bo/minus.gif13.12. Costs
https://acc.dau.mil/UI/img/bo/minus.gif13.13. Contracting
https://acc.dau.mil/UI/img/bo/plus.gif13.14. Detailed System Security...
https://acc.dau.mil/UI/img/bo/minus.gif13.15. Program Protection Plan (PPP)...
https://acc.dau.mil/UI/img/bo/minus.gif13.16. Program Protection Plan (PPP)...
https://acc.dau.mil/UI/img/bo/minus.gifChapter 14 -- Acquisition of Services
https://acc.dau.mil/UI/img/bo/plus.gif14.0. Overview
https://acc.dau.mil/UI/img/bo/plus.gif14.1. Introduction to the Acquisition of...
https://acc.dau.mil/UI/img/bo/minus.gif14.2. The Planning Phase
https://acc.dau.mil/UI/img/bo/plus.gif14.2.2. Step Two – Review Current...
https://acc.dau.mil/UI/img/bo/plus.gif14.2.3. Step Three – Market Research
https://acc.dau.mil/UI/img/bo/minus.gif14.3. The Development Phase
https://acc.dau.mil/UI/img/bo/plus.gif14.3.2. Step Five – Develop an...
https://acc.dau.mil/UI/img/bo/plus.gif14.4. The Execution Phase
https://acc.dau.mil/UI/img/bo/plus.gifAppendix A -- REQUIREMENTS ROADMAP...
https://acc.dau.mil/UI/img/bo/minus.gifAppendix B -- SERVICE ACQUISITION...
https://acc.dau.mil/UI/img/bo/plus.gifAppendix C -- SERVICE ACQUISITION MALL...
https://acc.dau.mil/UI/img/bo/plus.gifAppendix D -- MARKET RESEARCH RESOURCES
https://acc.dau.mil/UI/img/bo/plus.gifAppendix E -- GLOSSARY
https://acc.dau.mil/UI/img/bo/plus.gifDoD Directive 5000.01
https://acc.dau.mil/UI/img/bo/plus.gifDoD Instruction 5000.02
https://acc.dau.mil/UI/img/bo/minus.gifRecent Policy and Guidance
https://acc.dau.mil/UI/img/bo/plus.gifDownload the Defense Acquisition...
https://acc.dau.mil/UI/img/bo/plus.gifWeapon Systems Acquisition Reform Act of...
https://acc.dau.mil/UI/img/bo/plus.gifCurrent JCIDS Manual and CJCSI 3170.01 I
https://acc.dau.mil/UI/img/bo/minus.gifDefense Acquisition Guidebook Key...