CONTEXT AND
PROTOCOLS FOR PERFORMANCE
February 1999
TABLE OF CONTENTS
This document describes the context and protocols associated with that portion of the Office of Security Evaluations oversight program dealing with the performance testing of protective forces. It sets forth basic guidelines, procedures, and responsibilities for planning, conducting, and evaluating protective force performance tests that are part of the Office of Security Evaluations formal oversight activities. Although it describes Security Evaluations overall approach and basic requirements for implementing its performance testing philosophy, this document does not provide or prescribe detailed procedures for performance test planning or conduct. There are recognized differences among the various protective forces, physical facilities, and security interests within the DOE community; these differences require a flexible approach to the application of testing and evaluation techniques. While this document describes common guidelines and procedures applicable to most performance testing Security Evaluations will require, it does not restrict the types of performance tests Security Evaluations may conduct or the manner in which they are conducted. Within the general testing philosophy expressed in this document, Security Evaluations will conduct the types of performance tests it deems necessary, using the procedures it deems necessary and appropriate, to accomplish its oversight mission. This document supersedes the Office of Security Evaluations May 1990 Guidelines and Procedures for OSE Protective Force Performance Tests.
Performance tests have been an important part of Security Evaluations and its predecessor organizations activities since the inception of formal oversight of safeguards and security in the Department of Energy (DOE). Performance testing continues to play a significant role in Security Evaluations oversight activities. The most appropriate and useful method of evaluating a protective forces ability to perform certain routine and emergency duties in its operating environment is to observe it performing those or similar duties under controlled, and sometimes simulated, conditionsthat is, in performance tests. Security Evaluations performance tests range in complexity from simple demonstrations of a single individual skill to major integrated tests involving an entire protective force shift operating with other elements of a facilitys security system. Historically, artificialities driven largely by operational limitations and safety concerns have influenced and often constrained performance testing activitiesparticularly large scale, complex tests and those involving firearms and force-on-force action. In recent years requirements spurred by increased safety concerns have resulted in more formal, prolonged, and detailed planning and more stringent guidelines for conducting performance tests that involve firearms of any kind. Consequently, the appropriate site organizations must now play a much larger part in planning and conducting performance tests associated with oversight activities. Notwithstanding the larger role of site organizations, performance tests conducted for Security Evaluations oversight purposes must be planned, conducted, and evaluated in accordance with the protocols established herein and in a manner that promotes achievement of appropriate oversight goals.
Adversary Team. Players who act in the roles (as indicated by the prescribed scenario) of adversaries during performance tests. May be composed of Composite Adversary Team (defined below) members or personnel from other sources. May also include Insiders (defined below). Composite Adversary Team. A designated team of DOE Security Police Officers drawn from throughout the DOE complex to support Security Evaluations performance tests by acting as members of the adversary team. Controller. An individual assigned responsibilities to assist in the control of a performance test. Such responsibilities generally include enforcing rules of conduct, safety rules, and other control measures, as well as ensuring the timely and proper accomplishment of specific scenario events. Controllers are normally trained and certified to perform their duties, and are normally provided by organizations at the inspected site and its responsible DOE operations office. Under appropriate circumstances, Security Evaluations Evaluators may also perform Controller duties. Engagement Simulation System (ESS). Equipment consisting of weapons-mounted laser transmitters and laser sensors mounted on potential targets (e.g., personnel, vehicles, buildings). ESS permits accurate assessment of weapons effects during simulated hostile engagements. Synonymous with MILES (defined below). Evaluator. An individual who is assigned responsibility for formally evaluating the performance of security system elements during a performance test. For oversight activities, Security Evaluations provides the Evaluators from its pool of personnel who have been trained and certified as Controller/Evaluators. Insider. A person from the inspected facility who is assigned to assist the Adversary Team to the best of his/her ability. For purposes of a performance test, an insider is considered to be part of the Adversary Team. Insiders may be either active or passive, depending upon DOE threat guidance, the Site Safeguards and Security Plan, the position occupied by the Insider, and the details of the scenario being tested. The normal definitions of active and passive Insiders, as applied to Security Evaluations performance tests, are:
Limited Scope Performance Test (LSPT). A performance test designed to evaluate specific skills, equipment, or procedures. The events of an LSPT may be interrupted to facilitate data gathering, and the events may be directed or redirected by Security Evaluations personnel in order to achieve certain evaluation goals. Although used as a data collection method for input to various rated topics, LSPTs are not themselves rated. An LSPT may or may not involve the use of: ESS/MILES; live fire; and/or role players or an adversary team.Major Performance Test. A large scale performance test that is usually enhanced by the use of ESS/MILES and is designed to test the ability of protective force skills, procedures, and equipment to deal with a scenario threatening a DOE security interest. A major performance test may also evaluate other aspects of a security system (e.g., alarm systems, barriers, etc). Major performance tests always employ an Adversary Team. Although each test includes a planned scenario, major performance tests involve considerable free play. The data collected during major performance tests is used as input for various rated topic areas, but the performance tests themselves are not individually rated. Multiple Integrated Laser Engagement System (MILES). Equipment consisting of weapons-mounted laser transmitters and laser sensors mounted on potential targets (e.g., personnel, vehicles, buildings). MILES permits accurate assessment weapons effects during simulated hostile engagements. Synonymous with ESS (defined above). Observer. An individual who observes a performance test, but who is not a Player and has no responsibilities for controlling the test or evaluating Player performance. Performance Test Window (also Exercise Window). The portion(s) of the performance test process when scenario activities may be executed and elements of the protection system are being evaluated. The window is normally opened when all Players, Controllers, and other participants are in place and ready to begin and all administrative, logistical, and safety requirements for testing have been met. The window is normally closed when test objectives have been met, further useful scenario activity is unlikely, or some other event requires test termination. All evaluated scenario activity takes place during the performance test window, and no activities taking place before or after the window are considered part of actual performance test (scenario) play. When multiple scenarios or multiple iterations of a scenario are conducted, each scenario or iteration has a distinct window. Performance test windows are normally used only for tests employing force-on-force types of activities. Player. An active participant in a performance test. May be a member of the site protective force, other Federal agencies, local law enforcement agencies, (role-playing) site employees, or the Adversary Team. Safety Coordinator (also Safety Controller, Safety Officer). An individual responsible for ensuring that performance test plans satisfactorily address safety-related DOE policy issues and site-specific safety concerns. Responsible for identifying and mitigating hazards associated with the performance test area and planned scenario/test activities so that the test can be conducted with realism and a reasonable level of risk. Safety Coordinators are assigned by Security Evaluations, the responsible DOE field element, and the facility contractor safety organization, as necessary. Safety Coordinators are Trusted Agents (defined below) and are subject to confidentiality requirements. Senior Controller. An individual, responsible to the Test Director, who controls performance test preparations and conduct, and to whom all Controllers report. Normally provided by a facility contractor organization, usually the security or protective force organization. Security Evaluations designates a co-Senior Controller (usually a support contractor specialist) to work with the sites Senior Controller to ensure that appropriate test objectives are met. Test Director. An individual who is assigned overall authority and responsibility for planning and conducting a performance test. Normally provided by a facility contractor organization, usually the security or protective force organization. Security Evaluations designates a co-Test Director (senior Federal staff member) to work with the sites Test Director to assist in achieving a realistic, safe, and valid test. Trusted Agent. In general, neutral individuals whose involvement in the planning, coordination, or conduct of a performance test results in knowledge about test or scenario events that must be kept confidential in the interests of test validity. Primary Trusted Agents are usually assigned by the site protective force contractor (and the responsible DOE field office, if appropriate) to assist Security Evaluations in developing, validating, and implementing scenario events and other test parameters necessary to achieve test objectives. The primary Trusted Agent(s) must have the authority to approve scenario events and test parameters on behalf of their organizations. Other individuals involved in test planning, coordination, or approvalsuch as test planners, safety coordinators, and building managersand who thereby gain some level of knowledge regarding a test are also considered Trusted Agents and are expected to keep all test-related information confidential.
Security Evaluations uses performance testing to collect data on the capabilities of site protective forces and other security system elements as they relate to the protection of DOE security interests. To develop useful and valid information, the controlled conditions under which performance tests are conducted must be as realistic as possible, and any necessary constraints and artificialities must be designed to have as neutral an effect on player performance as possible. To meet the objectives of the oversight process, Security Evaluations has established the following general guidelines for performance testing conducted for oversight purposes:
Most facilities and protective force organizations now have specific, approved local procedures for performance tests or exercises involving the use of ESS/MILES equipment. It is common for the planning, coordination, and approval process for performance tests to commence up to several months before a test date and involve many formal steps and milestones. Whenever possible, Security Evaluations will observe these local requirements. The time on site and familiarity with site facilities, procedures, and personnel required to plan a major performance test while observing local requirements make it impractical for Security Evaluations personnel to play the primary role in detailed planning and test control. Major responsibility for detailed test planning and conduct therefore falls to the inspected site. Performance testing associated with oversight activities is a cooperative effort of Security Evaluations and the inspected facility, in which Security Evaluations establishes expectations and participates in the planning process, the appropriate site organization(s) accomplish detailed planning and test conduct, and Security Evaluations provides specific logistical and control support and evaluates performance. Major responsibilities regarding personnel, planning, scenario development, test conduct and control, evaluation, and logistics are described below. Security Evaluations will provide the following personnel:
The inspected facility will be expected to provide the following personnel:
Security Evaluations will have the following planning responsibilities:
The inspected facility will have the following planning responsibilities:
Security Evaluations, including the adversary team, will have primary responsibility for developing the adversary attack scenario, including target selection and specific adversary actions. Security Evaluations will coordinate scenario development and scenario events with the site primary Trusted Agent(s). The inspected facility is expected to assist in scenario development by:
Security Evaluations will have the following responsibilities during performance test conduct:
The inspected facility will have the following test conduct responsibilities:
Security Evaluations will prepare and provide certified Evaluators to observe and formally evaluate the performance of the elements of the sites security system being tested. Security Evaluations will determine the number of Evaluators to be used, their physical locations during the test, and the evaluation criteria to be used. The inspected facility mayat its discretion and for internal purposesconcurrently collect its own evaluation data (normally using facility-supplied Controllers as Evaluators) independent of Security Evaluations Evaluators. Security Evaluations will have the following logistics responsibilities:
The inspected facility will be responsible for providing all test-related logistical support except that provided by Security Evaluations (see above). Logistics requirements are normally similar to those associated with internal site performance tests, including but not limited to:
All equipment (radios, etc) and transportation required by Controllers. The provision of emergency services (fire, medical, maintenance) as required by local procedures and/or the approved test plan. The provision of food/drink to test participants, if required. If available, Control Net radios for Security Evaluations Evaluators. If available, protective force (net) radios for selected Security Evaluations Evaluators. Planning/training facilities for adversary team use before and after the performance test. If necessary, vehicles for adversary team use during test play and/or for pre-test preparations.
The role of the adversary team is to play the part of whatever adversary is postulated for a specific test. Adversary types may run the full gamut of the DOE Design Basis Threat and could range from a disgruntled employee to a highly trained, well equipped, state sponsored terrorist organization. The adversary team is used to simulate, as closely as possible (within the constraints imposed by available time and equipment, safety considerations, and available skills), the actions of the postulated adversary. Individual members of the adversary teamor the team in aggregateare not required to possess all of the skills or knowledge that the adversary they are representing (e.g., terrorist cell) might possess. The adversary team is not being tested during Security Evaluations performance testing activities, and members of the adversary team are not required to personally demonstrate some of the skills (e.g., explosives, electronic systems, pilot, parachutist, etc.) attributed to the role(s) they are playing . However, to achieve as much realism as possible during testing, the adversary team will be required to physically perform or simulate the actions associated with a specific scenario (for example, explosive breaching operations). Within the control and safety parameters established for the test, the adversary team will actually perform the normal physical and tactical activities (such as movement, communication, employment of smoke and simulated small arms, grenades, and mines) required to accomplish their assigned mission.6.2 Scenario Planning Responsibilities The adversary team will be assigned a target and a mission by Security Evaluations test planners. They may also be provided specific instructions regarding such things as methods and tactics, weapons, or equipment they are to employ, when such specific instructions are important to test objectives. Within the bounds of such guidelines, the adversary team is free to develop specific plans to accomplish their mission. These plans are subject to approval by Security Evaluations planners (co-Test Director, co-Senior Controller) in cooperation with the facility primary Trusted Agent(s). ("Approval" review, which is an informal process, considers safety, realism, fairness, and "do-ability" from a test control standpoint.) When the facility has provided a person to play the role of an Insider, that Insider will be considered a part of the adversary team and will fully participate in the teams information gathering and planning process. 6.3 Intelligence Gathering and Reconnaissance The scope of information potentially available to adversaries, as characterized by DOEs generic threat guidance, is practically unlimited because of the capabilities of modern intelligence-gathering equipment and techniques and the long timelines often available for collection. However, due to time and resource constraints, the adversary team has very limited opportunities to develop information for planning and conducting its missions. Consequently, the following guidelines will be followed regarding information that is provided to the adversary team and that the adversary team is allowed to collect. The adversary team will be provided with any unclassified information they wish, including information concerning the facility, target, and site operations. The adversary team will normally be provided with classified information only if the scenario involves Insider assistance or if a pathway to specific classified information has been identified. Data deemed to be classified, but available through unclassified routes, will also be available to the adversary team. If an individual is provided to play the part of an Insider, that individual will normally provide classified information known to him or her or reasonably obtainable by him or her. If an Insider is postulated but is not provided, classified information will be provided to the adversary team but will be limited to information that the specific type of insider would have or could obtain. In certain cases, particularly when no actual insider is provided, a member of the adversary team may be provided an unrestricted tour of the performance test area (including security areas, buildings, target areas, etc.) to partially compensate for terrain information that could be developed over time or with the assistance of an insider. During the planning phase, adversary team members may observe the performance test area from areas generally accessible to the public and from controlled areas that can be accessed without significant chance of detection. Such observations will be conducted overtly so as not to raise alarm if detected. Such observations will be coordinated as necessary through the primary Trusted Agent(s), and any appropriate notifications will be made so as to avoid the possibility of a security incident should any of the team members be observed and reported. NOTE: If members of the adversary team are detected while engaged in pre-test activities, such as conducting surveillance or validating aspects of the planned scenario, such detection will not affect the protective force posture during the performance test. In addition, observation of authorized adversary team movements prior to the opening of the performance test window will not be used to alert or reposition the protective force. As the test window opens, the protective force will be in its normal operational configuration, with no increased or heightened state of readiness.
![]() The capabilities attributed to the adversary team for Security Evaluations performance tests will be within the scope of the Design Basis Threat and appropriately approved local threat statements (if any) unless otherwise agreed. Security Evaluations may use all adversary skills, weapons, equipment, and other attributes that can be inferred from the Design Basis Threat. Security Evaluations will develop and periodically update a list of weapons, ammunition, explosives, and other equipment that will be considered a part of the adversary teams inventory. The list will be based on a sample of items generally available in the world. It will be supported by applicable data, obtained from authoritative sources, on accuracy, lethality, destructive force, reliability, etc. Security Evaluations will use that data as the basis for weapons/explosives performance during testing. The list will be published separately as Appendix A to this document, and may be classified.
Security Evaluations goals in selecting performance test participants (Players) is to test a group that is representative of the protective force, and to select participants in a manner that is free from bias. In the case of limited scope tests, the preferred method is to select a random sample from the appropriate population. (The appropriate population may be the entire protective force, special response team, alarm station operators, etc., depending on the test.) For major performance tests, an entire protective force shift, or that portion of a shift working at the targeted facility, may participate. The specific shift(s) tested will depend upon a number of factors, including the date and time of the test and the established shift work schedule. Security Evaluations will remain flexible in working with the inspected facility to schedule participants so as to minimize schedule disruption and overtime costs. However, Security Evaluations stresses several factors:
![]() Participant safety is an important consideration in planning and conducting Security Evaluations performance tests. Security Evaluations includes a Safety Coordinator as part of its test planning team whose responsibility is to work with assigned facility Safety Coordinators to identify and help mitigate risks associated with testing activities. Realism is also critical to performance testing and must be preserved to the extent possible. The types of activities being tested often themselves involve inherent risks, such as those associated with operating vehicles, running, negotiating barriers, working in an environment posing various radiological and industrial hazards, and using small arms. However, risk should be minimized while achieving the necessary levels of realism. Security Evaluations goal is to achieve a reasonable balance so that meaningful tests can be safely conducted.
The following Department of Energy documents contain information of specific pertinence to the general subject of performance testing/exercises.
|