ExpectMore.gov


Detailed Information on the
National Nuclear Security Administration: Science Campaign Assessment

Program Code 10003405
Program Title National Nuclear Security Administration: Science Campaign
Department Name Department of Energy
Agency/Bureau Name National Nuclear Security Administration
Program Type(s) Research and Development Program
Capital Assets and Service Acquisition Program
Competitive Grant Program
Assessment Year 2005
Assessment Rating Moderately Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 91%
Program Management 83%
Program Results/Accountability 72%
Program Funding Level
(in millions)
FY2008 $286
FY2009 $323

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Continue to improve the responsiveness of the Nuclear Weapons Complex Infrastructure by coordinating program activity with the Complex Transformation Strategy Record of Decision, including transfer of Pit Certification from the Pit Campaign to Science Campaign.

Action taken, but not completed
2007

Continue to improve the efficiency and cost effectiveness of the Nuclear Weapons Complex by integrating the program requirements into development of the new Office of Defense Programs National Level Work Breakdown Structure (WBS) plan.

Action taken, but not completed

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Improving the coordination of activities across multiple NNSA programs aimed at nuclear weapons activities, especially research and development work - especially across the six NNSA campaigns.

Completed
2006

Strengthening procedures to hold contractors accountable for cost, schedule, and results by expanding the linkage of Contractor awards to performance evaluation.

Completed Nuclear weapon activities are primarily executed by contractors operating Government-owned facilities. The program needs continued improvement in holding contractors accountable and seeking efficiencies to save costs.

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Cumulative percentage of progress in the development of the Quantification of Margins and Uncertainties (QMU) methodology to provide quantitative measures of confidence in the performance, safety and reliability of the nuclear weapons stockpile.


Explanation:

Year Target Actual
2004 10% 10%
2005 25% 25%
2006 40% 40%
2007 55% 55%
2008 70% Complete
Long-term Outcome

Measure: Cumulative percentage of progress in replacing key empirical parameters in the nuclear explosive package assessment with first principles physics models assessed by validation with experiment.


Explanation:By 2020, use modern physics models in assessment calculations to replace the major empirical parameters affecting energy balance, boost initial conditions, the amount of boost, secondary performance, and weapons output.

Year Target Actual
2007 36% 36%
2008 42% 42%
2009 50%
2010 60%
2011 63%
2012 66%
2013 69%
2014 69%
Long-term Outcome

Measure: Cumulative percentage of progress towards completing the Dual-Axis Radiographic Hydrotest Facility (DARHT) to provide data required to certify the safety and reliability of the U.S. nuclear weapons stockpile.


Explanation:

Year Target Actual
2004 16% 16%
2005 25% 25%
2006 60% 70%
2007 80% 95%
2008 100% Complete 100%
Long-term Outcome

Measure: Readiness, measured in months, to conduct an underground nuclear test, as established by current NNSA Policy.


Explanation:

Year Target Actual
2003 Baseline 36
2004 30 30
2005 24 24
2006 24 24
2007 24 24
2008 24-36 24-36
2009 24-36
Annual Output

Measure: Annual percentage of hydrodynamic tests completed in accordance with the National Hydrodynamics Plan to support the assessment of nuclear performance.


Explanation:

Year Target Actual
2004 Baseline 60%
2005 75% 75%
2006 75% 75%
2007 75% 75%
2008 75% 75%
Long-term Outcome

Measure: Cumulative percentage of progress towards creating and measuring extreme temperature and pressure conditions for the FY 2013 stockpile stewardship requirement.


Explanation:

Year Target Actual
2003 56% 56%
2004 63% 62%
2005 68% 68%
2006 70% 70%
2007 70% 70%
2008 75% 75%
Long-term Outcome

Measure: Cumulative percentage of progress towards achievement of key extreme experimental conditions of matter needed for predictive capability for nuclear weapons performance. (New measure, added February 2008)


Explanation:By 2015, achieve a greater than unity value of the average of the ratio of achieved conditions to needed conditions.

Year Target Actual
2007 13% 13%
2008 18% 18%
2009 25%
2010 35%
2011 55%
2012 75%
2013 85%
2014 90%
2015 100%
Annual Efficiency

Measure: Annual average cost per test, expressed in terms of thousands of dollars, of obtaining plutonium experimental data on the Joint Actinide Shock Physics Experimental Research (JASPER) facility to support primary certification models.


Explanation:

Year Target Actual
2004 Baseline Baseline
2005 $405K $405K
2006 $380K $380K
2007 $360K $360K
2008 $340K $340K
2009 $340K

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: "The Science Campaign has a clear purpose, to: develop improved capabilities to assess the safety, reliability, and performance of the nuclear package portion of weapons without further underground testing; enhance readiness to conduct underground nuclear testing if directed by the president; and develop essential scientific capabilities and infrastructure. The supporting objectives of the Science Campaign are to: (1) Develop knowledge, tools, and methods to assess with confidence the performance of a nuclear weapon without further underground testing; (2) Develop materials and technologies required to solve stockpile problems; (3) Maintain the readiness of the National Nuclear Security Administration (NNSA) to conduct nuclear testing as directed by the President; and, (4) Develop and maintain essential scientific capabilities and facilities in support of NNSA missions. This purpose and these objectives provide key support to the DOE Defense Strategic Goal: Protect our national security by applying advanced science and nuclear technology to the Nation's defense and the NNSA General Goal: Ensure that our nuclear weapons continue to serve their essential deterrence role by maintaining the safety, security, and reliability of the U.S. nuclear weapons stockpile. "

Evidence: DOE Strategic Plan, September 2003; NNSA Strategic Plan, November 2004; NNSA Strategic Planning Guidance for FY 2007-11, March 2005; President's FY 2006 Budget and NNSA FY 2006-2010 Future-Years Nuclear Security Program (FYNSP), Feb 2005; FY 2006-11 Science Campaign Program Plan; and individual FY 2006 Science Campaign Sub-Program Implementation Plans.

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: Absent underground nuclear testing, development of a certification methodology and predictive capabilities of the accuracy required for certification are fundamental to the success of science-based nuclear weapons stockpile stewardship. Detailed needs will include experimentally validated models of the physical properties and processes involved in predicting the performance of aged and rebuilt nuclear weapons. These models will be required for incorporation in Advanced Simulation and Computing (ASC) codes that will be used to assess the impact of problems discovered in the stockpile or of refurbishment decisions. The experimental capabilities provided by the Science Campaigns are critical for the validation of integrated ASC code predictions and for the assessment of stockpile issues.

Evidence: NNSA Strategic Plan; NNSA Strategic Planning Guidance; Defense Programs Strategic Vision for 2030, Feb 05; FY 2006 President's Budget/NNSA FYNSP; Science Campaign Program Plan; and individual Science Campaign Implementation plans.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: The Atomic Energy Act of 1954, as amended, delegates control of nuclear weapons activities to the DOE/NNSA forbidding other Federal agencies to perform nuclear weapons work. Absent underground nuclear testing, the Science Campaign provides a unique lead effort in NNSA in the development and maintenance of a science based nuclear weapons stockpile. It also is the only program tasked with maintaining the readiness to conduct underground nuclear tests, if the President so desires. Annually, the NNSA Planning, Programming, Budgeting, and Evaluation (PPBE) process ensures no duplication of effort among the NNSA programs. Program planning documents are required to include descriptions of linkages to other NNSA work to show required integration and avoid duplication of efforts within NNSA. The NNSA draws on DoD, industry, and academia for contributing technology and research as appropriate.

Evidence: Title 10 of the Code of Federal Regulations (10 CFR), Energy; FY 2006 President's Budget/NNSA FYNSP; NNSA PPBE documents on the PPBE web site; integration strategy for Campaign 02 (described in the Dynamic Materials Properties Program and Implementation Plan for FY 2004, October 24, 2003 on page 15); University Partnerships Major Technical Efforts (MTEs) in Science Campaign Implementation Plans; and DOE/DoD Joint Munitions Program Memorandum of Agreement.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The four Science Campaign activities were designed to comprehensively address the areas of physics required to provide the models and experimental basis to support certification, namely: (1) Primary Physics/Assessment Technology, (2) Stockpile Dynamic Materials Properties, (3) Advanced Radiography, and (4) Secondary Physics/Certification and Nuclear Systems Margins. In order to prioritize the work within these campaigns, a new certification methodology is being developed called "Quantification of Margins and Uncertainties (QMU)." The sources of uncertainty in weapons performance are prioritized to ensure that higher priority problems are addressed before those of lesser priority. The grouping of the 4 activities into an overall Science Campaign improves our management flexibility to address priorities.

Evidence: JASON reviews of Radiograph, QMU, and Pit lifetimes, 2001 and 2003 Report to Congress of the Panel to Assess the Reliability, Safety, and Security of the United States Nuclear Stockpile ("Foster Panel"); 1999 and 2000 DoD Performance, Analysis, and Evaluation Assessment of Test Readiness; 2003 and 2005 Test Readiness Reports to Congress; and 2002 Enhanced Test Readiness Cost Study released to Congress March 2003.

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: "The principal beneficiary of the efforts of the Science Campaign will be the laboratory designers who will use the tools and knowledge developed in order to support specific stockpile activities funded through Directed Stockpile Work (DSW). QMU will be a major assist in prioritizing certification efforts. The Campaign's Level 1 and 2 milestones are paced to support the corresponding DSW effort, and are documented in the Science Campaign Program Plan and Implementation Plans. Planned Science Campaign funding and priorities pass to the national laboratory technical organizations that accomplish the program of work as specified in Work Authorizations (WAs), developed through agreement with the Management & Operating (M&O) contractors on program and technical priorities. The program tracks expenditures monthly at the sub-program level using its official Budget and Reporting (B&R) classification codes and the DOE Financial Information System. Periodic reviews track work progress that indicate the level of effort is consistent with the planned schedule and obligated funding. These procedures are also relevant to the University Partnerships component of the program, through competitive processes, where appropriate."

Evidence: President's FY 2006 Budget/NNSA FYNSP; FY 2004 and FY 2005 NNSA Budget Summaries; Science Campaign Program and Implementation Plans; Monthly financial reports; monthly Work Authorizations; program review records, NNSA DP Quarterly Program Reviews (QPRs), and the Administrators Annual Science Campaign Program Review; The Stewardship Science Academic Alliances Program (defined and announced in Solicitation # DE-PS03-01SF22349 and the selection process described in the "Evaluation and Selection Plan" issued March 3, 2001); Review of the WSU/ISP grant renewal is documented in the DNCFA, created on January 10, 2003; and peer review of the University of Nevada/Las Vegas cooperative agreement, 2003, and other subsequent reviews which are properly documented.

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The Science Campaign has a limited number of long-term measures to meaningfully address progress in achieving the program purpose. They are: (1) By 2010, complete development of Quantification of Margins and Uncertainties (QMU) methodology to apply quantitative measures of confidence in the performance, safety, and reliability of the nuclear weapons stockpile, (2) By 2008, complete the Dual-Axis Radiographic Hydrotest Facility (DARHT) to support hydrotests required for nuclear weapons certification, (3) By the end of FY 2006, achieve an 18-month underground nuclear test readiness, (4) Annually, complete at least 75% of all scheduled hydrodynamic tests in accordance with the National Hydrodynamics Plan to support the assessment of nuclear performance, and (5) By 2008, reduce the average cost of obtaining plutonium experimental data on the Joint Actinide Shock Physics Experimental Research (JASPER) Facility to 80% ($ 340K) of the FY 2004 baseline cost ($ 426K).

Evidence: FY 2006 President's Budget/NNSA FYNSP; current Science Campaigns Milestones in the Office of Defense Programs (DP) Milestone Reporting Tool (MRT); FY 2006-11 Science Campaign Program Plan; and individual FY 2006 Science Campaign Implementation Plans.

YES 9%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: Developing the tools and models required to achieve the required level of accuracy for the prediction of nuclear performance without underground nuclear testing is a very ambitious but critical goal to meet national security requirements.

Evidence: FY 2006 President's Budget/NNSA FYNSP; DP Priorities; current Science Campaigns Milestones in the DP MRT; FY 2006-11 Science Campaign Program Plan; and individual FY 2006 Science Campaign Implementation Plans.

YES 9%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The program has a limited number of annual measures to meaningfully assess progress in achieving the program's long-term measures. For FY 2007, they are: (1) Complete 55% of progress toward development of the new QMU methodology, (2) Complete 80% of progress towards completion of the DARHT, (3) Maintain 18-month readiness to conduct an underground nuclear test, (4) Complete 75% of the scheduled hydrodynamic tests, and (5) Achieve a JASPER per test cost that is 85% ($ 360K) of the 2004 baseline cost. For additional year targets, see the PART Measures Tab.

Evidence: FY 2006 President's Budget/NNSA FYNSP; current Science Campaigns Milestones in the DP MRT; FY 2006-11 Science Campaign Program Plan; and individual FY 2006 Science Campaign Implementation Plans.

YES 9%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: Multiple-year funding baselines are established and adjusted for inflation. The annual performance targets are ambitious. Any deviations from the baseline are identified and explained in the annual NNSA Planning, Programming, Budgeting, and Evaluation (PPBE) process. Performance baselines are established to support current priorities using national-level Level 1 Milestones. These, in turn, are supported by programmatic Level 2 milestones. Milestones are maintained in an Office of Defense Programs (DP) Milestone Reporting Tool (MRT). Tracking progress on these milestones is accomplished during Quarterly Performance Reviews (QPRs), as well as during periodic program technical reviews. The work of the individual science campaign activities is to expand technical capabilities and expertise needed to assess and maintain the nuclear weapon stockpile. Thus, our knowledge and capability oriented targets are moderate to high risk endeavors. Our level-of -effort activities incorporate performance targets that are more easily quantified and represent lower-risk endeavors.

Evidence: FY 2005 President's Budget/NNSA FYNSP; current Science Campaigns Milestones in the DP MRT; FY 2005-10 Science Campaign Program Plan; and individual FY 2005-07 Science Campaign Implementation Plans.

YES 9%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: The long-term and annual goals of the Science Campaign are agreed to jointly between the NNSA and the NNSA national laboratories then reviewed and recommitted to annually. The NNSA PPBE process explicitly ties Science Campaign planning, programming, and budgeting to the work to be performed towards meeting these long-term program goals and aligns it with other NNSA program needs and progress. Contracts for the operation of the national laboratories also tie performance assessments to these long-terms goals. Significant support is additionally required from Bechtel Nevada, at the Nevada Test Site, whose support to the Science Campaign is evaluated on a semi-annual basis as part of their performance-fee assessment.

Evidence: NNSA Strategic Plan; FY 2006 President's Budget/NNSA FYNSP; FY 2006-11 Science Campaign Program Plan; individual FY 2006 Science Campaign Implementation Plans; NNSA PPBE documents on the NNSA web site; and contract statements of work (e.g., Appendix F measures in the University of California contracts for LANL and LLNL).

YES 9%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: A broad and comprehensive set of independent technical, program, and budget reviews are conducted periodically to assure that key performance areas are meeting expectations and to provide feedback for improvement. Ad hoc reviews are conducted in areas of special concern. These reviews often result in changes to the program activities, schedules, and performance expectations. Program implementation and experimental plans and then modified and updated as appropriate. The US Government Accountability Office (GAO) is currently conducting an audit of the Science Campaign.

Evidence: University of California Laboratory Division Review Committees; U.S. Strategic Command Strategic Advisory Committee Science and Technology Team (SAGSAT) reviews; JASON reviews, including the Review of Petawatt Laser Needs, 2002; Defense Nuclear Facilities Safety Board (DNFSB) reviews for safe operations; The Report of the Panel to Assess the Reliability, Safety, and Security of the United States Nuclear Stockpile ("Foster Panel"); National Academy Review of High Energy Density Physics; Peer-reviewed competition for Financial Assistance Awards for Defense Science University programs; DoD Program Analysis and Evaluation review of Test Readiness; Nuclear Weapons Council Standing and Safety Committee reviews; annual budget reviews; and annual peer review of the programs supported by the cooperative agreements with the University of Nevada at Las Vegas and Reno, University of Washington, Carnegie Tech, Cornell, and others. DARHT II project reviews, annual reviews of DSW and ASC include input from the Science Campaign.

YES 9%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The program's budget request is explicitly tied to the accomplishment of its annual targets and long-term goals. The long-term goal is linked to the annual targets that cascade into detailed technical milestones and baselines. Resource trade-offs are evaluated based on their impact on the goal and targets. These performance-based budgeting decisions are used to develop budget requests, thus allowing for the resource needs to be presented in a complete and transparent manner in the budget. In addition, the program has its own Budget & Reporting (B&R) structure to track resource expenditures independently. During execution, program and financial performance can be corporately monitored and assessed to allow results to influence future budget requests. All direct and indirect costs to attain the performance results for the Science Campaign are reported in the B&R categories. The cost for the Science Campaign Federal employees is carried in a separate NNSA program direction account, as required by the Congress.

Evidence: FY 2006 President's Budget/NNSA FYNSP; NNSA PPBE Guidance Documents located on the NNSA web-site; NNSA FY 2006 Program Decision Memorandum; and NNSA Strategic Plan.

YES 9%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: "The realignment of four campaigns under the Science Campaign ""Program"" umbrella was done to better coordinate and manage these activities. Program management has taken steps to adjust strategic planning to better meet stockpile stewardship needs and priorities. The FY 2006 PPBE process reflected lessons learned from the first year's experience. The NNSA campaign approach has aligned specified campaigns to support the Directed Stockpile work (DSW) effort to manage the nuclear weapons stockpile. This aligns the NNSA budget with the mission. It also requires the establishment of long-term technical goals to meet the goals of stockpile stewardship. At this point, significant weaknesses are still apparent, particularly with regard to the coordination of activities across multiple programs, such as the Science Campaign, Engineering Campaigns, and DSW. A notable example is the contributions of these programs to the evaluation of pit lifetimes. As a result, an NNSA HQ Pit Lifetime Working Group (PLWG) has been chartered to help coordinate these efforts across campaigns. The results of this effort should help to develop new approaches to address deficiencies in strategic planning for cross cutting issues in the future. The new QMU primary certification methodology will also aid in future planning."

Evidence: NNSA Strategic Plan; NNSA PPBE Guidance documents; FY 2006 President's Budget/NNSA FYNSP; NNSA HQ PLWG Charter; FY 2006-11 Science Campaign Program Plan; and individual FY 2006 Science Campaign Implementation Plans.

NO 0%
2.CA1

Has the agency/program conducted a recent, meaningful, credible analysis of alternatives that includes trade-offs between cost, schedule, risk, and performance goals, and used the results to guide the resulting activity?

Explanation: After thorough technical and program reviews, for instance, NNSA decided to limit Advance Hydrotest Facility efforts to technology development efforts only, curtailing expensive conceptual design-like activities until the full need for a potential facility is justified. Funding was shifted to enable fuller participation of the Los Alamos National Laboratory (LANL) in primary assessment work, vital both to the health of the national program and the LANL programs in primary assessment, which were under funded. Further high-impact requirements of the Science Campaigns that would require new facilities or capabilities are under review and study, and decisions to go forward will be based on clearly demonstrated need.

Evidence: JASON review of Radiography and Radiography report to the Congress; FY 2005 President's Budget/NNSA FYNSP; and decision to shut-down PHERMEX accelerator at LANL. The 2003 review of the LLNL experiments at NTS resulted in a decision to reduce funding for subcritical experiments in favor of more JASPER shots. At LANL, proton radiography work was reduced in order to provide adequate funding for the DARHT second axis.

YES 9%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program and (if relevant) to other efforts in other programs that have similar goals?

Explanation: "While the goals of the Science Campaign in support of the nuclear weapons program are unique, a number of measures are taken to ensure that technical performance is on par with the highest standards of the scientific community. For instance, at least one major programmatic area is reviewed by the JASON every year. The University of California (UC) President's external advisory panel, composed of top practitioners from the academic science community, reviews every relevant program at both UC-operated laboratories on a semi-annual basis. Additionally, major problems have been chosen for review on a recurring basis by the National Academy of Sciences and various panels established by the DOE Office of Science and Technology Policy (OSTP). The program has a strong collaboration with the United Kingdom (UK) through the Joint Working Group process. Performance of programs is additionally reviewed on a frequent basis by the Nuclear Weapons Council (NWC) and its Standing and Safety Committee (SSC) and by the Commander of U.S. Strategic Command through the Strategic Advisory Group Stockpile Assessment Team (SAGSAT). The DOE/DoD Joint Munitions Program Memorandum of Understanding (MOU) results in sharing of R&D of mutual benefit with semi-annual program reviews."

Evidence: U.S.-UK Joint Working Group proceedings; Stewardship Science Academic Alliances solicitation, # DE-PS03-01SF22349; JASON reports; NWC, NWC SSC, and SAGSAT meeting minutes; and periodic symposia and site visits that are used to review efforts and strengthen collaborations with the National Laboratories. The design laboratories, and, to a lesser extent, the plants and engineering laboratories, act to compete with and peer review one another. All of the MTEs under the Science Campaign result in the generation of peer-reviewed journal publications.

YES 9%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: Overall NNSA budget priorities are established and cordinated during each annual PPBE process cycle. Within the Science Campaign, priority is given to facility operation and technical R&D work to meet requirements set by DSW. Scientific priorities are further guided by the Quantification of Margins and Uncertainties Methodology (QMU) with greatest weight being given to those items that contribute the most to improving uncertainty in performance assessments.

Evidence: NNSA PPBE Guidance Documents located on the NNSA web-site; NNSA FY 2006 Program Decision Memorandum; annual contractor budget submittals; FY 2006-11 Science Campaign Program Plan; and individual FY 2006 Science Campaign Implementation Plans.

YES 9%
Section 2 - Strategic Planning Score 91%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Individual Federal Science Campaign Program and Sub-program Managers are in close communications with laboratory management for tracking program goals, budget, performance, and issues that arise. NA-10 QPRs review milestone accomplishment. Program Technical Reviews and campaign reviews are held on a periodic basis, and the performance of key events such as important hydrotests, subcritical experiments, and Joint Actinide Shock Physics Experimental Research (JASPER) shots are tracked and results reviewed, as available.

Evidence: QPR briefings; records of technical program review presentation materials; NA-10 MRT reports; meeting minutes; status reports; and annual reports for university grants programs, annual reviews of Academic Alliance centers.

YES 8%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Although the performance evaluations for federal managers include measures and criteria that hold them accountable for program results and effective management, and NNSA issues work authorizations, based on program input, that reference specific performance measures to hold site M&O Contractors accountable for accomplishing the scope and schedule within the allocated funds, there is significant room for improvement. Annual M&O Contractor evaluations are based on the work authorizations and expected performance. Award fees and/or award contracts are tied to performance. For example, the LANL management contract is being competed in order to address issues associated with the business practices and management at the Laboratory.

Evidence: Appendix F measures in the UC Contract; M&O Contractor performance and evaluation plans; and NNSA Director 2003 Memorandum indicates NNSA commitment to establish the criteria for the competition of the University of California contract.

NO 0%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds and work requirements are linked in program planning documents. During execution, funds are distributed by Federal Managers to Contractors via Work Authorizations (WAs) and Approved Funding Plans (AFPs). Monthly reports of obligations versus distributions are tracked closely. Unspent funds at the end of the year have been within acceptable parameters identified by DOE. Laboratory-level resource analysts report program execution results monthly for review by program management. Program management tracks expenditures at the sub-program level using its official Budget and Reporting classification codes and the DOE Data Warehouse/Financial Information System.

Evidence: FY 2006 President's Budget/NNSA FYNSP; Science Campaign FY 2006-11Program Plan and individual FY 2006 Implementation Plans: NNSA annual Budget Summaries; monthly NNSA financial reports; WA and AFP files; and program review briefings.

YES 8%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: While most of the expenditure in the Science Campaign is considered reasonably cost effective, it is hard to compare because it involves unique R&D activities and special costs associated with hazards such as high explosives, high power lasers, and special nuclear materials performed by M&O contractors. Some facility costs can be evaluated and roughly compared to the costs of similar facilities elsewhere. However, as a rule this has not been done to date. Furthermore, the productivity of experimental facilities can be monitored and improved, which is a function of the user committees that are established for operating facilities. This has, for example, been done in the case of the Los Alamos Neutron Science Center (LANSCE) versus the cost of the UK ISIS facility. Additionally, international programs provide for sharing of experimental results that allow for cost reduction through avoidance of duplication. This type of activity needs to be conducted and documented in a more consistent manner in order to claim success.

Evidence: The design laboratories, and to a lesser extent the plants and engineering laboratories, act to compete with and peer review one another. The laboratories, independent of NNSA HQ, make major procurements on a competitive basis. There is a continuing effort to improve the balance of small scale, relatively inexpensive experiments (e.g. JASPER) with larger scale experiments to maximize value.

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: "The NNSA PPBE process facilitates this integration of activity prior to and throughout budget development. Program and Implementation plans for each Defense Programs element have dedicated sections on integration with other programs. Quarterly Program Reviews for Defense Programs cover all programming elements, including the Science Campaign, to ensure coordination between the interdependent programs. Science Campaign program management coordinates with managers of the Pit Manufacturing & Certification Campaign, Advanced Simulation and Computing (ASC) Campaign, and DSW in many areas. In fact, the major deliverables of the Science Campaign are for DSW and ASC. Other examples include the Hydrodynamic test facilities, coordinated through the National Hydrodynamics Plan; collaboration on common concerns such as nuclear material shipping; and collaboration on experimental programs for the determination of materials properties. The Pit Lifetime Working Group was set up within NNSA HQ and led by the Science Campaigns staff specifically to address coordination between programs."

Evidence: NNSA PPBE documents on the NNSA web site; National Hydrodynamics Plan; shipping issue papers/ meeting minutes; PLWG Charter and review documents; FY 2006-11 Program Plan; and individual FY 2006 Science Campaign Implementation Plans.

YES 8%
3.6

Does the program use strong financial management practices?

Explanation: The Science Campaign is covered by DOE's financial management policies, procedures, and practices that meet all statutory requirements. Accounting services for NNSA are provided by DOE and these are free of material internal control weaknesses. The DOE's financial statements have been given a clear audit opinion in 8 out of the last 9 years. The Science Campaign Program Manager and staff review accounting reports monthly to monitor obligations and costs for all projects and sites. Day-to-day NNSA operations are supported through the NNSA PPBE process that require the integration of financial and performance management information systems at each phase.

Evidence: DOE Financial Management Orders; monthly NNSA financial reports; and NNSA PPBE Guidance Documents located on the NNSA web-site.

YES 8%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The combining of the four Science Campaigns into a single program increased management control and flexibility. The NNSA has reengineered its Federal procedures and work force in an effort to more effectively and efficiently conduct its mission. Meaningful efforts include: implementation of an Integrated Construction Program Plan to select and prioritize supporting construction projects; implementation of the PPBE process to improve performance based budgeting; and NNSA federal workforce realignment. The Office of Defense Programs also addressed program integration through the formation of the Office of Program Integration, NA-13. In the past FY, additional program efforts include JASON reviews of QMU and primary lifetimes activities, and workshops on primary physics, Plutonium lifetimes, and High Explosives to address primary certification.

Evidence: Science Campaign FY 2006-11 Program Plan; NNSA PPBE Guidance Documents located on the NNSA web-site; FY 2006 President's Budget/NNSA FYNSP; Integrated Construction Program Plan for the President's FY 2006 Budget, March 23, 2005; and NNSA of the Future (http://hq.na.gov/future/) web site.

YES 8%
3.CA1

Is the program managed by maintaining clearly defined deliverables, capability/performance characteristics, and appropriate, credible cost and schedule goals?

Explanation: Principal facilities recently constructed or brought to operational status under the Science Campaign include DARHT commissioning, JASPER construction and commissioning, and ATLAS relocation. These activities all have had baseline costs, schedules, and deliverables.

Evidence: DOE Manual 413.3-1, Project Management for the Acquisition of Capital Assets, March 28, 2003; DARHT Commissioning Plan; ATLAS Relocation Plan; FY 2006 President's Budget/NNSA FYNSP; FY 2006-11 Science Campaign Program Plan; and individual FY 2006 Science Campaign Implementation Plans.

YES 8%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: Financial Assistance Awards are made on the basis of standard DOE practices and consistent with 10CFR 600. The Stewardship Science Academic Alliances (SSAA) Program was administered through a competitive solicitation process including a rigorous technical merit review. Grants and Cooperative Agreements are monitored on an annual basis, and subject to a tri-annual DOE merit review process.

Evidence: SSAA Solicitation # DE-PS52-05NA25930, November 15. 2004; SSAA Evaluation and Selection Plan, November 2004; and 10 CFR 600; and Selection Statements August 20,2002 and May 28, 2003.

YES 8%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: For the SSAA Program: (1) As a part of the award/contract process, universities must provide a statement of work for the duration of their project; (2) Recipients of Financial Assistance Award grants and cooperative agreements are required to provide pre-print copies of all technical publications submitted to Journals, Conference Proceedings, etc.; (3) Recipients of cooperative agreements are subject to annual site visits; (4) Attendance is requested at the SSAA Technical Conference and, for all Financial Assistance Award recipients (including the SSAA Program), recipients grants and cooperative agreements are required to file annual and final project progress reports with their NNSA Technical Program Manager. In addition, individual campaign sub-program managers provide oversight through periodic program reviews, site visits, and program progress presentations to NNSA program managers and senior staff. the program management staff are technically qualified in their assigned areas of oversight.

Evidence: Procurement package and contractual agreement for Financial Assistance Award recipients; Reporting Checklist for SSAA recipients; and Statement of Substantial Involvement for recipients of Cooperative Agreements. Peer reviews are conducted for all of the campaign activities, and representatives from NNSA also attend the division reviews for all of the divisions funded under the Science Campaign at Los Alamos and Lawrence Livermore National Laboratories. Periodically, presentations of resulting R&D projects and results are made at HQ to NNSA staff.

YES 8%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: All recipients of Financial Assistance Awards are required to submit their Annual Progress Reports and Final Reports in the electronic format database on the DOE OSTI (Office of Scientific and Technical Information) web-site (https://www.osti.gov/elink/). This web-site has open read access for the public.

Evidence: Procurement package and contractual agreement for Financial Assistance Award recipients; Reporting Checklist for SSAA recipients; Statement of Substantial Involvement for recipients of Cooperative Agreements; and DOE OSTI web-site (https://www.osti.gov/elink/).

YES 8%
3.RD1

For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?

Explanation: The NNSA Science Campaign matches funding and work requirements in its planning documents. The program allocates funds to the MTE level within contractor and laboratory sites using the AFP and WA processes. The Program Manager and staff monitor costs via review of monthly financial reports. Spend plan information has also been added into the DP MRT for review at QPRs. Periodic program technical reviews and reports also provide a basis for allocation modifications throughout the FY.

Evidence: Science Campaign FY 2006-11 Program Plan; individual Science Campaign FY 2006 Implementation Plans; WA and AFP files; DP MRT files; and program review records

YES 8%
Section 3 - Program Management Score 83%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The program (e.g. QMU and test readiness) is generally on track to meet its long term performance goals. JASPER is being used to perform experiments using plutonium. The DARHT 2 facility has exhibited some high-voltage breakdown problems in accelerator cells, for which a reconstruction plan is now in place. Compact radiography development continues to meet stated technical objectives. Proton radiography efforts demonstrate continual improvement in capability using the LANSCE and Brookhaven laboratory. Atlas relocation has been completed. The hydrotest program has made significant improvements but is still short of achieving performance goals, particularly due to the LANL security- and safety-related stand-down.

Evidence: DARHT Construction Reports; JASPER reports; Program Reviews; Proton radiography reports and program reviews, MOU reports; and program reviews. See Measures Tab for baseline.

LARGE EXTENT 11%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The program met all but one of its annual performance goals in FY 2004 and is generally on track to meet its FY 2005 performance goals and supporting milestones. Materials science studies are on track and delivering data annually as projected, primary diagnostic development work is achieving its technical goals, subcritical experiments have been occurring as scheduled; plans for DARHT II reconstruction are now approved, although this project and other work is still completing its recovery from the LANL stand-down.

Evidence: DP MRT; University of California contract Appendix F annual appraisal reports; DARHT reports; and periodic program reviews, including quarterly reviews of progress on milestones. See Measures Tab for FY 2003, 2004, and 2005 results.

LARGE EXTENT 11%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The majority of Science Campaign work is non-repetitive scientific research and development work done by M&O contractors. The standard measure of such work is technical merit review, and the program has generally scored well in this area. The operation of scientific facilities is a major cost, and the efficiency of supporting experiments can be evaluated and improved. The National Hydrodynamics Plan and the experimental plan for dynamic materials and high-energy density physics establish aggressive performance targets for these facilities. A principal concern is the increasing cost of Environmental, Science, & Health (ES&H) compliance at nuclear and other high-hazard facilities. Significant impacts in FY 2004 and into 2005 have been the LANL safety & security stand-down and the associated costs and delays, and the Superblock work stoppage at LLNL, hence the change (from Large Extent) to Small Extent in this measure. The capital acquisition/facility aspects of the program meet all DOE expectations for cost effectiveness.

Evidence: The scientific and technical progress is reviewed in e.g. the LANL DX, LANSCE, MST and NMT Division reviews, the LLNL PAT and DNT directorate reviews. The milestones for progress of operations at all major facilities (e.g. ATLAS, JASPER, DARHT, LANSCE) are documented, as are quarterly progress reports and DOE M413.3-1 requirements. In addition it should be noted that the RTBF program also tracks milestones for these facilities.

SMALL EXTENT 6%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: While the work to be performed by the Science Campaign in support of nuclear weapons programs is unique, the technical quality is broadly recognized throughout the Federal Government and federal programs across the government routinely consult the National Laboratories. Scientific facility operations are on par with the operation of similar scientific facilities at other research organizations as assessed by review personnel who constitute the senior management of those organizations.

Evidence: DoD Program Analysis and Evaluation Reviews; JASON reviews; Nuclear Weapon Council and Standing Safety Committee reviews, STRATCOM SAGSAT reviews, pit certification plan review, and predictive capability review.

YES 17%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: The external reviews generally support the goals and approach of the Science Campaign.

Evidence: University of California contract Appendix F annual appraisal reports; JASON reviews; and 2001 and 2003 Report to Congress of the Panel to Assess the Reliability, Safety, and Security of the United States Nuclear Stockpile ("Foster Panel").

YES 17%
4.CA1

Were program goals achieved within budgeted costs and established schedules?

Explanation: The most relevant efforts in the Campaign are the facility projects, where they continue to perform on schedule and at the estimated cost. The R&D activities in the program have been successful in achieving the annual performance targets within budgeted costs and established schedules to a very large extent and schedules have rarely slipped. Proton radiography efforts demonstrate continual improvement in capability using the LANSCE and Brookhaven Laboratory. Atlas relocation and commissioning was completed The long-term Science Campaign goals are on track. Dynamic Materials milestones are again on track with the exception of the delays caused by the recent LANL safety- and security-related stand-down. ES&H costs, especially those related to the handling of high explosives and special nuclear materials, have escalated at a faster than anticipated rated. While the DARHT second axis has encountered high-voltage breakdown, a plan is now in place to correct the problems and move forward.

Evidence: University of California contract Appendix F annual appraisal reports; quarterly construction reports; monthly financial reports; DARHT construction reports; M&O Contractor annual award fee evaluation process; and program reviews/reports.

LARGE EXTENT 11%
Section 4 - Program Results/Accountability Score 72%


Last updated: 01092009.2005FALL