ExpectMore.gov


Detailed Information on the
National Nuclear Security Administration: Advanced Simulation and Computing (ASC) Assessment

Program Code 10000076
Program Title National Nuclear Security Administration: Advanced Simulation and Computing (ASC)
Department Name Department of Energy
Agency/Bureau Name National Nuclear Security Administration
Program Type(s) Research and Development Program
Assessment Year 2007
Assessment Rating Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 100%
Program Management 100%
Program Results/Accountability 73%
Program Funding Level
(in millions)
FY2008 $575
FY2009 $562

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Continue to maximize the use of resources while avoiding redundancy at the three national nuclear weapons laboratories.

Action taken, but not completed
2007

Continue to improve the responsiveness of the Nuclear Weapons Complex Infrastructure by coordinating program activity with the Complex Transformation Strategy Record of Decision.

Action taken, but not completed NNSA Portion of FY 2010 President's Budget
2007

Continue to improve the efficiency and cost effectiveness of the Nuclear Weapons Complex by integrating program requirements into development of the new Office of Defense Programs National Level Work Breakdown Structure (WBS).

Action taken, but not completed

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments

Program Performance Measures

Term Type  
Long-term Outcome

Measure: The cumulative percentage of simulation runs that utilize modern ASC-developed codes on ASC computing platforms, as measured against the total of legacy and ASC codes used for stockpile stewardship activities


Explanation:Adoption of Modern ASC Codes will enable a responsive simulation capability for the nuclear weapons complex. This measure is meant to show how quickly ASC codes are being adopted by the user community in place of legacy codes.

Year Target Actual
2006 Baseline 50%
2007 63% 63%
2008 72% 72%
2009 80%
2010 85%
2011 90%
2012 95%
2013 100%
2014 100%
Long-term Outcome

Measure: The cumulative percentage reduction in the use of calibration "knobs" to sucessfully simulate nuclear weapons performance.


Explanation:Reduced reliance on calibration will ensure the development of robust ASC simulation tools. These tools are intended tol enable the understanding of the complex behaviors and effects of nuclear weapons, now and into the future, without nuclear testing,

Year Target Actual
2006 Baseline 2%
2007 8% 8%
2008 16% 16%
2009 25%
2010 30%
2011 35%
2012 40%
2013 45%
2014 50%
2018 100%
Long-term Outcome

Measure: The cumulative percentage of nuclear weapon Significant Finding Investigations (SFIs) resolved through the use of modern (non-legacy) ASC codes, measured against all codes used for SFI resolution


Explanation:Demonstrates how valuable the ASC tools are for meeting the needs of the weapon designers and analysts by documenting the impact on closing Significant Finding Investigations

Year Target Actual
2006 Baseline 10%
2007 25% 25%
2008 37% 37%
2009 50%
2010 60%
2011 65%
2012 70%
2013 80%
2014 85%
Long-term Efficiency

Measure: The cumulative percentage of simulation turnaround time reduced while using modern ASC codes


Explanation:To show code efficiency by demonstrating that simulation time decreases as the ASC codes mature

Year Target Actual
2006 Baseline 6%
2007 7% 7%
2008 13% 13%
2009 13%
2010 15%
2011 20%
2012 27%
2013 34%
2014 42%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The Advanced Simulation and Computing (ASC) Campaign has a clear purpose: to provide leading edge, highend simulation capabilities to meet weapons assessment and certification requirements, including weapon codes, weapons science, computing platforms, and supporting infrastructure. ASC serves as the computational surrogate for nuclear testing to determine weapon effects, so ASC simulations play an essential role in studies of a Reliable Replacement Warhead, support the development of a Responsive Infrastructure, make possible interdiction/identification/attribution of nuclear threats, and support the transformation of the nuclear weapons complex consistent with the National Nuclear Security Administration (NNSA) Complex 2030 strategy.

Evidence: The FY 2008 President's Budget/National Nuclear Security Administration (NNSA) FY 2008-2012 Future-Years Nuclear Security Program (FYNSP), Feb 07; FY 2006-2011 Advanced Simulation and Computing (ASC) Campaign Program Plan (PP); and ASC FY 2007 and FY 2008 Program Implementation Plan (PIP).

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The specific need that the ASC Campaign addresses is to replace underground testing with a science-based computer simulation environment. The ASC Campaign provides the simulation capabilities necessary to maintain, assess, and certify the safety, performance, and reliability of the U.S. nuclear stockpile in the absence of underground nuclear testing. Prior to the 1992 testing moratorium, regular nuclear tests allowed scientists and designers to account for unknown physics processes in the computer simulations by adjusting the calculations to fit the test-based observed behavior. Because measurements in nuclear explosion conditions are quite challenging, experimental data typically reflects gross behaviors rather than details on specific physical or chemical processes. The length scales involved range from the nuclear scale to the weapon's scale or roughly 15 orders of magnitude. This is equivalent to the range of sizes from the thickness of a human hair up to our distance from the sun. The ability to perform nuclear tests allowed forgoing the microscopic understanding of a wide variety of physical processes involved in favor of much simpler computer models that could be used with the best available supercomputers to help design, modernize, and maintain the stockpile. Now, without nuclear testing, weapons scientists and engineers must rely on computers to simulate the aging processes and their impact on our weapon systems and any required modifications.

Evidence: The Atomic Energy Act of 1954, as amended; FY 1994 National Defense Authorization Act [Public Law (P.L.) 130-160]; Title 32 of the National Defense Authorization Act for FY 2000 (P.L. 106-65); Department of Energy (DOE) Strategic Plan, Sep 06; NNSA Strategic Plan, Nov 04; Office of Defense Programs (DP) Strategic Vision for 2030, Feb 05; President's Budget/NNSA FYNSP; and ASC Program Plan.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: The ASC Campaign has the unique responsibility to create the simulation tools that underpin a more rigorous scientific methodology to assess and maintain confidence in the U.S. nuclear weapons stockpile by using simulation capabilities, based upon developed advanced weapons codes and high-performance computing platforms. This means that ASC must enable the assessment of nuclear weapon safety, performance, and reliability far beyond original design lifetimes in a manner fundamentally different from what was done since the weapons were produced. The new ASC Business Model and national work breakdown structure provide federal managers visibility to avoid unnecessary duplication among the laboratories or between ASC and other agency efforts. For example, an ASC subprogram has the unique responsibility to create nuclear weapon simulation capabilities through the development of advanced integrated weapons codes. Another subprogram delivers high-performance computing platforms that utilize the weapons codes to meet weapons assessment & certification requirements. ASC is vertically integrated to include all aspects of simulation and computing so that that this key capability is effective and efficient in how it supports the nuclear weapon complex. The ASC Campaign leverages its activities with other high-performance computing efforts including the DOE Office of Science's Advanced Scientific Computing Research Program and the Defense Advanced Research Projects Agency High Productivity Computer Systems Project. This type of activity minimizes unnecessary duplication by the government.

Evidence: The President's Budget/NNSA FYNSP; Atomic Energy Act of 1954, as amended; the Energy Reorganization Act of 1974, P.L. 93-438; FY 1994 National Defense Authorization Act (P.L. 130-160; Title 32 of the National Defense Authorization Act for FY 2000 (P.L. 106-65); DOE Strategic Plan; NNSA Strategic Plan; DP Vision for 2030; and ASC Program Plan.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The Advanced Simulation and Computing Campaign evolved from the merging of the Accelerated Strategic Computing Initiative and Stockpile Computing programs, in order to balance the needs of today and tomorrow. In FY 2006, with the drafting of the FY 2006 Implementation Plan, the ASC Campaign completed a major programmatic overhaul that evaluated its elemental components and how they fit together to meet the programmatic mission and customer requirements. For FY 2007, the Budget and Reporting categories for ASC were changed to reflect the new Work Breakdown Structure that supports the evolving ASC roadmap for future activities. The program is confident that these actions and the evolving program enhance ASC and minimize any flaws in how it supports the Stockpile Stewardship Program. Peer review and periodic reevaluation of program priorities and emphasis are used to maintain a proper focus on the needs of the complex and stockpile.

Evidence: The President's Budget/NNSA FYNSP; Defense Programs Program Management Manual (PMM), Rev 1, Jan 05; NNSA Planning, Programming, Budgeting, and Evaluation (PPBE) process input; ASC Campaign Program Plan; ASC Campaign PIP; annual internal assessment reports and periodic program reviews; ASC Campaign Work Breakdown Structure (WBS); and ASC Business Model, Jul 05.

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: The reengineered ASC Campaign is structured to maintain careful focus on customer-supplier relationships and avoid redundancies within the program. The ASC Program Manager coordinates with the Directed Stockpile Work and other NNSA campaigns during the annual Planning, Programming, Budgeting, and Evaluation process to identify requirements. He then plans allocation of funds on a multi-year basis, using the NNSA Future-Years Nuclear Security Program. The ASC budget and schedule is allocated, using the Approved Funding Program and Work Authorization processes, based on programmatic targets/milestones annually after consulting with senior Stockpile Stewardship Program leadership, ASC Subprogram Directors, and laboratory executives. During the budget execution year, he monitors the program schedule and cost status and, periodically, briefs the status to the other programs and receives briefings from them. There are no unintended subsidies. To remain effectively targeted, the program and three national weapon laboratories also take into consideration the comments and critiques of review panels including the ASC Predictive Science Panel, the independent JASON (academic/industry advising body to the Department of Defense), and National Academies of Science.

Evidence: President's Budget/NNSA FYNSP; ASC Program Plan; ASC PIPs; NNSA Business Operating Policy Letter (BOP)-001, NNSA Planning, Programming, Budgeting, and Evaluation (PPBE) Process and NNSA PPBE documents on the NNSA web site; Approved Funding Program (AFP) and Work Authorization (WA) files; and National Academies of Science (NAS), JASONs, and the ASC Predictive Science panel files.

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: From FY 2002-FY 2006, the ASC Campaign maintained 4-5 long-term performance measures consistent with the program's goal. During FY 2006 and FY 2007, based on a revised roadmap for the future, the ASC Campaign developed four new indicators to focus on more specific areas of campaign activity. The ASC leadership has established challenging end-state targets that will continue the ASC program drive toward delivering comprehensive, science-based simulations. The new long-term measures are: 1) By 2013, the ASC-developed modern integrated simulation codes will be used for 99% of all simulations on ASC computing platforms (demonstrates adoption of modern ASC codes to enable a responsive simulation capability for the nuclear weapons complex); 2) By 2015, 100% of the major calibrations affecting weapon performance simulations will be replaced by science-based, predictive phenomenological models (demonstrates reduced reliance on calibration to ensure the development of robust ASC simulation tools that will enable the understanding of the complex behaviors and effects of nuclear weapons in an environment without nuclear testing); 3) By 2013, ASC codes will be the principal tools for resolution of 100% of the Significant Finding Investigations (by impacting Significant Finding Investigations closures demonstrates the focus of ASC work on the needs of weapons designers and analysts); 4) By 2013, achieve 50% reduction in turnaround time, as measured by a series of benchmark calculations, for the most heavily used ASC codes (demonstrates ASC code efficiency by showing that simulation time decreases as ASC codes mature).

Evidence: The President's Budget/NNSA FYNSP; ASC Program Plan; ASC Vision, Aug 04; ASC Roadmap, 2006; and PART Measures Tab.

YES 10%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The new targets are the best indicators of the campaign success, consistent with the new roadmap and major focus areas. Given the history of sucess with the previous measures and the overall challenge of delivering improved predictive capability to the weapon complex, the targets and timeframes of these new long-term measures are both challenging and ambitious. However, comprehensive planning and reviews, as well as progress to date, indicate that they are realistic and achievable,with proper execution and support. Specific challenges include: 1) Development of the improved physics and material science necessary to conceptualize and build improved models; 2) Coordination and communication between modelers and code developers to implement new models into the codes; and 3) Having adequate computing capacity to verify and validate the models as stand-alone entities and the codes with the new models incorporated. The ASC Campaign maintains an annual list of milestones that serve as internal performance targets of achievement.

Evidence: The President's Budget/NNSA FYNSP; ASC Roadmap; ASC Program Plan; ASC PIPs; status of program-related milestones on DP MRT; and PART Measures Tab.

YES 10%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The ASC had a limited number of s[pecific annual performance measures that guided its sucess to date. Based on the ASC Roadmap, they were revised to demonstrate future progress toward achieving the new long-term measures discussed in 2.1. The new annual measures track how the program can improve scientific understanding and its application to the Stockpile Stewardship Program. Building on baselined FY 2006 data, the measures for FY 2009 are: 1) Increase usage of the ASC-developed modern integrated simulation codes to 80% of total code usage for simulations; 2) Replace 25% of the major calibrations affecting werapon performance simulations by science-based, predictive phenomenological models; 3) Designers and analysts will use ASC codes as the principal tools in resolving 50% of the Significant Finding Investigations; and 4) Achieve 26% reduction in turnaround time for the most heavily used ASC codes.

Evidence: The President's Budget/NNSA FYNSP; ASC Program Plan; and ASC Roadmap. Technical programmatic and site milestones support these targets. For additional year targets, please see PART Measures Tab.

YES 10%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: For each of the annual performance measures discussed in 2.3, a set of annual measures from 2007 on exists. For each, a 2006 baseline exists. Technical programmatic and site milestones support these measures from 2007 on. The detailed implementation is as ambitious as the measures themselves; however this ambition is needed to support the overall transformation of the nuclear weapons complex into a more responsive infrastructure.

Evidence: The President's Budget/NNSA FYNSP; ASC milestone status in the DP MRT; ASC Program Plan-Appendix B; ASC Roadmap; and ASC PIPs.

YES 10%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Commitment to the ASC Campaign near- and long-term goals exists at all levels of Headquarters and the three National Weapon Laboratories. The internal performance measures, milestones, are developed and agreed upon through negotiations between the Headquarters ASC program management and laboratory ASC program executives to support both the current and future stockpile. For example, Level 1 (national) Milestones are structured to deliver advanced ASC physics and engineering capabilities to support the W76 Life Extension Program and certification, and develop a 100 teraflops computing environment. All milestones are developed by ASC leadership in conjunction with the managers of other NNSA programs and the supporting laboratory technical scientific and engineering program staff and users. The Level 1 Milestones follow an approval process through the Deputy Administrator for Defense Programs. Level 2 (programmatic) Milestones are developed under the direction of individual ASC subprogram directors working with program leads at the laboratories, associated with the ASC performance indicators, and approved by the ASC Program Manager. There is similar commitment on behalf of Headquarters and the laboratories to improve ASC management, specifically along the lines articulated in the ASC Business Model document. The Budget and Reporting codes for FY 2007 were changed to reflect the new ASC Work Breakdown Structure. With this change, all obligations and costs are aligned with individual ASC products; thereby clearly showing the return that ASC stakeholders get for the investment in ASC. Further commitment is demonstrated by laboratory ASC program executives as they restructure their organizations in-line with the sub-programs discussed in the ASC Business Model. Although more work is necessary for full-implementation of the ASC Business Model, evidence is clear that progress has been made.

Evidence: ASC Program Plan; ASC PIPs; ASC Business Model; ASC Roadmap; Level 1 Milestone Review; ASC milestone status in the DP MRT; campaign-related DP Quarterly Performance Review (QPR) briefings; DP Joule quarterly input for FY 2006 and FY 2007; FY07 ASC Budget & Reporting Codes; ASC Work Breakdown Structure; and ASC Tri-laboratory and Headquarters (HQ) Organization.

YES 10%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The recurring, independent evaluations used by the ASC Campaign currently are those by the Predictive Science Panel that reviews the weapon codes at Lawrence Livermore and Los Alamos National Laboratories. The Predictive Science Panel is a committee of knowledgeable scientists from academia and industry that peer reviews the quality of science that is implemented in the codes, evaluates weaknesses, and makes suggestions for next steps. The focus of the Panel is on the rate at which the quality of science is improved in the codes and the predictability of the resulting computer simulations. There is an additional review panel at Sandia National Laboratories that focuses attention on the engineering applications. Every two years, an independent review panel considers the software quality policies and procedures at all three laboratories. As capability computer platforms are delivered to the laboratories, a review panel is convened to evaluate the general availability environment of the machines. The ASC Campaign contracted for an independent evaluation of the costs to site and operate capability-class computer platforms. ASC has been reviewed in past years by the JASONs (advising body to the Department of Defense), Blue Ribbon Panels, DOE Inspector General, and the U.S. Government Accountability Office.

Evidence: DOE Inspector General (IG) Audit Report No. CR-L-0204, 5 Apr 02; JASON 1996; Blue Ribbon 1999/2000; U.S. Government Accountability Office (GAO) 1998/1999; Platform Review agendas; Level 1 Alliance External Annual Reviews; ASC Purple Level 1 Review, Dec 06; Contract # DE-AC52-05NA26698, Capability Siting Study; ASC-ASCR National Academy study 2004; and ASC JASON study 2004. (ASCR stands for Advanced Scientific Computing Research and is the advanced computing program in the DOE Office of Science.) Predictive Science Panel reviews are held every six months at alternating physics laboratories.

YES 10%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: Each NNSA program is managed using long-term performance goals with annual targets that cascade seamlessly from the DOE/NNSA Strategic Plans, and also link to supporting milestones, deliverables, and the budgets for program performers. The NNSA program performance measures are the framework for resource allocation decisions made in the annual Planning, Programming, Budgeting, and Evaluation process Programming and Budgeting Phases. Annually, the NNSA performance-planning-budgeting decisions are documented in the Administrator's Final Recommendation and used to develop the budget requests during the Budgeting Phase. Program and financial performance for each measure is corporately monitored and assessed during Budget Execution and the Planning, Programming, Budgeting, and Evaluation Phase. All direct and indirect costs to attain the performance results for the ASC Campaign are reported in the DOE Budget & Reporting categories. The cost for the NNSA Federal employees is carried in a separate NNSA program direction account, as required by the Congress. In the FY 2008-2012 President's Budget, NNSA continued to use a budget request format that includes the current 3 years plus 4 additional years of performance and budget information for each program in the mainline budget justification document.

Evidence: The President's Budget/NNSA FYNSP; ASC Program Plan; DOE Strategic Plan; NNSA Strategic Plan; ASC Strategy, 2004; ASC Roadmap; DOE/NNSA Budget and Reporting Code System; NNSA PPBE Guidance Documents located on the NNSA web-site; NNSA FY 2004 and FY 2005 Program Decision Memoranda, and FY 2006, 2007, and 2008 NNSA Administrator's Final Recommendations.

YES 10%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The ASC Campaign reengineering has been conducted in a detailed and comprehensive manner--2004 ASC Strategy, 2005 ASC Business Model, and 2006 ASC Roadmap--all focused on the needs of the Stockpile Stewardship Program. The reviews mentioned in 2.6 have guided this evolution. Assessment and comparison between ASC efforts and those of other high-performance computing organizations take place through periodic communications in a variety of formats. There are technical conferences and seminars during which results are presented, e.g., Supercomputing Conferences. Technical publications and newsletters (e.g., IEEE, ACM, Science, and HPCWire) are reviewed for comparisons. Another format for collaboration s actively participating in community working groups such as the Department of Defense High Productivity Computing System, Executive Branch's Office of Science and Technology Policy High End Computing Revitalization Task Force, and the Defense Department Integrated High End Computing. ASC is a member of the National Coordination Office for Information Technology Research and Development with staff leading the File Systems and Input/Output Roadmap activity under the High End Computing section. In addition, ASC collaborates with DOE Office of Science and its laboratories, including Argonne, Lawrence Berkeley, and Oak Ridge. ASC members sit on review panels of other High Performance Computing organizations and invite colleagues of DOE Office of Science, Department of Defense High Productivity Computing Office of Modernization, National Security Agency, and National Science Foundation to sit on various ASC review panels. Collaboration also occurs with universities. These collaborations provide excellent opportunities to compare and contrast the ASC Campaign with other organizations involved in the high end computing community and redirect planning and resources accordingly.

Evidence: NNSA FY 2008-2011 Stockpile Stewardship Program, 25 Apr 07; FY 2007-2011 NNSA Stockpile Stewardship Pprogram Ooverview, 13 Nov 06; ASC Strategy; ASC Business Model; ASC Roadmap; ASC Program Plan; Interagency working groups, meetings, and conferences attended by program staff (these include National Information Technology Research and Development meetings and the High End Computing Revitalization Committee Revitalization Task Force working groups); Purple (Computer) Level 1 Milestone Review; Red Strom (Computer) Program Review; National Energy Research Scientific Computing Center Review, May 05; Pacific Northwest National Laboratory Computational Science Initiative; and National Security Agency Advisory Panel on Advanced Computing Systems.

YES 10%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program and (if relevant) to other efforts in other programs that have similar goals?

Explanation: Assessment and comparison between ASC efforts and those of other high-performance computing organizations take place through periodic communications in a variety of formats. There are technical conferences and seminars during which results are presented, e.g. Supercomputing Conferences. There are also technical publications and newsletters (e.g., IEEE, ACM, Science, and HPCWire). Another format is collaboration that includes actively participating in community working groups such as the Department of Defense High Productivity Computing System and the Defense Department Integrated High End Computing. ASC is a member of the Office of Science and Technology Policy High End Computing and Networking and Information Technology Research and Development Interagency Working Groups, with staff leading the File Systems and Input/Output Roadmap activity under the High End Computing section. In addition, ASC collaborates with DOE Office of Science and its laboratories, including Argonne, Lawrence Berkeley, and Oak Ridge. ASC members sit on review panels of other Agencies' High Performance Computing organizations and invite colleagues of DOE Office of Science, Department of Defense High Productivity Computing Office of Modernization, National Security Agency, and National Science Foundation to sit on various ASC review panels. Collaboration also occurs with universities. These collaborations provide excellent opportunities to compare and contrast the ASC Campaign with other organizations involved in the high end computing community.

Evidence: Interagency working groups, meetings, and conferences attended by program staff (these include Networking and Information Technology Research and Development meetings and the High End Computing Interagency Working Groups); Purple (Computer) Level 1 Milestone Review; Red Storm (Computer) Program Review; National Energy Research Scientific Computing Center Review, May 05; Pacific Northwest National Laboratory Computational Science Initiative; and National Security Agency Advisory Panel on Advanced Computing Systems. Also see 2.7 explanation.

YES 10%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: The ASC Campaign utilizes a multi-tiered prioritization process to guide budget requests and funding alternatives. First, any decision must be consistent and supportive of the overall move toward enhancing the predictive capability for users at the laboratories as described in the ASC Strategy and Roadmap documents. Second, Federal funding is directed to the Products of ASC (as defined by the ASC Business Model) based on requirements for that Product and the demonstration of progress. Third, the NNSA Planning, Programming, Budgeting, and Evaluation process then causes ASC requirements to be prioritized with other Defense Programs and NNSA requirements. Priorities are reevaluated durinf the budget execution year and resource adjustments are made using the Work Authorization and Approved Funding Program processes.

Evidence: ASC Vision; ASC Strategy; ASC Roadmap; ASC Business Model; ASC Program Plan; ASC PIPs; program-related Work Authorization (WA) and Approved Funding Program (AFP) files; progran semiweekly teleconferences and quarterly face-to-face meetings (the ASC Executives bring together the interests of their respective laboratories that are then worked through as a committee effort incorporating the issues, concerns, and national priorities contributed by the HQ representative to reach consensus on the ASC direction); and NNSA PPBE documents located on the NNSA web site.

YES 10%
Section 2 - Strategic Planning Score 100%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Progress on milestones is reported to Headquarters by the three National Weapons Laboratories and the ASC Program Manager on a quarterly basis through the Defense Programs Milestone Reporting Tool, specific deliverables, and periodic reports. In the Defense Programs Quarterly Program Reviews, ASC management reports milestone performance and budget execution progress to the senior Defense Programs leadership, other program managers, and the Management and Operating contractors. In addition, Headquarters program directors have periodic meetings, both in-person and by telephone, to keep abreast of progress and to chart future directions. Headquarters program management members also attend periodic technical project reviews with universities/industry vendors and the labs to verify that work is progressing according to the contracts. Acquisition of capability platforms adhere to NNSA Office of Chief Information Officer Project Execution Model for Information Technology Investments, DOE Order 413.3A, Program and Project Management for the Acquisition of Capital Assets, and Office of Management and Budget A-300 reporting processes to provide management visibility into project performance.

Evidence: ASC Program Plan; ASC milestone status in the DP MRT; campaign-related DP QPR briefings; Quarterly Progress Reports; DOE Guide to IT Capital Planning and Investment Control (CIPC), Sep 06; NNSA Chief Information Officer (CIO) Project Execution Model for Information Technology (IT) Investments; DOE Order 413; Program Meetings Summary; OMB A-300; and associated quarterly reports.

YES 12%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Performance measures are represented annually in federal managers' individual measures of performance. Attainment of performance targets and milestones are included in annual Management and Operating contractor Performance Evaluation Plans. Funding and work scope & schedule are detailed in the execution year Program Implementation Plan and provided to the laboratories through the Approved Funding Program/Work Authorization processes. Costs are monitored monthly by program management against the budget. Performance is monitored using a set of targets and measures, and is monitored on a quarterly basis. Attainment of performance measures is included in annual individual performance appraisals and in site Management and Operating evaluations and resultant contract fee awards.

Evidence: Program-related AFP/WA files/documents; monthly DOE/NNSA Integrated (Financial) Management System (I-MANAGE) Data Warehouse (IDW) Standard Accounting and Reporting System (STARS) reports; ASC PIPs; FY 2006 and FY 2007 Management and Operating (M&O) Site Performance Evaluation Plans (PEPs); FY 2006 M&O site evaluations; and individual federal personnel measures of performance (MOPs) and annual appraisals.

YES 12%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: Planned allocation of funds, by subprogram and by site, for the budget execution year is detailed in the annual program implementation plan. Funding is actually allocated on a timely basis using the DOE/NNSA Approved Funding Program and Work Authorization processes. Budget execution is reported and analyzed monthly by laboratory resource analysts and reviewed by ASC Executive Management. The program management tracks actual costs monthly, using the DOE official Budget and Reporting classification codes and the DOE Integrated (Financial) Management System and Standard Accounting and Reporting System. In keeping with the reengineering of the ASC Campaign and to institutionalize the ASC Work Breakdown Structure, ASC leadership revised the Budget and Reporting codes for FY 2007 to reflect the new structure.

Evidence: DOE Integrated Financial Management System (I-MANAGE); current DOE Budget and Reporting (B&R) code structure; monthly DOE/NNSA I-MANAGE Data Warehouse (IDW) Standard Accounting and reporting System (STARS) reports; Program sweep 2000; program-related AFP and WA files; internal audits (1999, 2000, 2002, early-2003, & late-2003) performed by ASC staff; ASC Work Breakdown Structure (WBS); and ASC Business Model.

YES 12%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The ASC Campaign has evaluation procedures and criteria to ensure that ASC deliverables meet the requirements of our customers and satisfy our Stockpile Stewardship responsibilities in a cost-effective manner. A guiding principle for ensuring cost-effective ASC procurements is to promote a healthy U.S. supercomputing industry through competition among multiple, financially viable vendors. The high risk and low revenue contributions to corporate bottom-lines that supercomputing sales and services provide make this a challenge. Doing business with several vendors leads us to believe we are contributing to the success of a competitive industry. The current ASC Efficiency Measure demonstrates the real cost effectiveness that can be anticipated in the computer-rich environment with the correct amount of program emphasis. It replaces one that was effective but more broad in scope. Also, in Jul 06, ASC completed a study that considered the total cost of ownership of Linux/Open Source software.

Evidence: The PART Measures Tab contains the current ASC Efficiency Measure. An example of the ASC evaluation process concerns the ASC Purple and Roadrunner (computer) platforms - see the Requests for Proposals; Synopsis of superior supercomputing ideas whose companies couldn't build a viable business--Dead High Performance Computing Companies (slide courtesy of Burton Smith, Cray Research); and Open Source Software Development Acceleration Request for Information; and Study, "Total Cost of Ownership of Linux/Open Source Software-Jul 06."

YES 12%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The ASC Campaign collaborates both with other DOE Offices and other federal agencies (e.g., the DOE Office of Science and the Defense Advanced Research Projects Agency High Productivity Computer Systems project) and programs in technical areas such as system software, storage systems, visualization, and platform architecture. Coordination takes place under the auspices of the Department of Defense-DOE/NNSA Memorandum of Understanding and the Networking and Information Technology Research and Development Working Group. In addition, annually, the NNSA Planning, Programming, Budgeting, and Evaluation process enables coordination across all programs in NNSA/Defense Programs on a multi-year basis.

Evidence: Networking and Information Technology Research and development Supplement to the FY 2008 President's Budget Request; President's Budget/NNSA FYNSP; ASCI Technology Prospectus 2001; Krell-administered Computational Science Graduate Fellowships; PPBE documents; ASCI Response to DOE IG Audit, 5 Apr 02; and Multi-agency Memorandum of Understanding, Summer 03.

YES 12%
3.6

Does the program use strong financial management practices?

Explanation: The Program is covered by DOE's financial management policies, procedures, and practices that meet all statutory requirements. The DOE provides the accounting services for the NNSA and the financial management processes are free of material internal control weaknesses. The Budget and Program staff review cost expenditure reports on a monthly basis to monitor obligations and costs for all projects and sites to ensure that funding is spent as planned. Adjustments are made using the DOE/NNSA Approved Funding Program.

Evidence: President's Budget/NNSA FYNSP; DOE Financial Management Orders; DOE/NNSA IDW STARS financial reports; program-related WA and AFP files; and NNSA PPBE Guidance Documents located on the NNSA web-site.

YES 12%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: Over the last three years, the ASC Campaign management has performed a significant reexamination of the program's management structure and processes. A reengineering of ASC ensued. This management reengineering has resulted in a new ASC Business Model and work breakdown structure based on defined products and the customer requirements for those products. Whereas past management of the ASC Campaign stressed the need for consensus decision making, current decision making is focused on meeting the Nuclear Weapons Complex user requirements for the products of the program in support of the NNSA goal of transforming the complex into a more agile, cost-effective, and responsiveorganization.

Evidence: ASC Strategy, 2004; ASC Business Model, 2005; ASC Roadmap, 2006; ASC WBS; current Complex 2030 Transformation Concept; and ASC Program Plan.

YES 12%
3.RD1

For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?

Explanation: the program's allocation of funds does support and maintain program quality. By applying the work breakdown structure to the FY 2007 and FY 2008 Program Implementation Plans 9for planned expendatures) and to the DOE Budget and Reporting codes (for expendature reporting), Federal managers have the necessary transparency into ASC processes to use product performance data to assess their program's quality. Prior to the new ASC Business Model, this effort was accomplished through the use of milestone review panels, quarterly progress reports, site visits by Headquarters program management, and frequent communications with laboratory program managers. However, with the incorporation of the new ASC Business Model, feedback from ASC customers will be used to evaluate program progress on an annual basis. The focus on products enables any evaluation of the program to address specific components of what the ASC Campaign delivers for its budget. The specific process of how customer feedback will be considered is under development.

Evidence: ASC Business Model, 2005; ASC WBS; ASC Program Plan; ASC PIPs; DOE B&R Codes; and DOE/NNSA IDW STARS financial reports. The ASC Executives, representing HQ and the weapons laboratories, ensure that the quality of the ASC programmatic work is upheld at the laboratories and as a national program.

YES 12%
Section 3 - Program Management Score 100%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: Prior to its recent reengineering, the ASC Campaign had 4-5 annual performance measures, supported by program milestones, that it consistently achieved enroute to long-term goal accomplishment. Besides annual PART self-assessments, the program has reported this progress through the DOE Joule/Performance and Accountability Report process (performance measures) and the Defense Programs Milestone Reporting Tool (programmatic milestones). The FY 2006, FY 2007, and FY 2008 President's Budget and the FY 2007-2011 and FY 2008-2012 Stockpile Stewardship Program documents include the performance accomplishments as well as a list of significant annual accomplishments made that are part of the groundwork for moving toward the long-term programmatic goal of predictive simulation. The revised measures contained herein inplement the program reengineering effort and focus on that goal. Although new, they are based on solid program accomplishments to date, have a firm FY 2006 baseline, and are on schedule for attainment in FY 2007. As an example of the program's demonstrated progress, the following is a quote from a Predictive Science Panel (a committee of knowledgeable scientists formed by the weapons laboratories to review ASC progress annually and to suggest potential research directions) report: "In its first decade, the ASC program has a number of tremendous achievements of which all concerned can be very proud."

Evidence: President's Budget/NNSA FYNSP; FY 2005 and FY 2006 DOE Performance and Accountability Reports; DOE 1st and 2nd Quarter, FY 2007 Consolidated Quarterly Performance Reports; NNSA Performance Report for 2006; FY 2007-2011 and FY 2008-2012 Stockpile Stewardship Programs (SSPs); Review Reports- IG 2001, Milestone review panels; ASC milestone status in the DP MRT; 2006 Predictive Science Review Panel Report; Nonnuclear Predictive Science Review Panel; and Level 1 Alliance Reviews. Also see PART Measures Tab.

YES 20%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Although the ASC Campaign has had its challenges along the way, this program has experienced great success in achieving its annual goals as part of its mission to support the Science-based Stockpile Stewardship Program. The program has accomplished its 10-year, 100 teraflops commitment with ASC Purple computer system by IBM at Lawrence Livermore National Laboratory. In the past year, the Cray Red Storm computer system (Sandia National Laboratories) was upgraded to perform at over 100 teraflops. Other accomplished performance goals include new understanding of primary physics achieved through operation of ASC Purple, application of ASC technology to enable feasibility study of first Reliable Replacement Warhead design, and responsive analysis on emerging threats, e.g., in North Korea. Performance reporting through the DOE Joule indicates that the program achieved its code-based performance targets according to schedule. Program milestones supporting the annual performance goals have generally be accomplished on time. Currently, all FY 2007 performance goals have a FY 2006 baseline. While they are on track for accomplishment, it is slow progress.

Evidence: President's Budget/NNSA FYNSP; platform review committee meetings; DOE FY 2005 and 2006 PARs; NNSA Performance Reports for 2005 and 2006; and ASC milestone status in DP MRT. Also, see PART Measures Tab.

LARGE EXTENT 13%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: From its inception, the program has developed cost conscious strategies to deal with resource limitations, especially for products and services available directly from the private sector. Within the Computational Systems and Software Environment sub-program, funding priorities seek to encourage multiple vendors and the advancement of commercial-off-the-shelf development (vice building custom solutions). The Blue Gene P computer system and Q research and development project (led by LLNL) with the DOE Office of Science exemplifies how the program mitigates costs by sharing some of the development costs and risks. The Blue Gene/L computer platform was used to address pressing plutonium questions and enable much more accurate estimates of the changes this material goes through as a function of its age. In addition to its modeling contribution, the IBM Blue Gene/L computer (LLNL) architecture was explored to address escalating power and space costs. To maintain a focus on improving efficiencies and cost effectiveness, the ASC Campaign has developed an efficiency indicator that makes a ratio between dollars to procure, manage and maintain the ASC platforms. A result of the various investments made by the ASC Campaign is envisioned as a lower dollars/petaflops cost. However, this has not yet been demonstrated over a long enough period of time to determine lasting efficiencies. The current ASC Efficiency Measure demonstrates the real cost effectiveness that can be anticipated in the computer-rich environment with the correct amount of program emphasis. It replaces one that was effective but more broad in scope.

Evidence: President's Budget/NNSA FYNSP; ASC Program Plan--Executive Summary, and Appendix A, Level 1 Milestones; DOE FY 2005 and FY 2006 PARs; NNSA Performance Reports for 2005 and 2006; and PPBE Budget documents. Also, see PART Measures Tab.

SMALL EXTENT 7%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: Although no other government program has the exact purpose of the ASC Campaign, the general goal of providing a better product for analysis of a complex technical operation is comparable to some other government programs. ASC program management monitors the execution of other DOE/NNSA programs and non-DOE efforts with related processes to gauge progress and to improve internal accomplishments. This subjective comparison indicates that the program is doing well, although the performance metrics Although the mission to provide the computational science and computer simulation tools necessary for understanding various behaviors and effects of nuclear weapons may be unique to ASC, one applicable indicator through which the ASC Campaign might compare to other similar programs could be the adoption of ASC-originated or sponsored technologies. With the number of Red Storm and IBM Blue Gene/L (LLNL) computer product orders, the High Performance Storage System product, not to mention several visualization technologies and user software tools; the ASC Campaign fares well in comparison to other programs, from both cost and capability standpoints.

Evidence: Network and Information Technology Research and Development Blue Book for FY 2008 and Top 500 list; DOE FY 2005 and 2006 PARs; NNSA Performance Report for 2005 and 2006.

LARGE EXTENT 13%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: The ASC Campaign was recently evaluated by the Predictive Science Panel (a committee of knowledgeable scientists formed by the weapons laboratories to review ASC progress annually and to suggest potential research directions) meeting. The following two quotes are from Predictive Science Panel reports: "The strategic plans both for hardware acquisition and for code development (including evolution away from old legacy codes) have worked very well, and the labs are to be congratulated for their great successes here." "In its first decade, the ASC program has a number of tremendous achievements of which all concerned can be very proud." The ASC Campaign was also recently evaluated by the independent JASONs. Based on the reviewers' feedback, the ASC Campaign is meeting its mission requirements; however, there are areas for improvement and the program is making changes in order to address them. An example of such modification to the program includes the new Platform Acquisition Strategy. In addition, the Alliance Centers are reviewed annually by external review panels.

Evidence: Predictive Science Panel 2006 Review report; JASON FY 2005 Report; Review reports - IG 2001; FY 2004 Sandia Milestone Review Panel report; Predictive Science Panels; Non-nuclear Predictive Science Panel; ASC Purple Level 1 Review; and Level-1 Alliance Reviews.

YES 20%
Section 4 - Program Results/Accountability Score 73%


Last updated: 01092009.2007FALL