ExpectMore.gov


Detailed Information on the
Fusion Energy Sciences Assessment

Program Code 10000096
Program Title Fusion Energy Sciences
Department Name Department of Energy
Agency/Bureau Name Department of Energy
Program Type(s) Research and Development Program
Competitive Grant Program
Capital Assets and Service Acquisition Program
Assessment Year 2003
Assessment Rating Moderately Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 90%
Program Management 66%
Program Results/Accountability 80%
Program Funding Level
(in millions)
FY2008 $295
FY2009 $315

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Implementing the recommendations of expert review panels, especially two major National Academies studies, as appropriate.

Action taken, but not completed In response to recommendations, (1) OFES received the final report from the Low Temperature Plasmas Workshop in Sept 2008. Based on the report recommendations, for the first time, low temperature plasmas were included in the requests for proposals for the Plasma Science Centers and for the NSF/DOE Partnership in Basic Plasma Physics. (2) FESAC formed a panel to address the charge to identify the research opportunities in HEDLP. The panel's report is expected to be delivered to FESAC in 2009
2006

Developing strategic and implementation plans in response to multiple Congressional requirements.

Action taken, but not completed The National Research Council report on the review of the U.S. ITER science participation planning process was submitted to OFES in July 2008. OFES is analyzing the report for use in its planning for U.S. ITER science participation.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Re-engaging the advisory committee in a study of the how the program could best evolve over the coming decade to take into account new and upgraded international facilities.

Completed The Fusion Energy Sciences Advisory Committee (FESAC) has completed this activity. Its report, entitled "Priorities, Gaps, and Opportunities: Towards a Long-Range Strategic Plan for Magnetic Fusion Energy," dated October 2007 (DOE/SC-0102) was transmitted to Dr. Orbach on November 19, 2007.
2006

In cooperation with other Office of Science programs, develop and deploy a modern, streamlined, and cost-effective information management system for tracking the university grant proposal review and award process. This system should be in place by the end of FY 2007.

Completed DOE (including SC) uses GRANTS.GOV, as mandated, to receive all proposals; tracking the award of financial assistance agreements is through the DOE Procurement Acquisition Data System (PADS).
2007

Develop new annual measures to track how efficiently the program's facilities are operated and maintained on a unit per dollar basis by July, 2007.

Completed The review of NSTX was held in July 2008. The reviews for all three major facilities have now been completed. For the future, OFES plans to review all three facilities every few years.
2007

Participate in the development of a plan, due to OMB by March 1, 2007, to address serious deficiencies in the program's Exhibit 300 business casees for capital projects.

Completed Methods to address OMB concerns issued to OMB on 2/20/2007.

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Progress in developing a predictive capability for key aspects of burning plasmas using advances in theory and simulation benchmarked against a comprehensive experimental database of stability, transport, wave-particle interaction, and edge effects. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Excellent
2009 Excellent
2012 Excellent
2015 Successful
Long-term Outcome

Measure: Progress in demonstrating enhanced fundamental understanding of magnetic confinement and in improving the basis for future burning plasma experiments through research on magnetic confinement configuration optimization. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Excellent
2009 Excellent
2012 Excellent
2015 Successful
Long-term Outcome

Measure: Progress in developing the fundamental understanding and predictability of high energy density plasma physics. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Excellent
2009 Excellent
2012 Excellent
2015 Successful
Annual Efficiency

Measure: Average achieved operation time of the major national fusion facilities as a percentage of the total planned operation time.


Explanation:This annual measure assesses the reliability and dependability of the operation of the scientific user facilities. Many of the research projects that are undertaken at the Office of Science's scientific user facilities take a great deal of time, money and effort to prepare and regularly have a very short window of opportunity to run. If the facility is not operating as expected the experiment could be ruined or critically setback. In addition, taxpayers have invested millions of dollars in these facilities. The greater the period of reliable operations, the greater the return on that investment for the taxpayers.

Year Target Actual
2001 >90% 100%
2002 >90% 94%
2003 >90% 81%
2004 >90% 100%
2005 >90% 100%
2006 >90% 100%
2007 >90% 100%
2008 >90% 100%
2009 >90%
Annual Efficiency

Measure: Cost-weighted mean percent variance from established cost and schedule baselines for major construction, upgrade, or equipment procurement projects.


Explanation:This annual measure assesses whether the major construction projects are adhering to their specified cost and schedule baselines. Adhering to the cost and schedule baselines for a complex, large scale, science project is critical to meeting the scientific requirements for the project and for being good stewards of the taxpayers' investment in the project. The Office of Science has a rigorous process in place for overseeing the management of these large-scale, complex scientific projects and has been recognized, both inside government and by private organizations, for the effectiveness of this process.

Year Target Actual
2001 <10%, <10% -6%, -6%
2002 <10%, <10% +5%, 0%
2003 <10%, <10% 0%, 0%
2004 <10%, <10% +5%, +5%
2005 <10%, <10% -5%, -4%
2006 <10%, <10% 3%, 2%
2007 <10%, <10% >10%, >10%
2008 <10%, <10% cancelled for FY08
2009 <10%, <10%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The mission of the Fusion Energy Sciences (FES) program is to advance plasma science, fusion science, and fusion technology--the knowledge base needed for an economically and environmentally attractive fusion energy source.

Evidence: FY04 Budget Request (www.mbe.doe.gov/budget/04budget/index.htm). Public Law 95-91 that established the Department of Energy (DOE).

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The Fusion Energy Sciences program goals are designed to address the scientific and technology issues facing fusion energy development: 1. plasma chaos, turbulence, and transport, 2. magnetic configuration stability, reconnection, and dynamo, 3. plasma sheaths and boundary layers, 4. wave-particle interaction in plasmas, and 5. materials and technology engineering.

Evidence: FY04 Budget Request. National Research Council (NRC) report "Plasma Science". Fusion Energy Sciences Advisory Committee (FESAC) "Report on the Integrated Program Planning Activity for the DOE Fusion Energy Sciences Program" (www.ofes.fusion.doe.gov/More_HTML/FESAC_Charges_Reports.html).

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any Federal, state, local or private effort?

Explanation: FES is unique in funding fusion research for energy purposes. The program is coordinated with NNSA inertial confinement fusion program. FES also provides support for research in plasma science, and is coordinated with the National Science Foundation (NSF) program.

Evidence: Program funds all dedicated fusion energy research, and a significant share of the plasma physics research in the U.S. Coordinated planning with NNSA in inertial fusion. MOUs and joint solicitations with NSF.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The FES program is based on competitive merit-review, independent expert advice, and community planning. This proves efficient and effective. However, a COV has yet to validate the merit review system.

Evidence: FESAC, NRC reviews and reports (www.ofes.fusion.doe.gov/More_HTML/FESAC_Charges_Reports.html, www.ofes.fusion.doe.gov/FusionDocs.html). Program files.

YES 20%
1.5

Is the program effectively targeted, so program resources reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: FESAC ensures that input from the fusion research community is regularly gathered to assess the priorities, projects, and progress of the program. Peer review is used to assess the relevance and quality of each project.

Evidence: FESAC, NRC reviews and reports (www.ofes.fusion.doe.gov/More_HTML/FESAC_Charges_Reports.html, www.ofes.fusion.doe.gov/FusionDocs.html). Program files.

YES 20%
1.RD1

Does the program effectively articulate potential public benefits?

Explanation:  

Evidence:  

NA  %
1.RD2

If an industry-related problem, can the program explain how the market fails to motivate private investment?

Explanation:  

Evidence:  

NA  %
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: While not comprehensive, the three key long-term measures focus on outcomes and are meaningful indicators of progress in fusion and plasma physics. The three long-term measures reflect critical areas of uncertainty as identified in the FESAC and NRC reports. The program has defined "successful" and "minimally effective" performance milestones for each measure, and an external panel will assess interim program performance on a triennial basis, and update the measures as necessary. It is inappropriate for a basic research program such as this one to have a quantitative long-term efficiency measure.

Evidence: National Research Council (NRC) report "Plasma Science" and Frontiers in High Energy Density Physics". Fusion Energy Sciences Advisory Committee (FESAC) "Report on the Integrated Program Planning Activity for the DOE Fusion Energy Sciences Program" (www.ofes.fusion.doe.gov/More_HTML/FESAC_Charges_Reports.html). A description of the "successful" and "minimally effective" milestones, and an explanation of the relevance of these measures to the field can be found on the SC Web site (www.sc.doe.gov/measures).

YES 10%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: FESAC has reviewed the new long-term measures for this program and found them to be ambitious and meaningful indicators of progress in key fields. The external reviews described in 2.1 will update the measures, targets, and timeframes on an interim basis.

Evidence: Letter from FESAC chair regarding review of long-term measures.

YES 10%
2.3

Does the program have a limited number of specific annual performance measures that demonstrate progress toward achieving the program's long-term measures?

Explanation: The facilities construction and operations efficiency measures should provide capabilities that the scientific community needs to make discoveries directly connected to the long term measures.

Evidence: FY04 Budget Request. Website with further information (www.sc.doe.gov/measures).

YES 10%
2.4

Does the program have baselines and ambitious targets and timeframes for its annual measures?

Explanation: All of the annual measures include quantifiable annual targets. Baseline data (FY01 and FY02) is included in the attached measures sheet to verify that the annual measures are ambitious, yet realistic.

Evidence: FY04 Budget Request. Website with further information (www.sc.doe.gov/measures). Construction variance target of <10% comes from OMB Circular A-11, especially Capital Programming Guide supplement.

YES 10%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, etc.) commit to and work toward the annual and/or long-term goals of the program?

Explanation: A limited FY03 audit by the DOE Inspector General (IG) found that "performance expectations generally flowed down into the scope of work at the national laboratories." A recent FES program solicitation included links to programs goal documents, but future solicitation should explicitly include the PART measures.

Evidence: Program files. Memo from the DOE IG to the Director of the Office of Science. Example of recent research solicitation (www.science.doe.gov/grants/Fr03-19.html). PPPL contract (www.pppl.gov/common_pages/doe_pu_contract.html, Appendix B).

YES 10%
2.6

Are independent and quality evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: All research projects undergo Merit Review. Grants are reviewed triennially. Construction projects are reviewed quarterly. FESAC evaluates all aspects of the FES program. In addition to evaluating whether FES has achieved its goals in a timely fashion, it recommends how the program should be modified to improve its performance. The Presidential Council of Advisors in Science and Technology (PCAST) and the National Research Council (NRC) have reviewed aspects of the program. The program should initiate a Committee of Visitors (COV) review effort to provide the a process validation and detailed portfolio quality check.

Evidence: SC Merit Review guidelines (www.sc.doe.gov/production/ grants/merit.html). Program files, including facility peer reviews and Lehman reviews. FESAC review reports on materials and theory (www.ofes.fusion.doe.gov/More_HTML/FESAC_Charges_Reports.html). SEAB, PCAST and NRC reports (www.ofes.science.doe.gov/FusionDocs.html).

YES 10%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: DOE has not yet provided a budget request that adequately integrates performance information.

Evidence:

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: New performance goals and targets that have been developed in coordination with OMB and FESAC will be engaged in reviewing them. The program has not yet produced a new [Congressionally-requested] Administration strategic vision for the program given the decision to join ITER, and should do so as soon as all relevant advisory committee studies are complete. The program should initiate a COV process to help in identifying research program strengths and weaknesses for strategic planning purposes.

Evidence: FESAC development report (www.ofes.fusion.doe.gov/More_HTML/FESAC/Dev.Report.pdf). FES plans to develop an Administration plan once the current NRC review of burning plasma physics is complete (www7.nationalacademies.org/bpa/projects_bpac.html). 1996 FES program strategic plan (wwwofe.er.doe.gov/FusionDocuments/StrategicPlan.pdf).

YES 10%
2.CA1

Has the agency/program conducted a recent, meaningful, credible analysis of alternatives that includes trade-offs between cost, schedule, risk, and performance goals and used the results to guide the resulting activity?

Explanation: FESAC recently provided advice to the program on the burning plasma effort, including the various options for pursuing a burning plasma experiment. A Lehman review of the ITER project cost estimate was conducted prior to the ITER decision. The justification provided to OMB for the NCSX project lacks a meaningful alternatives analysis.

Evidence: FESAC burning plasma report (www.ofes.fusion.doe.gov/More_HTML/FESAC/Austinfinalfull.pdf). Lehman report on ITER cost basis (ofes.fusion.doe.gov/News/ITERCostReport.pdf). NRC interim report on burning plasma program (ofes.fusion.doe.gov/News/ BPAC_Letter_final_ns_122002.pdf). Program files, including predecisional Exhibit 300 for NCSX submitted to OMB.

YES 10%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?

Explanation: This is a basic R&D program, and the question is intended for industry-related R&D programs.

Evidence:  

NA 0%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: FESAC and NAS recommendations identify strategic priorities, and the FES budget requests prior to the ITER decision closely followed FESAC guidance.

Evidence: 1995 National Research Council (NRC) "Plasma Science" report (www.nap.edu/catalog/4936.html). FESAC reports on "Integrated Program Planning" and "Priorities and Balance" (www.ofes.fusion.doe.gov/More_HTML/FESAC_Charges_Reports.html).

YES 10%
Section 2 - Strategic Planning Score 90%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The program collects and acts upon performance information including weekly facilities reports, quarterly grantee progress reports, annual facility program advisory committee reports, and annual contractor performance assessments. Additional project performance information is collected via Lehman reviews. Research performance data from individual grantees and national labs is collected and assessed via peer review as a type of standardized quality control at the individual grant level. However, there is not yet a systematic process, such as regular COV evaluations, that conducts research portfolio quality and process validations. While DOE IG contracts with an outside auditor to check internal controls for performance reporting, and the IG periodically conducts limited reviews of performance measurement in SC, it is not clear that these audits check the credibility of performance data reported by DOE contractors.

Evidence: Program files, including Lehman reviews, action items based on contractor performance reports, weekly facility reports, and program advisory committee reports.

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, cost-sharing partners, etc.) held accountable for cost, schedule and performance results?

Explanation: Senior Executive Service (SES) and Program Manager Performance Plans are directly linked to program goals. The Management and Operations contracts for the Labs and Facilities include performance measures linked to program goals. Research funding requirements ensure consideration of past performance.

Evidence: 10 CFR 605 (www.science.doe.gov/production/grants/605index.html). Program and personnel files, including reviews and actions on poorly performing efforts at Los Alamos National Lab and Univ. of Texas. Performance-based fee arrangements in PPPL contract (Appendix B at www.pppl.gov/common_pages/doe_pu_contract.html). Statistics of PI renewals.

YES 8%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Using DOE's monthly accounting reports, SC personnel monitor progress toward obligating funds consistent with an annual plan that is prepared at the beginning of the fiscal year to ensure alignment with appropriated purposes. SC programs consistently obligate more than 99.5% of available funds.

Evidence: Program files. Audit reports.

YES 8%
3.4

Does the program have procedures (e.g., competitive sourcing/cost comparisons, IT improvements, approporaite incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: SC is currently undergoing a reengineering exercise aimed at flattening organizational structure and improving program effectiveness. The program collects the data necessary to track their "efficiency" measure on facility operations.

Evidence: SC reengineering information (www.screstruct.doe.gov). Program files on facility operations.

YES 8%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: FES reviews and coordinates research activities with NNSA's Inertial Confinement Fusion program. FES jointly sponsors research support for basic plasma physics with NSF.

Evidence: Joint program plans and reviews with NNSA. MOU with NSF for joint funding and oversight of plasma physics facility at UCLA. Joint solicitation with NSF (www.nsf.gov/pubs/2002/nsf02184/nsf02184.htm).

YES 8%
3.6

Does the program use strong financial management practices?

Explanation: SC staff execute the FES program consistent with established DOE budget and accounting policies and practices. These policies have been reviewed by external groups and modified as required to reflect the latest government standards.

Evidence: Various Departmental manuals. Program files. Audit reports.

YES 8%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: SC is currently re-engineering to improve program management efficiency. The FES program is reviewing the establishment of formal Committee of Visitors reviews for FY04. Program action on Lehman review findings are critical to success of construction projects.

Evidence: SC reengineering information (www.screstruct.doe.gov). Program files, including Lehman review of NCSX; actions taken in response to review of Tritium Systems Test Assembly at Los Alamos; review and corrective management actions at PPPL after NSTX coil failure.

YES 8%
3.CA1

Is the program managed by maintaining clearly defined deliverables, capability/performance characteristics, and appropriate, credible cost and schedule goals?

Explanation: The FES program documents the capabilities and characteristics of new facilities in conceptual design reports that are reviewed by FESAC and an independent Lehman review. Progress is tracked quarterly through program and Lehman reviews.

Evidence: Program files, including Lehman report on NCSX critical decision review, and program milestones for DIII-D user facility. Predecisional Exhibit 300 for NCSX.

YES 8%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: First time grant applications are encouraged in all Requests For Proposals. FES has a specific solicitation for the Outstanding Junior Investigator (OJI) program, in which awards are made to young non-tenured faculty. "Merit Review" guides all funding decisions. However, the quality of the research funded via this process has not yet been validated by a COV.

Evidence: For FY 2002, FES received 169 proposals-73 new, 41 for renewals, and 55 for supplements. Of these 26 new proposals were approved, 40 renewals were approved, and 52 supplementals were approved. Thus, FES funded 36% of new research applications.

NO 0%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: In addition to grantee progress reports, program managers stay in contact with grantees through email and telephone, conduct program reviews and site visits.

Evidence: Program files, including progress reports, and on-site review reports.

YES 8%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: In accordance with DOE Order 241.1A, the final and annual technical reports of program grantees are made publicly available on the web through the Office of Scientific and Technical Information's "Information Bridge". However, program-level aggregate data on the impact of the grants program is not adequately communicated in the annual DOE Performance and Accountability report.

Evidence: DOE Order 241.1A. Information Bridge (www.osti.gov/bridge/). FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf).

NO 0%
3.RD1

Does the program allocate funds through a competitive, merit-based process, or, if not, does it justify funding methods and document how quality is maintained?

Explanation: The funds for research programs and scientific user facilities at the Federal Labs are allocated through a limited competition analogous process to the unlimited process outlined in 10 CFR 605. FES publishes its own specific grant guidelines, and manages the execution of the research program very closely. Solicitations for labs are somewhat targeted, though unsolicited work (typically defined as "inherently unique") is not competed. However, the quality of the research funded via this process has not yet been validated by a COV.

Evidence: FES grant and merit review procedures (www.ofes.fusion.doe.gov/Grant/Grants.html). 10 CFR 605. (www.science.doe.gov/production/grants/605index.html) Program files. Example of lab solicitation (www.science.doe.gov/grants/LAB03_19.html).

NO 0%
3.RD2

Does competition encourage the participation of new/first-time performers through a fair and open application process?

Explanation:  

Evidence:  

NA  %
3.RD3

Does the program adequately define appropriate termination points and other decision points?

Explanation:  

Evidence:  

NA  %
3.RD4

If the program includes technology development or construction or operation of a facility, does the program clearly define deliverables and required capability/performance characteristics and appropriate, credible cost and schedule goals?

Explanation:  

Evidence:  

NA  %
Section 3 - Program Management Score 66%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome performance goals?

Explanation: FESAC will evaluate progress toward the long term performance measures every three years. External reports have found good scientific progress, though for the ultimate energy goal, critics question the credibily of the fusion community in continually promising "30 years to commercial fusion power"

Evidence: FESAC reports (www.ofes.fusion.doe.gov/More_HTML/FESAC/Dev.Report.pdf). NRC quality assessment (www.nap.edu/books/0309073456/html). Article in July 20, 2002 edition of "The Economist."

LARGE EXTENT 13%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: FES met roughly half of its annual performance goals in FY02, though one missed target was due to a programmatic decision.

Evidence: FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf): "mixed results" in SC6-2 and SC7-6 goals.

LARGE EXTENT 13%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program performance goals each year?

Explanation: For construction efficiency, the Electron Cyclotron Heating upgrade at DIII-D was more than 10% behind schedule for FY02. The National Spherical Torus Experimental (NSTX) Facility has recently experienced serious operational difficulties, and it is not expected to meet its original scheduled operating time for FY03.

Evidence: FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf): "mixed results" for the efficiency measure on facility construction. Program files, including program review of NSTX coil failure.

LARGE EXTENT 13%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., that have similar purpose and goals?

Explanation: FESAC, NRC, and PCAST reviews and interactions with foreign governments compare this program favorably to similar programs oversees. FES program is only 15% of world program in funding, and expert panels find an disproportionately large impact made by the U.S.

Evidence: NRC report (www.nap.edu/books/0309073456/html/). PCAST report (www.ofes.fusion.doe.gov/More_HTML/PDFfiles/PCAST.pdf). FESAC reports (www.ofes.fusion.doe.gov/More_HTML/ FESAC_Charges_Reports.html).

NA  %
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: FESAC, on a rotating schedule, reviews the major elements of the FES program. These reviews examine scientific progress, assess the scientific opportunities, and recommend reordering priorities based upon existing budget profiles. The program's performance has received generally positive marks by external panels from National Research Council and President's Council of Advisors on Science and Technology. NRC report found that the fusion community is too isolated, and this impacts its effectiveness.

Evidence: Burning Plasma Physics and Theory were reviewed by FESAC in 2001 (www.ofes.fusion.doe.gov/More_HTML/FESAC_Charges_Reports.html). External reports by PCAST, NRC, and SEAB (www.ofes.fusion.doe.gov/FusionDocs.html).

YES 20%
4.CA1

Were program goals achieved within budgeted costs and established schedules?

Explanation: NCSX, the only new large project in FES, had not been baselined yet. The problems at NSTX (see Question 4.3) are a potential concern for ITER since one reason given for the coil failure on the much smaller NSTX project was the inadequate number of qualified engineers at Princeton Lab.

Evidence: Program files, including Lehman review of NSTC coil failure. FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf). FY04 Annual Performance Plan (www.mbe.doe.gov/budget/04budget/index.htm).

YES 20%
4.RD1

If the program includes construction of a facility, were program goals achieved within budgeted costs and established schedules?

Explanation:  

Evidence:  

NA  %
Section 4 - Program Results/Accountability Score 80%


Last updated: 01092009.2003FALL