ExpectMore.gov


Detailed Information on the
High Energy Physics Assessment

Program Code 10000104
Program Title High Energy Physics
Department Name Department of Energy
Agency/Bureau Name Department of Energy
Program Type(s) Research and Development Program
Competitive Grant Program
Capital Assets and Service Acquisition Program
Assessment Year 2003
Assessment Rating Moderately Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 70%
Program Management 66%
Program Results/Accountability 87%
Program Funding Level
(in millions)
FY2007 $732
FY2008 $689
FY2009 $805

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Implementing the recommendations of past and new external assessment panels, as appropriate.

Action taken, but not completed The 2007 Committee of Visitors (COV) issued its report and the Office of High Energy Physics (OHEP) is preparing its response; OHEP has already implemented 6 of the 18 recommendations from the 2007 COV. Remaining items are pending new staff hires and/or management direction on implementation. At the May 2008 HEPAP meeting, P5 will present its report on the US HEP strategy in stringent budget scenarios.
2005

Developing a strategy and implementation plan for particle accelerator research and development, including a potential international linear collider.

Action taken, but not completed In Sep 2008 OHEP will conduct a review of advanced particle accelerator R&D (AARD) performed at laboratories. The review committee will also be asked for input on appropriate milestones and priorities for the OHEP AARD implementation plan. The first element of the implementation plan for the AARD test facility is the CD-1 review which is moving forward. HEP has worked with its international funding agency partners and the Global Design Effort to revise the international plan for ILC R&D.
2007

Develop new annual measures to track how efficiently the program's facilities are operated and maintained on a unit per dollar basis by July, 2007.

Action taken, but not completed As part of the Fermilab S&T review (June 2008), the operations of the Tevatron will be reviewed. As part of the SLAC institutional review (July 2008), the B-Factory D&D will be reviewed.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Engaging the National Academies to help develop a realistic long term plan for the program that is based on prioritized scientific opportunities and input from across the scientific community.

Completed The long-term plan for HEP emphasizes discovery at the energy frontier along with coordinated programs in particle astrophysics and neutrino research, as recommended by the National Academy. Details of the plan are availble in the FY2008 HEP budget request and the accompanying 5-year plan.
2006

In cooperation with other Office of Science programs, develop and deploy a modern, streamlined, and cost-effective information management system for tracking the university grant proposal review and award process. This system should be in place by the end of FY 2007.

Completed DOE (including SC) uses GRANTS.GOV, as mandated, to receive all proposals; tracking the award of financial assistance agreements is through the DOE Procurement Acquisition Data System (PADS).
2007

Develop a unified, action-based strategy for SC-wide collaboration in accelerator and detector R&D (including advanced accelerator concepts) by March 1, 2007.

Completed Strategy submitted to OMB 2/26/2007
2007

Participate in the development of a plan, due to OMB by March 1, 2007, to address serious deficiencies in the program's Exhibit 300 business casees for capital projects.

Completed Methods to address OMB concerns issued to OMB on 2/20/2007.

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Progress (excellent, adequate, poor) in measuring the properties and interactions of the heaviest known particle (the top quark) in order to understand its particular role in the so-called "Standard Model" of particle physics. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Excellent
2009 Excellent
2012 Excellent
2015 Successful
Long-term Outcome

Measure: Progress in measuring the matter-antimatter asymmetry in many particle decay modes with high precision. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Excellent
2009 Excellent
2012 Excellent
2015 Successful
Long-term Outcome

Measure: Progress in discovering or ruling out the Standard Model Higgs particle, thought to be responsible for generating mass of elementary particles. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Excellent
2009 Excellent
2012 Excellent
2015 Successful
Long-term Outcome

Measure: Progress in determining the pattern of the neutrino masses and the details of their mixing parameters. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Excellent
2009 Excellent
2012 Excellent
2015 Successful
Long-term Outcome

Measure: Progress in confirming the existence of new supersymmetric (SUSY) particles, or ruling out the minimal SUSY "Standard Model" of new physics. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Good
2009 Excellent
2012 Excellent
2015 Successful
Long-term Outcome

Measure: Progress in directly discovering, or ruling out the existence of, new particles which could explain the cosmological "dark matter." An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Good
2009 Excellent
2012 Excellent
2015 Successful
Annual Output

Measure: Deliver within 20% of baseline estimate a total integrated amount of data (in inverse picobarns, [pb-1]) to the CDF and D-Zero detectors at the Tevatron .


Explanation:In High Energy Physics, the amount of data taken by the detectors is a good indicator of progress. The events that researchers are interested in are rare, so the more events they record the more likely they will record what they are interested in studying. Increasing the amount of data can be technically challenging but is also critically important to the facility users. This measure will track how we actually perform versus what we plan to deliver. The target numbers are heavily influenced by the schedule of experiments and the expected program budget. Investments in the most basic areas of research, such as Physics, spark our imagination and advance our human curiosity about the universe in which we live. Historically, these investments have also paid hansom dividends in terms of new technologies that have raise our standard of living and even extended our life expectancy. Examples include cell phones and satellite TV, Magnetic Resonance Imaging (MRI), lasers (for levels, CD players, or eye surgery), the World Wide Web and the computers that get you there fast.

Year Target Actual
2002 80 83
2003 225 240
2004 240 331
2005 390 598
2006 675 621
2007 800 1311
2008 800
2009 1200
Annual Output

Measure: Deliver within 20% of baseline estimate a total integrated amount of data (in inverse femtobarns[fb-1]) delivered to the BABAR detector at the Stanford Linear Accelerator (SLAC) B-factory.


Explanation:In High Energy Physics, the amount of data taken by the detectors is a good indicator of progress. The events that researchers are interested in are rare, so the more events they record the more likely they will record what they are interested in studying. Increasing the amount of data can be technically challenging but is also critically important to the facility users. This measure will track how we actually perform versus what we plan to deliver. The target numbers are heavily influenced by the schedule of experiments and the expected program budget. Investments in the most basic areas of research, such as Physics, not only spark our imagination and advance our human curiosity about the universe in which we live. Historically, these investments have also paid hansom dividends in terms of new technologies that have raise our standard of living and even extended our life expectancy. Examples include cell phones and satellite TV, Magnetic Resonance Imaging (MRI), lasers (for levels, CD players, or eye surgery), the World Wide Web and the computers that get you there fast.

Year Target Actual
2001 25 25
2002 35 42
2003 45 40
2004 45 117
2005 50 54
2006 100 100
2007 130 90
2008 220
2009 discontinued
Annual Efficiency

Measure: Achieve less than 10% for both the cost-weighted mean percentage variance from established cost and schedule baselines for major construction, upgrade, or equipment procurement projects.


Explanation:This annual measure assesses whether the major construction projects are adhering to their specified cost and schedule baselines. Adhering to the cost and schedule baselines for a complex, large scale, science project is critical to meeting the scientific requirements for the project and for being good stewards of the taxpayers' investment in the project. The Office of Science has a rigorous process in place for overseeing the management of these large-scale, complex scientific projects and has been recognized, both inside government and by private organizations, for the effectiveness of this process.

Year Target Actual
2002 <10%, <10% +1.4%, -2.1%
2003 <10%, <10% +3.1%, -3.4%
2004 <10%, <10% +1.0%, -2.0%
2005 <10%, <10% +2.0%, -1.0%
2006 <10%, <10% <1%, <1%
2007 <10%, <10% 1.0%, <1.0%
2008 <10%, <10%
2009 <10%, <10%
Annual Efficiency

Measure: Achieve greater than 80% average operation time of the scientific user facilities (the Fermilab Tevatron and the Stanford Linear Accelerator (SLAC) B-factory) as a percentage of the total scheduled annual operating time.


Explanation:This annual measure assesses the reliability and dependability of the operation of the scientific user facilities. Many of the research projects that are undertaken at the Office of Science's user facilities take a great deal of time, money and effort to prepare and regularly have a very short window of opportunity to run. If the facility is not operating as expected the experiment could be ruined or critically setback. In addition, the taxpayers have invested millions of dollars in these facilities. The longer these facilities operate reliably, the greater the return on investment for the taxpayers.

Year Target Actual
2002 >80% 87%
2003 >80% 83%
2004 >80% 89%
2005 >80% 73%
2006 >80% 78.4%
2007 >80% 82%
2008 >80%
2009 >80%
Annual Output

Measure: Measure within 20% of the total integrated amount of data (in protons-on-target) delivered to the MINOS detector using the NuMI facility.


Explanation:In High Energy Physics, the amount of data taken by the detectors is a good indicator of progress. The events that researchers are typically interested in are rare, so the more events they record the more likely they will record what they are interested in studying. Increasing the amount of data can be technically challenging but is also critically important to the facility users. This measure will track how we actually perform versus what we plan to deliver. The target numbers are heavily influenced by the schedule of experiments and the expected program budget.

Year Target Actual
2006 1.0 x 10E20 1.01 x 10E20
2007 1.5 x 10E20 1.9 x 10E20
2008 2.0 x 10E20
2009 2.7 x 10E20

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The mission of the High Energy Physics (HEP) program is to understand the universe at a more basic level by investigating the elementary particles that are the fundamental constituents of matter and the forces between them.

Evidence: FY 2004 Budget Request (www.mbe.doe.gov/budget/04budget/index.htm). Public Law 95-91 that established the Department of Energy (DOE).

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The HEP program addresses several key questions: Can we realize Einstein's dream of a unified description of fundamental particles and forces in the universe? Where is the fundamental particle that endows all other particles with their masses? Are there additional or 'hidden' dimensions of space-time? What are the masses of the neutrinos, and what is their role in the universe? Why is there more matter than anti-matter in the universe? What is the nature of the dark matter and the dark energy, which together make up more than 95% of the universe?

Evidence: FY04 Budget Request/Annual Performance Plan. High Energy Physics Advisory Panel (HEPAP) Long-Range Plan (doe-hep.hep.net/hepap_reports.html). Portions of the HEP program address: the National Research Council (NRC) reports "Physics in a New Era: An Overview"; "Connecting Quarks with the Cosmos: Eleven Science Questions for the New Century"; and "Astronomy & Astrophysics in the New Millennium" (www7.nationalacademies.org/bpa/BPA_Reports.html).

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any Federal, state, local or private effort?

Explanation: The Office of Science (SC) HEP program is the principal source of federal funding for basic, long-term High Energy Physics research and much of particle astrophysics and cosmology research.

Evidence: About 90% of U.S. High Energy Physics research is supported by the HEP program. Much of the remaining portion is supported by the National Science Foundation and is coordinated through HEPAP, a joint advisory committee.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The HEP program is based on competitive, merit-review, independent expert advice, and community planning. However, a COV has yet to validate the merit review system.

Evidence: HEPAP reviews and reports. (doe-hep.hep.net/hepap_reports.html). Program files.

YES 20%
1.5

Is the program effectively targeted, so program resources reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: HEPAP ensure that input from the high energy physics research community is regularly gathered to assess the priorities, projects, and progress of the program. Peer review is used to assess the relevance and quality of each project.

Evidence: HEPAP reviews and reports. (doe-hep.hep.net/hepap_reports.html). Program files.

YES 20%
1.RD1

Does the program effectively articulate potential public benefits?

Explanation:  

Evidence:  

NA  %
1.RD2

If an industry-related problem, can the program explain how the market fails to motivate private investment?

Explanation:  

Evidence:  

NA  %
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The six long-term measures, listed in priority order, reflect the key scientific drivers that the U.S. high energy physics community has outlined for the field for roughly the next decade. The program has defined "successful" and "minimally effective" performance milestones for each measure, and an external panel will assess interim program performance on a triennial basis, and update the measures as necessary. It is inappropriate for a basic research program such as this one to have a quantitative long-term efficiency measure.

Evidence: HEPAP Long-Range Plan (doe-hep.hep.net/hepap_reports.html). National Research Council (NRC) reports "Physics in a New Era: An Overview"; "Connecting Quarks with the Cosmos: Eleven Science Questions for the New Century"; and "Astronomy & Astrophysics in the New Millennium" (www7.nationalacademies.org/bpa/BPA_Reports.html). A description of the "successful" and "minimally effective" milestones, and an explanation of the relevance of these measures to the field can be found on the SC Web site (www.sc.doe.gov/measures).

YES 10%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: HEPAP has reviewed the long-term measures for this program and found them to be ambitious and meaningful indicators of progress in the field. The external reviews described in 2.1 will update the measures, targets, and timeframes on an interim basis.

Evidence: Letter from HEPAP chair regarding review of long-term measures.

YES 10%
2.3

Does the program have a limited number of specific annual performance measures that demonstrate progress toward achieving the program's long-term measures?

Explanation: The quantitative annual output measures for facility construction and operations, and the data delivery goals for the two primary accelerators, serve as proxies for progress, because the efficient on-cost and on-schedule delivery of scientific data from these large facilities provides a critical resource necessary for continuing scientific discoveries that are directly connected to the long term goals of the program.

Evidence: FY04 Budget Request, previous GPRA reports. Website with further information, including explanation of units for data delivery measures (www.sc.doe.gov/measures).

YES 10%
2.4

Does the program have baselines and ambitious targets and timeframes for its annual measures?

Explanation: All of the annual measures have baseline data (FY01 and/or FY02) that demonstrate that the targets are ambitious, yet realistic. Based on past experience with the data delivery measures, a 20 percent tolerance is used to guard against facilities unwisely stressing hardware near the end of the fiscal year.

Evidence: FY04 Budget Request, previous GPRA reports. Construction variance target of <10% comes from OMB Circular A-11, especially Capital Programming Guide supplement.

YES 10%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, etc.) commit to and work toward the annual and/or long-term goals of the program?

Explanation: A limited FY03 audit by the DOE Inspector General (IG) found that "performance expectations generally flowed down into the scope of work at the national laboratories." For individual grantees, HEP uses general solicitations that do not explicitly include program goals.

Evidence: Memo from the DOE IG to the Director of the Office of Science. M&O contract performance evaluation provisions (Fermilab, www.fnal.gov/directorate/documents/DOE_Contract/appendixb.html; SLAC, www-group.slac.stanford.edu/bsd/contract/). Most recent general renewal solicitation (www.science.doe.gov/grants/Fr03-02.html).

NO 0%
2.6

Are independent and quality evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: All research projects undergo merit review, ongoing grants are reviewed triennially, major facilities are reviewed annually, and construction projects are reviewed quarterly. While the program has a great number of reviews on its construction projects and facility operations in the case of the Tevatron at Fermilab, any portfolio-level reviews of the research program conducted by HEPAP have typically concerned the lab program only, and have lacked sufficient scope and depth. HEP is working to begin a Committee of Visitors (COV) review process for the program, and hopes to review the first program element in 2003.

Evidence: SC Merit Review guidelines (www.sc.doe.gov/production/grants/merit.html) . Project reviews by advisory bodies (doe-hep.hep.net/general_reports.htm). HEPAP reports (doe-hep.hep.net/hepap_reports.html). Program files, including Lehman review reports, and post-meeting summary letters from HEPAP chair to DOE and NSF.

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: DOE has not yet provided a budget request that adequately integrates performance information.

Evidence:

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: New long-term and annual performance goals and targets have been developed in coordination with OMB. A new COV process is being organized, with the first program element review to occur in 2003. The new Particle Physics Project Prioritization Panel ("P5") report is expected in September, 2003, though the Panel is only looking at a select number of new projects. HEP does not yet have indpendent reviews or a program strategic plan that considers new and ongoing projects, early project R&D, and facility operations within the context of the research program.

Evidence: COV charge letter from DOE to HEPAP chair. HEPAP Long-Range Plan and 20-year facilities plan (doe-hep.hep.net/hepap.html). P5 Report due September, 2003 (doe-hep.hep.net/p5/index.html).

YES 10%
2.CA1

Has the agency/program conducted a recent, meaningful, credible analysis of alternatives that includes trade-offs between cost, schedule, risk, and performance goals and used the results to guide the resulting activity?

Explanation: One of a kind research facilities are not amenable to the same type of alternatives analysis as other captial asset investments. Recent Lehman review of Tevatron complex considered cost, schedule, risk, and performance issues within the effort. The analysis provided to OMB in the predecisional Exhibit 300s is frequently not meaningful.

Evidence: Program files, including Lehman reviews and Exhibit 300s. Summary of recent Tevatron review (doe-hep.hep.net/HEPAP/Jul2003/Lehman_HEPAP.pdf).

YES 10%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?

Explanation: This is a basic R&D program, and the question is intended for industry-related R&D programs.

Evidence:  

NA 0%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: Although not visible outside DOE, internal SC budget formulation practices include a priority ranking process. The HEPAP long range plan identified strategic priorities for the U.S. particle physics community. Priorities for specific large projects will be independently evaluated by the Particle Physics Project Prioritization Panel ("P5"). HEPAP recommened a 20-year facilities plan for DOE as a part of the SC strategic planning process.

Evidence: HEPAP Long-Range Plan and 20-year facilities plan (doe-hep.hep.net/hepap.html). P5 Report due September, 2003 (doe-hep.hep.net/p5/index.html).

YES 10%
Section 2 - Strategic Planning Score 70%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: A great deal of project performance information collected via Lehman facility construction and operations reviews, annual lab reviews, etc., and management changes are made in response to these reviews. The program collects performance data from individual grantees and national labs, and uses peer review as a type of standardized quality control at the individual grant level. However, there is not yet a systematic process, such as regular COV evaluations, that conducts research portfolio quality and process validations. While DOE IG contracts with an outside auditor to check internal controls for performance reporting, and the IG periodically conducts limited reviews of performance measurement in SC, it is not clear that these audits check the credibility of performance data reported by DOE contractors.

Evidence: Program files, including Lehman reviews and subprogram reviews. Reporting requirements for grants (www.science.doe.gov/production/grants/605-19.html).

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, cost-sharing partners, etc.) held accountable for cost, schedule and performance results?

Explanation: Senior Executive Service (SES) and Program Manager Performance Plans are directly linked to program goals, and several high level management changes were recently carried out, partially in response to ongoing problems at the Tevatron. The Management and Operations contracts for the Labs and Facilities include performance measures linked to program goals. Research funding requirements ensure consideration of past performance.

Evidence: 10 CFR 605 (www.science.doe.gov/production/grants/605index.html). Program and personnel files, including consequences for underperforming lab and university research, grant renewal statistics, and implications for performance-based fee for the Fermilab contractor.

YES 8%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Using DOE's monthly accounting reports, SC personnel monitor progress toward obligating funds consistent with an annual plan that is prepared at the beginning of the fiscal year to ensure alignment with appropriated purposes.

Evidence: SC programs consistently obligate more than 99.5% of available funds. Program files. Audit reports.

YES 8%
3.4

Does the program have procedures (e.g., competitive sourcing/cost comparisons, IT improvements, approporaite incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: SC is currently undergoing a reengineering exercise aimed at flattening organizational structure and improving program effectiveness. The program collects the data necessary to track the two "efficiency" measures for facility construction and operations management.

Evidence: SC reengineering information (www.screstruct.doe.gov).

YES 8%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The HEP program is well coordinated with similar programs at NSF and NASA through joint advisory and assessment groups (HEPAP and SAGENAP) and joint oversight groups (JOGs) for specific projects. The program jointly funds a range of international and interagency projects.

Evidence: HEPAP (doe-hep.hep.net/hepap.html) and SAGENAP (doe-hep.hep.net/general_reports.htm). JOG Minutes. International agreements with Europe, Japan, and China. MOU with National Science Foundation for HEPAP and the Large Hadron Collider in Europe. Implementing agreement with NASA for primary instrument on the GLAST mission. Early planning process for a potential joint dark energy mission.

YES 8%
3.6

Does the program use strong financial management practices?

Explanation: SC staff execute the HEP program consistent with established DOE budget and accounting policies and practices. These policies have been reviewed by external groups and modified as required to reflect the latest government standards.

Evidence: Various Departmental manuals. Program files. Audit reports.

YES 8%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: SC is currently reengineering to improve program management efficiency. A Committee of Visitors (COV) process is being implemented. A layer of management above HEP was removed. Several management changes were recently made, partially in response to ongoing problems at the program's largest facility.

Evidence: SC reengineering information (www.screstruct.doe.gov). SC reorganization memoranda.

YES 8%
3.CA1

Is the program managed by maintaining clearly defined deliverables, capability/performance characteristics, and appropriate, credible cost and schedule goals?

Explanation: Facility critical decision points are documented and reviewed an independent Lehman review, and occasionally via an assessment by HEPAP or SAGENAP. Progress for ongoing efforts is tracked quarterly through program and Lehman reviews. The Tevatron luminosity upgrade was not "projectized," and this was a key problem that is finally being addressed.

Evidence: Program files, including Lehman reports and program peer reviews. SAGENAP reviews (doe-hep.hep.net/general_reports.htm). Exhibit 300s.

YES 8%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: First time grant applications are encouraged in all Requests For Proposals. In addition, new or first-time scientists apply for funding through the Outstanding Junior Investigator award program. "Merit Review" guides all funding decisions. However, the award and merit review process has not yet been validated by a COV.

Evidence: In FY 2002, the HEP program funded 15 new research grants out of a total of 160 grants. Several of the new grants for junior investigators are incorporated as new "tasks" within existing grants.

NO 0%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: In addition to grantee reports, program managers stay in contact with grantees through email and telephone, conduct program reviews, video conferences and site visits, and have grantees participate in independent reviews of other projects.

Evidence: HEPAP and SAGENAP reports (doe-hep.hep.net/general_reports.htm). Program files, including site visits and reviews.

YES 8%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: In accordance with DOE Order 241.1A, the final and annual technical reports of program grantees are made publicly available on the web through the Office of Scientific and Technical Information's "Information Bridge". However, program-level aggregate data on the impact of the grants program is not adequately communicated in the annual DOE Performance and Accountability report.

Evidence: DOE Order 241.1A. Information Bridge (www.osti.gov/bridge/). FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf).

NO 0%
3.RD1

Does the program allocate funds through a competitive, merit-based process, or, if not, does it justify funding methods and document how quality is maintained?

Explanation: Priorities are determined in accord with guidance from the HEPAP Long-Range Plan, and construction projects are reviewed regularly. Unsolicited field work proposals from the Federal Labs are merit reviewed, but not competed. The funds for research programs and scientific user facilities at the Federal Labs are allocated through a limited competition analogous process to the unlimited process outlined in 10 CFR 605. However, the quality of the research funded via this process has not yet been validated by a COV.

Evidence: HEPAP long range plan (doe-hep.hep.net/lrp_panel/index.html). SC Merit Review procedures (www.sc.doe.gov/production/grants/merit.html, www.science.doe.gov/production/grants/605index.html) Program files, including example of merit review for lab work.

NO 0%
3.RD2

Does competition encourage the participation of new/first-time performers through a fair and open application process?

Explanation:  

Evidence:  

NA  %
3.RD3

Does the program adequately define appropriate termination points and other decision points?

Explanation:  

Evidence:  

NA  %
3.RD4

If the program includes technology development or construction or operation of a facility, does the program clearly define deliverables and required capability/performance characteristics and appropriate, credible cost and schedule goals?

Explanation:  

Evidence:  

NA  %
Section 3 - Program Management Score 66%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome performance goals?

Explanation: HEPAP will evaluate progress toward the new long term performance measures every three to five years. HEPAP reports discuss exciting recent discoveries in several areas of particle physics. Ongoing challenges and uncertainties in reaching expected luminosity levels at the Tevatron (currently the world's highest energy particle accelerator) may continue to present barriers to the mid-term scientific progress for much of the program.

Evidence: HEPAP long range plan (doe-hep.hep.net/lrp_panel/index.html). Post-meeting summary letters from HEPAP chair to DOE/NSF managers. Summary of recent Tevatron review (doe-hep.hep.net/HEPAP/Jul2003/Lehman_HEPAP.pdf).

LARGE EXTENT 13%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: HEP has met most of its annual performance goals in FY02, with the one schedule slip on the Large Hadron Collider project due to international partners. It appears that BABAR detector at SLAC's B-Factory might miss its luminosity goal for FY03.

Evidence: FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf). FY04 Annual Performance Plan (www.mbe.doe.gov/budget/04budget/content/perfplan/perfplan.pdf).

YES 20%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program performance goals each year?

Explanation: The recent history of tracking the two "efficiency" measures for facility construction and operation management shows that, on average, the program continues to meet expectations.

Evidence: Program files.

YES 20%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., that have similar purpose and goals?

Explanation: High energy physics is, by its very nature, an integrated worldwide effort, which makes comparison to similar programs in other countries questionable at best. An international benchmarking study has not been done, due in large part to its questionable value.

Evidence: 50% of collaborators at BaBar, CDF, and D-Zero experiments in U.S. are foreign. Half of collaborators on SuperK experiment in Japan are from the U.S. The U.S. has a significant stake in the Large Hadron Collider being built in Europe.

NA  %
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: Somewhat superficial HEPAP reviews of scientific progress in the program have found good research performance except for the Tevatron, though this was in part to mismanaged expectations by HQ and FNAL. Recent performance of the Tevatron accelerator (Run-II) has been a concern, and a recent Lehman review found decent progress, with many key hurdles for the project stretching through 2004. DOE-run reviews of laboratory programs include outside researchers, and have generally found good results.

Evidence: HEPAP reports (doe-hep.hep.net/hepap.html). Post-meeting summary letters from HEPAP chair to DOE/NSF managers. Program files, including lab peer reviews. Summary of recent Tevatron review (doe-hep.hep.net/HEPAP/Jul2003/Lehman_HEPAP.pdf).

YES 20%
4.CA1

Were program goals achieved within budgeted costs and established schedules?

Explanation: NuMI/MINOS has maintained its new baseline cost and schedule since 2001 rebaselining. All three components of the US contribution to the LHC project have maintained cost and schedule, though CERN has delayed the official completion of the LHC project. The Gamma-ray Large Area Space Telescope (GLAST/LAT) project, a collaborative venture with NASA, has maintained its baseline cost and schedule, though the recent departure of France as a partner causes concern. There are positive signs for the Tevatron complex, but there are significant technical and managerial hurdles remaining in order to meet cost and schedule "baselines" once the effort is finally "projectized" in early 2004. Since "finding the Higgs" was a major driver for the program in the past several HEP budget requests, the program should be held to this standard until they advance more realistic expectations.

Evidence: Lehman review reports for NuMI/MINOS, GLAST/LAT and US LHC projects (doe-hep.hep.net/general_reports.htm). Program files. Exhibit 300s. Summary of recent Tevatron review (doe-hep.hep.net/HEPAP/Jul2003/Lehman_HEPAP.pdf).

LARGE EXTENT 13%
4.RD1

If the program includes construction of a facility, were program goals achieved within budgeted costs and established schedules?

Explanation:  

Evidence:  

NA  %
Section 4 - Program Results/Accountability Score 87%


Last updated: 09062008.2003SPR