ExpectMore.gov


Detailed Information on the
Biological and Environmental Research Assessment

Program Code 10000080
Program Title Biological and Environmental Research
Department Name Department of Energy
Agency/Bureau Name Department of Energy
Program Type(s) Research and Development Program
Competitive Grant Program
Capital Assets and Service Acquisition Program
Assessment Year 2003
Assessment Rating Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 89%
Program Management 66%
Program Results/Accountability 87%
Program Funding Level
(in millions)
FY2007 $480
FY2008 $544
FY2009 $569

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Develop new annual measures to track how efficiently the program's facilities are operated and maintained on a unit per dollar basis by July, 2007.

Action taken, but not completed Reviews have been scheduled for EMSL(Fall 2008), JGI (Fall 2008), and ARM (CY2010)

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Implementing the recommendations of past external panel reviews of the program's research portfolio and management practices.

Completed The first round of COVs and the action from the COVs are completed. The second round has been initiated.
2005

Engaging the National Academies in an independent assessment of the scientific basis and business case for the program's microbial genomics research efforts.

Completed In May 2006, the National Academies issued Review of the DOE's Genomics: GTL Program. The report recommended that plans for new research facilities be reshaped to produce results earlier and more cost-effectively, and that the GTL facilities should be focused not on particular technologies, but on research underpinning particular applications ?? bioenergy, carbon sequestration, or environmental remediation. The full full report is available at http://DOEGenomestoLife.org.
2005

Reviewing operations of user facilities, and improving discrimination in identifying open user facilities versus collaborative research facilities.

Completed All 3 BER facilities have been reviewed and a schedule of triennial reviews established.
2006

In cooperation with other Office of Science programs, develop and deploy a modern, streamlined, and cost-effective information management system for tracking the university grant proposal review and award process. This system should be in place by the end of FY 2007.

Completed DOE (including SC) uses GRANTS.GOV, as mandated, to receive all proposals; tracking the award of financial assistance agreements is through the DOE Procurement Acquisition Data System (PADS).

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Life Sciences - Provide the fundamental scientific understanding of plants and microbes necessary to develop new robust and transformational basic research strategies for producing biofuels, cleaning up waste, and sequestering carbon. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Excellent
2009 Excellent
2012 Excellent
2015 Successful
Long-term Outcome

Measure: Climate Change Research - Deliver improved scientific data and models about the potential response of the Earth's climate and terrestrial biosphere to increased greenhouse gas levels for policy makers to determine safe levels of greenhouse gases in the atmosphere. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Excellent
2009 Excellent
2012 Excellent
2015 Successful
Long-term Outcome

Measure: Environmental Remediation - Provide sufficient scientific understanding such that DOE sites would be able to incorporate coupled physical, chemical and biological processes into decision making for environmental remediation and long-term stewardship. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Excellent
2009 Excellent
2012 Excellent
2015 Successful
Annual Output

Measure: Increase the rate and decrease the cost of DNA sequencing --- Increase by 10% the number (in billions) of high quality (less than one error in 10,000) bases of DNA from microbial and model organism genomes sequenced the previous year, and decrease by 10% the cost (base pair/dollar) to produce these base pairs from the previous year's actual results.


Explanation:To unlock the code of an organisms genetics, we sequence its genome - the As, Ts, Cs, and Gs in the correct order. Thanks to investments in process improvements and next-generation technologies, the rate at which we can sequence the DNA of microbes, microbial communities, and plants relevant to Energy missions has been steadily increasing. This leads to an increased pace of discovery. This measure will track our progress in increasing the rate of sequencing. A great deal of excitement in the Life Sciences centers around the promise of genomics. Microbial and plant genomics hold great promise for novel approaches to the Department's most challenging mission needs. We are working towards developing plants and microbes that might generate biofuels such as hydrogen and ethanol, sequester carbon dioxide, and transform chemical or radioactive waste. Sequencing the genomes of these plants and microbes provides the fundamental "parts list" that is a critical first step toward harnessing the biochemical and metabolic potential of these living organisms to solve tough energy and environmental challenges. The faster we can sequence these organisms, the more of them we can sequence and thus the greater our base of knowledge with which to experiment, understand, and exploit the abilities of plants and microbes.

Year Target Actual
2001 >4 5.8
2002 >10 12.7
2003 >14 18
2004 >20 25
2005 >28 33.61
2006 >30 32.72
2007 >40, 644 39, 714
2008 >42.8, >785
Annual Output

Measure: Improve climate models -- Develop a coupled climate model with fully interactive carbon and sulfur cycles, as well as dynamic vegetation to enable simulations of aerosol effects, carbon chemistry and carbon sequestration by the land surface and oceans and the interactions between the carbon cycle and climate.


Explanation:Understanding the complexity of our global climate is critical to predicting how it might respond to human activity. The primary way to experiment in this area is through complex climate models run on our very fastest computers. The more parameters a model contains, and the more these parameters are coupled to each other, the closer to the real climate the model gets. To build the model, an enormous amount of field data is collected about lots of different aspects of the climate and sophisticated software is developed to incorporate the data into a model. An interagency working group coordinates the efforts of many research programs across the Government to ensure that work is complementary and not unnecessarily duplicative. The Department's climate efforts are focused on several critical aspects of the climate that also utilize our core capabilities. These areas include: the Carbon and Sulfur cycles, the effect of aerosols, Atmospheric Chemistry and Radiation effects (such as the role of clouds). This measure will track our progress toward incorporating our research and field data into interagency models. Scientists believe that they see evidence of human impact on the climate that could have very serious repercussions for our standard of living. There have been several international efforts to regulate the human activity believed responsible for climate change. Without strong realistic climate models we cannot, with any certainty, predict climate change nor can we assess the effectiveness or impact of regulations.

Year Target Actual
2001 Consistency Consistency
2002 Resolution Resolution
2003 New Model New Model
2004 Test Bed Test Bed
2005 5 Yr Sim,3 Submodels 4 Yr Sim, 3 Submodel
2006 Validate cloud sims Cloud sims validated
2007 Parameterize clouds Parameterize clouds
2008 Control sim report
2009 subcontinental sim
Annual Output

Measure: Determine scalability of laboratory results in field environments -- Determine the dominant processes controlling the fate and transport of contaminants in subsurface environments and develop quantitative numerical models to describe contaminant mobility at the field scale.


Explanation:Cold War era production of nuclear weapons has left complex mixtures of contaminants in the subsurface at many DOE sites. Removing these contaminants or lowering their concentrations to national standards of environmental quality is often a technically challenging problem with few practical solutions. The Office of Biological and Environmental Research (BER) funds research on the coupled physical, chemical and biological processes that affect contaminant transport within the subsurface at DOE sites. This research develops methods to detect contaminants in the subsurface, quantitatively describe how they move, and devise novel treatment methods to remove or prevent further spreading of contamination. A critical element of this research is whether laboratory descriptions of contaminant mobility actually scale to the field setting. By funding a combination of projects examining key aspects of contaminant fate and transport at varying scales of measurement, including several field scale research sites, BER hopes to decipher the complex processes controlling contaminant mobility and develop science-based remediation strategies to intercept, immobilize or remove contaminants from the subsurface at DOE sites.

Year Target Actual
2002 Sequence Sequence
2003 Identify Identify
2004 Modeling Modeling
2005 Bioremediation test Bioremediation test
2006 Preditive model Preditive model
2007 Quantify processes Quantify processes
2008 ID critical pathways
2009 geophys. techniques
Annual Efficiency

Measure: Average achieved operation time of the scientific user facilities as a percentage of the total scheduled annual operation time.


Explanation:This annual measure assesses the reliability and dependability of the operation of the scientific user facilities. Many of the research projects that are undertaken at the Office of Science's scientific user facilities take a great deal of time, money and effort to prepare and regularly have a very short window of opportunity to run. If the facility is not operating as expected the experiment could be ruined or critically setback. In addition, the Taxpayers have invested millions or even hundreds of millions of dollars in these facilities. The greater the period of reliable operations, the greater the return on that investment for the Taxpayers.

Year Target Actual
2001 >90% 98%
2002 >90% 97%
2003 >90% 97%
2004 >90% 98%
2005 >90% 100%
2006 >95% 97%
2007 >98% 102%
2008 >98%
2009 >98%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The mission of the Biological and Environmental Research (BER) program is to advance environmental and biomedical knowledge that promotes national security through improved energy production, development, and use and contributes to international scientific leadership.

Evidence: FY04 Budget Request (www.mbe.doe.gov/budget/04budget/index.htm). Public Law 95-91 that established the Department of Energy (DOE). The BER Mission has been validated by the Biological and Environmental Research Advisory Committee (BERAC).

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: BER supports fundamental research across a broad range of the biological and environmental sciences including: (1) biotechnology solutions for clean energy, carbon sequestration, and environmental cleanup, (2) low dose radiation research to underpin risk protection and cleanup standards, (3) high throughput DNA sequencing for DOE and National needs, (4) understanding the response of the Earth system to different levels of greenhouse gases in the atmosphere, (5) developing and demonstrating novel solutions to DOE's most challenging environmental problems, and (6) developing innovative radiopharmaceuticals for diagnosis and treatment of human disease and novel imaging instrumentation/technologies to visualize and measure biological functions.

Evidence: BERAC reviews (www.sc.doe.gov/ober/berac/Reports.html).

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any Federal, state, local or private effort?

Explanation: BER supports long-term, fundamental, high risk research relevant to DOE missions. The BER program is well coordinated with similar programs across the Federal government including: the US Climate Change Science Program (CCSP), the National Institutes of Health (NIH), the Environmental Protection Agency, the National Science Foundation (NSF), and DOE Energy and Environmental Management programs.

Evidence: Program reviews (BERAC, National Academy, JASON). Joint program plans including: climate (USGCRP - Annual publication of Our Changing Planet); genomics/structural biology [www.sc.doe.gov/ober/berac/final598.html]; low dose radiation; Bioengineering [www.becon1.nih.gov/becon.htm].

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The BER program is based on competitive merit-review, independent expert advice, and community planning. This proves efficient and effective. However, a Committee of Visitors (COV) has yet to validate the merit review system.

Evidence: BERAC reviews and reports. Program files.

YES 20%
1.5

Is the program effectively targeted, so program resources reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: BERAC ensures that research community input is regularly gathered to assess the priorities, projects, and progress of the program. Peer review is used to assess the relevance and quality of each project.

Evidence: BERAC reviews and reports. Program files.

YES 20%
1.RD1

Does the program effectively articulate potential public benefits?

Explanation:  

Evidence:  

NA  %
1.RD2

If an industry-related problem, can the program explain how the market fails to motivate private investment?

Explanation:  

Evidence:  

NA  %
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The three key long-term measures focus on key scientific research outcomes and are meaningful indicators of progress in each of the three main program areas. The program has defined specific quantitative "successful" and "minimally effective" performance milestones for each measure, and an external panel will assess interim program performance on a triennial basis, and update the measures as necessary. It is inappropriate for a basic research program such as this one to have a quantitative long-term efficiency measure.

Evidence: Advisory committee reports discuss the key scientific drivers for the breadth of BER's diverse research portfolio (www.science.doe.gov/production/ober/berac/Reports.html). A description of the specific "successful" and "minimally effective" milestones, and an explanation of the relevance of these measures to the field can be found on the Office of Science (SC) Web site (www.sc.doe.gov/measures).

YES 11%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: BERAC has reviewed the new long-term and annual measures for this program and found them to be ambitious and meaningful indicators of progress. The external reviews described in 2.1 will update the measures, targets, and timeframes on an interim basis.

Evidence: Letter from BERAC chair regarding review of long-term and annual measures.

YES 11%
2.3

Does the program have a limited number of specific annual performance measures that demonstrate progress toward achieving the program's long-term measures?

Explanation: The facilities measure, sequencing rate measure and improvements to climate models should provide the capabilities that the scientific community needs to make discoveries directly connected to the long term measures. The measure on the scalability of field results is key to the success of the long-term measure for Environmental Remediation. The climate and environmental remediation measures are not trendable, and will have annual primary targets that continually evolve, and cannot be predicted more than one budget year in advance.

Evidence: FY04 Budget Request. Website with further information, including explanation of non-trendable measures and targets (www.sc.doe.gov/measures).

YES 11%
2.4

Does the program have baselines and ambitious targets and timeframes for its annual measures?

Explanation: Half of the annual measures include quantifiable annual targets. The other half include specific annual scientific targets. Baseline data (FY01 and FY02) verify that the quantifiable annual measures are ambitious, yet realistic.

Evidence: FY04 Budget Request. Website with further information (www.sc.doe.gov/measures).

YES 11%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, etc.) commit to and work toward the annual and/or long-term goals of the program?

Explanation: A limited FY03 audit by the DOE Inspector General (IG) found that "performance expectations generally flowed down into the scope of work at the national laboratories." BER program targeted solicitations explicitly include program goals, however the new measures from 2.1/2.3 (once adopted) should be present in future solicitations.

Evidence: Memo from the DOE IG to the Director of the Office of Science. M&O contract performance evaluation provisions (WWW-accesible examples include: Oak Ridge National Lab, www.ornl.gov/Contract/UT-BattelleContract.htm; and, Lawrence Berkeley National Lab, www.lbl.gov/LBL-Documents/Contract-98/AppFTOC.html). Solicitation examples (www.science.doe.gov/grants/Fr03-05.html, www.science.doe.gov/grants/Fr03-13.html)

YES 11%
2.6

Are independent and quality evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: All research projects undergo Merit Review. Grants are reviewed triennially. Major facilities are reviewed annually. Construction projects are reviewed quarterly. BERAC evaluates all aspects of the BER program every 2-5 years. JASON reviews of specific programs are used. Several large pieces of the BER portfolio are also reviewed by outside panels as part of interagency programs. Even though the FY04 PART process did not require the initiation of a Committee of Visitors (COV) review process, BER is in the process of establishing a COV because the previous external reviews have not provided a process validation and detailed portfolio quality check.

Evidence: SC Merit Review guidelines (www.sc.doe.gov/production/ grants/merit.html). BERAC reviews of climate change research, bioremediation program units, Free Air Carbon-dioxide Enrichment (FACE), and Atmospheric Radiation Measurement Unmanned Aerial Vehicles (ARM UAV) (www.sc.doe.gov/ober/berac/Reports.html). Program files, including Lehman review reports and JASON reviews. Letter to BERAC chair on creation of COV process, schedule for reviews, and conflict of interest issues.

YES 11%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: DOE has not yet provided a budget request that adequately integrates performance information.

Evidence:

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: New performance goals and targets have been developed in coordination with OMB. BER participated in the drafting of a new SC strategic plan. BERAC has produced forward-looking reports on various aspects of the program, including most recently the Genomes to Life effort. BER participates in interagency planning groups on topics such as genomics and climate change, including the recent strategic plan for the U.S. Climate Change Science Program. BER is initiating a COV process to help in identifying research program strengths/weaknesses for strategic planning purposes.

Evidence: SC strategic plan has yet to be officially provided to OMB for review. BERAC reports, e.g., structural biology, Genomes to Life, and the NABIR program (www.sc.doe.gov/ober/berac/Reports.html). Climate change documents; both governmental and National Academy of Sciences (www.usgcrp.gov, dels.nas.edu/ccgc).

YES 11%
2.CA1

Has the agency/program conducted a recent, meaningful, credible analysis of alternatives that includes trade-offs between cost, schedule, risk, and performance goals and used the results to guide the resulting activity?

Explanation: The program did not have any construction or upgrade projects of sufficient scale during FY02, so no analyses were necessary.

Evidence:  

NA 0%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?

Explanation: This is a basic R&D program, and the question is intended for industry-related R&D programs.

Evidence:  

NA 0%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: Although not visible outside DOE, internal SC budget formulation practices include a priority ranking process. The program occasionally solicits prioritization recommendations from BERAC, though the program has a difficult time prioritizing across its diverse portfolio. BER typically appears to make priority-based decisions during program execution.

Evidence: Genomes to Life (doegenomestolife.org) is a priority of both BERAC and BER. A recent BERAC assessment of Biosphere 2 determined that it the science capability was not a priority for the program (www.science.doe.gov/production/ober/berac/Biosphere_2.pdf). Charge letter to BERAC chair asking for recommendations on priorities for atmospheric sciences program.

YES 11%
Section 2 - Strategic Planning Score 89%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Performance information is collected for a number of program elements, e.g., amount and quality of DNA sequence determined, spatial resolution of improved climate models, as well as retrospective analyses by BERAC on broad program impacts. Project performance information is collected via Lehman reviews. The program collects performance data from individual grantees and national labs, and uses peer review as a type of standardized quality control at the individual grant level. However, there is not yet a systematic process, such as regular COV evaluations, that conducts research portfolio quality and process validations. While DOE IG contracts with an outside auditor to check internal controls for performance reporting, and the IG periodically conducts limited reviews of performance measurement in SC, it is not clear that these audits check the credibility of performance data reported by DOE contractors.

Evidence: JGI data (www.jgi.doe.gov). Climate models (www.ccsm.ucar.edu). BERAC program reviews (www.science.doe.gov/production/ober/berac/Reports.html). Program files, including JASON studies, and Lehman review of "Mouse House."

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, cost-sharing partners, etc.) held accountable for cost, schedule and performance results?

Explanation: Senior Executive Service (SES) and Program Manager Performance Plans are directly linked to program goals. The Management and Operations contracts for the Labs and Facilities include performance measures linked to program goals. Research funding requirements ensure consideration of past performance. All renewal requests are subject to competitive peer review, including earmarked projects after the first year.

Evidence: Program and personnel files. For performance-based fee adjustments on M&O contracts, see evidence for question 2.5. Grant rules for renewals (www.science.doe.gov/grants/#GrantRules).

YES 8%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Using DOE's monthly accounting reports, SC personnel monitor progress toward obligating funds consistent with an annual plan that is prepared at the beginning of the fiscal year to ensure alignment with appropriated purposes. SC programs consistently obligate more than 99.5% of available funds.

Evidence: Program files. DOE-wide audit reports.

YES 8%
3.4

Does the program have procedures (e.g., competitive sourcing/cost comparisons, IT improvements, approporaite incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: SC is currently undergoing a reengineering exercise aimed at flattening organizational structure and improving program effectiveness. The program collects the data necessary to track its one "efficiency" measure for facility operation management.

Evidence: FY04 Budget Request/Annual Performance Plan. SC reengineering information (www.screstruct.doe.gov).

YES 8%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The program, by its nature as a smaller player in almost everything it funds, is well coordinated with similar programs across the Federal government including the USGCRP, NIH, EPA, NSF, and DOE Energy and Environmental programs. This coordination and cooperation includes both joint planning, priority setting, as well as joint solicitations, including recently cost-sharing a new beamline at the Stanford Sychrotron Radiation Lab with NIH.

Evidence: Program and expert reviews detail coordination (e.g., www.sc.doe.gov/ober/berac/State%20of%20BER.pdf). Joint program planning with other agencies, especially for efforts such as the Human Genome Project and the U.S. global climate change program (www.ornl.gov/TechResources/Human_Genome/home.html, www.usgcrp.gov). Recent joint interagency solicitations (www.sc.doe.gov/grants/Fr03-04.html, www.sc.doe.gov/grants/Fr03-07.html)

YES 8%
3.6

Does the program use strong financial management practices?

Explanation: SC staff execute the BER program consistent with established DOE budget and accounting policies and practices. These policies have been reviewed by external groups and modified as required to reflect the latest government standards.

Evidence: Various Departmental manuals. Program files. Audit reports.

YES 8%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: SC is currently reengineering to improve program management efficiency. BER has worked with OMB to improve performance evaluation. Even though it was not recommended during the FY04 PART process, BER is organizing a new COV process under the auspices of BERAC.

Evidence: SC reengineering information (www.screstruct.doe.gov). Letter to BERAC chair on creation of COV process, schedule for reviews, and conflict of interest issues.

YES 8%
3.CA1

Is the program managed by maintaining clearly defined deliverables, capability/performance characteristics, and appropriate, credible cost and schedule goals?

Explanation: The BER program documents the capabilities and characteristics of new facilities in conceptual design reports that are reviewed by BERAC and independent Lehman Reviews. Progress on the one construction project is tracked quarterly through program and Lehman reviews.

Evidence: Conceptual Design Reviews. Program files, including facility peer review on FACE, and Lehman report on the program's single construction project (Laboratory for Comparative and Functional Genomics, bio.lsd.ornl.gov/mgd).

YES 8%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: First time grant applications are encouraged in all Request For Proposals/Applications, and BER has a much higher percentage of new awards than other SC programs. Merit review guides all funding decisions, and the targeted solicitations ensure that a larger amount of research dollars are fully competed. However, the quality of the research funded via this process has not yet been validated by a COV. Also, BER has seen an increasing amount of Congressional earmarking in recent years, and this "research"--totaling almost $100 million in FY 2004--does not go through any type of merit-based competitive review process.

Evidence: On average, BER funds 30% of new research applications. For calendar year 2001, BER received 495 new applications and 82 requests for renewals of currently funded projects. (www.sc.doe.gov/ober/ober_top.html) Targeted solicitations (universities: www.science.doe.gov/grants/closed03.html; labs: www.science.doe.gov/grants/clolab03.html).

NO 0%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: In addition to grantee progress reports, program managers stay in contact with grantees through email and telephone, program reviews, and site visits.

Evidence: Program files, including travel logs and progress reports.

YES 8%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: In accordance with DOE Order 241.1A, the final and annual technical reports of program grantees are made publicly available on the web through the Office of Scientific and Technical Information's "Information Bridge". However, program-level aggregate data on the impact of the grants program is not adequately communicated in the annual DOE Performance and Accountability report.

Evidence: DOE Order 241.1A. Information Bridge (www.osti.gov/bridge/). FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf).

NO 0%
3.RD1

Does the program allocate funds through a competitive, merit-based process, or, if not, does it justify funding methods and document how quality is maintained?

Explanation: The funds for research programs and scientific user facilities at the Federal Labs are allocated through a limited competition analogous process to the unlimited process outlined in 10 CFR 605, though BER funds very little work with this mechanism. More so than other SC programs, BER competes the lab research grants by developing a large number of targeted (rather than general) solicitations. However, the quality of the research funded via this process has not yet been validated by a COV.

Evidence: SC Merit Review procedures. (www.sc.doe.gov/production/grants/merit.html) 10 CFR 605. (www.science.doe.gov/production/grants/605index.html). Targeted solicitations (universities: www.science.doe.gov/grants/closed03.html; labs: www.science.doe.gov/grants/clolab03.html).

NO 0%
3.RD2

Does competition encourage the participation of new/first-time performers through a fair and open application process?

Explanation:  

Evidence:  

NA  %
3.RD3

Does the program adequately define appropriate termination points and other decision points?

Explanation:  

Evidence:  

NA  %
3.RD4

If the program includes technology development or construction or operation of a facility, does the program clearly define deliverables and required capability/performance characteristics and appropriate, credible cost and schedule goals?

Explanation:  

Evidence:  

NA  %
Section 3 - Program Management Score 66%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome performance goals?

Explanation: BERAC will evaluate progress toward the new long term performance measures every three years, but no external reviews that address progress toward program goals (either past ones or the new ones proposed in the "measures" tab) are available to date other than the generally positive BERAC reviews.

Evidence: BERAC reports, especially the 2001 assessment of the entire program (www.er.doe.gov/production/ober/berac/Reports.html).

LARGE EXTENT 13%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Although all but one of the annual performance measures for FY05 are new, BER hit over half of the targets for all of its former annual GPRA measures. The genome target was missed because of a programmatic decision to focus on completing DOE's piece of the human genome according to an accelerated interagency plan.

Evidence: FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf). FY04 Annual Performance Plan (www.mbe.doe.gov/budget/04budget/content/perfplan/perfplan.pdf).

LARGE EXTENT 13%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program performance goals each year?

Explanation: The recent history of tracking the one "efficiency" measure for facility operation management shows that the program continues to meet or exceed expectations.

Evidence: Program files, including facilities usage data.

YES 20%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., that have similar purpose and goals?

Explanation: The program is highly integrated with the activities of other agencies, and typically plays a relatively smaller--but important--leveraging role in interagency ventures: no other program with the range of activities (i.e., environmental remediation, climate change, life sciences, medical applications) and mission focus of BER exists in the world. Partly because of the highly integrated nature of BER, no expert panel comparison of performance (either with other agencies or countries) has been conducted at the program-wide level as would be appropriate for the PART.

Evidence: Internal government planning reviews to assess the strongest aspects of each agency. BERAC reports (www.er.doe.gov/production/ober/berac/Reports.html). BER role in human genome project, etc.

NA  %
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: BERAC, on a rotating schedule, reviews the major elements of the BER program against plans and scientific opportunities. The entire BER program was positively reviewed by BERAC in 2001, though this review did not have great depth. Other experts groups, such as JASON, also review pieces of BER as needed. However, BER needs a COV process to fill gaps in the normal BER review process.

Evidence: BERAC review reports (www.sc.doe.gov/ober/berac/Reports.html). Program files, including facility peer reviews and JASON reports.

YES 20%
4.CA1

Were program goals achieved within budgeted costs and established schedules?

Explanation: Construction of Laboratory for Comparative & Functional Genomics at Oak Ridge, to be completed in FY 2003, is on schedule and within cost.

Evidence: Program files, including 04/30/02 Lehman review report.

YES 20%
4.RD1

If the program includes construction of a facility, were program goals achieved within budgeted costs and established schedules?

Explanation:  

Evidence:  

NA  %
Section 4 - Program Results/Accountability Score 87%


Last updated: 09062008.2003SPR