You are here

Program Evaluation: EERE Resource Documents

Listed below are evaluation resource documents for the Office of Energy Efficiency and Renewable Energy (EERE).

(1)  Project Manager’s Guide to Managing Impact and Process Evaluation Studies. August 2015.
Prepared by: Yaw O. Agyeman, Lawrence Berkeley Laboratory & Harley Barnes, Lockheed Martin

This report provides a step-by-step approach to help managers of EERE evaluation projects create and manage objective, high quality, independent, and useful impact and process evaluations. It provides information to help with the following: Determine why, what and when to evaluate; Identify the questions that need to be answered in an evaluation study ; Specify the type of evaluation(s) needed; Hire a qualified independent third-party evaluator; Monitor the progress of the evaluation study; Implement credible quality assurance (QA) protocols; Ensure the evaluation report presents accurate and useful findings and recommendations; Ensure that the findings get to those who need them; and Ensure findings are put to appropriate use.

(2) A Framework for Evaluating R&D Impacts and Supply Chain Dynamics Early in a Product Life Cycle. 2014.
Prepared by: Gretchen Jordan (360 Innovation LLC), Jonathan Mote (George Washington University), Rosalie Ruegg (TIA Consulting Inc.), Thomas Choi (Arizona State University), and Angela Becker-Dippmann (Pacific Northwest National Laboratory)

This report provides a framework for evaluation of R&D investments aimed at speeding up the pace of innovation and strengthening domestic manufacturing and supply chains. The framework described in this report provides a view of dynamics unfolding in the "black box of innovation" during early phases of the product life cycle. This early period of focus can be contrasted with the long-term period of impact evaluation that seeks to measure ultimate results. The framework helps users understand, measure, and enhance the ingredients and early processes that will determine long-term impact. It adds analysis of product value chain networks to the evaluators' toolbox as a means of assessing early changes in a targeted product's domestic supply chain and value chain. The framework identifies core progress and impact metrics for analyzing changes in a product value chain, and it provides an approach for assessing DOE attribution in detail if warranted and feasible.

(3) Evaluating Realized Impacts of DOE/EERE R&D Programs. 2014 Final Report.
Prepared by: Rosalie Ruegg (TIA Consulting Inc.), Alan C. O'Connor (RTI International), and Ross J. Loomis (RTI International)

This document provides guidance for evaluators who conduct impact assessments to determine the "realized" economic benefits and costs, energy, environmental benefits, and other impacts of the EERE R&D programs. The approach described in this guide has a primary focus on realized results and the extent they can be attributed to the efforts of an R&D program.

(4) Overview of Evaluation Methods for R&D Programs. 2007.
Prepared by: Rosalie Ruegg and Gretchen Jordan (Sandia National Laboratories)

This booklet introduces managers to a variety of methods for evaluating R&D programs. It provides an overview of 14 evaluation methods that have proven useful to R&D program managers in federal agencies. Each method is briefly defined, its uses are explained, its limitations are listed, examples of successful use by other R&D managers are provided and references are given.

(5) Impact Evaluation Framework for Technology Deployment Programs: An approach for quantifying retrospective energy savings, clean energy advances, and market effects. 2007. (Main Report).
Prepared by: John H. Reed (Innovologie LLC), Gretchen Jordan (Sandia National Laboratories) and Edward Vine (Lawrence Berkeley National Laboratory)

This document describes a framework for evaluating the retrospective impact of technology deployment programs. Program managers and evaluators in federal, state, and local governments and in public entities and institutions are increasingly accountable for delivering and demonstrating results of their programs. The impact framework assists program managers and evaluators to develop and implement better and more cost effective evaluations. The seven step process, the generic templates, the generic evaluation questions, and the evaluation designs that make up the framework can be used to develop powerful and meaningful impact evaluations to refine programs, increase program effectiveness, make the tough decisions to drop ineffective program elements, and to develop credible evidence that communicates the value of the program to stakeholders. An important focus of the framework is increasing understanding of the linkages between program outputs and short term and long term outcomes (impacts). It simplifies and enriches the process of describing and developing measures of target audience response to program outputs, designing sound evaluations, and taking credit for all effects that are attributable to the program. Created for the EERE, the framework can be applied to most deployment programs in a broad array of disciplines.

(6) Impact Evaluation Framework for Technology Deployment Programs: An approach for quantifying retrospective energy savings, clean energy advances, and market effects. (An Overview and Example). 2007.
Prepared by: John H. Reed (Innovologie LLC), Gretchen Jordan (Sandia National Laboratories) and Edward Vine (Lawrence Berkeley National Laboratory)

This is a twelve page overview and an application example for the larger report "Impact Evaluation Framework for Technology Deployment Programs: An approach for quantifying retrospective energy savings, clean energy advances, and market effects.

(7) EERE Peer Review Guide. 2004.
Prepared by: Sam Baldwin (DOE), Jim Daley (DOE), Jeff Dowd (DOE), David Howell (DOE), John Ryan (DOE), Alan Schroeder (DOE), Frank Wilkins (DOE), and Gretchen Jordan (Sandia National Laboratories)

The peer review guide describes steps to plan, design, and implement external peer reviews. This guide provides managers and staff with guidance in establishing a formal in-progress peer review that provides intellectually fair expert evaluation of EERE R&D and supporting business administration programs.

(8) "Value of Program Evaluation" Case Study Series: Sponsored by DOE

(9) A Proposed Methodology to Determine the Leverage Impacts of Technology Deployment Programs. 2008.
Prepared by James L. Wolf for the U.S. Department of Energy, Washington, D.C.

This report describes a methodology developed for leverage estimation that is relevant and useful to technology deployment programs in EERE. The methodology recommended is intended to be helpful to the development of new standards concerning the calculation of leverage that would be deemed valid and defensible not only within EERE, but also across all technology deployment programs in the public sector, generally.

(10) Stage Gate Review: