text-only page produced automatically by LIFT Text
Transcoder Skip all navigation and go to page contentSkip top navigation and go to directorate navigationSkip top navigation and go to page navigation
National Science Foundation
 
About NSF
design element
About
History
Visit NSF
Staff Directory
Organization List
Career Opportunities
Contracting Opportunities
NSF & Congress
Highlights
Hearings
Program Awards by State/District
Major Legislation
Science & Policy Links
NSF & Congress Archive
Contact Congressional Affairs
Related
Science & Engineering Statistics
Budget
Performance Assessment Information
Partners
Use of NSF Logo
 


NSF & Congress
Testimony

Dr. Ann Petersen

Dr. Ann Petersen
Deputy Director
National Science Foundation

Testimony
Before the House Science Committee
July 10, 1996

Mr. Chairman, Congressman Brown, and members of the Committee, I appreciate the opportunity to appear today to discuss efforts underway at the National Science Foundation to implement the Government Performance and Results Act. NSF fully supports the GPRA aim of improving agency accountability and we welcome the opportunity to more fully inform the public about the outcomes from investments in research. Fully implementing the law presents some new challenges to the Foundation, but we have been exploring the possibilities of the Act for three years, and we are confident that we will be ready for full implementation in Fiscal Year 1999.

In the fall of 1993, NSF volunteered to participate in a number of GPRA pilot projects, which are still active. We have also participated in inter-agency discussions of the implications of GPRA for research agencies, and in the preparation of the NSTC white paper on this topic. The more time we have spent in these discussions, the more aware we have become of NSF's distinctive role in the federal research enterprise, and of its unique position as a research agency in the GPRA process. Our fellow agencies have specific missions which define long-term outcomes for their research activities, while NSF has the mission of providing the intellectual capital that enriches, and may sometimes transform, all those missions. These factors have contributed to the level of attention and care we have given to formulating our response to this Act.

Earlier, we provided the Committee with documentation on our preparations for GPRA implementation, and we have therefore not included that detailed information in this testimony. We are preparing case studies of our experience with GPRA strategic planning and performance measurement for the Office of Management and Budget, and will forward a copy when they are in final form.

A RESULTS-ORIENTED AGENCY

Neither "performance" nor "results" is a new topic of interest at NSF. By the very nature of its work, NSF is steeped in performance information. Our research and education investments are made through a rigorous process of merit review with peer evaluation. One important element in those decisions is the performance competence of the applicant -- in the case of research, the investigator's record of past research accomplishments, including communication of findings and sharing of data and other research products. Other criteria focus on expected results: new discoveries or fundamental advances; new or improved technology or assisting in the solution of societal problems; and contributing to the Nation's research, education, and manpower base. When NSF grantees return to us for further support, they include information about results of prior work in their applications, and we require reviewers to take this information into account in decisions about further funding. With larger, more complex activities that stretch out over longer periods of time, our level of attention to performance and results is correspondingly higher. Centers, for example, generally go through an intensive review in their third and sixth years, in addition to regular site visits and evaluation of annual reports. Facilities receive similar regular reviews and recompetition. Finally, our portfolio of education activities receives intensive monitoring for results in addition to regular program evaluation.

Just as "performance" and "results" are familiar territory for NSF, strategic planning is by no means a new concept for us. Our mission was set in our organic act, and every year, we translate that mission into a set of projected investments in the form of our budget request. In formulating the plans represented in the budget, we consult broadly, both with the research and education communities and with other partner institutions, such as state and local governments, industrial firms, nonprofit organizations, academic institutions, professional societies, international organizations, and other federal agencies. This consultation happens in a number of ways: for example, in the process of merit review, which involves more than sixty thousand reviewers annually; in our eight major advisory committees and the other panels that report through them; and through the National Science Board, our governing body. This rich network of consultation is the steering mechanism that keeps the Foundation's activities, from broad allocations of funds through the design and implementation of new programs to the selection of specific projects, moving in directions that are widely seen as fruitful for the nation.

GPRA, however, does ask us to summarize and present the results of these information-gathering and consultative processes differently than we have in the past. Much of the performance information we now collect deals with immediate results at the project level. For GPRA we will be using that information to look at our set of investments as a whole.

PERFORMANCE PLANNING

GPRA's most profound challenges for NSF lie in the area of planning. Under the Act, NSF is required to submit a strategic plan that covers a five-year period and will be updated every three years. GPRA further requires an annual performance plan, derived from the strategic plan, which sets quantitative targets for performance in a particular fiscal year. In research, it is vitally important that we be able to follow through on unanticipated breakthroughs. Such changes in direction are not consistent with goals tied to specific research directions. We have therefore used the required strategic plan as an opportunity to state our general objectives and strategies. The plan articulates three goals:

  • Enable the U.S. to uphold a position of world leadership in all aspects of science, mathematics, and engineering.
  • Promote the discovery, integration, dissemination, and employment of new knowledge in service to society.
  • Achieve excellence in U.S. science, mathematics, engineering, and technology education at all levels.

It also identifies four core strategies: develop intellectual capital, strengthen the physical infrastructure, integrate research and education, and promote partnerships. While we have phrased these goals and strategies to be enduring and long-term, we also expect to be able to adjust them as needed based on changing conditions. Our current strategic plan was published in 1995, and our budget submissions to Congress for FY 1996 and FY 1997 both reported and were shaped by its contents.

Performance plans, however, present NSF with a particular challenge. For GPRA reporting, we have designated four key program functions -- research projects, facilities, education and training, and administration and management -- and have begun to present our budget along these lines. These categories were chosen in large part because they shared common performance expectations, even though the activities they encompass are spread across our organizational units. Using those common performance expectations, we have experimented with performance planning for all four key program functions. Our GPRA pilot projects, along with our ongoing efforts in educational program evaluation, have taught us important lessons in this area. We have learned, for example, that some of those areas are easier to address than others. We have some quantitative indicators of performance in education, facilities, and administration and management, but our learning process has also demonstrated the difficulty in setting quantitative performance goals for research. In particular, they have demonstrated the difficulty in expressing performance goals in a quantified form.

First is the time factor. Under GPRA, a performance goal is intended to refer to the results of the agency's budget allocation for a particular year. But the outcomes of NSF-supported activities typically build over long time periods, and specific uses appear at unpredictable intervals. The results that appear in a given fiscal year -- both discoveries and other benefits for society -- are therefore often the interactive products of funding over several preceding years. For example, NSF-supported research on geohazards increases the capability of predicting major catastrophic events by building understanding of their interface with ongoing phenomena and processes. NSF supports this area through investigator-initiated research, centers (both focused and broad-ranging), integrated international programs, and rapid-response scientific teams. One example of the payoff from this investment appeared when Mount Pinatubo erupted in 1991. Kelvin Rodolf, an NSF-supported investigator, was able to build on the base of knowledge about geohazards to accurately map the flows of dangerous hot mud slides, saving many lives and allowing the timely evacuation of Clark Air and Subic Naval Bases. If GPRA had been in effect, Rodolf's accomplishment could have been reported for 1991, but it clearly reflected much more than an FY 1991 investment. Like many of NSF's activities, the investment in geohazards research does not fit neatly into fiscal year boxes for the purposes of setting performance goals.

Second, GPRA calls for quantitative performance targets. But setting numeric targets on performance indicators for research and education activities shows clear potential to do more harm than good. In research, for example, there is a set of well-established quantitative performance indicators, but they reflect only short-term outputs, and not the longer-term outcomes that GPRA, and indeed NSF's mission itself, expects us and our grantees to focus on. If we focused the attention of NSF staff and grantees on output indicators -- say, for example, publications in peer-reviewed journals -- we would be deflecting their attention from the more fundamental goals we want them to strive for: extending the frontiers of knowledge, recruiting and educating the country's best minds for research, and carrying the excitement of discovery into the classroom. Rather than achieving the purposes of GPRA, this approach would instead defeat them.

For these reasons, we have proposed to use the alternative format for performance goals available under GPRA, where appropriate. This alternative allows us to state descriptive goals, while challenging us to determine whether we have met them in an independently verifiable way. For many of our own activities, rather than those of our grantees, more conventional performance goals will be possible. When we are putting a new program in place, for example, we will be able to articulate implementation goals. We are also experimenting with indicators for the merit review process, for example, indicators related to support for young investigators and diversity in the reviewer pool. In these areas, too, however, we are seeking not just any measure, but the right measures, the ones that keep us focused on making effective investments in research and education on behalf of the nation. Here, too, the indicators need to be balanced between quantity and quality.

PERFORMANCE REPORTING

The use of the alternative format for performance goals has allowed us to make headway on designing a system for performance assessment that will be both efficient and effective. We are addressing the problem of the long-term, unpredictable payoffs from research and education investments by translating our performance goals into standards that can be applied to our portfolio retrospectively on a multiple-year basis. Using these standards, we can learn in a systematic way from past experience in order to plan better for the future, but on a longer time scale than GPRA's annual cycle. When we examine our investments in a given field, we can ask near-term questions about the management of the merit review process and grants program, along with longer-term questions about major advances in the field over the past decade and how NSF's grantees have contributed to and capitalized on them. Answering questions like this will require compilation of information from the program level up, and the process will therefore involve all NSF's professional staff and advisory committees, up to the level of the National Science Board. Examination of how funding strategies in the past have paid off will therefore provide inputs to a wide range of decisions about the deployment of resources in the future.

Second, descriptive performance goals will allow us to use quantitative performance information in its appropriate context. For example, for our facilities portfolio, we have developed measures of operational performance, but efficiency alone is not the goal of these investments. When we use descriptive goals to focus on the research payoffs from facilities, we can weigh the efficiency indicators appropriately into an overall assessment of performance and results. Similarly, our core strategy of developing intellectual capital leads to different output indicators in the different fields of research we support; these differences must be taken into account in interpreting indicators. In computer science and engineering, for example, software is a more important immediate product of the research process than publications; in the social sciences, books are more important than journal articles. The expected longer-term outcomes of our programs also vary. In computer science and engineering, new industries may develop out of given lines of research in a period of ten to fifteen years, while in environmental research, the building of international partnerships may be a more appropriate and productive expectation. By building reports of performance up from lower levels of our organization using goals-based performance standards, we can summarize NSF's overall performance within the framework of our strategic plan without forcing units into procrustean targets on a limited set of indicators. We intend to use major accomplishments alongside appropriate quantitative indicators as the basis for judgments about whether or not we are meeting our goals, and we expect to include such accomplishments in our performance reports.

GPRA requires independent verification of performance, whether descriptive or quantitative goals are used. To accomplish this in relation to our descriptive goals, we are planning to make use of our existing advisory structure, while recognizing that we will need some additional inputs to integrate across the more focused assessments. We are also improving the base of information that assessment panels will have available by redesigning our project reporting system. In both these areas, we have taken as a goal not to increase the burden on the research community. We believe we will be able to achieve that goal, first, by tapping into the power of information technology, and second, by paring back on current functions in order to make room for the increased attention to performance and results.

SUMMARY

In short, GPRA-type strategic planning has already had a positive impact within NSF. We are developing a system for performance reporting that draws on current practice while strengthening the base of information about performance and linking it more directly to the goals of our strategic plan. As you can see, the implementation of GPRA into the research and education enterprise is a difficult and challenging prospect. NSF, along with our external community, is moving forward to implement the act in ways that are responsive to the act's goals and objectives while at the same time reflect the realities of the basic research and education process.

Thank you for this opportunity to testify on GPRA. I would be happy to answer any questions.

See also: Hearing Summary.

 

Print this page
Back to Top of page
  Web Policies and Important Links | Privacy | FOIA | Help | Contact NSF | Contact Webmaster | SiteMap  
National Science Foundation The National Science Foundation, 4201 Wilson Boulevard, Arlington, Virginia 22230, USA
Tel: (703) 292-5111 , FIRS: (800) 877-8339 | TDD: (800) 281-8749
Last Updated:
Jul 10, 2008
Text Only


Last Updated: Jul 10, 2008