ExpectMore.gov


Detailed Information on the
Science and Engineering Centers Programs Assessment

Program Code 10004404
Program Title Science and Engineering Centers Programs
Department Name National Science Foundation
Agency/Bureau Name National Science Foundation
Program Type(s) Research and Development Program
Competitive Grant Program
Assessment Year 2006
Assessment Rating Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 100%
Program Management 100%
Program Results/Accountability 87%
Program Funding Level
(in millions)
FY2008 $258
FY2009 $251

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Monitoring performance against recently established efficiency targets and making appropriate management or other adjustments to improve its performance.

Completed This improvement plan should be deleted, since it duplicates the plan to develop annual performance measures (percentage of non-academic partners) to be monitored more readily. The Centers PART has only one measure relating to non-academic partners.
2006

Ensuring increased timeliness of yearly project reports from award recipients.

Completed On Nov. 18, 2006, changes will be implemented in the Project Reports System to enable NSF to monitor and enforce that PIs are submitting annual and final project reports within the appropriate timeframes. Annual reports are due 90 days prior to report period end date and are required for all standard and continuing grants and cooperative agreements. Final reports are due within 90 days after expiration of award. Policy documents have been updated to reflect the changes.
2006

The program will develop annual performance measures that it can monitor more readily.

Completed This measure refers to collecting data from all NSF-supported centers in time to report results in PARTWeb. NSF has responded by requiring data by that deadline. All centers make concerted efforts to include non-academic partner institutions, including industrial partners, in their activities and programs, as appropriate. To get a true measure of success, the results should be reported for each type of centers program. However, the PART measure is an aggregate one.

Program Performance Measures

Term Type  
Long-term Outcome

Measure: External Advisory Committee validation that NSF achieves the Centers Program objective to enable people who work at the forefront of discovery to make important and significant contributions to science and engineering knowledge.


Explanation:Assessment by the Advisory Committee for GPRA Performance Assessment of "significant achievement" in the NSF's IDEAS strategic outcome goal indicator, enabling people who work at the forefront of discovery to make important and significant contributions to science and engineering knowledge.

Year Target Actual
2002 Success Success
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success Success
2009 Success
2010 Success
2011 Success
2012 Success
Long-term Outcome

Measure: External Advisory Committee validation that NSF achieves the Centers Program objective to encourage collaborative research and education efforts across organizations, disciplines, sectors, and international boundaries.


Explanation:Assessment by the Advisory Committee for GPRA Performance Assessment of "significant achievement" in encouraging collaborative research and education efforts across organizations, disciplines, sectors, and international boundaries.

Year Target Actual
2002 Success Success
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success Success
2009 Success
2010 Success
2011 Success
2012 Success
Long-term Outcome

Measure: External Advisory Committee validation that NSF achieves the Centers Program objective to foster connections between discoveries and their use in the service of society.


Explanation:Assessment by the Advisory Committee for GPRA Performance Assessment of "significant achievement" in fostering connections between discoveries and their use in the service to society.

Year Target Actual
2002 Success Success
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success Success
2009 Success
2010 Success
2011 Success
2012 Success
Long-term Outcome

Measure: External Advisory Committee validation that NSF achieves the Centers Program objective to provide leadership in identifying and developing new research and education opportunities within and across S&E fields.


Explanation:Assessment by the Advisory Committee for GPRA Performance Assessment of "significant achievement" in providing leadership in identifying and developing new research and education opportunities within and across S&E fields.

Year Target Actual
2002 Success Success
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success Success
2009 Success
2010 Success
2011 Success
2012 Success
Annual Output

Measure: Percentage of partner institutions in the Centers Program that are non-academic institutions (includes industry, state, local and other Federal agencies) demonstrating connections between discoveries and their use in the S&E community and in the service of society.


Explanation:The Centers Program contributes significantly to the NSF goal of connecting discovery to learning, innovation, and service to society. The NSF considers partnerships as enabling the movement of ideas, tools, and people across the different sectors of the economy. Partnerships between academic and non-academic institutions increase the likelihood that the movement of ideas, center-developed tools, and center-trained people between the public and private sectors will be established, nurtured, and maintained. Maintaining current high levels of participation by non-acacdemic institutions under the uncertainties of the economy and the unpredictable nature of research will require significant effort; the Centers Program will work to sustain the established level of participation.

Year Target Actual
2003 >=60% 61%
2004 >=60% 65%
2005 >=60% 68%
2006 >=60% 63%
2007 >=60% 58%
2008 >=60% 59%
2009 >=60%
2010 >=60%
Annual Efficiency

Measure: Percentage of decisions on pre-proposals, that are merit reviewed and available to Centers Program applicants, within five months of pre-proposal receipt or deadline date.


Explanation:Timely availability of decisions on pre-proposals allows the STEM research and educational communities to more effectively develop full proposals and allows for the in-depth merit review of a limited number of full proposals that have a high likelihood of success. The measure of efficiency combines two years of data to form a rolling average that takes into account the Centers Program cycle of competitions, which are generally every two or three years. Considering the complexity of large-scale organized cutting-edge research and education efforts and the two-to-three year cycle of Centers Program competitions, maintaining the average five-month decision time of 85 percent of the pre-proposals that are processed ensures that only the most meritorious center proposals are selected for funding.

Year Target Actual
2005 0% 93%
2006 85% 89%
2007 85% 93%
2008 85% 73%
2009 85%
2010 85%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of the Centers Program is clearly stated. The purpose of the Centers Program is to enable academic institutions along with their non-academic partner institutions to integrate ideas, tools, and people on scales that are extensive enough to significantly impact important S&E fields and cross-disciplinary areas through large-scale organized efforts. The Centers Program consists of programs that exploit opportunities in science, engineering, and technology in which the complexity of the research problems or the resources needed to solve them require the advantages of scope, scale, change, duration, equipment, facilities, and students that can only be provided by an academic research center. The Centers Program includes the Chemical Bonding Centers (CBCs), Engineering Research Centers (ERCs), Materials Research Science and Engineering Centers (MRSECs), Nanoscale Science and Engineering Centers (NSECs), the National Center for Ecological Analysis and Synthesis (NCEAS), Science and Technology Centers (STCs), and Science of Learning Centers (SLCs).

Evidence: Evidence of the purpose of the Centers Program may be found in the following references: The NSF FY 2003-2008 Strategic Plan, (pp. 15-18) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201) The National Science Foundation Act of 1950, as amended, under the "Functions of the Foundation" (42 U.S.C. 1862, Sec. 3. (a)(5)) (http://www.washingtonwatchdog.org/documents/usc/ttl42/ch16/sec1862.html) NSF Authorization Act of 2002, (pp. 107-378, Section 2) (http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=107_cong_bills&docid=f:h4664enr.txt.pdf). "Senior Management Integration Group, June 21, 2005, Principles of National Science Foundation Research Centers" (Internal document) The following program solicitations provide evidence of a clear purpose for the Centers Program: (1) Chemical Bonding Centers, Phase I and II (http://www.nsf.gov/pubs/2004/nsf04612/nsf04612.htm) (http://www.nsf.gov/pubs/2006/nsf06558/nsf06558.htm) (2) Engineering Research Centers http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04570; (3) Materials Research Science and Engineering Centers http://www.nsf.gov/pubs/2004/nsf04580/nsf04580.htm; (4) Nanoscale Science and Engineering Centers http://www.nsf.gov/pubs/2000/nsf00119/nsf00119.pdf (5) National Center for Ecological Analysis and Synthesis http://www.nsf.gov/funding/pgm_summ.jst?pims_id=13450&org=OII (6) Science and Technology Centers http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf03550 (7) Science of Learning Centers http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05509

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The Centers Program responds directly to the national need to advance cutting-edge science and engineering through disciplinary and interdisciplinary research, the integration of research and education, and the transfer of knowledge and/or technology in service of society through large-scale organized efforts. These needs cannot be met fully under traditional individual investigator, small group, or instrumentation awards. The Centers Program makes available the resources to exploit opportunities in science, engineering, and technology in which the complexity of the research problem requires large-scale organized efforts provided by an academic research center and its partner academic and non-academic partner institutions. The Centers Program focuses on investigations at the frontiers of knowledge, at the interfaces of disciplines, or with fresh approaches to the core of disciplines. The Centers Program has the following long-term goals (1) enable people who work at the forefront of discovery to make important and significant contributions to S&E knowledge; (2) encourage collaborative research and education efforts across organizations, disciplines, sectors and international boundaries; (3) foster connections between discoveries and their use in the service of society; and (4) provide leadership in identifying and developing new research and education opportunities within and across S&E fields.

Evidence: The Centers Program addresses specific problems, interests, or national needs that require large-scale organized efforts and that cannot be met fully under traditional individual investigator, small group, or instrumentation awards. For example, the Engineering Research Centers focus on the need for long-term collaborations between universities and industry to address next-generation and transformative research topics of interest to industry that bring together faculty, students, and industry researchers in real-world technology and product development to produce engineering graduates prepared to be successful in industry. The Nanoscale Science and Engineering Centers are created through partnerships among universities, industry, government laboratories, and other sectors to address complex, interdisciplinary challenges in nanoscale science, engineering, and technology. The National Center for Ecological Analysis and Synthesis provides opportunities for scientists from different disciplines to collaborate on such issues as the massive and accelerating loss of biotic diversity, global change, habitat decline and fragmentation, over-exploitation of natural resources, and the pollution of air, water, and soil. The issues addressed by the Centers Program are well documented. Evidence can be found here: The NSF FY 2003-2008 Strategic Plan, (pp. 15-18) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201) The National Science Foundation Act of 1950, as amended, under the "Functions of the Foundation" (42 U.S.C. 1862, Sec. 3. (a)(5)) (http://www.washingtonwatchdog.org/documents/usc/ttl42/ch16/sec1862.html); NSF Authorization Act of 2002, (P.L. 107-378, Section 2) (http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=107_cong_bills&docid=f:h4664enr.txt.pdf). Additional information is available on center websites: Chemical Bonding Centers, Phase I: http://www.nsf.gov/news/news_summ.jsp?cntn_id=104340&org=CHE&from=newsEngineering Research Centers: http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5502&org=EECMaterials Research Science and Engineering Centers: http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5295&org=DMR&from=homeand http://www.mrsec.org/home/Nanoscale Science and Engineering Centers: http://www.nano.gov/html/centers/nnicenters.htmlNational Center for Ecological Analysis and Synthesis: http://www.nceas.ucsb.eduScience and Technology Centers: http://www.nsf.gov/od/oia/programs/stc/Science of Learning Centers: http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5567&from=fund

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: The Centers Program is designed not to be redundant of any other Federal, state, local, or private effort. The NSF uses several mechanisms to avoid duplication of efforts and spending. In order to ensure interagency collaboration and guided by OMB/OSTP Research and Development Budget Priorities, the NSF has substantial representation on the National Science and Technology Council interagency working groups and subcommittees. For example, NSF leads interagency efforts, such as the National Nanotechnology Initiative (NNI) thus ensuring interagency coordination for the Nanoscale Science and Engineering Centers. In addition, when appropriate, NSF participates in disciplinary and interdisciplinary workshops during which experts discuss and address new research areas, such as the establishment of the Chemical Bonding Centers. In addition, NSF Program Officers, mostly from the Chemistry Division, whose portfolios support research in the chemical sciences attend the annual meeting of the Federal Interagency Chemistry Representatives (FICR). Program managers from 10-15 funding agencies are typically in attendance to hear formal presentations on major science and education themes in their respective portfolios and to engage in informal discussions. The National Center for Ecological Analysis and Synthesis is the only institution in the world solely dedicated to application of analysis, synthesis, and collaboration using existing data to address important questions in ecology and its allied disciplines. Furthermore, NSF Senior Management and the National Science Board review the Centers Program portfolio and the NSF portfolio for balance among individual investigators, collaborative groups, facilities and centers. In contrast to "mission agencies" (e.g., NIH-biomedical, NASA-space, DOD-defense), NSF is the only Federal agency charged with promoting the progress of science and engineering research and education in all S&E fields and disciplines. NSF's unique relationship with the scientific and engineering communities and its competitive grant mechanisms strongly position the agency to carry out center programs specifically designed to advance large-scale organized cutting-edge research and enhance the nation's S&E education capability. Individual centers address unique aspects of S&E and do not duplicate the mission or purposes of the other centers in the Program.

Evidence: NSF has specific, statutory authority to evaluate the status and needs of the various sciences and engineering and to consider the results of this evaluation in correlating its research and educational programs with other Federal and non-Federal programs. See the National Science Foundation Act of 1950, as amended, under the "Functions of the Foundation" (42 U.S.C. 1862, Sec. 3. (a)(5)) (http://www.washingtonwatchdog.org/documents/usc/ttl42/ch16/sec1862.html). Evidence of coordination within the scientific and engineering community and non-duplication of efforts may be found in the multi-agency National Nanotechnology Initiative (NNI), in which NSF supports fundamental research through Nanoscale Science and Engineering (NSE) centers as well as single investigator projects. The National Nanotechnology Initiative Strategic Plan includes a schedule for the creation of NSE centers across five Federal agencies and draws on the work of 17 topical workshops that focused on areas of nanotechnology applications, societal implications, and regional, state, and local initiatives and the recommendations of the Research Directions II Workshop that identified cross-cutting themes and priority research opportunities. Five Federal agencies participate in NNI, coordinating their efforts through a Nanoscale Science, Engineering, and Technology Subcommittee of the Committee on Technology, National Science and Technology Council. For more information on the NSEC program and other NSE research see the following websites: http://www.nano.gov; www.nseresearch.org; and www.nsf.gov/nano/ Each of the center programs has its programmatic goals and disciplinary and/or interdisciplinary foci. For example, the Science of Learning Centers support efforts to expand the frontiers of knowledge on learning of all types. The Chemical Bonding Centers address major long-term chemical research problems, while the Engineering Research Centers tackle problems whose solutions could advance fundamental knowledge and engineered systems. Evidence of non-duplication within the Centers Program can be found in the following documents that describe each programs foci and goals: (1) Chemical Bonding Centers, Phases I and II http://www.nsf.gov/pubs/2004/nsf04612/nsf04612.htm http://www.nsf.gov/pubs/2006/nsf06558/nsf06558.htm (2) Engineering Research Centers http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04570; (3) Materials Research Science and Engineering Centers http://www.nsf.gov/pubs/2004/nsf04580/nsf04580.htm; (4) Nanoscale Science and Engineering Centers http://www.nsf.gov/pubs/2000/nsf00119/nsf00119.pdf (5) National Center for Ecological Analysis and Synthesis http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=13450&org=OII (6) Science and Technology Centers http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf03550 (7) Science of Learning Centers http://www.nsf.gov/pubs/2005/nsf05509/nsf05509.htm

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The Centers Program is free from major design flaws that would limit the program's effectiveness or efficiency in meeting its defined objectives an performance goals. The Centers Program implements extensive review and oversight to respond to potential problems and is designed to anticipate problems that could limit the program's effectiveness or efficiency. For example, the Centers Program carry out the following practices for extensive review: (1) a merit review process that has been recognized as a best practice for administering R&D programs; (2) the professional judgment of trained NSF Program Officers knowledgeable in their respective fields; (3) triennial external Committees of Visitor (COV) review to ensure effectiveness and efficiency of program operations and management (4) review by the National Science Board; and (5) commissioned studies conducted by the National Academies and other groups to ensure center programs address the needs of the S&E community. For oversight, NSF program officers use annual progress reports with measures of progress and achievement and annual or biennial site visits to (1) monitor center maturation and accomplishments and (2) recommend annual continuations, renewals, or discontinuations to NSF senior management. In 2005, the National Science Board affirmed the design of the Centers Program. It concluded that the Centers Program reflected a balanced portfolio and should continue the periodic review by the NSB, the re-competition of centers, and the centers' management practices. Moreover, the Centers Program makes improvements based on recommendations received from its independent reviewers and results from third party studies and advisory committees. This follows the guidance provided in the R&D criteria, as outlined in the Office of Management and Budget/Office of Science and Technology Policy Guidance Memo.

Evidence: Evidence of the effectiveness of the Centers Program's design may be found in the following reports: The 2005 Advisory Committee for GPRA Performance Assessment (AC/GPA) report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210). The FY 2005 Report of the National Science Board (NSB) on the NSF Merit Review Process (NSB-06-21) (http://www.nsf.gov/nsb/documents/2006/0306/merit_review.pdf). July 2005 OMB/OSTP Guidance Memo http://www.ostp.gov/html/budget/2007/ostp_omb_guidancememo_FY07.pdf Board Guidance for National Science Foundation Centers Programs, NSB-05-162, Attachment 4 (p. 13-15) http://www.nsf.gov/nsb/meetings/2005/1130/maj_actions.pdfAlso found in the Approved Minutes, Open Session, 389th Meeting, NSB-05-166, Appendix C (p. 18-20) http://www.nsf.gov/nsb/meetings/2005/1130/open_minutes.pdfAlso found in annual reports and annual or biannual site visit reports (Internal documents). Additional evidence of the effectiveness of the Centers Program design can be found in the following reports by third parties: (1) The Impact on Industry of Interaction with Engineering Research Centers 1997 http://www.sri.com/policy/csted/reports/sandt/erc/contents.html; (2) The Impact on Industry of Interaction with Engineering Research Centers Repeat Study 2004 http://www.sri.com/policy/csted/reports/sandt/documents/ERC2004REPORT.pdf; (3) The Impact of Engineering Research Centers on Institutional and Cultural Change in Participating Universities http://www.sri.com:8000/policy/csted/reports/sandt/documents/ERCCulturalImpact.pdf. (4) Assessment of and Outlook for NSF's Materials Research Laboratory Program (National Academy of Science, on-going) http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0446470; (5) An Assessment of the National Science Foundation's Science and Technology Centers Program, Committee on Science, Engineering, and Public Policy (National Academy of Sciences, National Academy of Engineering, Institute of Medicine, 1996) http://darwin.nap.edu/books/0309053240/html/ (6) An Evaluation of the National Science Foundation Science and Technology Centers (STC) Program., Abt Associates, Inc. (1996). (7) National Science Foundation's Science and Technology Centers: Building an Interdisciplinary Research Paradigm, A Study by a Panel of the National Academy of Public Administration, (National Academy of Public Administration, 1995). (8) A Multi-Method Analysis of the Social and Technical Conditions for Interdisciplinary Collaboration, D. Rhoten. Final Report for NSF - BCS-0129573 (2003) (http://www.nceas.ucsb.edu/fmt/doc?/nceas-web/center/renewal2005, item #10). (9) Report of a Workshop on New Mechanisms for Support of High-Risk and Unconventional Research in Chemistry (http://www.mrl.uiuc.edu/docs/nsfgmwfinal.pdf)

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: The design of the Centers Program is effectively targeted such that resources directly support the program's purpose and reach intended beneficiaries. The Centers Program effectively targets its resources to achieve its purposes by relying upon several mechanisms. First, each program solicitation contains a clear statement of purpose in the context of the particular activity and the audience to which they are directed. Second, NSF utilizes multiple strategies to effectively target the S&E community and the public through (1) workshops conducted by specific disciplinary communities, (2) outreach visits by NSF staff, (3) websites of center programs and centers, (4) targeted outreach through MyNSF, an electronic communications system that alerts self-identified individuals to specific opportunities, (5) NSF professional staff participation in professional conferences, and (6) opportunities for members of the S&E community and the public to access and use center resources. For example, the ISI (Institute for Scientific Information) Essential Science Citation Index reported that the National Center for Ecological Analysis and Synthesis was in the top 1 percent of more than 38,000 institutions in terms of total citations in the field of Environment/Ecology.

Evidence: The effectiveness of the Centers Program's design to target its resources to address program purposes and beneficiaries is demonstrated through the following examples. To ensure open access and to inform individuals interested in proposing ideas of opportunities to do so, all Center Program solicitations contain clear statements of the corresponding activity's purpose and context and are available online (http://www.nsf.gov/funding/). Targeted outreach is accomplished through several means including MyNSF, an electronic communications system that alerts self-identified individuals to specific opportunities. NSF staff members also conduct outreach activities at numerous regional and national professional conferences and during visits to academic institutions located throughout the Nation. Several dozen NSF staff members also participate in outreach at the NSF's biennial Regional Grants Conference. (http://www.nsf.gov/bfa/dias/policy/outreach.jsp). The FY 2005 Report of the NSB on the NSF Merit Review Process (NSB 06-21) indicates that NSF's extensive external merit review process ensures that funding is awarded to those proposals that are of high scientific and technical merit, strongly contribute to broader impact, and best address the goals and objectives of the Centers Program (http://www.nsf.gov/nsb/documents/2006/0306/merit_review.pdf). The National Center for Ecological Analysis and Synthesis is an example of how the Centers Program reaches its intended beneficiaries. The ISI Essential Science Citation Index reported that the National Center for Ecological Analysis and Synthesis was in the top 1% of more than 38,000 institutions in terms of total citations in the field of Environment/Ecology. Presentations on the characteristics of NSF Centers are targeted to audiences who have interests in submitting proposals to a center program, such as the presentation found in the following document. (http://www.nsf.gov/od/oia/presentations/bu/nsfcenteroverview.ppt). Additional information is available for intended beneficiaries (S&E communities, educators, the public, policy-makers) on center websites. (1) Chemical Bonding Centers, Phase I: http://www.nsf.gov/news/news_summ.jsp?cntn_id=104340&org=CHE&from=news (2) Engineering Research Centers: http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5502&org=EEC (3) Materials Research Science and Engineering Centers: http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5295&org=DMR&from=home and http://www.mrsec.org/home/ (4) Nanoscale Science and Engineering Centers: http://www.nano.gov/html/centers/nnicenters.html (5) National Center for Ecological Analysis and Synthesis: http://www.nceas.ucsb.edu (6) Science and Technology Centers: http://www.nsf.gov/od/oia/programs/stc/ (7) Science of Learning Centers: http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5567&from=fund

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The Centers Program has four specific long-term performance measures that focus on outcomes that reflect the program purpose and are listed in the Measures Tab. These measures are drawn from the specific Centers Program objectives set forth in the NSF Strategic Plan FY 2003-2008. They are: (1) enable people who work at the forefront of discovery to make important and significant contributions to S&E knowledge; (2) encourage collaborative research and education efforts across organizations, disciplines, sectors and international boundaries; (3) foster connections between discoveries and their use in the service of society; and (4) provide leadership in identifying and developing new research and education opportunities within and across S&E fields. These measures are drawn from the objectives of the NSF Strategic Ideas Goal and set forth in the NSF Strategic Plan FY 2003-2008. Unlike other types of research, fundamental research does not take place in a predictable fashion nor does it seek to achieve specific outcomes that are identified at the outset of the research. Complicating the situation is the fact that research does not take place in a linear fashion - fundamental research to applied research to emerging technology research to technology development to product development. In many of NSF's centers fundamental research is conducted in tandem with applied research and emerging technology research feeding off of each other. As a result, assessing center research is fraught with methodological challenges. NSF relies on expert review in the form of the Committee of Visitors process, directorate-level Advisory Committees, the AC/GPA, and also third-party evaluations of specific types of centers that address specific aspects of center outcomes and impacts.

Evidence: Long-term performance measures may be found in the Measures Tab. Additional information may be found in the following: The NSF Strategic Plan FY 2003-2008: NSF goals: pp 9-11; Centers Program's role in the Ideas goal: pp. 15-18. Appendix A describing NSF's performance assessment mechanisms: pp. 27-29 http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201 The FY 2005 Performance and Accountability Report, Ideas performance measures (listed as Indicators, a more accurate term for them than Measures) pp. II 19-25 http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par. The following report describes the goals of the Chemical Bonding Centers in terms of expected contributions to science: Report of a Workshop on New Mechanisms for Support of High-Risk and Unconventional Chemistry (http://www.mrl.uiuc.edu/docs/nsfgmwfinal.pdf). Several center programs have commissioned third party evaluation or targeted studies to assess outcomes and impact. The following documents provide descriptions of long-term performance measures that focus on outcomes and are aligned with the purposes of the center programs. (1) The Impact on Industry of Interaction with Engineering Research Centers 1997 http://www.sri.com/policy/csted/reports/sandt/erc/contents.html; (2) The Impact on Industry of Interaction with Engineering Research Centers Repeat Study 2004 http://www.sri.com/policy/csted/reports/sandt/documents/ERC2004REPORT.pdf; (3) The Impact of Engineering Research Centers on Institutional and Cultural Change in Participation Universities http://www.sri.com:8000/policy/csted/reports/sandt/documents/ERCCulturalImpact.pdf. (4) Assessment of and Outlook for NSF's Materials Research Laboratory Program (National Academy of Science, on-going) http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0446470; (5) An Assessment of the National Science Foundation's Science and Technology Centers Program, Committee on Science, Engineering, and Public Policy (National Academy of Sciences, National Academy of Engineering, Institute of Medicine, 1996) http://darwin.nap.edu/books/0309053240/html/ (6) An Evaluation of the National Science Foundation Science and Technology Centers (STC) Program., Abt Assoicates, Inc. (1996). (7) A Multi-Method Analysis of the Social and Technical Conditions for Interdisciplinary Collaboration, D. Rhoten. Final Report for NSF - BCS-0129573 (2003) (http://www.nceas.ucsb.edu/fmt/doc?/nceas-web/center/renewal2005, item #10).

YES 10%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The Centers Program has ambitious targets ("significant achievement") and timeframes for its long-term measures as described in the NSF Strategic Ideas Goal and as found in the Measures Tab. The Centers Program targets and timeframes are ambitious by nature, focusing on investigations at the frontiers of knowledge, at interfaces of disciplines and/or incorporating fresh approaches to the core of disciplines through large-scale organized efforts. NSF has demonstrated "significant achievement" in this area as assessed by the external Advisory Committee for GPRA Performance Assessment (AC/GPA). In addition to the AC/GPA, external advisory committees regularly assess the goals and timeframes for the Centers Program to ensure that they are appropriately ambitious and that the process to assess the targets and timeframes promotes continuous improvement consistent with NSF goals and objectives. The COV process is the primary mechanism for external evaluation at the division or program level. Several center programs commission third party evaluations and targeted studies to ensure that targets and timeframes are ambitious and inform continuous improvement consistent with center programs and NSF goals and objectives.

Evidence: Ambitious targets and timeframes for long-term measures may be found in the Measures Tab. Other evidence is found in the reports that may be accessed at the following web sites: (1) Performance Assessment Information (http://www.nsf.gov/about/performance/). (2) Annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp). (3) FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par). (4) Information concerning individual center programs may be found in their COV and AC reports, (http://www.nsf.gov/od/oia/activities/cov/covs.jsp) and in their solicitations (http://www.nsf.gov/publications/index.jsp?org=NSF&archived=false&hideSelects=false&pub_type=Program&nsf_org=NSF&x=9&y=10). Additional evidence can be found in the following documents produced by third party evaluations and target studies: (1) The Impact on Industry of Interaction with Engineering Research Centers 1997 http://www.sri.com/policy/csted/reports/sandt/erc/contents.html; (2) The Impact on Industry of Interaction with Engineering Research Centers Repeat Study 2004 http://www.sri.com/policy/csted/reports/sandt/documents/ERC2004REPORT.pdf; (3) The Impact of Engineering Research Centers on Institutional and Cultural Change in Participation Universities http://www.sri.com:8000/policy/csted/reports/sandt/documents/ERCCulturalImpact.pdf. (4) Assessment of and Outlook for NSF's Materials Research Laboratory Program (National Academy of Science, on-going) http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0446470; (5) An Assessment of the National Science Foundation's Science and Technology Centers Program, Committee on Science, Engineering, and Public Policy (National Academy of Sciences, National Academy of Engineering, Institute of Medicine, 1996) http://darwin.nap.edu/books/0309053240/html/ (6) National Science Foundation's Science and Technology Centers: Building an Interdisciplinary Research Paradigm, A Study by a Panel of the National Academy of Public Administration, (National Academy of Public Administration, 1995). (7) An Evaluation of the National Science Foundation Science and Technology Centers (STC) Program., Abt Associates, Inc. (1996). (8) A Multi-Method Analysis of the Social and Technical Conditions for Interdisciplinary Collaboration, D. Rhoten. Final Report for NSF - BCS-0129573 (2003) (http://www.nceas.ucsb.edu/fmt/doc?/nceas-web/center/renewal2005, item #10).

YES 10%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The Centers Program has two specific performance measures, shown in the Measures Tab, which can demonstrate progress toward achieving the program's long-term goals and the agency's strategic goals. These include: (1) percentage of decisions on pre-proposals, that are merit reviewed, and available to Centers Program applicants within five months of pre-proposal receipt or deadline date and (2) the percentage of partner institutions in the Centers Program that are non-academic institutions demonstrating connections between discoveries and their use in the S&E community and in the service of society. These annual measures help ensure effective and efficient operation of the Centers Program in meeting its long-term goals. Achievement of the annual measures indicate the selection and utilization of opportunities for cutting edge research and discovery, the integration of research and education, and the exchange of knowledge and/or technology across different sectors of the economy for the benefit of society.

Evidence: Specific annual performance measures demonstrating progress toward achieving long-term goals may be found in the Measures Tab.

YES 10%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The Centers Program has baselines and ambitious targets for its annual measures. Baselines and targets are set with the prospect of significant achievement in the discovery across the frontier of S&E, connected to learning, innovation and service to society. Baselines for the annual performance measures are obtained from internal NSF sources and Center Programs annual reporting data sources. Ambitious targets for its annual measures are shown in the Measures Tab. In all instances, the Centers Program has met NSF performance standards.

Evidence: Performance measures can be found in the Measures Tab. Additional information regarding baselines and targets may be accessed at Performance Assessment Information (http://www.nsf.gov/about/performance/). Annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp); and the FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par) provide specific information. The COV reports provide information about the performance of individual programs (http://www.nsf.gov/od/oia/activities/cov/covs.jsp.

YES 10%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: All participants in the Centers Program commit to and work toward achieving the program's long-term and annual goals. The Centers Program obtains this level of commitment by assuring that all program descriptions, announcements, solicitations, and cooperative agreements are consistent with these goals. The Centers Program employs external (peer) merit review to select proposals that demonstrate commitment to the center programs goals, requires awardees to submit template-based annual and final progress reports that are subject to NSF program officer approval as a prerequisite for continuation and/or renewal support, and requires the use of Cooperative Agreement (CA) Terms and Conditions. Cooperative Agreement Terms and Conditions specify the roles and responsibilities of the awardee institutions, its partners, and NSF. Cooperative Agreements state that the continuation of funding is contingent on annual performance of the commitments described in the CAs. Continuing annual support for centers is based upon annual progress described in annual reports submitted by grantees, which are subject to review and approval by NSF Program Officers. NOTE: Some center programs require the use of the standard NSF annual reporting template while other centers, (e.g., ERCs, NCEAS, NSECs, SLCs, STCs) have reporting requirements that are tailored for site visits and Program Officers' post-award management responsibilities. In many cases (e.g., ERCs, MRSECs, NCEAS, SLCs, STCs) annual performance also is assessed through annual site visits or biennial site visits conducted by external panels of experts. For many of the centers, strategic plans are required that describe activities, responsibilities, outputs and outcomes and form the basis of the assessment of annual performance (e.g., ERCs, NSECs, SLCs, and STCs). Moreover, final project reports must be submitted after the awards end. To receive subsequent awards, all applicants are required to include in their new proposals descriptions of the results of previous NSF support, which are considered in the merit review process. No subsequent awards can be made to applicants unless a Program Officer has approved final project reports for all previous awards.

Evidence: Evidence that all partners commit to and work toward program goals includes annual and final reports, site visits (if required), and a center-specific General Cooperative Agreement Terms and Conditions; the general form can be accessed at the following: (http://www.nsf.gov/pubs/gc1/ca102.pdf). Cooperative Agreement Terms and Conditions (1) are tailored to the type of center; (2) specify expectations (annual and long-term goals with reference to the roles and responsibilities of the grantee institutions, their partners, and NSF); (3) state that the continuation of funding is based on annual performance; and (4) are revised annually based on evidence that the centers are/are not meeting annual goals. Specific center program Cooperative Agreement Terms and Conditions can be found in the following: (1) Materials Research Science & Engineering Centers: http://www.nsf.gov/pubs/policydocs/nsf04580.pdf;(2) Nanoscale Science & Engineering Center: http://www.nsf.gov/pubs/policydocs/nsf05543.pdf; (3) Science of Learning Centers: http://www.nsf.gov/pubs/policydocs/nsf03573.pdf; (4) Science & Technology Centers: http://www.nsf.gov/pubs/policydocs/nsf03550.pdf; NOTE: The Engineering Research Centers (ERCs) Cooperative Agreement template is not a web document, but is available if requested. The Chemical Bonding Centers (CBC) Cooperative Agreement Terms and Conditions are forthcoming for Phase II. The National Center for Ecological Analysis and Synthesis Cooperative Agreement is a Word Document and is available, if requested. Some centers (ERCs, NSECs, SLCs, STCs) work from strategic plans that include the roles and responsibilities of partner institutions. Annual reports present annual achievements in the context of their strategic plans. The sample schematic for an ERCs' strategic plan can be found at: http://chaffee.qrc.com/nsf/eng/ercweb/help/rpt_guide/ERC_Strategic_Concept_Figure_Generic_2003.pdf. ERCs have separate objectives for the fundamental research plane, the enabling technology plane, and the engineered systems plane. These objectives are translated into a milestone chart showing actual and expected achievements. Strategic plans may evolve over the award period based on achievements, technical constraints, priority modifications suggested by reviewers or NSF, etc. Science and Technology Centers (STCs) have their strategic plans on their websites. The STC Class of 2005/2006 is required to participate in strategic planning training sessions and develop draft strategic plans that are finalized and approved. Science of Learning Centers (SLCs) are required to develop strategic plans assisted by an NSF program officer and a management consultant associated with the SLCs. Evidence that all partners must commit to and work toward annual and long-term goals of the program also can be found in the Centers solicitations. They are found on the NSF website (http://www.nsf.gov/funding/) and all reference NSF's two merit review criteria (http://www.nsf.gov/pubs/1999/nsf99172/nsf99172.htm) which aligned with the goals of the Centers Program. Other evidence may be found in the FY 2005 Report to the NSB on the NSF Merit Review Process (NSB 05-119) (http://www.nsf.gov/nsb/documents/2005/0930/merit_review.pdf).

YES 10%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: External committees conduct regularly scheduled high quality independent evaluations of the Centers Program activities for the purposes of program improvement and the assessment of program effectiveness and its relevance to a problem, interest, or need. External committees conduct regularly scheduled independent evaluations of Centers Program activities (e.g., COV, AC/GPA, Directorate AC, annual and biennial site visits, PI meetings, and third party evaluations). Committee of Visitors (COVs) reviews provide NSF with external expert judgments in two areas: (1) assessments of the quality and integrity of program operations and program-level technical and managerial matters pertaining to proposal decisions; and (2) comments on how the results generated by awardees have contributed to the attainment of NSF's mission and strategic outcome goals. COV reviews are conducted every three years. Advisory Committees, which meet several times per year, review Directorate performance and COV reports. The AC/GPA assesses performance on an NSF-wide basis for the Strategic Outcome Goals on an annual basis. In addition, several center programs schedule annual or biennial site visits conducted by external panels of experts during the life of the centers (e.g., ERCs, MRSECs, NCEAS, NSECs, SLCs, and STCs). At the beginning of the renewal process, intensive external reviews address the broader issues of resources, priority setting, and program effectiveness in meeting the goals of the center programs, the Centers Program, and the priorities of the Foundation. Several center programs sponsor workshops and PI meetings that assist in identifying emerging areas and knowledge gaps; encourage communication among centers; provide opportunities for new centers' staff to learn from staff at more mature centers; communicate results of independent evaluations; and inform center staff about changes based on study findings. By nature of the level and duration of support, center programs commission third party evaluations (e.g., studies of the STCS and MRSECs by the National Academies, program evaluations of the ERCs and STCs by independent consulting firms) or targeted studies to assess accomplishments and/or impact (e.g., NCEAS). These studies inform NSF Program Officers, senior management, and center staff of programmatic strengths, weaknesses, opportunities, and problems. NSF's approach to evaluation was highlighted by GAO as an "evaluation culture--a commitment to self-examination, data quality, analytic expertise, and collaborative partnerships." In addition, the NSF Office of Inspector General reported that NSF makes good use of the COV reports in determining its performance to meet strategic goals under the Government Performance and Results Act.

Evidence: GAO highlighted NSF's approach to evaluation as an "evaluation culture--a commitment to self-examination, data quality, analytic expertise, and collaborative partnerships." The NSF Office of Inspector General reported that NSF makes good use of COV reports in determining its performance to meet strategic goals under GPRA. Some independent evaluations are: (1) Program Evaluation: An Evaluation Culture and Collaborative Partnerships Help Build Agency Capacity (GAO-03-454 May 2, 2003) (http://www.gao.gov/new.items/d03454.pdf). (2) Audit of NSF's Committees of Visitors, NSF Office of Inspector General, OIG-03-2-013, September 25, 2003, pp. 7-8. (3) Annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp). (4) COV reports and NSF responses (http://www.nsf.gov/od/oia/activities/cov/). (5) Annual or biennial site visit reports by disciplinary experts (Internal documents). National Science Board Guidance, NSB-05-162, Attachment 4 (p. 13-15) http://www.nsf.gov/nsb/meetings/2005/1130/maj_actions.pdf. NSF's portfolio balance of centers, principal investigators, and facilities; (6) The Impact on Industry of Interaction with Engineering Research Centers 1997 (http://www.sri.com/policy/csted/reports/sandt/erc/contents.html); (7) The Impact on Industry of Interaction with Engineering Research Centers Repeat Study 2004 (http://www.sri.com/policy/csted/reports/sandt/documents/ERC2004REPORT.pdf); (8) The Impact of Engineering Research Centers on Institutional and Cultural Change in Participating Universities (http://www.sri.com:8000/policy/csted/reports/sandt/documents/ERCCulturalImpact.pdf); (9) Assessment of and Outlook for NSF's Materials Research Laboratory Program (National Academy of Science, on-going) (http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0446470); (10) An Assessment of the National Science Foundation's Science and Technology Centers Program, Committee on Science, Engineering, and Public Policy (National Academy of Sciences, National Academy of Engineering, Institute of Medicine, 1996) (http://darwin.nap.edu/books/0309053240/html/ ); (11) National Science Foundation's Science and Technology Centers: Building an Interdisciplinary Research Paradigm, A Study by a Panel of the National Academy of Public Administration, (National Academy of Public Administration, 1995); (12) A Multi-Method Analysis of the Social and Technical Conditions for Interdisciplinary Collaboration, D. Rhoten. Final Report for NSF - BCS-0129573 (2003) (http://www.nceas.ucsb.edu/fmt/doc?/nceas-web/center/renewal2005, item #10). http://www.nsf.gov/nsb/meetings/2005/1130/open_minutes.pdf. (13) An Evaluation of the National Science Foundation Science and Technology Centers (STC) Program., Abt Associates, Inc. (1996).

YES 10%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The NSF's annual budget request is explicitly tied to accomplishment of its annual and long-term performance goals and its resource needs are presented in a complete and transparent manner. The Foundation's performance structure provides the underlying framework for its request, with each major NSF organization (i.e. each Directorate and Office) tying its budget directly to this performance framework. This is further documented in the performance summary included in each organization's chapter of NSF's Budget Request to Congress, which ties their budget directly to the PART activities. With respect to presenting the resource needs in a clear and transparent presentation, the NSF budget displays resource requests by structural component and by performance goal. This presentation is based on consultations over the past year with key Congressional and OMB staff, and it also incorporates recommendations from the 2004 report on NSF by the National Academy of Public Administration. The purpose of this budget presentation is to highlight the matrix structure that NSF employs, with the major organizational units each contributing to the goals and investment categories established in the NSF Strategic Plan. This revised presentation contains additional information on the portfolio of investments maintained across NSF, including the Centers Program.

Evidence: Evidence showing that budget requests are tied to accomplishment of the annual and long-term goals of the Centers Program includes the following items. (1) The Executive Branch Management Scorecard (http://www.whitehouse.gov/results/agenda/scorecard.html). (2) The NSF FY 2007 Budget Request to Congress; see performance section in each directorate's summary e.g., pp 45-48 for the Directorate for Biological Sciences; pp. 84-88 for the Directorate for Engineering. (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=budreqfy2007). (3) The National Science Foundation Governance and Management for the Future (National Academy of Public Administrators, Order Number 04-07) (http://www.napawash.org/resources/news/news_4_28_04.html) and (Http://71.4.192.38/NAPA/NAPAPubs.nsf/17bc036fe939efd683251004e37f4/23f8c16a35c73b6485004b4a4f?). (4) Specific Centers Program objectives and their relationship to NSF strategic goals is found in the NSF's strategic plan (15-18, National Science Foundation Strategic Plan FY 2003-2008, http://www.nsf.gov/pubs/2004/nsf04201/FY2003-2008.pdf).

YES 10%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: Strategic planning deficiencies of the type and scope that would jeopardize the success of the Centers Program have not been identified. To ensure that such deficiencies do not arise, the Centers Program, as well as all programs across the NSF, utilize the COV, AC and AC/GPA processes that provide valuable constructive feedback on an ongoing basis concerning areas in which NSF strategic planning can be strengthened. In response, the Foundation takes steps to address those issues. Each division or office prepares an annual update describing key actions that have been taken to implement the recommendations of the previous COV report. In addition, for the Centers Program, site visit and reverse site visit reports identify emerging problems. In response to the recommendation made by the site reviewers, the cognizant program officers and the Foundation along with center directors take steps to address the issues.

Evidence: Evidence demonstrating Centers Program strategic planning may be found in the following documents: COV reports and NSF responses for the Centers Program components (http://www.nsf.gov/od/oia/activities/cov/) and the Annual AC/GPA report (http://www.nsf.gov/about/performance/acgpa/index.jsp). A description of the relationship of specific Centers Program objectives to NSF strategic goals is found in the NSF's strategic plan (pp.15-18, National Science Foundation Strategic Plan FY2003-2008, http://www.nsf.gov/pubs/2004/nsf04021/FY2003-2008.pdf). Examples of meaningful steps taken by the Centers Program to improve strategic planning within individual center programs include the development of materials and/or activities for NSF program officers and center personnel to develop/improve knowledge and skills in the design, assessment, and implementation of strategic plans. The following references provide examples of (1) materials and activities building the capacity of program officers and center personnel to develop/improve and implement strategic planning and (2) strategic plans guiding center performance. (1) Each new ERC must prepare a draft strategic plan during the first months of the award for NSF review, followed by a half-day meeting with the center leadership and ERC program officers focused on improving the document and plans for implementation. At the center of each plan is a three-plane chart that lays out the relationships between the fundamental research objectives, the emerging technology objectives, and the systems-level goals. ERCs have separate objectives for the fundamental research plane, the enabling technology plane, and the engineered systems plane. These objectives are translated into a milestone chart showing actual and expected achievements. Strategic plans may evolve over the award period based on achievements, technical constraints, priority modifications suggested by reviewers or NSF, etc. The sample schematic for an ERCs' strategic plan three-plane chart can be found at: http://chaffee.qrc.com/nsf/eng/ercweb/help/rpt_guide/ERC_Strategic_Concept_Figure_Generic_2003.pdf. (2) Beginning with the Science and Technology Centers Class of 2005/06, all new centers must prepare a strategic plan, which the cognizant NSF program officers review and ultimately finalized. To assist new centers and their NSF program officers understanding the strategic planning process and how to prepare a stratetic plan, the leaders of new centers and the NSF program officers are required to participate in strategic planning training workshops. Once approved, a strategic plan is then placed on the center's website: (http://www.nsf.gov/od/oia/programs/stc/). The experience of the ERC and STC program officers is that the efforts to work with new center staff in developing and reviewing each new center's draft strategic plan results in much improved plans that guide center progress.

YES 10%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program and (if relevant) to other efforts in other programs that have similar goals?

Explanation: The Centers Program uses regularly scheduled independent evaluations, such as COVs and Directorate ACs, and commissions studies conducted by other independent groups (e.g., the National Academics) to assess and compare the potential benefits of the Centers Program to those programs with similar goals. NSF's investments in the Centers Program addresses unique national STEM research and education needs that are not under the purview of the more mission-specific federal, state, or local agencies and that cannot be addressed through the individual investigator mode in the disciplines supported by the NSF. Investing in the center mode of support ensures that research that needs large-scale organized efforts and is impossible or unfeasible under traditional support is conducted by providing resources in a planned, organized, and focused way (e.g., research on large systems, research centered on major experimental capability, or research requiring extensive coordination). By virtue of the levels of support and duration, NSF senior management reviews and compares the Centers Program to ensure a balanced portfolio and the vitality of the Nation's S&E enterprise. The National Science Board has been clear that the nature of the research conducted by NSF's centers must not be possible for individual investigators or group awards. The Office of Science and Technology Policy, the National Science and Technology Council, the National Science Board, OMB, the Congress, and other policy-making bodies regularly review NSF's investments in Centers in the context of the overall Federal investment in science and engineering. In areas where research content may overlap, NSF coordinates its activities with those of mission agencies in order not to duplicate efforts and to ensure that each agency supports those efforts most appropriate to its charter.

Evidence: Examples of independent evaluations of sufficient scope and quality are included in COV reports and NSF responses (http://www.nsf.gov/about/performance/advisory.jsp) and in annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp). The Centers Program assesses and compares the potential benefits of individual center programs through the evaluation process. For example, NSF leads interagency working groups for the National Nanotechnology Initiative (NNI). An important component of NNI is the establishment and support of Nanoscale Science and Engineering Centers (NSECs) that are designed to complement and not duplicate other efforts currently supported by NSF or other public or private means. The Engineering Advisory Committee also considers Engineering Research Centers as an integral part of its overall review of Directorate efforts. Also found in annual reports and annual or biannual site visit reports (Internal documents). Documentation of third party evaluation of the unique contributions of center programs can be found in the following documents: (1) The Impact on Industry of Interaction with Engineering Research Centers 1997 (http://www.sri.com/policy/csted/reports/sandt/erc/contents.html); (2) The Impact on Industry of Interaction with Engineering Research Centers Repeat Study 2004 (http://www.sri.com/policy/csted/reports/sandt/documents/ERC2004REPORT.pdf); (3) The Impact of Engineering Research Centers on Institutional and Cultural Change in Participation Universities (http://www.sri.com:8000/policy/csted/reports/sandt/documents/ERCCulturalImpact.pdf); (4) Assessment of and Outlook for NSF's Materials Research Laboratory Program (National Academy of Science, on-going) (http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0446470); (5) An Assessment of the National Science Foundation's Science and Technology Centers Program, Committee on Science, Engineering, and Public Policy (National Academy of Sciences, National Academy of Engineering, Institute of Medicine, 1996) (http://darwin.nap.edu/books/0309053240/html/ ); (6) National Science Foundation's Science and Technology Centers: Building an Interdisciplinary Research Paradigm, A Study by a Panel of the National Academy of Public Administration, (National Academy of Public Administration, 1995). (7) An Evaluation of the National Science Foundation Science and Technology Centers (STC) Program., Abt Assoicates, Inc. (1996). Board Guidance for National Science Foundation Centers Programs and NSF Report to the NSB, (8) NSB-05-162, Attachment 4 (p. 13-15) http://www.nsf.gov/nsb/meetings/2005/1130/maj_actions.pdf (9) Also found in the Approved Minutes, Open Session, 389th Meeting, NSB-05-166, Appendix C (p. 18-20); http://www.nsf.gov/nsb/meetings/2005/1130/open_minutes.pdf; (10) NSB-05-162, Attachment 4 (p. 13-15) (11) NSF-88-35, "Report of the National Science Board Committee on Centers and Individual Investigator Awards," p. I-vii.

YES 10%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: A prioritization process is used to formulate the specific budget requests and guide funding decisions for the Centers Program. This process develops both NSF's overall highest priorities and individual programmatic priorities. In developing priorities for individual STEM activities, information on the following factors is obtained: (1) NSF's highest funding priorities listed in the FY 2007 Budget Request - especially strengthening the core and addressing major national challenges identified by the Administration; (2) needs and opportunities identified by COV and AC review; (3) new frontiers and topics of major impact that are identified by the scientific community, e.g., through workshops; and (4) important emerging areas for which large numbers of highly ranked proposals are received. Senior management integrates that information, prioritizes budget requests within and among programs, and determines funding levels for Centers Program activities. Acknowledging that funding modes cover a continuous spectrum from individual investigators through groups to centers, the National Science Board concluded that numerical targets should not be specified for funding any of these modes, but that the balance among them should be determined by the requirements of the research and education. NSF management and program officers, along with assistance and support of advisory committees, should maintain an appropriate balance based on the Foundation's long-range planning and budgetary process. Based on historical investment in these modes and extensive discussion about funding modes, NSB guidance suggests a balance of which approximately 6-8 percent of the total NSF budget support centers. At the programmatic level, the Centers Program relies on the merit review process to prioritize proposals for funding decisions; final funding decisions also include consideration of NSF's core strategies and maintenance of a diverse portfolio.

Evidence: Evidence demonstrating a prioritization process to guide budget requests and funding decisions may be found in the following references: (1) Board Guidance for National Science Foundation Centers Programs, NSB-05-162, Attachment 4 (p. 13-15) http://www.nsf.gov/nsb/meetings/2005/1130/maj_actions.pdf, NOTE: See p. 15 for National Science Board instructions to NSF regarding the distinction and balance between centers and individual investigator or small group research and the proportion of the NSF budget that should go to centers. (2) The NSF Strategic Plan FY 2003-2008 (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201). (3) Performance Assessment Information (http://www.nsf.gov/about/performance/). (4) The NSF FY 2007 Budget Request to Congress (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=budreqfy2007). (5) The Grant Proposal Guide, (NSF 04-23) (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg). (6) COV reports (http://www.nsf.gov/od/oia/activities/cov/). (7) National Science Board Reports, minutes and agendas (http://www.nsf.gov/nsb/). Additional evidence demonstrating a prioritization process guiding investment in the Centers Program may be found in the following references: (8) For more information on the Chemical Bonding Centers (CBCs) see the following report: Report of a Workshop on New Mechanisms for Support of High-Risk and Unconventional Chemistry (a href="http://www.mrl.uiuc.edu/docs/nsfgmwfinal.pdf" target="_new">http://www.mrl.uiuc.edu/docs/nsfgmwfinal.pdf). (9) For more information on the NSECs program and other Nanoscale Science and Engineering (NSE) research see the following websites: (http://www.nano.gov; www.nseresearch.org; and www.nsf.gov/nano/)

YES 10%
Section 2 - Strategic Planning Score 100%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The Centers Program regularly collects timely and credible performance information from key program partners and uses it to: adjust program priorities, make decisions on resource allocations, and improve program management and performance. Evidence relating to the use of credible performance information may be found in COV reports, AC report, AC/GPA report, and site visit reports conducted by external review panels. In addition, center programs collect performance information from NSF awardees through (1) annual reports, submissions to a web-based database, and annual meetings with grantees; (2) site visits and reverse site visits; and (3) third party evaluations and targeted studies. The reporting requirements of the NSF Centers Program are developed specifically to allow continuing oversight and tracking of accomplishments. Most of the centers include partnerships among multiple universities and non-academic institutions. The partner institutions must provide relevant data to the awardee institution before the data set or annual report can be submitted on behalf of the center. The performance information is used for making corrective action, adjusting program priorities, and making recommendations for the continuation of funding. Several center programs have commissioned program monitoring, independent program evaluations, and targeted studies to assess program management and results (e.g., ERCs, MRSECs, NCEAS, NSECs, STCs). GPRA and PART performance data are collected and assessed by an external panel of expert and verified and validated by an independent, external consulting firm.

Evidence: Performance data and information are included in the following sources: (1) COV reports and NSF responses (http://www.nsf.gov/od/oia/activities/cov/covs.jsp). (2) The annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp). (3) The FY2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par) (4) Directorate Advisory Committee minutes (e.g., http://www.nsf.gov/bio/advisory.jsp; http://www.nsf.gov/geo/advisory.jsp; http://www.nsf.gov/cise/advisory.jsp); (5) Internal documents such as awardee annual reports, site visit and reverse site visit reports, submissions to center-specific performance indicator databases (e.g., NSECWeb, ERCWeb), annual STC Cohort Reports, and ad hoc reports as requested (e.g., STCs, MRSECs, NCEAS). Information from these data sources provides a historical encapsulation of center-specific, cohort-based, and center program wide outputs. Since setting objectives and reasonable expectations is challenging for a portfolio with diverse and highly focused research areas, data are used to indicate reasonable annual accomplishments (e.g., publications, enrollments, presentations, patents and leveraged funding). Examples of independent third-party program evaluations, monitoring, and targeted studies of centers include: (1) Assessment of and Outlook for NSF's Materials Research Laboratory Program (National Academy of Sciences, on-going); (2) An Assessment of the National Science Foundation's Science and Technology Centers Program (1996) http://darwin.nap.edu/books/0309053240/html/ (2) The Impact on Industry of Interaction with Engineering Research Centers 1997 (http://www.sri.com/policy/csted/reports/sandt/erc/contents.html); (3) The Impact on Industry of Interaction with Engineering Research Centers Repeat Study 2004 (http://www.sri.com/policy/csted/reports/sandt/documents/ERC2004REPORT.pdf); (4) The Impact of Engineering Research Centers on Institutional and Cultural Change in Participation Universities (http://www.sri.com:8000/policy/csted/reports/sandt/documents/ERCCulturalImpact.pdf). (5) A Multi-Method Analysis of the Social and Technical Conditions for Interdisciplinary Collaboration D. Rhoten. Final Report for NSF - BCS-0129573 (2003) (http://www.nceas.ucsb.edu/fmt/doc?/nceas-web/center/renewal2005, item #10). http://www.nsf.gov/nsb/meetings/2005/1130/open_minutes.pdf (6) National Science Foundation's Science and Technology Centers: Building an Interdisciplinary Research Paradigm, A Study by a Panel of the National Academy of Public Administration (1995). (7) An Evaluation of the National Science Foundation Science and Technology Centers (STC) Program., Abt Assoicates, Inc. (1996).

YES 9%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: All NSF awardees, contractors, and NSF Program Officers must meet specific reporting and financial record keep requirements and are held accountable for cost, schedule, and performance results. The Centers Program awardees must adhere to the terms of their cooperative agreements and are held accountable for meeting annual and final reporting requirements, post-award monitoring requirements, as well as financial record keeping requirements. The Program holds the awardee accountable for sound financial management, timely billing, cost controls, schedule of milestones, and satisfactory performance as documented in annual reports and assessed through site visits and reverse site visits. Sub-grantees are similarly held accountable to NSF by grantees. NSF Program Officers monitor all reports and require corrective action from an awardee when necessary. Continuation of funding is subject to successful performance to date and correction of deficiencies identified in the past. Centers can be and have been terminated for insufficient performance and financial mismanagement. Continuing support at a reduced level because of poor performance is an alternative option. In proposals for new support, all applicants for NSF funding are required to include reports on the results of previous NSF support. Such past performance information is considered in the merit review process. The efforts of NSF staff managing Centers program are reviewed by their supervisors and by appropriate Committees of Visitors. Individually, staff performance plans are directly linked to NSF's strategic goals.

Evidence: Evidence demonstrating that federal managers and program partners are accountable for cost, schedule, and performance results may be found in COV Reports (http://www.nsf.gov/od/oia/activities/cov/covs.jsp); awardee annual reports, site visit and reverse site visit reports; Advisory Committee reports: NSF Advisory Committee for GPRA Performance Assessment (AC/GPA) reviews FY 2005 AC/GPA Report http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210&org=NSF; page 26); FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par; page II-41); COV reports (http://www.nsf.gov/od/oia/activities/cov/covs.jsp); General Cooperative Agreement terms and conditions: http://www.nsf.gov/pubs/gc1/ca102.pdf. Center-specific Cooperative Agreement terms: (1) Nanoscale Science and Engineering Center: http://www.nsf.gov/pubs/policydocs/nsf05543.pdf; (2) Science and Technology Centers: http://www.nsf.gov/pubs/policydocs/nsf03550.pdf; (3) Materials Research Science and Engineering Centers: http://www.nsf.gov/pubs/policydocs/nsf04580.pdf; (4) Science of Learning Centers: http://www.nsf.gov/pubs/policydocs/nsf03573.pdf; NOTE: The Engineering Research Centers cooperative agreement template is not a web document, but is available, if requested. The Chemical Bonding Centers Cooperative Agreement Terms and Conditions are not yet written for CBC, Phase II. Federal Cash Transaction Reports (http://www.nsf.gov/pubs/2006/nsf0601/pdf/09.pdf) and annual performance evaluations of NSF Staff and Program Officers.

YES 9%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: The NSF routinely obligates its funds in a timely manner and they are monitored to assure that they are spent for the intended purposes; the Centers Program is no exception. NSF also has pre- and post-award internal controls to reduce the risk of improper payments. Accurate reporting of grant fund expenditures is a primary requirement for continued NSF support. A study conducted by PricewaterhouseCoopers assessing the risk for erroneous payments found no erroneous payments. Beginning in FY 2004 the NSF incorporated erroneous payments testing of awardees into the on-site monitoring program. Program Officers managing individual centers and cooperative agreement specialists in the Division of Acquisition and Cooperative Support monitor expenditure of funds, carryover, and justification for requested budget for the up-coming year. Program officers can monitor center expenditure rate using the Federal Cash Transaction Records (FCTR). In addition to these agency wide policies and practices, the Centers Program employs added financial management oversight. To increase the likelihood that deficiencies are brought to light and addressed in a timely manner, awards under the Centers Program are closely examined and monitored.

Evidence: Data and information demonstrating that funds are obligated in a timely manner and spent for the intended purpose are included in Federal Cash Transaction Reports, a clean opinion on financial statements for the past 8 years, and the following reports: (1) FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par). (2) NSF carryover presented in the NSF budget requests to Congress (http://www.nsf.gov/about/budget/). (3) The Risk Assessment and Award Monitoring Guide www.nsf.gov/about/contracting/rfqs/dcca_060018/Risk%20Assessment%20Guide%20for%20Post%20Award%20Monitoring%20Site%20Visits.pdf). (4) The FY 2005 Federal Cash Transaction Report (http://www.nsf.gov/pubs/2006/nsf0601/pdf/09.pdf). (5) Evidence of the agency's financial obligations may be found in the PriceWaterhouse Coopers NSF FY 2001 Risk Assessment for Erroneous Payments, Data on NSF Carryover (found in the NSF Budget Request to Congress (http://www.nsf.gov/about/budget/). (6) Internal documents: site visit and reverse site visits reports and annual reports.

YES 9%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The Centers Program has a number of procedures to measure and achieve efficiencies and cost effectiveness in program execution. See the Measures Tab for a measure of annual efficiency. The measure is an indicator of how the Center achieves efficiencies and cost effectiveness in program execution. The Centers Program has developed procedures to reduce workloads on partners, on NSF, and on the reviewer community. The Centers Program promotes efficiency by awarding larger and longer duration awards that allow the research community to spend more time conducting research and less time preparing proposals. As an example, MRSECs transitioned from four-year to six-year awards to allow its research community to spend more time conducting research. Centers Program awards require detailed review to ensure the best investments in research and education. The Centers Program review procedures minimize workloads on the proposers, NSF staff, and reviewers, by using a pre-award process consisting of pre-proposal review, full proposal review, and site/reverse site visits. Only a small number (~ 30 percent) of the most promising proposals pass from pre-proposal to full proposal stage and from full proposal to site visit stage. This process ensures that in-depth merit review is carried out only on the small number of proposals that have high likelihood of success. The NSF's management of the Centers Program employs a variety of cost efficient techniques by maximizing utilization of modern technology (e.g., by replacing meetings in person by videoconferences when appropriate) and outsourcing some activities (e.g., third party evaluations, strategic planning). For example, the STC program contracts with an outside firm to train and provide technical assistance to new grantees.

Evidence: Evidence of annual efficiency measure and description of procedures to achieve cost effectiveness may be found in the Measures Tab. Data and information on cost effectiveness of the competitive selection and external (peer) merit review of the Centers Program are located in program management plans (internal documents, available on request), the NSF Enterprise Information System, and in the following documents: (1) Data and information on organizational excellence is located in the NSF Advisory Committee for GPRA Performance Assessment (AC/GPA) reviews FY 2005 AC/GPA Report http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210&org=NSF; page 26); (2) Annual Performance and Accountability Reports (http://www.nsf.gov/publications/pub_summ.jst?ods_key=par; (3) COV reports (http://www.nsf.gov/od/oia/activities/cov/covs.jsp); (4) NSF Strategic Plan FY 2003-2008 (pages 27-29, (http://www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf); (5) NSF Grant Proposal Guide (http://www.nsf.gov/pubs/gc1/gc1_605.pdf) (6) The Risk Assessment and Award Monitoring Guide (www.nsf.gov/about/contracting/rfqs/dcca_060018/Risk%20Assessment%20Guide%20for%20Post%20Award%20Monitoring%20Site%20Visits.pdf). (7) Federal Cash Transaction Reports (http://www.nsf.gov/pubs/2006/nsf0601/pdf/09.pdf); Evidence of competitive, external (peer) merit review for center programs can be found in the following documents. (1) Chemical Bonding Centers, Phases I and II: (http://www.nsf.gov/pubs/2004/nsf04612/nsf04612.htm) (http://www.nsf.gov/pubs/2006/nsf06558/nsf06558.htm) (2) Engineering Research Centers (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04570) (3) Materials Research Science and Engineering Centers (http://www.nsf.gov/pubs/2004/nsf04580/nsf04580.htm) (4) Nanoscale Science and Engineering Centers (http://www.nsf.gov/pubs/2000/nsf00119/nsf00119.pdf) (5) National Center for Ecological Analysis and Synthesis (http://www.nsf.gov/funding/pgm_summ.jst?pims_id=13450&org=OII) (6) Science and Technology Centers (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf03550) (7) Science of Learning Centers (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05509)

YES 9%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The Centers Program collaborates and coordinates effectively with related research and education programs within NSF and with other agencies and the private sector for mutually related interests. Internal to the NSF several center programs are managed by groups of program officers from across divisions or directorates/offices to ensure coordination within centers (e.g., ERCs, NSECs, SLCs, STCs) and with related programs. Some centers receive split funding from two or more NSF programs. In general NSECs receive split funding from MPS and ENG directorates. External to the NSF and in order to ensure interagency collaboration and to respond to guidance by the OMB/OSTP Research and Development Budget Priorities, the NSF has substantial representation on the National Science and Technology Council interagency working groups as well as their subcommittees. For example, NSF leads interagency working groups for the National Nanotechnology Initiative (NNI) thus ensuring interagency coordination for the Nanoscale Science and Engineering Centers (NSECs). In addition, when appropriate, representatives from industry or other agencies at federal, state, and local levels participate in the merit review process or are members of post-award monitoring teams. For example, the Science and Technology Centers routinely includes annually site visitors who are NASA employees to monitor the progress of the Integrated Space Weather Modeling Center at Boston University. Also, in some cases centers receive financial support from related programs across government or the private sector. As an example, twenty-five industrial companies have cooperated with ERCs and have co-sponsored the University of Michigan's ERC for Reconfigurable Manufacturing Systems.

Evidence: Evidence relevant to demonstrating the Centers Program coordination and collaboration with similar programs may be found in the following sources: (1) internal documents, such as internal administrative manuals and management plans for several types of centers (e.g., ERCs, SLCs, STCs), annual and final reports, specific center indicator databases (NSECWeb, ERCWeb), site visit reports and reverse site visit reports, (2) Representation on NSTC interagency committees and subcommittees; (3) Letter from OMB/OSTP on Administrative Research and Development Budget Priorities (4) National Nanotechnology Initiative (NNI), a collaboration among Federal agency and departments led by NSF and of which NSECs are a component (http://www.nano.gov/html/gov/home_gov.html (5) Award announcements: NSF/SRC ERC for Environmentally Benign Semiconductor Manufacturing: (http://www.nsf.gov/pubs/stis1996/pr9618/pr9618.txt). Additional information can be found in Centers Program solicitations: (1) Chemical Bonding Centers, Phases I and II: (http://www.nsf.gov/pubs/2004/nsf04612/nsf04612.htm) (http://www.nsf.gov/pubs/2006/nsf06558/nsf06558.htm) (2) Engineering Research Centers (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04570) (3) Materials Research Science and Engineering Centers (http://www.nsf.gov/pubs/2004/nsf04580/nsf04580.htm) (4) Nanoscale Science and Engineering Centers (http://www.nsf.gov/pubs/2000/nsf00119/nsf00119.pdf) (5) National Center for Ecological Analysis and Synthesis (http://www.nsf.gov/funding/pgm_summ.jst?pims_id=13450&org=OII) (6) Science and Technology Centers (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf03550) (7)Science of Learning Centers (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05509)

YES 9%
3.6

Does the program use strong financial management practices?

Explanation: The financial management practices of the Centers Program are consistent with the strong financial management practices that led to the NSF being the first federal agency to receive a 'green' for financial management on the President's Management Agenda scorecard. NSF continues to maintain a green rating. NSF has received a clean opinion on its financial audits for the last 8 years. NSF is committed to providing quality financial management to all its stakeholders. NSF honors that commitment by preparing annual financial statements in conformity with generally accepted accounting principles in the U.S. and then subjecting the statement to independent audits. As a federal agency, NSF prepares the following annual financial statements: Balance Sheet, Statement of Net Cost, Statement of Changes in Net Position, Statement of Budgetary Resources, and Statement of Financing. Supplementary statements are also prepared including Budgetary Resources by Major Accounts, Intragovernmental Balances, Deferred Maintenance, and Stewardship Investments. In addition to these agency wide policies and practices, the Centers Program, however does employ additional financial management oversight. To increase the likelihood that deficiencies are brought to light and addressed in a timely manner, awards under the Centers Program are closely examined and monitored. Due to several risk factors, including large award size, long award period, and the involvement of numerous sub-awardees, a significant number of awards are considered "high risk" by the NSF's Cost Analysis and Audit Resolution Branch (CAARB) of the Division of Institution and Award Support. Annually, CAARB together with the Division of Grants and Agreements (DGA) selects a subset of organizations for on-site reviews. Representatives from CAARB and DGA review general management practices and accounting systems while on-site. In addition, reported expenses are reconciled to the general ledger and other targeted areas are reviewed as appropriate. NSF ensures that these organizations address any deficiencies brought to light during the visit. Some types of centers with customized annual report and renewal proposal formats, such as the ERCs, include financial data describing annual expenditures for monitoring spending rates and to alert program officers of the need to request a CAARB review if necessary. CAARB staff members routinely give presentations at the ERC annual meeting to ERC's financial and administrative managers.

Evidence: Evidence of NSF's strong financial management practices may be found in the following sources: (1) Executive Branch Management Scorecard (http://www.whitehourse.gov/results/agenda/scorecard.html); (2) Performance and management assessments (http://www.whitehouse.gov/omb/). (3) FY 2005 Performance and Accountability Reports (PAR) (http://www.nsf.gov/pubsys/ods/getpub.cfm?par). (4) FY 2005 AC/GPA Report (http://www.nsf.gov/publications/pub_summ_jsp?ods_key=nsf05210); (5) General Cooperative Agreement terms and conditions: http://www.nsf.gov/pubs/gc1/ca102.pdf. Additional documentation can be found in Center-specific Cooperative Agreement terms: (1) Nanoscale Science and Engineering Center (http://www.nsf.gov/pubs/policydocs/nsf05543.pdf) (2) Science and Technology Centers (http://www.nsf.gov/pubs/policydocs/nsf03550.pdf) (3) Materials Research Science and Engineering Centers (http://www.nsf.gov/pubs/policydocs/nsf04580.pdf) (4) Science of Learning Centers (http://www.nsf.gov/pubs/policydocs/nsf03573.pdf) NOTE: The Engineering Research Centers cooperative agreement template is not a web document but is available if requested. Chemical Bonding Centers Cooperative Agreement Terms and Conditions are not yet written for CBC, Phase II. Internal documents provide additional information. ERC and NSEC annual reporting guidelines and performance indicators database documentation, new center start-up presentation section on financial management and reporting requirements; CAARB financial audit reports for individual centers; National Science Board and Directorate's Review Board review of center award recommendations

YES 9%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: No management deficiencies have been identified. The Centers Program receives and acts on advice on how to improve program management from external experts including: (a) Advisory Committee for GPRA Performance Assessment (AC/GPA), (b) Committees of Visitors (COVs) and Directorate Advisory Committees, (c) annual site visits and reverse site visits, and (d) third party evaluations and targeted studies. The AC/GPA concluded that the NSF demonstrated "significant achievement" for the Merit Review indicator of the Organizational Excellence Strategic Outcome Goal. The COVs perform external program assessments every three to five years by conducting detailed reviews of the materials associated with proposal actions, assessing the integrity and efficiency of the processes for proposal review, assessing program management, and providing recommendations for improving effectiveness and efficiency. The standard COV report is organized by the COV Template, a series of questions focusing on program management and process. Following receipt of the COV report, the NSF staff, responds to any management deficiencies and outlines the steps the agency will take to address the problems. The COV report and the NSF response are reviewed by the relevant Directorate Advisory Committees and by the cognizant NSF Assistant Director and the Senior Management Group. For example, a COV suggested moving MRSEC from four-year to six-year awards and the program implemented the recommendation. In addition, Directorate Advisory Committees review program management on a regular schedule. Annual and biennial site visits assess all facets of the centers, center management being essential to the success of centers. Third-party evaluations include assessments of program design, implementation and outcomes as well as program management. In addition, the Foundation conducts an annual review to assess administrative and financial systems and procedures to ensure that effective management controls are in place, and that any deficiencies are identified and addressed.

Evidence: Reports indicating no significant management deficiencies in the Centers Program are as follows: NSF Advisory Committee for GPRA Performance Assessment (AC/GPA) FY 2005 AC/GPA Report http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210&org=NSF; annual Performance and Accountability Reports (http://www.nsf.gov/pubsys/ods/getpub.cfm?par); the NSF Business Analysis; all COV reports, NSF responses, and Directorate Advisory Committee reviews of COV reports may be found at http://www.nsf.gov/od/oia/activities/cov/covs.jsp. Subchapter 300 of the Proposal and Award Manual(http//:www.inside.nsf.gov/pubs/pam/pam1205/toc_htm) and the Grant Policy Manual (http://www.nsf.gov/publications/pub_summ.jsp?ods_kay=gpm) provides additional information regarding the COV process and role of GPRA. Additional information regarding management planning procedures is available to NSF staff via documents such as the Proposal and Award Manual (http://www.inside.nsf.gov/cgi-bin/getpub?pam) and annual reviews of the NSF's internal accounting and administrative controls (http://www.nsf.gov/pubs/2006/nsf0601/pdf/05c.pdf). Information about management of center programs can be found in the following third-party studies. (1) Assessment of and Outlook for NSF's Materials Research Laboratory Program (National Academy of Sciences, on-going); (2) An Assessment of the National Science Foundation's Science and Technology Centers Program (1996) (http://darwin.nap.edu/books/0309053240/html/) (3) The Impact on Industry of Interaction with Engineering Research Centers 1997 (http://www.sri.com/policy/csted/reports/sandt/erc/contents.html); (4)The Impact on Industry of Interaction with Engineering Research Centers Repeat Study 2004 (http://www.sri.com/policy/csted/reports/sandt/documents/ERC2004REPORT.pdf); (5) The Impact of Engineering Research Centers on Institutional and Cultural Change in Participation Universities (http://www.sri.com:8000/policy/csted/reports/sandt/documents/ERCCulturalImpact.pdf). (6) A Multi-Method Analysis of the Social and Technical Conditions for Interdisciplinary Collaboration, Hybrid Vigor Institute (2003). (7) National Science Foundation's Science and Technology Centers: Building an Interdisciplinary Research Paradigm, A Study by a Panel of the National Academy of Public Administration (1995). (8) STC Ad Hoc Reports (special requests) (9) An Evaluation of the National Science Foundation Science and Technology Centers (STC) Program., Abt Associates, Inc. (1996). (10) Internal documents for center programs include site visit reports and reverse site visit reports.

YES 9%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: Centers Program awards are based on the NSF's competitive, merit review process that includes external peer review using two standard NSB-approved criteria as well as program-specific criteria. All activities from pre-proposal stage to the making of the awards rely upon NSF's competitive merit review process that includes external peer evaluation of pre-proposals, new and renewal proposals, and in the case of the Centers Program, annual continuations of funding. Prior to selecting the appropriate mail-reviewers and /or review-panel members and/or site visit team members, all pre-proposals and full proposals are carefully reviewed by an individual expert and/or multidisciplinary group of scientists, engineers, or educators serving as NSF program officer(s). Each center proposal undergoes multiple levels (3 to 4) of external review with each stage being conducted by NSF and external experts in the particular field(s) of research and education addressed in the proposal. Only a limited number of pre-proposal move to the full proposal stage. Reviewers use the NSB-approved criteria that address the "Intellectual Merit" and the "Broader Impacts" of the proposed effort. Some solicitations contain additional criteria specific to programmatic objectives, such as the "added value" of the Center mode for the proposed research, for the integration of research and education, for the strengthening of partnerships between academic and non-academic partners (e.g., industry), and for justification for and validity of NSF investment of the proposed center. After the external reviewers prepare their reviews and make recommendations the NSF Program Officers prepare their recommendations to fund or to decline each proposal. Depending on the size of the award, the cognizant Assistant Director, the NSF's Senior Management and in some cases, the National Science Board must approve award recommendations. In the case of continuing funding, program officers can recommend full funding, reductions in funding or the addition of conditions for follow-on funding.

Evidence: Evidence demonstrating that awards are made through a clear competitive process is included in the following sources: (1) The NSF FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par). (2) The NSB Policy on Recompetition (http://www.nsf.gov/nsb/documents/1997/nsb97224/nsb97224.txt). (3) The Report to the National Science Board on the National Science Foundation's Merit Review Process Fiscal Year 2005 (NSB-06-21) (http://www.nsf.gov/nsb/documents/2006/0306/merit_review.pdf). (4) NSF Merit Review Criteria (http://www.nsf.gov/pubs/1999/nsf99172/nsf99172.htm). (5) NSF Advisory Committee for GPRA Performance Assessment (AC/GPA) reviews FY 2005 AC/GPA Report pages 50 -54. http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210&org=NSF (6) COV reports and NSF responses (http://www.nsf.gov/od/oia/activities/cov/). (7) Internal documents include program officer recommendations; recommendations packets for Directorate, NSF and NSB review, e-jacket panel review records. The merit review process and criteria are described in Centers Program solicitations: (1) Chemical Bonding Centers, Phases I and II: (http://www.nsf.gov/pubs/2004/nsf04612/nsf04612.htm) (http://www.nsf.gov/pubs/2006/nsf06558/nsf06558.htm) (2) Engineering Research Centers (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04570) (3) Materials Research Science and Engineering Centers (http://www.nsf.gov/pubs/2004/nsf04580/nsf04580.htm) (4) Nanoscale Science and Engineering Centers (http://www.nsf.gov/pubs/2000/nsf00119/nsf00119.pdf) (5) National Center for Ecological Analysis and Synthesis (http://www.nsf.gov/funding/pgm_summ.jst?pims_id=13450&org=OII) (6) Science and Technology Centers http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf03550) (7) Science of Learning Centers (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05509)

YES 20%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The NSF Centers Program uses a multifaceted array of oversight practices including desk reviews, monthly teleconferencing, annual reports, site visit reviews, reverse site visits, an IT-enabled project management system, and annual awardees' meetings. The program's award oversight is tailored for each award and type of center, the terms and conditions of which are included in the cooperative agreement. Continuing support is based upon progress toward the goals of the center and the center program that are documented in required annual reports and in several cases, annual site visit reports that are subject to review and approval by NSF Program Officers before additional funds are released. NSF adheres to the oversight standards used by all Federal agencies. Using these standards allows NSF to benefit from awardee compliance with all relevant OMB Circulars regarding annual audits, and compliance with other Federal regulations regarding the use of Federal funds. The Single Audit Act and cognizant audit agencies mandate significant oversight functions for grant and contract recipients. This law and the concomitant oversight audit and review activities provide baseline oversight procedures that govern all grant recipients. As a system, NSF's oversight mechanisms provide sufficient knowledge of awardee activities to monitor and understand how funds are utilized by awardees. Moreover, several center programs have commissioned program evaluations or targeted studies of the centers to document and assess grantee activities. For example, the STC program commissioned a program evaluation that was conducted by the National Academies. The ERCs commissioned a study of the impact of interaction with universities on industry. The National Academies is conducting a study of the MRSEC program that is anticipated for release in the fall of 2007. The NCEAS supported a targeted study examining the social and technical conditions for interdisciplinary collaborations.

Evidence: Evidence demonstrating sufficient oversight practices are included in the following sources: COV reports (http://www.nsf.gov/od/oia/activities/cov/); Report to the NSB on the NSF Merit Review Process - FY 2005 (http://www.nsf.gov/nsb/documents/2005/0930/merit_review.pdf) Performance and Accountability Reports (http://www.nsf.gov/pubsys/ods/getpub.cfm?par); Risk Assessment and Award Monitoring Guide; clean audit opinions; President's Management Agenda (PMA) Scorecard for Financial Management (http://www.whitehouse.gov/results/agenda/scorecard.html); Internal documents that include: annual reports, site visit reports, reverse visit reports, context statement, program officer recommendations; recommendations packets for Directorate, NSF and NSB review; e-jacket panel review records Information on the documentation and assessment of awardee activities by third party studies are found in the following sources. (1) Assessment of and Outlook for NSF's Materials Research Laboratory Program (National Academy of Sciences, on-going); (2) An Assessment of the National Science Foundation's Science and Technology Centers Program (1996) (http://darwin.nap.edu/books/0309053240/html/) (3) The Impact on Industry of Interaction with Engineering Research Centers 1997 (http://www.sri.com/policy/csted/reports/sandt/erc/contents.html); (4)The Impact on Industry of Interaction with Engineering Research Centers Repeat Study 2004 (http://www.sri.com/policy/csted/reports/sandt/documents/ERC2004REPORT.pdf); (5) The Impact of Engineering Research Centers on Institutional and Cultural Change in Participation Universities (http://www.sri.com:8000/policy/csted/reports/sandt/documents/ERCCulturalImpact.pdf). (6) A Multi-Method Analysis of the Social and Technical Conditions for Interdisciplinary Collaboration, Hybrid Vigor Institute (2003). (7) National Science Foundation's Science and Technology Centers: Building an Interdisciplinary Research Paradigm, A Study by a Panel of the National Academy of Public Administration (1995). (8) An Evaluation of the National Science Foundation Science and Technology Centers (STC) Program., Abt Associates, Inc. (1996). (9) STC Ad Hoc Reports (special requests)

YES 9%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: The Centers Program collects awardee performance data on an annual basis from all NSF-supported centers in the Centers Program and makes the information available to the public in a transparent and meaningful manner. Through the submission of annual reports using either the generic electronic NSF annual reporting format on FastLane or a customized reporting template (e.g., MRSECs, NCEAS, NSECs, SLCs, STCs) and site or reverse site visits, the required information is collected, archived, and analyzed. Annual reports are either available to the public through NSF or on the websites of individual centers (e.g., STCs). In addition, awardees are required to collect and submit specific activity, personnel, and performance data to a customized database for specific types of centers (e.g. ERCs and NSECs). Tables of graphic displays of the aggregated data are presented during PI meetings and on websites dedicated to those centers. NSF, including the Centers Program, make available results on the Discoveries area of the NSF website, through press releases, through the annual Performance and Accountability Report, an annual brochure on Performance Highlights, and the annual Report of the of the Advisory Committee for GPRA Performance Assessment and performance data located on the NSF website or individual Center websites. NSF's Cooperative Agreement General Conditions require that technical results of NSF-supported research be published in open literature such as peer-reviewed journals and professional meeting proceedings. Each center in the NSF's Centers Program has a web site that features research, education, and accomplishments in knowledge/technology transfer. Some of the awardees distribute electronic or hard copy newsletters to various stakeholders and general public.

Evidence: Evidence demonstrating that performance data is collected from grantees and made available to the public in a transparent and meaningful manner includes the following sources: (1) NSF Discoveries web site ( http://www.nsf.gov/discoveries/); (2) News releases (http://www.nsf.gov/news/news_list.cfm?nt=2); (3) FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par); (4) FY 2005 AC/GPA (http://www.nsf.gov/publications/pub_summ_jsp?ods_key=nsf05210); (5) Center-level URLs that have current information about awards (ERC URLs: http://www.erc-assoc.org/centes.htm); annual reports on individual center websites (Science and Technology Centers - http://www.nsf.gov/od/oia/programs/stc/) (6) The NSF FY 2007 Budget Request to Congress (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=budreqfy2007). (7) Other reports include Center-level reports. Report for the University of Michigan College of Engineering NSF Engineering Research Center for Reconfigurable Manufacturing Systems Report to Industry: Innovations that Shape the Manufacturing Industry and Innovations and Inventions (2004); The Economic Impact on Georgia of Georgia Tech's Packing Research Center 2004 General Cooperative Agreement Terms (http://www.nsf.gov/pubs/gc1/ca102pdf) (8) Presentations of aggregated data during PI meetings (e.g., NSECWeb, ERCWeb). (reference).

YES 9%
3.RD1

For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?

Explanation: Not applicable to the Centers Program since all funding is allocated via the competitive grants process.

Evidence:

NA 0%
Section 3 - Program Management Score 100%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The Centers Program has demonstrated adequate progress in achieving its long-term performance goals as determined by an external expert panel, the Advisory Committee for GPRA Performance Assessment (AC/GPA) and as shown in the Measures Tab. Since FY 2002, the NSF Advisory Committee for GPRA Performance Assessment (AC/GPA) has determined that the accomplishments in all the indicators for the Ideas category, of which the Centers Program is a major component, demonstrate "significant achievement." Those indicators are: (1) enabling people who work at the forefront of discovery to make important and significant contributions to S&E knowledge; (2) encouraging collaborative research and education efforts across organizations, disciplines, sectors and international boundaries; (3) fostering connections between discoveries and their use in the service of society; and (4) providing leadership in identifying and developing new research and education opportunities within and across S&E fields.

Evidence: The AC/GPA has consistently determined that NSF has demonstrated significant progress towards meeting the long-term performance goals related to the Ideas Strategic Outcome Goal which includes the Centers Program" and found in the Measures Tab and as indicated in the FY 2005 Advisory Committee for GPRA Performance Assessment (AC/GPA), pages 26-35. http://www.nsf.gov/pubs/2005/nsf05210/nsf05210.pdf. A description of the relationship of specific Centers Program objectives to the NSF Strategic Goals is found in the NSF's Strategic Plan (pp. 15-18, National Science Foundation Strategic Plan FY 2003-3008, http://www.nsf.gov/pubs/2004/nsf04021/FY2003-2008.pdf). The long term outcome measures of the Centers Program are based on these objectives. In addition, the NSF FY 2005 Performance and Accountability Reports, pages I-10 - 1-11 (Management's Discussion and Analysis; 1-6 and II-41 to II-45, at http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par) provide evidence, as well as the independent COV evaluations and the NSF response to those evaluations (http://www.nsf.gov/od/oia/activities/cov/covs.jsp); Additional evidence that the Centers Program has demonstrated progress in achieving its long-term performance goals is included in the following sources developed by independent third parties: (1) Assessment of and Outlook for NSF's Materials Research Laboratory Program (National Academy of Sciences, on-going); (2) An Assessment of the National Science Foundation's Science and Technology Centers Program (1996) (http://darwin.nap.edu/books/0309053240/html/) (3) The Impact on Industry of Interaction with Engineering Research Centers 1997 (http://www.sri.com/policy/csted/reports/sandt/erc/contents.html); (4)The Impact on Industry of Interaction with Engineering Research Centers Repeat Study 2004 (http://www.sri.com/policy/csted/reports/sandt/documents/ERC2004REPORT.pdf); (5) The Impact of Engineering Research Centers on Institutional and Cultural Change in Participation Universities (http://www.sri.com:8000/policy/csted/reports/sandt/documents/ERCCulturalImpact.pdf). (6) A Multi-Method Analysis of the Social and Technical Conditions for Interdisciplinary Collaboration, Hybrid Vigor Institute (2003). Final Report for NSF - BCS-0129573 (2003) (http://www.nceas.ucsb.edu/fmt/doc?/nceas-web/center/renewal2005, item #10). (7) National Science Foundation's Science and Technology Centers: Building an Interdisciplinary Research Paradigm, A Study by a Panel of the National Academy of Public Administration (1995). (8) An Evaluation of the National Science Foundation Science and Technology Centers (STC) Program., Abt Assoicates, Inc. (1996). (9) STC Ad Hoc Reports (special requests) (10) Internal documents include site visit reports, reverse site visit reports, annual reports, and reports from annual PI meetings (Inter

YES 20%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The Centers Program achieves its annual performance goals. These include: (1) percentage of decisions on pre-proposals, that are merit reviewed and available to Centers Program applicants, within five months of pre-proposal receipt or deadline date and (2) the percentage of partner institutions in the Centers Program that are non-academic institutions demonstrating connections between discoveries and their use in the S&E community and in the service of society.

Evidence: Evidence showing that the Centers Program meets its annual performance goals can be found in the Measures Tab and internal documents and NSF databases.

LARGE EXTENT 13%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Centers Program has demonstrated improved efficiencies and cost effectiveness in achieving its program goals. A key area of activity has been the implementation of extensive in-depth external (peer) merit reviews of these large-scale organized S&E efforts that ensures that only the most meritorious proposals are selected for funding. Centers Program competitions are held every two to three years dependent upon Foundation priorities, S&E needs, and availability of funding. Within this context, the Centers Program responds to NSF's objective to operate a credible, efficient merit review system by subjecting Centers Program proposals to several rounds of external merit review before being awarded, starting with a pre-proposal stage. On average, approximately 33 percent of the submitted pre-proposals move to the full proposals stage of development and merit review. This process ensures that in-depth merit review is carried out on only a limited number of full proposals that have a high likelihood of success and ensures that only the most meritorious center proposals are selected for funding. The implementation of this strong multi-stage merit review process contributes significantly to the efficiency and cost effectiveness of the Foundation in the selection of awards for science and engineering efforts that require extensive resources. This commitment to a strong merit review process is reflected in post-award management practices. Individual centers are required to develop strategic plans, submit documentation of annual performance, and are subjected to annual or biennial site visits by external site review teams that hold individual centers accountable for the accomplishment of their annual goals to ensure the success of the centers.

Evidence: Evidence demonstrating improved efficiencies and cost effectiveness in achieving program goals can be found in the multi-stage merit review process as documented in the NSF Enterprise System. On average, only one third of Centers Program applicants who submit pre-proposals are invited to submit full proposals. Efficiencies in the merit review of pre-proposals and processing are demonstrated by the Centers Program, through maintaining a five-month time-to-decision (invite/not invite full proposals) goal of 85 percent as shown in the Measures Tab. Additional evidence of the competitive, multi-stage external (peer) merit review process can be found in internal documents (e.g., program management plans and documentation of decisions at each level of the review process) and the following program solicitations. (1) Chemical Bonding Centers, Phase I and II http://www.nsf.gov/pubs/2004/nsf04612/nsf04612.htm http://www.nsf.gov/pubs/2006/nsf06558/nsf06558.htm (2) Engineering Research Centers: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04570; (3) Materials Research Science and Engineering Centers http://www.nsf.gov/pubs/2004/nsf04580/nsf04580.htm; (4) Nanoscale Science and Engineering Centers http://www.nsf.gov/pubs/2000/nsf00119/nsf00119.pdf (5) National Center for Ecological Analysis and Synthesis http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=13450&org=OII (6) Science and Technology Centers: http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf03550 (7) Science of Learning Centers: http://www.nsf.gov/pubs/2005/nsf05509/nsf05509.htm

LARGE EXTENT 13%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: The performance of the Centers Program compares favorably to other government and private programs with similar high-level purposes and goals, while uniquely addressing needs that are not under the purview of mission-oriented federal, state or local agencies. Since NSF is the only Federal agency charged with promoting the progress of S&E research and education across all fields and disciplines its investments in the Centers Program provide a principle source of Federal support for academic research centers in S&E and enable them to integrate people, ideas, and tools on scales that are large enough to significantly impact important S&E fields and cross-disciplinary areas. No other entity, government or private, addresses such far-reaching and diverse purposes related to the enhancement of the Nation's cutting-edge research and education in STEM. Centers enable different sectors of the economy to develop or enhance partnerships to jointly address a problem or area of scientific or engineering interest through large-scale organized efforts. The Centers Program encompasses centers comprised of partnerships of academic and non-academic institutions, the nature of which is determined by the requirements of the research and education. For example, several centers in the Centers Program respond to Congressional mandates. For example, NSF leads interagency working groups for the National Nanotechnology Initiative (NNI). An important component of NNI is the establishment and support of Nanoscale Science and Engineering Centers (NSECs), which are designed to complement and not duplicate other efforts currently supported by NSF or other public or private means. In addition, NSF's centers have on occasion been co-funded by industry. An example from the private sector is the University of Michigan Engineering Research Center Reconfigurable Manufacturing Systems. Twenty-five industrial companies have contributed to the center's founding and continue to support it.

Evidence: Evidence demonstrating that the Center Program comparing favorably to other similar programs while addressing specific and existing problems can be found in the following sources. NSF FY 2003-2008 Strategic Plan, page 16-18 (http://www.nsf.gov/publications/puf_summ.jsp)ods_key=nsf04201; The FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par, pp. I-10 to I-11 and II-4 to II-45; COV reports (http://www.nsf.gov/od/oia/activities/cov/) AC reports; Information on the NNI is available at http://www.nano.gov/; See NSF centers' websites for descriptions of specific and existing problems being addressed by each center; internal documentation includes annual reports, site visit reports and/or reverse site visits/annual reviews. Evidence of Centers being distinctive relative to awards to individual researchers is included in COV reports, which discuss the appropriateness of the Center mode for centers included in the divisions covered by these reports ( http://www.nsf.gov/od/oia/activities/cov/covs.jsp ); Justification for the Center mode of funding is found in the NSF publication, Principles of National Science Foundation Research Centers, authored by NSF's Senior Management Integration Group, June 21, 2005.

YES 20%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: : Independent reviews by Committees of Visitors and other external groups such as Advisory Committees, the National Science Board, and national S&E organizations find that the Centers Program is effective and achieves desired results. In addition to COV and AC reviews, the 2005 performance evaluation conducted by the Advisory Committee for GPRA Performance Assessment (AC/GPA), that includes the Centers Program, concluded that the NSF demonstrated "significant achievement" in all objectives for the Ideas strategic outcome goal. The AC/GPA added that, the "NSF accomplishments in the IDEAS strategic outcome goal have advanced the frontiers of discovery and hold considerable promise for addressing important societal concerns." Targeted studies of specific center programs by independent third parties concluded that center programs have produced and/or facilitated high-quality world class research, fostered scientific collaborations, created interactions between industry and the academy as a means for the transfer of knowledge between public and private sectors. For example, the Institute for Scientific Information Essential Science Citation Index reported that the National Center for Ecological Analysis and Synthesis was in the top one percent of more than 38,000 institutions in terms of total citations in the field of Environment/Ecology. An early National Academies' study of the Science and Technology Centers (STC) Program by the Committee on Science, Engineering, and Public Policy found that the STCs are "producing high-quality world-class research that would not have been possible without a center structure and presence, and ?? that the design of the STC program has produced an effective means for identifying particularly important scientific problems that require a center mode of support." A recent study of the Engineering Research Centers found that the benefits of participation and impact on the private sector involved knowledge transfer, "access to new ideas, know-how, or technologies." In a follow-up study of the ERC Program, knowledge transfer continued to be a benefit and has had an impact on private sector firms involved in the centers. NOTE: The weight of this question has been increased to reflect the importance NSF places on the conduct of independent evaluations to support program improvement and evaluate program effectiveness.

Evidence: Evidence demonstrating that program is effective and achieving results can be found in the following sources: AC/GPA Report FY 2005 AC/GPA Report http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210&org=NSF; page 26) NSF Annual Performance and Accountability Reports in FY 2005 Performance and Accountability Report, pages I-10 to 1-11 and II-41 to II-45 (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par); COV reports and NSF responses (http://www.nsf.gov/od/oia/activities/cov/); Evidence supporting the finding that center programs are effective and achieving results can be found in the following third party studies. (1) Assessment of and Outlook for NSF's Materials Research Laboratory Program (National Academy of Sciences, on-going); (2) An Assessment of the National Science Foundation's Science and Technology Centers Program (1996) (http://darwin.nap.edu/books/0309053240/html/) (3) The Impact on Industry of Interaction with Engineering Research Centers 1997 (http://www.sri.com/policy/csted/reports/sandt/erc/contents.html); (4)The Impact on Industry of Interaction with Engineering Research Centers Repeat Study 2004 (http://www.sri.com/policy/csted/reports/sandt/documents/ERC2004REPORT.pdf); (5) The Impact of Engineering Research Centers on Institutional and Cultural Change in Participation Universities ( http://www.sri.com:8000/policy/csted/reports/sandt/documents/ERCCulturalImpact.pdf). (6) A Multi-Method Analysis of the Social and Technical Conditions for Interdisciplinary Collaboration, Hybrid Vigor Institute (2003). Final Report for NSF - BCS-0129573 (2003) (http://www.nceas.ucsb.edu/fmt/doc?/nceas-web/center/renewal2005, item #10). (7) National Science Foundation's Science and Technology Centers: Building an Interdisciplinary Research Paradigm, A Study by a Panel of the National Academy of Public Administration (1995). (8) STC Ad Hoc Reports (special requests) (9) An Evaluation of the National Science Foundation Science and Technology Centers (STC) Program., Abt Assoicates, Inc. (1996). (10) Internal documents include Site visit reports, reverse site visit reports, annual reports, and reports from annual PI meetings

YES 20%
Section 4 - Program Results/Accountability Score 87%


Last updated: 01092009.2006FALL