ExpectMore.gov


Detailed Information on the
Support for Research Institutions Assessment

Program Code 10002324
Program Title Support for Research Institutions
Department Name National Science Foundation
Agency/Bureau Name National Science Foundation
Program Type(s) Research and Development Program
Competitive Grant Program
Assessment Year 2004
Assessment Rating Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 100%
Program Management 100%
Program Results/Accountability 78%
Program Funding Level
(in millions)
FY2008 $150
FY2009 $153

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2008

NSF will continue to develop ways and measures to monitor its efforts to broaden participation in this and other programs.

Action taken, but not completed

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2004

The program will improve performance targets and will continue to improve monitoring of performance against those targets.

Completed This improvement action overlaps with the second improvement action and therefore is duplicative. NSF is taking steps to broaden participation from underrepresentative groups and diverse institutions, especially in programs aimed at undergraduate students.
2004

External committees of visitors are continuing targeted reviews of the components of the program.

Completed Two Committees of Visitors (COVs) were convened in FY2003 for Research Institutions programs and one COV was convened in FY2005. COV Reports as well as NSF responses are posted on NSF's website. COVs conduct thorough program reviews, assess outcomes, and make recommendations on program and management improvements. This is NSF's primary method for conducting assessment at the program and portfolio level.
2004

The Budget provides funding to continue this program??s current effectiveness in enhancing science and engineering education.

Completed
2005

All Institutions programs will ensure increased timeliness of yearly project reports from investigators.

Completed On Nov. 18, 2006, changes will be implemented in the Project Reports System to enable NSF to monitor and enforce that PIs are submitting annual and final project reports within the appropriate timeframes. Annual reports are due 90 days prior to report period end date and are required for all standard and continuing grants and cooperative agreements. Final reports are due within 90 days after expiration of award. Policy documents have been updated to reflect the changes.
2006

NSF will develop new ways and measures to monitor its efforts to broaden participation in this and other programs.

Completed This is part of NSF's Stewardship Goal - Broadening Participation. One significant step already in place is to develop a searchable reviewer database with demographic data, which will broaden and diversify the reviewer pool for proposals. Other recommendations concern training for staff and panelists on implicit bias, enhancing tracking mechanisms, and including a broadening participation performance indicator in staff evaluations.
2007

NSF continued its focus on strengthening and expanding broadening participation activities involving research institutions.

Completed

Program Performance Measures

Term Type  
Annual Efficiency

Measure: For 70 percent of proposals submitted the Education and Human Resources Directorate (EHR), be able to inform applicants about funding decisions within six months of proposal receipt or deadline, or target date, whichever is later, while maintaining a credible and efficient merit review system.


Explanation:Because the program category "Research Institutions" no longer exists under NSF's new Strategic Plan, data on the measures associated with the PART Program can no longer be tracked. However, because the PART Program corresponds to several programs administered by the EHR Directorate, the Foundation has adopted a Directorate-wide measure in its place for FY 2007 and beyond.

Year Target Actual
2001 - 71%
2002 - 74%
2003 - 80%
2004 70% 83%
2005 70% 76%
2006 70% 74.10%
2007 70% 82%
2008 70% 76%
2009 70%
2010 70%
Long-term Outcome

Measure: External validation by the Advisory Committee for GPRA Performance Assessment that Institutions has made "significant achievement" in activities to attract and prepare U.S. students to be highly qualified members of the global S&E workforce.


Explanation:Assessment by the Advisory Committee for GPRA Performance Assessment of the impact of Institutions on the development of the S&E workforce (through means such as providing opportunities for international study, collaborations, and partnerships).

Year Target Actual
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success Success
2009 Success
2010 Success
2011 Success
2012 Success
Annual Output

Measure: Increase the percentage of proposals submitted to the Education and Human Resources Directorate (EHR) programs from academic institutions not in the top 100 of NSF funding recipients.


Explanation:Because the program category "Research Institutions" no longer exists under NSF's new Strategic Plan, data on measures associated with the PART Program can no longer be tracked. However, because the PART Program corresponds to several programs administered by the EHR Directorate, the Foundation has adopted a Directorate-wide measure in its place for FY 2007 and beyond.

Year Target Actual
2001 - 73%
2002 - 66%
2003 - 70%
2004 71% 68%
2005 72% 71%
2006 73% 64.84%
2007 73% 64%
2008 73% 61%
2009 73%
2010 73%
Long-term Outcome

Measure: External validation by the Advisory Committee for GPRA Performance Assessment that Institutions has made "significant achievement" in developing the Nation's capability to provide K-12 and higher education faculty with opportunities for continuous learning and career development in science, technology, engineering and mathematics.


Explanation:Assessment of the impact of Institutions in integrating research and education, promoting scientific literacy, and contributing scientific knowledge for use in national policy decisions by the Advisory Committee for GPRA Performance Assessment.

Year Target Actual
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success Success
2009
2010
2011
2012

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of NSF's investments in Institutions is to "enable colleges, universities and other institutions to attract increased numbers of students to science and engineering (S&E) fields and enhance the quality of S&E education at all levels." This statement of purpose is derived directly from the statutes that govern the Foundation. The NSF Act of 1950 authorizes and directs NSF to support and strengthen science education programs at all levels. Other statutes, notably the Education for Economic Security Act, have expanded this authority to include the following objectives: public understanding of science and technology, faculty enhancement, student education and training, instructional development and instrumentation, and materials development and dissemination. These purposes have since been further expanded and clarified in the recently enacted NSF Authorization Act of 2002.

Evidence: Relevant information concerning the Institutions program purpose may be found in the NSF Strategic Plan FY 2003-2008 (www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf), the National Science Foundation Act of 1950, 42 USC 1861 et. seq., the NSF Authorization Act of 2002 (P.L. 107-378), and the Education for Economic Security Act, 20 USC 3912.

YES 20%
1.2

Does the program address a specific and existing problem, interest or need?

Explanation: The national imperative for NSF's investments in Institutions is addressed in Paragraph 2 of Section 3, (Policy Objectives) of the NSF Authorization Act of 2002: "To increase overall workforce skills by - ' (E) expanding science, mathematics, engineering, and technology training opportunities at institutions of higher education." In addition, Section 8 paragraph 7 authorizes the Science, Mathematics, Engineering and Technology Talent Expansion Program and Section 21 amends the Scientific and Advanced-Technology Act of 1992, which authorizes the Advanced Technological Education program. There are also provisions in Section 18 to report to Congress on gender differences in the careers of science and engineering faculty and Minority-Serving Institution funding.

Evidence: Evidence of the Institutions program addressing an existing problem may be found in the NSF Authorization Act of 2002 (P.L. 107-378).

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: NSF has specific, statutory authority to evaluate the status and needs of the various sciences and engineering and to consider the results of this evaluation in correlating its research and educational programs with other Federal and non-Federal programs. NSF is the only federal agency charged with promoting the progress of science and engineering research and education in all fields and disciplines. As such, NSF's activities through its investments in Institutions uniquely address needs that are not under the purview of mission-oriented federal, state or local agencies. Several of the Institutions program's activities respond to Congressional mandates.

Evidence: Evidence demonstrating the unique role of NSF's Institutions program may be found in Science and Engineering Equal Opportunities Act, 42 USC 1885. In addition, the Advanced Technological Education activity was established in the Scientific and Advanced Technology Act of 1992 (P.L. 102-476) and the Science, Mathematics, Engineering and Mathematics Talent Expansion Program (STEP) was established in the NSF Authorization Act of 2002 (P.L. 107-368).

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: NSF's investments in Institutions rely upon the competitive merit review process, NSF program officers, and Committees of Visitors (COVs) to ensure program effectiveness and efficiency. Merit review by peers has been recognized as a best practice for administering R&D programs. Independent reviews by COVs and other external groups (e.g., Advisory Committees, National Science Board, National Academy of Sciences/ National Research Council, President's Committee of Advisors on Science and Technology, American Association for the Advancement of Science) provide additional scrutiny of the portfolio and program goals and results. This follows the guidance provided in the R&D Criteria, as outlined in the OMB/OSTP Guidance Memo.

Evidence: Relevant information regarding the Institutions program effectiveness may be found in the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par) and in the FY 2003 Report on NSF Merit Review System (www.nsf.gov/nsb/documents/2004/MRreport_2003_final.pdf).

YES 20%
1.5

Is the program effectively targeted, so that resources will reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: NSF's investments in Institutions rely upon two mechanisms to ensure that the activity is effectively targeted and that funding directly addresses the purpose. First, the solicitations for each activity contain a clear statement of the activity's purpose in the context of the particular activity. Then, the merit review process ensures that funding is awarded to proposals that best address the activity's purpose. An example of aligning program purposes with intended beneficiaries is found in ADVANCE. The goal of ADVANCE is to increase the representation and advancement of women in academic science and engineering careers. Institutional Transformation Awards support academic institutional transformation to promote the increased participation and advancement of women scientists and engineers in academe.

Evidence: Excerpts of program purposes aligned with Institutions: CCLI seeks to improve the quality of STEM education based on educational research and empirical data concerning needs and opportunities in undergraduate education and effective ways to address them. Grants for the Department Level Reform of Engineering Education enable departmental and larger units to reformulate, streamline, and update engineering degree programs, develop new curricula for emerging engineering disciplines, and meet the emerging workforce and educational needs of U.S. industry. IMD seeks to enhance science, technology, and mathematics content knowledge and thinking skills and problem solving abilities of all preK-12 students, regardless of background, ability, or plans for future education.

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: Specific long-term performance measures for NSF's investments in Institutions are listed in the 'Measures' tab. These are drawn from the objectives set forth in the NSF Strategic Plan FY 2003-2008, and they encompass NSF's commitment to broadening participation in science and engineering and to strengthening the U.S. workforce in science, technology, engineering and mathematics.

Evidence: Performance measures can be found in the Measures tab. Additional information about the assessment of performance may be found in the NSF Strategic Plan FY 2003-2008 (www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf).

YES 9%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: Ambitious targets and timeframes are set under 'Measures' tab.

Evidence: Ambitious targets and timeframes for the Institutions program are included in the Advisory Committee for GPRA Performance Assessment Report (www.nsf.gov/pubs/2004/nsf04216/nsf04216.pdf) and in the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par).

YES 9%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The program has identified a number of quantitative annual measures, shown in the 'Measures' tab, that relate directly to the agency's strategic goals.

Evidence: Performance measures can be found in the Measures tab.

YES 9%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: Baselines are obtained from internal NSF sources. Ambitious targets are set under the 'Measures' tab.

Evidence: Performance measures can be found in the Measures tab. Additional information regarding baselines and targets may be found in NSF's Enterprise Information System and in annual and final project reports.

YES 9%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: The key partners for NSF's investments in Institutions both commit to and work toward the goals of the program. The commitment is ensured through the mechanisms described in the response to Q1.5 -- namely the combination of the program purpose being expressed in program solicitations and the selection of awards through the merit review process. NSF then ensures that its partners are working toward the goals of the program via the following mechanisms: 1) continuing support that is based upon annual progress reports submitted by Principal Investigators and reviewed by NSF program officers; 2) to receive subsequent awards, all applicants are required to report on the results of previous NSF support, which is then considered in the merit review process.

Evidence: Relevant information may be found in annual and final project reports and in the grant conditions. For example: STEM Talent Expansion Program (STEP) project goals must be to increase the total graduation numbers of students (US citizens or permanent residents) obtaining science, technology, engineering and mathematics (STEM) degrees at institutions with baccalaureate degree programs; or completing associate degrees in STEM fields or completing credits toward transfer to a baccalaureate degree program in STEM fields at community colleges, and all STEP proposals must include specific numerical targets for these increases (www.nsf.gov/pubsys/ods/getpub.cfm?ods_key=nsf04529 ).

YES 9%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: Regular evaluations bring about improvements and influence program planning. Each NSF activity is reviewed by a Committee of Visitors once every three years. GAO highlighted NSF's approach to evaluation as an 'evaluation culture--a commitment to self-examination, data quality, analytic expertise, and collaborative partnerships.' Advisory Committees review Directorate performance and the Advisory Committee for GPRA Performance Assessment assesses performance on NSF-wide Strategic Goals. NSF and external experts conduct major activity site visits. In fall 2002, the Education and Human Resources directorate instituted a series of community portfolio reviews to assess directorate-wide impact in critical areas, e.g., mathematics education, grades K-12, and the transition to college-level mathematics. NOTE: Increased weight reflects the importance NSF places on independent evaluations to support improvements and evaluate effectiveness.

Evidence: Independent evaluations are critical to NSF performance assessment. Examples include: Advanced Technological Education program evaluation (www.wmich.edu/evalctr/ate/); Course and Curriculum Development Program: Evaluation Report (www.nsf.gov/pubsys/ods/getpub.cfm?nsf9839); Final Report on the Evaluation of the National Science Foundation's Instructional Materials Development Program (Abt Associates, 2000); and Instructional Materials Development Dissemination and Implementation Site Evaluation, Final Report (WestEd/Abt Associates, 2003). Other formal curricula evaluations have been conducted by the Dept of Education (mathematics & science) and AAAS (middle school science), applied research studies, including dissertations, on student learning outcomes of NSF instructional materials, Committee of Visitors reports and NSF responses, and Advisory Committee reports.

YES 18%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: Performance information is used by managers to make informed decisions and is incorporated into NSF's budget requests to the Congress. The NSF FY 2005 Budget Request to Congress was built around the R&D Criteria, thereby highlighting specific performance information for NSF's investment portfolio. The budget also clearly presents the resource request and outlines the activities that will be supported with the funds. In the FY 2005 Budget Request, NSF displays the full budgetary cost associated with the new program framework defined in the NSF Strategic Plan FY 2003-2008.

Evidence: The FY 2005 NSF Budget Request to Congress presents the long-term goals of the Institutions program and the resources needed in a complete and transparent manner (www.nsf.gov/bfa/bud/fy2005/pdf/fy2005.pdf). Additional information may be found in the NSF Strategic Plan FY 2003-2008 (www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf).

YES 9%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: Committees of Visitors (COVs) provide a valuable mechanism for identifying and addressing planning-related issues. COVs provide feedback on the activity's goals and overall effectiveness. Steps to address weaknesses are identified. The FY 2003 COV for Course, Curriculum and Laboratory Improvement (CCLI) noted that the number of proposals from community colleges continues to be far less than the needs of these colleges. In response, NSF has provided funding to the Council for Resource Development to include sessions in their regional meetings about NSF programs. Additional activities include making use of the American Association of Community Colleges meetings and publications; more outreach activities to community colleges as well assuring that community college representatives are invited to presentations organized and normally attended by other types of institutions; and making a greater effort to invite more community college faculty to review proposals for the CCLI program.

Evidence: The program implements recommendations from outside evaluations as evidenced in Committee of Visitors reports and NSF responses.

YES 9%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?

Explanation: NSF's investments in Institutions address national science, technology, engineering and mathematics (STEM) workforce and education needs that are not addressed in the same ways at the more mission-specific federal, state or local agencies. The NSF investments in Institutions are positioned to focus on STEM workforce and education issues using a research and development strategy, with a strong focus on research and evaluation of the program and projects within the program and the institutional engagement of STEM disciplinary departments with STEM education institutions and departments. This process uses external review at several levels: Advisory Committees, National Science Board Committee on Education and Human Resources, and external evaluations of the program.

Evidence: The Institutions program assesses and compares the potential benefits of its efforts through the evaluation process. Evidence of this assessment can be found in the National Science and Technology Council Subcommittee on Education and Workforce Development, the National Science Board (NSB) Report on National Workforce Policy, The Science and Engineering Workforce: Realizing America's Potential (NSB 03-69), the National Science Board reports on diversity in the scientific and technological workforce, Advisory Committee reports, and external evaluations.

YES 9%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: The program relies on the external merit review system to prioritize proposals for final funding decisions. These final decisions include consideration of NSF's core strategies and maintaining a diverse portfolio. NSF staff work to employ a rigorous prioritization process for developing the NSF budget requests and determining funding decisions. For budget requests, each of the activities within the program provides input to senior management about past performance and future needs. Senior management integrates that information, prioritizes budget requests within and among programs, and determines funding levels, all of which is reviewed by the National Science Board.

Evidence: Relevant information regarding the prioritization process may be found in the NSF Strategic Plan FY 2003-2008 (www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf) and in the NSF FY 2005 Budget Request to Congress (www.nsf.gov/bfa/bud/fy2005/pdf/fy2005.pdf). Additional information regarding funding decisions may be found in the Grant Proposal Guide (www.nsf.gov/pubsys/ods/getpub.cfm?gpg).

YES 9%
Section 2 - Strategic Planning Score 100%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Performance information is collected from NSF grant recipients via interim, annual and final project reports. Site visits to larger projects are also used to collect performance information. Committee of Visitor reviews and recommendations are utilized to improve program performance. Process-related or quantitative goals such as dwell time are monitored via the agency's Enterprise Information System (EIS). All of these assessments impact management practices. NSF programs collect high-quality performance data relating to key program goals and use this information to adjust program priorities, make decisions on resource allocations and make other adjustments in management actions. GPRA performance data are verified and validated by an independent, external consulting firm.

Evidence: Evidence relating to the use of credible performance information may be found in Committee of Visitor reports (internal documents), Advisory Committee reports, including the Advisory Committee for GPRA Performance Assessment Report (www.nsf.gov/pubs/2004/nsf04216/nsf04216.pdf). Data is collected through annual, interim, and final project reports (internal documents), the Enterprise Information System (EIS) data - GPRA module, annual contract performance evaluations, and site visit reports (internal documents). Principal investigators are able to post up-to-date information about their project activities and results through the Division of Undergraduate Education's (DUE) Project Information Resource System (PIRS). This is in addition to the information reported via FastLane in their project reports.

YES 9%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: NSF awardees must meet annual and final reporting requirements as well as financial record keeping requirements. Performance is monitored by NSF program officers and funds can be withheld pending satisfactory project performance. The efforts of NSF staff are reviewed by their supervisors and by Committees of Visitors (COV). Corrective actions are taken as needed to assure accountability. NSF staff monitor cost, schedule and technical performance and take corrective action when necessary. Individual performance plans are directly linked to NSF's strategic goals.

Evidence: Federal managers and program partners are held accountable through cooperative agreements or contracts and annual performance evaluation of NSF employees/program officers. Relevant evidence of this may be found in annual and final project reports, in the NSF General Grant Conditions (www.nsf.gov/home/grants/gc102.pdf), Committee of Visitor reports, and in annual and final project reports.

YES 9%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: NSF routinely obligates its funds in a timely manner. NSF also has pre-award internal controls to reduce the risk of improper payments. Beginning in FY 2004 NSF has incorporated erroneous payments testing of awardees into the on-site monitoring program. When this testing is complete, it will provide NSF with information about the usage of NSF funding by awardees.

Evidence: Evidence of the agency's financial obligations may be found in the NSF FY 2001 Risk Assessment for Erroneous Payments, Data on NSF Carryover (found in the NSF Budget Request to Congress), the Risk Assessment and Award Monitoring Guide, NSF's clean opinion on financial statements for past 6 years, and in the Statement of Net Costs.

YES 9%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: NSF is a leader in the vigorous and dynamic use of information technology to advance the agency mission. IT improvements support more timely and efficient processing of proposals. The NSF-wide priority of increasing award size and duration enhances efficiency because larger, longer awards allow the research community to spend more time conducting research and less time preparing proposals. The Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) limits the number of proposals from a single institution. Such limits require an institution to come to an institution-wide consensus about the range of efforts that they believe that they can undertake to increase their numbers of students graduating with science, technology, engineering, or mathematics majors. It also allows for higher success rates and maximized interdisciplinary collaboration.

Evidence: Procedures to measure and achieve efficiencies are found in a number of documents: annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par), Committee of Visitors reports; the NSF Strategic Plan FY 2003-2008 (www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf); the NSF Grant Proposal Guide (www.nsf.gov/pubsys/ods/getpub.cfm?gpg), and the Science, Mathematics, Engineering and Technology Talent Expansion Program (STEP) solicitation (www.nsf.gov/pubsys/ods/getpub.cfm?ods_key=nsf04529).

YES 9%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The Institutions program promotes partnerships, including collaboration with other agencies, industry, and national laboratories for projects of mutual interest and international collaboration. NSF regularly shares information with other agencies and participates in coordination activities through OSTP and the National Science and Technology Council. Policy guidance provided by the National Science Board incorporates perspectives from related programs and investments. The Instructional Materials Development (IMD) activity works collaboratively with NASA, NOAA, and EPA in Global Learning to Benefit the Environment (GLOBE). IMD has been adapted to create a national presence for nanoscale science in 7-12 classrooms through linkages with the nanoscience priority area that cross all NSF disciplinary research directorates. Mechanisms are established for split funding between NSF directorates.

Evidence: Evidence relevant to demonstrating the Institutions program's coordination and collaboration with similar programs may be found in management plans (internal documents).

YES 9%
3.6

Does the program use strong financial management practices?

Explanation: NSF uses strong financial management practices. NSF was the first federal agency to receive a 'green light' for financial management on the President's Management Agenda scorecard. NSF has received a clean opinion on its financial audits for the last six years. The NSF is committed to providing quality financial management to all its stakeholders. It honors that commitment by preparing annual financial statements in conformity with generally accepted accounting principles in the U.S. and then subjecting the statement to independent audits. As a federal agency, NSF prepares the following annual financial statements: Balance Sheet, Statement of Net Cost, Statement of Changes in Net Position, Statement of Budgetary Resources, and Statement of Financing. Supplementary statements are also prepared including Budgetary Resources by Major Accounts, Intragovernmental Balances, Deferred Maintenance, and Stewardship Investments.

Evidence: Evidence of NSF's strong financial management practices may be found in the President's Management Agenda, in the results of NSF financial audits, and in the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par).

YES 9%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: Committees of Visitors (COV) regularly provide feedback on programmatic and management-related concerns. In addition, the Foundation conducts an annual review to assess administrative and financial systems and procedures to ensure that effective management controls are in place and that any deficiencies are identified and addressed.

Evidence: Reports indicating no significant management deficiencies in the Institutions program include the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par), the NSF Business Analysis, Committee of Visitors reports, Advisory Committee reviews of COV reports, and IG reports and NSF responses.

YES 9%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: All activities rely upon NSF's competitive, merit review process that includes external peer evaluation. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF program officer, and usually by 3-10 other persons outside NSF who are experts in the particular field represented by the proposal. Competitive merit review, with peer evaluation, is NSF's accepted method for informing its proposal decision process. The NSB-approved criteria address the "Intellectual Merit" and the "Broader Impacts" of the proposed effort. Some solicitations contain additional criteria that address specific programmatic objectives. NOTE: The weight of this question has been increased to reflect the relative importance of merit review in verifying the relevance, quality, and performance of NSF's investments.

Evidence: Evidence of grants awarded through a clear competitive process may be found in the FY 2003 Report on NSF Merit Review System (www.nsf.gov/nsb/documents/2004/MRreport_2003_final.pdf). Additional information may be found in the Enterprise Information System data (internal) and in the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par).

YES 18%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: NSF has a formal Award Monitoring and Business Assistance Program (AMBAP) based on a financial and administrative risk assessment of NSF awardee institutions, focusing on award oversight, including desk and on-site monitoring and providing assistance to awardees. AMBAP is a collaborative effort between NSF administrative and financial managers/technical staff and NSF program managers working with their awardee counterparts. Oversight mechanisms are currently sufficient. NSF's capacity to provide adequate oversight is dependent on available resources to offset salary and expenses with current resources reducing NSF's ability to perform the level of oversight deemed desirable. NSF is using technology and creativeness, such as teleconferencing, videoconferencing, and reverse site visits to enhance performance oversight within current resource constraints. NSF maintains scientific oversight of all awards through annual and final project reports, and funds are tracked (via reporting systems) to ensure that funds are used for their designated purpose.

Evidence: Oversight activities which demonstrate a sufficient knowledge of grantee activities may be found in the Committees of Visitors reports, quarterly / annual and final project reports, directorate reviews, FY 2003 Report on NSF Merit Review System, Risk Assessment and Award Monitoring Guide, clean audit opinions, President's Management Award Scorecard for Financial Management, site visit reports, workshops and grantee meetings (e.g., Advanced Technological Education Principal Investigator Meeting; Course, Curriculum and Laboratory Improvement Invention and Impact Conference (www.ccliconference.com); and Instructional Materials Development Principal Investigator Meeting (www.agiweb.org/education/nsf2004/index.html)).

YES 9%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: NSF Grant General Conditions require that results of NSF-supported research be published in the open literature and that NSF support is appropriately referenced/cited. NSF's annual Performance and Accountability Report and its annual Budget Request contain highlights of NSF-supported research. Principal Investigators provide annual progress reports to NSF that are examined and approved/disapproved by the program directors. Information is made available to the public on the numbers of proposals and numbers of awards as well as, for each award, the name of the principal investigator, the awardee institution, amount of the award, and an abstract of the project. The Budget Internet Information Site (BIIS) contains extensive information on awards and funding trends. In addition, the Division of Undergraduate Education's (DUE) Project Information Resource System (PIRS) is integrated with FastLane's Project Reports System.

Evidence: Grantee performance data is collected annually as evidenced in the annual Performance and Accountability Reports (http://www.nsf.gov/pubsys/ods/getpub.cfm?par); the annual Budget Request to Congress; the Grant General Conditions (http://www.nsf.gov/home/grants/gc102.pdf); and the Budget Internet Information Site (http://dellweb.bfa.nsf.gov/). The Division of Undergraduate Education's (DUE) Project Information Resource System (PIRS) is integrated with FastLane's Project Reports System. When a Principal Investigator submits an annual, final, or interim project report to NSF via FastLane, some sections of the project report are made accessible to the public through PIRS (https://www.ehr.nsf.gov/pirs_prs_web/search/).

YES 9%
3.RD1

For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?

Explanation: NSF programs are administered as competitive.

Evidence:  

NA 0%
Section 3 - Program Management Score 100%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: NSF relies on external evaluation to determine whether it is achieving its long-term objectives. Since FY 2002, the NSF Advisory Committee for GPRA Performance Assessment (AC/GPA) serves as the focal point for these activities. Input is derived from numerous sources including Committees of Visitors, Principal Investigator annual and final project reports, and summaries of substantial outcomes ('nuggets') from funded research. The AC/GPA has determined that the accomplishments under the People goal have demonstrated "significant achievement' toward annual and long-term performance goals. In addition, component activities of the Institutions program undergo third party evaluations. Collectively, these evaluations provide further evidence that the programs are demonstrating adequate progress in achieving their long-term performance goals.

Evidence: Evidence demonstrating progress in meeting the long-term goals of the Institutions program may be found in the Measures tab, in annual Performance and Accounability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par), in the Advisory Committee for GPRA Performance Assessment Report (www.nsf.gov/pubs/2004/nsf04216/nsf04216.pdf), in annual and final project reports and in third party evaluations of component activities of the program.

LARGE EXTENT 11%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The program achieves its annual performance goals.

Evidence: Evidence demonstrating achievement of Institutions' performance goals may be found in the Advisory Committee for GPRA Performance Assessment Report (www.nsf.gov/pubs/2004/nsf04216/nsf04216.pdf), in annual and final project reports, in Committee of Visitors reports, and in the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par).

LARGE EXTENT 11%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The NSF-wide priority of increasing award size and duration enhances efficiency because larger, longer awards allow the research community to spend more time conducting research and less time preparing proposals. Independent reviews by Committees of Visitors and other expert panels provide additional scrutiny to portfolio and program goals, ensuring effectiveness and operational efficiency. Where appropriate, activities limit the number of proposals accepted from a single institution demonstrating higher success rates and more interdisciplinary collaboration within submitting universities than would otherwise be possible. Several activities use pre-proposals to improve efficiencies in the merit review process and reduce the burden on researchers and reviewers. Increases for high priority graduate fellowships and traineeships have required reallocations across the People activities, which has affected the ability of Collaborations to make consistent improvements in award size and duration.

Evidence: Cost effectiveness and efficiencies for the Institutions program are shown in the Measures tab. Evidence may be found in several documents including the annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par), program solicitations, and the NSF FY 2005 Budget Request to Congress (www.nsf.gov/bfa/bud/fy2005/pdf/fy2005.pdf).

SMALL EXTENT 6%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: NSF's activities through its investment in Institutions address national science, technology, engineering, and mathematics (STEM) education and workforce needs that are not addressed by the mission agencies. Because of its recognized effectiveness, aspects of NSF investments in Institutions are often emulated by other programs in government and the private sector (e.g., rigorous pilot- and field-testing of K-12 instructional materials, alignment of education programs with relevant state and professional society content standards). The NSF activities also create a nation-wide response to address the goals of the program.

Evidence: Other federal agencies have implemented similar programs with guidance from NSF. Evidence of the Institutions program relationship with similar programs may be found in Committee of Visitors reports, Advisory Committee reports, and the Enterprise Information System data (internal) for split funding of projects.

YES 17%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: Independent reviews by Committees of Visitors and other external groups provide scrutiny of the portfolio and program goals and results. A number of education programs across the Education and Human Resources directorate support a growing number of merit-reviewed research projects studying the effectiveness of Instructional Materials Development and associated curricula. Research products are contributing growth in the knowledge base on student learning and curriculum implementation. The Advisory Committee for GPRA Performance Assessment wrote that NSF demonstrated "significant achievement' with respect to its GPRA Strategic Outcome Goals for People. In reaching this determination, the committee specifically considered indicators that matched the objectives used here for Institutions. NOTE: The weight of this question has been doubled to reflect the importance of independent evaluation in verifying relevance, quality and performance of NSF's investment in Institutions.

Evidence: Evaluations of the Institutions program may be found in the Advisory Committee for GPRA Performance Assessment Report (www.nsf.gov/pubs/2004/nsf04216/nsf04216.pdf), annual Performance and Accountability Reports (www.nsf.gov/pubsys/ods/getpub.cfm?par), annual and final project reports, and Committees of Visitors reports and NSF responses. For middle-school mathematics instruction, see growing list of doctoral dissertations that assess middle-school curricula efforts and inform future needs in mathematics education (showmecenter.missouri.edu/showme/dissertations.htm).

YES 33%
Section 4 - Program Results/Accountability Score 78%


Last updated: 01092009.2004FALL