ExpectMore.gov


Detailed Information on the
Construction and Operations of Research Facilities Assessment

Program Code 10001145
Program Title Construction and Operations of Research Facilities
Department Name National Science Foundation
Agency/Bureau Name National Science Foundation
Program Type(s) Research and Development Program
Capital Assets and Service Acquisition Program
Competitive Grant Program
Assessment Year 2003
Assessment Rating Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 100%
Program Management 100%
Program Results/Accountability 90%
Program Funding Level
(in millions)
FY2008 $509
FY2009 $535

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2003

Strengthen project management, including monitoring of performance against performance targets.

Completed Significant steps are: project management courses for NSF program officers; increasing the staff of the Large Facilities Office; and conducting numerous site visits. The Facilities Subcommittee of the Advisory Committee on Business and Operations has issued a report with process improvement recommendations. Improvements have also been made in the Facilities Reporting System, as discussed in NSF's Proud To Be documents.
2003

Continue to strengthen performance targets.

Completed Significant steps are: project management courses for NSF program officers; increasing the staff of the Large Facilities Office; and conducting numerous site visits. The Facilities Subcommittee of the Advisory Committee on Business and Operations has issued a report with process improvement recommendations. Improvements have also been made in the Facilities Reporting System, as discussed in NSF's Proud To Be documents.
2005

Complete program management reforms, including agency facilities oversight.

Completed Significant steps are: project management courses for NSF program officers; increasing the staff of the Large Facilities Office; and conducting numerous site visits. The Facilities Subcommittee of the Advisory Committee on Business and Operations has issued a report with process improvement recommendations. Improvements have also been made in the Facilities Reporting System, as discussed in NSF's Proud To Be documents.
2005

Take necessary steps to ensure that at least 90% of its facilities keep scheduled operating time lost under 10%.

Completed In FY2006, 95% of operational facilities kept scheduled operating time lost to less than 10%.
2006

For all MREFC facilities, keep negative cost and schedule variances to less than 10 percent. The facilities are ALMA, EARTHSCOPE, ICECUBE, SODV, AND USAP-SPSM.

Completed NSF has established a rigorous process to identify cost, contingency, and schedule in order to ensure that cost, schedule, and risks are well defined for all phases -- construction planning, design, construction, commissioning, and operations -- prior to the request for funding to Congress.
2007

Senior management reviews operational oversight requirements as defined in Cooperative Agreements or Contracts for current NSF-supported large facilities, and implemented NSF-wide changes in oversight of large facility operations in FY 2008.

Completed NSF senior management reviewed data on contract or cooperative agreement language. Topics included were requirements for annual reviews, existence of performance goals and metrics, long-range maintenance plans, safety and cybersecurity standards and reporting requirements.
2007

A Business Systems Review (BSR) is conducted at least once per 5-year award cycle for all large facilities in construction or operations in order to assure compliance with government regulations and NSF policies and procedures.

Completed

Program Performance Measures

Term Type  
Annual Efficiency

Measure: Percent of operational facilities that keep scheduled operating time lost to less than 10%


Explanation:Investments in the operation of state-of-the-art facilities and platforms. Measure in FY 01 and 02 was based on keeping operating time greater than 90%; results reported here are in terms of present measure.

Year Target Actual
2002 - 84%
2003 - 87%
2004 90% 89.7%
2005 90% 100%
2006 90% 95%
2007 90% 94%
2008 90% 100%
2009 90%
2010 90%
Long-term Outcome

Measure: External advisory committee (AC/GPA) finding of "significant achievement" that facilities enable discoveries or enhance productivity of NSF research or education communities.


Explanation:Leadership in the development, construction, and operation of major, next-generation facilities.

Year Target Actual
2001 - Success
2002 - Success
2003 - Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success Success
2009 Success
2010 Success
2011 Success
2012 Success
Long-term Outcome

Measure: External advisory committee (AC/GPA) finding of "significant achievement" that NSF has partnerships to support and enable development of large facilities.


Explanation:Expand opportunities for access to state-of-the-art S&E facilities

Year Target Actual
2001 - Success
2002 - Success
2003 - Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success Success
2009 Success
2010 Success
2011 Success
2012 Success
Annual Efficiency

Measure: For all facilities in the Major Research Equipment and Facilities Construction (MREFC) account, keep cost and schedule variances to ten percent or less of the approved performance baseline.


Explanation:This goal was established in the Construction and Operations of Research Facilities PART Program conducted in FY 2003. Through FY 2006, the goal applied to as many as 11 construction projects, and the target was set at 90 percent to stay within 10 percent of the approved project plan. In FY 2007, the goal was revised to apply to only the five projects named above, and the target was set at 100 percent

Year Target Actual
2001 - 84%
2002 - 48%
2003 - 88%
2004 90% 100%
2005 90% 79%
2006 90% 72.7%
2007 100% 90%
2008 100% 80%
2009 100%
2010 100%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: NSF's Facilities program reflects the parts of NSF's mission directed at programs to strengthen scientific and engineering research potential and to support the development and use of computers and other scientific methods and technologies. The NSF mission ("To promote the progress of science; to advance the national health, prosperity, and welfare; and to secure the national defense, and for other purposes.") is clear and unambiguous, and there is consensus of program purpose among interested parties.

Evidence: National Science Foundation Act of 1950 (www.nsf.gov/home/about/creation.htm); NSF Strategic Plan (www.nsf.gov/pubs/2001/nsf0104/start.htm)

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: NSF's Facilities program supports large, multiuser facilities, which allow researchers access to unique, state-of-the-art facilities that are necessary to advance U.S. capabilities required for world-class research. It also includes small facilities. This program addresses a critical need for tools to support basic research at universities and colleges.

Evidence: * Recent reports, such as that prepared by the National Science Board's (NSB) Taskforce on Science and Engineering Infrastructure (www.nsf.gov/nsb/documents/2003/start.htm), as well as Committee of Visitor (COV) * * reports and community workshops support NSF's role in capacity building.( * *COVs assess approximately one-third of NSF programs each year, and review performance over the previous three years. See the FY 2002 Performance and Accountability Report for a Schedule of Program Evaluations.) * GEO Advisory Committee endorsement of the GEO Facilities Plan is an example of this support (www.geo.nsf.gov/geo/adgeo/fac_lrp/facilities_plan.pdf). * NAS Study: Neutrinos and Beyond, New Windows on Nature

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any Federal, state, local or private effort?

Explanation: NSF supports unique facilities to enable research and education activities across the span of disciplines for which the Foundation has responsibility. In contrast, other federal agencies support research focused on specific missions. NSF has a responsibility to consider what large facilities are needed to maintain the nation's leadership in science and engineering. NSF consults with other agencies to avoid duplication and cooperates with other agencies and with international partners in constructing facilities.

Evidence: * The September 2001 report of the National Committee on Organization and Management of Research in Astronomy and Astrophysics recommended that "the National Science Foundation's astronomy and astrophysics responsibilities should not be transferred to NASA." The rationale for this recommendation was based on a thorough analysis of NSF activities in ground-based astronomy and the conclusion that NSF is the appropriate agency to sponsor ground-based astronomy and astrophysics (books.nap.edu/books/0309076269/html/3.html#pagetop). * NSF serves as the lead agency for the NITRD initiative, provides interagency leadership for the National Nanotechnology Initiative (www.nano.gov) and coordinates with the National Science and Technology Council in other areas. *NSF provides a majority of support for ground based astronomy, the Academic Research Fleet, and the majority of support for facilities at universities, colleges and other non-profit organizations. * Proposals to this and other NSF programs must identify other agency funding/requests to ensure no unnecessary duplication.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: NSF relies on the competitive merit review process, the NSF Program Officers in their oversight capacity, and Committees of Visitors (COVs) to ensure that facilities are effectively serving their intended communities, and to recommend changes to improve program effectiveness and efficiency. These measures ensure that supporting the acquisition and operation of infrastructure is the most efficient method of facilitating the science in question. Many facilities have "user groups" that communicate regularly with NSF and facilities managers. Merit review by peers has been recognized as a "best practice" for administering R&D programs. Independent reviews by COVs and external groups (e.g., National Research council, PCAST) provide additional scrutiny of the portfolio and program goals.

Evidence: * FY 2002 Report on the NSF Merit Review Process (www.nsf.gov/nsb/documents/2003/merit_report 2002 final.doc) * FY 2002 Performance and Accountability Report(www.nsf.gov/pubs/2003/nsf03023/pdf/chapter4.pdf) * COV Reports * R&D Investment Criteria

YES 20%
1.5

Is the program effectively targeted, so program resources reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: NSF supports unique facilities to enable research and education activities across the span of disciplines for which the Foundation has responsibility. The peer review process for access to specific facility resources and/or time ensures effective targeting of funding so that results of investments will reach the intended beneficiaries. Committees of Visitors ensure relevance to community needs. In most cases, the National Science Board reviews facility awards to ensure that they are appropriately supportive of NSF's mission.

Evidence: * COV Reports * NAS Study: Neutrinos and Beyond, New Windows on Nature * NRC 2001 Report: Astronomy and Astrophysics in the New Millennium. * Workshop Reports * NSB Report: Science and Engineering Infrastructure for the 21st Century: the Role of NSF * NAS Decadal Review of Astronomy: Astronomy and Astrophysics in the New Millennium (2001)

YES 20%
1.RD1

Does the program effectively articulate potential public benefits?

Explanation:  

Evidence:  

NA  %
1.RD2

If an industry-related problem, can the program explain how the market fails to motivate private investment?

Explanation:  

Evidence:  

NA  %
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The Facilities Program is a subset of the Tools Strategic Goal -- providing "broadly accessible, state-of-the-art and shared research and education tools." This reflects the parts of NSF's mission directed at programs to strengthen scientific and engineering research potential, and to support the development and use of computers and other scientific methods and technologies.

Evidence: * NSF Revised Strategic Plan (www.nsf.gov/pubsys/ods/getpub.cfm?nsf0104) * NSF annual GPRA Performance Plans (www.nsf.gov/od/gpra). * A limited number of Tools performance indicators pertain directly to facilities.

YES 9%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: Facilities that enable discoveries or enhance productivity of NSF research or education communities: The target of "significant achievement" requires external assessment of facility outcomes based on knowledge of science achievement on a world-wide stage. Partnerships to support and enable development of large facilities: Partnerships require major negotiations with international partners in times of economic uncertainty and must represent "significant achievement" in the view of external assessors.

Evidence: * Advisory Committee for GPRA Performance Assessment (ACGPA) Reports * FY 2004 Budget Request to Congress, Chapters on Tools and the MREFC Account. * FY 2002 Performance and Accountability Report

YES 9%
2.3

Does the program have a limited number of specific annual performance measures that demonstrate progress toward achieving the program's long-term measures?

Explanation: Each year, performance indicators that demonstrate progress toward achieving the long-term Facilities goal are delineated in the annual GPRA performance plan. There is also an annual cost and schedule goal for construction and upgrade of facilities and an annual goal related to facility operations.

Evidence: * In FY 2002, committees of external experts determined that NSF had demonstrated significant achievement for all of the annual performance indicators for the TOOLS goal, which includes facilities. * NSF was successful in achieving two of the four goals related to the construction/upgrade and operations of facilities projects. See the NSF FY 2002 Performance and Accountability Report (www.nsf.gov/pubs/2003/nsf03023/start.htm) for additional details.

YES 9%
2.4

Does the program have baselines and ambitious targets and timeframes for its annual measures?

Explanation: Baselines have been established for annual performance measures, and targets for facility performance are reviewed annually. Performance targets are ambitious but commensurate with the budget environment. In addition to program measures, individual projects also set performance targets.

Evidence: * NSF GPRA Performance Reports * FY 2002 Performance and Accountability Report * FY 2004 Budget Request to Congress, Chapters on Tools and the MREFC Account * FY 2004 GPRA Performance Plan

YES 9%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, etc.) commit to and work toward the annual and/or long-term goals of the program?

Explanation: All partners commit to and work toward the goals of the program. Purpose, responsibilities, and requirements for all partners are spelled out in Cooperative agreements for facilities. These Cooperative agreements specifically require annual reports on progress relative to the project's construction/upgrade or operations goals, as relevant. Memoranda of Understanding (MOUs) and Memoranda of Agreement (MOAs) exist between NSF and partnering organizations.

Evidence: * Annual / Final Project Reports. * GPRA Reporting Requirements in Cooperative Agreements for Facility Awards.

YES 5%
2.6

Are independent and quality evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: Evaluations are conducted regularly at multiple levels in order to inform program improvements and influence program planning. Each program at NSF is reviewed once every three years by a COV. Advisory Committees review and approve COV reports. As of FY 2002 the Advisory Committee for GPRA Performance Assessment makes use of COV reports in its assessment of performance for each Tools indicator applicable to facilities on an NSF-wide basis. NSF conducts workshops and various facilities have been reviewed by external entities such as the NAS. NSF staff and external experts conduct site visits at NSF-supported facilites. All these activities inform NSF senior management and contribute to development of plans for the agency. (The weight of this question has been increased to reflect the importance NSF places on the conduct of independent evaluations to support program improvements and evaluate effectiveness.)

Evidence: * COV reports and NSF responses. * AC reports, including the Advisory Committee for GPRA Assessment (AC/GPA) report (Fall 2002). * External reviews. * Community workshops. * Annual site visits that include external reviewers for facilities.

YES 18%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: Performance information is incorporated into NSF's budget requests. The FY 2004 justification was built around the R&D Criteria, thus highlighting specific performance information for NSF's investment portfolio. Continued funding for facilities is contingent on satisfactory progress and performance with respect to previously established metrics. The budget also clearly presents the resource request for each program and outlines activities supported with the funds. The FY 2004 Request provided full budgetary costing by the program framework in use at that time (Strategic Goals and Directorates). In FY 2005, NSF will display full budgetary cost associated with the new program framework defined in the Revised GPRA Strategic Plan. Facilities submit annual progress reports, and Program Officers conduct site visits with external experts. In contrast to the 2004 PART assessment for TOOLS, in which linkages were not all well defined, direct linkages exist for the Facilities program -- i.e., the MREFC Account and other major facilities.

Evidence: * Detailed plans for MREFC projects and other major facilities are included in the FY 2004 Budget Request to Congress (www.nsf.gov/bfa/bud/fy2004/toc/htm). * Budget submissions to OMB at multiple levels outline performance changes. * NSF's Budget Request to Congress contains milestones for MREFC projects. * Full budgetary costing for MREFC and for Tools is included in the FY 2004 Budget Request to Congress. * Capital Asset Plans. * Site Visit reports * Annual Reports

YES 5%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: NSF solicits public feedback on the agency's goals and planning processes as part of each independent (external) assessment of agency activities. Steps to address specific weaknesses are identified and implemented.

Evidence: * COVs address deficiencies and the program must respond. These reports and responses are reviewed by Advisory Committees for acceptability. * AC reports. * Selection of the Deputy Director, Large Facility Projects. He will coordinate NSF management and oversight activities for all facilities. * FY 2002 Performance and Accountability Report * Inspector General Reports and NSF Responses

YES 9%
2.CA1

Has the agency/program conducted a recent, meaningful, credible analysis of alternatives that includes trade-offs between cost, schedule, risk, and performance goals and used the results to guide the resulting activity?

Explanation: Cost/benefit analysis and risk management are aspects of the planning and decision-making processes when facility investments are considered. Alternative approaches, including cost and risk and utility for research, are considered in advance of project initiation. Research and development is conducted to support these choices and the decision-making process. Design studies examine tradeoffs between different concepts, such as selection of alternate sites (e.g., ALMA) and technical design (e.g., Gemini).

Evidence: Considerations of alternatives are apparent in: * Facility-specific benefit and risk analysis * Requirements from Large Facility Projects guidelines * Committee on the Organization and Management of Research in Astronomy and Astrophysics (COMRAA) Report * R&D prototyping * Site selection process for facilities * FY 2004 Budget Request to Congress * Examples of specific projects for which alternative approaches were considered: * Atacama Large Millimeter Array (ALMA) * Laser Interferometer Gravitational Wave Observatory (LIGO) * Gemini

YES 9%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?

Explanation: Prior to initiating support for new activities, workshops and external reviews are typically conducted to ensure that scientific opportunities justify the facility expenditure, and that supporting the acquisition and operation of infrastructure is the most efficient method of facilitating the science in question. NSF senior management reviews and compares opportunities of competing projects and selects from them, forwarding them for subsequent review and approval to the NSB. Interagency and international agreements and understandings are active and on file for most facilities projects, demonstrating the commitment of NSF to non-duplication and efficient and effective coordination of efforts.

Evidence: * NAS Studies * Workshops * COVs * Merit Review Process * MOUs and MOAs * Advisory Committee Reports * Example of interagency coordination -- High Energy Physics Advisory Panel (HEPAP) between NSF and DOE.

YES 9%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: NSF investments in Major Research Equipment and Facilities Construction have a documented prioritization process. For example, the MREFC Guidelines have been updated over the past year, and the Guidelines will continue to be a living document. Priorities for MREFC were explicitly provided in the FY 2004 Budget Request, as was a discussion of the process. Other facility investments are prioritized utilizing workshops, community-based planning efforts, and with advice from established Advisory Committees. In addition, external groups such as the National Academies provide prioritized recommendations.

Evidence: * Examples of documentation include: MREFC chapter in FY 2004 Budget Request MREFC Prioritizing Guidelines Community Planning Documents GEO Facilities Plan NAS Studies AC Reports High Energy Physics Advisory Panel reports NAS Decadal Review of Astronomy: Astronomy and Astrophysics in the New Millennium (2001)

YES 9%
Section 2 - Strategic Planning Score 100%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: NSF programs collect high-quality performance data relating to key program goals and use this information to adjust program priorities, make decisions on resource allocations and make other adjustments in management actions. NSF facilities are unique and information gathering can vary by facility. All facilities provide annual or more frequent progress reports on operations. NSF also has external annual reviews for programs that involve interagency and/or international partners. Program Officers monitor and collect information through weekly to monthly scheduled meetings with facilities managers and appropriate financial, managerial, and scientific staff. This oversight provides current and timely performance information that is meaningful to NSF program management. In agency construction programs, collection of performance data and monitoring can occur as frequently as daily. NSF collection is accomplished through formal channels of communication with interagency and/or international partners through weekly, quarterly, semiannual or annual reviews. External reviews are provided at least annually.

Evidence: * Examples of COV reports: FY 2002: Ship Operations; Astronomy facilities; Materials Research facilities. FY 2003: NCAR; Physics facilities. * AC reports, including the AC/GPA report. * GPRA Facilities Reports. * Annual Project Reports. * Enterprise Information System (EIS) data -- GPRA module. * Annual contract performance evaluations. * Site visit reports.

YES 8%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, cost-sharing partners, etc.) held accountable for cost, schedule and performance results?

Explanation: Facilities are subject to GPRA performance reporting requirements. NSF's contracts and Cooperative Agreements specify expected cost, schedule and performance results. These agreements can be (and have been) terminated in cases where the awardee is unable to meet the terms of the award instrument. NSF Program Officers monitor cost, schedule and technical performance and take corrective action when necessary. NSF has established policies and procedures that require program managers to report to senior management all deviations on cost and schedule. Deviations on cost that are greater than 10% must also be reported to the National Science Board.

Evidence: * Cooperative agreements or contracts for Facilities. * Annual performance evaluations of NSF employees * COV reports * Annual and final reports * GPRA Facilities Performance Reports * A number of facilities have been terminated or phased out based on performance

YES 8%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: NSF, including the facilities program, routinely obligates its funds in a timely manner. A study conducted by PricewaterhouseCoopers found no erroneous payments. NSF's grant monitoring activities assure that the funds are used for their intended purpose.

Evidence: * NSF FY 2001 Risk Assessment for Erroneous Payments * Data on NSF Carryover, found in the NSF's Budget Requests to Congress * Risk Assessment and Award Monitoring Guide * Clean opinion on financial statements for past 5 years

YES 8%
3.4

Does the program have procedures (e.g., competitive sourcing/cost comparisons, IT improvements, approporaite incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: In most cases, NSF's facilities are unique or one-of-a-kind and not available commercially, hence direct comparisons are not generally possible. In instances where facility capability may be commercially available, cost comparisons, including lease/purchase analysis per OMB A-94, are conducted to determine the most efficient and effective method of providing the required capability. As a result, NSF employs a number of acquisition strategies, including direct purchase/construction, lease, and fixed-duration contract in providing facility services.Cost efficiencies example: Daily operations costs for the Academic Research Fleet have been analyzed and compared to similar Navy and NOAA ships. NSF costs were found to be comparable.

Evidence: * FY 2002 Performance and Accountability Report * COV reports * The Academic Research Fleet report (www.geo.nsf.gov/oce/pubs/fleetrev.html)

YES 8%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: Facility construction projects and operations are coordinated with other federal programs as well as with international partners. For example, the Large Hadron Collider (LHC) is an example of a collaborative international partnership. LHC is an example of a collaborative international partnership. LHC is an international project under construction at the CERN laboratory in Geneva, Switzerland. The U.S. is involved in the construction of 2 particle detectors, a Toroidal LHC Apparatus (ATLAS) and the Compact Muon Solenoid (CMS). A total of 34 international funding agencies participate in the ATLAS detector project, and 31 in the CMS detector project. NSF and DOE are providing U.S. support. CERN is responsible for meeting the goals of the international LHC project.

Evidence: * Examples of facilities with other federal and international partners include: Large Hadron Collider Ocean Drilling Program (ODP/IODP) Atacama Large Millimeter Array (ALMA) High Performance Instrumented Airborne Platform for Environmental Research (HIAPER) * Mathematical and Physical Sciences coordinated activities

YES 8%
3.6

Does the program use strong financial management practices?

Explanation: NSF's facilities program uses strong financial management practices. NSF is currently the only federal agency to receive a "green light" for financial management on the PMA scorecard. NSF has received a clean opinion on its financial audits for the last 5 years.

Evidence: * Executive Branch Management Scorecard (website) * Results of NSF financial audits (website)

YES 8%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: NSF has taken several steps to address identified deficiencies in management and oversight. In response to the OIG FY 2002 Management Challenges, NSF has begun updating its policies and procedures to strengthen the management and oversight of large facility projects. NSF's improvements to facilities management and oversight have included: * Developing a Large Facility Projects Management and Oversight Plan, and has sought OIG input as it developed this plan. This plan provides comprehensive guidelines and procedures for all aspects of facilities planning, managing and oversight; * Appointing a Deputy Director for Large Facility Projects; * Revising goals for facilities that use earned management practices to evaluate performance and redesigning the data collection module in FastLane to incorporate these changes; and * Providing continuing long-term senior executive attention to NSF's management challenges and reforms through the Management Controls Committee. The Committee is chaired by the NSF Chief Financial Officer. In NSF's FY 2002 Performance and Accountability Report, the OIG confirms that NSF has taken important first steps toward addressing its facilities-management challenges.

Evidence: * Selection of Deputy Director, Large Facility Projects. He will coordinate NSF management and oversight activities for facilities. * Large Facility Projects Management and Oversight Plan (September 2001) * NSF FY 2002 Performance and Accountability Report. * The NSF Academy provides management coursework. * Booz Allen Hamilton contract for a multi-year business analysis. * COVs address deficiencies and the program must respond. These reports and responses are reviewed by Advisory Committees for acceptability. * Revised goals for facilities that use earned management practices to evaluate performance. Data collection module in FastLane incorporates these changes.

YES 8%
3.CA1

Is the program managed by maintaining clearly defined deliverables, capability/performance characteristics, and appropriate, credible cost and schedule goals?

Explanation: Construction projects are managed using annual cost and schedule goals as well as through "earned value". Facilities which have transitioned to an operations mode have annually defined deliverables.

Evidence: * Large Facility Project Guidelines; * GPRA performance goals * Annual / Final Project Reports

YES 8%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: NSF facilities support is allocated using a competitive process which uses merit review. Although many facility operation grants are renewed, the continuation of support is based on a merit reviewed proposal. As a result of NSB guidance to periodically recompete facility grants, NSF considers whether an expiring grant should be recompeted, and the default is to do so barring extenuating circumstances. (The weight of this question has been increased to reflect the importance of external merit review in validating the quality of this basic research program.)

Evidence: * NSB Policy on Recompetition * FY 2002 Report on the NSF Merit Review System * NSF Performance and Accountability Reports * Enterprise Information System (EIS)

YES 20%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: See Item 3.1 for current oversight mechanisms. Oversight mechanisms are currently sufficient, but projects are beginning to exceed our capacity to provide adequate oversight. This was raised as a management challenge in FY 2002, and NSF is addressing the increased oversight requirements in A&M budget plans. NSF is using technology, such as teleconferencing and videoconferencing to enhance performance oversight within current resource constraints. In FY 2002 NSF established a formal Award Monitoring and Technical Assistance Program (AM&TAP) based on financial and administrative risk assessment of NSF awardee institutions and with a primary focus to on-site monitoring. Consistent with NSF's existing award administration process, AM&TAP is a collaborative effort between administrative and financial managers/technical staff and NSF program managers.

Evidence: * COV reports * Quarterly / Annual and Final Project Reports. * Directorate Reviews * MREFC Panel Review * FY 2002 Report on the NSF Merit Review System * NSB Review * Consultants and external review committees * Annual reviews * Risk Assessment and Award Monitoring Guide * Facilities Management and Oversight Guide

YES 8%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: Annual performance data on facilities construction and operations are available through past GPRA Performance Reports and the combined Performance and Accountability Report. These reports are publicly available.

Evidence: * GPRA Performance Reports * Performance and Accountability Report * FY 2004 Budget Request

YES 8%
3.RD1

Does the program allocate funds through a competitive, merit-based process, or, if not, does it justify funding methods and document how quality is maintained?

Explanation: All NSF programs are administered as competitive grant programs

Evidence:  

NA 0%
3.RD2

Does competition encourage the participation of new/first-time performers through a fair and open application process?

Explanation:  

Evidence:  

NA  %
3.RD3

Does the program adequately define appropriate termination points and other decision points?

Explanation:  

Evidence:  

NA  %
3.RD4

If the program includes technology development or construction or operation of a facility, does the program clearly define deliverables and required capability/performance characteristics and appropriate, credible cost and schedule goals?

Explanation:  

Evidence:  

NA  %
Section 3 - Program Management Score 100%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome performance goals?

Explanation: NSF achieved its FY 2002 GPRA goal for TOOLS -- Providing "broadly accessible, state-of-the-art and shared research and education tools.'

Evidence: * FY 2002 Performance and Accountability Report. * AC/GPA

YES 15%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: NSF achieved 2 of the 4 GPRA goals for Facilities Construction and Operations in FY 2002. Goals achieved: (1) Annual Construction and Upgrade Expenditures: Of the 28 construction and upgrade projects supported by NSF in FY 2002, 26 (93%) were within 110% of annual expenditure plans. (2) Construction and Upgrade Total Cost (for projects initiated after 1996): Two projects were completed in FY 2002, one of which had been initiated prior to 1996. Goals not achieved: (1) Meeting Annual Schedule Milestones: Of the 27 construction and upgrade projects NSF supported, 13 (48%) met all annual schedule milestones compared to the goal of 90%. (2) Operating Time: Of 31 reporting facilities, 26 (84%) met the goal of keeping unscheduled downtime to below 10% of the total scheduled operating time compared to the goal of 90%. In FY 2003, NSF will combine cost and schedule performance into a single goal. The revised goals are calculated using the Earned Value technique, a project management tool for measuring progress that recognizes that cost or schedule data alone can lead to distorted perceptions of performance.

Evidence: * FY 2002 Performance and Accountability Report.

LARGE EXTENT 10%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program performance goals each year?

Explanation: Facilities are improving efficiencies through development of instrumentation making use of state-of-the-art technology to provide greater data gathering capabilities, including more efficient use of equipment and improved transmission rates (e.g., Gemini telescope). Upgrades to facilities provide improved technologies and enable more efficient operations. For example, scheduling of telescopes to carry out long term observations is accomplished using Q-scheduling, a scheduling technique that significantly enhances efficiency of use of telescopes.

Evidence: Specific examples of efficiencies: * Instrumentation at National Observatories takes data at rates hundreds of times faster than in the past. * Development of high-speed internet connections to Hawaii and South America for transmission of data to users. * Remote access to facilities enables increased cost efficiencies and easier access to results.

YES 15%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., that have similar purpose and goals?

Explanation: NSF uses competitive merit review to allocate the vast majority of its basic and applied research funds. NSF-supported construction and upgrade projects are routinely within estimated costs. COVs and ACs assess program performance in light of their knowledge of programs throughout the government.

Evidence: * NSF FY 2002 Performance and Accountability Report * COV reports * AC reports.

YES 15%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: Independent assessments of components of the TOOLS program find that the program is effective. External experts noted that NSF demonstrated significant achievement for the FY 2002 performance indicators associated with the TOOLS strategic outcome. (The weight of this question has been increased to reflect the importance of independent evaluations in assessing effectiveness of basic research programs.)

Evidence: * COV reports and NSF responses. * AC Reports. * FY 2002 Performance Report. * External Reports (e.g. NAS Reports).

YES 25%
4.CA1

Were program goals achieved within budgeted costs and established schedules?

Explanation: NSF achieved 2 of the 3 GPRA goals for Facilities Construction. Goals achieved: (1) Annual Construction and Upgrade Expenditures: Of the 28 construction and upgrade projects supported by NSF in FY 2002, 26 (93%) were within 110% of annual expenditure plans. (2) Construction and Upgrade Total Cost (for projects initiated after 1996): Two projects were completed in FY 2002, one of which had been initiated prior to 1996. Goal not achieved: (1) Meeting Annual Schedule Milestones: Of the 27 construction and upgrade projects NSF supported, 13 (48%) met all annual schedule milestones compared to the goal of 90%. In FY 2003, NSF will combine cost and schedule performance into a single goal. The revised goals are calculated using the Earned Value technique, a project management tool for measuring progress that recognizes that cost or schedule data alone can lead to distorted perceptions of performance.

Evidence: * FY 2002 Performance and Accountability Report.

LARGE EXTENT 10%
4.RD1

If the program includes construction of a facility, were program goals achieved within budgeted costs and established schedules?

Explanation:  

Evidence:  

NA  %
Section 4 - Program Results/Accountability Score 90%


Last updated: 01092009.2003FALL