This is the accessible text file for GAO report number GAO-09-956T 
entitled 'Small Business Innovation Research: Observations on Agencies' 
Data Collection and Eligibility Determination Efforts' which was 
released on August 7, 2009. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Testimony: 

Before the Committee on Commerce, Science, and Transportation, U.S. 
Senate: 

United States Government Accountability Office: 
GAO: 

For Release on Delivery: 
Expected at 2:30 a.m. EDT:
Thursday, August 6, 2009: 

Small Business Innovation Research: 

Observations on Agencies' Data Collection and Eligibility Determination 
Efforts: 

Statement of Patricia A. Dalton, Managing Director: 
Natural Resources and Environment: 

GAO-09-956T: 

GAO Highlights: 

Highlights of GAO-09-956T, a testimony before the Committee on 
Commerce, Science, and Transportation, U. S. Senate. 

Why GAO Did This Study: 

The Small Business Innovation Development Act of 1982 established the 
Small Business Innovation Research program (SBIR) to stimulate 
technological innovation, use small businesses to meet federal research 
and development (R&D) needs, foster and encourage participation by 
minority and disadvantaged persons in technological innovation, and 
increase private sector commercialization of innovations derived from 
federal R&D. Since the program’s inception, GAO has conducted numerous 
reviews of the SBIR program. This statement summarizes GAO’s past 
findings on the SBIR program’s (1) successes and challenges, (2) data 
collection issues that affect program monitoring and evaluation, and 
(3) how agencies make eligibility determinations for the program. 

GAO is not making any new recommendations in this statement. 

What GAO Found: 

Between July 1985 and June 1999, GAO found that the SBIR program was 
achieving its goals to enhance the role of small businesses in federal 
R&D, stimulate commercialization of research results, and support the 
participation of small businesses owned by women and/or disadvantaged 
persons. More specifically, GAO found that throughout the life of the 
program, awards have been based on technical merit and are generally of 
good quality. In addition, the SBIR program successfully attracts many 
qualified companies, has had a high level of competition, consistently 
has had a high number of first-time participants, and attracts hundreds 
of new companies annually. Further, SBIR has helped serve agencies’ 
missions and R&D needs; although GAO found that agencies differ in the 
emphasis they place on funding research to support their mission versus 
more generalized research. During these reviews GAO also identified 
areas of weakness and made recommendations that could strengthen the 
program further. Many of these recommendations have been either fully 
or partially addressed by the Congress in various reauthorizations of 
the program or by the agencies themselves. For example, in 2005, GAO 
found that the issue of how to assess the performance of the SBIR 
program remains somewhat unresolved after almost two decades, and 
identified data and information gaps that make assessment of the SBIR 
program a challenge. 

Many of the solutions to improve the SBIR program could be addressed, 
in part, by collecting better data and establishing a government-use 
database, so that SBA and participating agencies can share information 
and enhance their efforts to monitor and evaluate the program. However, 
in 2006, GAO reported that SBA was 5 years behind schedule in complying 
with a congressional mandate to develop a government-use database that 
could facilitate agencies’ monitoring and evaluation efforts. Moreover, 
the information that SBA was collecting for the database was incomplete 
and inconsistent, thereby limiting its usefulness. In 2006, SBA told 
GAO that it expected to have the government-use database operational 
early in fiscal year 2007. However, the database did not become 
operational until October 2008 and currently contains 2 years of new 
data, according to an SBA official. The database also does not permit 
information to be entered in an inconsistent format. 

In 2006, GAO also found that SBA, NIH, and DOD focus on a few select 
criteria to determine the eligibility of applicants for SBIR awards. 
GAO reported that both NIH and DOD largely relied on applicants to self-
certify that they met all of the SBIR eligibility criteria as part of 
their SBIR applications, although both made additional efforts to 
ensure the accuracy of the information when they observed discrepancies 
in the applications. When the agencies were unable to verify the 
eligibility of an applicant, they referred the application to SBA for 
an eligibility determination. GAO found that when SBA finds an 
applicant to be ineligible for the SBIR program, it places this 
information on its Web site but does not consistently identify that the 
ineligibility determination was made for the SBIR program. 

View [hyperlink, http://www.gao.gov/products/GAO-09-956T] or key 
components. For more information, contact Patricia Dalton at (202) 512-
3841 or daltonp@gao.gov. 

[End of section] 

Mr. Chairman and Members of the Committee: 

We are pleased to be here today to testify on our past work on the 
Small Business Innovation Research (SBIR) program. As you know, to be 
competitive in the global economy, the United States relies heavily on 
innovation through research and development (R&D). Recognizing the 
potential of small businesses to be a source of significant innovation, 
the Congress passed the Small Business Innovation Development Act of 
1982.[Footnote 1] The act established the SBIR program to stimulate 
technological innovation, use small businesses to meet federal R&D 
needs, foster and encourage participation by minority and disadvantaged 
persons in technological innovation, and increase private sector 
commercialization of innovations derived from federal R&D. The act 
provided for a three-phased program: phase I to determine the 
feasibility and scientific and technical merit of a proposed research 
idea; phase II to further develop the idea; and phase III to 
commercialize the resulting product or process with no further SBIR 
funding. 

Federal agencies that have budgets of $100 million for research 
conducted by others, called extramural research, are required to use 
2.5 percent of these budgets to establish and operate an SBIR program. 
Currently, 11 federal agencies participate in the SBIR program. Each 
agency manages its own program, including targeting research areas, 
reviewing proposed projects, and making research awards through grants, 
contracts, or cooperative agreements. The Small Business Administration 
(SBA) plays a central administrative role by, for example, issuing 
policy directives to the participating federal agencies, collecting 
data from participating agencies on awards and recipients, and 
reporting program results annually to the Congress. In 2005 awards from 
three agencies--the Department of Defense (DOD), National Institutes of 
Health (NIH), and National Aeronautics and Space Agency (NASA)-- 
accounted for the majority of SBIR funds. From its inception in fiscal 
year 1983 through fiscal year 2004, federal agencies had awarded over 
$17 billion for more than 82,000 projects. 

Since it was established in 1982, the SBIR program has been 
reauthorized and modified by the Congress at various times. For 
example, the Small Business Research and Development Enhancement Act of 
1992 directed SBA and participating agencies to, among other things, 
emphasize the goal of increasing commercialization of research results 
and to improve the government's dissemination of program-related data. 
[Footnote 2] As a result, agencies were required to include 
commercialization potential as a criterion for selecting award 
recipients. During this same period, SBA began to develop a publicly 
available database, known as Tech-Net, that contained information on 
all awards made through the SBIR program. The Tech-Net database is 
intended to be, among other things, an electronic gateway of technology 
information and resources for researchers, scientists, and government 
officials about federally funded, leading edge technology research. The 
Small Business Innovation Research Program Reauthorization Act of 2000 
formalized this database by requiring SBA to develop, maintain, and 
make available to the public a searchable, up-to-date, electronic 
database that contained SBIR award information.[Footnote 3] The 2000 
reauthorization act also required SBA to develop and maintain another 
restricted government database that would contain additional 
information on commercialization not contained in the public Tech-Net 
database, thereby allowing better evaluations of the SBIR program on an 
ongoing basis.[Footnote 4] This database was to be established by mid- 
2001 and made available only to government agencies and certain other 
authorized users. SBA has established, through a policy directive, a 
series of data elements for all the agencies to submit for its public 
Tech-Net database.[Footnote 5] The SBIR program is currently being 
considered by the Congress for reauthorization, and both the House and 
Senate have recently passed bills to reauthorize the program. 

In this context, you asked us to summarize the successes and challenges 
that our past work has identified about the SBIR program, summarize the 
concerns we have previously identified on SBA's efforts to establish an 
interagency database that includes information on SBIR applicants and 
awards, and describe the process that agencies use to determine the 
eligibility of SBIR applicants for the program. This statement is based 
largely on our prior reviews of the SBIR program and contacts with SBA 
officials. Our work on the prior reviews was conducted in accordance 
with generally accepted government auditing standards. Those standards 
require that we plan and perform the audits to obtain sufficient, 
appropriate evidence to provide a reasonable basis for our findings and 
conclusions based on our audit objectives. We believe that the evidence 
we obtained for those reviews provided a reasonable basis for our 
findings and conclusions based on our audit objectives. 

Summary: 

Over the life of the SBIR program, we have reviewed and reported on its 
implementation many times. For example, between July 1985 and June 
1999, we found that the SBIR program is achieving its goals to enhance 
the role of small businesses in federal R&D, stimulate 
commercialization of research results, and support the participation of 
small businesses owned by women and/or disadvantaged persons.[Footnote 
6] Participating agencies and companies that we surveyed during our 
reviews generally rated the program highly. We also identified areas of 
weakness and made recommendations that could strengthen the program 
further. Many of our recommendations for program improvements have been 
either fully or partially addressed by the Congress when it 
reauthorized the program or by the agencies themselves. For example, in 
2005, we noted one issue that continued to remain somewhat unresolved 
after almost two decades of program implementation--how to assess SBIR 
program's performance--and we identified data and information gaps that 
make an assessment of the SBIR program a challenge. In 2006, we 
conducted two reviews of the SBIR program.[Footnote 7] The first review 
described how DOD, NIH, and SBA verify the eligibility of SBIR 
applicants; and the second examined SBA's and eight participating 
agencies' efforts to collect data and establish a government-use 
database that would facilitate monitoring and evaluation of the 
program. In summary, we found the following: 

* SBA had not met the congressional mandate to develop and implement, 
by June 2001, a government-use database for monitoring and evaluating 
the SBIR program. SBA officials told us that they had been unable to 
meet the requirement to implement such a database by 2001 because of 
management changes that had occurred at the agency and because of 
budgetary constraints, but expected to have it operational by early in 
fiscal year 2007. However, this database did not become operational 
until October 2008, according to an SBA official. 

* Although federal agencies participating in the SBIR program annually 
submit a wide range of descriptive information to SBA about each award 
they make, they were not consistently providing the full range of 
required data elements. As a result, certain sections of the Tech-Net 
database needed for comprehensive program evaluation were incomplete. 
Agencies cited a variety of reasons for not providing all of the data 
elements, including frequent changes in SBA's data requirements and 
differences in the types of data agencies collect versus the types of 
data that SBA outlined in its policy directive. 

* Some participating agencies were not submitting SBIR award data in 
the standard format established in SBA's policy directive. For example, 
almost a quarter of the data provided by five participating agencies in 
2004 and 2005 did not comply with SBA's formatting guidance. In light 
of the problems we identified with the Tech-Net database and the 
implications for these errors to limit evaluations of the SBIR program, 
we recommended that SBA work with participating agencies to strengthen 
efforts to improve the quality of the data. According to an SBA 
official, as of October 2008, agencies can directly enter SBIR-related 
data into the Tech-Net database over the Internet in a way that does 
not accept incorrectly formatted data. 

* To determine a firm's eligibility for the SBIR program, DOD, NIH, and 
SBA focus primarily on criteria relating to ownership, for-profit 
status, and the number of employees. The agencies primarily rely on the 
applicants' self-certification of eligibility, although in some cases 
they may take additional steps to verify this information. When agency 
officials are unable to ensure the accuracy of an applicant's 
information, they refer the matter to SBA. After SBA makes an 
eligibility determination, it makes information about ineligible firms 
available on its Web site, but it does not always indicate that the 
determination was for SBIR purposes. Once agencies receive SBA's 
determination of eligibility, they may or may not have a process to 
share this information across the agency. 

Successes and Challenges of the SBIR Program: 

Our reviews of the SBIR program between 1985 and 1999 found numerous 
examples of program successes such as the following: 

* Funding high-quality research. Throughout the life of the program, 
awards have been based on technical merit and are generally of good 
quality. 

* Encouraging widespread competition. The SBIR program successfully 
attracts many qualified companies, has had a high level of competition, 
consistently has had a high number of first-time participants, and 
attracts hundreds of new companies annually. 

* Providing effective outreach. SBIR agencies consistently reach out to 
foster participation by women-owned or socially and economically 
disadvantaged small businesses by participating in regional small 
business conferences and workshops targeting these types of small 
businesses. 

* Increasing successful commercialization. At various points in the 
life of the program we have reported that SBIR has succeeded in 
increasing private sector commercialization of innovations. 

* Helping to serve mission needs. SBIR has helped serve agencies' 
missions and R&D needs, although we found that agencies differ in the 
emphasis they place on funding research to support their mission versus 
more generalized research. 

Our reviews of the SBIR program during that time have also identified a 
number of areas of weakness that, over time, have been either fully or 
partially addressed by the Congress in reauthorizing the program or by 
the agencies themselves. For example, 

* Duplicate funding. In 1995,[Footnote 8] we identified duplicate 
funding for similar, or even identical, research projects by more than 
one agency. A few companies received funding for the same proposals 
two, three, and even five times before agencies became aware of the 
duplication. Contributing factors included the fraudulent evasion of 
disclosure by companies applying for awards, the lack of a consistent 
definition for key terms such as "similar research," and the lack of 
interagency sharing of data on awards. To address these concerns, we 
recommended that SBA take three actions: (1) determine if the 
certification form needed to be improved and make any necessary 
revisions, (2) develop definitions and guidelines for what constitutes 
"duplicative" research, and (3) provide interagency access to current 
information regarding SBIR awards In response to our recommendations, 
SBA strengthened the language agencies use in their application 
packages to clearly warn applicants about the illegality of entering 
into multiple agreements for essentially the same effort. In addition, 
SBA planned to develop Internet capabilities to provide SBIR data 
access for all of the agencies. 

* Inconsistent interpretations of extramural research budgets. In 1998, 
[Footnote 9] we found that while agency officials adhered to SBIR's 
program and statutory funding requirements, they used differing 
interpretations of how to calculate their "extramural research 
budgets." As a result, some agencies were inappropriately including or 
excluding some types of expenses. We recommended that SBA provide 
additional guidance on how participating agencies were to calculate 
their extramural research budgets. The Congress addressed this program 
weakness in 2000, when it required that the agencies report annually to 
SBA on the methods used to calculate their extramural research budgets. 

* Geographical concentration of awards. In 1999,[Footnote 10] in 
response to congressional concerns about the geographical concentration 
of SBIR awards, we reported that companies in a small number of states, 
especially California and Massachusetts, had submitted the most 
proposals and won the majority of awards. The distribution of awards 
generally followed the pattern of distribution of non-SBIR expenditures 
for R&D, venture capital investments, and academic research funds. We 
reported that some agencies had undertaken efforts to broaden the 
geographic distribution of awards. In the 2000 reauthorization of the 
program, the Congress directed the SBA Administrator to establish the 
Federal and State Technology (FAST) Partnership Program to help 
strengthen the technological competitiveness of small businesses, 
especially in those states that receive fewer SBIR grants. The FAST 
Program was not reauthorized when it expired in 2005. In 2006 when we 
looked at the geographical concentration of awards made by DOD and NIH, 
we found that while a firm in every state received at least one SBIR 
award from both agencies, SBIR awards continued to be concentrated in a 
handful of states and about one third of awards had been made to firms 
in California and Massachusetts.[Footnote 11] 

* Clarification on commercialization and other SBIR goals. Finally, in 
2000, the Congress directed the SBA Administrator to require companies 
applying for a phase II award to include a commercialization plan with 
their SBIR proposals. This addressed our continuing concern that 
clarification was needed on the relative emphasis that agencies should 
give to a company's commercialization record and SBIR's other goals 
when evaluating proposals. In addition, in 2001, SBA initiated efforts 
to develop standard criteria for measuring commercial and other 
outcomes of the SBIR program and incorporate these criteria into its 
Tech-Net database. In fiscal year 2002, SBA further enhanced the 
reporting system to include commercialization results that would help 
establish an initial baseline rate of commercialization. In addition, 
small business firms participating in the SBIR program are required to 
provide information annually on sales and investments associated with 
their SBIR projects. 

SBIR Tech-Net Database Limitations: 

Many of the solutions cited above to improve and strengthen the SBIR 
program relied to some extent on the collection of data or the 
establishment of a government-use database, so that SBA and 
participating agencies could share information and enhance their 
efforts to monitor and evaluate the program. However, in 2006,[Footnote 
12] we reported that SBA was 5 years behind schedule in complying with 
the congressional mandate to develop a government database that could 
facilitate agencies' monitoring and evaluation of the program. We also 
reported that the information SBA was collecting for the database was 
incomplete and inconsistent, thereby limiting its usefulness for 
program evaluations. Specifically, we identified the following concerns 
with SBA's data-gathering efforts: 

* SBA had not met its obligation to implement a restricted government- 
use database that would allow SBIR program evaluation as directed by 
the 2000 SBIR reauthorization act. As outlined in the legislation, SBA, 
in consultation with federal agencies participating in the SBIR 
program, was to develop a secure database by June 2001 and maintain it 
for program evaluation purposes by the federal government and certain 
other entities. SBA planned to meet this requirement by expanding the 
existing Tech-Net database to include a restricted government-use 
section that would be accessible only to government agencies and other 
authorized users. In constructing the government-use section of the 
database, SBA planned to supplement data already gathered for the 
public-use section of the Tech-Net database with information from SBIR 
recipients and from participating agencies on commercialization 
outcomes for phase II SBIR awards. However, according to SBA officials, 
the agency was unable to meet the statutory requirement, primarily 
because of increased security and other information technology project 
requirements, agency management changes, and budgetary constraints. 
When we reported on this lack of compliance with the database mandate, 
SBA told us that it anticipated having the government-use section of 
the Tech-Net database operational early in fiscal year 2007. However, 
according to an SBA official, the database became operational in 
October 2008, and agencies have begun to provide data on their SBIR 
programs using the Internet. 

* While federal agencies participating in the SBIR program submitted a 
wide range of descriptive award information to SBA annually, these 
agencies did not consistently provide all of the required data 
elements. As outlined in SBA's policy directive, each year, SBIR 
participating agencies are required to collect and maintain information 
from recipients and provide it to SBA so that it can be included in the 
Tech-Net database. Specifically, the policy directive established over 
40 data elements for participating agencies to report for each SBIR 
award they make; a number of these elements are required. These data 
include award-specific information, such as the date and amount of the 
award, an abstract of the project funded by the award, and a unique 
tracking number for each award. Participating agencies are also 
required to provide data about the award recipient, such as gender and 
socio-economic status, and information about the type of firms that 
received the awards, such as the number of employees and geographic 
location. Much of the data participating agencies collected are 
provided by the SBIR applicants when they apply for an award. Agencies 
provide additional information, such as the grant/contract number and 
the dollar amount of the award, after the award is made. For the most 
part, all of the agencies we reviewed in 2006 provided the majority of 
the data elements outlined in the policy directive. However, some of 
the agencies were not providing the full range of required data 
elements. As a result, SBA did not have complete information on the 
characteristics of all SBIR awards made by the agencies. SBA officials 
told us that agencies did not routinely provide all of the data 
elements outlined in the policy directive because either they did not 
capture the information in their agency databases or they were not 
requesting the information from the SBIR applicants. Officials at the 
participating agencies cited additional reasons for the incomplete data 
they provided to SBA. For example, some officials noted that SBA's Tech-
Net annual reporting requirements often change and others said that if 
the company or contact information changes and the SBIR recipient fails 
to provide updated information to the agency, the agency cannot provide 
this information to SBA. 

* Participating agencies were providing some data that are inconsistent 
with SBA's formatting guidance, and while some of these inconsistencies 
were corrected by SBA's quality assurance processes, others were not. 
In 2006,[Footnote 13] we determined that almost a quarter of the data 
provided by five of the eight agencies we reviewed was incorrectly 
formatted for one or more fields in the Tech-Net database. As a result, 
we concluded that these inconsistent or inaccurate data elements 
compromised the value of the database for program evaluation purposes. 
SBA's quality assurance efforts focus on obtaining complete and 
accurate data for those fields essential to tracking specific awards, 
such as the tracking number and award amount, rather than on those 
fields that contain demographic information about the award recipient. 
We found that SBA electronically checked the data submitted by the 
participating agencies to locate and reformat inconsistencies, but it 
did not take steps to ensure that all agency-provided data were 
accurate and complete. We also determined that inconsistencies or 
inaccuracies could arise in certain data fields because SBA interpreted 
the absence of certain data elements as a negative entry without 
confirming the accuracy of such an interpretation with the agency. As 
we reported in 2006, such inaccuracies and inconsistencies were a 
concern because information in the Tech-Net database would be used to 
populate the government-use section of the database that SBA was 
developing (as discussed above) to support SBIR program evaluations. 
However, at the time of our review, SBA had no plans to correct any of 
the errors or inconsistencies in the database that related to the 
historical data already collected. As a result, we concluded that the 
errors in the existing database would migrate to the government-use 
section of the database and would compromise the usefulness of the 
government-use database for program evaluation and monitoring purposes. 

To address the concerns that we identified with regard to the quality 
of the data that SBA was collecting for the Tech-Net database, we 
recommended in our 2006 report that SBA work with the participating 
agencies to strengthen the completeness, accuracy, and consistency of 
its data collection efforts. According to an SBA official, the database 
is currently operational and agencies have entered data for fiscal 
years 2007 and 2008 over the Internet. Moreover, according to this 
official, the system is set up in such a way that it does not accept 
incorrectly formatted data. 

Agencies Focus on Select Awardee Eligibility Criteria: 

In 2006,[Footnote 14] we also found that SBA and some participating 
agencies focused on a few select criteria for determining applicants' 
eligibility for SBIR awards. Specifically, we reviewed DOD's, NIH's, 
and SBA's processes to determine eligibility of applicants for the SBIR 
program and found that they focused largely on three SBIR criteria in 
their eligibility reviews--ownership, size in terms of the number of 
employees, and for-profit status of SBIR applicants. Although agency 
officials also told us that they consider information on the full range 
of criteria, such as whether the principal investigator is employed 
primarily by the applying firm, and the extent to which work on the 
project will be performed by others. 

Moreover, we found that both NIH and DOD largely relied on applicants 
to self-certify that they met all of the SBIR eligibility criteria as 
part of their SBIR applications. For example, at NIH, applicants 
certified that they met the eligibility criteria by completing a 
verification statement when NIH notified them that their application 
had been selected for funding but before NIH made the award. The 
verification statement directs applicants to respond to a series of 
questions relating to for-profit status, ownership, number of 
employees, where the work would be performed, and the primary 
employment of the principal investigator, among others. Similarly, 
DOD's cover sheet for each SBIR application directs applicants to 
certify that they met the program's eligibility criteria. NIH and DOD 
would not fund applications if the questions on their agency's 
verification statement or cover sheet were not answered. Both NIH and 
DOD also warned applicants of the civil and criminal penalties for 
making false, fictitious, or fraudulent statements. In some cases the 
agencies made additional efforts to ensure the accuracy of the 
information applicants provided when they observed certain 
discrepancies in the applications. 

In 2006,[Footnote 15] we reported that when officials at the agencies 
had unresolved concerns about the accuracy of an applicant's 
eligibility information, they referred the matter to SBA to make an 
eligibility determination. We found that when SBA received a letter 
from the agency detailing its concerns, SBA officials contacted the 
applicants and asked them to re-certify their eligibility status and 
might request additional documentation on the criteria of concern. Upon 
making a determination of eligibility, SBA then notified the official 
at the inquiring agency, and the applicant, of its decision. 

Although, SBA made the information about firms it found ineligible 
publicly available on its Web site so that all participating agencies 
and the public could access the information, we found that it did not 
consistently include information on the Web site identifying whether or 
not the determination was for the SBIR program. An SBA official told us 
the agency planned to include such information on its Web site more 
systematically before the end of fiscal year 2006. Once the agencies 
received information about applicants' eligibility they also had 
different approaches for retaining and sharing this information. For 
example, while both NIH and DOD noted the determination of 
ineligibility in the applicant's file, NIH also centrally tracked 
ineligible firms and made this information available to all of its 
institutes and centers that make SBIR awards. In contrast, DOD did not 
have a centralized process to share the information across its awarding 
components, although DOD officials told us it was common practice for 
awarding components to share such information electronically. 

In conclusion, Mr. Chairman, while the SBIR program is generally 
recognized as a successful program that has encouraged innovation and 
helped federal agencies achieve their R&D goals, it has continued to 
suffer from some long-standing evaluation and monitoring issues that 
are made more difficult because of a lack of accurate, reliable, and 
comprehensive information on SBIR applicants and awards. The Congress 
recognized the need for a comprehensive database in 2000 when it 
mandated that SBA develop a government-use database. Although SBA did 
not meet its statutorily mandated deadline of June 2001, the database 
has been operational since October 2008, and contains limited new 
information but may also contain inaccurate historical data. 

Mr. Chairman, this concludes my prepared statement. I would be happy to 
respond to any questions that you or other members of the Committee may 
have. 

GAO Contact and Staff Acknowledgments: 

For further information about this statement, please contact me at 
(202) 512-3841 or at daltonp@gao.gov. Contact points for our Offices of 
Congressional Relations and Public Affairs may be found on the last 
page of this statement. Vondalee Hunt, Anu Mittal, and Cheryl Williams 
also made key contributions to this statement. 

[End of section] 

Footnotes: 

[1] Pub. L. No. 97-219 (1982) 

[2] Pub. L. No. 102-564 (1992). 

[3] Pub. L. No. 106-554, App. I, Tit. I (2000). 

[4] Throughout this statement we refer to this database as the 
government-use database. 

[5] 67 Fed. Reg. 60,072 (Sept. 24, 2002). 

[6] GAO, Federal Research: Observations on the Small Business 
Innovation Research Program, [hyperlink, 
http://www.gao.gov/products/GAO-05-861T] (Washington, D.C.: June 28, 
2005). 

[7] GAO, Small Business Innovation Research: Information on Awards Made 
by NIH and DoD in Fiscal Years 2001 through 2004, [hyperlink, 
http://www.gao.gov/products/GAO-06-565] (Washington, D.C.; April 14, 
2006) and GAO, Small Business Innovation Research: Agencies Need to 
Strengthen Efforts to Improve the Completeness, Consistency, and 
Accuracy of Awards Data, [hyperlink, 
http://www.gao.gov/products/GAO-07-38] (Washington, D.C.; October 19, 
2006). 

[8] GAO, Federal Research: Interim Report on the Small Business 
Innovation Research Program, [hyperlink, 
http://www.gao.gov/products/GAO/RCED-95-59] (Washington, D.C.; March 8, 
1995). 

[9] GAO, Federal Research: Observations on the Small Business 
Innovation Research Program, [hyperlink, 
http://www.gao.gov/products/GAO/RCED-98-132] (Washington, D.C.; April 
17, 1998). 

[10] GAO, Federal Research: Evaluation of Small Business Innovation 
Research Can be Strengthened, [hyperlink, 
http://www.gao.gov/products/GAO/RCED-99-114] (Washington, D.C.; June 4, 
1999). 

[11] [hyperlink, http://www.gao.gov/products/GAO-06-565]. 

[12] [hyperlink, http://www.gao.gov/products/GAO-07-38]. 

[13] [hyperlink, http://www.gao.gov/products/GAO-07-38]. 

[14] [hyperlink, http://www.gao.gov/products/GAO-06-565]. 

[15] [hyperlink, http://www.gao.gov/products/GAO-06-565]. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: