ExpectMore.gov


Detailed Information on the
Defense Small Business Innovation Research/Technology Transfer Assessment

Program Code 10001027
Program Title Defense Small Business Innovation Research/Technology Transfer
Department Name Dept of Defense--Military
Agency/Bureau Name Department of Defense--Military
Program Type(s) Research and Development Program
Competitive Grant Program
Capital Assets and Service Acquisition Program
Assessment Year 2003
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 60%
Strategic Planning 0%
Program Management 43%
Program Results/Accountability 6%
Program Funding Level
(in millions)
FY2007 $1,284
FY2008 $216
FY2009 $2
*Note: funding shown for a program may be less than the actual program amount in one or more years because part of the program's funding was assessed and shown in other PART(s).

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2004

Change the way companies' past performance is assessed to ensure that it more closely matches the intent of the law.

Action taken, but not completed To date, DoD increased the CAI below which firms' commercialization score in source selection is penalized 50% from 5% to 10% in FY06, to 15% in FY07 and will evaluate the impact of follow-on threshold increases. DoD also decreased the number of Phase II awards required for a proposing firm to be assigned a CAI from the current 5 awards to 4 awards. Further reduction is not possible. CAI calculation normalizes data to more heavily weight recent commercialization.
2004

Seek to get highly successful awardees to enter the mainstream of Defense contracting.

Action taken, but not completed The Department will continue to host an annual conference to bring together recent Phase II award winners with acquisition program office personnel and prime/integrating contractors to increase program awareness and facilitate the creation of partnerships to transition SBIR/STTR technologies. Additionally, acquisition office endorsement of SBIR/STTR topics remains at over 50%. New authority to use 1% of the SBIR budget to fund a commercialization pilot program enables additional efforts.
2004

Look for ways to budget explicitly for the program's administrative costs.

Action taken, but not completed Currently, some DoD Components budget for support, in varying degrees. OSD/OSBP budgets for overall administration of solicitations, web site maintenance and general program analysis and support. DoD continues to seek authority to use a portion of the set-aside budget to fund administration, submitting a FY08 legislative change proposal. A study has recently been commissioned with the RAND corporation to independently benchmark program overhead costs.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments

Program Performance Measures

Term Type  
Long-term Output

Measure: Revise the Commercialization Achievement Index (CAI) to eliminate counting of investments as commercialization no later than three years after receiving the first Phase II support. After that, count competitive sales receipts only.


Explanation:The CAI is used to document the success of a portfolio of past SBIR/STTR investments. Companies with five or more funded projects receive a CAI rating.

Year Target Actual
2004 Count all All
Long-term Output

Measure: Budget for program administration as separate entries in budget justification materials.


Explanation:This element allows full program costs to be known by taxpayers and policy makers

Long-term Output

Measure: Use the Commercialization Achievement Index much more aggressively to weed out non-performing multiple award winners from serious consideration for subsequent awards.


Explanation:Some multiple awardees have received millions of dollars in awards over many years but have produced a negligible value of commercial products.

Year Target Actual
2005 5% 5%
2006 10% 10%
2007 15% 15%
2008 20% TBD
Long-term Output

Measure: Verify data submitted by a portion of companies receiving (and possibly applying for) awards for Phase II funding.


Explanation:As standards for awards increase, companies will feel pressure to inflate claimed commercialization.

Year Target Actual
2006 500 1635
Long-term Output

Measure: Emphasize commercialization so overall competitively awarded sales to the government (direct or indirect) from resulting products is at least equal to new R&D investment (Phases I-III), as a portfolio of prior 3-8 year investments (rolling average).


Explanation:Ratio of sales to investment from the portfolio of funded projects should well exceed 1.0 in the long term. The timetable will be the result of many factors (e.g., potentially legislative and administrative changes) that will come out of the on-going crosscutting (cross-agency)SBIR/STTR review.

Year Target Actual
2008 TBD

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: Program pupose is "that assistance be given to small-business concerns to enable them to undertake and to obtain the benefits of research and development in order to maintain and strengthen the competitive free enterprise system and the national economy." The commercialization of the results of the program is a key goal of the program.

Evidence: The purpose is set out in 15 United States Code (USC) 638 (a) and commercialization is made clear through 15 USC 638 (e) (4) (B) (i) and related subsections.

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: "Research and development are major factors in the growth and progress of industry and the national economy. The expense of carrying on research and development programs is beyond the means of many small-business concerns, and such concerns are handicapped in obtaining the benefits of research and development programs conducted at Government expense. These small-business concerns are thereby placed at a competitive disadvantage. This weakens the competitive free enterprise system and prevents the orderly development of the national economy. It is the policy of the Congress that assistance be given to small-business concerns to enable them to undertake and to obtain the benefits of research and development in order to maintain and strengthen the competitive free enterprise system and the national economy. " In addition, the statute leaves the choice of projects up to the funding agency, indicating that the specific R&D problem to be addressed through project funding must match agency program (mission) needs.

Evidence: 15 USC 638 (a) and (g).

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any Federal, state, local or private effort?

Explanation: The program is in addition to many other opportunities for small businesses to engage in R&D of potential benefit to agencies of the U.S. government or to the small businesses themselves. Almost all of the early stage R&D programs of the Department are open to small businesses, many of them on a substantially equal basis compared with larger businesses. Venture capital organizations provide further opportunities for support without government assistance. However, this program is geared to lower any hurdles in Federal R&D contracting that small firms specifically may find too daunting to allow them to contribute to Federal mission success.

Evidence: Service and agency Broad Agency Announcements (BAAs) for non-SBIR and -STTR funded R&D.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: As the award process is configured, many firms are considered fully strong candidates for future funding even through they may not have produced any commercialized products in many former SBIR/STTR contracts.

Evidence: SBIR commercialization database.

NO 0%
1.5

Is the program effectively targeted, so program resources reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: DoD has set low standards for companies to compete successfully in future awards. Weaknesses in controls over multiple award applicants and beneficiaries ensure that companies have little incentive to perform.

Evidence: SBIR BAAs and SBIR commercialization database. The Commercialization Achievement Index (CAI) is used to judge the performance potential of proposed projects, but is so weakly applied as to be of little practical value.

NO 0%
1.RD1

Does the program effectively articulate potential public benefits?

Explanation:  

Evidence:  

NA  %
1.RD2

If an industry-related problem, can the program explain how the market fails to motivate private investment?

Explanation:  

Evidence:  

NA  %
Section 1 - Program Purpose & Design Score 60%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: Commercialization and commercialization of products that are bought by the U.S. military (without outside pressure being applied) are the main measurement tools. However, there are no strong performance measures (specific standards) against which performance can be measured.

Evidence:  

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: There are no long term targets, although with the commercialization database, it would be possible to construct some.

Evidence:  

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that demonstrate progress toward achieving the program's long-term measures?

Explanation: Because the program has no long term measures, it has no annual sub-measures.

Evidence:  

NO 0%
2.4

Does the program have baselines and ambitious targets and timeframes for its annual measures?

Explanation: No baselines or targets.

Evidence:  

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, etc.) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Each of the grantees is required to submit proposals based on potential application of results to DoD program needs (the defense mission).

Evidence: SBIR/STTR Solicitation announcements.

NO 0%
2.6

Are independent and quality evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: Independent studies haven't addressed the impact /value that the program would have with a different set-aside percentage. This is noteworthy, as a large portion of total program funding is awarded to companies that have received prior awards without being able to point to a strong record of commercialization successes. In addition, outside evaluations have not compared program successes quantitatively against successes of more conventional programs (with strong anecdotal evidence of promoting small businesses) that support large numbers of small businesses.

Evidence: Independent reports from NRC and GAO have not performed quantitative comparisons against apparently successful non-SBIR/non-STTR programs.

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: Per statute, funding is provided as a fixed percentage set-aside from extramural funding of overall R&D programs. When the Department of Defense attempted to fund the program through explicit line-items in the budget, Congress eliminated the separately identified funding and directed that the funding continue to be provided as fixed tax on each individual R&D program. Funding isn't required to be spent for the programs from which funding is derived, but is pooled for potentially broader application. In theory, this should make for a higher probability of success of commercialization (addressed through other questions and potentially able to provide YES answers elsewhere), but it loosens the connection with the funded activity and complicates the job of program managers, who must develop specific defense weapon systems and defend their budget requests based on their specific assigned missions.

Evidence: Explicit funding for the program is not displayed in the RDT&E Programs summary table (R-1) transmitted with the President's Budget, nor is it displayed in the Budget Justification materials tranmitted with the Budget. SBIR and STTR funding is embedded in general extramural funding requests.

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Department implemented several changes to the program that were the first steps in improvement of commercialization and relevance to the Department's mission starting in 1995. This included implementation of the Fast Track program and establishment of a Commercialization Achievement Index (CAI) that could help weed out unproductive awardees, but significant additional adjustments were not made in more recent years (~1997-2002). A new program manager at the Departmental level has not had time to assess needs and initiate further adjustments.

Evidence: Memo from UnderSecretary of Defense Paul Kaminski of a Final Report of the Process Action Team on the SBIR program (02 June, 1995). National Research Council report "SBIR An Assessment of the Department of Defense Fast Track Initiative."

NO 0%
2.CA1

Has the agency/program conducted a recent, meaningful, credible analysis of alternatives that includes trade-offs between cost, schedule, risk, and performance goals and used the results to guide the resulting activity?

Explanation: Generally, DoD components see this program as an entitlement for small-business subsidies. Regardless of the performance of past awardees in terms of cost, scheduels and performance, more money will become available next year, for which the same companies can compete.

Evidence: One DoD agency responded to this question with the answer: "Program implementation as required by public law does not allow trade-offs among cost, schedule, risk and performance. "

NO 0%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?

Explanation: DoD components have sometimes offered qualitative assessments, but specific quantitative evidence has not been submitted. Those comparisons that were alleged to have been made were very general and non-quantitative. Anecdotal evidence exists to indicate that small firms have done well in some other non-SBIR/STTR competitions. A more thorough examination of the two award processes (SBIR/STTR vs. non-SBIR/STTR) has yet to be made.

Evidence: DARPA non-SBIR/STTR successes include many small businesses that have become forces within their industry sectors, but there is no quantitative analysis comparing outcomes of the SBIR/STTR program against outcomes of DARPA non-SBIR/STTR programs.

NO 0%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: Results are mixed. Proposals are rated and awards are made to the highest scoring proposers. However, a steady level of funding is made available for the program independent of year-to-year successes or failures.

Evidence: 15 USC 638(f).

NO 0%
Section 2 - Strategic Planning Score 0%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Much commercialization data is available, but little of it seems to be used to manage the program and improve overall program performance. An extreme example: one firm which has received funding for 20 Phase II projects has received a 95 percentile ranking in the Commercialization Achievement Index (due to capitalization funding received from various sources), making it highly competitive for future awards, in spite of having sold no products whatsoever to any Federal agency as a result of its SBIR supported programs.

Evidence: DoD SBIR commercialization database.

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, cost-sharing partners, etc.) held accountable for cost, schedule and performance results?

Explanation: This is a mixed result. Awardees are held accountable for cost and schedule to the extent that results of Phase I investigations feed into the ratings for assessments for Phase II awards. However, firms with poor records of commercialization most often are competitive with other applicants in receiving additional funding for new projects due to the very low standards of commercialization expected in proposal assessments. Because the weakness of the computation and application of the CAI is addressed in several other questions, this PART rating emphasizes the Phase I aspects of the answer instead.

Evidence: SBIR/STTR Manager Desk Reference materials. Weighting is half that of the other elements in this section due to mixed result. The program receives partial credit based on positive aspects.

YES 5%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: The majority of Phase I funds are obligated within 4 months of receipt. However, comparable Phase II data, which would address a larger portion of total funds spent, are not available.

Evidence: Obligation data provided to OMB.

NO 0%
3.4

Does the program have procedures (e.g., competitive sourcing/cost comparisons, IT improvements, approporaite incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: Although data are gathered for the CAI that might provide some time series info, efficiencies are not generally assessed or monitored. The program spends approx. $57 M annually to administer the contracts, but has little info on administrative efficiency improvements.

Evidence:  

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: Individual DoD components do share data with each other, non-DoD SBIR agencies and SBA, but commercialization data or origination data from one source generally is not used to bar applicants from being considered elsewhere.

Evidence: Solicitation topics are generated in concert with each contracting office's parent organization.

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: The Department has a poor track record of posting and tracking funds obligation, use, and expenditure.

Evidence: Various DoD IG audits, GAO audits of funds tracking and expenditutes.

NO 0%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: Program has taken some steps to address some deficiencies, but the potentially most valuable tool, the commercialization database (which uses the CAI), has been used very weakly to weed-out applicants only with extremely low commercialization potential.

Evidence: Commercialization database.

NO 0%
3.CA1

Is the program managed by maintaining clearly defined deliverables, capability/performance characteristics, and appropriate, credible cost and schedule goals?

Explanation: Each contract includes clearly defined delieverables, which are monitored during execution. Phase I projects, in particular, result in deliverables that affect potential for receipt of a Phase II award. However, information on deliverables are often lost in the large pool of less significant data. For example, one firm which has received funding for 20 Phase II projects has received a 95 percentile ranking in the Commercialization Achievement Index, making it fully competitive for future awards, due to capitalization funding received from various sources, in spite of having sold no products whatsoever to any Federal agency as a result of its SBIR supported programs.

Evidence: SBIR Desk Reference Manual for program managers.

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: Awards are made following a competitive review process.

Evidence: SBIR Desk Reference Manual.

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: Proposals must include much information of use to review panels and government contract officials. Furthermore, awardees must provide information to update contract overseers.

Evidence: Federal Acquisition Regulations and Defense Federal Acquisition Regulations.

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: Data are kept, but the public has access only to highly aggregated data.

Evidence:  

NO 0%
3.RD1

Does the program allocate funds through a competitive, merit-based process, or, if not, does it justify funding methods and document how quality is maintained?

Explanation: This is a competitive award program.

Evidence:  

NA 0%
3.RD2

Does competition encourage the participation of new/first-time performers through a fair and open application process?

Explanation:  

Evidence:  

NA  %
3.RD3

Does the program adequately define appropriate termination points and other decision points?

Explanation:  

Evidence:  

NA  %
3.RD4

If the program includes technology development or construction or operation of a facility, does the program clearly define deliverables and required capability/performance characteristics and appropriate, credible cost and schedule goals?

Explanation:  

Evidence:  

NA  %
Section 3 - Program Management Score 43%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome performance goals?

Explanation: Only a small part of the funded programs reach fruition, as determined by the marketplace.

Evidence: SBIR commercialization database. The part of the CAI dealing with sales provides evidence of program outcomes.

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: No annual performance goals.

Evidence: See question 2.3 above.

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program performance goals each year?

Explanation: Administrative costs are outside of the realm of the set-aside funding, and are likely reasonably efficiently spent, but costs have not been tracked over time. The set-aside funding addresses supported R&D only and results, which though sometimes substantial, are infrequent.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., that have similar purpose and goals?

Explanation: No quantitative evidence that it compares favorably.

Evidence:  

NO 0%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: Many independent evaluations have been favorable, but are limited and incomplete, looking at the award process and pointing to occasional anecdotal successes. However, they have not compared results to other Federal and non-Federal programs, and have not seemed to look at the statistical results of the commercialization database. One Harvard study, which seemed to be the most complete of the external studies, found limited effectiveness.

Evidence:  

SMALL EXTENT 6%
4.CA1

Were program goals achieved within budgeted costs and established schedules?

Explanation: No goals.

Evidence: See questions 2.3 and 2.4 above.

NO 0%
4.RD1

If the program includes construction of a facility, were program goals achieved within budgeted costs and established schedules?

Explanation:  

Evidence:  

NA  %
Section 4 - Program Results/Accountability Score 6%


Last updated: 09062008.2003SPR