ExpectMore.gov


Detailed Information on the
Substance Abuse Treatment Programs of Regional and National Significance Assessment

Program Code 10000302
Program Title Substance Abuse Treatment Programs of Regional and National Significance
Department Name Dept of Health & Human Service
Agency/Bureau Name Substance Abuse and Mental Health Services Administration
Program Type(s) Competitive Grant Program
Assessment Year 2002
Assessment Rating Adequate
Assessment Section Scores
Section Score
Program Purpose & Design 80%
Strategic Planning 86%
Program Management 64%
Program Results/Accountability 33%
Program Funding Level
(in millions)
FY2007 $399
FY2008 $400
FY2009 $337

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Providing benchmark data to allow grantees to gauge how they perform compared to other grantees in their Program Area.

Action taken, but not completed All grantees have now received benchmarking data.
2008

Including language in new 2008 RFAs (as appropriate) around incentives and disincentives based on grantee performance

Action taken, but not completed CSAT included language in the 2007 Access to Recovery RFA. CSAT included incentive language in additional RFAs including Treatment Drug Courts.
2008

Begin to better integrate monthly tracking system of performance into Team Leader and Project Officer monitoring of grantees

Action taken, but not completed
2008

Begin to fund cross-site evaluation to better measure program effectiveness

Action taken, but not completed

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2003

Making more data reports from the automated Services Accountability Improvement System (SAIS) to grantees and program staff.

Completed Action completed. CSAT has developed a number of new reports in SAIS to allow grantees to better monitor their programs.
2006

Implementing the Project Officer Continuation Board which utilizes performance data collected through the automated Services Accountability Improvement System to assist in the decision making process regarding reapplication funding for grants.

Completed The Project Officer Continuation Board has been convened and will meet annually to review continuation applications.
2004

Implement a drug treatment voucher program to expand access to treatment.

Completed Action completed. CSAT has implemented a 3 year voucher based drug treatment program (Access to Recovery grants were awarded in 2004) and has now expanded the program to fund another cohort of grantees with a specific focus on methamphetamine treatment.
2003

Further improving the effectiveness of services grants by introducing grant funding incentives and reductions based on performance.

Completed Action completed. CSAT included language in one its RFAs (Treatment Drug Court, 2005) in order to introduce the incentive policy into grant language.
2003

Develop data for performance measures

Completed Program has implemented an automated data collection and reporting system. Data are being collected for all measures and used to manage the program. Targets have been set.
2007

Implement a monthly system of notifying grantees and project officers on grantee performance and begin to use this information for better linkage between grantee monitoring, performance and technical assistance.

Completed

Program Performance Measures

Term Type  
Long-term/Annual Outcome

Measure: Individuals who have received drug treatment services that show no past month substance use six months after admission to treatment


Explanation:

Year Target Actual
2003 New baseline 61.1%
2004 63% 63%
2005 65% 64.1%
2006 67% 63%
2007 63% 59.8%
2008 63% To be reported 10/08
2009 61% To be reported 10/09
2010 65% To be reported 10/10
2011 65% To be reported 10/11
2012 65% To be reported 10/12
2013 65% To be reported 10/13
Long-term/Annual Efficiency

Measure: Grantees that provide drug treatment services within approved cost per person guidelines by the type of treatment, such as inpatient, outpatient or methadone.


Explanation:

Year Target Actual
2003 Baseline 79%
2004 80% 80%
2005 80% 81%
2006 80% 81%
2007 80% To be reported 10/08
2008 80% To be reported 10/09
2009 78% To be reported 10/10
2010 78% To be reported 10/11
2011 78% To be reported 10/12
2012 78% To be reported 10/13
2013 78% To be reported 10/14
Long-term/Annual Outcome

Measure: Drug treatment professionals trained by the program that adopt proven treatment methods


Explanation:Adopting proven methods ultimately improves drug treatment outcomes.

Year Target Actual
2001 Baseline 40%
2002 70% 86.3%
2003 80% 84%
2004 18.7% 83%
2005 85% 87%
2006 89% 93%
2007 93% 90%
2008 93% To be reported 10/08
2009 90% To be reported 10/09
2010 90% To be reported 10/10
2011 90% To be reported 10/11
2012 90% To be reported 10/12
2013 90% To be reported 10/13

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of the drug treatment Programs of Regional and National Significance discretionary program cannot be stated succinctly. The mission of the program is to improve the quality and availability of drug treatment services. The program includes drug treatment service grants on one side, which have a clear purpose and design, and training, communications and regulatory activities on the other, which are less clear. Conceptually, the two main elements combine as supporting drug treatment services and improving the quality of those services. Actual coordination between the two sides is unclear, and the unifying purpose for this discretionary budget is unclear. The agency is refocusing its mission on supporting services and is developing a strategic plan, both of which will add clarity to the program purpose.

Evidence: The FY 2003 budget of $358 million is divided up by roughly 17 different grant streams. The agency is working to refocus the program on delivering services, but the purpose is not yet clear. The program is run by the Substance Abuse and Mental Health Services Administration (SAMHSA).

NO 0%
1.2

Does the program address a specific interest, problem or need?

Explanation: The program is designed to address the need for effective drug treatment services, especially in hard hit communities and for target populations. Service grants help areas with critical or newly emerging problems. Training, communications and regulatory grants are designed to improve treatment outcomes. Grantee data indicate those served by the program's drug treatment grants are more likely to be female and more likely to be minorities than national treatment averages (49% v 27% and 52% v 28%, respectively).

Evidence: The 2001 National Household Survey on Drug Abuse (NHSDA) estimates 16 million Americans used an illicit drug in the past month, 6.1 million persons above age 12 need treatment, 5.0 million need treatment but are not getting it, and 4.6 million people who meet the criteria for needing treatment do not even recognize that they need treatment.

YES 20%
1.3

Is the program designed to have a significant impact in addressing the interest, problem or need?

Explanation: The program is the Federal government's primary mechanism to target key areas and populations with support for drug treatment services. While the program is a relatively small portion of all public drug treatment funding, it is designed to have a significant impact that is reasonably known and can be measured in the context of all other factors. Drug treatment is designed to reduce drug use and its consequences. Outcome data from the program are available and the impact is known. The services grants provide meaningful assistance in individual hard hit communities receiving an award. The program's services grants also require scientifically established practices, which is important to improve drug treatment outcomes. State/local governments also support drug treatment clinics. The reach of the training efforts is limited relative to the number of drug treatment service providers and the extent to which many of those providers are using unproven methods. None of the grants leverage financial resources.

Evidence: Effective drug treatment is designed to have a significant impact on reducing drug use. The program supported an estimated 100,000 drug treatment admissions in 2002. According to agency estimates, drug treatment supported by the program in 2001 constitutes roughly 10% of Federal support and 5% of all public support for drug treatment.

YES 20%
1.4

Is the program designed to make a unique contribution in addressing the interest, problem or need (i.e., not needlessly redundant of any other Federal, state, local or private efforts)?

Explanation: The program makes a unique contribution. Service grants are designed specifically to fill gaps. While state and local governments support drug treatment, neither focus on regional, emerging problems. While schools and accreditation bodies play a role in improving the quality of treatment services, the program's training, communications and certification efforts are also unique. The agency also supports a substance abuse block grant, which provides even support to states to support alcohol and drug abuse prevention and treatment. The program shares many of the same goals as the block grant, but is designed for a different purpose.

Evidence: The Drug Abuse Warning Network and other surveys show pockets across the country with critical problems, or new problems such as ecstasy or methamphetamine use.

YES 20%
1.5

Is the program optimally designed to address the interest, problem or need?

Explanation: The program accomplishes its goals primarily through competitive grants. The program includes competitive drug treatment services grants to non-profit organizations and local and tribal governments to address gaps in treatment capacity, grants to community-based organizations to provide coordinated substance abuse and HIV services, grants to academic institutions to provide training for drug treatment providers, and grants to entities to support networking and technology transfer to accelerate the process of putting new drug treatment knowledge into practice.

Evidence: There is no evidence that block grants, regulations, or other approaches would be more effective or efficient to accomplish program goals.

YES 20%
Section 1 - Program Purpose & Design Score 80%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific, ambitious long-term performance goals that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program has adopted long-term outcome goals through the assessment process. The outcome goals also relate to national outcome goals of the Office of National Drug Control Policy.

Evidence: The program's long-term goals include the effectiveness of drug treatment services as measured by reductions in drug use six months after the conclusion of treatment, changes in the efficiency of grantees as measured by the percentage of providers that do not exceed approved costs per person treated according to the type of treatment provided, and the effectiveness of program training efforts as measured by the percentage of drug treatment providers that report adopting approved treatment methods as a result of receiving training and best practices information from the program.

YES 14%
2.2

Does the program have a limited number of annual performance goals that demonstrate progress toward achieving the long-term goals?

Explanation: Agency has a limited number of valid annual performance goals focused on outcomes that demonstrate progress toward achieving desired long-term outcomes. The program's annual goals also relate to Office of National Drug Control Policy long-term goals.

Evidence: Annual goals include the reductions in past month use, improvements in program efficiency, and changes in treatment methods resulting from program training efforts.

YES 14%
2.3

Do all partners (grantees, sub-grantees, contractors, etc.) support program planning efforts by committing to the annual and/or long-term goals of the program?

Explanation: Individual service grantees provide performance data through a common software system to measure annual goals. Further steps to use data to reward performance could encourage additional buy-in to program goals. Training partners also provide performance information. In a more general sense, the treatment community embraced the program's mission through the development of a National Treatment Plan.

Evidence: Service grantees input performance information into an ACCESS database. Data is compiled to report progress on annual goals. Grantees report on drug use, employment and other outcomes using a Core Client Outcomes tool.

YES 14%
2.4

Does the program collaborate and coordinate effectively with related programs that share similar goals and objectives?

Explanation: Meaningful collaboration with other Federal agencies that share similar objectives has increased, especially with the Department of Justice, the Office of National Drug Control Policy, and within HHS. Most significantly, SAMHSA has begun to collaborate more fully with the National Institutes of Health to improve the translation of science to services and refocus SAMHSA on service delivery. In order to be successful, this effort will require a further development of meaningful collaboration, including the full involvement of NIH to provide research findings to SAMHSA in a useful way and incorporate lessons gathered from SAMHSA's drug treatment services grantees into its research agenda. In 2002, SAMHSA is also supporting drug treatment services in criminal justice in collaboration with the Department of Justice. Representatives from VA, the Health Resources and Services Administration, and the Bureau of Prisons also participated in deliberations for the program's treatment plan. A 1997 GAO report found a dearth of collaboration, and not all of these areas have been addressed.

Evidence: GAO reported that SAMHSA needs to improve its coordination with agencies engaged in similar or complementary activities. The report suggested for example the need to improve work with Justice, Veterans Affairs, Education, Indian Health Service and the National Institutes of Health.

YES 14%
2.5

Are independent and quality evaluations of sufficient scope conducted on a regular basis or as needed to fill gaps in performance information to support program improvements and evaluate effectiveness?

Explanation: In 1997, the University of Chicago and Research Triangle Institute concluded the National Treatment Improvement Evaluation Study (NTIES). The purpose of the study was to demonstrate the value of the comprehensive treatment model supported by the program. The study considered how funds were used, what were the results of comprehensive treatment, and what lessons have been learned about cost and implementation. Data collection for the study ended in 1995 and since that time, there have been no comprehensive evaluations of the program. The program has studied the impact of specific treatment approaches through its Methamphetamine Treatment Project. No independent and comprehensive evaluations of the program's training and knowledge dissemination activities have been conducted.

Evidence: The NTIES evaluation was a comprehensive assessment of 157 multi-year awards across 47 states and several territorial areas made from 1989-1992. In addition to the NTIES study, SAMHSA reports directing extensive grantee efforts for evaluation, however, these reviews are not compiled into an independent and comprehensive assessment.

YES 14%
2.6

Is the program budget aligned with the program goals in such a way that the impact of funding, policy, and legislative changes on performance is readily known?

Explanation: The program cannot estimate the associated cost of each drug treatment service supported by the program, which is the main output directly associated with the program's outcome goals. Annual budget requests are not clearly derived by estimating what is needed to accomplish the annual performance measures and long-term outcomes. The program budget structure varies from program goals and the impact of funding decisions for the budget line on the actual performance of the program overall as a collection of its individual components is difficult to predict. The program can cost out anticipated outcomes by funding level based on average national cost of treatment. However, beyond using national averages, the program cannot measure the impact of proposed funds on program performance and outcomes.

Evidence: Assessment based on annual budget submissions to OMB and Congress.

NO 0%
2.7

Has the program taken meaningful steps to address its strategic planning deficiencies?

Explanation: The main deficiencies highlighted in this section are the need for long-term outcome measures, continued evaluations of the program, and improved alignment of budget and goals so that the impact of funding and policy changes on performance is readily known. The agency is also going through a strategic planning process and has adopted draft long-term outcome goals. Having these measures in place will also enable the program to better integrate budget planning and strategic planning and determine the level of financial resources needed to obtain long-term outcomes. The National Treatment Outcome Monitoring System (NTOMS) to be implemented in 2003 will provide new outcome data to fill gaps in performance information.

Evidence: Assessment based on discussion with agency and the program management plan. The agency is awarding a contract for NTOMS this year. The program plans evaluations of the effects of opiate treatment programs when buprenorphine is approved by the Food and Drug Administration. The agency's restructuring plan consolidated budget formulation, planning and Government Performance and Results Act activities within one unit.

YES 14%
Section 2 - Strategic Planning Score 86%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Data are not regularly used by managers in management and budget decisions. Annual performance data are collected, checked for validity and used to some extent by project officers. Explanations are offered when targets are not met, but significant changes have not been made to improve performance. Managers report being unable to use past performance as a factor in grantee competitions.

Evidence: Managers do not regularly use outcome data. For example, when lower than expected program outputs were discovered from grantee reports, no steps were taken to revise the program, shift resources, or improve grantee performance.

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, etc.) held accountable for cost, schedule and performance results?

Explanation: Neither managers nor partners are held directly accountable for program outcomes. Performance data are not used in employee evaluations. Grantees compete for funds initially, but only lose funding for poor performance in extreme cases. The agency is planning a significant change in grant management described below that will enhance partner accountability.

Evidence: Assessment is based on public personnel documents, discussions with the agency and grant announcements and reports. The agency has also taken new steps to identify and target the roughly 10% of program grantees that are not reporting outcomes data. Following contacts first by project officers and then by an agency contractor, the agency reports a significant reduction in non-reporting.

NO 0%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds are obligated efficiently and in accordance with planned schedules. The agency is working to release some grants earlier in the fiscal year. There have been very few known cases of funds being expended outside of their intended purpose. Project officers perform site reviews when possible.

Evidence: Assessment based on apportionment requests; annual budget submissions and financial reports, queries in Single Audit Database and agency grants management procedures. For reference, project officers visit roughly 25% of grantees annually.

YES 9%
3.4

Does the program have incentives and procedures (e.g., competitive sourcing/cost comparisons, IT improvements) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: In general, there is insufficient evidence that the program has incentives and procedures in place to improve efficiency and cost effectiveness in program execution to meet the standards for this question. The program is working to include an efficiency measure. The agency does rely on an HHS service clearinghouse known as the Program Support Center for many internal services, is providing FAIR Act targets, and appears to be making progress toward outsourcing additional services. Outsourced activities include accounting, graphics, human resources, and property management. The program also has automated the process for entering performance outcome data.

Evidence: FAIR Act report, services directed to HHS' consolidated Program Support Center.

NO 0%
3.5

Does the agency estimate and budget for the full annual costs of operating the program (including all administrative costs and allocated overhead) so that program performance changes are identified with changes in funding levels?

Explanation: The program does not have a financial management system that fully allocates program costs and associates those costs with specific performance measures. The program develops annual budget proposals that include associated FTE and accrual costs. However, the program is unable to cost out resources needed to achieve targets and results. The program does not capture all direct and indirect costs borne by the program agency, including applicable agency overhead, retirement, and other costs budgeted elsewhere, or include informational displays in the budget that present the full cost of outputs. FTE and administrative expenses are not tied to annual program budgets.

Evidence: Assessment based on annual program management budget requests and discussions with agency.

NO 0%
3.6

Does the program use strong financial management practices?

Explanation: The program receives clean opinions on its audits and is free of material internal control weaknesses. The agency's fiscal monitoring of grant awards is conducted through the SAMHSA Grants Information Management System, which tracks awards and obligations, carry over and submission of quarterly reports, application renewals and final reports. The system is used to flag grantee financial management issues for project officers and Federal managers.

Evidence: The assessment is based on audited statements from the Program Support Center and Office of the Inspector General reports.

YES 9%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The main deficiencies include use of performance data to enhance accountability, the ability to identify changes in performance with changes in funding levels, and additional incentives and procedures to improve efficiency. Most significantly, the agency reports taking additional steps to introduce funding incentives and reductions to improve grantee performance. This reallocation of second and third year awards would provide a powerful incentive to improve accountability and ultimately grantee efficiency and performance for drug treatment service grants. The agency has also begun placing grantees that fail to report performance data to the agency in a risk pool that will require weekly contact with project officers until data submission is complete and is exploring additional sanctions. The agency is extending its performance contracts to increase accountability and reports taking additional steps to hold staff accountable for program performance. The agency is also reorganizing the Center to more effectively use FTE resources at the Federal level.

Evidence: The assessment is based on conversations with the agency, management plan documents, and Federal Register notices. The agency's restructuring plan consolidated budget formulation, planning and Government Performance and Results Act activities within one unit. Steps to improve efficiency include reductions in deputy manager positions and consolidation of smaller offices.

YES 9%
3.CO1

Are grant applications independently reviewed based on clear criteria (rather than earmarked) and are awards made based on results of the peer review process?

Explanation: Applications for this program are peer reviewed based on clear criteria and awards are made based on merit as judged through the peer review process. A central office within the agency organizes and conducts independent review of grant applications for agency programs. There are some one-year, non-competitive earmarks, but the majority of funds are competitively awarded.

Evidence: Assessment based on grant review procedures, Federal Register Notices.

YES 9%
3.CO2

Does the grant competition encourage the participation of new/first-time grantees through a fair and open application process?

Explanation: The grant competition is open to new/first-time grantees. The agency has also hosted sessions for faith and community based organizations to encourage them to apply and provide technical assistance.

Evidence: Assessment based on technical assistance documents and planning sessions for faith and community based organizations.

YES 9%
3.CO3

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: Agency staff serve as project officers for grantees and meet with providers at conferences and other settings. Grantees report annually on performance and the agency is taking steps to improve data reporting.

Evidence: Assessment based on grantee reports. See also Question 4 explanation and evidence.

YES 9%
3.CO4

Does the program collect performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: Grantees enter data in a shared database. Annual performance data are summarized in the performance report and made available on the agency web site. Additional steps can be taken to make performance data at the state level publicly available, especially with the expansion of a targeted capacity expansion grant to states.

Evidence: Assessment based on agency GPRA reports and web site (www.samhsa.gov).

YES 9%
Section 3 - Program Management Score 64%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome goal(s)?

Explanation: The agency has adopted new long-term outcome goals. Two of the goals are new and baseline data are estimates. The first measure will track the effectiveness of drug treatment services by measuring reductions in drug use six months after admission to treatment. The second goal will capture changes in grantee efficiency by measuring the percentage of providers that do not exceed approved costs per person treated according to the type of treatment provided. Approved costs will be determined separately for outpatient, inpatient and methadone treatment using national averages and data on demographics of patients treated. This measure will also be used by program managers in reviewing applications and renewals, such as by not funding applicants whose proposed budgets are outside the range of acceptable costs. The third goal tracks the portion of drug treatment providers that report adopting approved treatment methods as a result of receiving training and best practices information from the program. A large extent would require progress on more than one measure.

Evidence: The National Treatment Outcomes Monitoring Study (NTOMS) will be used to determine the success rates of drug treatment supported by the program. In addition to providing a national comparison, NTOMS will allow the program to add sampling frames specific to grantees to cover external evaluation and allow grantees to dedicate more funding to services. The efficiency goal of acceptable costs is based on data from the Alcohol and Drug Services Study (ADSS). Cost comparisons will be made by modality, including inpatient, outpatient and methadone treatment. ADSS costs are per person per episode while the ranges used by the program are per person over a specified time period. The current baseline for this measure is an estimate of grantee performance. The agency has proposed an acceptable range of costs to mean $3,000 to $10,000 for residential treatment, $1,000 to $5,000 for outpatient non-methadone, and $1,500 to $8,000 for methadone. These ranges are under review. Targets for the third measure may also need to be adjusted when baseline data are confirmed.

SMALL EXTENT 7%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The agency is not meeting the standards of a Yes for having incentives and procedures to measure and achieve efficiencies. Program targets for increasing the drug treatment service capacity have not been met and have been revised down in subsequent years. Even if the prior year data were flawed, there are no new data available to indicate improvements in program efficiencies and cost effectiveness over the previous year. There are no data on improved efficiencies for training, communications or regulatory/certification efforts.

Evidence: Funding for drug treatment services grew from 2000 to 2001, but the program adjusted down its annual targets for the number of people served from 23,000 to 14,000 based on lower than expected performance in the prior year. The revised figures are attributed to improvements in data collection and verification efforts, however, no new data on improved program efficiencies are available.

SMALL EXTENT 7%
4.3

Does the program demonstrate improved efficiencies and cost effectiveness in achieving program goals each year?

Explanation: The agency is not meeting the standards of a Yes for having incentives and procedures to measure and achieve efficiencies. Program targets for increasing the drug treatment service capacity have not been met and have been revised down in subsequent years. Even if the prior year data were flawed, there are no new data available to indicate improvements in program efficiencies and cost effectiveness over the previous year. There are no data on improved efficiencies for training, communications or regulatory/certification efforts.

Evidence: Funding for drug treatment services grew from 2000 to 2001, but the program adjusted down its annual targets for the number of people served from 23,000 to 14,000 based on lower than expected performance in the prior year. The revised figures are attributed to improvements in data collection and verification efforts, however, no new data on improved program efficiencies are available.

NO 0%
4.4

Does the performance of this program compare favorably to other programs with similar purpose and goals?

Explanation: The program is the only competitive program of its kind that supports drug treatment for the general population outside of the criminal justice system. However, the program may be a more cost efficient and effective mechanism to focus specifically on drug treatment than the substance abuse block grant, which also support alcohol treatment and primary prevention services. The program tracks annual performance data on reductions in past month substance use and other treatment outcomes that indicate performance. Similar data on performance are not available for the block grant. Grantees also seem to perform as well or better than grantees funded by state and local governments or other sources. There are no data on how well the training, communications and regulatory/certification efforts compare with other efforts.

Evidence: There are no definitive data on what portion of the Substance Abuse Block Grant supports drug treatment, complicating estimates of the impact of a funding increment on drug treatment services. The agency has previously calculated that supporting a drug treatment slot through the program costs 1/3 less than through the block grant, however, these calculations are based on estimates rather than actual cost of treatment and may be revised. With respect to performance information, the efforts are underway to track outcome data for the block grant, but no effectiveness data are available at this time. By comparison, annual outcome data collected by this program indicates an impact on reducing past month drug use by 34% of those treated and the negative consequences of use such as reduced or no involvement with the criminal justice system by 75% of those treated. Grantee data indicate those treated by the program are more likely to be female and more likely to be minorities than national treatment averages (49% v 27% and 52% v 28%, respectively).

LARGE EXTENT 13%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: Data from the 1997 National Treatment Improvement Evaluation Study indicate the program's substance abuse treatment demonstration grants were effective. While the evaluation found drug treatment grants were effective, the agency has not had a comprehensive evaluation of their training and regulatory/certification efforts, or the drug treatment Programs of Regional and National Significance activity as a whole.

Evidence: Key findings from the NTIES include clients' use of their primary drug(s) declined from 73% to 38% one year after treatment; selling drugs declined by 78%; arrests for any crime declined 64%; rate of employment increased from 51% to 60% following treatment; and alcohol/drug-related medical visits declined 53% following treatment. Outpatient methadone treatment costs were about $3,900 for an average of 300 days of treatment, outpatient non-methadone treatment costs were about $1,800 for an average of 120 days, and treatment in a correctional setting cost $1,800 for an average of 75 days. With respect to the program's knowledge dissemination efforts, the OIG found in 1998 that only 32% of SAMHSA's own grantees are aware of treatment improvement protocols issued by the agency.

SMALL EXTENT 7%
Section 4 - Program Results/Accountability Score 33%


Last updated: 09062008.2002SPR