Skip Navigation Links
Centers for Disease Control and Prevention
 CDC Home Search Health Topics A-Z

Preventing Chronic Disease: Public Health Research, Practice and Policy

View Current Issue
Issue Archive
About the Journal
For Authors
For Peer Reviewers
Subscriptions
Announcements


Search PCD





Emerging Infectious Diseases Journal
MMWR


 Home 
 How to Submit Manuscript 
 FAQs 
 Site Map 
 Contact Us 

Volume 5: No. 2, April 2008

ORIGINAL RESEARCH
Cost of Starting Colorectal Cancer Screening Programs: Results From Five Federally Funded Demonstration Programs


TABLE OF CONTENTS


Translation available Este resumen en español
  Ce résumé est en français
  本摘要繁體中文版
  本摘要简体中文版
Print this article Print this article
E-mail this article E-mail this article:



Send feedback to editors Send feedback to editors
Download this article as a PDF Download this article as a PDF (369K)

You will need Adobe Acrobat Reader to view PDF files.


Navigate This Article
Abstract
Introduction
Methods
Results
Discussion
Acknowledgments
Author Information
References
Tables


Florence K. L. Tangka, PhD, Sujha Subramanian, PhD, Bela Bapat, MS, Laura C. Seeff, MD, Amy DeGroff, MPH, James Gardner, MSPH, A. Blythe Ryerson, MPH, Marion Nadel, PhD, Janet Royalty, MS

Suggested citation for this article: Tangka FKL, Subramanian S, Bapat B, Seeff LC, DeGroff A, Gardner J, et al. Cost of starting colorectal cancer screening programs: results from five federally funded demonstration programs. Prev Chronic Dis 2008;5(2). http://www.cdc.gov/pcd/issues/2008/
apr/07_0202.htm
. Accessed [date].

PEER REVIEWED

Abstract

Introduction
In 2005, the Centers for Disease Control and Prevention (CDC) started a 3-year colorectal cancer screening demonstration project and funded five programs to explore the feasibility of a colorectal cancer program for the underserved U.S. population. CDC is evaluating the five programs to estimate implementation cost, identify best practices, and determine the most cost-effective approach. The objectives are to calculate start-up costs and estimate funding requirements for widespread implementation of colorectal cancer screening programs.

Methods
An instrument was developed to collect data on resource use and related costs. Costs were estimated for start-up activities, including program management, database development, creation of partnerships, public education and outreach, quality assurance and professional development, and patient support. Monetary value of in-kind contributions to start-up programs was also estimated.

Results
Start-up time ranged from 9 to 11 months for the five programs; costs ranged from $60,602 to $337,715. CDC funding and in-kind contributions were key resources for the program start-up activities. The budget category with the largest expenditure was labor, which on average accounted for 67% of start-up costs. The largest cost categories by activities were management (28%), database development (17%), administrative (17%), and quality assurance (12%). Other significant expenditures included public education and outreach (9%) and patient support (8%).

Conclusion
To our knowledge, no previous reports detail the costs to begin a colorectal cancer screening program for the underserved population. Start-up costs were significant, an important consideration in planning and budgeting. In-kind contributions were also critical in overall program funding. Start-up costs varied by the infrastructure available and the unique design of programs. These findings can inform development of organized colorectal cancer programs.

Back to top

Introduction

Screening for colorectal cancer (CRC) reduces mortality and improves quality of life through earlier detection of precancerous polyps and thus more effective treatment of cancers (1). Less than one-half of the eligible population in the United States are up to date with recommended CRC screening tests, and the uninsured are among those least likely to participate in screening programs (2,3). Screening programs for the uninsured should help to increase the proportion of this subpopulation who are screened and improve health outcomes. Costs associated with offering screening tests and performing subsequent diagnostic procedures need to be assessed in program planning. Such economic assessment is an increasingly important tool that allows policy makers to plan for optimal allocation of limited health care resources, identify the most efficient approach to implementing screening programs, and assess annual budget implications.

CRC screening has been shown to be cost-effective in numerous studies using decision analytic models to assess the benefits and the cost of screening (4). The models have produced conflicting results on which of the screening tests for early detection of cancer recommended by the American Cancer Society guidelines (5,6) is most cost-effective. However, the models have been consistent in determining that screening for CRC with any of the recommended tests is more cost-effective than the alternative of no screening (7,8). To date, no evaluation effort has included careful assessment of the cost of offering CRC screening through organized programs in the United States. Overall costs of such programs go well beyond the cost of the individual screening tests provided. They include expenditures to hire staff, establish contracts and partnerships with providers, develop databases and other mechanisms to maintain records and track patient outcomes, recruit patients, provide professional education, and establish medical advisory boards. Programs that provide screening services to underserved populations can incur significant costs in outreach, patient education, and case management.

The Centers for Disease Control and Prevention (CDC) established the Colorectal Cancer Screening Demonstration Program (CRCSDP) in 2005 to explore the feasibility of establishing a CRC screening program for the underserved U.S. population. Data from the five program sites funded through this effort provide a unique opportunity to understand the costs associated with offering screening through organized programs. Each program is described in detail elsewhere in this issue (9,10).

CDC is undertaking a detailed evaluation of CRCSDP to estimate the cost of implementation (start-up and maintenance), describe implementation processes, assess patient outcomes, and determine the relative cost-effectiveness of screening modalities. We report here on the start-up costs of establishing CRC screening programs, which include all expenditures before delivery of the service. This information is essential for the estimation of the funding required to plan and start a CRC screening program. Subsequent reports will include costs incurred during the service delivery phase of the program.

Back to top

Methods

We developed an instrument to collect data on use of resources and costs related to the start-up of CRCSDP. The start-up period was defined as the time between the date of the funding award (August 31, 2005) and the start of delivery of the screening service in each program. We knew that the times required to complete start-up activities for each program might differ, which would lead to an inconsistent duration of the start-up period for programs. Nevertheless, we adopted this definition to ensure that start-up activities and their costs were fully captured at each program site.

Well-established methods for collecting cost data for program evaluation, such as the “ingredient approach,” were considered in designing the questionnaire (11-14) (S. Subramanian, PhD, unpublished data, 2006). Costs were assigned to four budget categories: staff salaries, contract expenditure, purchases, and administrative expenditure (Figure 1), and activities were placed in these categories. Activity-based costs were derived by aggregating expenditures for staff salaries and labor, contractual costs, and purchases for each activity. We also collected overhead or indirect costs, including expenditures for items such as telecommunications and rent associated with CRCSDP. The monetary value of in-kind contributions provided to the programs during the start-up period was also estimated.

flow chart

Figure 1. Approach to collection of cost data in study of start-up costs in five programs in Colorectal Cancer Screening Demonstration Program, 2005–2006. [A text description of this figure is also available.]

The cost instrument was designed to ensure collection of accurate data despite variations among the five programs. The programs differed in structure, provider network, selection of the screening test, and size of the service delivery area, and all of these factors affected costs. One of the five programs is implemented statewide, two are restricted to large urban areas (one city each), and two others serve clients throughout one to three counties. Two programs provide colonoscopy as the primary screening test, and the other three provide guaiac-based fecal occult blood tests. The structure for service delivery was more decentralized in some programs than in others. Decentralized programs, for example, contracted with providers to perform recruitment, screening, and patient follow-up activities, whereas centralized programs did not outsource these activities. Provider networks varied by site and included hospitals, specialty care centers, state health departments, and community health centers.

All data were collected in Microsoft Excel (Microsoft Corporation, Redmond, Washington). Program staff entered the data prospectively into the Excel work sheets to ensure accuracy of the information and avoid issues related to recall bias. For example, program staff maintained a log of the activities performed that was updated on a weekly or monthly basis. Programs were also provided with a user guide giving detailed definitions of each activity captured for the start-up period. This guide assisted with data reporting and helped to ensure consistent reporting among all programs so that meaningful comparisons could be made. Evaluators conducted a series of conference calls with each site to provide additional guidance for data collection.

We analyzed costs according to CRCSDP program activities during the start-up period. In estimating the true labor costs, we used the information collected on 1) the number of hours worked by staff per month on various activities, 2) the proportion of salary paid through CRCSDP funds, 3) data on the percentage of time staff members worked, and 4) staff salary. The staff salary information was requested as either a range or the actual base salary in addition to the fringe benefit rate. We used the average of the lower and upper bounds of the salary range when necessary. When salary information was not provided, we used national average compensation for a specific job category from the Bureau of Labor Statistics (www.bls.gov) or the average salary from a similar job category provided by the programs. On the basis of this information, we computed the hourly rate for labor and the proportion of in-kind labor cost — labor hours expended but not covered by CDC funds. The labor costs were then aggregated for each activity in each program.

For contract expenditures, we aggregated the costs of consultants and funding to provider sites by program activity, such as technology support and development of media materials. Costs for materials, equipment, and supplies were also computed for each activity in each program. We aggregated the overhead costs related to the start-up period and confirmed the activity-based cost estimates for each CRCSDP-funded activity by comparing the CRCSDP funds expended by each program. If discrepancies were noted, we contacted the program to clarify the data provided and resolve inconsistencies.

Back to top

Results

The start-up periods for the five programs ranged from 9 to 11 months. Details on the start-up activities are provided in Table 1. CDC funding and in-kind contributions were key resources for the program start-up activities. Funding sources for costs incurred by each program during the start-up period are shown in Table 2. The mean total cost for programs was $171,139. The lowest cost was $60,602, and the highest cost was $337,715.

On average, CDC funding was 42% of total cost and in-kind contribution was 50%. CDC funds ranged from 13% to 63% of total start-up costs. Two programs also received funding from other sources, including the state Comprehensive Cancer Control program and the state CRC task force. After excluding these two programs, the CRCSDP funding from CDC averaged 46%. For two of the programs, CDC funding for CRCSDP activities was substantially more than in-kind contributions. In-kind contributions varied among the programs and constituted 28% to 67% of total costs. They included donated labor time (e.g., physicians on the medical advisory board) and supplies such as computers. Labor was a major component of in-kind donations.

Distribution of start-up costs among budget categories is shown in Figure 2. The category with the largest expenditure was labor, which on average accounted for 67% (range, 55%–78%) of the start-up cost. Only one program incurred expenditures related to consultants; these costs were 2% of the total cost among all programs. On average among the five programs, administrative cost was 17% (range, 6%–34%) of the total cost, and the cost for materials and supplies was 14% (range, 5%–27%). Materials and supplies included items such as postage, forms, brochures, and medical supplies.

Bar graph

Figure 2. Distribution of start-up costs, by budget category, in study of five programs in Colorectal Cancer Screening Demonstration Program, 2005–2006. [A tabular version of this figure is also available.]

For all programs combined, the largest cost categories by activity were management (28%; range, 18%–34%); data collection and tracking, which were mainly for database development (17%; range, 8%–35%); administrative costs (17%; range, 6%–34%); and quality assurance (12%; range, 10%–15%) (Figure 3). Other activities with significant expenditures included public education and outreach (9%; range, 6%–13%) and patient support (8%; range, 0%–19%).

Pie chart

Figure 3. Percentage distribution of start-up costs, by activity, averaged across the five programs in the Colorectal Cancer Screening Demonstration Program, 2005–2006. Numbers do not add up to 100% due to rounding. [A tabular version of this figure is also available.]

Cost allocations among activities for each of the five programs are shown in Figure 4. This distribution varied among the programs. Database development was the largest cost category for program 1 (35%), and administration/overhead was the largest category for program 4 (34%). The largest category for the other three programs was program management (25%–34%).

Bar chart

Figure 4. Distribution of start-up costs, by activity, in study of five programs in Colorectal Cancer Screening Demonstration Program, 2005–2006. [A tabular version of this figure is also available.]

Back to top

Discussion

To our knowledge, no previous evaluation has provided details on the costs to begin a CRC screening program for the underserved population. The important contributions of this paper include 1) the imputation of market value to in-kind contributions (voluntary and donated services and products), which can help other programs explicitly account for in-kind contributions in their budgets in cases when these resources might not be freely available; 2) the categorization of cost components (Table 1), which provides a useful guide about different sources of costs; and 3) the provision of steps involved in the valuation of labor. Our analysis shows that start-up costs can be significant and should be considered in planning and budgeting for future CRC screening programs.

One of the largest cost components was overall program management, which involved a wide range of activities. Necessary resources included expenditures to develop fiscal systems, recruit and train staff, establish policies and procedures, and negotiate contracts with providers. Another significant cost component was developing a database system to monitor and track patient services. Labor required to perform these activities was the most significant budget category; materials and supplies accounted for a much smaller proportion of the total cost. Most of these activities represent fixed costs and therefore will not vary in relation to the volume of screens performed.

In this CRC screening demonstration, in-kind contributions were critical in providing the resources required for the start-up program activities. Therefore, total cost of start-up activities should include the monetary value of these contributions, which are generally related to donation of labor hours. This in-kind labor was mainly provided by senior management staff members, who were vital for the overall success of the program, and by physicians and other key individuals who participated in the medical advisory board to ensure that the program was designed to include pertinent clinical care partners and that quality care was provided to program participants. In future planning for resource allocation for a similar program, these critical categories should not be overlooked.

Start-up costs varied substantially across the five programs in this screening demonstration. The infrastructure available before the start of this effort accounts for some of the difference. For example, programs that could easily manipulate existing data-collection tools did not have to incur large expenses to create new data systems. Therefore, we can expect programs that plan to build on a well-established infrastructure to incur smaller start-up costs than programs with limited infrastructure. Other sources of variation could include the type of screening tests offered, the geographic area covered, the setting in which the program was created (e.g., academic medical center vs state health department), and individual program contributions to administrative costs, which were a significant proportion of total costs. Although we provided a user’s guide to minimize inconsistencies in reporting data, programs may have differed in the allocation of expenses to the various activity-based categories. Systematic quantitative assessment of these variations in start-up cost was not possible because of the small number of pilot sites available. Finally, we did not assess the cost incurred in the start-up period in relation to the effectiveness of the services provided by the programs. An evaluation of the cost-effectiveness of the programs is planned, and results based on intermediate outcomes (cost per screen performed and cost per polyp/cancer detected) will be reported in future publications. However, such a study cannot be conducted in the start-up phase of a program when no screening has occurred.

The information provided in this assessment on the magnitude of cost related to specific start-up activities can serve as a guide to estimation of start-up costs for funding agencies and organizations implementing screening programs. The detailed list of start-up activities developed for this study can also assist program staff to develop budget estimates, and the real-world cost values reported in this analysis can serve as a benchmark for evaluation of these estimates. Thus, our work should provide essential information for the successful start of CRC screening programs. Furthermore, details on implementation costs are expected to provide in-depth assessment of the costs associated with recommended screening tests and realistic estimates of the cost of diagnosis and complications.

Back to top

Acknowledgments

We thank the staff of sites participating in the CRCSDP for the generous contribution of time and cooperation in collection and provision of the cost data. We also recognize and thank the project team for the CRCSDP of CDC for ongoing support in the economic analyses of the project.

Back to top

Author Information

Corresponding Author: Florence K. L. Tangka, PhD, Division of Cancer Prevention and Control, Centers for Disease Control and Prevention, 4770 Buford Hwy NE, Mailstop K-55, Atlanta, GA 30341-3717. Telephone: 770-488-4639. E-mail: FTangka@cdc.gov.

Author Affiliations: Laura Seeff, Amy DeGroff, James Gardner, A. Blythe Ryerson, Marion Nadel, Janet Royalty, Division of Cancer Prevention and Control, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, Georgia; Sujha Subramanian, Bela Bapat, Research Triangle Institute, Inc, Research Triangle Park, North Carolina.

Back to top

References

  1. Rex DK, Johnson DA, Lieberman DA, Burt RW, Sonnenberg A. Colorectal cancer prevention 2000: screening recommendations of the American College of Gastroenterology. American College of Gastroenterology. Am J Gastroenterol 2000;95(4):868-77.
  2. Subramanian S, Amonkar MM, Hunt TL. Use of colonoscopy for colorectal cancer screening: evidence from the 2000 National Health Interview Survey. Cancer Epidemiol Biomarkers Prev 2005;14(2):409-16.
  3. Seeff LC, Nadel MR, Klabunde CN, Thompson T, Shapiro JA, Vernon SW, et al. Patterns and predictors of colorectal cancer test use in the adult U.S. population. Cancer 2004;100(10):2093-103.
  4. Pignone M, Saha S, Hoerger T, Mandelblatt J. Cost-effectiveness analyses of colorectal cancer screening: a systematic review for the U.S. Preventive Services Task Force. Ann Intern Med 2002;137(2):96-104.
  5. Smith RA, Mettlin CJ, Davis KJ, Eyre H. American Cancer Society guidelines for the early detection of cancer. CA Cancer J Clin 2007;50(1);34-49.
  6. Pignone M, Rich M, Teutsch SM, Berg AO, Lohr KN. Screening for colorectal cancer in adults at average risk: a summary of the evidence for the U.S. Preventive Services Task Force. Ann Intern Med 2002;137(2):132-41.
  7. Vijan S, Hwang EW, Hofer TP, Hayward RA. Which colon cancer screening test? A comparison of costs, effectiveness, and compliance. Am J Med 2001;111(8):593-601.
  8. Frazier AL, Colditz GA, Fuchs CS, Kuntz KM. Cost-effectiveness of screening for colorectal cancer in the general population. JAMA 2000;284(15):1954-61.
  9. DeGroff A, Holden D, Green SG, Boehm J, Seeff LC, Tangka F. Starting up the Colorectal Cancer Screening Demonstration Project. Prev Chronic Dis 2008;5(2). http://www.cdc.gov/pcd/issues/2008/apr/07_0204.htm.
  10. Seeff LC, DeGroff A, Tangka F, Wanliss E, Major A, Nadel M, et al. Development of a federally funded demonstration program to screen for colorectal cancer. Prev Chronic Dis 2008;5(2). http://www.cdc.gov/pcd/issues/2008/apr/07_0206.htm.
  11. French MT, Dennis ML, McDougal GL, Karuntzos GT, Hubbard RL. Training and employment programs in methadone treatment: client needs and desires. J Subst Abuse Treat 1992;9(4):293-303.
  12. French MT, Dunlap LJ, Zarkin GA, McGeary KA, McLellan AT. A structured instrument for estimating the economic cost of drug abuse treatment. The Drug Abuse Treatment Cost Analysis Program (DATCAP). J Subst Abuse Treat 1997;14(5):445-55.
  13. French MT, Roebuck MC, McLellan AT. Cost estimation when time and resources are limited: the brief DATCAP. J Subst Abuse Treat 2004;27(3):187-93.
  14. Salome HJ, French MT, Miller M, McLellan AT. Estimating the client costs of addiction treatment: first findings from the client drug abuse treatment cost analysis program (Client DATCAP). Drug Alcohol Depend 2003;71(2):195-206.

Back to top

 


Tables

Return to your place in the textTable 1. Activity-Based Categories in Study of Start-Up Costs in Five Programs in Colorectal Cancer Screening Demonstration Program, 2005–2006
Program Activity Description
Program management Defining specific measurable and realistic objectives, including goals of screening

Recruiting, hiring, and training staff

Developing fiscal system

Collaborating with Centers for Disease Control and Prevention

Establishing and managing related contracts

Identifying and contracting with local physicians and clinics to deliver screening services

Developing administrative policies and procedures

Managing programmatic, administrative, and reporting issues

Traveling for program meetings

Establishing necessary administrative billing and reimbursement system

Public education and outreach Developing and planning public education and outreach activities

Conducting outreach and in-reach activities

Conducting and facilitating related training

Collaborating with partners

Quality assurance and professional development Convening medical advisory board

Developing quality-control standards and mechanisms

Developing clinical policies and procedures

Developing or enhancing training to educate and train health care professionals

Partnership development and maintenance Developing and maintaining partnerships (e.g., Comprehensive Cancer Control, medical health care systems, businesses)
Data collection and tracking Developing and adapting data-collection and reporting system

Establishing surveillance system to track clients with abnormal screening results or diagnosis of cancer and follow up with them

Patient support Establishing patient support system to provide appropriate screening, diagnostic, and treatment services

Planning and identifying funding sources to ensure treatment services for people with cancer diagnosis or medical complications

Other activities Designing plan for program evaluation, job orientation, and training for collection of data on cost
Administrative/overhead costs Indirect costs (e.g., rent, telecommunications, maintenance)
Return to your place in the textTable 2. Distribution of Start-Up Costs, by Funding Source, Study of Five Programs in Colorectal Cancer Screening Demonstration Program (CRCSDP), 2005–2006
Funding Source Programs Mean
1 2 3 4 5
Total, $ 145,410 60,602 146,193 337,715 165,775 171,139
In-kind contributions,a % 63 58 37 67 28 50
CDC funding, % 13 42 63 33 57 42
Other,b % 24 0 0 0 15 8

All percentages are computed as percentage of the total cost.
a Labor costs and nonlabor costs (e.g., telephone calls, bowel preparation kits, printing of data-collection forms).
b Includes Comprehensive Cancer Control program and colorectal cancer task force funding.

Back to top

*URLs for nonfederal organizations are provided solely as a service to our users. URLs do not constitute an endorsement of any organization by CDC or the federal government, and none should be inferred. CDC is not responsible for the content of Web pages found at these URLs.

 




 



The opinions expressed by authors contributing to this journal do not necessarily reflect the opinions of the U.S. Department of Health and Human Services, the Public Health Service, the Centers for Disease Control and Prevention, or the authors’ affiliated institutions. Use of trade names is for identification only and does not imply endorsement by any of the groups named above.


 Home 
 How to Submit Manuscript 
 FAQs 
 Site Map 
 Contact Us 

Privacy Policy | Accessibility

CDC Home | Search | Health Topics A-Z

This page last reviewed February 19, 2008

Centers for Disease Control and Prevention
National Center for Chronic Disease Prevention and Health Promotion
 HHS logoUnited States Department of
Health and Human Services