ARC Online Resource Center
Skip Navigation and Search ABOUT ARC NEWSROOM THE APPALACHIAN REGION APPALACHIA MAGAZINE
  SEARCH

 
ORC Home
Resources for Community Planning
Funding
Regional Data & Research
Regional Data
Research Reports
Maps
Information by Topic
Site Map
Contact ARC
Privacy Policy
Web Policy
Evaluation of the Appalachian Regional Commission's Vocational Education and Workforce Training Projects
I. Introduction
Printer Version

Within the context of large-scale policy changes in the goals, resources, and implementation of national vocational education and workforce training programs, the Appalachian Regional Commission (ARC) directs a relatively small grants program aimed at these same issues.  As government performance reporting requirements for nationwide vocational education programs are increased, the Commission is focusing on evaluating the implementation and achievements of its projects, as well as the parallel performance reporting systems they employ, to improve the program overall and its individual projects.  This report summarizes findings from an evaluation of vocational education and workforce training projects funded by ARC between 1995 and 2000.

Appalachia: A Region in Transition

Appalachia is an area that is undergoing significant changes in its social and economic well-being, yet it continues to lag behind the rest of the nation in education and income. Decades ago its economy depended on industry, agriculture, and mining; today, human capital and the service sector are growing more critical to economic growth.  And like the much of the nation, information technology is becoming increasingly important. Furthermore, while some areas within the region have made substantial strides, others have shown only limited progress.  Measures such as the number of persons living in poverty, high school completion rates, employment rates, and job growth rates are but a few of the indicators that illustrate the gaps that exist between the citizens of Appalachia and the overall population of the United States.  With poverty rates continuing to decrease and educational attainment and employment rates continuing to grow, the gap is narrowing. However, there remains much work to do.

Going beyond these simple indicators, it is clear that if the region is going to become a vital player in the 21st century, its people must attain the new skills required to be successful in the changing world economy.  Students must not only graduate high school, but they must be literate in mathematics, science, and technology.  They must be able to go beyond the attainment of basic skills to solve challenging problems, to use new tools for solving these problems, and to work with others across the region, the nation, and the world. The region must rely upon human capital to adjust its labor markets and productivity, and  human capital development is dependent upon the strength of its workforce training and vocational education programs.

The Appalachian Regional Commission

ARC was created in 1965 to promote economic and social development in the region.  It is a federal-state partnership designed to help the region help itself by creating self-sustaining economic development and improved quality of life.  As such, the agency functions as a catalyst, drawing upon the resources of the federal government, the participating states, and local resources, be they individuals, public agencies, or private organizations.  Although considerable progress has been made in its more than three decades, the ARC Strategic Plan: 1997-2002 identifies five key areas of remaining need:

  • Developing a knowledgeable and skilled pop-ulation;
  • Supporting the region's physical infra-structure;
  • Promoting community and civic leadership;
  • Creating a dynamic economic base; and
  • Fostering healthy people.

The current evaluation addresses two of these areas:  developing a knowledgeable and skilled population and creating a dynamic economic base.  The stated objectives for the first goal in the strategic plan are (1) increasing the percentage of workers receiving basic education and skills training, skills upgrading, and customized training, which will lead to development of a workforce that is competitive in the 21st century world economy, and (2) increasing the percentage of students participating in school readiness, dropout prevention, school-to-work transition, and GED programs, thereby raising the college-going rate and preparing students for the world of work in the 21st century.

Moreover, with improved student achievement and workforce readiness comes productivity improvements in the workplace.  These labor market outcomes, along with better business attraction and creation rates in targeted industries, work together to foster a dynamic and improved local economy.

To accomplish these five strategic goals, ARC provides financial and technical support to local, regional, and multistate projects through its Area Development Programs. The process for awarding these grants reflects the underlying partnership between the Commission and participating states, as well as the need to give local communities a voice in determining how ARC funds are to be allocated. Within each state, local development districts (LDDs) provide for grassroots-level participation, so that ARC activities originate from—and ultimately benefit—the communities themselves. 

Each year, the 13 states of Appalachia prepare individual annual strategy statements and spending plans. These documents contain state-level goals (which are aligned with ARC's five strategic goals) and corresponding proposals for each of the specific projects that are being recommended for funding. In some states, these initiatives are developed to reflect state priorities. In others, applicants submit proposals based on needs identified in their local communities.

Once approved by the governor, a state's recommendations for project funding are submitted to ARC. Each proposed project is then reviewed by ARC project coordinators and, in most cases, approved by the federal co-chair.  A limited number of projects originate and are funded each year directly through the Commission and ARC set-asides.  Project coordinators can negotiate changes to the proposed project with state program managers. Until recently, these adjustments primarily reflected changes to timetables and budgets.

Program Changes

Over the past several years, ARC has made some changes to its application, proposal review, and program monitoring processes. First, program staff developed a workbook for state program managers and applicants with the intent of collecting more complete application packages. By providing examples of outputs and outcomes, they hoped to encourage prospective projects to be mindful of these concepts when designing their implemen-tation plans and to identify specific outputs and outcomes in their grant proposals.  Indeed, applicants are now required to specify outputs and outcomes and the degree to which these extend beyond the life of the grant.  Applicants are further encouraged to discuss quantifiable results of the proposed projects.

Second, staff provided a Grant Administration Manual that describes what should be included in a project's quarterly progress reports and final report. The manual includes sample formats and examples of how output and outcome measures can fit into the narratives. Program staff are also taking a greater role in negotiating with states and projects to improve the quality of the projects by improving the substance of outputs and requiring that outcomes be more specific. Most recently, ARC staff have begun making site visits to a sample of projects 2 years after the end of their grant period. These validation visits are designed to assess whether projects actually achieve their longer term outcomes.

The evaluation is intended to provide both a look at what has been accomplished to date and specific recommendations for addressing this key area in the future. It is an evaluation of the progress achieved through the supports provided by ARC over the last decade and of a work in progress.  Because findings and recommendations drawn from this evaluation are reflective of a program that has changed, we do not attempt to generalize these findings to the current system.  The next section discusses the purpose of this evaluation in greater detail.

Study Overview

In the late 1990s, ARC began a systematic review of its portfolio of funded projects.  This study of vocational education and workforce training projects—conducted by Westat, a Rockville, Maryland, research firm—follows a similar study conducted in 2000 of ARC's educational projects; it builds upon the methodology and understanding of the ARC context from the previous study.  The study sample comprises 92 projects funded by ARC during the latter half of the 1990s and 2000. 

In an effort to examine how recent program changes have affected projects' objectives and data collection practices, the study was conducted with two cohorts of grant recipients.  Cohort 1 is composed of 67 projects that were funded between 1995 and 1999, before the change was made, while the 25 Cohort 2 projects were funded in 2000, after the change was made, and were still active at the time the study was being conducted.

Study Questions

ARC delineated four primary objectives for the evaluation: (1) assess the extent to which projects were able to accomplish their anticipated outcomes; (2) benchmark project activities and accomplishments against current national studies of workforce training and vocational education efforts; (3) assess the utility and validity of specific performance measurements that might enhance ARC's ongoing capacity to monitor and evaluate its workforce training and vocational education projects; and (4) make other policy recommendations that can improve ARC's efforts to monitor and assist its workforce training and vocational education projects.  In an effort to ensure that the evaluation addressed each of these objectives in a comprehensive manner, we identified seven primary, interrelated research questions that guided the study:

  • What are the characteristics of communities and individuals who benefited from the projects?
  • What problems were projects designed to address?
  • What approaches did projects use to ameliorate these problems?
  • What specific outcomes were projects designed to achieve?
  • To what extent have projects accomplished their objectives?
  • What factors influenced projects' ability to implement their approaches and achieve their objectives?
  • What performance reporting systems are projects utilizing and how could these benefit the ARC?

The evaluation employed both qualitative and quantitative techniques that addressed all of the study's outcome and process questions in various depths and to different degrees.  The approach included the following integrated activities.

Qualitative techniques:

A review of the literature regarding workforce training and vocational education and data collection requirements for these types of projects.  Related to the literature review, we have talked informally with recipients and evaluators of other federal vocational education funding.  These conversations contributed to the development of the mail survey and site visits and informed our recommendations to the Commission.

An extensive review of project files to gain a better understanding of the purpose, scope, and goals/objectives of the 92 projects in the study sample. The document review was also used to guide the construction of the questionnaire and the design and site selection of the case studies. Finally, the document review was used to identify the specific objectives and outcomes that projects delineated in their original proposals to ARC. These outcomes were entered into a database developed to generate an addendum to the mail survey that respondents used to indicate whether they had met their own intended outcomes.

Site visits to five projects to obtain more detailed information about project-related implementation experiences, accomplish-ments, and impacts. The case studies allowed us to explore in greater detail the experiences of projects that have implemented potentially promising practices that warrant further study, to verify project outcomes, and to gain an understanding of best practices regarding data tracking and reporting.

Quantitative techniques:

A mail survey to collect broad-based data on the implementation and impact of the 67 projects in the study sample that received ARC funding between 1995 and 1999, before changes in application requirements.  The survey was designed to collect a common set of data regarding these Cohort 1 projects' characteristics, implementation practices, outcomes, and data collection and reporting systems.  It also obtained extensive narrative information on the extent to which projects' original objectives were achieved.

An abbreviated survey of 25 projects that received ARC funding in 2000, after the changes in application requirements. The survey was designed to collect detailed information on these Cohort 2 projects' data collection and performance reporting systems and to assess the impact of ARC's revised application procedures.[1]

Appendix C provides a more detailed overview of these activities, as well as a discussion of the procedures used to select and refine the study sample.  Appendix D provides information on the process used to select the five case study sites.

Issues Regarding Study Methodology

Several caveats regarding the study are worth noting.  First, the sample is small because the program is relatively small, and the evaluation included only projects closed in the last 5 years.  Second, the process used to select the study sample systematically excluded projects that lacked a complete project file at ARC headquarters (in some cases project files were in the closure process or undergoing internal review and were not available for the evaluation).  Several projects were discarded because, due to staff turnover, projects lacked a knowledgeable individual who could respond to the mail survey.  These exclusions, while necessary, increased the likelihood that we would primarily survey projects that successfully implemented their ARC grant—and potentially limited our opportunity to examine factors that hampered the efforts of ineffectual projects.  In addition, projects that received less than $10,000 from ARC were excluded from the sample.  Findings regarding the success and sustainability of ARC-funded vocational education and workforce training projects are therefore limited to the 67 Cohort 1 projects that responded to the survey.

Third, and similarly, the site visit findings reflect a purposefully selected segment of the study sample.  By conducting the mail survey prior to selecting case study sites, we were able to use preliminary survey findings to select potential case study sites.  The pool of potential sites included those that had achieved some of their intended outcomes, appeared to have in place a well-planned, complete, or innovative data collection system, and had sustained themselves over time.  As such, any conclusions drawn from the site visits may not pertain to the overall study sample.

Fourth, we initially planned to disaggregate all survey findings by the project characteristics discussed in Chapter 3.  However, after reviewing the data, we found that there were very few noteworthy findings uncovered by these analysis, due in large part to a small sample size, particularly when exacerbated by the small cell counts that occurred when survey responses were divided according to a variety of project characteristics.  Typical statistical standards require a sample size of at least 100 and cell sizes of at least five, but preferably ten or more cases, to conduct the more powerful analyses.  Additionally, many of the project characteristics were correlated or even overlapping (e.g., projects serving youth, projects serving adults, and projects serving both youths and adults), making the data not appropriate for high-level, complex regression analyses.  In addition, there may simply be few differences in project implementation and outcomes based on these characteristics.  Nonetheless, we do point out some noteworthy findings and refer the reader to the appropriate table in Appendix A.  Given the small sample size, it should be noted that we are largely speculating on these findings and have not conducted tests of statistical significance.

Finally, the RFP requested an analysis of the extent to which grantees were complying with other federal and state performance reporting systems.  Survey data suggest that in most cases, if projects are participating in other systems, staff are not aware of it.  Similarly, other federal programs are structured, funded, and managed very differently from ARC's program, making comparisons of performance data inappropriate.  In addition, a lack of comparable outcome data precludes such comparisons.

Structure of the Report

The remainder of the report provides the substantive findings from the evaluation.  These results are organized as follows:

  • Chapter 2—History and Background of Vocational Education and Workforce Training
  • Chapter 3—Projects' Context
  • Chapter 4—Project Activities
  • Chapter 5—Achievement of Objectives
  • Chapter 6—Project Sustainability
  • Chapter 7—Project Objectives and Data Collection Activities
  • Chapter 8—Summary and Recommendations
  • Appendix A—Additional Survey Data
  • Appendix B—Evidence in Support of Projects' Outcomes
  • Appendix C—Technical Approach
  • Appendix D—Case Study Methodology and Reports
  • Appendix E—Project Descriptions
  • Appendix F—Cohort 1 Mail Survey
  • Appendix G—Cohort 2 Mail Survey

Notes

[1] As discussed previously, ARC program staff have revamped their application procedures and technical assistance in order to gather data that better reflect the performance outcomes and measurements that are the focus of the Government Performance and Results Act. Because many of these projects were still in operation at the time of the survey, we were not able to administer the entire survey.  Accordingly, Cohort 2 projects received an abbreviated survey and, therefore, are not included in many of the analyses conducted on Cohort 1 projects.

Previous | Next

Table of Contents | PDF version of the report (approx. 1.25 Megs)