OFFICES


OPE: Office of Postsecondary Education
Current Section


Lessons Learned from FIPSE Projects II - September 1993

Austin Peay State University

Project for Area Concentration Achievement Testing (PACAT): Curricular Evaluation

Purpose

In recent years many institutions have found that national achievement tests were poorly suited to curriculum improvement within academic departments. Typically, national testing instruments are not adapted to the goals and content emphases of particular disciplinary departments, and cannot gauge the impact of the curriculum on student learning. On the other hand, the validity of "in-house" tests, internally normed and tailored to individual departmental needs, cannot be judged against external criteria.

The Project for Area Concentration Achievement Testing (PACAT) responded to this need for assessment in the major that is sensitive both to department characteristics and curriculum and to the need to compare student performance across peer institutions. Begun in 1983 as a consortium of Tennessee psychology departments, it aimed to expand and disseminate its assessment model in psychology, political science, and social work to include faculty in nine other states and create a minimum of six additional disciplinary consortia.

Innovative Features

PACAT surveys were used to identify content area emphases of departments in each discipline. Then assessment instruments for graduating seniors were constructed to conform to these curricular patterns from items submitted by faculty at participating institutions.

By giving faculty ownership of multiple-choice standardized testing and by measuring the relationship between content area performance and departmental curricula, PACAT bridges the gap between parochial instruments and nationally-normed exams. PACAT creates, maintains, and updates test items and serves as a coordinating office for scoring and interpreting the summary test results to departments. Score reports generated by PACAT contain raw and standardized scores for each content area tested and the individual department's performance history.

These senior tests have been used to meet state-mandated assessment requirements. Since they point out the academic weaknesses and strengths of graduating students, individual departments can alter course content and balance course requirements in line with test results. Further, departments can evaluate the impact of their curriculum on student achievement against other campus departments and against their own performance in previous years.

Evaluation

Several construct validation studies of the Area Concentration Achievement Test (ACAT) in psychology were conducted by independent evaluators and by faculty at adopting institutions; that is, evaluators examined the extent to which inferences and decisions derived from test scores were supported by evidence and rationale. Comparison of groups of introductory and senior students showed seniors' test performance to be superior on all subtests and on overall scores. The ACAT was thus demonstrated to be psychometrically sound and to measure what it was designed to measure--the impact of the psychology curriculum on student majors. Also, reliability statistics were collected at each test administration by using multiple versions of the test and separate groups of participating students.

Evaluation of PACAT shows that the project exceeded its objectives in the following areas: number of curriculum surveys returned; number of follow-up requests; number of ACAT instruments administered annually; number of disciplines and curricular patterns used by ACAT; number of institutions and states added to the consortia; and test items added to item pools used to construct ACAT instruments.

When FIPSE funding began in 1988, 865 ACATs had been administered in 22 departments during a five-year period, mostly in Tennessee. By 1992, 5,267 ACATs had been administered in 54 departments in 19 states. Curriculum surveys were sent to 10,600 academic departments nationally and over 4,700 were returned in 13 disciplinary areas. Follow-up surveys were requested by 1,394 departments. The Educational Testing Service contributed a large pool of items to PACAT gathered under another FIPSE-funded project.

Beyond the three original ACAT disciplines, new instruments have been introduced in English literature, communications, art, biology and agriculture. Five new curricular versions of the ACAT in psychology and two in biology have been adopted for use. New multi-state consortia are being formed in history, public administration and criminal justice.

Of course, the long-term impact of PACAT on students will depend on the extent to which test results are used to improve curriculum and instruction. It is too early to determine this effect definitely, since the consortia are still forming and it takes several years of data collection before departments are ready to act on the results.

ACAT has already been used to justify program accreditation, perform self-studies and comply with state-mandated assessments. Colleges and universities such as Jamestown College, Belmont College, The University of Alabama, Ohio State University, Wayne State University and East Tennessee State University have used ACAT results to support faculty development, to isolate areas of academic weakness among their graduating students, to readjust curriculum content to balance those weaknesses, to initiate faculty and student research, and to provide external validation to program evaluation of departments and courses.

Project Continuation

Austin Peay continues to support PACAT at levels somewhat higher than those prior to FIPSE, but it must now obtain independent funding for ACAT test materials and administration. Presently, PACAT is continuing its complete program and developing new assessment consortia.

Major Insights and Lessons Learned

Giving individual departments a voice in the construction of tests to evaluate their curricula is a long, slow process, taking at least three or four years to implement programs, many of which then require several years of fine tuning and coordinating efforts with administrators and departmental faculty. Unfortunately, many institutions ignore students while planning assessment, even though that is a time when they could gain their acceptance and cooperation.

PACAT revealed that many departments at small institutions are not able to initiate assessment programs of their own, especially those using external comparisons. By pooling resources and costs through consortia, however, assessment can be performed and the diversity of departmental curricula can be maintained.

Available Information

Information about the surveys, curricular patterns, and the ACATs in specific disciplines is available by writing or calling PACAT. Departments are welcome to join consortia at any time and, although submitting test items is not a requirement, they are encouraged to do so. Upon receipt of a written request, sample test booklets will be provided for examination. Commentaries, both favorable and unfavorable, combined with concrete examples for improvement of the test items, are welcome.

Anthony J. Golden, Director
Project for Area Concentration Achievement Testing (PACAT)
P.O. Box 4568
Austin Peay State University
Clarksville, TN 37044
Telephone: (615) 648-7451 BITNET; ANTHONY@APSU

[Buttressing Assessment] [Table of Contents] [City University of New York]

Top

FIPSE Home


 
Print this page Printable view Send this page Share this page
Last Modified: 02/22/2006