OFFICES


OPE: Office of Postsecondary Education
Current Section
Lessons Learned from FIPSE Projects I - October 1990 - University of California - Los Angeles

A Value-Added Approach to Institutional Excellence

Purpose of Project:

ULLA employed a value-added method to achieve institutional excellence-one that differed significantly from traditional approaches based on tested abilities of entering students. In this project, effectiveness was judged by an institution's capacity to develop the talents of students. The focus was on how much students actually learned or improved from entry to exit rather than on their relative level of competency at the point of entry.

Its primary purpose was to assess student liberal education outcomes as a way of strengthening institutional effectiveness. Under this view, a high-quality institution develops and maximizes the intelligence and personal talents of its students. The student assessment programs were implemented by a consortium of seven colleges and universities (Spelman College, Eckert College, UCLA, Carnegie Mellon University, Empire State College, Hood College, and Rhode Island College).

Innovative Features:

Beginning in 1984, these seven diverse institutions worked together to create a model value-added assessment program. This was the first time such a program was attempted by a consortium. The deliberate mix of institutions permitted testing the generality of findings in a variety of educational contexts.

Evaluation:

To obtain baseline data on student attitudes and expectations, institutions administered the Cooperative Institutional Research Program's freshman survey in 1984. Baseline data on general education were gathered by administering a common instrument, the ACT-COMP battery. The evaluation design also included post-test assessments: affective information from the Follow Up Survey of entering students and cognitive data from a readministration of the ACT-COMP. Testing students on the same or similar instruments at repeated intervals provided measures of growth or changes over time-the value-added.

All member institutions are now developing a longitudinal student data base that will become a regular part of the institution's planning and management. Evaluation data will go beyond standardized test results and include information on residence life, study habits, extracurricular activities, and choice of major.

Thus far, the project has been evaluated through comprehensive case studies and site visits to each campus by the Higher Education Research Institute (HERI) staff as well as through regular meetings of the consortium participants.

Complex consortial projects involving a number of institutions, such as this one, require more than several years for full implementation and adequate evaluation. An impact study measuring student outcomes got under way in the fall of 1989. Longitudinal student data from the seven participating consortium institutions will be compared with data from a matched control group of seven non-participating institutions on such outcomes as student retention rates, changes in student satisfaction with their institutions, and changes in students' educational plans and vocational aspirations over time.

Impact or Changes From Grant Activities:

The project has already changed thinking significantly about the role of assessment and its relationship to effective institutional practice. Not only do participants demonstrate an informed awareness of assessment, in general, but they have committed their institutions to long-term value-added assessment, in particular. Evidence ranges from successful efforts of campus leadership to commit needed staff resources, to the creation and funding of full-time assessment positions, to the implementation by several consortium institutions of regular longitudinal assessment programs, to plans to continue and enlarge the consortium.

An unanticipated institutional need for assistance in establishing longitudinal student data bases was answered by project-sponsored value-added workshops and a specially created two-hour videotape that explores relevant technical and organizational data base issues.

What Activities Worked Unexpectedly?

Unexpectedly, the consortium structure turned out to be a major element of the program's success. Its cooperative dynamics helped keep the participants on track because they were able to exchange information, identify common problems, and profit from each other's solutions.

Probably the most critical finding of the project thus far is the need for comprehensive student data bases drawn from data in admissions, financial aid, student affairs, placement, alumni, registration, student academic and entry characteristics. These data provide a context for the students' college years-their courses, programs, and activities-within which to make sense of the assessment data. In fact, the absence of such data bases is the single biggest impediment to effective use of value added assessments.

What Activities Didn't Work?

Initially, the project assumed that participants would be able to link their value-added assessments to other critical student data. This turned out to be a naive assumption; there were really no longitudinal student data bases to permit these linkages, so new assessments were conducted largely in a vacuum.

The other problem for the project was finding suitable instruments for assessing general education. The ACT-COMP presented difficulties with test administration and interpretation of scores. Therefore, some consortium members have decided to build on existing cognitive assessments and use admissions or subject placement tests as "pre-tests" repeated at appropriate intervals to create value-added longitudinal data.

Similarly, outcome assessments such as comprehensive examinations, competency tests required for graduation, junior exams, professional certification exams, or graduate school admissions tests may be used as pre-tests at some appropriate earlier time to produce longitudinal data. Some consortium members came to feel that to ensure accurate and adequate pre-testing and post-testing, participation in assessment on campuses should be required.

What Do You Have To Send Others And How Do They Get It?

A major outcomes monograph entitled College Student Outcomes Assessment: A Talent Development Perspective has been published by ASHE-ERIC. Its unique feature is a detailed compendium of available instruments for assessing cognitive outcomes. Also the HERI staff prepared a newsletter, the Value-Added Exchange, for consortium members and other interested parties about program developments.

Also available is a two-hour videotape on how to design and conceptualize a comprehensive student data base, the technical and political considerations in creating it, and how the data can be used for institutional self-study and analysis. Interested readers can obtain a VHS cassette copy of this video for $25 by writing to:

Alexander W. Astin
Higher Education Research Institute
Graduate School of Education
UCLA
Los Angeles, CA 90024
213-825-1925

What Has Happened To The Program Since The Grant Ended?

Current planning involves enlarging the consortium to perhaps 35 institutions that will implement findings from the study. Potential participants are gathering and the project is looking for funds to support this expansion. Beyond this financial and organizational base, members are hoping to employ a full-time consortium director and develop regional workshops on value added assessment themes. The project has inspired a publication through the ACE McMillan Series in Higher Education entitled Assessment for Excellence: Philosophy and Practice of Assessment in Higher Education and has influenced California legislation on talent development by endorsing comprehensive data bases on all campuses in the state.

Project Insights: Key insights from the case study visits include:

  • Faculty resistance is a normal part of the process of instituting an assessment program. Given proper consideration to faculty criticism and debate, projects can use it constructively to translate opposition into advocacy.

  • Using the ACT-COMP long form to assess general education has its problems, especially in its administration and the interpretation and meaning of scores.

  • A considerable amount of advance planning allows involving faculty, staff, and students in the assessment process. Some of the consortium members advocate making participation in assessment an institutional requirement.

  • As noted earlier, value-added assessment can build upon existing assessments, particularly admissions and subject matter tests. Longitudinal insights about student development can be gained through repeated administration of these tests over time. Also assessment efforts seem to clearly benefit from being anchored in existing routines, as in using orientation and registration opportunities for pre-test data collection.

  • Assessment projects especially benefit from having key administrators personally involved, and rewarding and encouraging faculty participation.

  • A longitudinal student data base is key to an effective value-added assessment program.

[Atlanta University] [Table of Contents] [Carnegie Mellon University]


 
Print this page Printable view Send this page Share this page
Last Modified: 03/14/2007