Archived Information

Answers in the Tool Box: Academic Intensity, Attendance Patterns, and Bachelor's degree Attainment — June 1999

V. Conclusion: The Tool Box Story

The story told by this voyage is clear, and advances the observation of Alexander, Riordan, Fennessey, and Pallas (1982) that "academic variables are much more potent predictors of college completion" than social background variables (p. 324), Hearn's analysis of "non-traditional enrollment styles," and Cabrera, Nora, and Castaneda's (1993) remarkable adjustment in the retention-modeling literature in unburying academic performance as a powerful, direct influence on the momentum toward degrees. It helps us advise and guide students no matter what paths of attendance they follow through higher education. It tells us that if degree-completion lags for any student or group of students, the situation is fixable. We learn where to take the tool box, and what tools to use.

One must acknowledge that SES has a continuing influence in life-course events. But the analysis here (and elsewhere) indicates how much education can mitigate those effects-and in both directions (downward mobility is not a chimera). If SES were an overpowering presence, the tool box directions would be futile. Optimism is the preferred stance.

There is an obvious academic line to the story. To admit that some students are not prepared for the academic demands of the particular higher education environment in which they initially find themselves seems to be some form of heresy in the research traditions on this issue. We do students no favors by advancing to center stage complex stories built from marginal variables to explain why they don't finish degrees. Yes, there are psychological and personal reasons for non-completion, but these are extraordinarily difficult to micromanage. The tool box does not offer much help. At the same time, "academic demands" and "environment" are both relative and varied. Some students may struggle to major in a field they did not understand well enough before they cross the threshold of the major, e.g. engineering (Adelman, 1998) or over-estimate their own talent and proclivities in the field, e.g. music. If their academic performance lags, they may (1) migrate to another major either without stopping-out or with a short term of non-continuous enrollment, (2) stop-out and rethink what they are doing in higher education, perhaps returning in one or two years, (3) turn up as provisional permanent drop-outs at age 30, or (4) change institutions in combination with changed field and stop-out. As this analysis has demonstrated, changing institutions has minor effects on completion. Academic preparation, continuous enrollment, and early academic performance, on the other hand, prove to be what counts.

This tripartite message has dimensions that can—and should—be heeded by those to make and execute policy for higher education, by those who advise and guide students in both high school and college, by those involved in research and evaluation of the school/college continuum, and by students themselves. Let's reflect on these dimensions.

Start with Opportunity-to-Learn

Think about the ACRES variable, the foundation of this analysis. It has three components, only one of which is subject to change, the curriculum component. For purposes of constructing the ACRES index, curriculum intensity and quality was set out as a scaled continuum. But every point on that scale can be described as a criterion, a standard of content. While test scores and class rank reflect standards of performance that are usually expressed in relative terms, there is no reason why all students cannot reach the highest levels of the curriculum scale--at which point, of course, the scale itself can be abandoned. This ideal state will not come to pass without ensuring opportunity-to-learn, and, at the present moment, not all secondary schools can--or do, or will--provide that opportunity (Spade, Columba, and Vanfossen, 1997). Many do not offer mathematics beyond Algebra 2; many offer Algebra 2 courses that, in content, are closer to Algebra 1. Many cannot offer the three basic laboratory sciences, or foreign language study beyond the second year, or computer programming-let alone Advanced Placement courses. Students who enter higher education from these schools, enter with less momentum toward degrees than others. Poor and working-class students, students from rural areas, and minority students are disproportionately affected by this lack of opportunity-to-learn (Rosenbaum, 1976; Monk and Haller, 1993).

What can the higher education enterprise do to provide equitable opportunity-to-learn? Dual enrollment, a growing practice, is one answer. The higher education partner in this arrangement is usually a community college. Under dual enrollment, high school students who do not have access to trigonometry or physics or third year Spanish at the high school take those courses at the local community college and receive both high school and college credit for them. Direct provision can also fill curricular gaps at those high school districts willing to accept college faculty who provide the instruction on site. It's nigh impossible to do this in laboratory science if there is no laboratory in the secondary school, but a variety of other subjects are open to regular visiting instructors.

A third approach involves incentives to schools and students themselves instead of relying on institutions of higher education as providers of instruction. This "restoration strategy" recognizes that students from too many high schools (and minority students, in particular) have been slowed down by a variety of structural and environmental factors, and arrive at the college application line in the fall semester of grade 12 with 10 grades worth of work in their portfolios. If all our 4-year colleges adopted a rolling admissions cycle that takes in 11 percent of the target pool each month from November through July (yes, July!), students would be able (a) to recapture as much as a full year's worth of learning, and (b) to prepare for testing as late as April or May. Since test scores follow learning (not the other way around), disadvantaged students would benefit more from this recapturing of learning time. The so-called "bridge programs" that take place during the summer following high school graduation may be a helpful reinforcement in the restoration strategy, but would be more effective if they began two summers earlier (after the 10th grade), on a much larger scale, and with follow-up cooperative curriculum-fortifying activities in the school district.

For those who doubt that opportunity to fill one's high school portfolio with an academic intensity and quality that would place one toward the top of the scale we constructed for ACRES and hence close the race gap in degree completion, I offer table 40. The data are a stunning endorsement of what curricular intensity and quality can do for African-American and Latino students in particular.

Some will argue that the Latino data in table 40 are too unstable (the standard errors are large, rendering statistical comparisons with the other race/ethnicity groups tenuous) and not representative of the Latino population in higher education. After all, for the High School & Beyond/Sophomores, 54 percent of the Latino students who continued their education after high school began in community colleges, a ratio far larger than that for any other race/ethnicity group(47). That proportion, however,declined from 57 percent for the generation of the NLS-72, and has dropped to 49 percent for the NELS-88 cohort (the high school graduating class of 1992). This trend suggests that when we are able to validate this entire analysis using the NELS-88 college transcripts, we will witness less volatility and stronger statistical relationships.

Of the three core variables in table 40, only the curriculum variable is criterion-referenced. That is, the highest 40 percent does not necessarily describe relative position, particularly when the HIGHMATH criterion of trigonometry or higher is added. To repeat: in a happy paradox, everybody can be in the "highest 40 percent" on the curriculum measure. This is not the case for a test score measure , not matter what combination of tests we use. And it certainly is not the case for class for class rank, which by definition, is a relative measure.

Table 40.–Bachelor's degree completion rates for students in the top two quintiles of each component of ACRES, who entered 4-year colleges directly form on-time high school graduation, by race, High School & Beyond/Sophomore cohort, 1982-1993


  White Black Latino Asian All
All 75.4% 45.1% 60.8% 86.9% 72.1%
  (1.16) (3.14) (7.27) (7.79) (1.07)
Curriculum:
Highest 40% of
HIGHMATH beyond
Algebra 2
85.7 72.6 79.3 89.0 84.8
  (1.44) (4.98) (7.34) (3.47) (1.33)
Test Scores
Highest 40% of
Combined Scale
80.5 67.1 66.6 94.7 79.9
  (1.17) (3.66) (8.38) (1.90) (1.09)
Class Rank/ GPA
Highest 40% of
Combined Variable
78.9 58.8 57.0 84.9 77.1
(1.26) (4.56) (7.44) (2.95) (1.19)

The Trouble With the "X-Percent" Solution

And yet it is to high school class rank and GPA that policy makers have turned for cheap and easy "solutions" in admissions to public institutions that exercise any degree of selectivity.

The policies are expressed in "take the top-X percent by class rank' formulas, even if, nationally, 19 percent of our secondary schools no longer use class rank, and 53 percent include non-academic courses in the calculation of GPA (College Board, 1998). This approach clearly does not acknowledge the student and his/her goals of completing a degree. As table the least for everybody, and actually would have had a negative impact on Latinos in the High School & Beyond/Sophomore cohort.

If we are genuinely interested in improving the degree completion rates of minority students, which of these indicators would we rather use? The answer, as they say, is a "no-brainer the only field on which we can exercise change. A test score is a snapshot of performance on a Saturday morning. secondary school grades-and the relative standing that they produce in "classes" where the student body may be constantly changing-carry as much reliability as a pair of dice (Elliott and Sternaa, 1988). But the intensity and quality of curriculum is a cumulative investment of years of effort by schools, teachers, and students, and provides momentum into higher education and beyond. It obviously pays off. The effects of grades and tests diminish in time, but the stuff of learning does not go away.

Table 41.–Comparative improvement in bachelor's degree attainment rates by moving into the top 40 percent on each component measure of the ACRES index, by race, High School& Beyond/Sophomore cohort, 1982-1993


  White Black Latino All
Component:
Curriculum +10.4% +27.5% +18.5% +12.7%
Test Scores +5.1 +22.0 +5.8 +7.8
Class Rank/GPA +3.5 +13.7 +3.8 +5.0

Contemporary policy with respect to admissions at selective or highly selective public institutions, of course, does not focus on the top 40 percent of class rank/GPA(let along any other measure), rather on the top 10 percent (Texas) or 4 percent (California). The portrait in tables 40 and 41, to be fair, may not be the portrait resulting from a more restrictive threshold such as 10 percent or 4 percent. And a contemporary population may be very different from that of the HS&B/So. All this, of course, is speculative. The admissions line is not the commencement line. In an old saw, only time will tell. The common-sense odds, however, say that unless students in the "top X percent" by class rank/GPA ALSO have the curriculum that comes from opportunity-to-learn, we may not be doing right by them.

Post-Matriculation Tools

The analysis of student progress in an age of multi-institutional attendance clearly advises us to keep the tool box handy. Access (entry) to higher education is not the dependent variable for students. Nor is mere retention to something called "year 2. " From the multivariate analysis we learned just how large a role continuous enrollment plays in the degree completion. We know that keeping students enrolled, even for one course a term, is critical. How do we do that when students are highly mobile, when they behave like consumers, and when their loyalties to particular institutions are weak?

We possess the technology to answer that question in action. The dean's offices of this world have to know when a student intends to leave the institution, and must do everything in their persuasive power to ensure either that the student is transferring to another institution without a pause in the course of study, or that the student is connected to a course, anywhere, so that a potential break from academic momentum does not lengthen to irretrievable dimensions. In following the 16 percent of beginning 4-year college students in the BPS90 study who left higher education by the end of their first year (1989-1990), Horn (1998) found that 36 percent of this group returned within a year, but as the temporal gap lengthened, the return rate fell. In replicating Hom's analysis with a longer time fame (1982-1993) and the broader population that was subject to our multivariate analysis, I found that the 26 percent of stop-outs who returned within a year completed degrees (associate's or bachelor's) at a 50 percent rate, but that any further delay in returning to college cut that completion rate in half.

Our other options include finding the student an on-line course from the increasing number of providers of Internet-based instruction, and not fretting if the provider is not a traditional 2year or 4-year college. If the student does not own a computer and lives in the institution's immediate area, loan a computer to the student. Depending on student interests, occupational as well as academic, there are variations on this theme. If we are serious about helping students complete degrees, we can be creative. But in these situations, subsequent vigilant contact with students—by e-mail, by phone, by whatever—by advisors is necessary.

We found that a high DWI (Drops-Withdrawals-Incompletes) index worked significantly against degree completion. The situation could be ameliorated by institutional policies that both restrict the extent of withdrawals, incompletes, and no-credit repeats and play closer attention, in advisement, to student credit loads and time commitments. In others words, a restrictive policy alone does not help students unless it is accompanied by advisement actions that enable a student to complete a reasonable course load successfully.

This tool box is placed best in the offices of counselors and advisers, on the desks of those journalists and editorial writers who interpret higher education for the broad public, and, most of all, in the minds of students' friends and family. The tools derive from the principal features of the story-line. That story, brought to us by the wisdom of the U.S. Department of Education in establishing and maintaining its longitudinal studies, is a legacy from one generation to the next.


-###-
IV. Does It Make Any Difference? Common Sense and Multivariate Analysis [Table of Contents] Notes