National Institute for Literacy
 

[Assessment 790] Post from Forrest Chisman

Marie Cora marie.cora at hotspurpartners.com
Fri Apr 20 14:41:57 EDT 2007


Hi folks,

Just a quick note to say that the email below that was just posted is
from Forrest Chisman, of CAAL - it is not signed below.

Thanks,
Marie Cora
Assessment Discussion List Moderator


-----Original Message-----
From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of forrest
Sent: Friday, April 20, 2007 1:47 PM
To: assessment at nifl.gov
Subject: [Assessment 786] Re: Using Data

Dan, you have raised some critically important questions. NRS data
indicates that only about 36% of ESL students "complete a level" each
year. This is cause for concern, because the same data show that the
vast majority of ESL students are at the lowest levels of proficiency
and have low levels of education in their native countries. However, the
NRS data is not definitive for a number of reasons -- such as low rates
of re-test in many programs, the use of tests that do not measure the
full range of English language skills, and the fact that data is
reported only for a single year (students may persist in programs long
enough to achieve much larger learning gains).

As a first step toward finding out more about the learning gains and
persistence of ESL students, Jodi Crandall and I worked with the faculty
and staff at 5 highly regarded community college programs to use student
record data as a means of determining both learning gains and
persistence rates. At several of the colleges we were able to track the
learning gains and persistence of students for as long as seven years.
At most of the colleges, the measures of learning gains used was
completion of one or more additional levels AS THE COLLEGE DEFINED THE
LEVELS. Both the definition of levels and the standards of completion
took account of test scores (of the sort reported to the NRS), but they
also took account of other measures of student achievement (including
proficiency in all core ESL skills).

Needless to say, our findings were fairly complex and cannot be
adequately set forth here. In summary, however, we found that at most
30% of students persist for 2-3 college terms and complete more than 2-3
levels over a seven year period. More than 40-50% of students do not
complete a level or complete only a single level at any time over a
seven year period. Although we could not be sure, it appears that low
level students were more likely to persist than higher level students.
About 10-15% of adult education ESL students enrolled in credit ESL at
these colleges, and the number who eventually enrolled in academic
credit courses was in the single digits.

We also found that all the colleges we examined employ strategies that
significantly improve the rate of learning gains and retention. Among
these were high intensity/managed enrollment classes (more than 3-6
hours per week), strategies to encourage learning outside the classroom,
appropriate uses of echnology for instruction, co-enrollment of adult
education ESL students in vocational programs taught in English,
curricular designs that insure instruction is relevant to the interests
of students (such as Frerian approaches), enriched
guidance/counseling/support services, setting high expectations, and
VESL programs. Unfortunately, only small numbers of students have access
to most of these strategies at most colleges, because they are far more
expensive on a per student basis than is standard ESL instruction.
Conversely, it appears that large numbers of students would like to make
the committment to enhanced programs, if they were available.

The results of our research were published by the Council for the
Advancement of Adult Literacy (under the auspices of which the research
was conducted) in February as the report " Passing the Torch:Strategies
for Innovation in Community College ESL." This is available at the CAAL
website: www.caalusa.org. CAAL will be publishing more of the data we
gathered later this spring.

Among the "take away" messages we gathered from our work were: 1) The
use of longitudinal (multi-year) data and holistic assessments of
learning gains are essential for understanding and improving the
effectiveness of ESL programs. In many programs it is feasible to gather
and use longitudinal data in this way, but few programs do so due to a
variety of perceived constraints and/or a lack of support for data
analysis by their host institutions. 2) Research can be very helpful in
program improvement, but it requires a substantial committment on the
part of programs to gather relevant data and tease out its lessons on an
on-going basis. Programs should receive far more support for this. 3) It
is posible to greatly improve ESL program outcomes using a variety of
strategies, but these require a larger investment in instruction per
student -- an investment that we believe is well worth the cost. 4)
Numbers do not speak for themselves. For example, low rates of learning
gains must be read in the context of the goals that both students and
programs set for ESL instruction. It may be that some portion of
students legitimately wish to use ESL programs as an initial platform to
learn SOME English, and that their learning gains after separating from
programs are substantial. Too little is known about this. Conversely, we
found that the more students learn, the more ambitious their learning
goals become. Because numbers do not speak for themselves, it is all the
more important for individual programs and state agencies to invest in
the use of research for program improvement and to ACTUALLY USE IT for
therse purposes. Too often over-burdened ESL faculty and staff consider
research an after-thought. They need the time, encouragement, resources,
and training to development "continuous program improvement" models to
their work.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.nifl.gov/pipermail/assessment/attachments/20070420/61bf5517/attachment.html


More information about the Assessment mailing list