National Institute for Literacy
 

[Assessment 758] Re: Using Data

Steve Reder reders at pdx.edu
Thu Apr 19 10:17:37 EDT 2007


The Longitudinal Study of Adult Learning has been following a target
population of ABE learners over a long period of time. It's finding exactly
the pattern that others have been describing - many adults participate in
programs over a series of "episodes" which often span multiple years (NRS
accounting periods). When we've presented these data, we've suggested that
NRS will not capture all of the impact that programs have on learning in
part because of its short-term focus for measuring both participation and
outcomes. I wonder if states could get waivers on a pilot basis to
experiment with longer reporting periods as David Rosen suggested.



-Steve Reder



_____

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On
Behalf Of Dan Wann
Sent: Thursday, April 19, 2007 6:47 AM
To: 'The Assessment Discussion List'
Subject: [Assessment 756] Re: Using Data



I wonder if there is enough data to even show that adult basic and ESL
students stay with a program in large enough numbers to track over a longer
period? The conventional wisdom of those outside of the adult basic skills
network is that basic skills programs have little impact because students do
not stay long to make a difference. Do we have any evidence that shows we
work with the same students more than one year and that we work with a high
enough number of students more than one year to make a significant
difference?





Dan Wann

Professional Development Consultant

IN Adult Education Professional Development Project



dlwann at comcast.net



_____

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On
Behalf Of David J. Rosen
Sent: Thursday, April 19, 2007 2:05 AM
To: The Assessment Discussion List
Subject: [Assessment 752] Re: Using Data



Larry, and others,

Tina, and many other program administrators have observed patterns like this
that suggest that a one-year time frame, a funding year, may not be the best
unit of time in which to measure learner gains, except for those who are
doing basic skills brush-up or who have very short-term goals like preparing
for a drivers license test. I wonder if there is a possibility that the NRS
might be adjusted, perhaps in a pilot at first, so that a longer period of
learning, say three years, might be used to demonstrate learner gains. Of
course, there would need to be intermediate measures, but accountability --
for programs and states -- might be based on a longer period of time.

It seems to me that the one-year time frame within to measure learning gains
or goals accomplished comes not from K-12 or higher education, but rather
Congressional expectations for job skills training. Would you agree?

Also I wonder if you or others have some examples of programs that track and
report learner outcomes over several years, and use the data for program
improvement.

David J. Rosen
djrosen at comcast.net


Tina_Luffman at yc.edu wrote:

Hi Luanne,



I find it interesting that what you are finding in data seems to be
consistent with what we see in our GED classes here in Arizona. Often the
last group who enter in March are the least likely to stay with the program
until posttesting, and the August group seem to have the highest posttesting
and retention rate.



Tina





Tina Luffman
Coordinator, Developmental Education
Verde Valley Campus
928-634-6544
tina_luffman at yc.edu



-----assessment-bounces at nifl.gov wrote: -----

To: <mailto:assessment at nifl.gov> <assessment at nifl.gov>
From: "Luanne Teller" <mailto:lteller at massasoit.mass.edu>
<lteller at massasoit.mass.edu>
Sent by: assessment-bounces at nifl.gov
Date: 04/18/2007 10:56AM
Subject: [Assessment 746] Using Data



Hi all:



I wanted to chime in about our program's use of data since this is the focus
of our discussion. Coincidentally, I am in the process of writing our
proposal for next year, so I am knee-deep in data even as we speak!



The use of data takes many forms in our program. We look at what most
people consider the "hard data" -- the raw numbers with regard to
attendance, learner gains, retention, goal attainment, etc. We believe;
however, that the numbers alone provide an incomplete picture of what is
happening, so we use the numbers as a basis for discussion, not decision
making. After analyzing the numbers, we begin to look at additional sources
of data that we find essential in informing our planning---meetings with
staff, classes, our student advisory board, and focus groups.



Here's an example we're currently working on---we did a two year analysis of
learner retention, and began to document why students did not persist. We
found that the retention for students who enrolled after January 1 (our
programs runs on a school calendar year from September to June) was
significantly lower than the retention for students who began in September.
Even more compelling, we learned that the retention for students who began
after March 1 was 0%.



We met with staff and students, and did some research around student
retention issues. After a year-long process, we decided to pilot a "managed
enrollment" approach. In Massachusetts , our grantor (MA DOE) allows us to
"over-enroll" our classes by 20%, so we enroll 20% more students in the
fall. When students leave, we "drop" the overenrolled students into funded
slots. This allows us to keep the seats filled even with the typical
attrition that occurs.



In January, when we do our mid-point assessments; we move students to the
higher level who are ready to progress..that typically leaves several
openings in the beginner levels and we begin students in February as a
cohort. This year, we implemented new orientation programs including a
requirement that new students observe a class before enrolling.



While it is still too early to tell if these new procedures will have a
positive impact, we are hopeful and we know anecdotally that the transition
seems to be easier for some of these students. We are eager to look at the
data at the end of the year to analyze the effectiveness of this plan.



As we begin to look at our data, we are finding that there seem to be a
unique set of issues for our beginner ESOL students. We suspect that the
lack of effective English communication skills to advocate for themselves
with employers is influencing their attendance and persistence. This is an
issue that we are beginning to tackle in terms of policy. Do we need to
have a more flexible, lenient policy for beginner students? Is there a way
to support students in addressing these employment issues? How can we
empower students more quickly? Are there other issues for these beginner
level students that affect their participation? As we enter these
discussions, the numbers will provide a basis for developing strategies, but
the students themselves with be our greatest source of valuable data.



Luanne Teller



Luanne Teller

-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
http://www.nifl.gov/mailman/listinfo/assessment











_____






-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
http://www.nifl.gov/mailman/listinfo/assessment




-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.nifl.gov/pipermail/assessment/attachments/20070419/2938d83d/attachment.html


More information about the Assessment mailing list