National Institute for Literacy
 

[Assessment 766] Re: Using Data

Barbara Arguedas barguedas at sfccnm.edu
Thu Apr 19 13:02:59 EDT 2007


We too see this happening. Just now we posted the official GED test
scores (passing!) for a student who started in June 2003 (65 hours) and
stopped out in July 2003. She came back March 2004 and was in and out
through March 2005. This March and April (2007!) she took and passed
all of the official GED tests. So this is a success story! BUT, we
get no credit as far as NRS is concerned because the student is not
enrolled this program year. YES, we support efforts to report results
over multi-year periods.

Thanks.

Barbara Arguedas

Santa Fe Community College Adult Basic Education



-----Original Message-----
From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Mary Beheler
Sent: Thursday, April 19, 2007 9:17 AM
To: The Assessment Discussion List
Subject: [Assessment 760] Re: Using Data



Our oldest few:

Intake 1997 Grade level equivalent 2, now 11

Intake 1998 GLE K, now 7

Intake 1998 GLE 2, now 7

Intake 1998 GLE 3, now 11

Intake 2000 GLE K, now 4

And all of them have been "stop-out" students, with very uneven
progress, at times even going down on the CASAS scale. They tend not to
improve as quickly as NRS likes, but they do improve.

There is another set of students that need to continue coming just to
keep whatever skills they have. They can be really hard on NRS
statistics!



Mary G. Beheler
Tri-State Literacy
455 Ninth Street
Huntington, WV 25701
304 528-5700, ext 156

-----Original Message-----
From: assessment-bounces at nifl.gov
[mailto:assessment-bounces at nifl.gov]On Behalf Of Dan Wann
Sent: Thursday, April 19, 2007 9:47 AM
To: 'The Assessment Discussion List'
Subject: [Assessment 756] Re: Using Data

I wonder if there is enough data to even show that adult basic
and ESL students stay with a program in large enough numbers to track
over a longer period? The conventional wisdom of those outside of the
adult basic skills network is that basic skills programs have little
impact because students do not stay long to make a difference. Do we
have any evidence that shows we work with the same students more than
one year and that we work with a high enough number of students more
than one year to make a significant difference?





Dan Wann

Professional Development Consultant

IN Adult Education Professional Development Project



dlwann at comcast.net




________________________________


From: assessment-bounces at nifl.gov
[mailto:assessment-bounces at nifl.gov] On Behalf Of David J. Rosen
Sent: Thursday, April 19, 2007 2:05 AM
To: The Assessment Discussion List
Subject: [Assessment 752] Re: Using Data



Larry, and others,

Tina, and many other program administrators have observed
patterns like this that suggest that a one-year time frame, a funding
year, may not be the best unit of time in which to measure learner
gains, except for those who are doing basic skills brush-up or who have
very short-term goals like preparing for a drivers license test. I
wonder if there is a possibility that the NRS might be adjusted, perhaps
in a pilot at first, so that a longer period of learning, say three
years, might be used to demonstrate learner gains. Of course, there
would need to be intermediate measures, but accountability -- for
programs and states -- might be based on a longer period of time.

It seems to me that the one-year time frame within to measure
learning gains or goals accomplished comes not from K-12 or higher
education, but rather Congressional expectations for job skills
training. Would you agree?

Also I wonder if you or others have some examples of programs
that track and report learner outcomes over several years, and use the
data for program improvement.

David J. Rosen
djrosen at comcast.net


Tina_Luffman at yc.edu wrote:

Hi Luanne,



I find it interesting that what you are finding in data seems to
be consistent with what we see in our GED classes here in Arizona. Often
the last group who enter in March are the least likely to stay with the
program until posttesting, and the August group seem to have the highest
posttesting and retention rate.



Tina





Tina Luffman
Coordinator, Developmental Education
Verde Valley Campus
928-634-6544
tina_luffman at yc.edu



-----assessment-bounces at nifl.gov wrote: -----

To: <assessment at nifl.gov> <mailto:assessment at nifl.gov>
From: "Luanne Teller" <lteller at massasoit.mass.edu>
<mailto:lteller at massasoit.mass.edu>
Sent by: assessment-bounces at nifl.gov
Date: 04/18/2007 10:56AM
Subject: [Assessment 746] Using Data



Hi all:



I wanted to chime in about our program's use of data since this
is the focus of our discussion. Coincidentally, I am in the process of
writing our proposal for next year, so I am knee-deep in data even as we
speak!



The use of data takes many forms in our program. We look at
what most people consider the "hard data" -- the raw numbers with regard
to attendance, learner gains, retention, goal attainment, etc. We
believe; however, that the numbers alone provide an incomplete picture
of what is happening, so we use the numbers as a basis for discussion,
not decision making. After analyzing the numbers, we begin to look at
additional sources of data that we find essential in informing our
planning---meetings with staff, classes, our student advisory board, and
focus groups.



Here's an example we're currently working on---we did a two year
analysis of learner retention, and began to document why students did
not persist. We found that the retention for students who enrolled
after January 1 (our programs runs on a school calendar year from
September to June) was significantly lower than the retention for
students who began in September. Even more compelling, we learned that
the retention for students who began after March 1 was 0%.



We met with staff and students, and did some research around
student retention issues. After a year-long process, we decided to
pilot a "managed enrollment" approach. In Massachusetts , our grantor
(MA DOE) allows us to "over-enroll" our classes by 20%, so we enroll 20%
more students in the fall. When students leave, we "drop" the
overenrolled students into funded slots. This allows us to keep the
seats filled even with the typical attrition that occurs.



In January, when we do our mid-point assessments; we move
students to the higher level who are ready to progress....that typically
leaves several openings in the beginner levels and we begin students in
February as a cohort. This year, we implemented new orientation
programs including a requirement that new students observe a class
before enrolling.



While it is still too early to tell if these new procedures will
have a positive impact, we are hopeful and we know anecdotally that the
transition seems to be easier for some of these students. We are eager
to look at the data at the end of the year to analyze the effectiveness
of this plan.



As we begin to look at our data, we are finding that there seem
to be a unique set of issues for our beginner ESOL students. We suspect
that the lack of effective English communication skills to advocate for
themselves with employers is influencing their attendance and
persistence. This is an issue that we are beginning to tackle in terms
of policy. Do we need to have a more flexible, lenient policy for
beginner students? Is there a way to support students in addressing
these employment issues? How can we empower students more quickly? Are
there other issues for these beginner level students that affect their
participation? As we enter these discussions, the numbers will provide
a basis for developing strategies, but the students themselves with be
our greatest source of valuable data.



Luanne Teller



Luanne Teller

-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go
to http://www.nifl.gov/mailman/listinfo/assessment









________________________________






-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go
to http://www.nifl.gov/mailman/listinfo/assessment




-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.nifl.gov/pipermail/assessment/attachments/20070419/94d4de1a/attachment.html


More information about the Assessment mailing list