National Institute for Literacy
 

[Assessment 754] Re: Using Data for Program Improvement

Luanne Teller lteller at massasoit.mass.edu
Thu Apr 19 09:24:33 EDT 2007


Good morning:

How interesting to hear from a range of institutions---I get so focused
on my programs that it is interesting to hear from other types of
organizations and structures.



For my students, motivation isn't an issue at all. We have a lengthy
wait list - depending on the level, students could wait as little as 6
months or as long as 2 years to get into our program. Consequently,
they are pretty thrilled to finally be there, and eager to participate.



For my population, (adult ESOL learners---a large majority in the 25-44
age range) the issue is juggling demands on their time. The majority
work at least one job (many work 2 or more) and have children in school.
Our classes are in the evening, since over 90% of the population we
serve work during the day. Many rush directly from work to class, and
might be late due to mandatory overtime, or a family need that requires
attention prior to attending class. Given the lack of access to
adequate preventative health care that many of our students face, there
are ongoing health problems for many. Add this to the occasional trip
back to their native country for a death in the family, or some other
type of family emergency, and frankly I'm amazed that they are able to
maintain such a strong commitment to their studies.



Some of our research and data analysis have uncovered these issues--
still it remains quite a challenge to respond to these problems. We
initially adjusted our program plan and schedule to allow for longer
breaks during the holidays, when many students wish to return to their
native countries. We also incorporate school vacations in our planning.
When an individual student starts to have a problem, we meet with him or
her to see how we can help. It doesn't always work, but sometimes we
are able to communicate with employers, and that's been helpful for many
students. Sometimes, we offer students a "leave of absence" to deal
with pressing personal matters, and invite them to return when things
are more settled. All of these strategies have evolved over years of
looking at attendance/retention data, and discussions with focus groups.
These strategies have had a positive impact, and students appreciate our
responsiveness to their needs.



The first year that we implemented our "managed enrollment" (vs. open
entry/open exit) model, our retention increased from 74% to 90%. Our
attendance has increased from 68% to now over 82%. We all know how
critical it is to keep students long enough for them to reach their
goals...



Which brings me to my final point. We all serve so many masters---NRS,
our funders (in my case DOE); our parent organizations (for me a
community college) and we are constantly looking at data to justify our
existence and demonstrate our effectiveness. Let's be realistic, if we
want to retain our funding, we have to show results-which is as it
should be.



For us; however, when we look at our data, it is always with an eye to
how we can better serve our students and respond to their needs. The
difference is subtle, but powerful. It's a lot easier to get staff and
students on board with planning and change when they can see a direct
result for our students than to respond to a bunch of charts and
mandates from "higher ups".



Luanne Teller



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Fred Lowenbach
Sent: Wednesday, April 18, 2007 5:15 PM
To: The Assessment Discussion List
Subject: [Assessment 749] Re: Using Data for Program Improvement



Hopefully everyone participating in this discussion regarding adult
literacy is aware that almost everything you are saying applies to the
results for students in school as well. Coming from a public school
background you could always see the effect that high mobility rates had
on overall student results. Schools with those highest rates almost
always struggled to meet standard on state measures connected to NCLB.
This was the case with overall populations as well as various subgroups
that were tested. The same applies to student retention, or for that
matter attendance. As a rule, student who attended regularly achieved
much higher grades than students whose attendance was far less
consistent. This then followed suit with results on standardized
testing and ultimately on graduation rates.



The entire education community, whether it is involved with adult
literacy, or the traditional K-12 curriculum is faced with the same
thing. The key to increasing literacy and to closing achievement gaps
starts with getting and retaining students.



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Condelli, Larry
Sent: Wednesday, April 18, 2007 3:33 PM
To: The Assessment Discussion List
Subject: [Assessment 748] Re: Using Data for Program Improvement



Sandy,



A few years ago I did a study on adult ESL literacy students that
focused primarily on instruction. But we also looked at retention. We
found that the proportion of time an ESL literacy student attended
(measure by hours attended over total hours class was scheduled) had a
positive effect on oral English skills and reading comprehension, all
else being equal (using a complex statistical model).



The possible reasons for this effect are intriguing and need more
research. Because this measure showed an effect regardless of how many
hours the student actually attended (or how many hours per week a
student attended), my interpretation is that this measure is a measure
of motivation (although I have no data or other information to check
this). In other words, the student who continues to attend over time,
despite all of the other competing demands on time, is one that is more
motivated. This motivation helps learning.



I think if true, it does have implications for structuring instructional
segments.



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Sandy Strunk
Sent: Tuesday, April 17, 2007 5:38 PM
To: The Assessment Discussion List
Subject: [Assessment 736] Re: Using Data for Program Improvement

Larry,

Could you tell us more about the ESL research on percentage of possible
time attended? This is a new idea to me. Does it reflect greater
intensity as opposed to lesser intensity for a longer duration - or do
you think something else is going on? If your research is correct, there
are certainly implications for how we structure instructional segments.



Sandy Strunk



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Condelli, Larry
Sent: Tuesday, April 17, 2007 5:31 PM
To: The Assessment Discussion List
Subject: [Assessment 735] Re: Using Data for Program Improvement



Hi Ella,



Disaggregating by class can be very effective to understanding of what
is going on.



I wanted to comment on your last remark about tracking consistency of
attendance.



Attendance and persistence are a very popular topics these days and most
data systems allow for tracking of student attendance and persistence
patterns. One thing you might consider looking at learners who "stop
out" -- have sporadic attendance patterns, attending for a while and
coming back later. Another measure is the percent of time possible that
learners attend. You compute this by dividing the attended hours by
total possible (e.g., learner attends 8 hours a week for a class
scheduled 10 hours a week=80%). Some research I did on ESL students
showed that those who attended a higher proportion of possible time
learned more, independent of total hours. I think this is so because
this measure reflects student motivation to attend.



Identifying and studying "stop out" learners might tell you a lot about
why these type of students don't attend more regularly and can inform
you of needs, which could help in designing classes and programs for
them.



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of EllaBogard at cs.com
Sent: Tuesday, April 17, 2007 4:47 PM
To: assessment at nifl.gov
Subject: [Assessment 732] Re: Using Data for Program Improvement

Dear Collegues:

Here at Franklinton Learning Center, we use data everyday in our program
to help us track and improve the end results coming out of our program.
We use enrollment data to check the reach of our program, average hours
attended data to check the depth of engagement of students, and numbers
of students throught he door versus number completeing enrollment to
help us improve retention in the crucial orientation period of classes.

We have a program called ABLELink here in Ohio that has made it very
easy to track some areas. It has also allowedus to compare statistics
from one year to another so we know how we are doing in comparison to
previous years. By tracking information collected on attendance,
educational gain, hours of engagement and accomplishments, we have been
able to improve all of these efforts.

Tracking and constantly checking this data is what has made it possible
to improve. We can easily pull up reports on testing, who has tested,
progress made, who hasn't tested, attendance, etc. We can organize that
information by class, by teacher, by program, or by site, which allows
us to compare effectiveness of programs and staff and assign
responsibility for improvement where needed.

I would like to be able to track consistency of attendance over time not
just total hours attended. I think this might give a better picture of
the progress to be expected than the total time attended does. I would
also like to understand more about how I can use all of the ABLELink
data collected to improve my programs overall effectiveness.

Respectfully submitted by,
Ella Bogard

Ella Bogard, Executive Director
Franklinton Learning Center
1003 West Town Street
Columbus, Ohio 43222-1438

Phone: (614) 221-9151
Fax: (614) 221-9131

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.nifl.gov/pipermail/assessment/attachments/20070419/092c0279/attachment.html


More information about the Assessment mailing list