National Institute for Literacy
 

[Assessment 791] Re: Using Data for Program Improvement

Borge, Toni tborge at bhcc.mass.edu
Fri Apr 20 14:55:04 EDT 2007


Yes, when parents move so do their children. The public school system
here has a very low graduation rate and a big factor is the number of
students who haven't dropped out but have moved out. And not surprising,
their performance on meeting standards is not high no matter how hard
the teachers in the system work to address the needs of the students.

Toni



Toni F. Borge

Adult Education & Transitions Program Director

Bunker Hill Community College

Chelsea Campus

175 Hawthorne Street

Chelsea, MA 02150

Phone: 617-228-2108 * Fax:617-228-2106

E-mail: tborge at bhcc.mass.edu

"Our lives begin to end the day we become silent about things that
matter." Martin Luther King Jr.

________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Fred Lowenbach
Sent: Wednesday, April 18, 2007 5:15 PM
To: The Assessment Discussion List
Subject: [Assessment 749] Re: Using Data for Program Improvement



Hopefully everyone participating in this discussion regarding adult
literacy is aware that almost everything you are saying applies to the
results for students in school as well. Coming from a public school
background you could always see the effect that high mobility rates had
on overall student results. Schools with those highest rates almost
always struggled to meet standard on state measures connected to NCLB.
This was the case with overall populations as well as various subgroups
that were tested. The same applies to student retention, or for that
matter attendance. As a rule, student who attended regularly achieved
much higher grades than students whose attendance was far less
consistent. This then followed suit with results on standardized
testing and ultimately on graduation rates.



The entire education community, whether it is involved with adult
literacy, or the traditional K-12 curriculum is faced with the same
thing. The key to increasing literacy and to closing achievement gaps
starts with getting and retaining students.



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Condelli, Larry
Sent: Wednesday, April 18, 2007 3:33 PM
To: The Assessment Discussion List
Subject: [Assessment 748] Re: Using Data for Program Improvement



Sandy,



A few years ago I did a study on adult ESL literacy students that
focused primarily on instruction. But we also looked at retention. We
found that the proportion of time an ESL literacy student attended
(measure by hours attended over total hours class was scheduled) had a
positive effect on oral English skills and reading comprehension, all
else being equal (using a complex statistical model).



The possible reasons for this effect are intriguing and need more
research. Because this measure showed an effect regardless of how many
hours the student actually attended (or how many hours per week a
student attended), my interpretation is that this measure is a measure
of motivation (although I have no data or other information to check
this). In other words, the student who continues to attend over time,
despite all of the other competing demands on time, is one that is more
motivated. This motivation helps learning.



I think if true, it does have implications for structuring instructional
segments.



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Sandy Strunk
Sent: Tuesday, April 17, 2007 5:38 PM
To: The Assessment Discussion List
Subject: [Assessment 736] Re: Using Data for Program Improvement

Larry,

Could you tell us more about the ESL research on percentage of possible
time attended? This is a new idea to me. Does it reflect greater
intensity as opposed to lesser intensity for a longer duration - or do
you think something else is going on? If your research is correct, there
are certainly implications for how we structure instructional segments.



Sandy Strunk



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Condelli, Larry
Sent: Tuesday, April 17, 2007 5:31 PM
To: The Assessment Discussion List
Subject: [Assessment 735] Re: Using Data for Program Improvement



Hi Ella,



Disaggregating by class can be very effective to understanding of what
is going on.



I wanted to comment on your last remark about tracking consistency of
attendance.



Attendance and persistence are a very popular topics these days and most
data systems allow for tracking of student attendance and persistence
patterns. One thing you might consider looking at learners who "stop
out" -- have sporadic attendance patterns, attending for a while and
coming back later. Another measure is the percent of time possible that
learners attend. You compute this by dividing the attended hours by
total possible (e.g., learner attends 8 hours a week for a class
scheduled 10 hours a week=80%). Some research I did on ESL students
showed that those who attended a higher proportion of possible time
learned more, independent of total hours. I think this is so because
this measure reflects student motivation to attend.



Identifying and studying "stop out" learners might tell you a lot about
why these type of students don't attend more regularly and can inform
you of needs, which could help in designing classes and programs for
them.



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of EllaBogard at cs.com
Sent: Tuesday, April 17, 2007 4:47 PM
To: assessment at nifl.gov
Subject: [Assessment 732] Re: Using Data for Program Improvement

Dear Collegues:

Here at Franklinton Learning Center, we use data everyday in our program
to help us track and improve the end results coming out of our program.
We use enrollment data to check the reach of our program, average hours
attended data to check the depth of engagement of students, and numbers
of students throught he door versus number completeing enrollment to
help us improve retention in the crucial orientation period of classes.

We have a program called ABLELink here in Ohio that has made it very
easy to track some areas. It has also allowedus to compare statistics
from one year to another so we know how we are doing in comparison to
previous years. By tracking information collected on attendance,
educational gain, hours of engagement and accomplishments, we have been
able to improve all of these efforts.

Tracking and constantly checking this data is what has made it possible
to improve. We can easily pull up reports on testing, who has tested,
progress made, who hasn't tested, attendance, etc. We can organize that
information by class, by teacher, by program, or by site, which allows
us to compare effectiveness of programs and staff and assign
responsibility for improvement where needed.

I would like to be able to track consistency of attendance over time not
just total hours attended. I think this might give a better picture of
the progress to be expected than the total time attended does. I would
also like to understand more about how I can use all of the ABLELink
data collected to improve my programs overall effectiveness.

Respectfully submitted by,
Ella Bogard

Ella Bogard, Executive Director
Franklinton Learning Center
1003 West Town Street
Columbus, Ohio 43222-1438

Phone: (614) 221-9151
Fax: (614) 221-9131

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.nifl.gov/pipermail/assessment/attachments/20070420/3fdb3b36/attachment.html


More information about the Assessment mailing list