National Institute for Literacy
 

[Assessment 740] Re: Using Data for Program Improvement

Gopalakrishnan, Ajit Ajit.Gopalakrishnan at ct.gov
Wed Apr 18 09:20:30 EDT 2007


In light of Larry's comments, I would like to share a program quality
standard that we have been using in Connecticut. We call it the
"utilization rate" or "% of available instruction used". It is the
percent of available class hours utilized by each student in the class.
We aggregate this measure at the class level and the program level.



We have experienced some challenges with this measure though. We are
able to account for late starters by pro-rating the remaining available
hours based on that late start date, but it gets unwieldy to also
account for students who exit early. This measure works well for classes
offered on a set schedule but can be problematic for learning labs where
the lab might be open for say 25 hours a week but a student is not
expected to be there for the entire 25 hours; this could result in a low
utilization rate though the students might be attending say 10
hours/week. At the other extreme, some classes/programs may show high
utilization rates but may be offering classes that run for only 40 hours
in a semester. I find that combining this utilization rate with an
absolute average of hours attended gives a better picture of the
participation and persistence of learners within a program.



I too would like to hear Larry's thoughts on Sandy's question. In my
personal experience after looking at tons of data over the past 2-3
years from a variety of programs, I would expect that "intensity" (more
instructional hours in a week) more than "duration" (more calendar days
between class start and end dates) might result in greater learner
attendance. For example, it is probably more likely that 20 ESL students
will attend 100 hours each on average during a fiscal year if they are
offered a class that runs 12 hours a week for 12 weeks than if they are
offered a class that runs 4 hours a week for 36 weeks.



Another element that we are beginning to track more closely is retention
across fiscal years. We know that many students don't achieve their
goals within one fiscal year. Therefore, we are using our data system to
track and report on students who are new in the fiscal as well as those
who might be returning to that program from a prior fiscal year.



What about recruitment? Do any programs/states look at the students
served over the past six/seven years and compare that to say Census
2000?



Ajit



Ajit Gopalakrishnan
Education Consultant
Connecticut Department of Education
25 Industrial Park Road
Middletown, CT 06457
860-807-2125
Fax: 860-807-2062
ajit.gopalakrishnan at ct.gov



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Sandy Strunk
Sent: Tuesday, April 17, 2007 5:38 PM
To: The Assessment Discussion List
Subject: [Assessment 736] Re: Using Data for Program Improvement



Larry,

Could you tell us more about the ESL research on percentage of possible
time attended? This is a new idea to me. Does it reflect greater
intensity as opposed to lesser intensity for a longer duration - or do
you think something else is going on? If your research is correct, there
are certainly implications for how we structure instructional segments.



Sandy Strunk



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Condelli, Larry
Sent: Tuesday, April 17, 2007 5:31 PM
To: The Assessment Discussion List
Subject: [Assessment 735] Re: Using Data for Program Improvement



Hi Ella,



Disaggregating by class can be very effective to understanding of what
is going on.



I wanted to comment on your last remark about tracking consistency of
attendance.



Attendance and persistence are a very popular topics these days and most
data systems allow for tracking of student attendance and persistence
patterns. One thing you might consider looking at learners who "stop
out" -- have sporadic attendance patterns, attending for a while and
coming back later. Another measure is the percent of time possible that
learners attend. You compute this by dividing the attended hours by
total possible (e.g., learner attends 8 hours a week for a class
scheduled 10 hours a week=80%). Some research I did on ESL students
showed that those who attended a higher proportion of possible time
learned more, independent of total hours. I think this is so because
this measure reflects student motivation to attend.



Identifying and studying "stop out" learners might tell you a lot about
why these type of students don't attend more regularly and can inform
you of needs, which could help in designing classes and programs for
them.



________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of EllaBogard at cs.com
Sent: Tuesday, April 17, 2007 4:47 PM
To: assessment at nifl.gov
Subject: [Assessment 732] Re: Using Data for Program Improvement

Dear Collegues:

Here at Franklinton Learning Center, we use data everyday in our program
to help us track and improve the end results coming out of our program.
We use enrollment data to check the reach of our program, average hours
attended data to check the depth of engagement of students, and numbers
of students throught he door versus number completeing enrollment to
help us improve retention in the crucial orientation period of classes.

We have a program called ABLELink here in Ohio that has made it very
easy to track some areas. It has also allowedus to compare statistics
from one year to another so we know how we are doing in comparison to
previous years. By tracking information collected on attendance,
educational gain, hours of engagement and accomplishments, we have been
able to improve all of these efforts.

Tracking and constantly checking this data is what has made it possible
to improve. We can easily pull up reports on testing, who has tested,
progress made, who hasn't tested, attendance, etc. We can organize that
information by class, by teacher, by program, or by site, which allows
us to compare effectiveness of programs and staff and assign
responsibility for improvement where needed.

I would like to be able to track consistency of attendance over time not
just total hours attended. I think this might give a better picture of
the progress to be expected than the total time attended does. I would
also like to understand more about how I can use all of the ABLELink
data collected to improve my programs overall effectiveness.

Respectfully submitted by,
Ella Bogard

Ella Bogard, Executive Director
Franklinton Learning Center
1003 West Town Street
Columbus, Ohio 43222-1438

Phone: (614) 221-9151
Fax: (614) 221-9131

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.nifl.gov/pipermail/assessment/attachments/20070418/833e7608/attachment.html


More information about the Assessment mailing list