![]() |
![]() |
[Assessment 751] Re: Using Data for Program ImprovementSandy Strunk sandy_strunk at IU13.orgWed Apr 18 19:01:38 EDT 2007
Fred, I certainly agree that K-12 education has retention issues related to mobility; however, the difference as I see it is twofold. First of all, most schools run on a 180 cycle and children are expected to attend every day that they're healthy and reside in the district. Secondly, while an individual teacher may structure his/her instructional segments, most students don't have the ability to choose whether or not to attend a given session. I suspect that the attendance issue in K-12 - at least up until 9th or 10th grade - is related to the family's mobility rather than to student motivation. Most adult education programs in Pennsylvania, have an average attendance of 60 to 100 hours per year. Mobility is certainly a factor, but in my experience most adults "stop out" for many reasons other than mobility. As a program director, I have tried various combinations of intensity and duration. One of the ways we've worked on retention is to have each teacher create a scattergram of his/her retention patterns. One axis of the graph is the number of hours available, the other axis is the duration of the class. What we found is that different patterns emerge on the scattergram with different teachers. We then work with teachers individually to develop improvement strategies based on their individual patterns. For example, a teacher with students who cluster in the low intensity/low duration quadrant would use very different retention strategies than a teacher who has students clustering in the low intensity/high duration quadrant or a teacher whose scattergram is evenly distributed across the four quadrants. Ultimately, the teacher's goal is to see his/her students clustering in the high intensity, high duration quadrant. Our experience suggests that working with teachers on their scattergrams and retention strategies has a positive impact on student retention. If Larry's research can be replicated, it speaks to a couple of very important issues for our field. Open entry/open exit is one of them. The second is the length of the instructional segment, regardless of intensity. Our program has operated under the assumption that low intensity classes need to be longer in duration. For example, our night classes tend to run in 14 week segments whereas our daytime, high intensity classes tend to run about 7 weeks. This research certainly challenges this assumption. Sandy ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Fred Lowenbach Sent: Wednesday, April 18, 2007 5:15 PM To: The Assessment Discussion List Subject: [Assessment 749] Re: Using Data for Program Improvement Hopefully everyone participating in this discussion regarding adult literacy is aware that almost everything you are saying applies to the results for students in school as well. Coming from a public school background you could always see the effect that high mobility rates had on overall student results. Schools with those highest rates almost always struggled to meet standard on state measures connected to NCLB. This was the case with overall populations as well as various subgroups that were tested. The same applies to student retention, or for that matter attendance. As a rule, student who attended regularly achieved much higher grades than students whose attendance was far less consistent. This then followed suit with results on standardized testing and ultimately on graduation rates. The entire education community, whether it is involved with adult literacy, or the traditional K-12 curriculum is faced with the same thing. The key to increasing literacy and to closing achievement gaps starts with getting and retaining students. ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Condelli, Larry Sent: Wednesday, April 18, 2007 3:33 PM To: The Assessment Discussion List Subject: [Assessment 748] Re: Using Data for Program Improvement Sandy, A few years ago I did a study on adult ESL literacy students that focused primarily on instruction. But we also looked at retention. We found that the proportion of time an ESL literacy student attended (measure by hours attended over total hours class was scheduled) had a positive effect on oral English skills and reading comprehension, all else being equal (using a complex statistical model). The possible reasons for this effect are intriguing and need more research. Because this measure showed an effect regardless of how many hours the student actually attended (or how many hours per week a student attended), my interpretation is that this measure is a measure of motivation (although I have no data or other information to check this). In other words, the student who continues to attend over time, despite all of the other competing demands on time, is one that is more motivated. This motivation helps learning. I think if true, it does have implications for structuring instructional segments. ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Sandy Strunk Sent: Tuesday, April 17, 2007 5:38 PM To: The Assessment Discussion List Subject: [Assessment 736] Re: Using Data for Program Improvement Larry, Could you tell us more about the ESL research on percentage of possible time attended? This is a new idea to me. Does it reflect greater intensity as opposed to lesser intensity for a longer duration - or do you think something else is going on? If your research is correct, there are certainly implications for how we structure instructional segments. Sandy Strunk ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of Condelli, Larry Sent: Tuesday, April 17, 2007 5:31 PM To: The Assessment Discussion List Subject: [Assessment 735] Re: Using Data for Program Improvement Hi Ella, Disaggregating by class can be very effective to understanding of what is going on. I wanted to comment on your last remark about tracking consistency of attendance. Attendance and persistence are a very popular topics these days and most data systems allow for tracking of student attendance and persistence patterns. One thing you might consider looking at learners who "stop out" -- have sporadic attendance patterns, attending for a while and coming back later. Another measure is the percent of time possible that learners attend. You compute this by dividing the attended hours by total possible (e.g., learner attends 8 hours a week for a class scheduled 10 hours a week=80%). Some research I did on ESL students showed that those who attended a higher proportion of possible time learned more, independent of total hours. I think this is so because this measure reflects student motivation to attend. Identifying and studying "stop out" learners might tell you a lot about why these type of students don't attend more regularly and can inform you of needs, which could help in designing classes and programs for them. ________________________________ From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of EllaBogard at cs.com Sent: Tuesday, April 17, 2007 4:47 PM To: assessment at nifl.gov Subject: [Assessment 732] Re: Using Data for Program Improvement Dear Collegues: Here at Franklinton Learning Center, we use data everyday in our program to help us track and improve the end results coming out of our program. We use enrollment data to check the reach of our program, average hours attended data to check the depth of engagement of students, and numbers of students throught he door versus number completeing enrollment to help us improve retention in the crucial orientation period of classes. We have a program called ABLELink here in Ohio that has made it very easy to track some areas. It has also allowedus to compare statistics from one year to another so we know how we are doing in comparison to previous years. By tracking information collected on attendance, educational gain, hours of engagement and accomplishments, we have been able to improve all of these efforts. Tracking and constantly checking this data is what has made it possible to improve. We can easily pull up reports on testing, who has tested, progress made, who hasn't tested, attendance, etc. We can organize that information by class, by teacher, by program, or by site, which allows us to compare effectiveness of programs and staff and assign responsibility for improvement where needed. I would like to be able to track consistency of attendance over time not just total hours attended. I think this might give a better picture of the progress to be expected than the total time attended does. I would also like to understand more about how I can use all of the ABLELink data collected to improve my programs overall effectiveness. Respectfully submitted by, Ella Bogard Ella Bogard, Executive Director Franklinton Learning Center 1003 West Town Street Columbus, Ohio 43222-1438 Phone: (614) 221-9151 Fax: (614) 221-9131 -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.nifl.gov/pipermail/assessment/attachments/20070418/48b075ab/attachment.html
More information about the Assessment mailing list |