National Institute for Literacy
 

[Assessment 744] Re: Using Data for Program Improvement

Monti, Suzi SMonti at ccbcmd.edu
Wed Apr 18 12:29:26 EDT 2007


Karen,

In response to your question about our curriculum, we do use the same materials (core books) across classes at the same level and while we have/encourage great flexibility to teach in response to student needs there is commonality provided by a framework of target skills or strategies to cover per level/per semester. This does facilitate the learning process if a student transfers within a semester or even if the student shows up the next semester.

As far as tracking, within our program and within a fiscal year we do pick up the students where they turn up assuming the identifiers are consistent. We also plan "sister sites" to encourage multiple enrollment to increase intensity/contact hours for students who desire it. We have the same issues with multiple test results in those cases but the information system we use seems to successfully handle that.

You mentioned students accessing more than one program and being able to track that. I am not sure if our statewide system in Maryland is able to do that (perhaps someone on the list can respond to that?). The issue of tracking can even become problematic within our program. Because of the lack of usual identifiers such as SS#s we have issues due to the use of multiple names or varied arrangement of names/surnames, reversals on dates of birth, etc. as can be common with ESOL students. We assign a number to each student but it is challenging to determine if we are dealing with the same or a different student sometimes. It can be detective work to try to sort it out. It would be interesting to know how often this impacts tracking.

Suzi

________________________________

From: assessment-bounces at nifl.gov on behalf of Karen Mundie
Sent: Wed 4/18/2007 10:40 AM
To: The Assessment Discussion List
Subject: [Assessment 741] Re: Using Data for Program Improvement


Suzi, given that much mobility in your students, I'm curious about your curriculum. Do you have a reasonably set curriculum that is consistent across sites? If that were the case, the movement would have fewer implication for student learning gains.

And do you "move" the student records and hours in class forward from one site to another internally?

The beauty of our on-line data bases in Pennsylvania is that students can't be duplicated in the system. Even if a student moves to a different program rather than a different site, his record is in edata and the new program and old one share the student equally. Both programs can put in hours, but only one can put in assessment information. The program in which the student is currently active is usually the "primary" program.

There are, frankly, some students who, because they are so motivated to learn English, are active in more than one program at the same time. It doesn't matter which agency does the testing--we usually use the best results for the official record. If our data base manager sees that we have better results than the primary agency, it benefits both of us to use that data. We might have the caught the student on a better day or our test might have have more appropriate for that particular student.

Karen Mundie
Associate Director
Greater Pittsburgh Literacy Council
100 Sheridan Square, 4th Floor
Pittsburgh, PA 15206
412 661-7323 (ext 101)
kmundie at gplc.org

GPLC - Celebrating 25 years of literacy, 1982-2007


This e-mail is intended solely for the use of the named addressees and is not meant for general distribution. If you are not the intended recipient, please report the error to the originator and delete the contents.



On Apr 17, 2007, at 6:58 PM, Monti, Suzi wrote:


I would like to add a few comments on retention and ESOL students.

We have recently heard a lot about "stopping out" and I think that can pertain to ESOL learners for many of the same reasons as ASE/GED learners - with the addition of issues such as stages of acculturation and/or home country responsibilities which may cause ESOL learners to withdraw for weeks or months and then possibly return.

I would also like to raise the issue of the mobility of the ESOL population. We see migration reports on immigrants and settlement trends and I often wonder how much of a difference in retention these trends makes when comparing ASE/GED retention rates with ESOL.

I think of the "stopover" trend we see sometimes in ESOL here in Baltimore, MD where non-native speakers will enter and only temporarily reside her before moving to an intended more perm ant location. This obviously has great impact on retention. When comparing ESOL programs statewide or nationwide, the "stopover" trend may negatively impact the retention rates of certain programs.

Another thing we see is "shift" or movement around the beltway (as we call it). We have major ESOL class sites at locations along the Baltimore beltway that roughly encircles the city and we see contraction and expansion at these sites based the movement of the ESOL population. We will see that a site may suddenly have low retention across ALL six or seven ESOL classes offered - even the classes with veteran/experienced teachers with a great track record of retention. In some cases, the same teacher is also teaching at another site and his/her class there is doing well at that site. Both of these things show that attrition is not likely a result of instructional issues.

When we see this contraction of a site with mid-semester attrition, we can sometimes predict that at another site we will experience a boom in registration the next semester. It depends on if it is more "stopover" (with learners leaving the area entirely) or just "shift" (learners relocating within the area). If it is the latter, learners who leave one site mid-semester will turn up to register the next semester at another site.

Suzi Monti

ESOL Curriculum Developer and Instructional Specialist
The Community College of Baltimore County
Center for Adult and Family Literacy
7200 Sollers Point Road E102
Baltimore, MD 21222

(410) 285-9476


-----Original Message-----
From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On Behalf Of Condelli, Larry
Sent: Tuesday April 17, 2007 5:31 PM
To: The Assessment Discussion List
Subject: [Assessment 735] Re: Using Data for Program Improvement




Hi Ella,

Disaggregating by class can be very effective to understanding of what is going on.

I wanted to comment on your last remark about tracking consistency of attendance.

Attendance and persistence are a very popular topics these days and most data systems allow for tracking of student attendance and persistence patterns. One thing you might consider looking at learners who "stop out" -- have sporadic attendance patterns, attending for a while and coming back later. Another measure is the percent of time possible that learners attend. You compute this by dividing the attended hours by total possible (e.g., learner attends 8 hours a week for a class scheduled 10 hours a week=80%). Some research I did on ESL students showed that those who attended a higher proportion of possible time learned more, independent of total hours. I think this is so because this measure reflects student motivation to attend.

Identifying and studying "stop out" learners might tell you a lot about why these type of students don't attend more regularly and can inform you of needs, which could help in designing classes and programs for them.

________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On Behalf Of EllaBogard at cs.com
Sent: Tuesday, April 17, 2007 4:47 PM
To: assessment at nifl.gov
Subject: [Assessment 732] Re: Using Data for Program Improvement


Dear Collegues:

Here at Franklinton Learning Center, we use data everyday in our program to help us track and improve the end results coming out of our program. We use enrollment data to check the reach of our program, average hours attended data to check the depth of engagement of students, and numbers of students throught he door versus number completeing enrollment to help us improve retention in the crucial orientation period of classes.

We have a program called ABLELink here in Ohio that has made it very easy to track some areas. It has also allowedus to compare statistics from one year to another so we know how we are doing in comparison to previous years. By tracking information collected on attendance, educational gain, hours of engagement and accomplishments, we have been able to improve all of these efforts.

Tracking and constantly checking this data is what has made it possible to improve. We can easily pull up reports on testing, who has tested, progress made, who hasn't tested, attendance, etc. We can organize that information by class, by teacher, by program, or by site, which allows us to compare effectiveness of programs and staff and assign responsibility for improvement where needed.

I would like to be able to track consistency of attendance over time not just total hours attended. I think this might give a better picture of the progress to be expected than the total time attended does. I would also like to understand more about how I can use all of the ABLELink data collected to improve my programs overall effectiveness.

Respectfully submitted by,
Ella Bogard

Ella Bogard, Executive Director
Franklinton Learning Center
1003 West Town Street
Columbus, Ohio 43222-1438

Phone: (614) 221-9151
Fax: (614) 221-9131




-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to http://www.nifl.gov/mailman/listinfo/assessment


-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/ms-tnef
Size: 15216 bytes
Desc: not available
Url : http://www.nifl.gov/pipermail/assessment/attachments/20070418/55f89d4e/attachment.bin


More information about the Assessment mailing list