National Institute for Literacy
 

[Assessment 776] Re: Using Data

JoAnn (Jodi) Crandall crandall at umbc.edu
Fri Apr 20 03:54:17 EDT 2007


Larry and Steve,

I agree. There are important gains that are missed when one only looks at
data within a year. Longitudinal data would give us a much better view of
participant progress, both for those programs in which a significant
number of adults continue more than one term, as well as those in which
participants stop out and return.

Jodi Crandall


> Thanks for those points, Larry. Besides giving us a broader view of more

> complex patterns of participation, multi-year data frames will probably do

> a

> better job at revealing program impacts on longer term outcomes such as

> postsecondary education and employment It's good to hear that there may

> be

> flexibility within ED for experimentation such as this.

>

> -Steve Reder

>

>

>

> _____

>

> From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On

> Behalf Of Condelli, Larry

> Sent: Thursday, April 19, 2007 11:16 AM

> To: The Assessment Discussion List

> Subject: [Assessment 768] Re: Using Data

>

>

>

> Dan, Karen, Steve and David,

>

>

>

> You all have raised the issue of changing the NRS reporting period from

> one

> year to multiple years. While this is off the topic of using data, I will

> give a quick response.

>

>

>

> First, the mandate is to have an annual reporting system so some

> information

> is required each year top report to Congress. Beyond this, this topic has

> come up and been considered multiple times and there is some flexibility

> with ED to make some changes to the reporting period, if there is a

> compelling reason that can be demonstrated. Our analyses of several

> states'

> data (not NRS reported data but individual student data from over several

> years), however, including some very large states, is that there are

> proportionally very few students who continue year to year (on the order

> of

> 5 percent or less in some states) and it does not appear at this time that

> it would make a difference in performance data at the national level, as

> Dan

> Wann suggested.

>

>

>

> NRS is a national system so with some local programs (such as Karen's) or

> other states, there may be large numbers of students who continue year to

> year and in those instances it might be advisable to look at and report

> multi-year data. To bring us back to our topic of using data, this would

> be

> a good analysis a state or local program to pursue-- to look at returning

> and continuing students and see how they differ in outcomes and other

> factors from students who stay a short time. We also can rely on

> research,

> such as Steve Reder's study to look at long-term relationships, which if

> compelling, could result in a change to the reporting period in the

> future.

>

>

>

>

> _____

>

> From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On

> Behalf Of Karen Mundie

> Sent: Thursday, April 19, 2007 11:51 AM

> To: The Assessment Discussion List

> Subject: [Assessment 761] Re: Using Data

>

> Dan, I know that's the perception, but I also know that we roll over about

> half of our students from one year to the next. . . and some of those

> students had rolled over the previous year as well. We've actually had to

> put a three year limit on some students (especially ESL).

>

>

>

> I'm having our data person look this up as well as we can. Unfortunately,

> our data tends to be divided, as David indicated, in discrete yearly

> "lumps." We can get the information, but it's time-consuming because the

> data bases are designed for accountability over a contract year.

>

>

>

> We certainly do have a lot of students who come in with short term goals

> and

> leave when these are accomplished. We also have a lot of stop out

> students,

> who have to put goals on the back burner while they work out other issues.

> I

> think, however, we do keep a significant number of students over time. I

> think for my own little research project, I'm going to investigate gains

> over multiple years.

>

>

>

>

>

>

>

> Karen Mundie

>

> Associate Director

>

> Greater Pittsburgh Literacy Council

>

> 100 Sheridan Square, 4th Floor

>

> Pittsburgh, PA 15206

>

> 412 661-7323 (ext 101)

>

> kmundie at gplc.org

>

>

>

> GPLC - Celebrating 25 years of literacy, 1982-2007

>

>

>

>

>

> This e-mail is intended solely for the use of the named addressees and is

> not meant for general distribution. If you are not the intended

> recipient,

> please report the error to the originator and delete the contents.

>

>

>

>

>

>

>

> On Apr 19, 2007, at 9:47 AM, Dan Wann wrote:

>

>

>

>

>

> I wonder if there is enough data to even show that adult basic and ESL

> students stay with a program in large enough numbers to track over a

> longer

> period? The conventional wisdom of those outside of the adult basic

> skills

> network is that basic skills programs have little impact because students

> do

> not stay long to make a difference. Do we have any evidence that shows we

> work with the same students more than one year and that we work with a

> high

> enough number of students more than one year to make a significant

> difference?

>

> Dan Wann

>

> Professional Development Consultant

>

> IN Adult Education Professional Development Project

>

> dlwann at comcast.net

>

> _____

>

> From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On

> Behalf Of David J. Rosen

> Sent: Thursday, April 19, 2007 2:05 AM

> To: The Assessment Discussion List

> Subject: [Assessment 752] Re: Using Data

>

> Larry, and others,

>

> Tina, and many other program administrators have observed patterns like

> this

> that suggest that a one-year time frame, a funding year, may not be the

> best

> unit of time in which to measure learner gains, except for those who are

> doing basic skills brush-up or who have very short-term goals like

> preparing

> for a drivers license test. I wonder if there is a possibility that the

> NRS

> might be adjusted, perhaps in a pilot at first, so that a longer period of

> learning, say three years, might be used to demonstrate learner gains. Of

> course, there would need to be intermediate measures, but accountability

> --

> for programs and states -- might be based on a longer period of time.

>

> It seems to me that the one-year time frame within to measure learning

> gains

> or goals accomplished comes not from K-12 or higher education, but rather

> Congressional expectations for job skills training. Would you agree?

>

> Also I wonder if you or others have some examples of programs that track

> and

> report learner outcomes over several years, and use the data for program

> improvement.

>

> David J. Rosen

> djrosen at comcast.net

>

>

> Tina_Luffman at yc.edu wrote:

>

> Hi Luanne,

>

> I find it interesting that what you are finding in data seems to be

> consistent with what we see in our GED classes here in Arizona. Often the

> last group who enter in March are the least likely to stay with the

> program

> until posttesting, and the August group seem to have the highest

> posttesting

> and retention rate.

>

> Tina

>

>

>

>

>

> Tina Luffman

> Coordinator, Developmental Education

> Verde Valley Campus

> 928-634-6544

> tina_luffman at yc.edu

>

> -----assessment-bounces at nifl.gov wrote: -----

>

> To: <mailto:assessment at nifl.gov> <assessment at nifl.gov>

> From: "Luanne Teller" <mailto:lteller at massasoit.mass.edu>

> <lteller at massasoit.mass.edu>

> Sent by: assessment-bounces at nifl.gov

> Date: 04/18/2007 10:56AM

> Subject: [Assessment 746] Using Data

>

>

>

>

>

> Hi all:

>

>

>

> I wanted to chime in about our program's use of data since this is the

> focus

> of our discussion. Coincidentally, I am in the process of writing our

> proposal for next year, so I am knee-deep in data even as we speak!

>

>

>

> The use of data takes many forms in our program. We look at what most

> people consider the "hard data" -- the raw numbers with regard to

> attendance, learner gains, retention, goal attainment, etc. We believe;

> however, that the numbers alone provide an incomplete picture of what is

> happening, so we use the numbers as a basis for discussion, not decision

> making. After analyzing the numbers, we begin to look at additional

> sources

> of data that we find essential in informing our planning---meetings with

> staff, classes, our student advisory board, and focus groups.

>

>

>

> Here's an example we're currently working on---we did a two year analysis

> of

> learner retention, and began to document why students did not persist. We

> found that the retention for students who enrolled after January 1 (our

> programs runs on a school calendar year from September to June) was

> significantly lower than the retention for students who began in

> September.

> Even more compelling, we learned that the retention for students who began

> after March 1 was 0%.

>

>

>

> We met with staff and students, and did some research around student

> retention issues. After a year-long process, we decided to pilot a

> "managed

> enrollment" approach. In Massachusetts , our grantor (MA DOE) allows us

> to

> "over-enroll" our classes by 20%, so we enroll 20% more students in the

> fall. When students leave, we "drop" the overenrolled students into

> funded

> slots. This allows us to keep the seats filled even with the typical

> attrition that occurs.

>

>

>

> In January, when we do our mid-point assessments; we move students to the

> higher level who are ready to progress..that typically leaves several

> openings in the beginner levels and we begin students in February as a

> cohort. This year, we implemented new orientation programs including a

> requirement that new students observe a class before enrolling.

>

>

>

> While it is still too early to tell if these new procedures will have a

> positive impact, we are hopeful and we know anecdotally that the

> transition

> seems to be easier for some of these students. We are eager to look at

> the

> data at the end of the year to analyze the effectiveness of this plan.

>

>

>

> As we begin to look at our data, we are finding that there seem to be a

> unique set of issues for our beginner ESOL students. We suspect that the

> lack of effective English communication skills to advocate for themselves

> with employers is influencing their attendance and persistence. This is

> an

> issue that we are beginning to tackle in terms of policy. Do we need to

> have a more flexible, lenient policy for beginner students? Is there a

> way

> to support students in addressing these employment issues? How can we

> empower students more quickly? Are there other issues for these beginner

> level students that affect their participation? As we enter these

> discussions, the numbers will provide a basis for developing strategies,

> but

> the students themselves with be our greatest source of valuable data.

>

>

>

> Luanne Teller

>

>

>

> Luanne Teller

>

> -------------------------------

> National Institute for Literacy

> Assessment mailing list

> Assessment at nifl.gov

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment

>

>

>

>

>

>

>

> _____

>

>

>

>

>

>

> -------------------------------

> National Institute for Literacy

> Assessment mailing list

> Assessment at nifl.gov

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment

>

>

> -------------------------------

>

> National Institute for Literacy

>

> Assessment mailing list

>

> Assessment at nifl.gov

>

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment

>

>

>

> -------------------------------

> National Institute for Literacy

> Assessment mailing list

> Assessment at nifl.gov

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment

>



--
JoAnn (Jodi) Crandall
Professor, Education Department
Director, Ph.D. Program in Language, Literacy & Culture
Coordinator, Peace Corps Master's International Program in ESOL/Bilingual
Education
University of Maryland, Baltimore County (UMBC)
1000 Hilltop Circle, Baltimore, MD 21250
ph: 410-455-2313/2376 fax: 410-455-8947/1880
email: crandall at umbc.edu
www.umbc.edu/llc/
www.umbc.edu/esol/
www.umbc.edu/esol/peacecorps.html






More information about the Assessment mailing list