National Institute for Literacy
 

[Assessment 787] Re: Using Data for Program Improvement

Karen Mundie kmundie at gplc.org
Fri Apr 20 14:27:51 EDT 2007


In Pennsylvania we also have a mandated Assessment training. At least
one person from every program must have attended the training.

However, the truth is, many people are still giving assessments who
have not attended training.

At my own agency, everyone who gives the TABE or BEST or BEST Plus
has attended training. Basically, I think we do very well with the
TABE. We are lucky in having an education specialist who is an
assessment trainer and who oversees our testing procedures in general.

I think that the Best Plus is a problem for many programs because
expertise in giving the test requires a good deal of practice. We
have limited the number of staff who give the BEST Plus to about nine
staff members and try to have the pre and post tests given by the
same person, but I'm still concerned that there is still too much
room for scoring anomalies and differences in judgment.

I hear from a number of programs that this is problem.

Karen Mundie
Associate Director
Greater Pittsburgh Literacy Council
100 Sheridan Square, 4th Floor
Pittsburgh, PA 15206
412 661-7323 (ext 101)
kmundie at gplc.org

GPLC - Celebrating 25 years of literacy, 1982-2007


This e-mail is intended solely for the use of the named addressees
and is not meant for general distribution. If you are not the
intended recipient, please report the error to the originator and
delete the contents.


On Apr 20, 2007, at 12:55 PM, Rosemary Matt wrote:


> Karen,

> The challenge of rolling out the staff development for these

> assessment pieces is truly daunting. In New York, every staff

> member who is to administer the TABE must attend the state

> training. Through our data management system, we track all

> attendance and certifications for state mandated training. We have

> a similar strand running for the BEST Plus as well. Although it is

> a challenge to get everyone trained, for those programs who pushed

> their staff through the first round, they are already seeing the

> difference in their performance data for educational gain. You are

> absolutely correct in feeling there is not only strategy involved

> but a better understanding of the nuances of the NRS system can

> assist a program in demonstrating through their data reporting, a

> more accurate picture of the service they provide and the impact it

> has on their students.

> Rosemary

>

> Rosemary I. Matt

> NRS Liaison for NYS

> Literacy Assistance Center

> 12 Meadowbrook Drive

> New Hartford, NY 13413

> 315.798.1026

>

> From: assessment-bounces at nifl.gov on behalf of Karen Mundie

> Sent: Fri 4/20/2007 12:23 PM

> To: The Assessment Discussion List

> Subject: [Assessment 784] Re: Using Data for Program Improvement

>

> I'm fascinated by this discussion because most of us who do a lot

> of standardized testing have, indeed, learned by experience that

> the whole testing thing is not quite as straightforward as one

> might think, that there is strategy involved if a program is to put

> its "best foot forward" in a completely legitimate way--not to

> mention a good deal of staff training to achieve any kind of

> consistency.

>

> Karen Mundie

> Associate Director

> Greater Pittsburgh Literacy Council

> 100 Sheridan Square, 4th Floor

> Pittsburgh, PA 15206

> 412 661-7323 (ext 101)

> kmundie at gplc.org

>

> GPLC - Celebrating 25 years of literacy, 1982-2007

>

>

> This e-mail is intended solely for the use of the named addressees

> and is not meant for general distribution. If you are not the

> intended recipient, please report the error to the originator and

> delete the contents.

>

>

> On Apr 20, 2007, at 7:04 AM, Rosemary Matt wrote:

>

>> Katrina,

>> I am not sure what state you are from but here in New York, we

>> have just this past year implemented a new state policy for

>> administration of the TABE and a series of validity tables as

>> well. (I have attached both for you to take a look at) Larry may

>> remember that it was our state data that prompted us to change the

>> way our programs were using the TABE. In some cases, based on the

>> score ranges, teachers were actually prohibiting their students

>> from showing enough gain to place them in the next EFL under NRS

>> guidelines by choosing an invalid level of the TABE. Scores on

>> either the high or low end of each range of scores on the TABE are

>> unreliable because of the large standard error of measurement

>> associated with the extreme ends of any bell curve. This means

>> that as you suspected, the high and low scores on each of the

>> tests are less likely to be a true indication of the student’s

>> ability. Retesting with a higher or lower level of the test is

>> recommended for these cases. It was evident to us, based on our

>> state data, that test administrators either did not understand

>> that concept or had differing opinion as to when a test was

>> outside the acceptable range and consequentially when to retest.

>>

>> We employed the methodology developed by the University of

>> Massachusetts for the Massachusetts Department of Education to

>> establish acceptable ranges for the Reading, Mathematics

>> Computation and Applied Mathematics sections of the TABE 7 & 8 and

>> TABE 9 & 10. The policy, along with the scoring tables, were then

>> integrated into our data management system such that invalid

>> scores may not even be entered into the data system. If students

>> score outside the valid ranges, they must be retested on an

>> appropriate version of the TABE.

>>

>> Strategy is still advised when using these scoring tables. For

>> example, based on our validity tables, if a student scores a 7.2

>> GE reading level on an M TABE, they are within the valid range

>> however if a level M is administered for the post test to this

>> same student, the very highest that student may achieve and still

>> fall within the valid range is a 7.7 GE. This score will not be

>> enough to show education gain. This student must be given a level

>> D test to open up the possibility of achieving a score high enough

>> to evidence gain. As long as the administration of the TABE

>> levels is contiguous, the scores are valid and may be used under

>> NRS guidelines. (So moving from an M to a D is acceptable)

>>

>> As you can imagine Katrina, a comprehensive staff development was

>> built to accommodate all this information and we rolled it out to

>> all programs through a train the trainer model. I am pleased to

>> say that our state’s performance in the area of educational gain

>> has increased significantly as a result of this work. I hope this

>> is useful to you and your testing coordinator.

>> Rosemary

>>

>>

>> Rosemary I. Matt

>> NRS Liaison for NYS

>> Literacy Assistance Center

>> 12 Meadowbrook Drive

>> New Hartford, NY 13413

>> 315.798.1026

>>

>> From: assessment-bounces at nifl.gov on behalf of Katrina Hinson

>> Sent: Thu 4/19/2007 7:14 PM

>> To: assessment at nifl.gov

>> Subject: [Assessment 774] Re: Using Data for Program Improvement

>>

>> I don't know if anyone has raised this question but one of the

>> things I

>> know myself and the testing coordinator at my school are concerned

>> with

>> is the fact that we don't get to count gains made if a person goes

>> from

>> the M level TABE to the D level TABE. We use versions 9 and 10. I

>> had a

>> student who scored around 9.0 or so on the M level Reading which we

>> question in and of itself in terms of validity. He's at the point

>> where

>> we're supposed to retest him. He'd tested on the D level in math the

>> first time and he's been regularly attending and regulary working

>> on his

>> goals. He went from the 9.0 on the M level test which is

>> moderately hard

>> in reading to a 7.7 on the D level test which is difficult.

>> Because he

>> DROPPED in terms of grade level, it's not counted as a level

>> completion...even though he actually went from M to D...which one

>> would

>> think would also qualify as a level.

>>

>> Has anyone else had this happen and if so, what are your suggestions.

>>

>> Regards,

>> Katrina Hinson

>> -------------------------------

>> National Institute for Literacy

>> Assessment mailing list

>> Assessment at nifl.gov

>> To unsubscribe or change your subscription settings, please go to

>> http://www.nifl.gov/mailman/listinfo/assessment

>>

>> <TABE_Policy_NYS.doc>

>> <TABE_Final_Tables.doc>

>> -------------------------------

>> National Institute for Literacy

>> Assessment mailing list

>> Assessment at nifl.gov

>> To unsubscribe or change your subscription settings, please go to

>> http://www.nifl.gov/mailman/listinfo/assessment

>

> -------------------------------

> National Institute for Literacy

> Assessment mailing list

> Assessment at nifl.gov

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment


-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.nifl.gov/pipermail/assessment/attachments/20070420/673826b4/attachment.html


More information about the Assessment mailing list