National Institute for Literacy
 

[Assessment] To fudge or not to fudge

Varshna Narumanchi-Jackson varshna at grandecom.net
Thu Mar 9 07:30:50 EST 2006


Maria: I know the DOL requires states to undergo a data validation process
in order to ensure the quality of the data they are receiving‹does NRS or US
Dept of Education? I¹ve read several threads on this, and I¹m not sure the
problem is with the NRS or states¹ expectations. The problem will come when
programs get to choose which students are included in program performance.
The US Dept of Education has been slow to move on the Common Measures
adopted across federal agencies funding employment and training programs.
If you¹re counting everyone who receives a qualifying service, then no one
is creaming the population to enhance performance and we¹re all using the
same definitions for performance and outcomes, regardless of how the states
define their benchmarks. Thanks, Varshna.


on 3/8/06 11:08 AM, Marie Cora at marie.cora at hotspurpartners.com wrote:


> That is the questionŠ..

>

> Hello all! Not too long ago, I received an email question regarding

> submitting accurate data to the states and the Feds. It appeared that the

> person was being pressured to make up data (assessment scores) so that the

> outcomes of the program looked better.

>

> I bet this story is not new to you ­ either because you have heard about it,

> or perhaps because it has happened to you.

>

> So I have some questions now:

>

> If programs fudge their data, won¹t that come back to haunt us all? Won¹t

> that skew standards and either force programs to under-perform or not allow

> them to reach their performance levels because they are too steep? Why would

> you want to fudge your data? At some point, most-likely the fudge will be

> revealed don¹t you think?

>

> We don¹t have nationwide standards ­ so if programs within states are

> reporting data in any which way, we won¹t be able to compare ourselves across

> states, will we?

>

> Since states have all different standards (and some don¹t have any), states

> can report in ways in which it makes them appear to be out-doing other states,

> when perhaps they are not at all?

>

> I¹m probably mushing 2 different and important things together here: the

> accurate data part, and the standards part (³on what do we base our data²) ­

> but this is how it¹s playing out in my mind. Not only do we sometimes struggle

> with providing accurate data (for a variety of reasons: it¹s complex, it¹s

> messy, we feel pressure, sometimes things are unclear, etc.), but we do not

> have institutionalized standards across all states for all to be working in

> parallel fashion.

>

> What are your thoughts on this?

>

> Thanks,

> marie cora

> Assessment Discussion List Moderator

>

>

>

> -------------------------------

> National Institute for Literacy

> Assessment mailing list

> Assessment at nifl.gov

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment



-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.nifl.gov/pipermail/assessment/attachments/20060309/0684babe/attachment.html


More information about the Assessment mailing list