National Institute for Literacy
 

[Assessment 235] Re: [Assessment] Tying data to performance and performance to data

Marie Cora marie.cora at hotspurpartners.com
Tue Mar 14 11:33:42 EST 2006


Hi Varshna and all,

So are you saying that you believe the federal system separates
employment and training for purposes of data collection only? And that
it is not to distinguish between opportunities? Clarify this for us a
bit if you can ok?

I did go to the ExpectMore url you suggested below and I was intrigued
by the Improvement Plan they propose: it focuses on developing uniform
or standardized mechanisms for collecting data - they clearly believe
that this will make an impact on performance.

There is also this item: 'Measuring how program participation impacts
an individual's earnings.' - What do folks think of this? Do people
feel that we can effectively tie earnings to program participation (and
vice versa)?

Thanks,
marie
Assessment Discussion List Moderator


-----Original Message-----
From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Varshna Narumanchi-Jackson
Sent: Friday, March 10, 2006 8:13 AM
To: The Assessment Discussion List
Subject: Re: [Assessment] To fudge or not to fudge

Katrina:

Just thought I'd address one of the questions below: whether employment
with
the military is 'entered employment' for the purposes of WIA. It is my
understanding that federal military wage records are part of the data
set --
the other primary data set being UI wage records -- that states can
utilize
when evaluating employment outcomes. The question then becomes one of
sharing data between Title I and Title II agencies for the purposes of
evaluating program outcomes. It is not always clear to me that, when
states
have separate Title I and II agencies, this happens or that it happens
with
the kind of coordination expected at the federal level.

The issue -- and I'm stating my opinion now -- is the attempt to make a
distinction between education and training that (again, my opinion) has
no
practical relevance to the adults who seek ABE services and, maybe, has
no
relevance in the discussion of measuring the efficacy of
federally-funded
programs, especially when education appears to be considered a training
service on the basis of its inclusion in federally-defined employment
and
training programs. Maybe that's a discussion for another list...

I don't know if anyone has read the www.expectmore.gov page on Adult
Education:
http://www.whitehouse.gov/omb/expectmore/summary.10000180.2005.html

It's interesting... Program Results and Accountability rates a ZERO.

Thanks, Varshna

on 3/10/06 6:50 AM, Katrina Hinson at khinson at future-gate.com wrote:


> That's a big question. I had to think about this one some before I

could even

> begin to think about a response. I agree that it's not a new story

and that

> it probably happens more than people realize. My personal opinion is

that any

> time you tie funding primarily to performance, there are bound to be

issues

> regarding data collection and I'm not sure there is a "neat" solution

that

> will adress the problem. Also I think there are gaps in the data

itself. I

> teach in addition to other duties I have at my institution and so

often when I

> do outcomes, I don't feel they're a true indicator of a student's

performance

> - a student may have made progress but not enough to move up a level -

not

> enough to meet a goal or performance indicator - goals like that are

accounted

> for. Another problem we've encountered with the data itself is the

fact that

> if a students marks "find a job" as a goal and ends up joining the

military,

> the goal isn't met either because of the way the data is cross re

> ferences with ESC agencies, yet I think most people would agree that

joining

> the military definitely qualifies as getting a job.

>

> I think that there are issues with the data collection instruments,

that while

> they may have been validated at some point, they perhaps need to be

reviewed

> to see if they are capturing the right data needed to show a program's

real

> performance or if more needs to be taken into account when

determining if a

> program is doing well. I don't think raw data alone can ever truly

show a

> programs strenghts and weaknesses.

>

> I'm still digesting this topic. It definitely warrants thought.

>

> Regards

> Katrina Hinson

>

>>>> marie.cora at hotspurpartners.com >>>

> That is the question...

>

> Hello all! Not too long ago, I received an email question regarding

> submitting accurate data to the states and the Feds. It appeared that

> the person was being pressured to make up data (assessment scores) so

> that the outcomes of the program looked better.

>

> I bet this story is not new to you - either because you have heard

about

> it, or perhaps because it has happened to you.

>

> So I have some questions now:

>

> If programs fudge their data, won't that come back to haunt us all?

> Won't that skew standards and either force programs to under-perform

or

> not allow them to reach their performance levels because they are too

> steep? Why would you want to fudge your data? At some point,

> most-likely the fudge will be revealed don't you think?

>

> We don't have nationwide standards - so if programs within states are

> reporting data in any which way, we won't be able to compare ourselves

> across states, will we?

>

> Since states have all different standards (and some don't have any),

> states can report in ways in which it makes them appear to be

out-doing

> other states, when perhaps they are not at all?

>

> I'm probably mushing 2 different and important things together here:

> the accurate data part, and the standards part ("on what do we base

our

> data") - but this is how it's playing out in my mind. Not only do we

> sometimes struggle with providing accurate data (for a variety of

> reasons: it's complex, it's messy, we feel pressure, sometimes things

> are unclear, etc.), but we do not have institutionalized standards

> across all states for all to be working in parallel fashion.

>

> What are your thoughts on this?

>

> Thanks,

> marie cora

> Assessment Discussion List Moderator

>

>

> -------------------------------

> National Institute for Literacy

> Assessment mailing list

> Assessment at nifl.gov

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment

>

>



-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
http://www.nifl.gov/mailman/listinfo/assessment




More information about the Assessment mailing list