National Institute for Literacy
 

[Assessment 551] Re: peep?

Katrina Hinson KHinson at future-gate.com
Mon Oct 30 19:40:13 EST 2006


I don't know the answer to this and apologize for not getting to this
sooner but it's been busy. The instructor should be the advocate for
student rights but ultimately the instructor is bound by the constraints
of his or her program - rightly or wrongly. All we can do is personally
motivate our students to see beyond any score - to give positive
reinforcement all the time so the focus isn't on a test score - those
are just my thoughts. I have good retention in my class so far but I try
not to ever make the focus on the tests - whether it's the TABE or GED.
I put the focus on what my students show me they know and show me they
can do. I make a lot out of their successful moments. I think that goes
a long way.

Regards,
Katrina Hinson


>>> Nancy Hansen <sfallsliteracy at yahoo.com> 10/27/2006 12:31 pm >>>

Had to send a brief reply.

Who is the advocate for learner rights in this issue?

If what Katrina is true in her concluding paragraph:
<< > It's a catch 22 for instructors. We're caught in a loop of
having to
meet performance measures that may or may not truly reflect student
ability yet the work a student does that shows his or her ability is
often
ignored.>>

Who is The Loser here? Isn't it the students in these programs?
What does this inaccurate performance measure do to the adult learners'
self-confidence - their self-esteem - the way they feel about being
successful? I'll bet they know they know *more* than the tests show yet
they are being told differently. Aren't the consequences going to be a
higher dropout rate? And what is yours?

Nancy Hansen
Sioux Falls Area Literacy Council
Donna Chambers <donnaedp at cox.net> wrote:
Katrina,

You are not alone and it is a catch 22. I know that the test/data
doesn't
often show the accurate picture. However, in reality, we are forced to
deal
with standardized tests as the main measurement tool. Truth be told,
adult
literacy learners are often not good traditional testers. What can we
do?
Donna Chambers

----- Original Message -----
From: "Katrina Hinson"
To:
Sent: Tuesday, October 24, 2006 12:36 PM
Subject: [Assessment 536] Re: peep?



> I'm in the same boat as Donna below. Normally I try to keep up with

the

> discussions but I've been silent on the lists I'm on lately simply

because

> I wear way too many hats at the moment and it's difficult to keep up

with

> sheer volume of emails generated sometimes. Likewise, I have my own

> 'peep' to add to the discussion:

>

> We use TABE (9 & 10) at the moment, for ABE/GED students and CASAS

for

> ESL and Family Literacy and Comp. Ed.. One problem occurs when

students

> move between programs. Students tested via CASAS, are not given a

math

> component - EVER - just Reading and Listening and those in Family

Literacy

> are only given Reading. They may or may not ever have a math

placement

> score. Additionally, the program I'm in places great emphasis on

these

> scores due to the nature of the funding - so much so that paperwork

now

> has to go through multiple hands to ensure that it's all correctly

filled

> in to ensure that the numbers are all accurate. Another area where

there

> is a problem is the move from those students tested on TABE 8 to TABE

9.

> We saw a dramatic decline in test scores and were left asking if it

was

> accurate? We asked why the huge drop? Had these students regressed?

Had

> the test not been administered properly before hand? Had they

memorized

> (which I do agree is a major issue with standa!

> rdized testing) the test?

>

> There were no easy answers and we're still seeing scores all over the



> place sometimes.

>

> Instructors here and administrators are very much tied to the "tests"

as

> if that is the only measure. I sometimes feel like a lone voice

saying

> "Yes, BUT..." at a lot of meetings...or trying to explain that the

data

> isn't always a valid reflection of student ability. I'm always met

with

> the same response - the tests need to match student ability to ensure



> funding.

>

> It's a catch 22 for instructors. We're caught in a loop of having to

meet

> performance measures that may or may not truly reflect student

ability yet

> the work a student does that shows his or her ability is often

ignored.

>

> Regards,

> Katrina Hinson

>

>>>> "Donna Chambers" 10/24/06 8:40 AM >>>

> Wow! Like others, I lose track of the discussion because I am just

too

> busy to keep up. There is so much work to be done in thinking through

this

> issue, but in the meantime, we must keep up with our duties in the

> classroom and running our programs . Now I just heard on the news

that

> sitting at a computer for hours at a time can be addictive and may

> require medical and psychiatric treatment Here is my quick "peep"

> because I don't have the time for another addiction.

>

> I agree with both Nancy and Mary Jane. The testing requirement for

> government funding is not enough and sometimes not appropriate. I

have

> spent my career working with competency based assessment and/or

authentic

> assessment and so that is what I inherently use to see if a learner

> understands something. By testing this way, I am also challenging the



> thinking skills of the learner which I believe is critical in the

process

> for the adult.

>

> Lately I have been doing a bit of informal experimentation. Here is

what

> I find more times than not. The learner knows the skill because it

was

> demonstrated to me when asked to verbally to solve the problem and

> explain the solution. Yet the learner got the item wrong on the paper



> and pencil, multiple choice test. The learner demonstrates an ability

to

> do a problem and apply the skill in several examples and then gets

the

> same type of problem wrong when asked to do it on a test . Pre test

> scores do not always correlate with what I believe a learner knows

and yet

> such we are asked to place such importance on them. Sometimes even

scores

> go down between pre and post testing. I do believe this begs the

question,

> "What is happening here?" It is definitely worth considering more

varied

> assessment methods. We are working with adults that may be test

anxious

> and certainly language plays a huge role in being able to answer the



> question correctly. This is not to say that th!

> e tests we use are not valid, but getting a question right more

> circumstances than knowledge or skill, especially for adults. How do

we

> know when they know it, be able to retain it, and to apply it again

in

> other circumstances? "

>

> This is more than a "peep" ,but I feel this topic is critical to our



> programs and the whole "accountability/assessment" issue in education



> today.

>

> Donna Chambers

>

> ----- Original Message -----

> From: Mary Jane Jerde

> To: sfallsliteracy at yahoo.com ; The Assessment Discussion List

> Sent: Monday, October 23, 2006 6:49 PM

> Subject: [Assessment 534] Re: peep?

>

>

> Hi,

>

> The testing that is required for government funding tends to fly in

the

> face of some serious principles for assessment: never depend on one

tool,

> don't allow the students to become familiar with a specific test

form, use

> a variety of testing methods, numbers do not give a full, realistic

> assessment. CASAS does do several things fairly well: easy training,

easy

> scoring, easy make up. How many students know Form 54 by sheer

repetition?

> It's always a shock when they hit Form 56.

>

> I use BEST Plus with CASAS. It's not perfect either, but I have the

> "luxury" of a trained assessor willing to come on site to give it.

Between

> the two, I'm getting a better picture and more options for

reporting.

>

> Mary Jane Jerde

> Howard Community College

> Columbia, MD

>

>

>

> Nancy Hansen wrote:

> Marie -

>

> Without a peep .... I have been lurking ... every once in a while but



> not regularly on this thread. Part of the reason I haven't replied is

the

> emails that were posted were so loooong. I felt it would require

research

> to read thoroughly and respond, so I didn't. And I'm as always busy

with

> many projects as the only full-time paid staff.

>

> It's not that I am not interested personally, but via my scanning the



> posts I feel The Movement is not taking into consideration one (of a



> couple) very important factors: Some adult learners cannot commit to

the

> kind of time that many of you speak about. That means their time with



> their study is also very precious even though elongated. If testing

takes

> away that time, it would be resented.

>

> Ours is an adult literacy program driven by volunteer instructors.

The

> focus of the materials includes periodic check-ups built into the

lessons.

> (Note: Not called Tests.) However, I sense that by your colleagues'

> standards my program would be deemed ineffective. You know .... the

> learners aren't gaining a grade level every year. Quite frankly the

> learners don't *care* about that form of measurement.

>

> So I lurk. I feel, No. 2, The System places too *little* importance

on

> what it is that the adult learner has brought with them as goals in

their

> need to read, write and spell better. It cannot be measured in many

cases

> ... except, perhaps, in smiles, self-confidence and improved worth.

> *That* our learners *do* treasure! How do members of the adult

education

> system intend those skill development factors to be measured? Learner



> Portfolios are part of *our* system, yet unacceptable to the NRS. It

used

> to be that the check-up scores counted. But no longer.

>

> Until the answers are clear, this agency director will remain on the



> perimeters of the assessment discussion ... and *consequently* the

agency

> will continue without funding that is tied to a grade level increase



> requirement. The kicker is: The learners *like* what they are

receiving

> and that matters more. At least to me.

>

> Nancy Hansen

> Executive Director

> Sioux Falls Area Literacy Council

> Sioux Falls, SD

>

> Marie Cora wrote:

> Dear colleagues,

>

> Are you out there? Is this a bad time for a discussion? Is the

> topic not of interest?

>

> Aside from Virginia�s post last week, I haven�t heard from



> any of the 550 subscribers to the List. I�m assuming that the

topic

> (Measuring Education Gains in Adult Literacy, 10/17 and 10/18) is of



> interest. But you need to let me know if it is or not. I generally

gauge

> interest based on Subscriber responsese topic is not hot (which

surprises

> me).

>

> Are there other topics you�d prefer to engage in? What are

> they? The purpose of this List is to provide a forum for discussion

that

> goes well beyond your program�s walls, so if I�m not

hitting on

> the right stuff, I really do need to hear from you. Membership here

> continues to climb, but you are all very silent.

>

> I really want to hear your thoughts and I want to know how this List



> can serve you well. Please let me know. Feel free to respond to the

> discussion on Measuring Gains, but also, feel free to start your own



> discussion topic, or feel free to send your thoughts to the List or

to me

> personally regarding other discussions that you would like to see

happen

> � and I�ll make them happen.

>

> I value and appreciate your membership highly � but a Discussion



> List is only as good as the discussions that occur. If you�ve

never

> posted and that makes you a bit reticent, feel free to send me your

post

> and I can do a couple of things like help you compose your message,

or I

> can post your message for you anonymously. The important thing is for



> your voices to be heard.

>

> Thanks!

>

> marie

>

> Marie Cora

> marie.cora at hotspurpartners.com

> NIFL Assessment Discussion List Moderator

> http://www.nifl.gov/mailman/listinfo/assessment

> Coordinator, LINCS Assessment Special Collection

> http://literacy.kent.edu/Midwest/assessment/

>

>

> -------------------------------

> National Institute for Literacy

> Assessment mailing list

> Assessment at nifl.gov

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment

>

>

>

>

>

----------------------------------------------------------------------------

> Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and

30+

> countries) for 2�/min or less.-------------------------------

> National Institute for Literacy

> Assessment mailing list

> Assessment at nifl.gov

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment

>

>

>

>

>

>

------------------------------------------------------------------------------

> Want to be your own boss? Learn how on Yahoo! Small Business.

>

>

>

------------------------------------------------------------------------------

>

>

> -------------------------------

> National Institute for Literacy

> Assessment mailing list

> Assessment at nifl.gov

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment

>

>



--------------------------------------------------------------------------------



> -------------------------------

> National Institute for Literacy

> Assessment mailing list

> Assessment at nifl.gov

> To unsubscribe or change your subscription settings, please go to

> http://www.nifl.gov/mailman/listinfo/assessment

>



-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
http://www.nifl.gov/mailman/listinfo/assessment



---------------------------------
Do you Yahoo!?
Everyone is raving about the all-new Yahoo! Mail.



More information about the Assessment mailing list