National Institute for Literacy
 

[Assessment 1077] Re: Standardized Tests

Dan Wann dlwann at comcast.net
Thu Nov 29 14:22:15 EST 2007



>From what I understand the use of grade level equivalency scores are given

because of the need to report by that standard. I have read TABE materials
that states that GE is too broad of a term to mean much and that the scale
score is much more meaningful in understanding student performance. I know
that CASAS scale scores are much more meaningful than the GE but the GE is
made available when funding sources require grade level equivalency
reporting.

Both CASAS and BEST do test students' reading ability in life and work
contexts and I would disagree with some of what Bruce has written below.
Further, as a test of basic reading skills these tests are constructed so
that extensive and selective background knowledge is not necessary. The
test items on both of these tests seem to test reading within a context that
is known by a majority of people who function in society at various levels
and are not designed to test at a post-secondary level. If you examine the
CASAS and BEST tests you see questions based on paycheck stubs, bus
schedules, work memos, employee handbooks, medicine labels and narratives.

In my opinion standard testing is a necessary but not sufficient tool to use
for assessing student skill level or proficiency. While I do not use TABE
because of the orientation of the test to an academic setting, I would not
say it is a bad test. The question is does the test do what it says it will
do. I think that CASAS and BEST do what is mentioned below by testing adult
students' ability to read a variety of documents in a variety of settings.
The scaled scores given and the indicators of levels are not grade level
dependent and can be explained just as an SAT or CAT score can be explained
independent of Grade Level. Educators must choose a test that correlates to
the curriculum and instruction if the test is to be of value, but I would
not say that a test is bad because it tests one thing and I teach another.

Dan Wann
Professional Development Consultant
Adult ESL and Development Education Teacher

-----Original Message-----
From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On
Behalf Of Bruce C
Sent: Thursday, November 29, 2007 8:19 AM
To: The Assessment Discussion List
Subject: [Assessment 1076] Standardized Tests

The problem with standardized tests is the claim they
make about reading level.

All reading happens in a context. My reading level is
dependent on my background knowledge (including
vocabulary, print conventions, conent knowledge),
interest in the subject, purpose for reading, urgency
of the task at hand, etc. For example, I could read a
complicated "Request for Proposal" for Adult Literacy
funding much more successfully than I could read an
RFP for community health centers. In the case of the
former, I have read similar texts before, I know the
jargon and acronyms, and I will push through it
because I need the money. Not so for the latter. This
is true even if these documents have the same
Readability level.

Standardized tests ignore the realities of real
reading. In real life, I read maps, newspapers, my
paycheck, phone books, novels, utility bills, emails,
and text messages. My reading level changes depending
on the context of each tasks. This is why the National
Assesment of Adult Literacy asked people to perform a
range of reading tasks and did not give them a
standardized test. They wanted to be accurate.

Standardized reading tests give me a fair indication
of how well a student will do on a standardized test.
They may tell me something about the students general
reading level, but not nearly as much, and not as
accurately, as these tests claim.

Standardized tests act as if I can put a "sixth grade"
stamp on a student's forward and assume she/he reads
everything at the sixth grade level. That just isn't
how reading works.

Bruce Carmel
Turning Point
Brooklyn NY





> >>> "Venu Thelakkat" <VenuT at lacnyc.org> 11/23/2007

> 3:22 PM >>>

> I agree with Miriam that test publishers present

> easy targets for blame.

> While I am not overly compelled to defend CTB McGraw

> Hill, I am not sure

> about the point being made by John and Bruce. Is it

> that;

>

> - The TABE is a bad standardized test and there are

> others that are

> better suited to our purposes? Or is it that

>

> - All standardized assessments are bad and should

> not be part of any

> system for evaluating program or student

> performance? Or is the

> argument that

>

> - Standardized assessments should only be part of a

> more comprehensive

> evaluation system that includes portfolios,

> performance-based

> assessments and teacher observations (this, I

> believe, is part of the

> argument David made in a similar thread a few weeks

> ago)

>

>

>

>

>

> Venu Thelakkat

>

> Director of ASISTS/Data Analysis

>

> Literacy Assistance Center

>

> 32 Broadway, 10th floor

>

> New York, NY 10004

>

> (212) 803-3370

>

> venut at lacnyc.org

>

> www.lacnyc.org

>

> -----Original Message-----

> From: assessment-bounces at nifl.gov

> [mailto:assessment-bounces at nifl.gov]

> On Behalf Of John Gordon

> Sent: Wednesday, November 21, 2007 2:42 PM

> To: The Assessment Discussion List

> Subject: [Assessment 1073] Re: TABE Training

>

>

>

> I think Bruce was right on the money about the TABE.

> It's hard not to be

>

> cynical about the role of the test publishers. I

> wonder about the role

>

> McGraw Hill has played in getting the TABE adopted,

> why they keep coming

>

> out with new versions every couple of years, etc.

>

>

>

> Much as I like to think that programs are not

> letting the tests drive

>

> instruction, I'm doubtful. The pressures to show

> testing outcomes are

>

> so intense (high stakes testing for the programs!),

> I assume many

>

> programs devote considerable time to preparing

> students for the TABE and

>

> other tests. I would point out that McGraw-Hill now

> publishes a series

>

> of TABE workbooks. So much for not letting the test

> drive instruction. I

>

> have no idea how many programs around the country

> use them; I'd be

>

> interested to know....

>

>

>

> john gordon

>

> new york

>

>

>

> -----Original Message-----

>

> From: assessment-bounces at nifl.gov

> [mailto:assessment-bounces at nifl.gov]

>

> On Behalf Of Kroeger, Miriam

>

> Sent: Wednesday, November 21, 2007 11:00 AM

>

> To: The Assessment Discussion List

>

> Subject: [Assessment 1072] Re: TABE Training

>

>

>

> OVAE requires that Adult Education programs funded

> through WIA Title II

>

> use "standardized assessment procedures....The

> procedures must be a

>

> standardized test or a standarized

> poerformance-based assessment with a

>

> standardized scoring rubric." (NRS guidelines,

> pg.22) OVAE is also in

>

> the process of "vetting" assessments for approved

> use in AE programs.

>

> With the standarized, reliability and validity

> requirements, programs

>

> have "defaulted" to publishers' tests - thus the

> overwhelming use of

>

> TABE or CASAS. (Other tests that may have

> previously been used such as

>

> ABLE or AMES are no longer published or have not

> been updated.)

>

>

>

> We all know that no test is perfect; what we need to

> know is how to make

>

> the best use of the tests that are approved and to

> not let a test drive

>

> the instruction; rather it should help to inform

> instruction. Given

>

> that, do the majority of individuals using the TABE

> use the resources

>

> that are available with it? Someone has suggested

> that the instructors

>

> take the tests. They should at least review them at

> all the levels.

>

> They also need to be aware of the level at which the

> test items are

>

> written. A scale score of 600 on an E level test

> does not indicate that

>

> the student is at an ASE II level.

>

>

>

> Additionally, the Users Manual for both 7/8 and 9/10

> contain item

>

> analysis for each question for both the complete and

> survey. Using

>

> these tool appropriately the instructor can get an

> idea of some of the

>

> skills that a student may be weak in. There are

> even sample lesson

>

>

=== message truncated ===




____________________________________________________________________________
________
Never miss a thing. Make Yahoo your home page.
http://www.yahoo.com/r/hs
-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
http://www.nifl.gov/mailman/listinfo/assessment
Email delivered to dlwann at comcast.net





More information about the Assessment mailing list