National Institute for Literacy
 

[Assessment 742] Re: Data quality and usefulness

Mary Beheler mbeheler at cabell.lib.wv.us
Wed Apr 18 10:50:29 EDT 2007


MessageWe have no ESL students. They are all basic literacy students,
primarily at FFL levels 1-4. We use the Life Skills assessment because the
one for Employment Skills is just too irrelevant to our many disabled and
retired students. I have not seen the new series, but the term "work" makes
me suspect it might bring up some of the same issues. And right now the both
the time and money budgets are too tight to experiment.

The map question, which does *not* include a note that X means "you are
here," has baffled many students, not just the excellent ones. Hate to
"teach to the test" but now I try to remind our tutors to mark a "You are
here" X on at least one of the maps they use in practice. It is just one of
those things you either know or you don't. ("Everything is intuitive, once
you know how," is one of my favorite quotes about learning a yet another
computer application.)

I wanted to see which questions our students missed most, and over the years
the map question with the unmarked X has been prominent, though other map
questions were not. (The one with the left pointing north arrow is the
second most missed map question.)

The person who did worse after I began comparing his answers to the correct
master is an very artistic student who once just filled in the bubbles on
the answer sheet in a pretty pattern. (Had to have a serious talk with him
about that.)

Scaled scores are derived from the raw score. If one is low, so is the
other. I used the term "doing so poorly" because I knew he had previously
answered more than half the questions correctly on a parallel assessment.
This time I was seeing very few correct answers, even at the very beginning
of the assessment, where the questions tend to be easier for most students.

Sometimes he is fully engaged in what he is doing and we get a good
assessment. When he is in a "lets get this over with" mood, anything can
happen. The day he did better with the wrong answer sheet was a day he was
doing random guesses. Even so, marked enough of them correctly for his
scaled score to be in the valid range. However, the ones he hit correctly
were scattered all over the place. As a human being could see the
randomness. I just junked what he'd done that day and gave him a different
assessment a couple of months later, when his attitude was more suitable.
Glad the random marking didn't happen when we were up against the end of the
fiscal year deadline!
Mary G. Beheler
Tri-State Literacy
455 Ninth Street
Huntington, WV 25701
304 528-5700, ext 156

-----Original Message-----
From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On
Behalf Of Dan Wann
Sent: Tuesday, April 17, 2007 9:56 PM
To: 'The Assessment Discussion List'
Subject: [Assessment 739] Re: Data quality and usefulness


Mary,



Where did the valid score place the excellent student who was doing
poorly? Since we work from the scale score and not the number correct I do
not know how to interpret the doing so poorly. When working with teachers
and students, we try to place an emphasis on the where the student falls on
the scale or NRS level. In training with teachers, we always stress that
students are not a single level but students have range of strengths and
needs and we discuss about how to present this information to the student.



I agree that a missed question does not tell us why the student was not
able to correctly answer the question; however, by taking "multiple
measures" of student performance over time with a variety of formats tells
me if the student has the concept and if the student can transfer what is
known to different contexts.



In the example cited below, in most exercises that use maps the X is also
marked with the words such as "you are here." It would seem that the
student using "X marks the spot" and not "X you are here" would indicate a
reading/literacy problem. As a teacher I would watch that student read
directions and try to perform tasks. Does the student ask others to help
explain the task? Does the student ask me, the teacher, for help? Observing
the student's behavior and patterns of work is part of the assessment
process, and from your other posts I think that is what you seem to do in
your program with such a variety of learners at vastly different levels.



The last point you make about application forms when the student does not
intend to seek employment presents a challenge that we always face. If not
application for work, all ESL students are faced without filling out forms
and as a teacher I want to have students learn to "transfer" knowledge so I
think I might use your quote "learn to read and then read to learn." That
is I would use the application form for work as a transition to other forms
and so filling out forms is the concept that I am teaching as well as
helping students understand that there are many forms that they will have to
fill out in English and forms have certain questions and vocabulary in
common. That way there is a relationship between the assessment and what is
taught. The context of the question is not as important as the ability to
read and fill forms. I explain to students that the employment form for
them is not important but I would try and brainstorm with them when they
might need to help someone with an employment form.



Since you mention using CASAS life skills test you might want to look at
the new tests CASAS has developed such as the Life and Work. If your state
has other approved assessments for ESL then you might look at those
assessments to see if you think it is a better match for your curriculum. I
have found that CASAS Literacy tests and level A reading tests are very
sensitive instruments in tracking lower level student learning gains.



Dan Wann

Professional Development Consultant

Indiana Adult Education Professional Development Project



dlwann at comcast.net








----------------------------------------------------------------------------
--

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On
Behalf Of Mary Beheler
Sent: Tuesday, April 17, 2007 5:58 PM
To: The Assessment Discussion List
Subject: [Assessment 737] Data quality and usefulness



Does GIGO apply to the data all of us, big and small, are gathering?



The information gleaned from the CASAS Life Skills (or any other)
assessment tool can be useful for spotting problems and successes, but
because of the *multiple layers of skills involved in answering any one
question* the specific question(s) missed must be looked at very carefully.
But, getting the student feedback that makes looking carefully possible gets
into problems of assessment confidentiality.



Example: A excellent student may understand *everything* else about a
certain CASAS map question, but if he or she has heard the expression "X
marks the spot," and assumes X means the goal, not the beginning, this
question will be nonsense. I don't think the check-off sheet of demonstrated
skills specifies, "Knows that on *this* map X means, "Start here." If the
student's tutor or I can't discuss a missed question with the student, how
will I discover that?



How do we know that the question a student answers or misses actually
assesses the skill the assessment manual tells us it does?



I was scoring an answer sheet and was dismayed at how poorly a student was
doing, when I noticed I wasusing the math answers, not reading. I switched
to the correct set, and the student made even a worse score! Both scores
were in the "valid" range, too! How much confidence should I place in that
assessment?



Having the questions in a booklet and marking the answers to multiple
choice questions on a separate sheet may be a skill even somewhat advanced
adult students do not have.Because our student workbooks don't use multiple
choice questions, we have actually created lists of number-letter pairs to
see if a student can mark the letter in the appropriate column of each
numbered row on a separate answer sheet. That's all. (Did that after a
strong level 2 student marked the answer sheet by page number, not question
number, with answers to 2 and 3 questions marked in the same row.)



Has anyone made an effort to see if the lower level literacy students want
to learn what the CASAS or other accepted NRS assessments wants us to teach?
Lots of our students are on SSI. They don't see the point in learning about
employment applications. That often means any question about employment
forms isn't important enough to take seriously, even if the ones about other
forms are.



Colleges have sense enough to make all freshman year classes pretty
generic, and leave the "major" study to later years. Is it useful for
assessment of beginning literacy to get so specific so soon? Whatever
happened to "Learn to read; then read to learn"?



We used to use the quick and unintimidating SORT-R for student assessment.
Even on that simple test almost all men missed word "dainty," no matter what
their reading level. Does anyone know if any specific question(s) on the
assessments used for gathering NRS data is answered incorrectly by most
students at any one level? Or if a significant number of students in the
Laubach series misses different questions than students in Challenger or
Voyager, or another series?



Mary G. Beheler
Tri-State Literacy
455 Ninth Street
Huntington, WV 25701
304 528-5700, ext 156

-----Original Message-----
From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]On
Behalf Of Marie Cora
Sent: Monday, April 16, 2007 1:29 PM
To: Assessment at nifl.gov
Subject: [Assessment 714] Just joining us? Here's what you need to
know...

Hi folks,



A number of subscribers have just joined us and so I would like to give
them the necessary info for joining our discussion. Please post your
questions and share your experiences now!



View the archives at:
http://www.nifl.gov/pipermail/assessment/2007/date.html to get up to date
with the current conversation.



See suggested resources at:
http://www.nifl.gov/lincs/discussions/assessment/07program_impr.html
(Scroll to the bottom!!)

See more resources at:

For the 3 power points from New York State/LAC, click on these links:

http://www.nifl.gov/lincs/discussions/list_docs/ReportCardRubric.ppt

http://www.nifl.gov/lincs/discussions/list_docs/RollingOutReportCard.ppt

http://www.nifl.gov/lincs/discussions/list_docs/DevelopingDisseminatingR
eportCards.ppt

Here are your prompts: add your voice!:

* Do you use data in your program? What type? How? What have been the
results?

* What information (data) would you like to track and why?

* What data would you like to learn how to use?

Thanks!!



Marie Cora







Marie Cora

marie.cora at hotspurpartners.com

NIFL Assessment Discussion List Moderator

http://www.nifl.gov/mailman/listinfo/assessment

Coordinator, LINCS Assessment Special Collection

http://literacy.kent.edu/Midwest/assessment/




-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.nifl.gov/pipermail/assessment/attachments/20070418/c4596ccb/attachment.html


More information about the Assessment mailing list