National Institute for Literacy
 

[Assessment 285] Re: Pending questions and discussion

Eastland, Julie jeastland at ETS.ORG
Mon Apr 3 09:54:48 EDT 2006


Hi Everyone,
In response to the questions below:



>* What would be the cost:

For teachers' time to learn how to conduct the assessments? * For the
assessment instruments?


The tests are designed to be administered on-line. directly with the
students. The teachers would only need to be available to get the
students started with the tests, so we believe that the time and cost
for these activities would be minimal. I believe that we would be able
to better determine these costs during the pilot stage.


>*Also, the description of a state's involvement includes attending

development meetings and piloting all the tests - how much time is
involved in completing these tasks?

The states would need to be involved in attending 3-4 meetings during
the pilot stage (over a two year period), one that is mainly about item
development and development of score reports, one for pilot
implementation, one to participate in a standard setting process to map
the tests to NRS levels, and one at the end of the pilot to discuss the
results. The meetings would be 2-3 days each.



>And what might this be in terms of additional cost-per-student? In

other words, if a State Director of Adult Education were asked by a
legislator how much it would cost to fully implement EFF assessments,
what would be the answer? What would the additional cost per student
be, recognizing that that would vary for states with larger or fewer
numbers of students?

The tests are planned to cost $10 per test, and there will be 2 forms of
each test for pre- and post-testing purposes. I think that the costs per
student would vary by state, depending on the number of students tested,
the tests they take (Read with Understanding, Reading Components, Using
Math to Solve Problems) and the number of times they are tested.

And my question about the costs is, what is the value added? What will
learners and instructors receive that will inform future learning, that
other assessments don't already provide? What will program managers
receive that will inform program design and funding allocations, beyond
what current assessments and evaluations already provide? What will
legislators and other funders learn that will benefit future funding,
that they are not already learning from adult education advocates?
And, what will be reportable to the NRS that is not already reported?

We believe the value added will take many forms. These assessments will
be based on current research in the areas of reading and numeracy
skills; provide diagnostic information, especially with the reading
components measures, which can be used to profile adult learners, inform
them of their progress, and guide instructional planning; allow
measurement of gains at the lowest skill levels; map to existing
standards-based professional development and curricula; use open-ended
tasks based on authentic materials selected from adult roles and
contexts; offer an integrated system which administers, scores, and
analyzes responses in real time and which can be used in conjunction
with existing state information management systems; be linked to the
National Adult Literacy Survey, the International Adult Literacy Survey,
and the Adult Literacy and Life Skills Survey, making it easy for policy
makers and program managers to make connections to social and economic
benchmarks and to track changes over time; and predict student success
in educational and workforce environments. Score reports will be
designed with input from literacy practitioners, to better reflect the
kinds of information that will be useful to test takers, teachers, and
policy makers.


The discussion about this project has been quite informative and I hope
that you will contact me directly if you are interested in
participating.

Best regards,

Julie K. Eastland
Program Administrator
Center for Global Assessment
Educational Testing Service
Rosedale Rd.
Princeton, NJ 08541
jeastland at ets.org





________________________________

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov]
On Behalf Of Marie Cora
Sent: Thursday, March 30, 2006 11:25 AM
To: assessment at nifl.gov
Subject: [Assessment 280] Pending questions and discussion



Hi everyone,



There are clearly a series of questions here that we need to address.
Some of them Julie and ETS folks can respond to; others are probably for
us to pursue discussing and need the attention of policy-makers (local
and otherwise).



Julie - I wonder if you might respond to these questions posed by David
Rosen:



What would be the cost:


>* For teachers' time to learn how to conduct the assessments? * For the

assessment instruments?



Also, the description of a state's involvement includes attending
development meetings and piloting all the tests - how much time is
involved in completing these tasks?



I'm unsure who might respond to this series of David's questions -
anyone?:




>And what might this be in terms of additional cost-per-student? In



>other words, if a State Director of Adult Education were asked by a



>legislator how much it would cost to fully implement EFF assessments,



>what would be the answer? What would the additional cost per student



>be, recognizing that that would vary for states with larger or fewer



>numbers of students?




As for value added and present costs - Howard you asked some questions
related to how this might be better than what we have now. I think
these are questions for Julie, ETS folks, and Jim Austin and other pilot
participants to respond to, if they can:



Howard Dooley wrote:

And my question about the costs is, what is the value added? What will

learners and instructors receive that will inform future learning, that

other assessments don't already provide? What will program managers

receive that will inform program design and funding allocations, beyond

what current assessments and evaluations already provide? What will

legislators and other funders learn that will benefit future funding,

that they are not already learning from adult education advocates? And,
what will be reportable to the NRS that is not already reported?



But Howard, I will respond back to you on these couple of questions from
my personal point of view:



Howard wrote:

Aren't other assessments already aligned with EFF enough to

provide the information each stakeholder needs to make their decisions?




Marie: I would say no, there aren't. I also question "aligned enough"
- isn't that exactly what we want to get away from? This Discussion
List had quite a conversation on standards a couple of weeks ago, and
that discussion clearly expressed people's frustrations with the lack of
national standards and aligned curriculum and assessment.




Howard: Isn't the current mix of formal and informal, standardized and
local, performance and objective assessments essentially doing the
necessary tasks (leaving room for continuous improvements, of course)?



Marie: Yes, I would say that it is - but at what *cost*? The intense
juggling that programs and states must tackle in order to meet all the
demands they face is taxing to put it mildly. Wouldn't it be better if
we had a system in which the pieces fit together so seamlessly that
there wouldn't be any juggling to do?



I invite everyone to join this discussion further. Let us know your
thoughts, the answers to any of the above questions, or ask us your own
question.



Thanks,



marie cora

Assessment Discussion List Moderator






--------------------------------------------------
This e-mail and any files transmitted with it may contain privileged or confidential information.
It is solely for use by the individual for whom it is intended, even if addressed incorrectly.
If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute,
or take any action in reliance on the contents of this information; and delete it from
your system. Any other use of this e-mail is prohibited.

Thank you for your compliance.
--------------------------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.nifl.gov/pipermail/assessment/attachments/20060403/bda8a110/attachment.html


More information about the Assessment mailing list