AdultAdolescenceChildhoodEarly Childhood
Programs

Programs & Projects

The Institute is a catalyst for advancing a comprehensive national literacy agenda.

[Assessment 1937] DIBELS NOT NORMED for Adults

Stephanie Moran

stephanie at durangoaec.org
Thu May 28 15:42:33 EDT 2009


Bob's points are one reason why we can't use DIBELS in CO as it stands-our
tests must be standardized and normed (CASAS, TABE, and BEST are
acceptable).



From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On
Behalf Of Hughes, Robert
Sent: Thursday, May 28, 2009 10:25 AM
To: The Assessment Discussion List
Subject: [Assessment 1933] Re: DIBELS



A year or so ago, I decided to see if DIBELS could be adapted to adult ed
settings. I contacted the researchers who designed it at the University of
Oregon (https://dibels.uoregon.edu/) to see what they thought. They seemed
to think that the measures would be appropriate, and I agree with John's
assessment below that there could be some uses.



Here's the rub, though. The way that CMBs like DIBELS work is that they
rely on input from large number of users to generate norms that teachers can
use to assess individual students. This works well because DIBELS is
gathering scores from a wide range of users from all over the country. The
number and diversity of users provides a natural sampling that provides a
pretty accurate and constantly updated norming process. DIBELS is,
therefore, normed to the K-12 population pretty well.



There isn't a category for entering adult learning scores into DIBELS, and
that needs to be done before it can be appropriately normed. And rather
than grade-level norms, someone would have to generate norms that are closer
aligned to the norms that we use in adult ed. My brief discussion with the
DIBELS folks suggest that they aren't averse to doing this -- but that
people haven't approached them with the request. I'm guessing that if
enough people start contacting them, they might respond.



Bob H.



Bob Hughes, Ed.D.
Associate Professor of Adult Education
Seattle University
410 Loyola Hall

901 12th Ave

PO Box 222000, Seattle WA 98122

Ph: 206-296-6168
E-mail: rhughes at seattleu.edu



_____

From: assessment-bounces at nifl.gov on behalf of Sabatini, John
Sent: Thu 5/28/2009 6:42 AM
To: The Assessment Discussion List
Subject: [Assessment 1929] Re: DIBELS

Hi,



I'd also recommend the following references for thinking about how to assess
and think about fluency measures with adult learners. The first two are
actually from the 4th grade special studies of Oral reading conducting by
the NAEP. The reason to look at them is to see how the authors constucted
the fluency/prosody/expressiveness subscale and to understand a bit about
the distinctions between rate (words per minute), accuracy (percentage
correct), and words correct per minute. As the Wayman report points out,
4th grade is a key developmental year for the strength of the relationship
between oral reading and comprehension in children. The national sampling is
sound. The Wayman article introduces all the variations of oral reading
tasks and what aspects might matter in choosing one. One can also read
nearly anything by Tim Shanahan.



DIBELS has been an exemplar of a Curriculum-based Measures (CBM) approach.
The goal of that research had been to use fluency-type measures as a proxy
for predicting reading comprehension. Interestingly, the focus has been
less on the subgoal/subskill of improving children's reading fluency. The
DIBELS technical reports still provide some useful benchmarks for thinking
about the development of reading rate and fluency, but as the previous post
notes, be cautious about applying any rules as is with adults. They do
continue to improve the technical aspects.



Of course, we continue to recommend you look at the NCES Basic Skills report
that was just published, as we gave a national sample of some 19000 adults
two passages -- one at about 2nd-6th grade level another at 7th-8th grade
level. While we cannot at present create a normative scale for those
particular passages, as we develop further reports, the results can be a
guide to expectations for adult readers. Our research team here is also
conducting research on adult reading fluency, though we don't have
particular assessments to recommend at this time. Hopefully, we'll have more
helpful reports out there for you soon.



I think one of the main purposes in reading fluency assessments with adults
is to monitor the improvement of accuracy, rate, and
fluency/prosody/expressiveness (I think referred to here as chunking for
syntax, grammar) over time with texts of increasing challenge. So, it is
the repeating of the activity over time and the recording of rates and
accuracy and ease to see if there is improvement. I don't trust
readability formulas for equating texts - don't expect any two texts with
the same readability index to be of equal difficulty in terms of reading
rate for any adult. However, adults and most readers are roughly consistent
in their reading rates across a relatively wide variety of texts - until
they get so difficult that the individual is struggling with every word. I
actually prefer picking easy texts relative to the adults word reading
ability when monitoring continuous text reading fluency. There are separate
measures one can use for word recognition and decoding.



Finally, McShane's report applies this to adults.



John





Daane, M. C., Campbell, J. R., Grigg, W. S., Goodman, M. J., & Oranje, A.
(2005). Fourth-grade students reading aloud: NAEP 2002 special study of oral
reading (No. NCES 2006-469). Washington, DC: U. S. Department of Education,
Institution of Education Sciences, National Center for Educational
Statistics.

Pinnell, G. S., Pikulski, J. J., Wikxson, K. K., Campbell, J. R., Gough, P.
B., & Beatty, A. S. (1995). Listening to children read aloud: Data from
NAEP's Integrated Reading Performance Record (IRPR) at grade 4 (No.
NAEP-23-FR-04; NCES-95-726). Princeton, NJ: Educational Testing Service.

Samuels, S. J. (2006). Toward a model of reading fluency. In S. J. Samuels &
A. E. Farstrup (Eds.), What research has to say about fluency instruction
(pp. 24-46). Newark, DE: International Reading Association.

Wayman, M. M., Wallace, T., Wiley, H. I., Ticha, R., & Espin, C. A. (2007).
Literature synthesis on curriculum-based measurement in reading. The Journal
of Special Education, 41(2), 85-120.



McShane, S. (2005). Applying Research in Reading Instruction for Adults:

First Steps for Teachers. Washington, DC: National Center for Family
Literacy, National Institute for Literacy.





_____

From: assessment-bounces at nifl.gov [mailto:assessment-bounces at nifl.gov] On
Behalf Of SALandrum at aol.com
Sent: Thursday, May 28, 2009 6:53 AM
To: assessment at nifl.gov
Subject: [Assessment 1926] Re: DIBELS

I may be wrong but I don't think it has been scaled for adults. Below is
from their webpage.



The DIBELS were developed as criterion-based measures; but national norms
have been developed.

DIBELS are criterion-referenced because each measure has an empirically
established goal (or benchmark) that changes across time to ensure students'
skills are developing in a manner predictive of continued progress. The
goals/benchmarks were developed following a large group of students in a
longitudinal manner to see where students who were "readers" in later grades
were performing on these critical early literacy skills when they were in
Kindergarten and First grade so that we can make predictions about which
students are progressing adequately and which students may need additional
instructional support. This approach is in contrast with normative measures
which simply demonstrate where a student is performing in relation to the
normative sample, regardless of whether that performance is predictive of
future success.

For your convenience, district-level norms or percentiles are generated at
each benchmark data collection period so schools/districts can make
decisions about student performance in relation to the local context of
students who have received, generally, the same type of instructional
experiences. National norms, generated with all the students in the DIBELS
Data System as of 2002, are also posted within the Technical Reports section
of the website in Technical Report #9.

You can see how the benchmark goals are used by going to our Technical
Reports <https://dibels.uoregon.edu/techreports/index.php> page and
downloading the following report:

Good, R. H., Simmons, D. S., Kame'enui, E. J., Kaminski, R. A., & Wallin, J.
(2002). Summary of decision rules for intensive, strategic, and benchmark
instructional recommendations in kindergarten through third grade (Technical
Report No. 11). Eugene, OR: University of Oregon.



Susan Landrum
Certified Barton Tutor
Central Georgia Technical College
slandrumcgtcedu at gmail.com



In a message dated 5/28/2009 6:42:03 A.M. Eastern Daylight Time,
jmarrapodi at applestar.org writes:

I'm going out on a limb here.

Lots of folks in the K-5 world use DIBELS (https://dibels.uoregon.edu/ ) for
reading assessment in the primary grades. It is fairly granular. Is there
any history or applicability for use with adult low literacy learners? It's
fairly intensive to learn to administer, but it does measure a lot of the
subskills we are looking at with alphabetics, fluency, comprehension and
vocabulary. In the teacher discussions on teachers.net one of their
complaints was the timing issues for young children, which I can see could
create undue stress for some tasks. Often elementary materials are
problematic for adults, but this one comes well researched.



I'm just wondering about it, so I thought I'd toss it into the mix this week
to see what you all thought.





<http://www.applestar.org/> http://www.applestar.org/


Jean Marrapodi, PhD, CPLP

teacher by training, learner by design
<mailto:rejoicer at aol.com> jmarrapodi at applestar.org
mobile:
<http://www.plaxo.com/click_to_call?lang=en&src=jj_signature&To=401%2E440%2E
6165&Email=rejoicer at aol.com> 401.440.6165
<http://www.applestar.org/> www.applestar.org











-------------------------------
National Institute for Literacy
Assessment mailing list
Assessment at nifl.gov
To unsubscribe or change your subscription settings, please go to
http://www.nifl.gov/mailman/listinfo/assessment
Email delivered to salandrum at aol.com



_____

We found the real 'Hotel California' and the 'Seinfeld' diner. What will you
find? Explore <http://www.whereitsat.com/?ncid=emlwenew00000004>
WhereItsAt.com.

--------------------------------------------------
This e-mail and any files transmitted with it may contain privileged or
confidential information.
It is solely for use by the individual for whom it is intended, even if
addressed incorrectly.
If you received this e-mail in error, please notify the sender; do not
disclose, copy, distribute,
or take any action in reliance on the contents of this information; and
delete it from
your system. Any other use of this e-mail is prohibited.

Thank you for your compliance.
--------------------------------------------------
-------------- next part --------------
A non-text attachment was scrubbed...
Name: winmail.dat
Type: application/ms-tnef
Size: 19126 bytes
Desc: not available
Url : http://www.nifl.gov/pipermail/assessment/attachments/20090528/6cd6587c/attachment.bin


More information about the Assessment discussion list