Skip Navigation
small header image

Results from the 2002 National Assessment of Educational Progress (NAEP) writing assessment

Dr. Peggy G. Carr Hello, and welcome to today's StatChat on the NAEP 2002 writing results for the nation and the states. I hope you've had time to look at the results on the website. I'm sure that you have many questions regarding today's release, so let's get right to them...

Debbie from Austin, Texas asked:
Will the writing prompts be made public today? If so, where will they be on the website?
Dr. Peggy G. Carr: There are three writing prompts that we used at each grade that are released and available at the NAEP website, http://nces.ed.gov/nationsreportcard/itmrls/. Included with the sample prompts are scoring guides and sample student responses, with explanations for how they were scored. The remaining prompts used in 2002 are kept secure so that we can use them in future writing assessments.

Marissa from Greenbelt, MD asked:
1. Does NAEP provide individual scores for kids? 2. Was the assessment compulsory? Did all states participate?
Dr. Peggy G. Carr: NAEP does not provide scores for individual students. NAEP is designed so that each student only takes a portion of the test. Therefore it is not possible for NAEP to report on individual scores, but only on group or subgroup performance. NAEP is a voluntary assessment. The quality of the results, however, are dependent on the participation of sampled students. States volunteer, and for the writing assessment in 2002, at grade 4, 45 states and five jurisdictions participated, at grade 8, 44 states and six jurisdictions participated. However, some states that participated did not meet minimum participation standards for reporting. Two at grade 4 and three at grade 8 did not meet these standards.

Ann from Austin, Texas asked:
On what date did students take this test?
Dr. Peggy G. Carr: Students took the test during the last week of January through the first week of March in 2002.

Hui from Iowa City, Iowa asked:
We have results for fourth-grade students from Iowa, but we don't have results for eighth-graders. Was it because the eighth-graders did not participate? Do we do anything to encourage participation?
Dr. Peggy G. Carr: In 2002 Iowa chose not to participate at the 8th grade level. For more information about the reasons for this decision, contact Dianne Chadwick, the Iowa NAEP State Coordinator. She can be reached by email at Dianne.Chadwick@ed.state.ia.us.

John from Pennington, NJ asked:
How many students are tested in each state? How are they chosen?
Dr. Peggy G. Carr: The target sample of students at each grade in each state for each subject is 2500. It is a scientifically stratified random sample that is representative of the population at that grade in that state.

Tamara from Philadelphia, PA asked:
As you know, the student population of Philadephia (or any other big city) is very different from the rest of the State. Does NAEP give results on (inner)cities? Is there a way to make comparisons between cities?
Dr. Peggy G. Carr: Until 2002, NAEP provided results only for the nation and participating states. In 2002, a trial urban district assessment was administered at grades 4 and 8, and the results will be released for both reading and writing on July 22, 2002 at 10am. Five districts participated in this trial assessment: Atlanta, Chicago, Los Angeles, Houston, and New York. (Note that the District of Columbia has traditionally participated as part of the state program). The trial urban district assessment has been expanded in 2003 to include 9 districts; those results, for the reading and math assessments, will be released later this year.

Sandy from Arlington, VA asked:
How clearly do you feel that the reading results and the writing results can be linked? Does the data show that kids who read more write better?
Dr. Peggy G. Carr: None of the students take both the reading and writing assessments, so it is not possible to use student data to answer your question. You could use the NAEP data tool on the web (nces.ed.gov/nationsreportcard/naepdata) to see if the highest scoring states in reading are also the highest in writing. It is possible that states with a strong curricular emphasis on language arts did well on both NAEP reading and writing.

Mary from Northampton, MA asked:
How has NAEP changed with the advent of the No Child Left Behind legislation?
Dr. Peggy G. Carr: With the advent of the No Child Left Behind legislation, NAEP reading and math at grades 4 and 8 will be conducted every two years starting in 2003. States that receive Title I funding are required to participate in these assessments.

Michelle from Alberta, Canada asked:
I have just recently attended a workshop at the University of Calgary where your National Writing Project was featured as having over 175 sites throughout all of states in the U.S.A. What kind of impact do you think this program has had on the success of students writing? It is also stated that over 2000 000 teachers have been inserviced by the National Writing Project. Are there any statitics to find out how many teachers are still implementing the practices that were taught during these summer institute sessions?
Dr. Peggy G. Carr: While the National Writing Project is not a NAEP program, NAEP does collect information through background questionnaires about teacher practices and student classroom writing activities. This information can tell you what?s going on in classrooms. I also suggest you check out the National Writing Project website, at www.writingproject.org.

Peter from Newtown, PA asked:
As internet-based automated essay scoring engines, like Intellimetric from Vantage Learning, continue to out perform human scorers in consistency, reliability and speed, does NAEP foresee the integration of this type of technology in upcoming pilots or tests? If not,what are the main concerns?
Dr. Peggy G. Carr: As part of a NAEP research project, we?ve explored online assessments and automatic scoring of math and writing. While delivery and student experiences were successful, automatic writing scoring will require further refinement to be workable for scoring responses from a nationally representative sample of students ? where there are wide ranges and varieties of response types.

Judith from Spencer, Iowa asked:
Is there information concerning the number of students at each grade level from each state who were given the writing assessments?
Dr. Peggy G. Carr: The number of students assessed at each grade from each state is provided in the procedural appendix (Appendix A) of the full report card, The Nation?s Report Card: Writing 2002. A .pdf of the report card is available on the web at nces.ed.gov/nationsreportcard/writing/results2002.

Lisa from Hawaii asked:
Does the NAEP include background questions on the use of computers? Kids are writing more using computers from papers to sending emails and Instant messaging. Does this help or hurt their writing?
Dr. Peggy G. Carr: NAEP data can?t address causal questions like yours, but the common belief among professional writers is that the use of computers makes revision and editing of drafts easier, resulting in better writing. If you look at the NAEP data tool (nces.ed.gov/nationsreportcard/naepdata), you can search for ?instructional content and practice? questions from the writing assessment background questionnaires. This will show you the scores of students who use computers in different ways in school.

Mary from Bryn Mawr, PA asked:
Aren't the lackluster high school results of various NAEP tests attributable largely to older students' disinterest in doing an anonymous test for which they receive no grade and no credit?
Dr. Peggy G. Carr: It has been true throughout the thirty-five year history of NAEP that students receive no grade and get no credit for taking the NAEP assessment. To be associated with declining scores, there would have to be a recent decline in motivation, and we have no evidence of this. The National Assessment Governing Board has appointed a commission to study the proper role of NAEP in high school. Motivation is among the factors they are considering in their deliberations.

Barbara from Edinboro, PA asked:
What is the most significant piece of data or insight that can be gained from the 2002 Writing results?
Dr. Peggy G. Carr: Barbara, this is a tough one! There is a wealth of information available in the NAEP data, and each person finds different insights from the results. However, I?ll offer my personal top three (in no particular order): at grade 4, there were scale score improvements across all levels of student performance; at grade 8, where there is state data from 1998 and 2002, not one state had a statistically significant decline in average scale scores; at all grades, the top 10% of student writers had significant scale score gains.

Maria A. from Duarte CA asked:
Does the report break down state results by school district? I couldn't find it...
Dr. Peggy G. Carr: In the report released today, no school district results are provided. NAEP has traditionally only reported on performance for the nation and participating states. In 2002, a trial urban district assessment was conducted in reading and writing, at grades 4 and 8. See my answer to Tamara about this trial assessment. Don't forget that the results will be released on July 22nd!

Anna from Upper Montclair, NJ asked:
Who decides which subjects get tested in NAEP each year? What is the next year in which writing will be assessed?
Dr. Peggy G. Carr: The National Assessment Governing Board as part of its policy oversight of NAEP decides which subjects get tested in which year. The next NAEP writing assessment is scheduled for 2007.

Joan from New Hampshire asked:
Do you believe writing is integrated widely enough within all subject areas at secondary levels? Might the declining scores for seniors also reflect lesser investment in performing well for the test? Also, can you identify some public schools nationwide that are exemplars for effective teaching and integration of writing and who are willing to share their best practices with others? Thank you.
Dr. Peggy G. Carr: Curricular emphasis and structure is determined at the state and local education agency level. The curricular integration of writing almost certainly varies state by state, and specific information about this is best obtained from such agencies. The motivation and performance of 12th grade students on NAEP have been the subject of much speculation and interest. The National Assessment Governing Board (NAGB) has a commission currently active to provide more information regarding potential issues around motivation and performance on NAEP. NAEP does collect data from teachers and schools in addition to students, but results are provided only for the nation, states, and, as of July 22nd, for five urban districts that participated in the Trial Urban District Assessment. Resources and information relevant to your question might be available from state or local education agencies or organizations such as the National Writing Project. Also, you might consult the Department?s website for a list of the nation?s ?Blue Ribbon? schools.

Scott from Solebury, PA asked:
Thank you for taking this question. Can you give your feelings for why you think the Grade 12 results are showing a decline across years - Is it "Motivation" (an increase in "senior syndrome" over the years), Is it a true Decrease in Writing ability, Is it a mix of the two, or something else (an increase of "test fatigue" over years)?
Dr. Peggy G. Carr: See my response to Mary of Bryn Mawr, PA. I can add that there is rarely a single reason for score increases or decreases.

Amy from Hamburg, PA asked:
I was interested in the participation in NAEP. Is it voluntary? What does it mean when a state did not meet participation guidelines?
Dr. Peggy G. Carr: In 2002, participation in NAEP was voluntary. Beginning in 2003, states that receive Title I funds are required to participate in the math and reading assessments. NCES has established specific participation rate standards that states and jurisdictions were required to meet in order for their results to be reported. These standards are provided in the full report card The Nation's Report Card: Writing 2002, at nces.ed.gov/nationsreportcard/writing/results2002. See pages 176-178.

Jason from New York city, NY asked:
How are the essays scored for the writing assessment? How does NAEP ensure the reliability of the scores given by human raters?
Dr. Peggy G. Carr: NAEP uses scoring guides that describe six levels of writing performance: excellent, skillful, sufficient, uneven, insufficient, unsatisfactory. The guides are designed to measure the overall quality of response, with an emphasis on development, organization, sentence structure and word choice, and mastery of conventions of written English. Trained raters score students? responses to the writing tasks. In order to ensure reliability, the following steps are taken: raters are trained; cross-year consistency in rating is checked; inter-rater consistency is checked; supervisors monitor the scoring very closely; raters are retrained when returning to scoring after a lunch or other break to minimize scorer drift.

Pat from Columbia, MD asked:
Is there any relationship between NAEP writing and reading results? Or student writing and reading skills?
Dr. Peggy G. Carr: NAEP reading and writing assessments are given to different student samples and are summarized on different scales; therefore, this is not information available from NAEP data. Also, see my answer to Sandy, who asked a similar question.

Kimberly from Little Rock, AR asked:
Some educators have criticized the NAEP performance levels as being too demanding, with the bar for excellence set too high. I've looked at the definition for proficiency, but can you tell us if it would be accurate to say that a student performing at the proficient level is actually performing at grade level?
Dr. Peggy G. Carr: All writing prompts and scoring criteria are reviewed by classroom teachers and curriculum experts with grade-level expertise. This means that grade 4 and 8 students respond to prompts designed to match their specific grade level skills, and their responses are scored with criteria that match what students at each grade are able to do. Therefore, a student at grade 8 who is performing at the Proficient level should be demonstrating skills appropriate to the 8th grade, although it is worth noting that expectations of what students can do at each grade-level vary across states.

Scott from Solebury, PA asked:
Could you give a picture (at the school level) of how the data is actually gathered? (i.e., how are the kids actually chosen within a school?, who administers the tests (is it the classroom teacher?), how long does the test take?, can the kids opt out?, what instructions are the kids given (in terms of attempts at "motivation", discouragement of "cheating")?
Dr. Peggy G. Carr: Starting in 2002, a member of contractor staff administers all NAEP assessments. This staff member contacts the school in advance of the assessment day, to assure that all necessary information is correct and that arrangements are clear. On the day of the assessment, the staff member arrives at the school in advance of the assessment time, bringing all NAEP materials. Students are chosen to participate in the assessment in advance, and schools are aware of which students were selected. Students are taken to the assessment location chosen by the school and are given the assessment. Students respond to fifty minutes of subject area items and about fifteen minutes of background and school experience questionnaires. NAEP is always voluntary for students. Students may choose not to participate, answer all questions, or omit any questions they choose. Students at the same site are rarely, if ever, taking the same test booklet, so copying is difficult. Students are expected to respond to NAEP as they would to any other assessment.

Bruce from Newton,MA asked:
Why should states pay attention to NAEP results? It seems to me that State NAEP results would be affected by the degree to which State Frameworks and NAEP Frameworks overlap. To what extent are state NAEP results influenced by the alignment of the State Frameworks and NAEP Frameworks? Since states are being compared with each other, does NAEP offer a statistical correction for student?s opportunity to learn (perhaps based upon the degree of overlap)?
Dr. Peggy G. Carr: State participation in state NAEP through 2002 has been voluntary. States have chosen to participate for a variety of reasons, including the ability to compare their state performance to the nation and to other states. The NAEP assessment frameworks in all subjects, available on the NAGB website (www.nagb.org), are developed with extensive input from a variety of stakeholders. As a result, the NAEP assessments take a broad perspective on content. NAEP is not aligned to any particular state?s curriculum. The skills NAEP assesses are intended to be appropriate for the grade level and subject. NAEP does not make any adjustment to the student samples assessed for opportunity to learn or match of state and NAEP frameworks. While it is true that state populations differ, as do their experiences in state assessments, NAEP data are carefully analyzed so that results across states are reported on a single scale.

Max from Bangor, ME asked:
Why are the sample sizes for 2002 so much larger than 1998?
Dr. Peggy G. Carr: I assume you are talking about the national samples, which were about 5-6,000 in 1998 and 132-139,000 in 2002 at fourth and eighth grade. The 1998 national sample was independent of all the state samples; due to changes in our procedures, in 2002 the state samples were counted as part of the national sample. In 1998, more than 100,000 students were assessed in the national and state samples in each grade, but the state samples were not counted as part of the national sample. The main factor in this sample size increase between 1998 and 2002 was NAEP?s new capability to count students assessed in the states as part of the national sample.

Jane from Miami, FL asked:
Can results be compared across fourth and eighth grades? Do the scores mean the same thing at the different grades?
Dr. Peggy G. Carr: No. In NAEP writing, cross-grade comparison of results is not permitted. The NAEP writing scale for each of the grades ranges from 0-300 but are not comparable. Expectations and scoring criteria for the responses vary from grade to grade. The scales for different grades were developed separately. Consequently, the scores do not mean the same thing at different grades. Furthermore, the target proportion of tasks for the three writing purposes vary from grade to grade. For example, the grade 12 assessment includes more persuasive tasks than the grade 4 assessment.

Judith from Spencer, IA asked:
You stated earlier today that, "However, some states that participated did not meet minimum participation standards for reporting." What are these minimum participation standards for reporting?
Dr. Peggy G. Carr: There are five guidelines for the standards for state sample participation and reporting of results. For example, to get public school results published, the weighted participation rate for the initial sample of public schools should be greater than or equal to seventy percent. For detailed rules, see Appendix A of The Nation's Report Card: Writing 2002 (p.176-178).

Thanks for all the excellent questions. Unfortunately, I could not get to all of them, but please feel free to contact me or members of the NAEP staff if you need further assistance. I hope that you found this session to be helpful and the reports to be interesting. Later this summer, we will release the NAEP 2002 Trial Urban District Assessment results for reading and writing.

Back to StatChat Home


1990 K Street, NW
Washington, DC 20006, USA
Phone: (202) 502-7300 (map)