Skip Navigation
small header image

The Nation's Report Card — Results from the 2005 NAEP Science assessment
Dr. Peggy G. Carr Good afternoon, and welcome to our Stat Chat on the NAEP 2005 Science report. I hope you?ve had time to examine the results and that I can answer any questions you may have. There are many findings for all three grades and I?m interested to hear which ones you want to talk about. So, let?s get right to your questions?
Sarah Castile from Florence, AL asked:
The students have always enjoyed the hands-on portion of the science assessments. Are there plans to expand, change or add hands-on experiments in the future?

Dr. Peggy G. Carr 's response:
The science assessment will continue to include hands-on activities as described in the new framework that will be used in the 2009 NAEP Science Assessment. In addition, future assessments will include interactive computer tasks, which will actively engage students in simulated experiments and investigations of natural phenomena.


Elisabeth from Montgomery, Alabama asked:
Will standardization ever come across all areas of testing? It seems national validly is difficult because each state has their own test and sets their own evaluation methods and standards. For example: If exams could have a core area of questions for each subject so government would have one standardized test sample, that would benefit all states.

Dr. Peggy G. Carr 's response:
The United States does not have a national test, just as there is not a national curriculum. Each state is responsible for deciding on the appropriate education for its citizens, including what their students should be tested on. NAEP is useful for comparing performance among states, as the same NAEP assessment is given across the country.


June from Jacksonville, Florida asked:
Are there statistics on the percent of students from each state that go into science related professions? Or do you know where we could possibly look for them?

Dr. Peggy G. Carr 's response:
NCES?s longitudinal studies program tracks the population of high school students through college into work life. See http://nces.ed.gov/surveys/nels88. However, this survey is a national study, and does not provide statistics for each state?none of our surveys do that.


Karen from Denver, Colorado asked:
What percentage of proficiency should we expect -- 100 percent? Also, what do we know about those students who are able to perform at a proficient level?

Dr. Peggy G. Carr 's response:
The National Assessment Governing Policy Board established achievement level standards to help the public interpret NAEP results as the goal for all students. The description of the Proficient level for each grade explains what Proficient students know and can do in science (see the Science Report Card). Also, it may be useful to look at the ?item maps? in the report released today or on the website. It shows the kinds of assessment questions that students in different achievement levels could answer.


Kristin from Arlington, Virginia asked:
Can you explain the importance of this study for science educators and how it can benefit them in their teaching practices?

Dr. Peggy G. Carr 's response:
The NAEP study is a snapshot of student achievement at one point in time. As a result, statements about best teaching practices cannot be made from NAEP data. Longitudinal studies, such as the National Education Longitudinal Study (NELS) and the Educational Longitudinal Study (ELS), run by NCES are better suited for these purposes. However, NAEP does provide released test questions on the NAEP website along with statistics on how well states and the nation perform on each question. Teachers find these helpful because many of these questions correspond to their state frameworks. Also, NAEP asks background questionnaires of science teachers regarding their classroom practices. These data provide a context for the achievement results. The NAEP online data tool (the NAEP Data Explorer at http://nces.ed.ogv/nationsreportcard/nde) allows researchers to explore the relations of these variables to achievement.


Diane from Arlington, Virginia asked:
Please comment on the impact of the study for science teachers.

Dr. Peggy G. Carr 's response:
Please see my answer to Kristin's question.


Lori from Columbus, OH asked:
Can you please discuss the relationship between NAEP science testing and the NCLB state science testing that is forthcoming?

Dr. Peggy G. Carr 's response:
NCLB state science testing measures performance against state science curriculum standards, which can vary from state to state. The NAEP science assessment measures performance in every state against a national content framework. While the NAEP framework is not a content standard for states, it is both broad and provides a common yardstick against which to measure the performance of every state in the NAEP state assessment.


Matt from New York asked:
How can teachers use the statistical information from NAEP Science 2005 to improve classroom instruction?

Dr. Peggy G. Carr 's response:
Please see my answer to Kristin?s question.


Donna from Philadelphia asked:
Why don't we have results for all states?

Dr. Peggy G. Carr 's response:
Participation at the state level was voluntary. Only states that signed up to participate with NAEP received science results. Although schools and students were assessed in all states, in non-participating states the samples are too small to give useful state-level results.


Catherine from Gainesville, FL asked:
I understand that there are actual lab experiments that are part of the NAEP science assessment. Do all students complete an experiment as part of the test? How long do the experiments take, and what kind of things do students do in them?

Dr. Peggy G. Carr 's response:
Approximately half of all students taking the assessment also complete a hands-on activity. These activities take 20 minutes for grade 4 students and 30 minutes for grade 8 and grade 12 students. The activities involve a variety of activities that might include measurement classification, collecting and graphing data, and forming conclusions based on their observations.


Peggy from Atlanta, GA asked:
How frequently is science assessed in NAEP? When is the next assessment? Is NAEP science always given at grades 4, 8, and 12, or are other grades included?

Dr. Peggy G. Carr 's response:
NAEP assesses science every four years. For the next assessment, data collection will be in 2009, with the results reported in 2010. NAEP science is always given to a national sample in grades 4, 8, and 12. In addition, science is administered on a voluntary basis to obtain state-representative results in grade 8, and grade 4 when funds permit. NAEP science is only given to these three grades.


Marianne from Silver Spring, MD asked:
I heard someone say at the press conference that the science scores could be linked to students' reading and math skills. How heavy is the reading burden on the NAEP science test?

Dr. Peggy G. Carr 's response:
Darv Winnick, the chairperson of the National Assessment Governing Board (NAGB), did make that observation at this morning?s press conference. Regarding your question about the reading burden on the NAEP science assessment, this is certainly one issue we look at as we develop test questions. Of course, some amount of reading is required in order to understand and answer test questions, but we make sure the focus of the assessment remains on science knowledge and skills.


Veta from New York, NY asked:
Does the new framework support assessing cross grade items? How do the students perform on such items in Science?

Dr. Peggy G. Carr 's response:
The new framework is not designed to reflect a cross-grade approach. Some questions in previous assessments were asked at more than one grade, but the skills tested at the various grades are different. It therefore is not appropriate to measure performance on a cross-grade scale. You might want to use the NAEP Questions Tool to view performance on individual items (http://nces.ed.gov/nationsreportcard/itmrls/).


Linda from Snoqualmie, WA asked:
When you released the reading and math results last fall, all the scores were in the 200s and 300s. Why are the science results so much lower?

Dr. Peggy G. Carr 's response:
The score numbers are lower because it is a different scale, one that ranges from 0 to 300. The reading and mathematics scales range from 0 to 500. It is not meaningful to compare scores on these different tests.


Kelly from El Paso, TX asked:
This report gives national results for grades 4, 8, and 12, but only state results for grades 4 and 8. Is there a reason state data was not collected or reported for grade 12?

Dr. Peggy G. Carr 's response:
No NAEP funds are available for state-by-state assessment for grade 12 in and subjects. However NAGB, NAEP?s governing body, is currently discussing the feasibility of adding a 12th-grade state assessment.


Tara from Harrisonburg, VA asked:
I read in the Virginia summary of results, that Virginia students may have exceeded the national average because of our Standards of Learning (SOLs) which are required in the Virginia state system. Will NAEP recommend other states to implement higher standards than the national standards in order to push their students to excel?

Dr. Peggy G. Carr 's response:
Virginia has much to be proud of with these NAEP results. It is one of the top states, with improvements in science scores for both grades 4 and 8, with consistent results for earth, life, and physical sciences. This assessment is not designed to determine how these results were obtained. The VA Department of Education will probably want to further investigate this issue. NAEP offers a wide variety of student performance, teacher, and school variables that might be useful in formulating and testing hypotheses about student performance. Correlational research of this nature should be followed up with further research, ideally using randomized field trials, to explore causal hypotheses.


Caitlin from Monterey, CA asked:
Are there any measures of student SES status? If so, what did the results show about high SES vs. low SES students on NAEP science?

Dr. Peggy G. Carr 's response:
Yes, NAEP reports on the achievement of students who qualify for the free and reduced-price lunch program. Students who come from families near or below the poverty level are eligible for the lunch program. The results show that the score gap between those eligible and not eligible in both grades 4 and 8 is decreasing. This is due to the eligible students improving.


Melissa from Silver Spring, Maryland asked:
I'm pleased to see the results for the different areas of science (i.e. earth, physical, and life), but I would be interested in learning more about how specific groups of students fared in each subscale as well. Is this information available? I'm especially interested in gender differences.

Dr. Peggy G. Carr 's response:
The results for all student groups are available on the NAEP Data Explorer (http://nces.ed.gov/nationsreportcard/nde). You may look at their overall average scores, or their average scores in the different fields of science. In order to find these results, go to our website and enter the Data Explorer in the ?Advanced? mode. Under ?Format Table,? you?ll find a pull-down menu for subscales.


Tamas from Columbus, OH asked:
How do you compare test scores between two different years?

Dr. Peggy G. Carr 's response:
The assessments are designed to be comparable from one assessment to the next, as long as they are conducted to measure the same frameworks. We release about ΒΌ of the questions to the public, and replace them with questions that measure comparable content and skills.


Timothy from Philadelphia, PA asked:
How is the mix of item types (multiple choice vs. constructed response) on the assessment determined? Is the NAEP assessment meant to be similar to the tests that science teachers administer in their classrooms?

Dr. Peggy G. Carr 's response:
The balance of multiple-choice and constructed-response items on the assessment is guided by the specifications. The format of the tests that teachers use in their classrooms vary widely, and have different purposes. The NAEP assessment includes a variety of item types but is not intended to reflect or suggest what is done in classrooms.


Robert from Nashua NH asked:
Should we be concerned or encouraged that the "increased proficiency" at the Grade 4 level comes from the bottom percentiles? Also that the drops (though not significantly) at higher grade levels come at the top percentiles?

Dr. Peggy G. Carr 's response:
At grade 4 it is encouraging that gains are evident at the lower percentiles because this is not at the cost of proficiency at higher levels?the higher percentiles are holding steady. For grade 12, since 1996 there have been decreases at all percentile levels, but with no changes since 2000. So the drop is across the board, but minor at all levels.


Eric Zilbert from Sacramento, California asked:
In grade 4, California Hispanic students gained 14 points, which are not statistically significant. However, the proportion of students scoring above basic was significantly higher. A similar situation exists for Asian/Pacific Islander students: a non significant 13 point gain, but a significant gain in those score at or above proficient. How are we to interpret these results? Are these groups improving or not?

Dr. Peggy G. Carr 's response:
The California Hispanic grade 4 gain is significantly different than 2000. In the NAEP Data Explorer (NDE), be sure to only select one subgroup (e.g. Hispanic) when checking significance. Therefore, Hispanics are improving, both in average scale score and percent scoring at or above Basic. The Asian/Pacific Islander subgroup is much smaller in California relative to Hispanics. Therefore, the margin of error around the score is larger for Asian/Pacific Islanders, which explains why the 13 point gain was not significant. The significant gain in the percent at or above Proficient shows that the more able Asian/Pacific Islanders are improving.


Michelle from Harrisburg, PA asked:
If student participation in NAEP is voluntary and students don't receive individual scores, how do we know that they gave their best effort on the assessment? Is there any way to take this into account when reporting the results?

Dr. Peggy G. Carr 's response:
Unfortunately, there is no objective way to measure students' effort. We do encourage all students to do their best. In addition, on the background questionnaire, students are asked about the effort they gave the test. The data for this question, and all others from the questionnaires, are available on the NAEP Data Explorer (http://nces.ed.gov/nationsreportcard/nde).


Amanda from Killen, AL asked:
What kind of impact, if any, do you anticipate on the NAEP science assessment, on the new requirement that science must now be added to the "No Child Left Behind" subject areas to be assessed in grades 3-8 starting in 2007?

Dr. Peggy G. Carr 's response:
We hope, of course, that greater attention to science in the nation's schools will be reflected in improved performance on NAEP. At this point, there is no requirement for states to participate in the NAEP science assessment, as there is in reading and mathematics at grades 4 and 8.


Shirley from Plainsboro, NJ asked:
How do you select the cities in TUDA? How will the results of these big cities have impact on classroom activities and why does this study want to separate these large cities from the others?

Dr. Peggy G. Carr 's response:
The Council of Great City Schools proposed the Trial Urban District Assessment (TUDA) to assist urban districts in monitoring the educational progress of their students. Because the project is a trial, in 2002 urban districts were chosen to select a diverse set of districts. The districts possessed factors that were expected to present challenges to the feasibility of the Trial Urban District Assessment.


Clare from Shirley, MA asked:
How does the number of twelfth-grade students tested in 2005 compare to that tested in 2000 and 1996? Are some states resistant to testing in Science?

Dr. Peggy G. Carr 's response:
About the same number of twelfth-graders were tested in all three assessments between 11,000 and 15,000. Participation in NAEP science assessments is voluntary. In 2005, participation at grade 12 was adequate for national estimates. There was no state level assessment at grade 12. Forty-four states agreed to participate at the state level at grades 4 and 8, and there was adequate participation from the remaining states to get good estimates of national performance at those grades.


Allan from Paris, Kentucky asked:
Some States have improved from 2000 while others did not. In grade 8, some even declined. Why do you think this happened and does NAEP collect and provide information about curriculum and policy differences between states?

Dr. Peggy G. Carr 's response:
As a statistical agency, we only report how students have performed. We cannot speculate on the reasons for our findings. NAEP does not collect information about state curricula and policies. Our assessments are based on frameworks developed by committees of educators, subject-matter experts, and other informed citizens. These are not based on curricula in any particular state. You might take a look at the wide range of information on student characteristics, teacher qualifications, and school policies (http://nces.ed.gov/nationsreportcard/nde).


Marianne from Andover, MA asked:
How are the topics for the hands-on items chosen, and how are the topics written, for each grade level?

Dr. Peggy G. Carr 's response:
The hands-on activities are chosen so that each of the fields of science are represented at each grade. They are written so that students will be discovering new information using their science knowledge and forming conclusions based on their observations and data collected. Also, they may be asked to apply this information to answer questions requiring practical reasoning.


Irene from Bozeman, MT asked:
Does NAEP contemplate the distribution of school demographic variables (such as Free-reduced lunch, ethnicity, ESL, etc.) across states to make that comparison meaningful?

Dr. Peggy G. Carr 's response:
It is important to consider variations in demographics when making comparisons across states and jurisdictions. This is one reason we've included state results by gender, race/ethnicity, and eligibility for free/reduced-price lunch on the initial release website (http://nationsreportcard.gov). Using these charts, you can look at how students with similar demographic characteristics performed across all states. But beyond these demographics, NAEP collects information about instructional practices, teacher qualification, and school policies. This is also important information to consider when making cross-state comparisons. You can access these data in the NAEP Explorer on our website (http://nces.ed.gov/nationsreportcard/nde).


Dr. Peggy G. Carr :
Thanks for all the excellent questions. Unfortunately, I could not get to all of them, but please feel free to contact NAEP staff or myself for more information. For additional information, contact Sherran Osborne at sherran.Osborne@ed.gov. I hope that you found this chat helpful and the report interesting. Please visit the web site in the coming weeks for more information on the release of the first NAEP trial urban district (TUDA) report in Science.

1990 K Street, NW
Washington, DC 20006, USA
Phone: (202) 502-7300 (map)