Skip Navigation
small header image

Results from the 2003 National Assessment of Educational Progress Trial Urban District Assessments for Reading and Mathematics

Dr. Peggy G. Carr Hello, and welcome to today's StatChat on the NAEP 2003 reading and mathematics results for the Trial Urban District Assessment. I hope you've had time to look at the results on the website. I'm sure that you have many questions regarding today's release, so let's get right to them.

Laura from Chicago, IL asked:
When are the next district assessments scheduled to take place, and in which subjects?
Dr. Peggy G. Carr: Yes, there will be more district NAEP assessments. In 2005, there will be an assessment of ten urban districts in reading and math. Exactly which ten districts will participate has not yet been finalized.

Carson from Washington, DC asked:
What districts will be included in the 2004 NAEP assessment?
Dr. Peggy G. Carr: There will be no NAEP state or Trial Urban District Assessments in 2004. The assessments scheduled for that year are Grade 12 Foreign Language and Long-Term Trend at the national level for only ages 9, 13, and 17.

Steve from Falls Church, VA asked:
These are fascinating results. Will there be another Trial Urban District Assessment? If so, when? Will there be more urban districts assessed if another is conducted?
Dr. Peggy G. Carr: Yes, Steve, more district assessments are planned. See my response to Laura.

Louise from San Jose, CA asked:
Charlotte-Mecklenburg did really well at both grades, compared to most of the other districts. Can you describe what factors may have contributed to this difference?
Dr. Peggy G. Carr: I don?t know the answer to your question, but I can refer you to people who can help. 1) The Office of Instructional Accountability of the Charlotte-Mecklenburg school district can help offer an insider?s point of view: www.cms.k12.nc.us/departments/instraccountability or via email: ia@cms.k12.nc.us. 2) The Council of Great City schools last year published a study of five school districts, including Charlotte-Mecklenburg. This report is available on their website: www.cgcs.org. 3) You can also explore the possible factors that may have contributed to their success using NAEP?s data tool, which includes background information about students, teachers, and schools. Select 'NAEP Data' in the navigation bar at the top of http://nces.ed.gov/nationsreportcard.

Matt from Rockville, MD asked:
Just looking at the list of districts assessed in 2003, I noticed that no cities north of Los Angeles and west of Chicago were selected to participate in TUDA. Is there a particular reason why the northwest quadrant of the country was not included?
Dr. Peggy G. Carr: Because of funding limitations, only ten districts could be assessed in 2003. Urban districts were chosen to represent districts from different geographic areas and different racial/ethnic groups. Future selection of districts will take the northwest quadrant into account.

Vivian Cash from Laurel, Maryland asked:
How does NAEP ensure the comparability of results among the state assessments and between the state and national assessments?
Dr. Peggy G. Carr: NAEP?s design ensures comparability across jurisdictions by utilizing a number of approaches. For example, NAEP results are comparable across states and jurisdictions because the same assessments are given to all participating students, and our field staff administer them in the same way throughout the country. In addition, since the national results are an aggregate of the state results, there really is no question of comparability between national and state assessments.

Lisa from Hawaii asked:
What is the value of these results to the school districts? How will they use them?
Dr. Peggy G. Carr: School districts can use the NAEP results for a variety of purposes: 1. They can compare their results to the other districts in the assessment, students in large central cities generally, and in the nation. 2. Districts can examine the performance of subgroups of their students (for example, gender, race, eligibility for free or reduced-price lunch). 3. A district can measure its progress over time.

Chris from Rural Hall, Ga asked:
Is this the same tests that our students here in Georgia take?
Dr. Peggy G. Carr: Chris, no, it is not the test your students take for your Georgia state assessment. The Trial Urban District Assessment (TUDA) is based on a content framework established by the National Assessment Governing Board (www.nagb.org). This assessment is used with samples of students in these ten TUDA districts and in every state. NAEP results serve as an independent monitor of progress, one that is comparable across every jurisdiction.

Cindy from Jacksonville, Fla asked:
There is too much testing in our schools and the teachers don't have time to teach and the students to learn. I thought I just read about the District of Columbia a month or so ago but they were included in the report today. Maybe they just test too much which causes their scores to be lower than other states and districts.
Dr. Peggy G. Carr: Cindy, I am sympathetic to your viewpoint. However, testing does provide a way for students, parents, teachers and education policy makers to measure educational performance and to determine whether performance is improving. In NAEP, the District of Columbia was tested only once in 2003. Their results were reported along with the states last month and again today with the urban districts.

Ione from Arlington asked:
How did these districts get selected? Charlotte seems to be out of line with the rest of the districts? Can other districts participate?
Dr. Peggy G. Carr: Districts that represent a range of urban settings were selected by the Council of Great City Schools, the National Assessment Governing Board, and the National Center for Education Statistics. Since this is a trial assessment, it is important to have a variety of districts. Charlotte is typical of many larger urban districts with varied student profiles, including sizeable but not overwhelming percentages of students from minority and low-income backgrounds. The project is expected to continue at its current level, nine districts for 2005, plus the District of Columbia.

Jan from Winston-Salem asked:
I am surprised that, compared to Houston, Los Angeles seems to have tested all of their Hispanic students. Is it fair to compare the two districts with one another? What about with Charlotte? Their demographics seem totally different.
Dr. Peggy G. Carr: Hispanic students who are excluded from NAEP are generally students who are still learning the English language, and whose proficiency in English is not yet sufficient to participate in an assessment of reading in English. Not all Hispanic students are so classified. Differences in exclusion rates vary across districts (and across states) due to factors such as the population profile, proportion of special-needs students, and state or district policies regarding the provision of specific accommodations which vary across states and districts. California?s policy and practice is to test reading in English, even for Limited-English-Proficient (LEP) students. Houston?s and Los Angeles? scores accurately reflect the performance of children who were tested in English on the NAEP reading assessment. However, the large proportion of excluded students in Houston in the fourth grade reading may limit the generalizability of the results to the entire population of Houston?s fourth-graders, which includes LEP students. For a response regarding Charlotte, please see the response to Ione.

Jeanne from Palo Alto, CA asked:
Can you explain the situation in Los Angeles (and Charlotte) that's described on your website--they "have from one-quarter to one-third of their fourth- and eighth-grade students enrolled in surrounding urban fringe or rural areas." What does this mean when we compare them with districts that are classified as totally "large central cities?" Thanks very much for clarifying this for me.
Dr. Peggy G. Carr: In addition to areas classified as urban by the Bureau of the Census, the Los Angeles and Charlotte-Mecklenburg school districts encompass areas classified as urban fringe and rural. The other districts included in the Trial Urban District Assessment do not. To compare the areas of districts classified as urban, the NAEP data tool on the NCES website may be used to compare districts by type of community.

Jeff from San Diego, CA asked:
Can my district scores be compared to my state scores?
Dr. Peggy G. Carr: Yes, Jeff, district and state scores can be compared. Keep in mind, however, that demographic variations and differences in resources between a district and its state may contribute to differences in overall results. In particular, most urban districts have a higher minority concentration than whole states, and usually a higher proportion of low-income students.

Johannes from Maryland asked:
Charlotte-Mecklenburg is demographically very different. Is it the only consolidated city-county district included? How was Charlotte-Mecklenburg chosen?
Dr. Peggy G. Carr: Charlotte-Mecklenburg is different because NAEP selected districts to be quite variable. Since this is a trial, we wanted to get experience with many different demographic distributions. I believe Los Angeles Unified School District is also a consolidated city-county district and also includes outside urban areas described as ?large central cities.? As for why Charlotte-Mecklenburg was chosen, please see my answer to the question from Ione of Arlington.

Mary Jo from Sierra Vista, AZ asked:
In the report on Trial Urban District mathematics, in looking at male - female score gaps, there are some reversals of the usual difference--for instance, at grade 8, it seems that girls did better than boys in Atlanta, Boston, and DC. Has anyone suggested possible explanations for that?
Dr. Peggy G. Carr: Good observation! While it is true that, on a variety of assessments, it has been observed that males tend to outscore females, this is not always the case. Many assessments have seen these gaps begin to close, and in NAEP we have seen states in which females outscore males, just as we see here in these three districts. As score gaps between male and female students tend to be small compared to score gaps between other groups (such as groups based on race/ethnicity, free/reduced-price lunch eligibility, or SD/ LEP status) so the existence of such reversals in the male/female average scores is not unexpected. If you examine the data for males and females on the NAEP achievement levels for these districts, you will see that the pattern of results for males and females is similar for both groups.

Karen from Peru, NY asked:
Do non-public schools in large urban districts perform better than public schools in those districts?
Dr. Peggy G. Carr: The Trial Urban District Assessment is designed to provide results only for students in public schools. We have no separately reported results for students for non-public schools within these districts. Students attending non-public schools are sampled in sufficient numbers in national NAEP to be reported. Since the TUDA samples form part of the NAEP national sample, there almost certainly are non-public students from these districts in the national sample. Unfortunately, the number of non-public students in any one district is too small to provide stable estimates of average scores or to meet NAEP reporting standards for subgroup sample size.

Claudette from Keene, NH asked:
What about small school districts? How well do they perform? When will NAEP report their results?
Dr. Peggy G. Carr: NAEP cannot report results for small districts. The assessments are conducted using a sample of students, and a certain minimum number of students are required to produce reliable results. Thus, NAEP can only report results for the nation, regions, states, and large districts.

David from Seattle, WA asked:
How do exclusion rates for districts compare with those for the nation? Please comment, if they are different. Can I see these rates on your website?
Dr. Peggy G. Carr: WIth the exception of Houston's fourth grade and Cleveland's fourth and eighth grades, the exclusion rates are roughly comparable to those for the nation. Cleveland's rate was high based on a large number of students with disabilities who could not meaningfully participate in the assessment. Houston's rate was high because they use a native language version of their state reading test for their limited-English-proficient students. NAEP does not offer that accommodation, so the decision was made by their schools to exclude those students.

Bruce from Lawrenceville NJ asked:
Are there plans to include more districts in the future? Will we be able to see results from all districts sometime in the future?
Dr. Peggy G. Carr: Please see my response to Steve about future urban district plans. Districts were selected to meet a variety of characteristics, and continuing to maintain these criteria will play a major role in selecting districts for future assessments. Of those districts that meet the criteria, it is unlikely that NAEP could assess and report all of them due to budget limitations.

Anne from Santa Monica, CA asked:
Are students in the district results excluded from the state results? Can students appear in both samples? If so, how many do?
Dr. Peggy G. Carr: Students in the district results are not excluded from those for the states in which they are located. Similarly, students in each state are included in the results for the nation as a whole. All the students tested in each district appear in both state and national results. While this results in proportionally more students from the selected districts than from the rest of the states, we give less weight to their answers in summarizing state or national results, so that our findings properly represent each jurisdiction.

Fran from Dover, PA asked:
Does teacher salary in urban districts have any influence on quality of teachers and then on performance of students?
Dr. Peggy G. Carr: NAEP does not collect data on teacher salary. There is a lot of literature about this question, however, which you could find through the Department of Education's ERIC database (www.eric.ed.gov).

Sam from Gillette, WY asked:
These urban district results are fascinating, but I was wondering if there was anything in the works to assess rural communities. Or is that information already available on the existing NAEP database?
Dr. Peggy G. Carr: It is not technically possible to provide reliable results for rural areas such as Gillette, because of the small sample size. However, through the NAEP website data tool, you can explore the performance of rural students in Wyoming and other states, as well as the nation.

Paolo from Newark, NJ asked:
These results are very interesting. It seems like there are quite big differences between districts and one possible explanation is that districts are very different with respect to income level and population density. Is there a way to relate proficiency to such variables?
Dr. Peggy G. Carr: You've hit on an interesting point. The Highlights reports contain data on a number of demographic and income-related characteristics, such as race/ethnicity and eligibility for the free and reduced-price lunch program (a measure of low income). Other characteristics are reported on the NAEP website (nces.ed.gov/nationsreportcard), select "NAEP Data" to use the data tool.

Judith from Kennebunk, ME asked:
Is the information about student performance in the large urban districts transferable to small cities? Can NAEP information help those of us who do not fit in the larger geographical categories?
Dr. Peggy G. Carr: NAEP is currently authorized by legislation only to survey and report on the nation, states, and on a trial basis, urban districts. Results from the Trial Urban District Assessment are not generalizable to other individual school districts, such as those of small cities. The NAEP data tool on the web may be used to compare aggregations of results for different types of communities, nationally and by states.

Lou from Santa Clara, CA asked:
I read Winnick's statement "On the fourth-grade reading assessment, African-American students score above the national average in four of the ten districts. Hispanic students exceed the national average in five of the districts." This is great news! Where can I find these data on your website?
Dr. Peggy G. Carr: Both the published report and a set of presentation slides summarizing the results include the data you requested. Both can be found on the web at http://nces.ed.gov/nationsreportcard/reading/results2003/districtresults.asp or at http://nces.ed.gov/nationsreportcard/mathematics/results2003/districtresults.asp. These data are also available through the data tool.

Thomas from Syracuse, NY asked:
Do the district samples contribute to the state results as well, or are they independent?
Dr. Peggy G. Carr: Thomas, please see my answer to Anne of Santa Monica about how students can appear in both samples. When we do statistical tests comparing performance between districts and their states, we adjust for the lack of independence among the estimates. You can use our NAEP data tool on the web to make such comparisons yourself.

Gloria from Bosque Farms, NM asked:
Does ethnic diversity in urban districts help or hinder student performance? Is there a relationship between ethnicity/minority status and economic status?
Dr. Peggy G. Carr: Hispanic and Black students are far more likely to come from poor families than White or Asian students (as measured by their eligibility and participation in the free/reduced-price-lunch program - a program based on income below or near the poverty level). NAEP data show that students from poorer families do less well than students from families that are not poor. So, in part, one reason why Hispanics and Black students score lower on NAEP is related to their disadvantaged status.

Thanks for all the excellent questions. Unfortunately, I could not get to all of them, but please feel free to contact me or members of the NAEP staff if you need further assistance. I hope that you found this session to be helpful and the reports to be interesting.

Back to StatChat Home



1990 K Street, NW
Washington, DC 20006, USA
Phone: (202) 502-7300 (map)