Skip Navigation
small header image
Illustration/Logo View Quarterly by  This Issue  |  Volume and Issue  |  Topics
Education Statistics Quarterly
Vol 1, Issue 4, Topic: Education Statistics Quarterly - Featured Topic: Civics Achievement
Invited Commentary: Uses and Limitations of the NAEP 1998 Civics Assessment
By: Richard G. Niemi, Don Alonzo Watson Professor of Political Science, University of Rochester
 
This commentary represents the opinions of the author and does not necessarily reflect the views of the National Center for Education Statistics.
 
 

The release of a new, major education report is looked upon with considerable anticipation, especially by those of us who worked for years on its conceptualization and operationalization. Reports such as the NAEP 1998 Civics Report Card for the Nation answer numerous questions - sometimes confirming what we thought we knew, sometimes catching us by surprise, usually a bit of each. But for those of us interested in the assessment as a research base as well as for the overall and group scores it reveals, the release ironically raises as many questions as it answers.


One of the most intriguing questions raised by results of the National Assessment of Educational Progress (NAEP) 1998 Civics Assessment is how students in the 4th, 8th, and 12th grades would have scored on a comparable test 10, 20, or even 50 years ago. It is widely argued that young people in the 1990s are characterized by disinterest, distrust, and disengagement. Though participating heavily in individual acts of community service (as shown in the 1998 civics assessment and elsewhere), students and young adults are uninterested in politics, distrustful of government, and uninvolved in voting and other forms of civic and political life. All of these characteristics might have contributed to low knowledge levels in the new civics assessment. Thus, had there been a civics assessment in, say, the 1950s, the leading hypothesis is that students, on the whole, would have outperformed today's students. Unfortunately, we have only limited evidence on this point.

Lack of knowledge by all age groups has been of concern for a long time, but especially since modern polling techniques have allowed representative, nationwide "tests" of civic knowledge (Hyman and Sheatsley 1947). Still, systematic, over-time evidence about young people is hard to come by. Comparing 1989 survey results to results from the 1940s and 1950s, a recent study found a dramatic increase in the knowledge gap between young people (18- to 29-year-olds) and those 45 to 54 years old; however, a comparison could be made for only five survey questions (Delli Carpini and Keeter 1996, 172). Moreover, changing education levels over this period make such comparisons even more difficult than they would otherwise be. NAEP itself has examined this question for 8th- and 12th-graders using assessments conducted in 1976, 1982, and 1988. Across these three assessments, changes in knowledge levels were small and not entirely consistent, with 13-year-olds performing as well as or better in later years but 17-year-olds generally performing less well (Anderson et al. 1990).1

Back to the Top


As discussed above, if our standard is student knowledge in previous years, we are left with something of a puzzle. Supposing, however, that there is a downward trend in knowledge among the newest generations, a further question is raised: Is it the fault of the schools? Have the quantity and quality of civics training declined sufficiently over this period that we can lay the blame on poorer civic education and, more importantly, conclude that a return to higher levels of civic education would reverse the decline in knowledge?

Once again, there is less evidence than we would like, and what information exists contradicts, in part, conventional wisdom. One might begin by observing that in the new assessment over 70 percent of 8th- and 12th-graders claimed they had studied the U.S. Constitution and Congress during the current year, and nearly as many said they had studied topics such as state and local government. These are high percentages, but student reports almost surely overestimate actual coverage. Since 1982, the National Center for Education Statistics (NCES) has conducted periodic high school transcript studies. In work underway (Niemi and Smith 1999), a coauthor and I compare information about course enrollments (not topical coverage) from the 1994 High School Transcript Study with self-reports from the NAEP 1994 U.S. History Assessment. The latter showed enrollment estimates for grades 9, 10, and 12 that were two-and-a-half to three times the percentages shown in students' transcripts (with estimates for grade 11, in which many students in fact take U.S. history, a near match). In any event, for over-time comparisons, we need to draw on additional data.

The conventional wisdom is that considerably less time is devoted to civic education now than in the past. From the period of educational reform early in this century through the 1950s, students often had a 9th-grade civics course and perhaps a capstone 12th-grade course in civics, American government, or problems of democracy. Beginning in the 1960s, according to the conventional view, this pattern broke down, with more students taking electives in other social studies (especially economics and psychology) or simply taking less social studies altogether.

Such data as we have are not entirely supportive of this picture. For one thing, although information prior to 1982 provides only an approximation of course-taking habits, it appears as if civics or government courses, though widespread, were far from inclusive of all high school students during the "traditional" period (through the 1950s). The conventional picture holds true for the 1970s and early 1980s, as such courses reached a smaller proportion of graduating seniors. Yet between 1982 and 1994, there was a considerable growth rather than further decline in government courses. One tabulation shows the proportion of seniors who had taken at least one semester of civics or American government in grades 9-12 increasing from 62 percent in 1982 to 78 percent in 1994 (Legum et al. 1998, A-199). The latter figure compares favorably with estimates for the middle of the century. To further complicate matters, however, it is likely that in earlier decades students more often had a full-year course rather than only one semester, but we lack hard evidence to support this point.

In any event, information about topical coverage and course-taking habits suggests two points. First, there is room for additional civics instruction, especially at the 12th-grade level. Only half of 1993-94 seniors had a semester or more of American government in their final year of high school, and only about 70 percent had a full year of any social studies (Niemi and Smith 1999). Second, simply increasing the amount of civics teaching, if the recent upswing in government coursework is any guide, is not likely to increase substantially the knowledge levels of young people. Improving the nature and quality of government courses is likely to be as important as increasing the number of students exposed to such courses.

Back to the Top


Another question that is not answered in the NAEP 1998 Civics Report Card is how students performed on the subsections of the assessment. The test was designed to assess a broad range of knowledge, covering several general topics or content areas; at the 12th grade, for example, about 20 percent of the assessment was about the relationship of the United States to other nations and to world affairs (National Assessment Governing Board 1996). It remains to be seen whether students were more knowledgeable about some topics than about others. Judging by the results of the 1988 assessment, considerable variability across subject matter is likely (Niemi and Junn 1998, ch. 2). Similarly, the framework for the 1998 assessment also called for indirectly measuring students' participatory skills and civic dispositions. It will be interesting to observe overall student performance on such dimensions and whether performance varies in the same way as it does on the knowledge component. Variations in performance across subject matter might provide clues as to how the content of government courses could be improved.

A related question is how students performed on multiple-choice versus constructed-response (i.e., open-ended) items.2 Ultimately, this is a methodological as well as a substantive question. Inasmuch as NAEP is a "low stakes" assessment in which students receive no individual scores, motivation is a problem, especially at the 12th grade. The question raised here is whether motivation is less of a problem for multiple-choice than for open-ended questions. With the former, the right answer is provided (along with several wrong answers). With the latter, students must generate their own answers, without even the usual guidance from the teacher about the kind of answer that is expected.

Back to the Top


Even if all of the above questions could be answered, there remains the matter of whether the assessment is meaningfully related to an individual's ability to function as a citizen. One can approach this question in a variety of ways. Some, for example, will no doubt argue about specific items or about the particular mix of questions. Indeed, this author, in writing about the 1988 assessment, noted critically the small number of questions about subjects such as political parties, interest groups, and women and minorities (Niemi and Junn 1998, ch. 2). Others will argue that in civics, unlike in mathematics, attitudes are the essential element, and that NAEP is seriously impaired because it is not permitted to assess students' feelings. Political scientists, as of late, have stressed another point, namely, that relatively uninformed individuals can use a variety of heuristics, cues, and shortcuts to guide them in voting and other decisionmaking processes.

While granting some validity to each of these points of view, I would emphasize instead the broad coverage of the new assessment. As noted above, it was designed to test knowledge of a number of content areas, including the nature of civic life and politics generally; the foundations of the American political system, both generally and as it is embodied in the U.S. Constitution; the role of the United States in the international system; and the rights and responsibilities of citizens. But it was also designed to measure students' intellectual and participatory abilities. And, though unable to probe their attitudes, questions were designed to measure students' knowledge and understanding of the importance of civic dispositions, such as by asking how a democratic society benefits from citizens actively participating in the political process. A look at the sample questions on the NCES Web Site will show that students were expected to do much more than answer narrowly constructed questions about arcane constitutional provisions.

Back to the Top


Of course, no test is adequate from every perspective, and the NAEP 1998 Civics Assessment is no exception. As discussed above, it will not answer all of the questions we have about student performance levels, even when fully analyzed. Yet the new assessment provides the means to answer many questions about students' knowledge of politics and government as well as the teacher and classroom context for learning about civics. The release of the NAEP 1998 Civics Report Card only begins the task of answering these questions. It remains for us to make full use of the new data.

Back to the Top


Footnotes

1 Analysis of trends between 1988 and 1998 is also planned, since the 1998 civics assessment included a partial replication of the 1988 assessment. NCES plans to release a trend report covering this replication in the year 2000.

2 Fifty-three percent of the 4th-grade assessment (judged by assessment time) consisted of multiple-choice items. At the 8th and 12th grades, 61 percent was multiple choice.


Anderson, L., Jenkins, L.B., Leming, J., MacDonald, W.B., Mullis, I.V.S., Turner, M.J., and Wooster, J.S. (1990). The Civics Report Card (NCES 90-498). U.S. Department of Education. Washington, DC: U.S. Government Printing Office.

Delli Carpini, M., and Keeter, S. (1996). What Americans Know About Politics and Why It Matters. New Haven: Yale University Press.

Hyman, H., and Sheatsley, P. (1947). Some Reasons Why Information Campaigns Fail. Public Opinion Quarterly, 11: 412-423.

Legum, S., Caldwell, N., Davis, B., Haynes, J., Hill, T.J., Litavecz, S., Rizzo, L., Rust, K., and Vo, N. (1998). The 1994 High School Transcript Study Tabulations: Comparative Data on Credits Earned and Demographics for 1994, 1990, 1987, and 1982 High School Graduates-REVISED (NCES 98-532). U.S. Department of Education. Washington, DC: U.S. Government Printing Office.

National Assessment Governing Board. (1996). Civics Framework for the 1998 National Assessment of Educational Progress. Washington, DC: Author.

Niemi, R., and Junn, J. (1998). Civic Education: What Makes Students Learn. New Haven: Yale University Press.

Niemi, R., and Smith, J. (1999). Enrollments in High School Government Classes: Are We Short-Changing Both Citizenship and Political Science Training? Manuscript in preparation, University of Rochester.

Back to the Top



1990 K Street, NW
Washington, DC 20006, USA
Phone: (202) 502-7300 (map)