Skip Navigation
small header image
Illustration/Logo View Quarterly by  This Issue  |  Volume and Issue  |  Topics
Education Statistics Quarterly
Vol 6, Issue 4, Topic: Note From NCES
Note From NCES
Val Plisko, Associate Commissioner, Early Childhood, International, and Crosscutting Studies Division
 

Comparing U.S. Students' Performance Internationally: Results From the 2003 TIMSS and PISA

The National Center for Education Statistics (NCES) participates in several international assessment programs to compare the achievement of students in the United States to that of students in other countries. It recently released findings from two of these assessments, the Trends in International Mathematics and Science Study (TIMSS) (formerly known as the Third International Mathematics and Science Study) and the Program for International Student Assessment (PISA), both conducted in 2003, which are the focus of this issue of the Quarterly. TIMSS, organized internationally by the International Association for the Evaluation of Educational Achievement (IEA), is an assessment in math and science. Much like the National Assessment of Educational Progress in this country, TIMSS tests students' mastery of the curricula expected to be taught. Beginning in 1995 and with subsequent assessments in 1999 and 2003, TIMSS is conducted on a 4-year cycle. Testing was conducted in the United States at the 4th, 8th, and 12th grades in 1995; at the 8th grade in 1999; and at the 4th and 8th grades in 2003.

The framework that guides the development of TIMSS is based on expert judgment of the content that is expected to be taught at the 4th and 8th grades in participating countries. The same assessment is administered in each participating nation, except for the language that it is given in, and can help to distinguish the knowledge level of students in the same grades in different countries.

Although the makeup of the groups of nations participating in TIMSS has varied, the United States has participated in each administration at each grade level. In 2003, 46 nations participated at one or both grades. For 15 nations, trend data are available for 4th-graders from 1995 to 2003; for 34 nations, trend data are available for 8th-graders from either 1995 or 1999 to 2003.

PISA, which is organized internationally by the Organization for Economic Cooperation and Development (OECD), is on a 3-year cycle and was conducted in 2000 and 2003. PISA assesses students at age 15, regardless of grade. Three areas are assessed in each administration of PISA-reading literacy, mathematics literacy, and science literacy-with one area being the focus of the assessment. The focus area includes more items in the assessment and receives a more detailed analysis and reporting of results. In 2003, the area of focus was mathematics literacy.

Literacy is defined in PISA as the ability to apply knowledge and skills gained in school or elsewhere to a broad range of situations. To test students' literacy, assessment items are therefore set in situations, or use materials, from everyday life whose solutions require the application of subject-area knowledge. Math items, for example, often use charts or graphs that students need to understand in order to solve the problem that is presented.

Participating countries are required to draw samples that are nationally representative. In 2003, almost 19,000 U.S. students participated in TIMSS and almost 5,500 U.S. students participated in PISA, drawn from public and private schools sampled across the country.


How TIMSS and PISA Differ

What is assessed Perhaps most significantly, TIMSS and PISA differ in what they test. Whereas TIMSS tests students' mastery of the specific knowledge, skills, and concepts that are typically taught as part of school curricula, PISA tests students' ability to apply what they have learned to real-life situations. Thus, assessment items in PISA are presented in a variety of situations that students might encounter. The different emphases also are reflected in a difference in the format of the assessments: About two-thirds of the items in TIMSS are multiple choice, compared to about one-third of the items in PISA. PISA relies more heavily on constructed-response, or open-ended, items.

Students and countries assessed

Another important distinction between the two assessments is that TIMSS student samples are selected by grade, whereas PISA student samples are selected by age. While TIMSS focuses on assessing curricular learning at consistent grade levels in participating countries, PISA focuses on assessing the "yield" of education systems as students make the transition from school to society at large. PISA assesses students only at age 15; in 2003, 61 percent of the U.S. students participating in PISA were in the 10th grade. Similarly, most PISA students in other participating countries are also in a grade that is near the end of compulsory schooling.

Finally, the characteristics of the groups of participating countries differ. The 46 countries that participated in TIMSS in 2003 represent a wide range of development, with only 13 belonging to the OECD, and the international averages that are reported include all participating countries. In contrast, in 2003, participants in PISA included all 30 member countries of the OECD as well as 11 other countries. Thus, the participants in PISA are weighted toward developed countries. Furthermore, the international averages that are reported for PISA include only the OECD countries.

Because of these differences, TIMSS and PISA provide different kinds of information about different sample populations. The studies are complementary, but the results are not directly comparable.

Assessing problem-solving skills

In TIMSS, the assessment frameworks for grades 4 and 8 in 2003 included "problem-solving and inquiry" tasks. These tasks assessed how well students can "draw on and integrate information and processes in mathematics and science to solve problems." The problem-solving and inquiry tasks were embedded in the TIMSS math and science items and did not receive a separate score.

In PISA, each administration, in addition to literacy areas, assesses a different cross-curricular competency that is not repeated in following administrations. Unlike the problem-solving and inquiry tasks in TIMSS, the cross-curricular items are distinct from the literacy items and receive a separate score. In 2003, the competency that was assessed was problem solving. To solve problem-solving items, students apply multistep reasoning to novel situations. Three types of problem-solving items were presented: "system analysis and design, where students had to use information about a complex situation to analyze or design a system that met stated goals; troubleshooting, where students had to understand the reasons behind a malfunctioning device or system; and decisionmaking, where students had to make decisions based on a variety of alternatives and constraints."

*   *   *   *   *   *   *

The results for TIMSS tell us that U.S. 4th- and 8th-graders scored higher than the international averages in math and science in 2003; however, between 1995 and 2003, while U.S. 8th-graders' scores and international standing increased, U.S. 4th-graders' scores remained the same and their standing decreased. The results for PISA show us that U.S. 15-year-olds in 2003 scored lower, on average, in math literacy and problem solving than the OECD averages. Despite the differences between the two assessments, both document that a number of Asian and European countries outperformed the United States in these critical areas in 2003. Joan Ferrini-Mundy, University Distinguished Professor, Michigan State University, discusses these and other findings as well as their implications in the invited commentary in this issue of the Quarterly. Highlighted findings from TIMSS 2003 and PISA 2003 are presented in the excerpts of the recently released reports that precede the commentary.


back to top

1990 K Street, NW
Washington, DC 20006, USA
Phone: (202) 502-7300 (map)