Skip Navigation
small header image
The Condition of Education Indicator List Site Map Back to Home
Supplemental Notes

<< Go Back
Note 5: International Assessments

PROGRAM FOR INTERNATIONAL STUDENT ASSESSMENT (PISA)

Indicators 13 and 26 are based on data collected as part of the Program for International Student Assessment (PISA). First conducted in 2000, PISA had its first follow-up in 2003 and has a second follow-up scheduled in 2006. The focus of each PISA is on the capabilities of 15-year-olds in reading literacy, mathematics literacy and problem solving, and science literacy. However, in each assessment year, PISA provides a detailed examination for a different one of the three subjects and basic examination of the other two subjects. The 2000 assessment focused on reading. The 2003 assessment focused on mathematics literacy and problem solving. The 2006 assessment will focus on science literacy. PISA is sponsored by the Organization for Economic Cooperation and Development (OECD), an intergovernmental organization of 30 industrialized countries that serves as a forum for member countries to cooperate in research and policy development on social and economic topics of common interest.

In 2003, 41 countries participated in PISA, including all 30 of the OECD countries and 11 non-OECD countries. To implement PISA, each participating country selected a nationally representative sample of 15-year-olds. A minimum of 4,500 students from a minimum of 150 schools was required. Each student completed a 2-hour paper-and-pencil assessment. The results of one OECD country, the United Kingdom, are not discussed due to low response rates. Because PISA is an OECD initiative, all international averages presented for PISA are the average of the participating OECD countries’ results.

PISA seeks to represent the overall yield of learning for 15-year-olds. PISA assumes that by the age of 15, young people have had a series of learning experiences, both in and out of school, that allow them to perform at particular levels in reading, mathematics, and science literacy. Formal education will have played a major role in student performance, but other factors, such as learning opportunities at home, also play a role. PISA’s results provide an indicator of the overall performance of a country’s educational system, but they also provide information about other factors that influence performance (such as hours of instructional time, which was used in indicator 26). By assessing students near the end of compulsory schooling in key knowledge and skills, PISA provides information about how well prepared students will be for their future lives as they approach an important transition point for education and work. PISA thus aims to show how well equipped 15-year-olds are for their futures based on what they have learned up to that point.

Indicator 13 discusses student performance in mathematics literacy and problem solving. These concepts are defined by PISA as follows.

Mathematics literacy is defined as “an individual’s capacity to identify and understand the role that mathematics plays in the world, to make well-founded judgments and to use and engage with mathematics in ways that meet the needs of that individual’s life as a constructive, concerned, and reflective citizen.” Mathematics literacy can be broken down into four domains or subscales: (1) space and shape, which includes recognizing shapes and patterns; (2) change and relationships, which includes data analysis needed to specify relationships or translate between representations; (3) quantity, which focuses on quantitative reasoning and understanding of numerical patterns, counts, and measures; and (4) uncertainty, which includes statistics and probability.

Problem solving is defined as “an individual’s capacity to use cognitive processes to confront and resolve real, cross-disciplinary situations where the solution is not immediately obvious, and where the literacy domains or curricular areas that might be applicable are not within a single domain of mathematics, science, or reading.” Students completed exercises that assessed the students’ capabilities in using reasoning processes not only to draw conclusions, but also to make decisions, to troubleshoot (i.e., to understand the reasons for malfunctioning of a system or device), and/or to analyze the procedures and structures of a complex system (such as a simple kind of programming language). Problem-solving items required students to apply various reasoning processes, such as inductive and deductive reasoning, reasoning about cause and effect, or combinatorial reasoning (i.e., systematically comparing all the possible variations that can occur in a well-described situation). Students were also assessed in their skills in working toward a solution and communicating the solution to others through appropriate representations.

A comparative analysis of the National Assessment of Education Progress (NAEP), Trends in International Mathematics and Science Study (TIMSS), and PISA mathematics assessments sponsored by NCES found that PISA used far fewer multiple choice items and had a much stronger content focus on the “data” area (which often deals with using charts and graphs), which fits with PISA’s emphasis using materials with a real-world context. For more results from the study, see A Content Comparison of the NAEP, TIMSS, and PISA 2003 Mathematics Assessments (NCES 2005–112).

PROGRESS IN INTERNATIONAL READING LITERACY STUDY (PIRLS)

Indicator 26 is based on data collected in 2001 as part of the Progress in International Reading Literacy Study (PIRLS). The study, conducted by the International Association for the Evaluation of Educational Achievement (IEA), assessed the reading comprehension of children in 35 countries. In each country, students from the upper of the two grades with the most 9-year-olds (4th grade in the United States and most countries) were assessed. Designed to be the first in a planned 5-year cycle of international trend studies in reading literacy by IEA, PIRLS 2001 provides comparative information on the reading literacy of 4th-graders and also examines factors that may be associated with the acquisition of reading literacy in young children, such as hours of instructional time, which is used in indicator 26.

For further information on PIRLS, see http://nces.ed.gov/surveys/pirls.

TRENDS IN INTERNATIONAL MATHEMATICS AND SCIENCE STUDY (TIMSS)

The Trends in International Mathematics and Science Study (TIMSS), under the auspices of the International Association for the Evaluation of Educational Achievement (IEA), assessed the science and mathematics achievement of students in 41 countries in grades 3, 4, 7, 8, and the final year of secondary school in 1995. Information about how mathematics and science learning takes place in each country was also collected. TIMSS asked students, their teachers, and their school principals to complete questionnaires about the curriculum, schools, classrooms, and instruction. The TIMSS assessment was repeated in 1999 in 45 countries at grade 8 and in 2003 in 25 countries at grade 4 and 45 countries at grade 8, so that changes in achievement over time can be tracked. Moreover, TIMSS is closely linked to the curricula of the participating countries, providing an indication of the degree to which students have learned concepts in mathematics and science they have encountered in school.

Indicators 11 and 12 use data from the TIMSS.

1995 TIMSS

In 1995, the assessment components of TIMSS tested students in three populations:

  • Population 1: Students enrolled in the two adjacent grades that contained the largest proportion of 9-year-old students at the time of the assessment—3rd- and 4th-grade students in most countries.

  • Population 2: Students enrolled in the two adjacent grades that contained the largest proportion of 13-year-old students at the time of the assessment—7th- and 8th-grade students in most countries.

  • Population 3: Students enrolled in their final year of secondary education, which ranged from 9th to 14th grade. In many countries, students in more than one grade participated in the study because the length of secondary education varied by type of program (e.g., academic, technical, vocational). No indicators in The Condition of Education 2005 used data from this population.

All countries that participated in the study were required to administer assessments to the students in the two grades at Population 2 but could choose whether to participate in the assessments of other populations. Results for Population 2 were reported for 42 countries.

TIMSS used a two-stage sample design. For Populations 1 and 2, the first stage involved selecting, at a minimum, 150 public and private schools within each country. Countries were allowed to oversample for analyses of particular national interest, and all collected data were appropriately weighted to account for the final sample. Random sampling methods were then used to select from each school one mathematics class for each grade level within a population (generally 3rd and 4th for Population 1 and 7th and 8th for Population 2). All of the students in these mathematics classes (except for excluded students) then participated in the TIMSS testing in science and mathematics. This design was also used in 1999 and 2003.

The development of TIMSS was a cooperative effort including representatives from every participating country (a list of participating countries is available on the TIMSS website, given below). The TIMSS assessment was based on collaboratively developed frameworks for the topics from curricula in mathematics and science to be assessed, and the framework and related consensus process involved content experts, education professionals, and measurement specialists from many different countries. The assessment included multiple-choice and constructed-response questions.

1999 TIMSS

For the 1999 assessment, the international desired population consisted of all students in the country who were enrolled in the upper of the two adjacent grades that contained the greatest proportion of 13-year-olds at the time of testing. These populations corresponded with Population 2 in 1995 except that only students in the higher of the two adjacent grades containing the largest proportion of 13-year-olds at the time of the assessment were included in the sample instead of students from both of these grades. In the United States and most countries, this corresponded to grade 8.

All countries that participated in the 1995 TIMSS were invited to participate in the 1999 TIMSS, along with some countries that did not participate in 1995. In total, 38 countries collected data for the 1999 TIMSS: 26 that had participated in the 1995 TIMSS and 12 that were participating for the first time. (A list of participating countries is available on the TIMSS website, given below.)

2003 TIMSS

For the 2003 assessment, the international desired population consisted of all students in the country who were enrolled in the upper of the two adjacent grades that contained the greatest proportion of 9- and 13-year-olds at the time of testing (Populations 1 and 2, respectively, except only the upper of the two adjacent grades). In the United States and most countries, this corresponded to grades 4 and 8. In all, 25 countries participated at grade 4, and 45 countries participated at grade 8. (A list of participating countries is available on the TIMSS website, given below.)

Approximately one-third of the 1995 4th-grade assessment items and one-half of the 1999 8th-grade assessment items were used in the 2003 assessment. Development of the 2003 assessment began with an update of the assessment frameworks to reflect changes in the curriculum and instruction of participating countries. “Problem-solving and inquiry” tasks were added to the 2003 assessment to assess how well students could draw on and integrate information and processes in mathematics and science as part of an investigation or in order to solve problems.

For the 2003 assessment, countries were placed into one of 4 categories based upon their response rate, detailed in the table below. In indicators 11 and 12, countries in category 1 appear in the tables and figures without annotation; countries in category 2 are annotated in the tables and figures as “met international guidelines for participation rates only after replacement schools were included”; countries in category 3 are annotated in the tables and figures as “country did not meet international sampling or other guidelines”; and countries in category 4 are not included in the indicators. In addition, annotations are included when the exclusion rate for a country exceeds 10 percent. Latvia is designated as “Latvia-LSS (Latvian-speaking schools)” in some analyses because data collection in 1995 and 1999 was limited to only those schools in which instruction was in Latvian. Finally, Belgium is annotated as Belgium-Flemish because only the Flemish education system in Belgium participated in TIMSS.

Response Rates for the 2003 TIMSS assessment

Response Rates for the 2003 TIMSS assessment

For further information on TIMSS, see http://nces.ed.gov/timss



1990 K Street, NW
Washington, DC 20006, USA
Phone: (202) 502-7300 (map)