Skip Navigation
small header image
The Condition of Education Indicator List Site Map Back to Home
Supplemental Notes

<< Go Back
Note 5: International Assessments

PROGRAM FOR INTERNATIONAL STUDENT ASSESSMENT (PISA)

The Special Analysis and indicator 17 are based on data collected as part of the Program for International Student Assessment (PISA). First conducted in 2000, PISA had its first follow-up in 2003 and has a second follow-up scheduled in 2006. The focus of each PISA is on the capabilities of 15-year-olds in reading literacy, mathematics literacy and problem solving, and science literacy. However, in each assessment year, PISA provides a detailed examination of a different one of the three subjects and basic examination of the other two subjects. The 2000 assessment focused on reading. The 2003 assessment focused on mathematics literacy and problem solving. The 2006 assessment focuses on science literacy. PISA is sponsored by the Organization for Economic Cooperation and Development (OECD), an intergovernmental organization of 30 industrialized countries that serves as a forum for member countries to cooperate in research and policy development on social and economic topics of common interest.

In 2003, 41 countries participated in PISA, including all 30 of the OECD countries and 11 non-OECD countries. To implement PISA, each participating country selected a nationally representative sample of 15-year-olds. A minimum of 4,500 students from a minimum of 150 schools was required. Each student completed a 2-hour paper-and-pencil assessment. The results of one OECD country, the United Kingdom, are not discussed due to low response rates. Because PISA is an OECD initiative, all international averages presented for PISA are the averages of the participating OECD countries’ results.

PISA seeks to represent the overall yield of learning for 15-year-olds. PISA assumes that by the age of 15, young people have had a series of learning experiences, both in and out of school, that allow them to perform at particular levels in reading, mathematics, and science literacy. Formal education will have played a major role in student performance, but other factors, such as learning opportunities at home, also play a role. PISA’s results provide an indicator of the overall performance of a country’s educational system, but they also provide information about other factors that influence performance (e.g., hours of instructional time). By assessing students near the end of compulsory schooling in key knowledge and skills, PISA provides information about how well prepared students will be for their future lives as they approach an important transition point for education and work. PISA thus aims to show how well equipped 15-year-olds are for their futures based on what they have learned up to that point.

Both the Special Analysis and indicator 17 discuss student performance in mathematics literacy and problem solving. These concepts are defined by PISA as follows.

Mathematics literacy is defined as “an individual’s capacity to identify and understand the role that mathematics plays in the world, to make well-founded judgments and to use and engage with mathematics in ways that meet the needs of that individual’s life as a constructive, concerned, and reflective citizen.” Mathematics literacy can be broken down into four domains or subscales: (1) space and shape, which includes recognizing shapes and patterns; (2) change and relationships, which includes data analysis needed to specify relationships or translate between representations; (3) quantity, which focuses on quantitative reasoning and understanding of numerical patterns, counts, and measures; and (4) uncertainty, which includes statistics and probability.

Problem solving is defined as “an individual’s capacity to use cognitive processes to confront and resolve real, cross-disciplinary situations where the solution is not immediately obvious, and where the literacy domains or curricular areas that might be applicable are not within a single domain of mathematics, science, or reading.” Students completed exercises that assessed the students’ capabilities in using reasoning processes not only to draw conclusions, but also to make decisions, to troubleshoot (i.e., to understand the reasons for malfunctioning of a system or device), and/or to analyze the procedures and structures of a complex system (such as a simple kind of programming language). Problem-solving items required students to apply various reasoning processes, such as inductive and deductive reasoning, reasoning about cause and effect, or combinatorial reasoning (i.e., systematically comparing all the possible variations that can occur in a well-described situation). Students were also assessed in their skills in working toward a solution and communicating the solution to others through appropriate representations.

A comparative analysis of the National Assessment of Educational Progress (NAEP), Trends in International Mathematics and Science Study (TIMSS), and PISA mathematics assessments sponsored by NCES found that PISA used far fewer multiple-choice items and had a much stronger content focus on the “data” area (often dealing with using charts and graphs), which fits with PISA’s emphasis on using materials with a real-world context. For more results from the study, see Comparing Mathematics Content in the NAEP, TIMSS, and PISA 2003 Assessments (NCES 2006-029).

PROGRESS IN INTERNATIONAL READING LITERACY STUDY (PIRLS)

The Special Analysis uses data collected as part of the Progress in International Reading Literacy Study (PIRLS) 2001. Designed to be the first in a planned 5-year cycle of international trend studies in reading literacy by the International Association for the Evaluation of Educational Achievement (IEA), PIRLS 2001 provides comparative information on the reading literacy of 4th-graders and also examines factors that may be associated with the acquisition of reading literacy in young children. The study, conducted by IEA, assessed the reading comprehension of children in 35 countries. In each country, students from the upper of the two grades with the most 9-year-olds (4th grade in the United States and most countries) were assessed.

For further information on PIRLS, see http://nces.ed.gov/surveys/pirls.

TRENDS IN INTERNATIONAL MATHEMATICS AND SCIENCE STUDY (TIMSS)

The Special Analysis uses data collected as part of the Trends in International Mathematics and Science Study (TIMSS). Under the auspices of the IEA, TIMSS assessed the science and mathematics achievement of students in 41 countries in grades 3, 4, 7, 8, and the final year of secondary school in 1995. Information about how mathematics and science learning takes place in each country was also collected. TIMSS asked students, their teachers, and their school principals to complete questionnaires about the curriculum, schools, classrooms, and instruction. The TIMSS assessment was repeated in 1999 in 45 countries at grade 8, and again in 2003 in 25 countries at grade 4 and 45 countries at grade 8 so that changes in achievement over time could be tracked. Moreover, TIMSS is closely linked to the curricula of the participating countries, providing an indication of the degree to which students have learned the concepts in mathematics and science that they have encountered in school.

2003 TIMSS

For the 2003 assessment, the international desired population consisted of all students in the country who were enrolled in the upper of the two adjacent grades that contained the greatest proportion of 9- and 13-year-olds at the time of testing (Populations 1 and 2, respectively, except only the upper of the two adjacent grades). In the United States and most countries, this corresponded to grades 4 and 8. In all, 25 countries participated at grade 4, and 45 countries participated at grade 8. (A list of participating countries is available on the TIMSS website at http://nces.ed.gov/timss.)

Approximately one-third of the 1995 4th-grade assessment items and one-half of the 1999 8th-grade assessment items were used in the 2003 assessment. Development of the 2003 assessment began with an update of the assessment frameworks to reflect changes in the curriculum and instruction of participating countries. “Problem-solving and inquiry” tasks were added to the 2003 assessment to assess how well students could draw on and integrate information and processes in mathematics and science as part of an investigation or in order to solve problems.

For the 2003 assessment, countries were placed into one of four categories based upon their response rate, detailed in the table below. In the Special Analysis, countries in category 1 appear in the tables and figures without annotation; countries in category 2 are annotated in the tables and figures as “met international guidelines for participation rates only after replacement schools were included”; countries in category 3 are annotated in the tables and figures as “country did not meet international sampling or other guidelines”; and countries in category 4 are not included in the indicators. In addition, annotations are included when the exclusion rate for a country exceeds 10 percent. Latvia is designated as “Latvia-LSS (Latvian-speaking schools)” in some analyses because data collection in 1995 and 1999 was limited to only those schools in which instruction was in Latvian. Finally, Belgium is annotated as Belgium-Flemish because only the Flemish education system in Belgium participated in TIMSS.

For further information on TIMSS, see http://nces.ed.gov/timss.

ADULT LITERACY AND LIFESKILLS SURVEY (ALL)

The Special Analysis also uses data collected as part of the Adult Literacy and Lifeskills Survey (ALL). ALL is a large-scale, international comparative assessment designed to identify and measure a range of skills linked to the social and economic characteristics of individuals across (or within) nations. As our societies become more and more information oriented, it is clear that adults will need a broad set of skills in order to participate effectively in the labor market, in political processes, and in their communities. They will need to be literate and numerate; they will need to be capable problem solvers; and, increasingly, they will need to be familiar with information and communications technologies.

ALL is a household survey. Participants completed approximately 45 minutes of background questions and 60 minutes of assessment items in their homes. In the United States, a nationally representative sample of approximately 4,000 adults ages 16–65 was selected. Each participating country provided a sample that is representative of their adult population as a whole. Data collection for the main study took place between January and June 2003 in the United States.

ALL provides information on the skills and attitudes of adults ages 16–65 in a number of different areas, including the following:

  • Prose and Document Literacy: the knowledge and skills to understand and use information from texts such as editorials, news stories, poems, and fiction; and the knowledge and skills required to locate and use information contained in various formats such as tables, forms, graphs, and diagrams

  • Numeracy: the ability to interpret, apply, and communicate mathematical information

  • Analytical Reasoning/Problem Solving: the ability to solve problems by clarifying the nature of the problem and developing and applying appropriate solution strategies

ALL consists of two components: a background questionnaire designed to collect general participant information; and an assessment of the skills of participants in Prose and Document Literacy, Numeracy, and Analytical Reasoning/Problem Solving. (The United States did not participate in Analytical Reasoning/Problem Solving.)

For further information on ALL, see http://nces.ed.gov/Surveys/ALL/index.asp.

Response Rates for the 2003 TIMSS assessment

Response Rates for the 2003 TIMSS assessment



1990 K Street, NW
Washington, DC 20006, USA
Phone: (202) 502-7300 (map)