Skip Navigation
small header image
National Assessment of Adult Literacy (NAAL)

NAAL Factsheets -> Performance Levels

Why is it important to have performance levels?
Performance levels provide the ability to group people with similar literacy scores into a relatively small number of categories of importance to the adult education community, much like grouping students with similar scores on a test into various letter grades (e.g., A or B). A benefit of having performance levels is that they enable the National Assessment of Adult Literacy (NAAL) to characterize American adults' relative literacy strengths and weaknesses by describing the nature and difficulty of the literacy tasks that participants at each level can perform with a reasonably high rate of success.

How were the 2003 performance levels determined?
In response to a request from the National Center for Education Statistics (NCES), the National Research Council (NRC) convened a Committee on Performance Levels for Adult Literacy. The committee's goal was to do the following in an open and public way: evaluate the literacy levels used by NAAL's 1992 predecessor survey, and recommend a set of performance levels that could be used in reporting the 2003 results and also applied to the 1992 results in order to make comparisons across years.

New levels. After reviewing information about the 1992 and 2003 assessments as well as feedback from stakeholders, the committee specified a new set of performance levels intended to correspond to four policy-relevant categories of adults, including adults in need of basic adult literacy services. The next step was to determine the score ranges to be included in each level for each of the three NAAL literacy scales-prose, document, and quantitative literacy.

Score ranges. To determine the score ranges for each level, the committee decided to use the "bookmark" method. Initial implementation of the method involved describing the literacy skills of adults in the four policy-relevant levels, and holding two sessions with separate panels of "judges" consisting of adult literacy practitioners, officials with state offices of adult education, and others. One group of judges focused on the 1992 assessment tasks and the other group focused on the 2003 assessment tasks.

Bookmarks. For each literacy area (prose, document, and quantitative), the judges were given, in addition to descriptions of the performance levels, a booklet of assessment tasks arranged from easiest to hardest. The judges' job was to place "bookmarks" in the set of tasks that adults at each level were "likely" to get right. The term "likely" was defined as "67 percent of the time," or two out of three times, and statistical procedures were used to determine the score associated with a 67 percent probability of performing the task correctly. The bookmarks designated by the judges at the two sessions were combined to produce a single bookmark-based cut score for each performance level on each of the three literacy scales.

Quasi-contrasting groups approach. To refine the bookmark-based cut scores, which indicated the lowest score to be included in each performance level, the committee used a procedure they termed "quasi-contrasting groups approach." They compared the bookmark-based cut scores with the 1992 scores associated with various background variables, such as educational attainment. The criterion for selecting the background variables was potential usefulness for distinguishing between adjacent performance levels such as Basic and Below Basic (e.g., having some high school education vs. none at all; reporting that one reads well vs. not well; reading a newspaper sometimes vs. never reading a newspaper; reading at work sometimes or more often vs. never reading at work).

In each case, the midpoint between the average scores of the two adjacent performance levels (Below Basic and Basic; Basic and Intermediate; Intermediate and Proficient) was calculated and averaged across the variables that provided contrasts between the groups. The committee developed a set of rules and procedures for deciding when and how to make adjustments to the bookmark cut scores when the cut scores associated with the selected background variables were different from the bookmark-based scores.

Who is classified as Nonliterate in English?
The NRC committee recommended that NCES distinguish a fifth group of adults with special importance to literacy policy-those who are nonliterate in English. As originally defined by the committee, this category consisted of adults who performed poorly on a set of easy screening tasks in 2003 and therefore were routed to an alternative assessment for the least-literate adults. Because the 1992 assessment included neither the alternative assessment nor the 2003 screening tasks, adults in this category cannot be identified for 1992.

To provide a more complete representation of the adult population who are nonliterate in English, NCES expanded the category to include not only the 3 percent of adults who took the alternative assessment, but also the 2 percent who were unable to be tested at all because they knew neither English nor Spanish (the other language spoken by interviewers). Thus, as defined by NCES, the category included about 5 percent of adults in 2003.

What refinements did NCES make before using the levels?
The new performance levels were presented to NCES as recommendations. Having accepted the general recommendations, NCES incorporated a few refinements before using the levels to report results. First, NCES changed the label of the top category from Advanced to Proficient because the term "proficient" better conveys how well the upper category of adults performs. Second, NCES added sample tasks from the 2003 assessment to illustrate the full range of tasks that adults at each level can perform as well as a brief (one-sentence) summary description for each level to enhance public understanding. Third, as outlined in the previous paragraph, NCES included additional adults in the Nonliterate in English category.

Top


For more information about NAAL and its components, visit the NAAL website at http://nces.ed.gov/naal or contact Sheida White, NAAL Project Officer at the National Center for Education Statistics.