Skip Navigation
small header image
Schools and Staffing Survey (SASS)

1990-91 SASS Methods and Procedures

Questionnaire Design Imputation
Sampling Frame Weighting
Sample Design Response Rates
Data Collection Reinterviews
Data Editing Manuals and Technical Reports

Questionnaire Design

The SASS continued to measure teacher shortage and demand; characteristics of elementary and secondary teachers; teacher workplace conditions; characteristics of school principals; and school programs and policies. Schools funded by the Bureau of Indian Affairs (BIA) became a new SASS component.

Sampling Frames

The sampling frame for the public school sample was the 1988-89 Common Core of Data (CCD) school file. The CCD is a universe file that includes all public elementary and secondary schools in the United States. Schools operated by the Department of Defense or those that offered only kindergarten or pre-kindergarten or adult education were excluded from the SASS sample.

For private schools, the 1989–90 Private School Universe Survey (PSS) list frame was used to select the private school sample. An area frame supplemented the list frame, and the number of sample areas was increased from 75 to 123.

A separate universe of schools funded by the Bureau of Indian Affairs (BIA) was drawn from the Program Education Directory maintained by the BIA. To avoid duplicates in the BIA files, BIA schools in the CCD school file were treated as public schools.

Sample Design

The main design objective was to provide estimates of school characteristics by the following key analytical domains: the nation; elementary and secondary levels by public and private sectors; school levels of public schools by state; and private schools by association group, region and school level. The sample design for the 1990-91 SASS was modified to obtain estimates for Bureau of Indian Affairs (BIA) schools and American Indian schools (those with 25 percent or more Indian students). In the Teacher Survey, separate domains were included for Asian and Pacific Islander (API) teachers and for American Indian, Aleut and Eskimo (AIAE) teachers.

The sample design also sought to control overlap between the two rounds of SASS. The proportion of overlap varied by sector and stratum within sector, based on an evaluation of the tradeoff between improved estimates of change and expected effects of response rates.

Data Collection

The U.S. Census Bureau performed the data collection and began in September 1990 by sending advance letters to LEAs with one or more sampled schools. Schools in the sample also received advance letters requesting a list of their teachers. School questionnaires were mailed in December 1990 and January 1991. A second questionnaire was sent to nonrespondents a month later. Census field representatives called nonrespondents after three weeks and attempted to complete the interviews by telephone.

Data Editing

The U.S. Census Bureau performed the data processing. Each questionnaire was coded according to its status-for example, whether the questionnaire contains a completed interview, a respondent refused to complete it, a school district merged with another district, or a school closed. The next step was to make a preliminary determination of each case's interview status, i.e., whether it is an interview, a noninterview, or if the respondent was out-of-scope (for example, if a sampled school had closed). A computer pre-edit program generated a list of cases where problems occurred as defined by edit specifications, depending on each survey. After pre-edit corrections were made, each file was subjected to another computer edit. This operation consisted of a range check, a consistency edit, and a blanking edit.

After the completion of range, consistency and blanking edits, the records were put through another edit to make a final determination of whether the case was eligible for the survey and, if so, whether there were sufficient data for the case to be classified as an interview. A final interview status recode value was assigned to each case as a result of the edit.

Imputation

SASS used several methods to impute values for questionnaire items that respondents did not answer. These methods included the following: (1) deductive imputation (or using data from other items on the questionnaire); (2) hot deck imputation (extracting data from the record for a sample case with similar characteristics); and clerical imputation (reviewing the data record, the sample file record or other sources before deriving an entry consistent with the existing data). All School Survey items that were missing or failed consistency checks were imputed.

Top

Weighting

Weighting of the sample units was carried out to produce national and state estimates for public schools, LEAs, administrators and teachers. Private schools, administrators and teachers were weighted to produce national and affiliation estimates. The weighting procedures used in the School Survey have three purposes: to take account of the school's selection probabilities; to reduce biases that may result from unit nonresponse; and to make use of available information from external sources to improve the precision of sample estimates.

Response Rates

Weighted response rates are defined as the number of in-scope responding questionnaires divided by the number of in-scope sample cases, using the basic weight (inverse of the probability of selection) of the record.

1990-91 SASS Sample Sizes and Response Rates
Component Sample size   Weighted response rate
Public Schools
Teacher Demand and Shortage  4,884 93.5%
Administrator  9,054 96.9%
School  8,969 95.3%
Teacher 46,705 90.3%
Teacher Listing Form  —   95%
Private Schools
Teacher Demand and Shortage  2,620 83.9%
Administrator  2,757 90.1%
Teacher  6,642 83.6%
Teacher Listing Form  —   90%
BIA Schools
School 101 97.7%

Reinterviews

SASS conducted a reinterview of nearly ten percent of schools and principals in the sample. Schools that returned questionnaires by mail were reinterviewed by mail and those responding by telephone were reinterviewed by telephone. Reinterview results were analyzed using one index of inconsistency for items with a dichotomous response that is accounted for by response variance. A separate index of inconsistency was used for response items with more than two response categories.

Manuals and Technical Reports

NCES 93144: User's Manual: 1990-91 SASS Data File - Public Use Codebook and Bureau of Indian Affairs Codebooks
NCES 98312: CD-ROM: The Schools and Staffing Survey (SASS) and Teacher Followup Survey (TFS) CD-ROM: Electronic Codebook and Public-Use Data for Three Cycles of SASS and TFS
     NCES 96338: An Exploratory Analysis of Response Rates in the 1990-91 Schools and Staffing Survey (SASS)
NCES 95342-I: Design Effects and Generalized Variance Functions for the 1990-91 Schools and Staffing Survey (SASS), Volume 1 (User's Manual)
NCES 95342-II: Design Effects and Generalized Variance Functions for the 1990-91 Schools and Staffing Survey (SASS), Volume II: Technical Report
NCES 2000308: Quality Profile for SASS Rounds 1-3: 1987-1995, Aspects of the Quality of Data in the Schools and Staffing Surveys (SASS)
NCES 93449: 1990-91 Schools and Staffing Survey: Sample Design and Estimation
NCES 9508: CCD Adjustment to the 1990-91 SASS: A Comparison of Estimates
NCES 9502: QED Estimates of the 1990-91 Schools and Staffing Survey: Deriving and Comparing QED School Estimates with CCD Estimates
NCES 9503: Schools and Staffing Survey: 1990-91 SASS Cross-Questionnaire Analysis
NCES 9403: 1991 Schools and Staffing Survey (SASS) Reinterview Response Variance Report

Top