Questionnaire Design | Imputation |
Sampling Frame | Weighting |
Sample Design | Response Rates |
Data Collection | Reinterviews |
Data Editing | Manuals and Technical Reports |
The SASS continued to measure the five major policy issues in the previous rounds of SASS: teacher shortage and demand; characteristics of elementary and secondary teachers; teacher workplace conditions; characteristics of school principals; and school programs and policies. The 1993-94 SASS added three components to gain a more complete understanding of elementary and secondary schools. The new components of SASS were: a student records survey; a library media center survey; and a library media specialist/librarian survey.
The sampling frame for the public school sample was the 1991-1992 Common Core of Data (CCD) school file. The CCD is a universe file that includes all public elementary and secondary schools in the United States. Schools operated by the Department of Defense or those that offered only kindergarten or pre-kindergarten or adult education were excluded from the SASS sample.
The list frame used the 1991-92 Private School Survey (PSS) list, updated with association lists. An area frame supplement was based on the canvassing of private schools within specific geographical areas.
A separate universe of schools funded by the Bureau of Indian Affairs (BIA) was drawn from the Program Education Directory maintained by the BIA. To avoid duplicates in the BIA files, BIA schools in the CCD school file were treated as public schools.
The main sample design objectives were to provide estimates of acceptable precision for the specified domains of analysis. Estimates could be made for public schools by state, by school level within state, by schools with more than 25 percent Indian enrollment and by Bureau of Indian Affairs-funded schools. Estimates for private schools could be made by affiliation and by school level.
Another objective was to balance the requirements of the school sample against the requirements of the samples of LEAs and teachers. The sample design called for selecting schools with a probability proportionate to the square root of the school's teacher size and, within each stratum, selecting a fixed number of teachers, subject to constraints on the total number of teachers selected in a school.
The SASS sample design also sought to control sample overlap between SASS and other NCES school surveys and the overlap between the 1993-94 SASS and previous rounds of SASS.
The U.S. Census Bureau performed the data collection and began by sending advance letters to the sampled LEAs and schools in August and September, respectively. School questionnaires were mailed in December and a reminder postcard was sent several weeks later. In the past, SASS used a decentralized telephone follow-up for nonresponding teachers. During the 1993-94 SASS, the teacher survey split nonrespondents into a sample designated for traditional telephone follow-up and a sample for Computer-Assisted Telephone Interviewing (CATI) follow-up.
The U.S. Census Bureau performed the data processing. Each questionnaire was coded according to its status - for example, whether the questionnaire contained a completed interview, a respondent refused to complete it, a school district merged with another district, or a school closed. The next step was to make a preliminary determination of each case's interview status, i.e., whether it was an interview, a noninterview, or if the respondent was out-of-scope (for example, if a sampled school had closed). A computer pre-edit program generated a list of cases where problems occurred as defined by edit specifications, depending on each survey. When necessary, the Census Bureau used extensive follow-up to collect data on key items from schools.
After the completion of range, consistency and blanking edits, the records were put through another edit to make a final determination of whether the case was eligible for the survey and, if so, whether there were sufficient data for the case to be classified as an interview. A final interview status recode value was assigned to each case as a result of the edit.
SASS used four methods to impute values for questionnaire items that respondents did not answer. These include the following: (1) using data from other items on the questionnaire, (2) extracting data from a related component of SASS, (3) extracting data from the sample file (PSS or CCD), and (4) extracting data from the record for a sample case with similar characteristics (commonly known as the "hot deck" method for imputing item response).
Weighting of the sample units was carried out to produce national and state estimates for public schools, LEAs, administrators and teachers. Private schools, administrators and teachers were weighted to produce national and affiliation estimates. The weighting procedures used in the SASS have three purposes: to take account of the school's selection probabilities; to reduce biases that may result from unit nonresponse; and to make use of available information from external sources to improve the precision of sample estimates.
Weighted response rates are defined as the number of in-scope responding questionnaires divided by the number of in-scope sample cases, using the basic weight (inverse of the probability of selection) of the record. For all other components, only one sampling stage was involved; therefore, for these components, the weighted overall response rate and the weighted response rate are the same.
Component | Sample size | Weighted response rate |
---|---|---|
Public Schools | ||
District | 5,363 | 93.9% |
Administrator | 9,415 | 96.6% |
School | 9,532 | 92.3% |
Teacher | 53,003 | 88.2% |
Teacher Listing Form | — | 95% |
Library Media Center | 4,655 | 90.1% |
Librarian | 4,175 | 92.3% |
Student Record | 5,576 | 91.2% |
Private Schools | ||
Administrator | 3,038 | 87.6% |
School | 3,074 | 83.2% |
Teacher | 10,386 | 80.2% |
Teacher Listing Form | — | 91% |
Library Media Center | 2,067 | 70.7% |
Librarian | 1,356 | 76.5% |
Student Record | 1,371 | 88.4% |
BIA Schools | ||
Administrator | 150 | 98.7% |
School | 153 | 99.3% |
Teacher | 2,690 | 86.5% |
Library Media Center | 142 | 89.4% |
Librarian | 111 | 88.3% |
Student Record | 142 | 89.4% |
SASS conducted a reinterview of about 10 percent of schools and principals in the sample. Questionnaires were sent three or four weeks after the original interview. CATI reinterviews took place one or two weeks later. Reinterview results were analyzed using one index of inconsistency for items with a dichotomous response that is accounted for by response variance. A separate index of inconsistency was used for response items with more than two response categories.
NCES 96142: | 1993-94 Schools and Staffing Survey: Data File Users Manual, Volume I: Survey Documentation |
NCES 97573: | 1993-94 Schools and Staffing Survey: Data File User's Manual: Volume II: Restricted-Use Codebook |
NCES 199912: | 1993-94 Schools and Staffing Survey: Data File User's Manual, Volume III: Public-Use Codebook |
NCES 199913: | 1993-94 Schools and Staffing Survey: Data File User's Manual, Volume IV: Bureau of Indian Affairs (BIA) Restricted-Use Codebook |
NCES 98312: | CD-ROM: The Schools and Staffing Survey (SASS) and Teacher Followup Survey (TFS) CD-ROM: Electronic Codebook and Public-Use Data for Three Cycles of SASS and TFS |
NCES 96089: | 1993-94 Schools and Staffing Survey: Sample Design and Estimation |
NCES 98243: | An Analysis of Total Nonresponse in the 1993-94 Schools and Staffing Survey (SASS) |
NCES 9605: | Cognitive Research on the Teacher Listing Form for the Schools and Staffing Survey |
NCES 9723: | Further Cognitive Research on the Schools and Staffing Survey (SASS) Teacher Listing Form |
NCES 9718: | Improving the Mail Return Rates of SASS Surveys: A Review of the Literature |
NCES 9615: | Nested Structures: District-Level Data in the Schools and Staffing Survey |
NCES 2000308: | Quality Profile for SASS Rounds 1-3: 1987-1995, Aspects of the Quality of Data in the Schools and Staffing Surveys (SASS) |