Skip Navigation
small header image

Search Results: (1-15 of 200 records)

 Pub Number  Title  Date
NCES 200801 2008 National Postsecondary Student Aid Study (NPSAS:08) Field Test Methodology Report
This report describes the methodology and findings of the NPSAS:08 field test, which took place in the 2006-07 school year. The NPSAS:08 field test was used to plan, implement, and evaluate methodological procedures, instruments, and systems proposed for use in the full-scale study scheduled for the 2007-08 school year.
9/10/2008
NCES 200601 2004/06 Beginning Postsecondary Students Longitudinal Study (BPS:04/06) Field Test Methodology Report
This report describes the methodology and findings for the field test of the 2004/06 Beginning Postsecondary Students Longitudinal Study (BPS:04/06). These students, who started their postsecondary education during the 2002-03 academic year, were first interviewed as part of the 2004 National Postsecondary Student Aid Study (NPSAS:04) field test. BPS:04/06 is the first follow-up of this cohort. The BPS:04/06 field test was used to plan, implement, and evaluate methodological procedures, instruments, and systems proposed for use in the full-scale study scheduled for the 2005-06 school year. The report provides the sampling design and methodologies used in the field test. It also describes data collection outcomes, including response rates, interview burden, and results of incentive and prompting experiments. In addition, the report provides details on the evaluation of data quality for reliability of responses, item nonresponse, and question delivery and data entry error. Recommendations for the full-scale study are provided for the sampling design, locating and tracing procedures, interviewer training, data collection, and instrumentation.
5/17/2006
NCES 200501 2000 NAEP -- 1999 TIMSS Linking Report
The report describes the methodology used in an attempt to link the 2000 NAEP assessments in mathematics and science to the 1999 TIMSS assessments in those subjects. The report explains why the linking effort met only limited success.
8/8/2005
NCES 200502 2004 National Postsecondary Student Aid Study (NPSAS:04) Field Test Methodology Report
This report describes the methodology and findings of the NPSAS:04 field test, which took place in the 2002-03 school year. The NPSAS:04 field test was used to plan, implement, and evaluate methodological procedures, instruments, and systems proposed for use in the full-scale study scheduled for the 2003-04 school year.
4/22/2005
NCES 200402 1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03) Field Test Methodology Report
This report describes the methodology and findings of the B&B:93/03 field test interview, conducted in the spring and early summer of 2002, with 1991-92 bachelor's degree recipients. This is the third and final follow-up interview with this cohort.
11/18/2004
NCES 200401 2004 National Study of Postsecondary Faculty (NSOPF:04) Field Test Methodology Report
This report describes the methodology and findings of the NSOPF:04 field test that took place during the 2002–03 academic year. The NSOPF:04 field test was used to plan, implement, and evaluate methodological procedures, instruments, and systems proposed for use in the full-scale study scheduled for the 2003-04 academic year.
10/13/2004
NCES 200321 U.S. 2001 U.S. PIRLS Nonresponse Bias Analysis
The Progress in International Reading Literacy Study (PIRLS) of 2001 is a large international comparative study of the reading literacy of young students. The student population for the U.S. 2001 PIRLS was the set of all fourth-graders in the United States, corresponding to the grade in which the highest proportion of nine-year-olds are enrolled. Because the response rate from the original sample was below 85 percent, NCES investigated the potential magnitude of nonresponse bias at the school level. The methodology and results of this investigation are presented in this report.
10/21/2003
NCES 200320 Imputation Methodology for the National Postsecondary Student Aid Study: 2004
The working paper offers several improvements for the imputation of variable values. Using data from the 1999-2000 National Postsecondary Student Aid Study (NPSAS), the paper describes the procedures and results for resolving discrepancies in variable values obtained from several data sources; imputing missing values on variables with low levels of missing data; imputing values for derived variables of great policy importance; and variance estimates for imputed variable values. Identified improvements will be applied to the NPSAS:2004 study.
9/2/2003
NCES 200319 NAEP Quality Assurance Checks of the 2002 Reading Assessment Results for Delaware
In March 2003, NCES asked Human Resources Research Organization (HumRRO) to participate in a special study of 2002 reading assessment results for Delaware. The study was based on seven technical factors aimed at exploring the data for potential problems: the sampling of Delaware students; the case weights of the Delaware data; the design for assigning test booklets; the scoring of Delaware data; scaling and equating for Delaware; coding data in Delaware; and a breach in test security in Delaware.
8/25/2003
NCES 200318 Report for Computation of Balanced Repeated Replicate (BRR) Weights for the Third (NELS 88:1994) and Fourth (NELS88:2000) Follow-up Surveys
This report describes the procedures used to construct and the statistical properties of two Balanced Repeated Replicate weights for the last two follow-ups of the National Education Longitudinal Study of 1988.
6/12/2003
NCES 200314 Feasibility Studies of Two-Stage Testing in Large-Scale Educational Assessment: Implications for NAEP
This report discusses the rationale for enhancing the current NAEP design by adding a capacity for adaptive testing. Items are tailored to the achievement level of the student in adaptive testing. The authors conclude that implementation of adaptive testing procedures, two-stage testing in particular, has the potential to increase the usability and validity of NAEP results. Adaptive testing would permit adequately reliable scores to be reported to individual students and their parents, increasing their personal stake in performing well. Improvement in data quality would also speed data processing and permit delivery of assessment results in a timely manner.
5/21/2003
NCES 200315 Computer Use and Its Relation to Academic Achievement in Mathematics, Reading, and Writing
In this study, the authors, using evidence obtained from the 1996 NAEP assessment in Mathematics and the 1998 NAEP main assessments in reading and writing, examine patterns of computer achievement in each of these three academic domains. The authors conclude that the design of the NAEP data collection precludes using such data to make even tentative conclusions about the relationship of achievement and computer use. They recommend further study, including a multi-site experiment to determine how teachers and students are using computers and the impact of computers on achievement.
5/21/2003
NCES 200316 Implications of Electronic Technology for the NAEP Assessment
This report emphasizes the need for NAEP to integrate the use of technology into its assessment procedures; it reviews major options; and suggests priorities to guide the integration. The author identifies three short-term goals for this development: a linear computer-administered assessment in a target subject area such as mathematics or science should be implemented; a computer administered writing assessment should be developed and implemented; the introduction and evaluation of technology-based test accommodations for handicapped students and English-language learners should be continued. The author suggests that NAEP consider redesign as an integrated electronic information system that would involve all aspects of the assessment process including assessment delivery, scoring and interpretation, development of assessment frameworks, specifications of population and samples, collection of data, and prepration and dissemination of results.
5/21/2003
NCES 200317 The Effects of Finite Sampling on State Assessment Sample Requirements
This study addresses statistical techniques the might ameliorate some of the sampling problems currently facing states with small populations participating in State NAEP. The author explores how the application of finite population correction factors to the between-school component of variance could be used to modify sample sizes required of states that currently qualify for the exemptions from State NAEP's minimum sample requirements. He also examines how to preserve the infinite population assumptions for hypothesis testing related to comparisons between domain means. Results lend support to alternate sample size specifications both in states with few schools and in states with many small schools. The author notes that permitting states to use design options other than the current State NAEP requirement could reduce costs related to test administration, scoring, and data processing.
5/21/2003
NCES 200311 Reporting the Results of the National Assessment of Educational Progress
This paper explores ways results of NAEP data collections might be communicated to a variety of audiences, each with differing needs for information, interests in its findings, and sophistication in interpreting its results. The author describes “market-basket” reporting as a feasible alternative to traditional NAEP reporting. These reports would include samples of items and exercises used in an assessment together with their scoring rubrics which would give a clearer picture of the kinds of skills assessed by NAEP, as well as an indication of skills not assessed. In the second section of the paper, the author cautions that in order to uphold strict standards of data quality, NAEP reports must format and display results to make them more accessible while also discouraging readers from drawing overly broad interpretations of the data. A final section describes a detailed program of research on reporting and dissemination of NAEP findings based on these three dimensions: the research questions to be asked; the audiences to whom the questions should be addressed; and the strategies through which the questions should be pursued – as well as the intersection of these dimensions. The author suggests that the highest priority be given to research on reporting through public media; followed by making NAEP reporting more understandable and useful to school curriculum and instruction personnel, reporting to the public, and further research with state education personnel.
5/20/2003
   1 - 15     Next >>
Page 1  of  14
1990 K Street, NW
Washington, DC 20006, USA
Phone: (202) 502-7300 (map)