Award Abstract #9907447
Statistical Models for Monitoring Educational Progress
NSF Org: |
SES
Division of Social and Economic Sciences
|
|
|
Initial Amendment Date: |
September 7, 1999 |
|
Latest Amendment Date: |
September 7, 1999 |
|
Award Number: |
9907447 |
|
Award Instrument: |
Fellowship |
|
Program Manager: |
Cheryl L. Eavey
SES Division of Social and Economic Sciences
SBE Directorate for Social, Behavioral & Economic Sciences
|
|
Start Date: |
September 1, 1999 |
|
Expires: |
August 31, 2000 (Estimated) |
|
Awarded Amount to Date: |
$64870 |
|
Investigator(s): |
Brian Junker brian@stat.cmu.edu (Principal Investigator)
|
|
Sponsor: |
Carnegie-Mellon University
5000 Forbes Avenue
PITTSBURGH, PA 15213 412/268-8746
|
|
NSF Program(s): |
METHOD, MEASURE & STATS, STATISTICS
|
|
Field Application(s): |
|
|
Program Reference Code(s): |
OTHR, 0000
|
|
Program Element Code(s): |
1333, 1269
|
ABSTRACT
This award supports the investigator's work at the University of Pittsburgh's Learning Research and Development Center (LRDC). Projects to be initiated include: (1) Analyzing school district data archives with an eye toward evaluating educational progress and monitoring the outcomes of educational innovations; (2) Exploring social judgement in education, in particular in the development of an institutional portfolio rating system for classrooms and schools, based on the "Principles for Learning" of LRDC's Institute for Learning; and (3) Laying technical groundwork for a bank of linked topical tests instantiating a purely standards-referenced testing program. All three are connected to ongoing research programs at LRDC. These projects are designed to contribute to the development of data collection systems for adequate school accountability systems and for educational policy evaluation. Research conducted through the Institute for Learning and elsewhere suggests that sustained improvement in student achievement is most reliably attained through institutional change. Yet most currently implemented accountability systems focus instead on individual student outcomes, and often are confounded with high-stakes decisions for individual students. The first project will explore whether existing school district data archives can be exploited to limit additional individual student testing when student achievement data is called for. The second project will apply methodology developed over the past ten years for student portfolio assessment to the development and rating of institutional portfolios intended to show that local institutions (e.g., classrooms, schools and districts) are engaged in a process of professional development that ensures long term gains for students. The banked tests in the third project would each cover fairly narrow topics, such as integer arithmetic, fractions, etc., and could be used for example to assess the distribution of student achievement within a district, school, or classroom relative to specific learning standards. This research is supported by the Methodology, Measurement, and Statistics Program and the Statistics and Probability Program under the Mid-Career Methodological Opportunities Fellowship Announcement.
Please report errors in award information by writing to: awardsearch@nsf.gov.
|