ExpectMore.gov


Detailed Information on the
National Assessment for Educational Progress Assessment

Program Code 10000194
Program Title National Assessment for Educational Progress
Department Name Department of Education
Agency/Bureau Name Institute of Education Sciences
Program Type(s) Research and Development Program
Assessment Year 2003
Assessment Rating Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 100%
Program Management 70%
Program Results/Accountability 100%
Program Funding Level
(in millions)
FY2008 $93
FY2009 $104

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Completing the NAEP evaluation and determining which findings can be used to recommend program improvements.

Action taken, but not completed The evaluation is being conducted by the Department's Policy and Program Studies Service (PPSS). PPSS put the final report into Departmental review in May 2008, revised the report in response to comments, and submitted the report for Department editorial review. PPSS expects to receive editorial comments in the early spring of 2009.
2008

Improving the timeliness of additional NAEP reports, including technical reports.

Action taken, but not completed NCES will continue to track the time to release of NAEP additional NAEP reports, including technical reports, and will identify strategies to improve the timeliness of these reports. NCES is on track to report on progress by June 30, 2009.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2003

Improve the timeliness of NCES products and services, which include National Assessment activities.

Completed NCES set a target of releasing NAEP data in support of NCLB to NAGB within 6 months of the end data collection. This target was met for the 2005 NAEP. NCES will examine the time to complete other reports, including technical reports.
2005

Reporting data on the progress of improving the timeliness of the release of assessment data.

Completed The Department reports data on the time to release of NAEP reports in support of No Child Left Behind.
2007

Identify NAEP-specific program performance measures to track future program performance.

Completed The Department presented OMB with a proposal for additional performance measures and, based on feedback received, developed a final set of additional measures.
2007

Conducting an analysis of interstate variation in student exclusion rates to help ensure the comparability of NAEP assessment results.

Completed On November 18, 2008, IES released the report "Measuring the Status and Change of NAEP State Inclusion Rates for Students with Disabilities." The report examines the relationship between various characteristics of students with disabilities and the probability that they would be included in NAEP and examines the changes in inclusion rates from 2005 to 2007. The report is available at http://nces.ed.gov/nationsreportcard/pubs/studies/2009453.asp

Program Performance Measures

Term Type  
Long-term/Annual Efficiency

Measure: Timeliness of National NAEP data for Reading and Mathematics Assessment in support of the President's No Child Left Behind initiative. (The time from the end of data collection to initial public release of results for the reading and mathematics assessments.)


Explanation:NCES has significantly reduced the time to release of reading and mathematics assessments. When NCES adopted this measure, time to public release was 15 months. In 2005, NCES released the reports to the National Assessment Governing Board in 6 months.

Year Target Actual
2003 6 8
2005 6 6
2007 6 5.25
2009 6
2011 6
2013 6
Annual Efficiency

Measure: The percentage of NAEP reports on state-level 4th grade and 8th grade (and 12th grade if implemented) reading and mathematics assessments ready for release by the National Asssessment Governing board within 6 months of the end of data collection.


Explanation:The "time to release" is measured from the end of the data collection to the time NCES releases the reports to the National Assessment Governing Board.

Year Target Actual
2005 Baseline 100
2007 100 100
2009 100
2011 100
2013 100
Long-term/Annual Efficiency

Measure: The percentage of NAEP initial releases, excluding National and state reading and mathematics assessments reported as separate measures, that either meet the target number of months from the end of data collection to release of the report, or show at least a 2-month improvement over the prior release, with the starting point of 18 months in 2006, then declining to 16 months in 2007, 14 months in 2008, and 12 months in 2009 and beyond.


Explanation:The "time to release" is measured from the end of the data collection to the time NCES releases the reports to the National Assessment Governing Board.

Year Target Actual
2007 Baseline 80%
2008 80% 100%
2009 80%
2010 85%
2011 90%
2012 95%
2013 95%
Long-term Outcome

Measure: The percentage of respondents who would recommend the Nation's Report Card to others and who would rely on the Nation's Report Card in the future as measured by the American Customer Satisfaction Index (ACSI).


Explanation:NCES survey participants are sampled from a combined list of NCES report requesters, data users, participants in meetings of constituents data providers for postsecondary institutions and for elementary and secondary state level data, participants in NCES training sessions that are offered to interested users and providers of NCES data, and volunteers to an open solicitation for participants from NCES web users. The ACSI is an indicator established in 1994 by the American Society for Quality and the University of Michigan Business School. The ACSI measures customer satisfaction in each of seven sectors: manufacturing-nondurables, manufacturing-durables, retail, services, finance/insurance, transportation/communications/utilities, and public administration/government. In each of the sectors, customer satisfaction is studied for its relationship to quality, profitability, and customer retention. ACSI measures satisfaction with more than 100 different federal government programs and/or websites, including 28 programs of other information providers. In addition, 5 other federal statistical agencies (BLS, BEA, NASS, ERS, IRS) have one or more components of their program evaluated through the ACSI. By participating in the ACSI, NCES will be able to use this common metric to compare customer satisfaction with its products and services with customer satisfaction with other ACSI participants.

Year Target Actual
2008 Set a Baseline
2010
2012
2014
Long-term Output

Measure: Number of web visits to the NAEP website (monthly average).


Explanation:Number of web visits is a program performance measure of dissemination. Specifically, it is a metric of user traffic that is employed by several statistical agencies (BTS, BJS, NCHS) to monitor the level of interest in the information provided on the agency website over time. The collection of these data will allow NCES to monitor changes in web usage associated with major data releases or new user tools, and will provide a basis for comparison with other statistical agencies. This measure is based on the number of unique visits. A unique visit is a series of actions that begins when a visitor views their first page from the server, and ends when the visitor leaves the site or remains idle beyond the idle-time limit. The default idle-time limit is 30 minutes.

Year Target Actual
2008 Set a Baseline 66,464
2009
2010
2011
2012
2013
Long-term/Annual Output

Measure: Number of users of the Assessment Explorer data tool (monthly average).


Explanation:

Year Target Actual
2008 Set a baseline 7,063
2009
2010
2011
2012
2013
Long-term/Annual Output

Measure: Number of downloads of electronic versions of NAEP reports (monthly average).


Explanation:

Year Target Actual
2008 Set a Baseline 11,702
2009
2010
2011
2012
2013
Long-term/Annual Output

Measure: Number of times NAEP data are cited on the web sites of 90 education associations and organizations.


Explanation:The 90 websites cover the range of elementary/secondary and postsecondary associations that represent data providers, education practitioners, education information dissemination experts, researchers, and education policy makers. These groups serve as an information source for their constituents, frequently repackaging relevant information to increase accessibility for their members. The list started from a list of potential recipients for education products and was supplemented based on the experience and knowledge of the team members who developed the monitoring program.

Year Target Actual
2008 Set a Baseline 41
2009
2010
2011
2012
2013
Long-term/Annual Efficiency

Measure: After adjustment for inflation, the average cost per completed case for the assessments (in 2006 dollars).


Explanation:Cost per case for a survey is an efficiency measure. The measure is the cost per case for the data collection component of the survey in 2006 dollars, and it refers to the calculated cost for each completed individual response in a survey. The measure is calculated by adding the total costs for a survey for data collection (specifically costs for mailing, interviewers, web data collection, and, where applicable, incentives for participation) and total costs for processing of the data (including computer input and editing). These total costs are divided by the number of completed cases in the survey to create a "cost per case" for the survey. An interagency committee collaborated on the development of a framework for performance measures for federal statistical agencies. That committee identified costs to produce a product as an efficiency measure. In particular, five other statistical agencies (Census, BLS, NASS, BTS, EIA) are using measures of unit cost to monitor efficiency of performance. NCES anticipates that in light of increased resistance to voluntary participation in federal surveys, and increased costs associated with transportation of field staff and materials, it will be difficult to maintain a level cost per case collected over time.

Year Target Actual
2007 Set a Baseline $79.68
2009 $79.68
2011 $79.68
2013 $79.68

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The statute clearly states the purpose of National Assessment of Educational Progress (NAEP): "to provide, in a timely manner, a fair and accurate measurement of student achievement and reporting trends in such achievement in reading, mathematics, and other subject matter ."

Evidence: Sec. 303, National Assessment of Educational Progress Authorization Act

YES 20%
1.2

Does the program address a specific interest, problem or need?

Explanation: NAEP provides the only nationally representative and continuing assessment of what American students know and can do.

Evidence: Sec. 303, National Assessment of Educational Progress Authorization Act

YES 20%
1.3

Is the program designed to have a significant impact in addressing the interest, problem or need?

Explanation: See above.

Evidence: See above.

YES 20%
1.4

Is the program designed to make a unique contribution in addressing the interest, problem or need (i.e., not needlessly redundant of any other Federal, state, local or private efforts)?

Explanation: The National Center for Education Statistics (NCES) is organized according to policy area and core activity. The current administrative structure is successful in supporting NCES products and activities, however the successful administration of the assessment program does not mean that continuous program improvements are not needed. The National Assessment Governing Board (NAGB) serves as the NAEP governing body and formulates policy guidelines for NAEP.

Evidence: Key NAEP reports provide useful information and are produced on schedule.

YES 20%
1.5

Is the program optimally designed to address the interest, problem or need?

Explanation:  

Evidence:  

NA  %
1.RD1

Does the program effectively articulate potential public benefits?

Explanation: The Office measures public benefit through satisfaction surveys. However, NCES should consider conducting surveys to determine how data are used, as well as evaluations to determine the effectiveness of NAEP data in informing educational decisions.

Evidence: Results of biennial customer surveys.

YES 20%
1.RD2

If an industry-related problem, can the program explain how the market fails to motivate private investment?

Explanation: N/A

Evidence:  

0%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific, ambitious long-term performance goals that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The Department of Education's GPRA Plan contains an NCES long-term goal to "Provide timely, useful, and comprehensive data that are relevant to policy and educational improvement." Performance targets are established through 2007.

Evidence: NCES GPRA goals.

YES 12%
2.2

Does the program have a limited number of annual performance goals that demonstrate progress toward achieving the long-term goals?

Explanation: NCES uses a survey to measure customer satisfaction goals related to product comprehensiveness, timeliness, and utility. Although this survey is only administered every two years, the Department of Education has demonstrated that biennial administration provides high quality data for decision-making while reducing respondent burden and survey costs. A shortcoming of the performance measure, however, is that customer satisfaction data are reported for the Statistics and Assessment programs combined. However, the Assessment program also will monitor the timeliness of NAEP reports with a separate measure of the time from the end of data collection to the initial public release of results of the reading and mathematics assessments.

Evidence: NCES Customer Satisfaction Survey. NAEP reports.

YES 12%
2.3

Do all partners (grantees, sub-grantees, contractors, etc.) support program planning efforts by committing to the annual and/or long-term goals of the program?

Explanation: NCES conducts meetings with key constituents. Contractors, grantees, and the NCES Advisory Council were involved in the development and/or review of the NCES Information Quality Guidelines and Statistical Standards. In addition, each contractor and subcontractor is contractually committed to adhering to the NCES Information Quality Guidelines and Statistical Standards.

Evidence: Elementary and Secondary and Postsecondary data forums, technical review panels, contractor meetings, and the NCES Advisory Council for Education Statistics. NCES held separate review meetings with a cross-section of NCES contractors and Grantees to receive input to the development of the Information Quality Guidelines and Statistical Standards.

YES 12%
2.4

Does the program collaborate and coordinate effectively with related programs that share similar goals and objectives?

Explanation:  

Evidence:  

NA 0%
2.5

Are independent and quality evaluations of sufficient scope conducted on a regular basis or as needed to fill gaps in performance information to support program improvements and evaluate effectiveness?

Explanation: External evaluations of Assessment activities include the work of the Board on Testing and Assessment (BOTA), an arm of the National Academies National Research Council (NRC). In addition, in 2003 the Department will make an award for an independent review of NAEP.

Evidence: See above (BOTA) and www7.nationalacademies.org/bota/Evaluation_of_NAEP.html & www7.nationalacademies.org/bota/NAEP_Reporting_Practices.html. Reports include: Grading the Nation's Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress, 1999 and NAEP Reporting Practices: Investigating District-Level and Market-Basket Reporting, 2001.

YES 12%
2.6

Is the program budget aligned with the program goals in such a way that the impact of funding, policy, and legislative changes on performance is readily known?

Explanation: Budget decisions are directly tied to the scope and methodological rigor of assessment activities.

Evidence: Budget calculations associated with NAEP authorization.

YES 12%
2.7

Has the program taken meaningful steps to address its strategic planning deficiencies?

Explanation: NAGB's long-range schedule of assessments provides appropriate opportunities to review and address strategic planning issues.

Evidence: NAGB documents and reports on the NAGB web site.

YES 12%
2.RD1

Is evaluation of the program's continuing relevance to mission, fields of science, and other "customer" needs conducted on a regular basis?

Explanation: See questions 2 and 5. In addition, NAEP is subject to an ongoing validity study by a panel of academic researchers.

Evidence: Customer survey; NAGB

YES 12%
2.RD2

Has the program identified clear priorities?

Explanation: In large part based on statutory guidance, NAGB has identified clear goals for the program.

Evidence: Statute and NAGB data collection and reporting schedules.

YES 12%
Section 2 - Strategic Planning Score 100%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: NCES uses customer satisfaction information to inform bureau products and services. NCES claims that biennial surveys are sufficient to measure satisfaction of customers and structure the creation and delivery of products.

Evidence: Customer satisfaction surveys.

YES 10%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, etc.) held accountable for cost, schedule and performance results?

Explanation: ED's managers are subject to the new EDPAS system which links employee performance to success in meeting the goals of the Department's Strategic Plan. In general, managers are provided individual performance agreements where there are given responsibility for achieving relevant action steps outlined in the Strategic Plan. These action steps and other items included in managers' performance agreements are designed to measure the degree to which a manager contributes to improving program performance. Contractor and grantee performance is monitored on an annual basis through review and approval of annual budget plans, compliance reviews, audits, and site visits. Contractors and grantees that do not meet Federal requirements are required to submit improvement plans and can have awards reduced or discontinued for serious or persistent failures to comply.

Evidence:  

YES 10%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: The Assessment program successfully obligates funds by the end of each fiscal year, but should work to reduce penalty interest charges.

Evidence:  

YES 10%
3.4

Does the program have incentives and procedures (e.g., competitive sourcing/cost comparisons, IT improvements) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: Although NCES has been working on technological improvements that will improve data accuracy and timeliness, the Office does not have formal incentives and procedures for realizing efficiencies and cost effectiveness. Moreover, NCES should work to synthesize project web architecture in order to promote interoperability and lower costs.

Evidence:  

NO 0%
3.5

Does the agency estimate and budget for the full annual costs of operating the program (including all administrative costs and allocated overhead) so that program performance changes are identified with changes in funding levels?

Explanation: Education's 2004 Budget satisfies the first part of the question by presenting the anticipated S&E expenditures (including retirement costs) for this program, which constitute 8.6 percent of the program's full costs. However, Education has not satisfied the second part of the question because program performance changes are not identified with changes in funding levels.

Evidence:  

NO 0%
3.6

Does the program use strong financial management practices?

Explanation: NCES follows Federal Procurement Regulations that prescribe procedures for monitoring poor performance, such as the issuance of cure notices and stop work notices, and for executing termination as required. In addition the conversion to performance-based contracts will further facilitate this monitoring activity.

Evidence:  

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The program is subject to the advice and consent of NAGB. NAGB oversight has led to several changes in the administration of the Assessment program.

Evidence:  

YES 10%
3.RD1

Does the program allocate funds through a competitive, merit-based process, or, if not, does it justify funding methods and document how quality is maintained?

Explanation: NAEP is conducted through competitive awards to external firms.

Evidence:  

YES 10%
3.RD2

Does competition encourage the participation of new/first-time performers through a fair and open application process?

Explanation: NAGB holds four public meetings a year. The meetings include discussion of procurement policy and future plans for the Assessment program. NCES holds bidders conferences, places SOWs on the web, and conducts outreach at meetings and conferences.

Evidence:  

YES 10%
3.RD3

Does the program adequately define appropriate termination points and other decision points?

Explanation: NCES is beginning to use performance-based contracts that have adequate opportunity for termination and amendment. NAGB provides oversight of NAEP activities and selects subject areas to be assessed (consistent with the statute). However, the Assessment program did not demonstrate that there is in place an effective plan for systematically determining when resources should be allocated to higher priority activities or when specific data elements or reports should be terminated or overhauled. In addition, NCES needs to design a process wherein decisionmakers, including the OMB and senior Departmental management, are apprised of significant contractual activity.

Evidence:  

NO 0%
3.RD4

If the program includes technology development or construction or operation of a facility, does the program clearly define deliverables and required capability/performance characteristics and appropriate, credible cost and schedule goals?

Explanation: N/A

Evidence: N/A

NA 0%
Section 3 - Program Management Score 70%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome goal(s)?

Explanation: The Department of Education's GPRA Plan contains an NCES long-term goal to "Provide timely, useful, and comprehensive data that are relevant to policy and educational improvement." Measurement of this indicator shows that NCES is showing progress in achieving long-term goals. Data for this indicator are available for both the Statistics program and NAEP combined, and therefore do not provide specific information for the NAEP program. However, NCES has added a second performance goal for NAEP: reducing the time between the end of data collection to the initial public reselase of the reading and mathematics assessment results. Data are not yet available for this indicator.

Evidence: GPRA Performance Plan.

YES 25%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: NCES continues to measure high levels of customer satisfaction.

Evidence:  

YES 25%
4.3

Does the program demonstrate improved efficiencies and cost effectiveness in achieving program goals each year?

Explanation: NCES staff work to improve data collection and reporting strategies, such as through the enhanced use of technology, in order to conduct work in a more cost-effective manner.

Evidence: NCES continues to modify product delivery so that publications and data are available electronically and on the web. Technological improvements have increased the timeliness of NCES products and services.

YES 25%
4.4

Does the performance of this program compare favorably to other programs with similar purpose and goals?

Explanation:  

Evidence:  

NA 0%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: NCES conducts reviews of individual projects to ensure high quality, and customer survey data show that customers are, overall, satisfied with the comprehensiveness, timeliness, and utlity of publications, data files, and services. In addition, external evaluations of the Assessment program by BOTA indicate that Assessment activities produce quality products.

Evidence: Customer satisfaction surveys. NAEP validity studies by BOTA.

YES 25%
4.RD1

If the program includes construction of a facility, were program goals achieved within budgeted costs and established schedules?

Explanation:  

Evidence:  

NA 0%
Section 4 - Program Results/Accountability Score 100%


Last updated: 01092009.2003FALL