ExpectMore.gov


Detailed Information on the
Advanced Placement Assessment

Program Code 10003300
Program Title Advanced Placement
Department Name Department of Education
Agency/Bureau Name Department of Education
Program Type(s) Competitive Grant Program
Assessment Year 2005
Assessment Rating Moderately Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 88%
Program Management 100%
Program Results/Accountability 42%
Program Funding Level
(in millions)
FY2007 $37
FY2008 $44
FY2009 $70

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Work with the College Board to improve data collection and analysis capabilities.

Action taken, but not completed The Department has continued to align its performance indicators with the best available data available from the College Board. The Department will report 2008 data in September 2008.
2007

Require grant recipients that use API funds to develop and deliver advanced placement courses based on the content and curricular goals of the College Board's Advanced Placement courses, to participate in the College Board's course audit process, and obtain authorization to use the "AP" designation for these courses on student transcripts.

Action taken, but not completed The program office has drafted regulations that would implement this action.
2007

Conduct on-site and desk monitoring of 10 AP Test Fee grantees to identify promising practices that improve the efficiency of State administration of the program and increase participation by eligible students. Draft and disseminate a promising practices report.

Action taken, but not completed The Department will carry out the monitoring in September and October of 2008.
2007

Require API grantees to promote the availability of Academic Competitiveness Grants and provide assistance to eligible students in meeting the necessary academic requirements, as well as in applying for assistance.

No action taken The program office has drafted regulations that would implement this action.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Work to present data for the new performance measures to the public in a transparent manner.

Completed The Department now reports annual Challenge Index data for each API grant recipient on its website.
2006

Use the performance data to drive program improvements, as part of Administration strategy to strengthen high school education.

Completed The Department has implemented a regular program of peer-to-peer technical assistance through which grantees that have been achieving promising performance results provide information and technical assistance to other grantees about how to replicate and adapt the strategies and practices they have found to be effective.
2006

Collect data for the new performance measures.

Completed We have collected data for the new performance measures from all recipients of current API grants.

Program Performance Measures

Term Type  
Long-term/Annual Outcome

Measure: The number of Advanced Placement tests taken by low-income students nationally.


Explanation:Beginning in 2005, this measure was changed to focus on public school students only. The previous measure reported on public and non-public school students. The new measure now aligns with the population served by the program. 2004 data is included for historical purposes.

Year Target Actual
2004 170,092 190,350
2005 183,314 223,263
2006 209,411 267,286
2007 230,352 286,028
2008 253,387
2009 278,726
2010 306,599
2011 337,258
2012 370,984
Long-term/Annual Outcome

Measure: The number of Advanced Placement tests taken by minority (Hispanic, Black, Native American) students nationally.


Explanation:

Year Target Actual
2004 baseline 267,608
2005 300,000 315,203
2006 336,000 359,372
2007 376,000 413,847
2008 421,000
2009 472,000
2010 528,000
2011 575,520
2012 621,562
Long-term/Annual Outcome

Measure: The number of Advanced Placement tests passed by low-income students.


Explanation:FY 2005 target is to establish a baseline.

Year Target Actual
2005 baseline 79,800
2006 90,009 95,350
2007 99,000 97,142
2008 103,728
2009 113,194
2010 126,660
2011 137,601
2012 152,845
Long-term Outcome

Measure: The number of Advanced Placement and International Baccalaureate tests taken in high schools served by API grants, divided by the total number of seniors enrolled at each school (the "Challenge Index").


Explanation:FY 2005 target is to establish a baseline. 2006 target is baseline plus 1%.

Year Target Actual
2006 baseline .6
2007 .66
2008 .67
Long-term Efficiency

Measure: Cost per passage of an Advanced Placement test by a low-income student (amount provides for AP Test Fees divided by total number of tests passed by low-income students).


Explanation:2006 target is to establish a baseline.

Year Target Actual
2006 baseline $95.22
2007
2008

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The program purposes include: (1) support for State and local efforts to raise academic standards through advanced placement programs; (2) providing access to advanced placement courses for secondary school students at schools that do not offer advanced placement programs, increase the rate at which secondary school students participate in advanced placement courses, and increase the numbers of students who receive advanced placement test scores for which college academic credit is awarded; and (3) increasing the participation of low-income individuals in advanced placement tests through the payment or partial payment of the costs of advanced placement test fees.

Evidence: Title I, Part G of the ESEA, as amended by NCLB, Section 1702.

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: Clifford Adelman's 1999 study "Answers in the Toolbox" concluded that the rigor of the high school curriculum (more than any other factor, including socio-economic status) is strongly associated with a student's chances for succeeding at the post-secondary education level. Two major findings are: (1) intensity of high school curriculum measures contribute 41% to statistical measures of bachelor's degree completion; and (2) intensity of high school curriculum is more important (positively) in predicting degree completion for African-American and Latino students than any other pre-college indicator (including socio-economic status). The case for academic preparation and rigor is also made in a 2004 "Ready or Not: Creating a High School Diploma That Counts" report by the American Diploma Project (made up of Achieve, Inc., The Education Trust, and Fordham University). Recent data show that minority students participate in Advanced Placement exams at rates below those of non-minority students. In 2003, according to College Board data, proportionately fewer African-American, American Indian, and Hispanic students than white students took AP exams. White students were three times more likely than African-American students to take AP exams, twice as likely as American Indian students, and one-third times more likely than Hispanic students. In addition, participation in advanced placement programs is still highly correlated with family income. In 2003, students who were eligible for free- or reduced-priced lunch were only one-third as likely to take AP exams as students from families with higher family incomes.

Evidence: Studies, including: "Answers in the Toolbox" by Clifford Adelman (NCES) and "Ready or Not: Creating a High School Diploma That Counts" by the American Diploma Project. College Board report: Advanced Placement Report to the Nation: 2005. College Board and NCES Common Core of Data: The number of exams taken by white public school students in 2003 divided by the total number of white students in public schools in grades 9-12 was 10.8 percent. The comparable percentages for African-American, American Indian, and Hispanic students were 3.3 percent, 4.1 percent, and 8.2 percent, respectively.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: Although other Federal education programs seek to boost the achievement of disadvantaged students and prepare them for college, the API Program alone supports efforts to boost enrollment of disadvantaged students in AP programs and expand the availability of those programs. While several States also subsidize AP test fees, the great majority do not, and the Federal program, by statute, supplements State authority. Moreover, some ?? of the appropriation now goes to the "API" portion of the program, which does not duplicate other activities.

Evidence: Six States pay the entire test fee: AR, FL, GA, SC, MN, and WI. Six States pay part of the test fee: NE (2005),TX (2003 & 2005), MS (2003), KS (2003), VA (2003), NM (2003).

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: There is no evidence of significant design flaws in the program. The only issue has been State projections relative to need for AP Test Fee funds; several States have carryover funds remaining from the previous year. The "supplement, not supplant" requirement provides a major incentive for States not to cut back; they would risk losing their funding under the program if they did. Although the Department believes that API is the more important component of the program (because it provides for a much more comprehensive approach to the problem of uneven access to AP and challenging high school curricula), the programs are interdependent. About one-fourth of the money now goes into the AP Test Fee Program with ?? allocated to API. Therefore, the requirement to meet the demand for test fee payments does not constitute a real flaw in the program.

Evidence:  

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: The API Program statute includes several program priorities, including support for applications that: (1) demonstrate a pervasive need for access to advanced placement incentive programs; and (2) target schools with a high (i.e., 40% or more by statutory definition) concentration of low-income students. In addition, priority points are awarded for the development of math, science, and English advanced placement and pre-advanced placement programs and courses.

Evidence: Title I, Part G of the ESEA - Section 1705 Absolute and competitive preference priorities developed for the API competitions in 2002, 2003, and 2005.

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: Yes. The program has several long-term measures that focus on outcomes and reflect the program's purposes.

Evidence: The measures are: (1) Number of Advanced Placement (AP) tests taken by low-income students nationally. (2) Number of AP tests taken by minority (Hispanic, Black, Native American) students nationally. (3) Number of AP passed by low-income students.

YES 12%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The program has developed three long-term measures with targets through 2010. The target for measure #1 increases by 10 percent annually. The target for measure #2 increases by 12 percent annually. The target for measure #3 increases by 10 percent annually.

Evidence: (1) 2005 - 183,315; 2006 - 220,000; 2007 - 242,000, 2008 - 266,000; 2009 - 293,000; 2010 - 322,000. (2) 2005 - 300,000; 2006 - 336,000; 2007 - 376,000; 2008 - 421,000; 2009 - 472,000; 2010 - 528,000. (3) 2005 - 70,649; 2006 - 77,714; 2007 - 85,486; 2008 - 94,034; 2009 - 103,438; 2010 - 113,782.

YES 12%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: In addition to the long-term measures (which have annual targets), the Department has adopted and will implement one annual measure (the "Challenge Index") and one efficiency measure.

Evidence: The measures are: (1) Number of AP and IB tests taken in high schools served by API grants, divided by the total number of juniors and seniors enrolled at each school ("Challenge Index" for API grantees). (2) Efficiency Measure: Cost per passage of AP test by a low-income student (amount provided for AP Test Fees divided by total number of tests passed by low-income students).

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The Department has baselines and targets for three of the five measures discussed above.

Evidence: See evidence in 2.2 - measure #1

YES 12%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Through annual performance reports, the Department confirms grantee commitment to working towards the program's goals of increasing access of low-income and disadvantaged students to both rigorous pre-advanced placement/advanced placement courses and programs and to the exams associated with these courses and programs. The College Board cooperates with the Department by providing annual data on the number of tests taken in each State, disaggregated by demographic group. The Department is working with International Baccalaureate of North America (IBNA) to provide the same data in FY 2005.

Evidence: The Department will determine how well partners/grantees are meeting the program's goals through the annual performance reports and through exam data provided annually by ETS via the College Board.

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: Although there has not been a formal, scientific evaluation, the Department's Program and Policy Studies Service (PPSS), working with an outside contractor, analyzes data from the College Board and produces an annual report on AP outcomes. This report will be published in the fall 2005. PPSS plans to produce another report in 2006, analyzing, among other things, participation in AP and IB courses and exams. The College Board published the "Advanced Placement Report to the Nation 2005", which reported diaggregated data on AP participation and performance.

Evidence: "Access, Participation, and Performance in Advanced Placement" - PPSS (2005) www.collegeboard.com/prod_downloads/about/news_info/ap/2005/ap-report-nation.pdf

YES 12%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The FY 2005 and 2006 budgets requested major increases for this program because of the significant unmet need and because the program aligns with the Administration's priorities. The increases will enable more students to take AP and pre-P classes (consistent with our performance objectives) but the requests were not aligned with attainment of specific objectives in each year. The Department has satisfied the second part of the question because the Department's budget submissions show the full cost of the program (including S&E).

Evidence:  

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Department has developed and is implementing additional performance measures for the program. In addition, the Department is working with grantees, the College Board, and IBNA to ensure that, starting in FY 2005, the Department will collect and report data on all five annual measures (performance and efficiency) and all three long-term performance measures.

Evidence:  

YES 12%
Section 2 - Strategic Planning Score 88%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The Department receives an annual report from the College Board that provides the number of AP exams taken by low-income students. The data are disaggregated and assist the Department in determining the degree of exam participation by minority and other disadvantaged students. ED staff use exam participation data in determining the annual level of funding for the AP Test Fee Program. The IBNA is collecting parallel information on IB participation. Program staff also determine grantee progress based upon information provided in the annual performance reports. If a grantee demonstrates that substantial progress is being made, then a continuation award is approved. If there is a failure to implement program activities in a timely manner and in accordance with the approved goals/objectives, then a continuation award may be denied or reduced. The Department has used API program grantee performance data since FY 2000 to improve program effectiveness. Due to an extremely general initial authorizing statute, API grants made in FY 2000 and 2001 lacked specific, measurable objectives. Under NCLB, a greatly enhanced program statute (i.e., clearly articulated program purposes/objectives and data collection requirements relative to both AP Test Fees and API) enabled the program office to compete and shape API grants funded under an absolute priority in FY 2002 and subsequent competitions.

Evidence: Annual College Board (i.e., ETS) exam data. Annual Performance Reports: program files.

YES 10%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: ED managers are held accountable for tracking program performance. ED managers are subject to EDPAS, which links employee performance to relevant Strategic Plan goals and is designed to measure the degree to which a manager contributes to improving program performance. Grantees are required to develop project evaluation plans that include specific, measurable, time-framed objectives. Grantees are held accountable through project monitoring and annual performance reports that are reviewed by federal managers. This information is used to evaluate grantee progress towards meeting established program goals and objectives. Continuation is not automatic and is contingent on demonstrating satisfactory progress.

Evidence: EDPAS agreements of: Assistant Deputy Secretary for Innovation and Improvement; Director of Improvement Programs; AP Program Manager. Annual Performance Reports.

YES 10%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds are obligated within the timeframes set out by Department schedules and used for the purposes intended. However, carryover of Test Fee funds for several States has caused the Department to decrease the funding levels for several States, until funds from past FYs are used.

Evidence:  

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: IT improvements, including e-Reader and e-Applications, enhance program management efficiencies and cost effectiveness by streamlining submission of grant applications and allowing for on-line and telephone grant review. Although the cost per application reviewed was slightly higher with the online review, this increase can be attributed to the Department's hiring of a contractor to handle all the logistics of the competition (photocopies, mailings, recruitment of reviewers, legal waivers of reviewers, payment of reviewers, coordination of facilitators and reviewers, etc.). This small additional cost allowed Department staff to concentrate more on the management of the AP program by freeing them of logistical details of the competition. In addition, the program has developed a new efficiency measure.

Evidence: Data on cost/application: 2002 (in DC), 2003 (e-review), 2004 (e-review) forthcoming.

YES 10%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The program shares/exchanges information with the Magnet Schools Assistance Program (MSAP) on a regular basis since IB programs are also supported under the MSAP Program. In FY 2003 and 2005, several MSAP grantees applied for and received API grants. The program frequently shares information with the Office of Indian Education relative to ways in which schools with high numbers of Native American students might benefit from the AP Programs. The Department has seen an increase in the number of applications from grantees that serve Native Americans. The program also has a strong collaboration with the College Board and IBNA, which results in an efficient and thorough data collection and exchange. Additionally, Department staff often speak at AP and IBNA national conferences to share information about the APTF and API programs.

Evidence:  

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: Recent agency-wide audits have not identified deficiencies in the financial management of this program.

Evidence:  

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The program has increased capacity through the addition of staff to the Advanced Placement team. The team holds regular team meetings to address management issues. The program has provided technical assistance to grantees during project director meetings held in D.C. and at the site of the national Advanced Placement conference (organized by the College Board) and by posting information on our website. Project officers provide timely and appropriate assistance on the Department's listserv to meet the individual needs of grantees.

Evidence:  

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: The program statute clearly defines the program purposes and priorities, which are reflected in the Federal Register notices when funds for either program are announced. States may receive AP Test Fee funds so long as they meet the statutory requirements. API awards are truly competitive and applicants must thoroughly address the selection criteria within the context of the absolute priority for the competition. Additionally, applicants may receive additional points for addressing the competitive preference priorities.

Evidence:  

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: On-site monitoring, monitoring of carryover funds, and review of the annual performance reports enable the program to effectively manage and advise grantees regarding the implementation of their grants. Once an onsite review has been completed, the grantee receives a report that includes: (1) an update of implementation progress to date; (2) a discussion of how well the grantee has met the approved objectives; (3) implementation issues; and (4) technical assistance on how the grantee should address issues or concerns raised during the review. Ongoing and consistent technical assistance is provided through the AP grantee listserv.

Evidence:  

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: While data are posted on the web in an aggregated format, grantee-level data are not currently available to the public. However, the Department is planning to post a synopsis of API grantee final reports on the web. Additionally, the Department is planning to release the first annual report to Congress, which will include national data on course taking, test-taking, and mastery rate for students in each State. This information will be disseminated via the Department's website in late summer or early fall of 2005.

Evidence: Program Performance Report: www.ed.gov/about/reports/annual/2006plan/edlite-g2eseaadvancedplace.html API grantee final reports "Access, Participation, and Performance in Advanced Placement: Annual Report to Congress 2005" (to be released in late summer or early fall 2005)

YES 10%
Section 3 - Program Management Score 100%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: Long-term measures for the program have been established. Data has been collected on measure #1 and targets have been met every year since 2001. Data has been collected on measure #2; a baseline and targets were recently established. Data has been collected on measure #3; a baseline and targets are being established.

Evidence:

LARGE EXTENT 17%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Performance measures for both AP Programs have been developed, and the Department has reported on AP exam participation since the program's inception. Every year, from 2001 through 2004, the Department has significantly exceeded its target for this measure. Baseline data on the other two annual measures will be collected in FY 2005.

Evidence: The number of AP tests taken by low-income students nationaly. 2001: Actual - 112,891, Target - 112,200; 2002: Actual - 140,572, Target - 124,180; 2003: actual - 166,649, Target - 154,629; 2004: Actual - 190,350, Target -170,092.

LARGE EXTENT 17%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Department has begun analyzing efficiency data.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: No data are available for such a comparison. No other federal programs perform a similar function. While several States and school districts pay for low-income students to take AP and IB tests, federal funds may only supplement, not supplant those funds, so no comparison can be made.

Evidence:  

NA  %
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: Although there have been no scientific evaluations, the College Board and PPSS both report that AP enrollment, test taking, and mastery are increasing nationwide, including by low-income and minority students.

Evidence:  

SMALL EXTENT 8%
Section 4 - Program Results/Accountability Score 42%


Last updated: 09062008.2005SPR