ExpectMore.gov


Detailed Information on the
Training and Advisory Services Assessment

Program Code 10002122
Program Title Training and Advisory Services
Department Name Department of Education
Agency/Bureau Name Department of Education
Program Type(s) Competitive Grant Program
Assessment Year 2008
Assessment Rating Adequate
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 75%
Program Management 90%
Program Results/Accountability 25%
Program Funding Level
(in millions)
FY2007 $7
FY2008 $7
FY2009 $7

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2008

Provide technical assistance to improve the rigor and quality of grantees' evaluations, such as workshops during the post-award period to grantees and their evaluators.

No action taken Starting during the post-award period, the Department will provide workshops and additional technical assistance to grantees and their evaluators to improve the rigor and quality of their evaluations. The cooperative agreements will require grantees to submit revised evaluation plans within the post-award period and also updated annual evaluation plans, for Department review and approval.
2008

Require grantees to submit annual plans for Department review and measure annual performance against what they proposed to accomplish.

No action taken
2007

Use the findings from the customer satisfaction survey to identify areas in need of improvement and incorporate this information into the monitoring plan as appropriate.

No action taken The Department has used the findings from the survey for program improvement, for example, encouraging Center directors to invest more attention on those Department priorities that clients rated lower in terms of results from EAC services. Program staff are incorporating these changes into the monitoring plan.
2007

Implement the new efficiency measure ("The number of working days it takes the Department to send a monitoring report to grantees after monitoring visits, both virtual and on-site") and collect baseline data.

No action taken The Department has established a new GPRA efficiency measure. In FY 2008, the program office is piloting site visits with grantees, largely using videoconferencing to gather the information needed to monitor the program with greater efficiency than traditional on-site visits. The Department currently plans to use 45 working days as the target, but will set final targets after implementing the monitoring visits and gathering baseline data.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Revise, as necessary, the follow-up customer satisfaction survey and administer the survey in order to obtain information on the quality, relevance, and usefulness of program services.

Completed
2006

Implement the efficiency measure and continue work to establish and implement at least one additional efficiency measure.

Completed
2006

Develop long-term performance goals to assess program's effectiveness.

Completed

Program Performance Measures

Term Type  
Long-term/Annual Outcome

Measure: The percentage of customers of Equity Assistance Centers that develop, implement, or improve their policies or practices, or both, in eliminating, reducing, or preventing harassment, conflict, and school violence.


Explanation:Data are collected annually through a customer satisfaction survey administered by the Library of Congress's Federal Research Division.

Year Target Actual
2006 Baseline 66
2007 67 50
2008 68 July 08
2009 69 July 09
2010 70 July 10
2011 71 July 11
2012 72 July 12
2013 73 July 13
2014 74 July 14
Long-term/Annual Outcome

Measure: The percentage of customers of Equity Assistance Centers that develop, implement, or improve their policies or practices, or both, ensuring that students of different, race, sex, and national origin have equitable opportunity for high-quality instruction.


Explanation:Data are collected annually through a customer satisfaction survey, which is administered by the Library of Congress's Federal Research Division.

Year Target Actual
2006 Baseline 71
2007 72 82
2008 73 July 08
2009 74 July 09
2010 75 July 10
2011 76 July 11
2012 77 July 12
2013 78 July 13
2014 79 July 14
Long-term/Annual Outcome

Measure: The percentage of customers who report that the products and services they received from the Equity Assistance Centers are of high quality.


Explanation:Data are collected annually through a customer satisfaction survey, administered by the Library of Congress's Federal Research Division.

Year Target Actual
2007 Baseline 92
2008 90 July 08
2009 90 July 09
2010 90 July 10
2011 90 July 11
2012 90 July 12
2013 90 July 13
2014 90 July 14
Long-term/Annual Outcome

Measure: The percentage of customers who report that the products and services they received from the Equity Assistance Centers are of high usefulness to their policies and practices.


Explanation:Data are collected annually through a customer satisfaction survey administered by the Library of Congress's Federal Research Division.

Year Target Actual
2006 Baseline 85
2007 86 88
2008 87 July 08
2009 88 July 09
2010 89 July 10
2011 90 July 11
2012 90 July 12
2013 90 July 13
2014 90 July 14
Long-term/Annual Efficiency

Measure: The percentage of Equity Assistance Center grant funds carried over in each year of the project.


Explanation:

Year Target Actual
2006 Baseline .624
2007 10 .138
2008 10 August 08
2009 10 August 09
2010 10 August 10
2011 10 August 11
2012 10 August 12
2013 10 August 13
2014 10 August 14
Long-term/Annual Efficiency

Measure: The number of working days it takes the Department to send a monitoring report to grantees after monitoring visits (both virtual and on-site).


Explanation:The Department intends to create a monitoring plan for site visits in 2008 and to complete a round of pilot visits by the end of 2008. The goal is to issue monitoring reports within 45 days of monitoring site visits.

Year Target Actual
2008 Baseline Dec 08

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The program is designed to support the intent of Title IV of the Civil Rights Act by providing technical assistance and training to school boards and other responsible government agencies on issues related to desegregation in order to ensure that all children, regardless of race, gender, or national origin, have equal access to a quality education. Funded projects must provide technical assistance and training, upon request, in the desegregation assistance areas of race, sex, and national origin desegregation. Allowable activities are intended to assist school personnel, community members, parents, and students in coping with the preparation, adoption, and implementation of plans for the desegregation of public schools - which in this context means plans for equity (including desegregation based on race, sex, and national origin) - and in the development of effective methods of coping with special educational problems occasioned by desegregation.

Evidence: Title IV, Civil Rights Act of 1964, 42 U.S.C. 2000c-2000c-2, 2000c-5; 34 CFR Parts 270 and 272; Notice Inviting Applications for New Awards for FY 2005 "This program provides financial assistance to operate regional Desegregation Assistance Centers to enable them to provide technical assistance (including training) at the request of school boards and other responsible governmental agencies in the preparation, adoption, and implementation of plans for the desegregation of public schools, and in the development of effective methods of coping with special educational problems occasioned by desegregation."

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: Data and numerous studies indicate that discrimination and unequal access to a quality education remain problems. The 2007 NAEP reading assessment shows that, at the fourth grade level, compared to the 2005 results, the acheivement gap between white and black students narrowed, but remains large and statistically significant, and the acheivement gap between white and Hispanic students stayed the same and is also statistically significant. The gap between students who are eligible for free or reduced-price lunch and those who are not also did not narrow from the 2005 to the 2007 test administration. The EACs provide assistance to States and districts in developing plans to improve instruction for minority students who are performing below the expected achievement levels using strategies and tools such as data-driven decision-making, Response to Intervention (RTI), and the Center for Applied Linguistics' Sheltered Instruction Observation Protocol (SIOP) model for English Language Learners. On the School Year 2006-2007 client survey, 82% of the respondents reported that as a result of EAC services, their organizations developed, implemented, or improved their policies and practices in ensuring that students of different race, sex, and national original have equitable opportunity for high-quality instruction; 73% in ensuring culturally relevant instruction; and 70% in improving academic opportunities for English Language Learners (ELLs). Ensuring that all students have access to a quality education, regardless of where they live or the economic status of their family also continues to be a challenge. The Century Foundation Task Force on the Common School commissioned research that highlighted that, "schools with a core of middle-class families are marked by higher expectations, higher-quality teachers, more-motivated students, more financial resources, and greater parental involvement," (p. 14). The Elementary and Secondary Education Act (ESEA) requires States to ensure that poor and minority children are not taught at higher rates than other children by inexperienced, unqualified, or out-of-field teachers. Nationwide, we know that schools from the highest income quartile compared to schools from the lowest income quartile have 5 percent (for elementary schools) and 8 percent (for secondary schools) more teachers who meet the highly qualified teacher definition. The EACs support districts and schools in the development and implementation of equity plans, and materials and professional development activities designed to develop the skills of all teachers who work with minority student populations. In addition, the EACs provide assistance to States, districts, and schools with issues of disproportionality, i.e., over-representation of minority students in Special Education programs and under-representation of minority students in gifted and talented programs. As the U.S. Department of Education, Office for Civil Rights (OCR) observed in its Annual Report to Congress Fiscal Year 2005 (2006): "Students inappropriately placed in special education programs may not receive the same curriculum content as other students and may face barriers in their later efforts to obtain a regular high school diploma, pursue secondary education, or prepare for employment" (p. 9). School bullying and harassment have gained more national attention than ever before as we understand the damaging effects such practices can have on students. According to Indicators of School Crime and Safety: 2007, "in 2005, about 28 percent of 12- to 18-year-old students reported having been bullied at school during the last 6 months"(p. 34). The EACs provide assistance in the prevention of violence, school bullying, and harassment through conferences, workshops, training, and consultations on policies and practices to raise awareness and establish and implement policies and practices.

Evidence: ??FY05 Cooperative Agreements ??Dinkes, R., Cataldi, E.F., and Lin-Kelly, W. (2007). Indicators of School Crime and Safety: 2007. (NCES 2008-021/NCJ 219553). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education, and Bureau of Justice Statistics, Office of Justice Programs, U.S. Department of Justice. Washington, DC. http://www.ojp.usdoj.gov/bjs/pub/pdf/iscs07.pdf ??Nansel, T.R., Overpeck, M.D., Haynie, D.L., Ruan, W.J., and Scheidt, P.C. (2003). Relationships Between Bullying and Violence Among U.S. Youth. Archives of Pediatric and Adolescent Medicine, 157(4): 348-353.http://archpedi.amaassn.org/cgi/content/full/157/4/348 ??2007 National Assessment of Educational Progress (NAEP) http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2007496 ??Weicker, Lowell, Chair (2002). Divided We Fail: Coming Together through Public School Choice: The Report of The Century Foundation Task Force on the Common School. NY: The Century Foundation Press. http://www.tcf.org/list.asp?type=PB&pubid=377 ??ESEA, Sec. 1111(b)(8)(C) Data on distribution of Highly Qualified Teachers, 2005-2006 School Year, Academic Improvement and Teacher Quality Programs Unit, US Department of Education ??Annual Report to Congress Fiscal Year 2005 (2006): Office for Civil Rights, U.S. Department of Education ??Miller P., and Meditz, S.W. (July 2007). U.S. Department of Education Equity Assistance Centers Program Client Survey for School Year 2006-2007. Washington, DC: Library of Congress, Federal Research Division.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: This program does not significantly overlap with other Federal or non-Federal efforts, including the efforts of State and local governments or the private and non-profit sectors. This is the only Federal education program that is legislatively required to provide technical assistance specifically for desegregation and special educational problems occasioned by desegregation. "'Desegregation assistance' means the provision of technical assistance (including training) in the areas of race, sex, and national origin desegregation of public elementary and secondary schools." This is also the only Federal program that focuses on providing technical assistance and training to ensure that children, regardless of race, gender or national origin, have equal access to a quality education. Other Federal entities that address desegregation, such as the U.S. Department of Justice (DOJ), focus on compliance issues and do not provide technical assistance. In fact, DOJ often asks the Equity Assistance Centers to provide the technical assistance needed to address compliance issues. There are few individuals at the State level (SEAs) designated as Equity Personnel, and they do not provide technical assistance. LEAs usually do not have the capacity to address desegregation-related issues and provide training and technical assistance.

Evidence: See P.L. 88-352, Title IV, Section 403; 42 U.S.C. 2000c-2000c-2, 2000c-5; and 34 CFR Part 270.3.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: Title IV of the Civil Rights Act (the authorizing statute) provides a broad authority for this type of activity and, thus, does not create any statutory impediments that limit its effectiveness. Even though the statute governing the program was authorized in 1964, there still exist hundreds of school districts throughout the country, but particularly in the South, that need assistance implementing court-ordered desegregation plans and working toward the achievement of unitary status. The program is designed as a competitive grant program that provides funds to public agencies (excluding State educational agencies or school boards), nonprofits or private organizations in each of ten different geographic regions. There is no evidence indicating that the structure of the program is flawed or that another approach or mechanism would be more efficient or effective in achieving the intended purpose. The design of the program, 10 Equity Assistance Centers serving different regions of the country, allows technical assistance to be provided nationwide while also being responsive to specific needs in different parts of the country. In terms of efficiency, it would be difficult for other mechanisms, such as contracts instead of grants, to deliver these services at the low cost of the current program, which has been $7 million in recent years.

Evidence: Title IV of the Civil Rights Act of 1964, 42 U.S.C. 2000c-2000c-2, 2000c-5; 34 CFR Parts 270 and 272; Statements from EAC Directors and requests from school districts for assistance with implementing desegregation plans The customer satisfaction ratings for high quality (92% of 199 respondents) and high usefulness (88% of 200 respondents) suggest that the program design is effective. Miller P., and Meditz, S.W. (July 2007). U.S. Department of Education Equity Assistance Centers Program Client Survey for School Year 2006-2007. Washington, DC: Library of Congress, Federal Research Division.

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: The design of the program, 10 Equity Assistance Centers serving different regions of the country, allows technical assistance to be provided nationwide while also being responsive to specific needs in different parts of the country. The design of the program targets services to schools, districts, and other agencies that operate schools, reaching education decision-makers, administrators, teachers, and students, all of whom are the intended beneficiaries. The program funds activities to deliver services to schools, districts, SEAs, and organizations that serve them. The program's regulations stipulate that the centers work on request. While this design does not ensure that the centers work with the schools or districts that most need assistance, it does create a situation that is more likely to be successful. Given the small size of these grants, working on request seems to be a model that has a greater likelihood of bringing about change, as opposed to using resources to determine which schools and districts have the greatest need and then working to persuade those schools and districts of the value of the assistance that the EACs could provide.

Evidence: The program provides technical assistance and training to schools, school districts, and State educational agencies on issues related to desegregation and to help ensure that all children, regardless of race, gender, or national origin, have equal access to a quality education. 34 CFR 272.1 "This program provides financial assistance to operate Desegregation Assistance Centers to enable them to provide technical assistance (including training) at the request of school boards and other responsible governmental agencies in the preparation, adoption, and implementation of plans for the desegregation of public schools, and in the development of effective methods of coping with special educational problems occasioned by desegregation."

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: In coordination with other Department technical assistance programs, the program office established long-term performance measures for quality (1 measure), relevance (2 measures), and usefulness (1 measure). The measures for relevance and usefulness were announced in the notice of the FY 2005 competition for Equity Assistance Centers, published March 1, 2005. The measures are also used as the annual measures. Data for the four long-term performance measures are collected annually through a customer satisfaction survey. Assessing the effectiveness of technical assistance programs has proven to be challenging. For a relatively small program such as this one, the results of customer satisfaction surveys seem to serve as a meaningful proxy for determining the effectiveness of the EACs program. Ensuring that the products and services that customers receive from the EACs are of high quality is an important condition for the success of the program. Also important is for customers to believe that the products and services they have received are of high usefulness to their policies and practices. Finally, the measures for relevance focus specifically on intermediate policy and practices outcomes that are necessary for the program to achieve success in reducing harassment and violence and to help ensure equitable educational opportunities for all students.

Evidence: ED's Visual Performance Suite (VPS) system: The program has established the following long-term performance measures for quality, relevance, and usefulness: GPRA measure 1.3 for quality: The percentage of customers who report that the products and services they received from the Equity Assistance Centers are of high quality. GPRA measure 1.1 for relevance: The percentage of customers of Equity Assistance Centers that develop, implement, or improve their policies or practices, or both, in eliminating, reducing, or preventing harassment, conflict, and school violence. GPRA measure 1.2 for relevance: The percentage of customers of Equity Assistance Centers that develop, implement, or improve their policies or practices, or both, ensuring that students of different race, sex, and national origin have equitable opportunity for high-quality instruction. GPRA measure 1.4 for usefulness: The percentage of customers who report that the products and services they received from the Equity Assistance Centers are of high usefulness to their policies and practices.

YES 12%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The Department has established ambitious targets and time frames for the long-term measures of quality (1 measure), relevance (2 measures), and usefulness (1 measure). Baseline data were set in 2006 for the outcome measures for relevance and usefulness, and in 2007 for the outcome measure for quality. The Department has set the long-term target for each of these measures at 90% and plans to keep the targets constant when they reach 90%, because of fluctuations that occur when only a few responses change. For example, with a total of 200 respondents, it takes only 6 respondents to make a difference of 3% in the findings. With a total of 100 respondents, 6 respondents would make a difference of 6%. The amount of time the Department estimates it will take to reach this 90% target depends on the baseline data for a given measure. Consequently, while the target for the quality measure is 90% in 2008, the 90% target is not set until further out in the future for the other measures. Comparable data from other technical assistance programs are not available to help inform the setting of EAC GPRA targets. For example, the first report with findings from the national evaluation of the Comprehensive Assistance Centers is not expected to be available until summer 2009. Initial findings may be available by the end of July 2008; however, comparability will be weak because data for the Comprehensive Assistance Centers GPRA measures will be calculated from the judgments of review panels or target audiences instead of the judgments of the customers. Similarly, the GPRA measures for the Special Education Technical Assistance and Dissemination (TA&D) program are calculated as the percentage of products and services deemed by review panels to be of high quality, high relevance, and useful, instead of the percentage of customers who rate the products and services to be of high quality, high relevance, and high usefulness.

Evidence: ??Targets for GPRA measure 1.3 (the percentage of customers who report that the products and services they received from the Equity Assistance Centers are of high quality) are 90% (FY 2008), 90% (FY 2009), 90% (FY 2010), 90% (FY 2011), 90% (FY 2012), 90% (FY 2013). ??GPRA measure 1.1 (the percentage of customers of Equity Assistance Centers that develop, implement, or improve their policies or practices, or both, in eliminating, reducing, or preventing harassment, conflict, and school violence) are 68% (FY 2008), 69% (FY 2009), 70% (FY 2010), 71% (FY 2011), 72% (FY 2012), 73% (FY 2013). ??GPRA measure 1.2 (the percentage of customers of Equity Assistance Centers that develop, implement, or improve their policies or practices, or both, ensuring that students of different race, sex, and national origin have equitable opportunity for high-quality instruction): 73% (FY 2008), 74% (FY 2009), 75% (FY 2010), 76% (FY 2011), 77% (FY 2012), 78% (FY 2013). ??GPRA measure 1.4 (the percentage of customers who report that the products and services they received from the Equity Assistance Centers are of high usefulness to their policies and practices): 87% (FY 2008), 88% (FY 2009), 89% (FY 2010), 90% (FY 2011), 90% (FY 2012), 90% (FY 2013) (EAC) Training and Advisory Services FY 2007 (OESE) GPRA performance report (http://www.ed.gov/about/reports/annual/2007report/g2cratraining.doc) Comprehensive Assistance Centers (OESE) FY 2007 GPRA performance report (http://www.ed.gov/about/reports/annual/2007report/g2esracomprehensive.doc) Special Education Technical Assistance and Dissemination FY2009 GPRA performance plan (http://www.ed.gov/about/reports/annual/2009plan/g1specedta.doc).

YES 12%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: In coordination with other Department technical assistance programs, the program has established performance measures for quality (1 measure), relevance (2 measures), and usefulness (1 measure). These are both annual performance measures and also long-term measures. The annual measures track regular progress toward achieving the long-term targets. The measures for relevance and usefulness were announced in the notice of the FY 2005 competition for new Equity Assistance Centers, published March 1, 2005. Data for the four long-term/annual performance measures are collected annually through a customer satisfaction survey. In addition, the program has established two annual efficiency measures. The first efficiency measure tracks grantee spending during the year with the goal of reducing the amount of funds grantees carry over from one year to the next. While the measure cannot ensure quality implementation, it does provide an indication of fiscal responsibility and poor fiscal responsibility is often an indicator of poor project management. The second efficiency measure, to be implemented in 2008, tracks the number of days that it takes the Department to provide written feedback to grantees after a monitoring visit (with a goal of 45 days). Conducting regular monitoring visits and providing feedback to grantees in a timely manner is consistent with the Department's goal of improving program oversight and improving program outcomes.

Evidence: The program has established the following performance measures for quality, relevance, and usefulness that are both annual and long-term measures: ??GPRA measure 1.3 for quality: The percentage of customers who report that the products and services they received from the Equity Assistance Centers are of high quality. ??GPRA measure 1.1 for relevance: The percentage of customers of Equity Assistance Centers that develop, implement, or improve their policies or practices, or both, in eliminating, reducing, or preventing harassment, conflict, and school violence. ??GPRA measure 1.2 for relevance: The percentage of customers of Equity Assistance Centers that develop, implement, or improve their policies or practices, or both, ensuring that students of different race, sex, and national origin have equitable opportunity for high-quality instruction. ??GPRA measure 1.4 for usefulness: The percentage of customers who report that the products and services they received from the Equity Assistance Centers are of high usefulness to their policies and practices. In addition, the program has established the following annual efficiency measures: ??GPRA 2.1: The percentage of Equity Center grant funds carried over in each year of the project. ??GPRA 2.2: The number of working days it takes the Department to send a monitoring report to grantees after monitoring visits (both virtual and on-site).

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The program has established baselines and ambitious targets for its annual measures of quality, relevance, and usefulness, and for one of the two annual efficiency measures (carryover of grant funds). Ultimately, the target for the other annual efficiency measure (the number of working days it takes the Department to send a monitoring report after monitoring visits) will be 45 days; the program will establish intermediate targets after completing the pilot site visits in FY 2008 and gathering baseline data in FY 2009. In setting the annual targets, the program office called for continuous improvement from the baseline data. The 2008 targets for quality and usefulness are high: 90% and 87%, respectively. The 2008 target of 68% for relevance is much higher than the 2007 actual performance level (50%), in part because the question on the 2007 client survey was more closely aligned with the GPRA measure than on the 2006 survey. In addition, the program office is working with grantees and emphasizing the importance of their technical assistance concerning harassment, conflict, and school violence, and we expect to see resulting improvements in performance. On the second relevance measure, performance rose from 71% in 2006 to 82% in 2007, in part because the question on the 2007 survey was more closely aligned with the GPRA measure, and consequently the Department is considering raising the targets, currently 73% for. The Department plans to review all the targets after collecting performance data for three years and to consider whether any changes in the targets are warranted.

Evidence: ??Targets for GPRA measure 1.3 (the percentage of customers who report that the products and services they received from the Equity Assistance Centers are of high quality) are 90% (FY 2008), 90% (FY 2009), 90% (FY 2010), 90% (FY 2011), 90% (FY 2012), 90% (FY 2013). ??GPRA measure 1.1 (the percentage of customers of Equity Assistance Centers that develop, implement, or improve their policies or practices, or both, in eliminating, reducing, or preventing harassment, conflict, and school violence) are 68% (FY 2008), 69% (FY 2009), 70% (FY 2010), 71% (FY 2011), 72% (FY 2012), 73% (FY 2013). ??GPRA measure 1.2 (the percentage of customers of Equity Assistance Centers that develop, implement, or improve their policies or practices, or both, ensuring that students of different race, sex, and national origin have equitable opportunity for high-quality instruction): 73% (FY 2008), 74% (FY 2009), 75% (FY 2010), 76% (FY 2011), 77% (FY 2012), 78% (FY 2013). ??GPRA measure 1.4 (the percentage of customers who report that the products and services they received from the Equity Assistance Centers are of high usefulness to their policies and practices): 87% (FY 2008), 88% (FY 2009), 89% (FY 2010), 90% (FY 2011), 90% (FY 2012), 90% (FY 2013). ??GPRA measure 2.1 (the percentage of Equity Center grant funds carried over in each year of the project): 10% (FY 2008), 10% (FY 2009), 10% (FY 2010), 10% (FY 2011), 10% (FY 2012), 10% (FY 2013).

YES 12%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: All grantees, the program partners, operating under a revised Cooperative Agreement (FY05) and provide information in their annual performance reports. The program office worked with the Department's Data Quality Initiative in 2007 and 2008 to improve the response rate on the customer satisfaction survey. The Library of Congress has administered the customer satisfaction survey and analyzed the results of the survey for the program in FY 2002, 2006, and 2007, and will administer and survey and analyze the data again in FY 2008. This has ensured a consistent administration of the survey. The survey was not administered in FY 2003-2005.

Evidence: Cooperative Agreements were aligned with the No Child Left Behind legislation and were designed to ensure that grantees focus on the program purpose and performance measures. The notice of the FY 2005 competition for new Equity Assistance Centers, published March 1, 2005, announced that the Department would collect data from grantees on the performance measures for relevance and usefulness. Client Survey for 2006-2007; Client Survey for 2007-2008.

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The Department does not conduct formal evaluations of this program, because its small size ($7 million) does not justify spending the funds needed for a formal evaluation. However, the Department conducts annual customer satisfaction surveys, administered by the Library of Congress. The Department has undertaken steps to improve data quality and the response rate. Data from the annual customer satisfaction surveys are used for program management, program improvement, and GPRA reporting on performance measures. In addition, grantees contract with third-party evaluators to conduct evaluations of the work of the Centers, and the evaluators share findings from their formative evaluations with the Center Directors throughout the year to help them improve their work. The evaluation methods and content of the reports vary, depending on what each grantee proposed for the evaluation in the grant application. Grantees submit evaluation reports to the Department annually. The program office reviews the evaluation reports as part of grant monitoring, comparing the work described in the evaluation reports with the goals, objectives, and activities proposed in the grant applications, and examining information on the nature of technical assistance services and products provided, description and numbers of clients served, and data on quality, relevance, and usefulness. The Department provides feedback to grantees and conducts workshops to help improve the quality of the project evaluations. For example, one workshop discussed the importance of evaluations that address results at the school and district levels. If sufficient funding were available and burden were not problematic, one could provide a comprehensive understanding of the need for the EAC program, the nature and scope of program services, and the outcomes by hiring an independent contractor to conduct a national evaluation that included: ??Site visits, interviews, and document reviews for each of the 10 Center ??Interviews and document reviews at the U.S. Department of Education ??Client surveys and interviews to provide greater depth than the current annual client satisfaction survey ??Panel reviews of EAC products and services to complement the current annual client satisfaction survey ??In-depth case studies to examine the goals and objectives of the program and each Center, the extent to which the program and the Centers have achieved their goals and objectives, and the factors that have influenced the work of the Centers and the results

Evidence: The Department took action to improve the data quality of the annual customer satisfaction survey by aligning the survey questions more directly with GPRA measures and incorporating follow-up procedures, resulting in an increase in the response rate from 48% in 2006 to 76% in 2007. Moreover, the Department has used the findings from the survey for program improvement. For example, the program office has explored the nature of requests from clients for services and has emphasized in meetings with the Center directors the importance of eliminating, reducing, or preventing harassment, conflict, and school violence, based on findings that only 50% of the 2007 survey respondents said that their organization developed, implemented, or improved their policies and practices in eliminating, reducing, or preventing harassment, conflict, and school violence as a result of EAC services (compared to 82% of the 2007 survey respondents who said that their organization developed, implemented, or improved their policies and practices in ensuring that students of different race, sex, and national origin have equitable opportunity for high-quality instruction as a result of EAC services). Miller, P. and Meditz, S.W. (July 2007). U.S. Department of Education Equity Assistance Centers Program Client Survey for School Year 2006-2007. Federal Research Division, Library of Congress, Washington, DC. EAC grantee evaluation reports. Evaluation of the Comprehensive Assistance Centers Statement of Work. Evaluation of the Comprehensive Assistance Centers information collection clearance package (http://edicsweb.ed.gov/browse/browsecoll.cfm?pkg_serial_num=3414).

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: ED has not satisfied the first part of the question because program performance changes are not matched with changes in funding levels. However, ED has satisfied the second part of this question in that the Department's budget submissions show the full cost of the program (including S&E).

Evidence: Congressional budget justifications

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Department established long-term and annual performance measures for the program; collects data annually on these measures; has established baselines and ambitious targets for all the outcome measures and one of the efficiency measures; and will collect baseline data and establish targets for the remaining efficiency measure in FY 2009. The program office worked with ED's Data Quality Initiative to improve the survey response rate and also revised the survey to improve the alignment between the survey questions and the program's GPRA measures.

Evidence: ED's VPS system; Revised customer satisfaction survey for 2008

YES 12%
Section 2 - Strategic Planning Score 75%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The key program partners are the grantees. Through monitoring, the program office systematically examines how the grantee is implementing the program. The program office has policies, standards, and procedures outlined in the Department's Handbook for the Discretionary Grant Process and the program office's Comprehensive Monitoring Procedures Handbook for SSTP Programs. The Department collects annual and current data through the annual performance reports and annual evaluation reports submitted by the Centers. The Department uses these reports to monitor grantee performance and manage and improve the program. Grantee performance reports address project goals, objectives, progress, outcomes, and budget expenditures; explain any difficulties or lack of progress; describe the nature of technical assistance provided and clients served; give quantitative information on their services and clients; and include data on quality, relevance, and usefulness program performance measures. Reports from the evaluations conducted by third-party evaluators vary according to the evaluation plan in the individual grantee's application. In general, they provide more detailed and more in-depth information about their technical assistance and their clients than the performance reports do, and describe the data collection methods and analysis. The program office reviews the annual performance reports, evaluation reports, and other sources of information mentioned in this section of the PART, assessing progress by comparing the work described with the goals, objectives, and activities proposed in the grant applications, and examining information on the nature of technical assistance services and products provided, description and numbers of clients served, outcomes from the EACs' work, and data on quality, relevance, and usefulness. The Department uses the information to identify and call the individual EACs' attention to related issues and work going on at other EACs and in other Department programs, such as Title I, the Comprehensive Assistance Centers program, and the Special Education Technical Assistance and Dissemination Network program. For example, the Department has found an increase in services and training provided by the EACs from FY 2006 to FY 2007 in areas such as Response to Intervention (RTI) and disproportionality, i.e., over-representation of minority students in Special Education programs and under-representation of minority students in gifted and talented programs. As a result, the Department then supported in-depth presentations and the distribution of tools at the Equity Assistance Centers Directors meeting in February 2008 to help Center Directors improve their services and training on RTI and disproportionality. To supplement information from the grantees, the annual survey examines customers' ratings of the quality and usefulness of Centers' products and services, as well as whether customers report that their organizations developed, implemented, or improved their policies and practices as a result of EAC services.

Evidence: ??Agenda and tools from the Equity Assistance Centers' Directors meeting, February 2008 ??Comprehensive Monitoring Procedures Handbook for SSTP Programs (August 10, 2007). Washington, DC: U.S. Department of Education, Office of Elementary and Secondary Education, School Support and Technology Programs (SSTP) ??Handbook for the Discretionary Grant Process (February 24, 2006). Washington, DC: U.S. Department of Education, Office of Management ??Grantee performance reports ??Grantee evaluation reports ??Protocols and agendas for monthly conference calls with grantees ??Miller, P, and Meditz, S.W. (July 2007). U.S. Department of Education Equity Assistance Centers Program Client Survey for School Year 2006-2007. Washington, DC: Library of Congress, Federal Research Division.

YES 10%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Through the Education Department Performance Appraisal System (EDPAS), the EAC program manager is held accountable for the following responsibilities/expected results: ??Effectively managing the discretionary grant program from the initial phases of the competition through the final closeout process ??Providing technical assistance to grantees on policy matters and conducting on-going monitoring of projects ??Reviewing and developing program-specific documents and evaluation instruments. Evidence of grant management activities includes obligation of awards, administrative actions, analysis of funding and disbursement records, and reviewing of grant performance reports. Grantees are partners and must demonstrate that they have made substantial progress toward meeting the objectives of the project in order to receive continuation funding for the next year (Education Department General Administrative Regulations (EDGAR) § 75.253(a)(2)). To make that determination, the Department analyzes the information from the annual performance reports and compares it to the grantees' goals and objectives in their initial grant application and their cooperative agreements to determine whether each grantee has made substantial progress.

Evidence: EDPAS includes the following performance standards for the EAC program manager: ??Provide comprehensive technical assistance and concise guidance to EAC grantees on grants management, fiscal accountability, compliance issues, and resolving programmatic problems in a timely manner. Analyze the GAPS grantee drawdown reports on a regular basis to determine that the rate of drawdown corresponds with the implementation timeline of program goal, objectives, and activities. ??Effectively manage the EAC program by efficiently processing administrative requests, program actions, correspondence, and approvals in a timely manner. ??Update and maintain official files routinely to ensure all correspondence, administrative actions, and official records are accurately filed in a timely manner. ??Effectively monitor EAC grantees regularly, in accordance with GPOS, to assess the extent to which the projects are meeting their annual goals and objectives and are in compliance with approved activities. ??Provide quality review of Cooperative Agreements, Work Plans, Strategic Plans, annual performance and evaluation reports, and give timely oral and written feedback to grantees. ??Provide leadership for the development of effective program evaluation instruments, accurate Strategic Plans, meaningful EAC Goals, measurable GPRA indicators, and program data that satisfactorily address the PART requirements and objectives. Before awarding continuation funding for the next year, the Department analyzes the information from the annual performance reports to determine whether each grantee has made substantial progress. Based on the annual performance reports and other information gathered throughout the year from sources described above, the program manager compares the technical assistance provided by the Center and reported outcomes with the goals, objectives, and activities proposed in the grant application to assess whether the grantee has made substantial progress. ??EDGAR ??Comprehensive Monitoring Procedures Handbook for SSTP Programs (August 10, 2007). Washington, DC: U.S. Department of Education, Office of Elementary and Secondary Education, School Support and Technology Programs (SSTP) ??Handbook for the Discretionary Grant Process (February 24, 2006). Washington, DC: U.S. Department of Education, Office of Management ??Grantee performance reports ??Agendas for monthly conference calls with grantees

YES 10%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: Funds are obligated within the timeframes set out by Department schedules and used for the purposes intended. Funds have been obligated each year on time and within the established grant schedule. Program staff adhere to the established grant schedule timelines and provide oversight on all grant functions. This includes monitoring grantee activities, such as drawdowns, entering administrative actions, and reviewing monthly disbursement reports. Program staff monitor disbursement reports to ensure that grantees are drawing down funds at an acceptable rate. Carryover funds are also carefully monitored. Grantees also have obligated funds in a timely manner, as evidenced by achievement of the long-term target of the GPRA efficiency measure 2.1; the percentage of Equity Center grant funds carried over in FY 2007 was less than 10% of allocated funds (0.138%). Grantee budgets are reviewed when new applications are received and annually thereafter. There has been no evidence of erroneous payments.

Evidence: ED's Grants Administration and Payment System (GAPS) drawdown reports

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The Department has established the annual efficiency measures, and baseline and targets for one of the two annual efficiency measures (carry-over of grant funds). The Department will establish intermediate targets for the other annual efficiency measure (the number of working days it takes the Department to send a monitoring report after monitoring visits) after completing the pilot site visits in FY 2008 and gathering baseline data in FY 2009; the target will be 45 days. The efficiency measure that tracks the carry over of grant funds was established because there had been instances of projects carrying over a large percentage of their grant funds from one year to the next. While the percentage of carryover cannot be an indicator of a project's success, it can indicate how fiscally responsible a grantee is. A grantee that is not fiscally responsible and cannot spend funds according to its plan may also have poor project management practices which will likely lead to less than ideal outcomes. The efficiency measure that requires the Department to issue a monitoring report within 45 days of a monitoring visit will be implemented in 2008. Improved program oversight is a goal of the Department and cannot be achieved unless the Department provides feedback to grantees in a timely manner. Also, the Department identified a very cost-effective way to administer the program's customer satisfaction survey. The Library of Congress charges just $18,000 to administer the survey and analyze the results from the survey, thus leaving the overwhelming majority of the program's appropriation for grants. In addition, the Department employs technology to improve efficiency in its grantee oversight. For example, the Department uses its videoconferencing capabilities to conduct post-award conferences, helping grantees avoid costly travel expenses. The Department intends to use videoconferencing to monitor grantees as well.

Evidence: ED's VPS system; Interagency Agreement with the Library of Congress

YES 10%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: Equity Assistance Centers work with a number of programs and Federal entities, as well as in collaboratives with State and local entities. The EACs have worked with the Department's Office for Civil Rights to help schools with desegregation and equity issues. The EACs also have collaborated with the U.S. Department of Justice on desegregation, school violence, and harassment issues. For example, the EACs provide assistance to school districts that are developing and implementing their court-ordered desegregation plans. In addition, the EACs work jointly with the Department's Office of Special Education (OSEP) on committees addressing specific topics, including Response to Intervention (RTI) and disproportionality, i.e., over-representation of minority students in Special Education programs and under-representation of minority students in gifted and talented programs. To build coordination and collaboration, the Centers and the Department's program office also actively participate in an annual Leveraging Resources conference with the Comprehensive Assistance Centers and with the Resource Centers in OSEP's Technical Assistance and Dissemination (TA&D) Network, and collaborate on individual projects within regions. Results from collaboration include sharing of useful information on the work of the Centers and the context within the State, as well as joint planning and technical assistance. For example, to avoid duplication and to maximize resources and results, the Region IV Southeastern EAC, Southeast Comprehensive Assistance Center, and Special Education TA&D Southeast Regional Resource Center developed an integrated plan and jointly provided technical assistance to the Mississippi Superintendent of Education and State Education Agency (SEA) to help the State develop Response to Intervention (RTI) plans and tools in a long-term effort to reduce the dropout rate and raise student achievement. The EAC, Comprehensive Assistance Center, and TA&D Resource Center have worked intensively to bring together staff in the areas of curriculum, dropout prevention, assessment, Federal programs, vocational education, special education, and other State offices, and have helped the State develop a comprehensive, multi-tiered approach, including: ??A statewide plan for Response to Intervention (RTI) ??Self-assessment tools with indicators of progress ??A manual and plans to launch the manual and provide technical assistance to districts beginning summer 2008 ??Plans to launch a website with resources for teachers later in the year

Evidence: Annual Leveraging Resources Conferences with the Department's Comprehensive Assistance Centers program and the OSEP TA&D projects (agenda from the February 2008 conference); Materials from a joint conference held by the Department's Office of Civil Rights and the Region VI EAC; Joint RTI plan and status report

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: Recent agency-wide audits have not identified deficiencies in the financial management of this program. The program has procedures in place to ensure that payments are made properly for the intended purposes.

Evidence: The program staff monitor expenditures by using the GAPS management system to track awards and payments. Program staff also review financial information in the annual performance report (ED 524B), budget revisions, and budget submissions.

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: Major internal management deficiencies have not been identified for this program.

Evidence: NA

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: Independent peer review panels are used to score and rank all applications. Applications are reviewed and successful applications are selected for all EAC grant awards through a competitive process. The selection criteria for the FY05 competition were taken from the Education Department General Administrative Regulations (EDGAR) General Section Criteria. The FY08 EAC grant competition will use the program-specific selection criteria from 34 CFR Part 270 and 272. Although 9 of the same 10 organizations received funding in the last two competitions, there was a clear competitive process and a limited number of eligible entities from which to draw. In an attempt to increase the number of applications, program staff engaged in outreach activities in FY 2005, including disseminating grant competition information on higher education electronic list servs and on "ED Review," an online bi-weekly Department update of activities relevant to the intergovernmental and corporate communities and other stakeholders. While outreach efforts did not produce an increase in the number of applicants for the competition in 2005, the program office intends to make the same efforts to publicize the 2008 competition. The FY 2005 Technical Review Plan describes the guidelines the Department set to create qualified, independent panels of reviewers from diverse backgrounds.

Evidence: FY 2005 Technical Review Plan; 34 CFR Part 270 and 272

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The Department maintains information on grantee activities through monitoring, annual reports and evaluation reports, a monthly calendar of Events for all the EACs, conference calls, and annual meetings with EACs. In FY 2008, the program office is developing a site visit protocol, preparing a monitoring report template, and piloting site visits with grantees, largely using videoconferencing to gather the information needed to monitor the program with greater efficiency than traditional on-site visits. The Department conducts conference calls with all EAC Directors typically once a month and individual phone calls with each EAC Director as needed; reviews information on the EACs' web sites regularly; provides updates on Department policies, requirements, and other relevant information through the program listserv; and conducts national meetings with all EAC Directors at least twice a year, at which program staff discuss performance, provide technical assistance, and in which officials from other program offices (for example, Title I, Title III, and Title V) share relevant information on policies and activities. Monthly conference calls and national meetings also serve as opportunities for Center Directors to share information about their work, discuss common issues across region and work on common solutions, and plan resource sharing and collaborative efforts. A monthly calendar of events for all EACs provides additional information on Centers' activities and facilitates coordination and collaboration across Centers. Before awarding continuation funding for the next year, the Department analyzes the information from the annual performance reports to determine whether each grantee has made substantial progress. Based on the annual performance reports and other information gathered throughout the year, the program manager compares the technical assistance and reported outcomes provided by the Center with the goals, objectives, and activities proposed in the grant application to assess whether the grantee has made substantial progress. Grantee performance reports address project goals, objectives, progress, outcomes, and budget expenditures; explain any difficulties or lack of progress; describe the nature of technical assistance provided and clients served; give quantitative information on their services and clients; and include data on quality, relevance, and usefulness program performance measures. Reports from the evaluations conducted by third-party evaluators vary according to the evaluation plan in the individual grantee's application. In general, they include data on the performance measures, provide more detailed and more in-depth information about their technical assistance and their clients than the performance reports do, and describe the data collection methods and analysis. The program office reviews the annual performance reports, evaluation reports, and other sources of information mentioned in this section of the PART, assessing progress by comparing the work described with the goals, objectives, and activities proposed in the grant applications, and examining information on the nature of technical assistance services and products provided, description and numbers of clients served, outcomes from the EACs' work, and data on quality, relevance, and usefulness. The Department uses the information to identify and call the individual EACs' attention to related issues and work going on at other EACs and in other Department programs, such as Title I, Comprehensive Assistance Centers program, and the Special Education Technical Assistance and Dissemination Network program. The program office also uses monitoring and communication with the EACs to identify grantees' technical assistance needs. The customer satisfaction survey provides information to the Department that informs the program's performance measures and alerts the program office to technical assistance needs.

Evidence: Annual performance reports and evaluation reports; agendas from national meetings with EAC Directors at least twice per year; monthly monitoring phone calls with each EAC Director, conference calls, and other communications as needed; analysis of the customer satisfaction survey results.

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: The Department collects performance data annually through a customer satisfaction survey administered and analyzed by the Library of Congress. The findings on performance measures for quality, relevance, and usefulness are presented in the annual GPRA reports, which the Department posts on its public website. The Department also collects and reviews grantee performance data on an annual basis through the annual performance reports and the annual evaluation reports. However, the Department does not make grantee performance reports or grantee evaluation reports available to the public.

Evidence: GPRA performance and accountability reports are posted on the Department's website: http://www.ed.gov/about/reports/annual/2007report/g2cratraining.doc ED's VPS system contains performance measures, targets, and actual data.

NO 0%
Section 3 - Program Management Score 90%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The program has demonstrated some preliminary progress in achieving its long-term performance goals. The program has collected baseline data and established ambitious targets for all the long-term measures (quality, relevance, and usefulness). The program exceeded the targets for the usefulness measure and for one of the two relevance measures (the percentage of customers that develop, implement, or improve their policies or practices, or both, ensuring that students of different race, sex, and national origin have equitable opportunity for high-quality instruction) in the first year after the baseline year. The program collected baseline data for the quality measure and will collect and report performance data in comparison with the target in FY 2008. However, the program did not meet its target for the other relevance measure (percentage of customers that develop, implement, or improve their policies or practices, or both, in eliminating, reducing, or preventing harassment, conflict, and school violence). The relevance questions on the 2007 client survey were revised to align more closely with the GPRA measures. At this time, it is not possible to determine with certainty whether the drop from 66% in 2006 to 50% in 2007 actually represents a decrease in the percentage of customers that developed, implemented, or improved their policies or practices, or both, in eliminating, reducing, or preventing harassment, conflict, and school violence, or whether the change resulted from 2007 revisions in the survey question, which no longer used wording about "safe schools" and "school climate." The Department will continue to monitor performance data closely.

Evidence: ED's VPS system

SMALL EXTENT 8%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The program has demonstrated some progress in achieving its annual performance goals. The program has collected baseline data and established ambitious targets for all of the annual measures (quality, relevance, and usefulness), which also are long-term measures. The program has exceeded the targets for the usefulness measure and for one of the two relevance measures (the percentage of customers that develop, implement, or improve their policies or practices, or both, ensuring that students of different race, sex, and national origin have equitable opportunity for high-quality instruction). The program collected baseline data for the quality measure and will collect and report performance data in comparison with the target in FY 2008. The program did not meet its target for the other relevance measure (percentage of customers that develop, implement, or improve their policies or practices, or both, in eliminating, reducing, or preventing harassment, conflict, and school violence). The relevance questions on the 2007 client survey were revised to align more closely with the GPRA measures. At this time, it is not possible to determine with certainty whether the drop from 66% in 2006 to 50% in 2007 actually represents a decrease in the percentage of customers that developed, implemented, or improved their policies or practices, or both, in eliminating, reducing, or preventing harassment, conflict, and school violence, or whether the change resulted from 2007 revisions in the survey question, which no longer used wording about "safe schools" and "school climate." The Department will continue to monitor performance data closely.

Evidence: ED's VPS system

SMALL EXTENT 8%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Department has established two efficiency measures for the program, and OMB has approved both measures. Performance on the first efficiency measure is better than the target (carryover by grantees is lower than 10% of the allocations). Ensuring that grantees spend money according to their plans is one indication of fiscal responsibility and efficiency. The Department will collect and report data on the second efficiency measure in FY 2009. FY2008 is a developmental year for the monitoring visits and subsequent reports. This measure is the number of working days it takes the Department to send a monitoring report to grantees after monitoring visits.

Evidence: ED's GAPS, ED's VPS system

SMALL EXTENT 8%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: There currently are no comparable data are available for comparable programs. There may be some value in comparing this program to other technical assistance centers in the Department, such as the Comprehensive Assistance Centers (CACs) or OSEP's TA&D Centers. However, the measures for these programs, while similar, differ in critical ways. All three programs look at the quality, relevance, and usefulness of the products and services provided, but the CACs and the TA&D Centers use an expert panel to make determinations while the EACs program uses a customer satisfaction survey. Also, there will not be data for the CACs until late 2008. It would be very costly to arrange for an expert panel to evaluate the quality, relevance, and usefulness of the EACs' products and services.

Evidence: ED's VPS system, program reports for the EACs, the CACs, and the TA&D Centers

NA 0%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: The program has been proactive in soliciting customer satisfaction information on program performance through the annual survey for program management and improvement. The Federal Research Division of the Library of Congress conducts the annual, national survey and also analyzes the data. The survey provides the data for the four annual/long-term performance measures. The Department took action to improve the data quality of the annual customer satisfaction survey by aligning the survey questions more directly with GPRA measures and incorporating follow-up procedures, resulting in an increase in the response rate from 48% in 2006 to 76% in 2007. The results of the survey have generally found that the program is effective and achieving results. There is not a comprehensive evaluation conducted of the program because the relatively small size of program funding ($7 million) is insufficient to cover a comprehensive evaluation to determine program effectiveness without severely reducing funds needed for program services to schools, districts, and other agencies that operate schools. The absence of a program evaluation highlights a broader issue underlying the evaluation of very small competitive grant programs. In this case, customer satisfaction surveys serve as a good proxy for understanding elements of program efficacy. The benefits of a rigorous evaluation should be weighed against cost and the known utility of customer satisfaction survey data, especially in the context of a program with such modest administrative funds. Summary findings from the most recent customer survey (Miller and Meditz (July 2007); pp. 1-3) are overall extremely positive.

Evidence: Miller, P., and Meditz, S.W. (July 2007). Equity Assistance Centers Program Customer Satisfaction Survey for School Year 2006-2007. Washington, DC: Federal Research Division, Library of Congress.

NO 0%
Section 4 - Program Results/Accountability Score 25%


Last updated: 09062008.2008SPR