ExpectMore.gov


Detailed Information on the
National Institute for Literacy Assessment

Program Code 10009085
Program Title National Institute for Literacy
Department Name Department of Education
Agency/Bureau Name Department of Education
Program Type(s) Competitive Grant Program
Assessment Year 2008
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 60%
Strategic Planning 62%
Program Management 70%
Program Results/Accountability 16%
Program Funding Level
(in millions)
FY2008 $7
FY2009 $6
Note
 

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2008

Revise the interagency agreement to reestablish agency duties and procedures, improve coordination of resources, and ensure effective management oversight of the Institute (NIFL).

No action taken
2008

Determine a schedule for periodic independent evaluations of major program activities to support program improvement and evaluate program effectiveness on an established cycle.

No action taken The Institute will establish a schedule for the independent evaluation of its major program activities, those activities funded at $300,000 or more. These evaluations may include both expert and practitioner assessment of the activities' value, as appropriate. Program activities funded below $300,000 may be grouped together, as appropriate, and evaluated within the context of the program goal achieved through their implementation.
2008

Ensure program activities (conducted by NIFL and its partners), budget requests, and spending plans are aligned with performance measures.

No action taken
2008

Improve the alignment between grantee performance reporting requirements and NIFL's official program measures.

Action taken, but not completed NIFL has revised its GPRA measures and will ensure that the reporting forms request the necessary information from grantees.
2008

Make grantee performance information available on NIFL's web site.

No action taken Once NIFL has revised its reporting forms and collected data for measures one and two, (which occurs six months after collecting data for measure one), they will make the information public.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments

Program Performance Measures

Term Type  
Long-term/Annual Outcome

Measure: The percentage of recipients of information through the National Institute for Literacy (NIFL) technical assistance who report they are likely to implement instructional practices grounded in scientifically based research (or the most rigorous research available).


Explanation:Individuals who participate in trainings sponsored by NIFL will be asked to take a survey following each training.

Year Target Actual
2008 Baseline Dec. 2008
Long-term/Annual Outcome

Measure: The percentage of individuals who receive NIFL technical assistance who can provide examples or other evidence that they implemented instructional practices grounded in scientifically based research within six months of receiving the technical assistance.


Explanation:Participants who completed a survey immediately following the trainings will receive a follow-up survey six months later.

Year Target Actual
2008 Baseline Dec. 2009
Long-term/Annual Outcome

Measure: The number of NIFL products that are determined to be of high quality by an independent peer review panel.


Explanation:NIFL will convene an expert panel to judge the quality of the products NIFL has produced.

Year Target Actual
2008 Baseline Dec. 2009
Long-term/Annual Outcome

Measure: The percentage of Federal program managers and key agency policy staff who consider NIFL effective in coordinating support of reliable and replicable research on literacy and basic skills across federal agencies.


Explanation:

Year Target Actual
2009 Baseline Oct. 2009
Long-term/Annual Efficiency

Measure: The percentage of funding used for product development that supports high-quality products.


Explanation:The amount spent on high quality products (numerator) and the total amount spent on product development (denominator).

Year Target Actual
2009 Baseline Oct. 2009

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The National Institute for Literacy's (Institute) purpose is, as defined in the Workforce Investment Act (WIA), to "??provide national leadership regarding literacy; coordinate literacy services and policy; and serve as a national resource for adult education and literacy programs by providing the best and most current information available, including the work of the National Institute of Child Health and Human Development (NICHD) in the area of phonemic awareness, systematic phonics, fluency, and reading comprehension, to all recipients of Federal assistance that focuses on reading??and supporting the creation of new ways to offer services of proven effectiveness." The Institute conveys the statutory purposes and intended outcomes for those purposes through its recently defined mission statement: "The National Institute for Literacy is a catalyst for advancing a comprehensive national literacy agenda. The Institute bridges policy, research, and practice to prompt action and deepen public understanding of literacy as a national asset." The additional authority given to the Institute under ESEA to disseminate findings from scientifically based reading research reinforces the Institute's purpose, established through WIA, as a national resource on reading information, and its authorized duty to "collect and disseminate information on methods of advanced literacy that show great promise, including phonemic awareness, systematic phonics, fluency, and reading comprehension??." The Institute serves as a partner, along with other entities, in carrying out identification and dissemination activities that will support the goals of Reading First and Early Reading First. In addition, when funds are available, ESEA authorizes the Institute to lead research efforts to gain information on improving adult literacy skills and parents' abilities to support their children's literacy development. While the purpose of the Institute is clear under both WIA and ESEA, the WIA mandate is very broad, which permits great flexibility and requires focused planning to avoid too many activities with disparate goals.

Evidence: ??Sec. 242 of WIA ??Sections 1207, 1224, and 1241 of ESEA (authority for the Reading First, Early Reading First, and Even Start programs) ??"Strategic Directions, 2008 and Beyond: Advancing Literacy as a National Asset," March 2008

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The Institute addresses the ongoing need for a coordinated Federal approach to adult literacy and basic skills services. At least nine Federal agencies currently have responsibility for providing services, and each has its own priorities, goals and approaches. This fragmentation was first highlighted almost 20 years ago in a report called Jump Start, which called national attention to the poor basic skills of the American workforce and urged the Federal Government "??to play a leadership role by energizing other institutions and providing a sense of direction for the field." The Institute also was designed to address other persistent problems noted in the report, such as uneven dissemination efforts and significant gaps in research and knowledge about adult literacy basic skills. For example, the report declined to criticize the teaching workforce, observing, "??so little effort has been made to translate the results of research and experience into usable form and place it at their disposal." Several of the Institute's WIA-authorized duties permit a direct response to these issues, such as establishing a database of information on literacy to be disseminated to the adult literacy field and carrying out basic and applied research and development "on topics that are not being investigated by other organizations or agencies??." Confirming the report's observation, the Institute's 2001 review of the literature on adult reading was the first step in the development and dissemination of practitioner information on reading and found only 20 studies that met the criteria for scientifically based research. A similar need for usable information on reading instruction existed in the K-12 system when the 1998 Reading Excellence Act (REA) tasked the Institute with conducting a reading research dissemination campaign. At that time, classroom teachers and administrators had minimal access to professional, research-based resources. With the passage of REA, and subsequently, No Child Left Behind (NCLB), Congress sought to strengthen the role of research in educational practice through the Institute's dissemination campaign and other efforts.

Evidence: "Jump Start: The Federal Role in Adult Literacy," Final Report of The Project on Adult Literacy, Forrest P. Chisman, January 1989. Reading Excellence Act, http://frwebgate.access.gpo.gov/cgiin/getdoc.cgi?dbname=105_cong_public_laws&docid=f:publ277.105.pdf

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: The Institute's authority under both WIA and ESEA permits some services that other federal programs and private organizations also have the authority or goal to provide. The Institute was established in 1991 and subsequently re-authorized in WIA to play a role not filled by other Federal agencies and non-profits. Unlike agencies responsible for providing literacy and basic skills instruction, the Institute was designed to build connections across provider systems and perform policy, research, dissemination, and program improvement functions that stand to benefit all organizations responsible for managing national services. WIA authorizes funding for the Office of Vocational and Adult Education's (OVAE) National Leadership Activities program to develop, improve, and identify the most successful methods and techniques for addressing the education needs of adults or to carry out research on adults with low levels of literacy. WIA also authorizes the Institute to carry out basic and applied research and development on topics that are not being investigated by other entities and to collect and disseminate information on promising methods for advancing literacy. Historically, coordination of activities has not always been smooth. The overlapping language has caused friction between the Institute and the agencies it is charged to work with when the Institute has attempted to exercise leadership and policy development in the areas of literacy, health literacy, workplace literacy, and English as a second language. While the Institute is understood to serve as more of a coordinating body, providing leadership at a very high level, and providing high quality information for the field in one place, the places of overlap in the statute could be revisited to make the purposes of the these two programs more clearly unique from each other. The spending plan approval process that the Department oversees is one way to ensure that overlapping activities do not take place in implementation. Recently, the Interagency Group (IAG) has been more involved in the process of coordinating spending plans between OVAE and the Institute in an effort to leverage resources where possible. Additionally, the Institute's mandate under ESEA is defined to encompass only the dissemination of findings from scientifically based reading research. So, there is some duplication or redundancy between the Institute's early childhood and childhood dissemination activities and other Federal departments or agencies, foundations, and non-profits. The Institute of Education Sciences (IES) also has a dissemination mandate, in addition to its research mandate, even though IES has not made adult education a priority. Organizations such as the David and Lucile Packard Foundation, the William and Flora Hewlett Foundation, the Carnegie Corporation, and the National Association for the Education of Young Children (NAEYC) also make resources and information available, however they are not designed to serve federal grantees or specifically to support federal education priorities. Their materials typically must also be purchased. Non-profits such as ProLiteracy Worldwide and the National Coalition for Literacy serve as advocates for adult learners and the groups that support them, a role that the Institute is legally prohibited from assuming. Foundations, such as Annie E. Casey, Lumina Foundation for Education, and Joyce, may also do resource development, but perhaps not to a large enough extent to erase the need for the Institute's work in this area. Also, very few foundations support adult literacy activities, as noted in a report by the Council for the Advancement of Adult Literacy.

Evidence: ??Title II of WIA, Section 243, National Leadership Activities ??Proliteracy Worldwide www.proliteracy.org/downloads/PLW%20fact%20sheet.pdf ??National Coalition for Literacy www.national-coalition-literacy.org/about.html ??Annie E. Casey Foundation www.aecf.org/OurWork.aspx ??Lumina Foundation for Education www.luminafoundation.org/about_us/index.html ??Joyce Foundation www.joycefdn.org/Programs/Education/ ??David and Lucile Packard Foundation www.packard.org/categoryList.aspx?RootCatID=3&CategoryID=63 ??William and Flora Hewlett Foundation www.hewlett.org/Programs/Education/ ??Carnegie Corporation www.carnegie.org/sub/program/national_program.html ??National Association for the Education of Young Children www.naeyc.org/about/mission.asp ??http://www.caalusa.org/publications.html#corp

NO 0%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The Institute's governance design has been problematic. While the Institute's decision-making structure has at times been implemented effectively it's competing governing bodies create unclear roles, and the lack of buy-in by all three cabinet agencies has hindered coordination efforts. By design, a Presidentially appointed Advisory Board provides guidance, feedback, and leadership from experts in the areas of literacy and literacy research and makes recommendations to the IAG on the Institute's goals. The statute requires that the IAG administer the Institute under the terms of an Interagency Agreement (IAA) entered into by the Secretary of Education with the Secretary of Labor and Secretary of Health and Human Services. These Secretaries, under law, should help plan the goals of the Institute and have a role in the implementation of programs to achieve those goals. This structure is intended to lay the foundation for cross-agency coordination and collaboration in program planning and implementation. However, historically, the Advisory Board has often functioned more in a governing role than an advisory role, even though the statute explicitly defines its role as advisory. The IAG has executed its responsibilities inconsistently. When the IAG does not function, there is no check and balance system and no regular management oversight of the Institute. The roles of the IAG could be clarified and further delineated in the statute or the IAA so that it is not solely dependent upon the will of the individuals serving on the IAG. The authorized activities are sufficiently broad to enable the Institute to utilize a variety of means to carry out its purposes. Perhaps they are too broad, as the Institute's activities have sometimes been characterized as fragmented, with many activities occurring at once with a relatively small amount of funds. However, the Institute has recently undertaken a strategic planning process with its Board to create long-term, focused goals for the work of the Institute. Also, the Institute has sought to increase its effectiveness by leveraging its resources through partnerships with other Federal organizations and non-governmental organizations, especially in support of research on adult reading and information dissemination.

Evidence: ??Title II WIA, Section 242 ??Title I, ESEA, Public Law 107-110, Part B, Subpart 1 - Reading First, Sections 1202 (b) and 1207, Subpart 2 - Early Reading First, Section 1224, and Subpart 3 - William F. Goodling Even Start Family Literacy Programs, Sections 1232 (b) and 1241(b) ??Conversations with IAG members

NO 0%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: The Institute's program design is targeted to serve the principal decision-makers, service providers, and literacy service recipients. These audiences, named in WIA, include adult literacy and basic skills service providers; national organizations interested in literacy; the U.S. Congress; and Federal, state, and local government agencies responsible for adult literacy policies and services. Additional target audiences include participants in specific Federal programs such as Title I, Title VII, Head Start and the Individuals with Disabilities in Education Act (IDEA) as identified in the Institute's other statutory authorities. Given the variety of target audiences, the Institute has tried to prioritize audiences for specific products. For instance, the Resource Collections are primarily targeted to practitioners whereas the target audience for papers on emerging issues will be policymakers and program administrators. Under ESEA, teachers and parents have been targeted as the primary audience to most effectively support the implementation of Reading First. In keeping with its authorization, the Institute has developed a comprehensive portfolio of projects that, taken together, address all segments of the legislatively defined target audience. The projects include a comprehensive dissemination network called the Literacy Information and Communication System (LINCS) that includes regional centers, online resource collections, and discussion lists; several activities that address adult reading, including an online reading diagnostic tool, resources for practitioners, and research reviews; technical assistance on teaching adults with learning disabilities (Bridges to Practice); descriptive research on State and local literacy policies; and a comprehensive program to disseminate findings from scientifically based reading research. A programmatic framework the Institute developed with guidance from its Advisory Board establishes priorities within the Institute's authorized purposes and activities. The Institute is granted statutory authority to award grants and enter into contracts with various entities, providing the opportunity for some target audience members to compete for Institute grants and contracts. At the same time, the Institute is able to establish roles for specified target audiences in the projects it conducts through grants and contracts. For example, in its most recent competition for LINCS regional centers, the Institute defined the centers' purpose as disseminating highest-quality resources "through partnerships with adult education and related organizations" to help practitioners use evidence-based practices.

Evidence: ??Bridges to Practice Training and Certification Program Review, Draft Report, TATC Consulting, January 22, 2007 ??Comprehensive Review and Analysis of the Literacy Information and Communication System (LINCS), RMC Research Corporation, September 25, 2005 www.nifl.gov/nifl/executive_summary.doc ??The Partnership for Reading: Final Report - Dissemination Activities 2001-2005, Abt Associates, March 2006 ??National Institute for Literacy, Overview Information; Literacy Information and Communication (LINCS) Regional Resource Centers; Notice Inviting Applications for New Awards for Fiscal Year (FY) 2006; Notice, Federal Register, Volume 71, Number 147, Tuesday, August 1, 2006, Pages 43628 - 43631 http://frwebgate.access.gpo.gov/cgi-bin/getpage.cgi?position=all&page=43628&dbname=2006_register ??National Institute for Literacy, Overview Information; Literacy Information and Communication (LINCS) Resource Collections; Notice Inviting Applications for New Awards for Fiscal Year (FY) 2006; Notice, Federal Register, Volume 71, Number 151, Monday, August 7, 2006, Pages 44716 - 44720 http://frwebgate.access.gpo.gov/cgi-bin/getpage.cgi?dbname=2006_register&position=all&page=44716

YES 20%
Section 1 - Program Purpose & Design Score 60%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The Institute currently has three long-term performance measures that were established and refined between FY2004 and FY2006. These measures address the Institute's role as a resource: (a) the percentage of recipients receiving information through National Institute for Literacy technical assistance who report that they are likely to implement instructional practices grounded in scientifically based research (or the most rigorous research available), (b) The percentage of individuals receiving National Institute for Literacy technical assistance who can provide examples or other evidence that they implemented instructional practices grounded in scientifically based research within six months of receiving the technical assistance, and (c) the number of NIFL products that are determined to be of high quality by an independent peer review panel. The first two measures address the usefulness of the training that participants receive through the LINCS. The third measure will address a broader sample of products developed by the Institute. The Institute has proposed an additional measure, intended to reflect its role as a leader and coordinator of literacy research, to be adopted in FY2008. The measure is the percentage of Federal program managers and key agency policy staff who consider NIFL effective in coordinating support of reliable and replicable research on literacy and basic skills across federal agencies.

Evidence: The Department of Education's Visual Performance Suite (VPS) system

YES 12%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The Institute does not have ambitious annual targets for its performance measures and its proposed efficiency measure. Targets for the performance measures and the efficiency measure will be set once baseline data have been collected (FY 08).

Evidence: The Department of Education's VPS system

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The Institute has three performance measures, an additional proposed performance measure, and one proposed efficiency measure connected to achieving the program's long-term goals of providing national leadership regarding literacy, coordinating services, and serving as a national resource. The addition of the performance measure in FY06 that requires follow up with recipients of NIFL-sponsored trainings strengthened a focus on outcomes by requiring follow-up to ensure that participants reported implementation of NIFL technical assistance rather than simply expressing an intent to do so. The proposed efficiency measure calculates the cost of product development at the Institute by dividing the amount spent on high quality products (measure 3) by the cost of all products reviewed by the independent peer review panel. The Institute's annual measures are the same as its long-term measures and will serve to monitor progress toward the long-term measures regularly. The Institute revised its measures for serving as a national resource several times before FY06 in order to strengthen the focus on outcomes.

Evidence: The Department of Education's VPS system

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The Institute does not yet have established baselines and annual targets for its performance measures and its efficiency measure. The Institute's first measure, the percentage of recipients of Institute information to improve their instructional practice, was established in FY04. Following the first year of data collection, the LINCS and Bridges trainings were suspended to make programmatic changes. The Institute anticipates collecting data for this measure, its two additional existing performance measures, its proposed performance measure, and its proposed efficiency measure in 2008. Baselines and targets will be set at that time for all five measures.

Evidence: The Department of Education's VPS system

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: The Institute's partners' activities contribute to the attainment of both the annual and long-term performance goals. The contract and grant documents themselves include a description of the Institute's overall goals and purposes as well as the relationship between the specific contract activities and those goals and purposes. In addition, some contracts and grants require the contractor or grantee to establish and/or collect data for the Institute's official performance measures. For example, the LINCS Resource Center grantees, as specified in their grants, are required to manage Institute dissemination activities, including gathering information from participants at the close of training sessions and six months later. The LINCS grantees also assisted in creating the survey instruments used in collecting the data. Further, the Institute awarded contracts to carry out NCLB-authorized dissemination activities, which also include requirements to develop measures and collect data for those measures. Federal partners such as the Institute of Education Sciences (IES), OVAE, NICHD, and the Head Start Bureau in the Department of Health and Human Services (HHS) also have partnered with the Institute on activities that contribute to the Institute's accomplishment of its goals. The Institute and IES have signed two separate Memoranda of Understanding concerning the collection and dissemination of information on Early Reading First as well as on the use of rigorous research methods and evidence-based practices in adult education settings. In addition to partnering with OVAE and NICHD to fund adult reading research studies, the Institute has worked with them to identify themes and priorities for future research in adult literacy. The Institute and HHS worked together for five years to fund the National Early Literacy Panel, a group of nationally recognized researchers who conducted a meta-analysis of the literature on young children's acquisition of skills that lead to later success in reading. The effectiveness of partnership activities will be examined through future evaluations. Partners in partnerships formed to collect and disseminate information have not reported data that relate to the program's performance measures. In some instances, such as with the Department of Labor, it is not clear the extent to which partner activities have been linked to the goals of NIFL.

Evidence: ??Overview Information; Literacy Information and Communication (LINCS) Regional Resource Centers; Notice Inviting Applications for New Awards for Fiscal Year (FY) 2006; Notice, Federal Register, Volume 71, Number 147, Tuesday, August 1, 2006, Pages 43628 - 43631 http://frwebgate.access.gpo.gov/cgi-bin/getpage.cgi?position=all&page=43628&dbname=2006_register ??Statement of Work - Technical Assistance on Dissemination of K-3 Literacy Resources - RFP# ED-06-R-0039 (Southwest Educational Development Corporation - Contract # ED-04-CO-0039) ??Statement of Work - Dissemination of Pre-K Literacy Resources, EDC ??MOU between the Institute and IES, Independent Evaluation of the Early Reading First program by Decision Information Resources, Inc., Contract #ED-01-CO-0027/0002, September 29, 2004 through September 30, 2006, signed April 29, 2005 ??MOU between the Institute and IES, National Center for the Study of Adult Learning, Improve the Quality of Adult Literacy Research and Strengthen the Use of Evidence-Based Practices ??"Adult Literacy Research Themes," Revised Draft Working Document, January 28,2008 ??MOU with HHS for the National Early Literacy Panel, 2000

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The Institute has conducted reviews of two projects and is in the process of reviewing a third project. Two of the projects constitute approximately 58% of the program's WIA funds and the third project constitutes approximately 88% of the program's ESEA funds. All three reviews were conducted by independent firms that won the awards through the Department's Multiple Award Task Order (MATO) process. One contractor is completing a review of Bridges to Practice (Bridges), a second organization reviewed the Partnership for Reading (ESEA-authorized dissemination activities) as part of a larger contract, and the third contractor conducted and evaluated various dissemination activities. The Institute later modified that third contract to add a task to review LINCS. The review of LINCS was a response to recommendations from the Institute's Advisory Board. The Institute's Director initiated the reviews of the Partnership for Reading and Bridges to Practice. In general, the Institute relies on the contractor to select experts to design and conduct the review, without potentially biased input from the Institute. For the Partnership evaluation, a technical review panel reviewed the initial proposal to ensure that the evaluators were qualified. For the LINCS evaluation, the Institute reviewed the qualifications of the proposed experts. And for the Bridges review, the Institute has taken a more active role in the design of this evaluation. The findings from two of the three reviews are leading to thorough revisions to the LINCS and Bridges projects. The review of Bridges was conducted to determine if the design, purpose, objectives and content of the training is appropriate; to assess best practices for training adults with learning disabilities; to verify that the content is current; and to evaluate the effectiveness of the program in meeting its goals. The review included a ten-step process to examine the training sessions for rigor, instructional quality, and fidelity to the Bridges training guidebooks; the certification process; recruitment and marketing; and cost-effectiveness. Data collection came from multiple sources using a variety of techniques. Preliminary findings have led to a restructuring of the Bridges project. A comprehensive program review of LINCS was completed in September 2005. The review assessed quality, effectiveness, value, and fidelity to the program goals and Institute goals. Separate studies were conducted on each of the major LINCS components - the website, the discussion lists, the regional technology centers, the special collections, and the Assessment Strategies and Reading Profiles tool. The reviewers used interviews, surveys, content analysis, usability tests, and expert reviews as appropriate to assess each of the components. The review was multi-stepped - first analyzing data collected from each tool, then across similar tools, and then by research question across data sources. The review used a rigorous design that emphasized data collection from multiple sources and a multi-step data analysis strategy that analyzed data across similar sources, then analyzing by research question and topic within each question across all data sources. The evaluators were able to identify variability, range, and patterns in the responses using descriptive statistics, qualitative categorization, and quantitative assessment. Finally, the Partnership for Reading project to disseminate findings from scientifically based reading research and related NCLB-authorized tasks was also reviewed. The winning contract was a consortium of institutions, each with a specific expertise required by the tasks of the contractor. However, this evaluation collected survey data from participant lists that were 1-3 years out of date. This led to a low response rate (14%) and data that was unreliable. Due to these problems, this evaluation is not considered to be of sufficient scope and quality.

Evidence: ??Bridges to Practice Training and Certification Program Review, Draft Report, TATC Consulting, January 22, 2007 ??Comprehensive Review and Analysis of the Literacy Information and Communication System (LINCS), RMC Research Corporation, September 25, 2005 www.nifl.gov/nifl/executive_summary.doc ??The Partnership for Reading: Final Report - Dissemination Activities 2001-2005, Abt Associates, March 2006

YES 12%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The Institute's annual budget requests are not tied to the accomplishment of specific performance goals. The budgets do present the Institute's resource needs in a complete manner. However, adjustments are frequently made during a fiscal year, so the expenditure of funds does not always match the original spending plan.

Evidence: ??annual spending plans ??annual budget requests ??annual Congressional Justifications

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Institute has acted to implement recommendations from the LINCS and Bridges reviews, which identified deficiencies in strategic planning as well as in the execution of those projects. The intention of the changes is to focus the projects more effectively and enable the collection of data to begin again this year. Baselines will be established using the results of this year's performance and then ambitious long-term goals will be adopted. Other deficiencies in the Institute's strategic planning have been addressed through the Institute's development and implementation of a program planning effort. This approach has entailed consulting expert groups on adult literacy, English language learners (ELL), workforce basic sills, and youth with learning disabilities. Each group has advised the Institute on relevant topics for commissioned papers. For example, the commissioned papers that are an element of the ELL planning strand address the use of technology for ELLs, language requirements for ELLs in the healthcare sector, and the literacy needs of ELLs with few or no literacy skills in their native languages. Each planning strand also includes environmental scans and gap analyses to identify areas of need within the literacy field and efforts undertaken by other organizations. For example, the scan undertaken for the adult literacy strand identified activities and gaps in activities in research, professional development, curriculum development and materials, technology and distance learning, direct services, program accountability, transitions to postsecondary education and employment, workforce development, and policy. Options for program activities are then identified using the information described above. By following this process, the Institute is able to make decisions informed by current data and expert opinion. The program planning process is also located within a larger context defined in a new strategic plan. The Institute is close to completing a strategic planning process with its Advisory Board that will result in a document that establishes the Institute's strategic directions for the next two to four years, including goals, objectives and key activities. The Board's purpose in developing the document was to meet its WIA-authorized responsibility to provide independent advice to the Institute. The Board is now transmitting the report to the IAG for review and feedback. Additional program measures are being established based on the revised goals and objectives to assist the Institute in continuing to improve its program management and performance. Finally, the Institute has taken steps to increase the quantity and breadth of data available to inform its strategic planning process. The Institute awarded a contract in FY06 to develop additional performance measures for activities that currently cannot be measured using the Institute's official GPRA program measures. These activities promote and support rigorous research; translate research into practice; and identify high-performing programs, practices, and policies. An additional measure will address the Institute's leadership function. By collecting and examining the data from these new measures as well as the data from the official program measures, the Institute is able to engage in strategic planning using performance data from the entirety of its program activities.

Evidence: ??Statement of Work - Technical Assistance for Developing and Implementing a Strategic Plan, RFP# ED-07-R-0106 ??Statement of Work - Technical Assistance for Planning/Facilitating Strategic Planning Session, RFP# ED-07-R-0015 ??Statement of Work - Development of Performance Measures, RFP# ED-07-R-0094 ??Statement of Work - Technical Assistance for Program Planning (MPR Associates Inc. - Contract #ED-04-CO-0121/002) ??Bridges to Practice Training and Certification Program Review, Draft Report, TATC Consulting, January 22, 2007 ??Comprehensive Review and Analysis of the Literacy Information and Communication System (LINCS), RMC Research Corporation, September 25, 2005 www.nifl.gov/nifl/executive_summary.doc ??The Partnership for Reading: Final Report - Dissemination Activities 2001-2005, Abt Associates, March 2006 ??"Strategic Directions, 2008 and Beyond: Advancing Literacy as a National Asset," March 2008.

YES 12%
Section 2 - Strategic Planning Score 62%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Although the Institute regularly collects performance information from grantees and contractors, and has used evaluation findings to improve its programs, it does not have baselines for all its PART measures. The grantees' quarterly reports describe activities, accomplishments and plans for the next quarter. These narrative reports allow the Institute to ensure that grantees are prioritizing activities in accordance with the terms of their grants and completing them according to schedule. Performance data that support the Institute's official PART measures are entered by grantees directly into a database established for that purpose rather than submitted in the quarterly reports. As for contractors, the Institute established requirements in two early literacy contracts to develop and implement measures and data collection plans for the contracts' dissemination activities. The data will help the Institute understand who received its publications, how the publications were used, and whether the recipients found them well-done and useful. Such information will help the Institute refine future dissemination activities. The Institute has used evaluation findings from the LINCS and Bridges programs to improve these programs. As a result of recommendations from the review of LINCS, the Institute made the following four major changes: (1) changed LINCS' organizational responsibilities to focus more on dissemination; (2) ended some local level activities and focused grantees on identifying practitioners' information needs; (3) developed new selection criteria for materials to improve quality and focus on instructional materials; and (4) began a comprehensive overhaul of the LINCS website. The Institute used the preliminary findings from the Bridges evaluation to make the following four changes: (1) revise recruitment materials, (2) reduce the length of training but devote more time to participants' acquiring training skills in addition to learning content, (3) develop materials for new trainers using the same content covered in their own training, and (4) change the goal of training from certification to earning a certificate. To generate performance information from a larger number of its activities, especially unique activities with limited performance periods, the Institute awarded a contract in September 2007 for assistance in developing internal program measures to supplement its official PART measures. Rather than treat numerous small projects individually, the Institute will measure their contribution to broader goals.

Evidence: ?? Sample Quarterly Grant Reports - Region 2 Regional Resource Center, Center for Literacy Studies, University of Tennessee, Grant Number X257T060003, 3rd and 4th quarters of 2007 ?? Sample Quarterly Grant Report - Basic Skills Resource Collection, Ohio Literacy Resource Center and Pennsylvania State University, October 1 - December 31-2006 ?? Bridges to Practice Training and Certification Program Review, Draft Report, TATC Consulting, January 22, 2007 ?? Comprehensive Review and Analysis of the Literacy Information and Communication System (LINCS), RMC Research Corporation, September 25, 2005 www.nifl.gov/nifl/executive_summary.doc ?? Statement of Work - Technical Assistance on Dissemination of K-3 Literacy Resources - RFP# ED-06-R-0039 (Southwest Educational Development Corporation - Contract # ED-04-CO-0039) ?? Statement of Work - Technical Assistance on Dissemination of Early Childhood Literacy Resources - RFP# ED-06-R-0036 (Education Development Center, Inc. - Contract #ED-04-CO-0069/0004) ?? Sample Discussion Lists Quarterly Report for October - December 2007, Assessment Discussion List, University of Tennessee ?? Statement of Work - Development of Performance Measures, RFP# ED-07-R-0094 (Urban Institute - Contract #ED-07-PO-1416)

YES 10%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Although Federal program managers and some program partners are held accountable, the Institute Director has never had a performance agreement and the Institute has no means for holding federal partners accountable. The IAG has discussed developing a performance agreement for the Director, but currently none exists. The Institute has adopted the Department's EDPAS format for assessing the performance of managers and employees. The Director has created agreements for its managers to hold them accountable for cost, schedule, and performance. For example, the Deputy Director is accountable for managing projects to ensure work is completed within the established performance period, taking appropriate steps to ensure the timely reporting of accurate and relevant performance data, and working with project staff to ensure grantees, contractors, and partners understand the Institute's goals and performance measures and collect the required data. The Institute's program officers are held accountable for managing projects in accordance with grant or contract terms and approved project plans, through their performance agreements. The Executive Officer is held accountable by the Board and the IAG for maintaining documentation of the Institute's performance on official and internal performance measures and providing guidance to staff on schedule and other relevant requirements pertaining to the collection and entry of performance data into the official database. In cases where the Institute partners with another agency or organization, the Institute drafts a Memorandum of Understanding (MOU) that delineates each agency's role and responsibilities. However, the Institute has no means for holding other agencies accountable for cost, schedule, or performance results if the terms of the MOU are not met. The Institute holds its grantees and contractors accountable for completing work in accordance with the terms and conditions of their contracts. Grants are awarded through a competitive process established by the Education Department General Administrative Regulations (EDGAR). All contracts awarded for Institute projects are fixed-price, performance-based contracts. When appropriate, the Institute's contracts also include incentives for contractors to submit high-quality products in accordance with the deliverables schedule. However, less than a quarter of contracts awarded in FY07 were completed within the original performance period.

Evidence: ?? Sample QASP - K-3 Dissemination - RFP# ED-06-R-0039 (Southwest Educational Development Corporation - Contract #ED-04-CO-0039) ?? MOU between the Institute and IES, Independent Evaluation of the Early Reading First program by Decision Information Resources, Inc., Contract #ED-01-CO-0027/0002, September 29, 2004 through September 30, 2006, signed April 29, 2005 ?? MOU between the Institute and IES, National Center for the Study of Adult Learning, Improve the Quality of Adult Literacy Research and Strengthen the Use of Evidence-Based Practices ?? MOU with NICHD and USED (OSERS, OVAE and OESE) - Scientifically Based Reading Research Related to Adult and Family Literacy (June 20, 2002 to September 30, 2004) ?? MOU between the Institute and Mocha Moms, Inc. (dated January 4, 2008) ?? IAA with HHS, Head Start Bureau for The Family Literacy Reading Project - #ED-01-NP-0864 - #IAD-01-1701 (September 12, 2001 to Completion) ?? Performance Agreement for Deputy Director

NO 0%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: Although the Institute develops a Spending Plan for most of its appropriated funds, it has had to make frequent changes to the Plan, has some unobligated balances at fiscal year end, and lapsed funds from FY99 to FY02. However, the Institute has a difficult task in ensuring timely expenditure of funds in that it must get approval for its Spending Plan from its Board, three different cabinet agencies, and OMB. Delays in this approval process has contributed to delays in executing the procurements required to carry out program activities. The schedule of procurements is developed by the Contract Specialist in coordination with the program officers and specifies the timeline for the various procurements to ensure that grant and contract competitions are completed before the end of the fiscal year. Project officers at the Institute monitor the expenditure of funds for intended purposes by using the schedule of deliverables included in the contract documents. They provide their findings to the Contracts Specialist. Project officers review invoices to ensure that costs are appropriate to the tasks and deliverables in the contracts. At the end of every fiscal year, the Institute submits to the Department and OMB a table showing any differences in expenditures from what was anticipated in the spending plan. In FY06, the Institute reprogrammed 27% of its Reading First authorized funds and 49% of its WIA authorized funds. The IAG has now instituted a process by which the Institute must request approval for any expenditure that varies more than 10% from the estimate in the approved spending plan. From FY99 to FY02, the Institute had between one and five percent of its WIA appropriation cancel. The Institute leaves unobligated funds at the end of the fiscal year as a precaution to pay any unexpected costs that arise after the end of the fiscal year. Over the past three fiscal years, between $110,000 and $160,000 have been left unobligated at the end of the fiscal year. .

Evidence: ?? FY06 Budget Differences from Spending Plan ?? Procurement Schedule FY07 ?? Budget Authority Canceled Funds report, EOY 2004 through EOY 2007 ?? Status of Funds reports from FY04, FY05, and FY06

NO 0%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: Nearly 100 percent of grants and contracts awarded by the Institute are awarded through competitive processes. In FY06, the Institute awarded one sole source contract worth .3 percent of the program budget. Contracts are always fixed price rather than time-and-materials or other types of contracts that can lead to overcharging more easily. The Institute uses Quality Assurance Surveillance Plans (QASPs) for contracts greater than $100,000 when the competition is conducted through the Department's MATO schedule. A QASP establishes criteria for quality and timeliness of selected deliverables and defines an award pool. The award pool incentivizes contractors to submit high quality products according to schedule. The Institute has one efficiency measure that was adopted in FY07. A baseline and targets were also established at that time. It is the percentage of contracts that are completed during the original contract performance period. The goal is to improve contract administration, complete the work on schedule and within budget, and reduce the number of contract extensions required to receive all deliverables.

Evidence: ?? Performance and Efficiency measures, the Department's VPS system ?? Sample QASP- K-3 Dissemination - RFP# ED-06-R-0039 (Southwest Educational Development Corporation - Contract #ED-04-CO-0039)

YES 10%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: One of the Institute's main authorized tasks is to collaborate and coordinate effectively across federal agencies. The Institute tries to meet this mandate in three ways by: (1) sharing information about organizational priorities or activities; (2) partnering with other federal programs to work toward a shared goal or interest; and (3) sharing responsibility for the costs associated with the joint activities. The Institute has achieved the largest number of cooperative activities by using Federal staff working groups committed to producing specific products. An example of (1), sharing of information, stems from the Institute's Partnership for Reading project, initiated to encourage the multiple Federal offices with responsibility for the production and dissemination of scientifically based reading research to speak with one voice about such research and its implications for instructional practice. A cross-agency group identifies products that are needed to disseminate research findings and practical advice on how to use those findings in instruction. An example of (2), working towards a shared goal, can be found with the Institute's leadership on the Interagency Working Group on Adolescent Literacy, led by the Institute and composed of representatives from OVAE, IES, and NICHD in which agencies contributed staff time and expertise to produce research-based reports and products that fill a gap in knowledge or resources supporting practice.. The Institute oversaw the development of a guide for content-area teachers on how to strengthen adolescents' reading skills. While the Institute bears the major responsibility for disseminating this report, the other three participating offices also frequently feature and disseminate it as part of their presentations at meetings and conferences. Another example of working towards a shared goal, is the Institute's involvement on the President's Executive Order to strengthen adult literacy. The Institute completed its multi-agency effort to identify adult literacy research needs in March 2008 and has provided a copy of its report to the Interagency Working Group established under the auspices of the President's Executive Order to strengthen adult literacy. It is not clear how it has been used. The Institute is now turning its attention to early literacy, working with the Good Start, Grow Smart (GSGS) Interagency Committee to develop a research agenda about acquisition of reading skills by very young children who living in homes in which English is not the primary language. Finally, an example of (3), cost-sharing, is seen with the Institute's mandate under ESEA Title I, Part C, Subpart 3 to conduct scientifically based reading research through an entity with expertise in longitudinal studies of the development of literacy skills in children and experience in developing effective interventions for childhood reading difficulties. The Institute, working cooperatively with six offices from the National Institutes for Health, including NICHD, and the U.S. Department of Education, developed a joint research agenda to serve as the basis for a new national research program in adult and family literacy. The National Institute for Literacy contributed $2 million annually for five years; its funds also leveraged additional funds from NICHD and OVAE to create a pool of $18.5 million over a period of five years. Although the Institute has made progress in coordinating and collaborating across related programs, some improvements still need to be made in ensuring that overlapping activities do not take place in implementation. Recently, the Interagency Group (IAG) has been more involved in the process of coordinating spending plans between OVAE and the Institute in an effort to leverage resources where possible.

Evidence: ?? IAA with NICHD and OVAE on Adolescent Literacy ?? MOU between the Institute and IES, Independent Evaluation of the Early Reading First program by Decision Information Resources, Inc., Contract #ED-01-CO-0027/0002, September 29, 2004 through September 30, 2006, signed April 29, 2005 ?? MOU between the Institute and IES, National Center for the Study of Adult Learning, Improve the Quality of Adult Literacy Research and Strengthen the Use of Evidence-Based Practices ?? MOU with NICHD and USED (OSERS, OVAE and OESE) - Scientifically Based Reading Research Related to Adult and Family Literacy (June 20, 2002 to September 30, 2004) ?? IAA with HHS, Head Start Bureau for The Family Literacy Reading Project - #ED-01-NP-0864 - #IAD-01-1701 (September 12, 2001 to Completion) ?? "Research-Based Principles for Adult Basic Education Reading Instruction" www.nifl.gov/partnershipforreading/publications/adult_ed_02.pdf ?? Assessment Strategies and Reading Profiles www.nifl.gov/readingprofiles/ ?? "What Content-Area Teachers Should Know About Adolescent Literacy" www.nifl.gov/nifl/publications/adolescent_literacy07.pdf ?? "Adult Literacy Research Themes" - Revised Draft Working Document, January 28,2008 ?? "Literacy Begins at Home" (www.nifl.gov/nifl/publications/Literacy_Home.pdf) ?? "What Content-Area Teachers Should Know About Adolescent Literacy" National Institute for Literacy, National Institute for Child Health and Human Development (HHS office), and U.S. Department of Education

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: The Institute uses the Department of Education's financial management systems and complies with Departmental financial procedures. The Departmental financial systems provide a system of checks and balances to ensure that funds are available for obligation, that fiscal coding is valid, and that there is a separation of responsibilities between requesters and approvers in order to provide adequate oversight of all obligations and payments. The Institute is also assessed on the same financial management performance indicators as offices within the Department. This information is reported to OMB through the CFO Council's Metric Tracking System Website. The Institute also participates in the regular Departmental reviews and reconciliations of open obligations. Although it has not been individually audited, the Institute has been included in the sampling for Department-wide audits. The Institute has not been made aware of any material weaknesses in this area as a result of these audits. In 2002, the Institute hired an independent contractor to audit the Institute's contract files to ensure records and close-out activities complied with federal laws and regulations. The contractor found xxx, xxx, and xxx. As a result of these findings, the Institute made changes in three areas - (1) implemented procedures for improving contract administration and closeout, including training staff on proper contract documentation; (2) established a system to document the receipt and acceptance of deliverables, and (3) increased management oversight of contract modifications. The Institute also closed-out out all contracts requiring such action, including deobligating contract funds that remained on the contract after the final invoice was processed. The Institute uses a separation of responsibilities for payment of invoices with Program Officers reviewing and certifying contract invoices prior to processing by the financial staff and final approval by a senior officer. To improve processing timeframes, the Institute put in place a system to alert the Program Officers to the dates by which invoices must be reviewed and approved for payment in order to be processed in a timely manner.

Evidence: ?? Sample Selected Measures, POC Summary Report, FY07, National Institute for Literacy ?? Sample Selected Measures from the OMB MTS Public Treasury Scorecard as of December 31, 2007 ?? Statement of Work - Acquisition Support Services (Kalman and Company - Contract #ED-02-CO-0055) ?? Need the independent review

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The Institute has used private and federal experts to help identify management deficiencies and has acted upon their recommendations to improve internal and fiscal activities, especially contracting (the Institute's main vehicle for completing work). In 2004, the Institute hired an independent contractor, in consultation with the Department's Office of Management, to conduct an internal management review. The review resulted in 16 recommendations, 8 of which have been implemented and focus on improving travel processes, developing a recruiting strategy, creating a Contracts Specialist position, enhancing communication and coordination processes in order to meet schedules and deadlines, creating a senior management team, identifying and developing a competency model for employees, cross-training employees in multiple functions, and implementing individual development plans. The Institute is working to implement 3 recommendations related to performance management, document control, and knowledge management processes and procedures and intends to implement the remaining recommendations by 2010. The institute has addressed fiscal management deficiencies on four fronts. First (as described in 3.6), the Institute contracted with an outside firm in 2002 to review its contract files to ensure records and close-out activities complied with federal laws and regulations. Second, the Institute worked with the Department's Financial Improvement and Post-Audit Operations team from 2002 to 2003 to improve its invoicing procedures and strengthen budget execution and related functions. The immediate results of this work included a revision of billing requirements for contractors so the Institute can better monitor contract costs. Third, in 2008, the Institute developed a database to record all contract deliverables and their deadlines. The database enables more rigorous contract monitoring by assembling key schedule and performance milestones in one place, making it easier for the Contract Specialist and Institute management to track contract performance. By Month Year, the Institute will implement a new requirement for monthly budget reconciliations so that managers can properly monitor expenditures. Lastly, by Month Year, the Institute will implement a plan to ensure that no more than one percent of funds remain available at the end of the fiscal year.

Evidence: ?? Statement of Work - Technical Assistance for Program Planning (MPR Associates Inc. - Contract #ED-04-CO-0121/002) ?? Efficiency Measure, the Department's VPS system ?? Statement of Work - Technical Assistance for Developing and Implementing a Strategic Plan, RFP# ED-07-R-0106 (Practical Strategy Corp. - Contract #ED-07-PC-1454) (www.nifl.gov/nifl/grants_contracts/sow/ED-07-R-0106_SOW.doc) ?? Statement of Work - Technical Assistance for Planning/Facilitating Strategic Planning Session, RFP# ED-07-R-0015 (Judith Moak - Contract #ED-07-PO-0282) ?? Statement of Work - Development of Performance Measures, RFP# ED-07-R-0094 (Urban Institute - Contract #ED-07-PO-1416) ?? Need results of the reviews as well

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: The Institute's grant process is governed by the rules outlined in EDGAR. Grant competitions are advertised through the Federal Register and on Grants.gov and are open to individual public and private agencies and institutions or non-profits or consortia of these organizations. Proposals are reviewed by a panel of independent, qualified reviewers according to predetermined criteria described in the grant announcement. Unsolicited grant proposals have occasionally been submitted to the Institute. In 2006, four unsolicited proposals were received. Only one was funded using the process required by EDGAR as described above. The funded proposal requested funds for a fifth wave of data collection and analysis for a longitudinal study of the extent, antecedents, and consequences of basic skills changes in adulthood. The study, called the Longitudinal Study of Adult Learning (LSAL), is a panel study, representative of a local target population for adult literacy services. The Institute has consulted with the Office of General Counsel and has been advised it may adopt EDGAR criteria for occasional grant competitions to fund innovative ideas. The Institute plans to adopt this strategy and will no longer accept unsolicited proposals.

Evidence: ?? Education Department General Administrative Regulations (EDGAR) www.ed.gov/policy/fund/reg/edgarReg/edgar.html ?? National Institute for Literacy, Notice of Consideration of Unsolicited Grant Proposals and Intent to Publish Regulations, Federal Register, Volume 71, Number 120, Page 35956, Thursday, June 22, 2006 http://frwebgate.access.gpo.gov/cgi-bin/getpage.cgi?dbname=2006_register&position=all&page=35956 ?? National Institute for Literacy, Overview Information; Literacy Information and Communication (LINCS) Regional Resource Centers; Notice Inviting Applications for New Awards for Fiscal Year (FY) 2006; Notice, Federal Register, Volume 71, Number 147, Tuesday, August 1, 2006, Pages 43628 - 43631 http://frwebgate.access.gpo.gov/cgi-bin/getpage.cgi?position=all&page=43628&dbname=2006_register ?? National Institute for Literacy, Overview Information; Literacy Information and Communication (LINCS) Resource Collections; Notice Inviting Applications for New Awards for Fiscal Year (FY) 2006; Notice, Federal Register, Volume 71, Number 151, Monday, August 7, 2006, Pages 44716 - 44720 http://frwebgate.access.gpo/cgi-bin/getpage.cgi?position=all&page=44716&dbname=2006_register ?? List of Institute grants awarded in FY05 and FY06

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The Institute has established oversight procedures for its six LINCS grantees - three regional centers and three resource collections - that rely on telephone, written and in-person monitoring activities. The Institute's LINCS grantees do not operate physical locations that provide services; therefore, site visits are not the linchpin of the Institute's oversight. Instead, the Institute requires quarterly written reports from each grantee describing major activities and accomplishments, plans for work to be completed in the coming quarter, and approaches to addressing problems. The Institute holds monthly telephone conference calls with grantees and operates discussion lists to promote communication with the grantees and among the grantees in between calls and meetings. The Institute convenes two meetings per year with all grantees. Further, the Program Officer who manages all six grants approves all of the grantees' activities in advance, including their work on meetings with partners, conferences, trainings, and presentations. The Program Officer works closely with the grantees to ensure that they are providing services required by the grant, spending about 50% of her time on grant oversight activities. The Deputy Director is responsible for managing the one unsolicited grant, and the only other grant made by the Institute. Grantees submit annual budgets and reports of spending by line item at the end of each grant year. The final report is a cumulative report of the year's accomplishments as well as actual expenditures for the entire grant period. In-kind contributions and contributions by grant partners are also identified. This is reviewed by the Program Officer who also would be notified automatically by the Department of Education should a grantee attempt to draw down an inappropriately large portion of its grant funding. The Institute has never received such a notification. All of this information allows the Program Officer to monitor actual grantee work against the expectations laid out in the grant agreement.

Evidence: ?? Sample Quarterly Grant Reports - Region 2 Regional Resource Center, Center for Literacy Studies, University of Tennessee, Grant Number X257T060003, 3rd and 4th quarters of 2007 ?? Sample Annual Grant Report - Year 2 (FY 2006-2007) Continuation Proposal for LINCS Region 1 Resource Center, Grant Award Number X257TO60001, Submitted July 15, 2007 ?? Sample Discussion List Quarterly Report - Assessment Discussion List, University of Tennessee, October 1 - December 31, 2007, Submitted January 14, 2008 ?? Sample Quarterly Grant Report - Basic Skills Resource Collection, Ohio Literacy Resource Center and Pennsylvania State University, October 1 - December 31-2006

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: Although the Institute collected grantee performance data annually, it does not make this information available to the public in a transparent and meaningful way. The Institute collects grantee performance data in two ways. First, it receives quarterly and annual performance reports on required activities for the duration of the performance period. The quarterly reports are publicly available on the Institute's web site, however, these reports do not include performance data. Second, the Institute receives data from training sessions run by the grantees to use in its first performance measure, which assesses the impact of the training.

Evidence: ?? Sample Annual Grant Report - Year 2 (FY 2006-2007) Continuation Proposal for LINCS Region 1 Resource Center, Grant Award Number X257TO60001, Submitted July 15, 2007 ?? http://www.nifl.gov/lincs/about/reports/quarterlyreports.html.

NO 0%
Section 3 - Program Management Score 70%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The Institute established baselines for performance measures in FY04 and FY05 but adopted new measures in FY06 in order to make the measures more reflective of actual performance, including focusing more on outcomes rather than outputs, and on the quality of the outcomes achieved. During FY06 and FY07, the training offered through LINCS and Bridges, the data sources of the Institute's current measures, was suspended while the projects were redesigned and grants and contracts re-competed to implement recommendations made during program evaluations. During this period, the LINCS Resource Collections, the data source for the Institute's third measure, were also awaiting the completion of new, more rigorous material selection criteria. The Institute plans to set baseline data for the three existing measures in 2008.

Evidence: The Department's VPS system

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The Institute has established baselines and targets for its efficiency measure and one of its performance measures. It has not yet established baselines or performance goals for its two other current measures and one proposed measure, but anticipates doing so by the end of 2008.

Evidence: The Department's VPS system

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: Since only baseline data exists for the efficiency measure at this point, it is not known whether the program has demonstrated improved efficiency over the prior year. However, the program can cite two ways it has operated cost effectively: by preparing for selected meetings using Institute staff rather than contractors and by printing publications in very large quantities to achieve economies of scale. The quality of the Institute's work as well as its capacity to provide national leadership is enhanced by consultations with nationally recognized experts, so the Institute convenes these project-focused meetings regularly. Over the past 12 months, the Institute held three expert-group meetings with 10-12 participants each and used Institute staff to develop agendas, prepare materials, plan participant travel, disburse honorarium and related activities. Planning and managing meetings in-house (staff workload permitting) saves the Institute from incurring contractor labor and overhead costs above and beyond Institute salaries and overhead. These are expenses the Institute would have to pay under any circumstances. Therefore, planning and managing meetings in-house saved the Institute approximately $42,000 in contractor labor and overhead fees. While logistics contracts for larger meetings (30 or more attendees) benefit from economies of scale, handling meetings of 15 or fewer attendees using only Institute resources has resulted in considerable savings. The Institute also has demonstrated improved efficiencies through economies of scale in its ESEA-funded publication orders. The Institute's first orders for its publications were often for fewer than 25,000 publications with per-copy costs of $.92, $1.06, and $1.84. After reviewing these costs, the Institute realized it could significantly lower its costs by printing or reprinting its publications in very large quantities once a year rather than in smaller quantities several times a year. Thus, per-copy costs were reduced from $.92 to $.13, from $1.06 to $.32 and from $1.84 to $.42. The cost of printing 625,000 copies of a $1.06-per-copy document is $662,500 while the same quantity of a $.32-per-copy document is $200,000. The use of simpler designs and bindings, fewer photographs, and basic colors also helped reduce printing costs. The Institute also anticipates improved efficiencies in the new design of Bridges training. In the past, each cohort of participants participated in two non-sequential weeks of training. The Institute now plans to convene each cohort of participants for one week in 2008, thereby reducing overall training costs by half or $55,830.

Evidence: List of Partnership Publications Excel Spreadsheet with Estimated Contractor Labor and Overhead Costs for Small Meetings (10 to 15 people) Excel Spreadsheet with Bridges Training Travel Estimates ED's VPS system

SMALL EXTENT 8%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: As noted in the answer to Question 1.3, an examination of the Institute's program design shows that its authorized activities under the Workforce Investment Act and ESEA overlap some with other Federal and non-Federal efforts. However, no comparable data are available to make comparisons and would be too inherently difficult to obtain.

Evidence: ED's VPS system

NA  %
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: As described in section 2.6, the Institute has reviewed two major activities (LINCS and the Partnership for Reading) and is in the process of reviewing a third (Bridges to Practice). The Literacy Information and Communication System (LINCS) analysis, conducted by RMC Research Corporation and completed in September 2005, reviewed the project's five major components: the web site, discussion lists, the regional technology centers, the special collections, and the Assessment Strategies and Reading Profiles tool. The reviewers used interviews, surveys, content analysis, usability tests, and expert reviews, as appropriate, to assess each of the components. The review reached conclusions in the areas of quality, value, and effectiveness in addition to identifying its users. The evaluation found that LINCS reached its intended audience and provided benefits to them. Regarding the quality of LINCS, the evaluation found that the infrastructure was more complicated than necessary and the search tool was problematic. Management and implementation of the project were "generally satisfactory." Positive results from surveys of users suggested that the materials and resources were of high value to users. The evaluation found LINCS services to be more cost-effective than comparable services provided by for-profit firms. Regarding effectiveness, the evaluation found that LINCS increased the adult literacy field's capacity for technology and built a "successful and extensive network of linked web sites, unique in the field of adult education/literacy." The review also provided specific recommendations for improving LINCS, most of which the Institute used to direct changes in the project, reflected in the FY06 grant competition. The Bridges to Practice program review draft report, prepared by TATC Consulting in January 2007, found, "??the training provides a wealth of information to participants on working with low literacy adults with learning disabilities. The guidebooks are a comprehensive resource for participants. The training staff demonstrates dedication to the B2P program and a real desire to increase awareness of participants to assist them in better serving their population of low literacy adults with learning disabilities. There are specific master trainers with strong skill sets in the areas of training and learning disabilities. Participants identify a strong need for information on working with low literacy adults with learning disabilities and an eagerness to share this information with the organizations they represent. This provides a strong foundation for the B2P program." The report includes recommendations, which will be further informed upon completion of the full review. The Institute has already agreed to implement some of the recommendations, including improvements to the training materials, improving the skills of the trainers, and restructuring the training sessions. The final report is expected in October 2008. Finally, the Partnership for Reading evaluation, conducted by Abt Associates and submitted in March 2006, reported positive results in response to three evaluation questions. Yet, as discussed in Section 2.6, this evaluation had serious data collection flaws that caused it to fall short in its scope and quality. Initiatives of modest scope and limited performance periods that represent the remainder of the Institute's program activities do not easily lend themselves to outcome-based evaluations. These initiatives typically result in papers, reports, or other resources that can be evaluated most appropriately via peer review, a process the Institute has standardized in recent years. Activities such as printing and shipping are not appropriate activities to evaluate for outcomes. The Institute is in the process of creating supplemental measures for these smaller activities and work of the Institute.

Evidence: National Institute for Literacy Bridges to Practice Training and Certification Program Review, Draft Report, TATC Consulting, January 22, 2007 Comprehensive Review and Analysis of the Literacy Information and Communication System (LINCS), RMC Research Corporation, September 25,2005 "Comprehensive Review and Analysis of the Literacy Information and Communication System (LINCS), Final Report, Executive Summary" www.nifl.gov/nifl/executive_summary.doc The Partnership for Reading: Final Report - Dissemination Activities 2001-2005, Abt Associates, March 2006 Statement of Work - Development of Performance Measures, RFP# ED-07-R-0094 (Urban Institute - Contract #ED-07-PO-1416)

SMALL EXTENT 8%
Section 4 - Program Results/Accountability Score 16%


Last updated: 01092009.2008FALL