U.S. Department of Education: Promoting Educational Excellence for all Americans

Exhibit 300 FY2009

PART I: SUMMARY INFORMATION AND JUSTIFICATION

In Part I, complete Sections A, B, C, and D for all capital assets (IT and non-IT). Complete Sections E and F for IT capital assets.

Section A: Overview (All Capital Assets)

The following series of questions are to be completed for all investments.

I. A. 1. Date of Submission:
2007-09-10

I. A. 2. Agency:
018

I. A. 3. Bureau:
50

I. A. 4. Name of this Capital Asset:
(short text - 250 characters)
National Assessment of Educational Progress (NAEP)

I. A. 5. Unique Project (Investment) Identifier:
For IT investment only, see section 53. For all other, use agency ID system.
018-50-01-05-01-1020-00

I. A. 6. What kind of investment will this be in FY2009?
Please NOTE: Investments moving to O&M in FY2009, with Planning/Acquisition activities prior to FY2009 should not select O&M. These investments should indicate their current status.
Mixed Life Cycle

I. A. 7. What was the first budget year this investment was submitted to OMB?
FY2003

I. A. 8. Provide a brief summary and justification for this investment, including a brief description of how this, closes in part or in whole, an identified agency performance gap:
(long text - 2500 characters)
The National Center for Education Statistics (NCES) is the sponsoring entity for the National Assessment of Educational Progress (NAEP) program, a nationwide assessment effort involving multiple contractors responsible for performing the assessment and executing the vision of the assessment from NCES and the National Assessment Governing Board (NAGB). The execution of this vision includes:? The creation and coordination of test items and questionnaires to be used in a variety of assessment topics;? The development and production of test instruments and supporting materials for the assessments;? The hiring and management of full-time and part-time staff to coordinate and administer the assessments; ? The creation and release of reports detailing the results of the assessments and the methodologies used to administer the assessments; and? The development and deployment of web-based applications to manage and integrate the contractor activities and to report results to the American Public.NCES and the Alliance contractors have established a set of tools to allow for the review and collaboration on project documentation, including project deliverables. These online tools open the communication among the parties involved in performing NAEP-related assessments and make critical documents accessible. The tools created for coordination include two critical systems for management.NAEP Network is a set of applications available to NCES, State NAEP coordinators, and NAEP assessment contractors to receive updated information and guidance regarding the current year's assessments and to collaborate with NAEP personnel. NAEP Network allows NCES, the Alliance contractors, and field staff to communicate and collaborate on status and activities of the current year's assessment. Additionally, the NAEP Integrated Management System (IMS) provides an online intranet for NCES and Alliance contractors, including a critical document repository, contact and calendar information, and other resources for the NAEP program. The IMS provides a central library of NAEP program documentation, including design plans, project deliverables, and schedules for review by NCES and Alliance contractors. The IMS also provides a secure environment for Alliance and NCES personnel to exchange information regarding the assessment efforts, to track document and plan changes, and to collaborate on document development.

I. A. 9. Did the Agency's Executive/Investment Committee approve this request?
yes

I. A. 9. a. If "yes," what was the date of this approval?
2007-06-14

I. A. 10. Did the Project Manager review this Exhibit?
yes

I. A. 11. Contact information of Project Manager

Name
(short text - 250 characters)

Phone Number
(short text - 250 characters)

E-mail
(short text - 250 characters)

I. A. 11. a. What is the current FAC-P/PM certification level of the project/program manager?

I. A. 12. Has the agency developed and/or promoted cost effective, energy-efficient and environmentally sustainable techniques or practices for this project?
yes

I. A. 12. a. Will this investment include electronic assets (including computers)?
yes

I. A. 12. b. Is this investment for new construction or major retrofit of a Federal building or facility? (answer applicable to non-IT assets only)
no

I. A. 12. b. 1. If "yes," is an ESPC or UESC being used to help fund this investment?

I. A. 12. b. 2. If "yes," will this investment meet sustainable design principles?

I. A. 12. b. 3. If "yes," is it designed to be 30% more energy efficient than relevant code?

I. A. 13. Does this investment directly support one of the PMA initiatives?
yes

I. A. 13. a. If "yes," check all that apply:
Competitive Sourcing

I. A. 13. b. Briefly and specifically describe for each selected how this asset directly supports the identified initiative(s)? (e.g. If E-Gov is selected, is it an approved shared service provider or the managing partner?)
(medium text - 500 characters)
NCES has instituted several mechanisms for performing oversight and control of the activities of the Alliance contractors in their execution of the NAEP program activities. These mechanisms have been instituted to keep NCES informed of project progress and status as well as maintaining understanding of current issues or concerns. The three key areas of oversight are:? Activity Reporting? Coordination Meetings? Online Collaboration and Coordination

I. A. 14. Does this investment support a program assessed using the Program Assessment Rating Tool (PART)? (For more information about the PART, visit www.whitehouse.gov/omb/part.)
no

I. A. 14. a. If "yes," does this investment address a weakness found during the PART review?
no

I. A. 14. b. If "yes," what is the name of the PARTed Program?
(short text - 250 characters)

I. A. 14. c. If "yes," what PART rating did it receive?

I. A. 15. Is this investment for information technology?
yes

I. A. 16. What is the level of the IT Project? (per CIO Council PM Guidance)
Level 1 - Projects with low-to-moderate complexity and risk. Example: Bureau-level project such as a stand-alone information system that has low- to-moderate complexity and risk.
Level 2 - Projects with high complexity and/or risk which are critical to the mission of the organization. Examples: Projects that are part of a portfolio of projects/systems that impact each other and/or impact mission activities. Department-wide projects that impact cross-organizational missions, such as an agency-wide system integration that includes large scale Enterprise Resource Planning (e.g., the DoD Business Mgmt Modernization Program).
Level 3 - Projects that have high complexity, and/or risk, and have government-wide impact. Examples: Government-wide initiative (E-GOV, President's Management Agenda). High interest projects with Congress, GAO, OMB, or the general public. Cross-cutting initiative (Homeland Security).

Level 1

I. A. 17. What project management qualifications does the Project Manager have? (per CIO Council's PM Guidance):
(1) Project manager has been validated as qualified for this investment;(2) Project manager qualification is under review for this investment;(3) Project manager assigned to investment, but does not meet requirements;(4) Project manager assigned but qualification status review has not yet started;(5) No Project manager has yet been assigned to this investment
(1) Project manager has been validated as qualified for this investment

I. A. 18. Is this investment identified as "high risk" on the Q4-FY 2007 agency high risk report (per OMB Memorandum M-05-23)?
no

I. A. 19. Is this a financial management system?
no

I. A. 19. a. If "yes," does this investment address a FFMIA compliance area?

I. A. 19. a. 1. If "yes," which compliance area
(short text - 250 characters)

I. A. 19. a. 2. If "no," what does it address?
(medium text - 500 characters)

I. A. 19. b. If "yes," please identify the system name(s) and system acronym(s) as reported in the most recent financial systems inventory update required by Circular A-11 section 52
(long text - 2500 characters)

I. A. 20. What is the percentage breakout for the total FY2009 funding request for the following? (This should total 100%)

I. A. 20. a. Hardware
1

I. A. 20. b. Software
3

I. A. 20. c. Services
96

I. A. 20. d. Other
0

I. A. 21. If this project produces information dissemination products for the public, are these products published to the Internet in conformance with OMB Memorandum 05-04 and included in your agency inventory, schedules and priorities?
yes

I. A. 22. Contact information of individual responsible for privacy related questions:

I. A. 22. a. Name
(short text - 250 characters)

I. A. 22. b. Phone Number
(short text - 250 characters)

I. A. 22. c. Title
(short text - 250 characters)

I. A. 22. d. E-mail
(short text - 250 characters)

I. A. 23. Are the records produced by this investment appropriately scheduled with the National Archives and Records Administration's approval?
yes

I. A. 24. Does this investment directly support one of the GAO High Risk Areas?
Question 24 must be answered by all Investments:
no

Section B: Summary of Spending (All Capital Assets)

I. B. 1. Provide the total estimated life-cycle cost for this investment by completing the following table. All amounts represent budget authority in millions, and are rounded to three decimal places. Federal personnel costs should be included only in the row designated "Government FTE Cost," and should be excluded from the amounts shown for "Planning," "Full Acquisition," and "Operation/Maintenance." The "TOTAL" estimated annual cost of the investment is the sum of costs for "Planning," "Full Acquisition," and "Operation/Maintenance." For Federal buildings and facilities, life-cycle costs should include long term energy, environmental, decommissioning, and/or restoration costs. The costs associated with the entire life-cycle of the investment should be included in this report.
Note: For the cross-agency investments, this table should include all funding (both managing and partner agencies).
Government FTE Costs should not be included as part of the TOTAL represented.

  PY-1 and Spending Prior to 2007 PY 2007 CY 2008 BY 2009 BY+1 2010 BY+2 2011 BY+3 2012 BY+4 2013 and Beyond
Planning 0.000 0.000 0.000 0.500        
Acquisition 0.000 0.000 1.387 1.730        
Subtotal Planning & Acquisition                
Operations & Maintenance 10.817 3.678 2.107 2.212        
Total                
Government FTE Costs 0.329 0.112 0.115 0.120        
Number of FTE represented by cost 0 1 1 1        

I. B. 2. Will this project require the agency to hire additional FTE's?
no

I. B. 2. a. If "yes," How many and in what year?
(medium text - 500 characters)

I. B. 3. If the summary of spending has changed from the FY2008 President's budget request, briefly explain those changes.
(long text - 2500 characters)
The spending has changes from the 2007 President's Budget Request to reflect recent direction from the National Assessment Governing Board (NAGB) regarding supporting requirements for IT management and for the expansion of the assessment scope and range. The adjustments to the budget reflect increased use of information technology (IT) in the conduct of assessments as well as increased functionality of management and reporting capabilities in current IT assets. These changes have resulted in increased requirements for IT solutions, including introduction of online questionnaires, electronic assessment instruments, and consolidated status reporting.

Section C: Acquisition/Contract Strategy (All Capital Assets)

I. C. 1. Complete the table for all (including all non-Federal) contracts and/or task orders currently in place or planned for this investment. Total Value should include all option years for each contract. Contracts and/or task orders completed do not need to be included.
SIS - Share in Services contract; ESPC - Energy savings performance contract ; UESC - Utility energy efficiency service contract; EUL - Enhanced use lease contract; N/A - no alternative financing used.
(Character Limitations: Contract or Task Order Number - 250 Characters; Type of Contract/Task Order - 250 Characters; Name of CO - 250 Characters; CO Contact Information - 250 Characters)

  Type of Contract/Task Order Has the contract been awarded? If so what is the date of the award? If not, what is the planned award date? Start date of Contract/Task Order End date of Contract/Task Order Total Value of Contract/Task Order ($M) Is this an Interagency Acquisition? Is it performance based? Competitively awarded? What, if any, alternative financing option is being used? Is EVM in the contract? Does the contract include the required security & privacy clauses? Name of CO CO Contact Information (phone/email) Contracting officer certification level If N/A, has the agency determined the CO assigned has the competencies and skills necessary to support this aquistion?
                                 
                                 

I. C. 2. If earned value is not required or will not be a contract requirement for any of the contracts or task orders above, explain why:
(long text - 2500 characters)

I. C. 3. Do the contracts ensure Section 508 compliance?

I. C. 3. a. Explain Why:
(medium text - 500 characters)

I. C. 4. Is there an acquisition plan which has been approved in accordance with agency requirements?
yes

I. C. 4. a. If "yes," what is the date?

I. C. 4. b. If "no," will an acquisition plan be developed?

I. C. 4. b. 1. If "no," briefly explain why:
(medium text - 500 characters)

Section D: Performance Information (All Capital Assets)

In order to successfully address this area of the exhibit 300, performance goals must be provided for the agency and be linked to the annual performance plan. The investment must discuss the agency's mission and strategic goals, and performance measures (indicators) must be provided. These goals need to map to the gap in the agency's strategic goals and objectives this investment is designed to fill. They are the internal and external performance benefits this investment is expected to deliver to the agency (e.g., improve efficiency by 60 percent, increase citizen participation by 300 percent a year to achieve an overall citizen participation rate of 75 percent by FY 2xxx, etc.). The goals must be clearly measurable investment outcomes, and if applicable, investment outputs. They do not include the completion date of the module, milestones, or investment, or general goals, such as, significant, better, improved that do not have a quantitative measure.

I. D. 1. Table 1. Performance Information Table
In order to successfully address this area of the exhibit 300, performance goals must be provided for the agency and be linked to the annual performance plan. The investment must discuss the agency's mission and strategic goals, and performance measures (indicators) must be provided. These goals need to map to the gap in the agency's strategic goals and objectives this investment is designed to fill. They are the internal and external performance benefits this investment is expected to deliver to the agency (e.g., improve efficiency by 60 percent, increase citizen participation by 300 percent a year to achieve an overall citizen participation rate of 75 percent by FY 2xxx, etc.). The goals must be clearly measurable investment outcomes, and if applicable, investment outputs. They do not include the completion date of the module, milestones, or investment, or general goals, such as, significant, better, improved that do not have a quantitative measure.

Agencies must use the following table to report performance goals and measures for the major investment and use the Federal Enterprise Architecture (FEA) Performance Reference Model (PRM). Map all Measurement Indicators to the corresponding "Measurement Area" and "Measurement Grouping" identified in the PRM. There should be at least one Measurement Indicator for each of the four different Measurement Areas (for each fiscal year). The PRM is available at www.egov.gov. The table can be extended to include performance measures for years beyond FY 2009.

  Strategic Goal(s) Supported Measurement Area Measurement Grouping Measurement Indicator Baseline Target Actual Results
2006 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Customer Results Customer Services Average visitors to Integrated Management System per day to be measured by number of unique users (unique IP addresses) to the IMS as measured in user logs. 50 80 100
2006 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Mission and Business Results Frequency and Depth Average number of page requests per month on NAEP web site to be measured as an average of page requests to NAEP public web site and to NAEP 2006 release web site. 40,000 50,000 372,500
2006 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Customer Results Customer Satisfaction Percentage of NAEP data users who are satisfied or very satisfied with NAEP products. This information will be provided as part of the NAEP Web Contractor's Award Fee Evaluation 75 100 100
2006 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Processes and Activities Efficiency Integrated Management System (IMS) content items increase with additional use. 15 25 54
2006 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Processes and Activities Cycle Time Reduction in time required to review the content to be published on the web. 2 3 4
2006 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Technology Availability Percentage of time NAEP web site is available. To be measured using up-time logs and access times recorded at host. 92 95 97
2007 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Customer Results Customer Services Average visitors to Integrated Management System per day to be measured by number of unique users (unique IP addresses) to the IMS as measured in user logs. 60 85 65
2007 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Mission and Business Results Frequency and Depth Average number of page requests per month on NAEP web site 45,000 55,000 300,000
2007 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Customer Results Customer Satisfaction Percentage of NAEP data users who are satisfied or very satisfied with NAEP products. This information will be provided as part of the NAEP Web Contractor's Award Fee Evaluation. 75 100 100
2007 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Processes and Activities Efficiency Integrated Management System (IMS) content items increase with additional use to be measured in number of published pages and content items contained within the IMS. 10 20 20
2007 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Processes and Activities Cycle Time Reduction in time required to review content to be published on the web to be measured against 2006 review cycles and 2007 review cycles using online review tools. 1 2 2
2007 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Technology Availability Percentage of time NAEP web site is available to be measured using up-time logs and access times recorded at host. 94 97 96
2007 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Processes and Activities Efficiency Content Management System (CMS) and ADTracker content items increase with additional use, to be measured in number of published pages and content items contained within the CMS/ADTracker. 10 20 20
2008 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Customer Results Customer Services Average visitors to Integrated Management System per day to be measured by number of unique users (unique IP addresses) to the IMS as measured in user logs. 75 90 TBD
2008 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Mission and Business Results Frequency and Depth Average number of page requests per month on NAEP web site as measured by total page requests from the NAEP public web site. 200,000 350,000 TBD
2008 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Customer Results Customer Satisfaction Percentage of NAEP data users who are satisfied or very satisfied with NAEP products. This information will be provided as part of the NAEP Web Contractor's Award Fee Evaluation. 75 100 TBD
2008 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Processes and Activities Efficiency Integrated Management System (IMS) content items increase with additional use to be measured in number of published pages and content items contained within the IMS. 10 20 TBD
2008 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Processes and Activities Cycle Time Reduction in time required to review content to be published on the web to be measured against 2006 review cycles and 2007 review cycles using online review tools. 1 2 TBD
2008 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Processes and Activities Efficiency Content Management System (CMS) and ADTracker content items increase with additional use, to be measured in number of published pages and content items contained within the CMS/ADTracker. 10 20 TBD
2008 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Technology Availability Percentage of time NAEP web site is available to be measured using up-time logs and access times recorded at host as hours available per month. 95 99 TBD
2009 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Customer Results Customer Services Average visitors to Integrated Management System per day to be measured by number of unique users (unique IP addresses) to the IMS as measured in user logs. 80 95 TBD
2009 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Mission and Business Results Frequency and Depth Average number of page requests per month on NAEP web site as measured by total page requests from the NAEP public web site. 250,000 350,000 TBD
2009 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Customer Results Customer Satisfaction Percentage of NAEP data users who are satisfied or very satisfied with NAEP products. This information will be provided as part of the NAEP Web Contractor's Award Fee Evaluation. 80 100 TBD
2009 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Processes and Activities Efficiency Reduction in time required to review content to be published on the web to be measured against 2006 review cycles and 2007 review cycles using online review tools. 12 24 TBD
2009 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Processes and Activities Cycle Time Reduction in time required to review content to be published on the web to be measured against 2006 review cycles and 2007 review cycles using online review tools. 2 3 TBD
2009 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Processes and Activities Efficiency Content Management System (CMS) and ADTracker content items increase with additional use, to be measured in number of published pages and content items contained within the CMS/ADTracker. 12 24 TBD
2009 Goal 1: Objective 1: Improve Student Achievement in reading/language artsGoal 1: Objective 2: Improve Student Achievement in Mathematics Technology Availability Percentage of time NAEP web site is available to be measured using up-time logs and access times recorded at host as hours available per month. 96 99 TBD

Section E: Security and Privacy (IT Capital Assets only)

In order to successfully address this area of the business case, each question below must be answered at the system/application level, not at a program or agency level. Systems supporting this investment on the planning and operational systems security tables should match the systems on the privacy table below. Systems on the Operational Security Table must be included on your agency FISMA system inventory and should be easily referenced in the inventory (i.e., should use the same name or identifier).

For existing Mixed-Life Cycle investments where enhancement, development, and/or modernization is planned, include the investment in both the "Systems in Planning" table (Table 3) and the "Operational Systems" table (Table 4). Systems which are already operational, but have enhancement, development, and/or modernization activity, should be included in both Table 3 and Table 4. Table 3 should reflect the planned date for the system changes to be complete and operational, and the planned date for the associated C&A update. Table 4 should reflect the current status of the requirements listed. In this context, information contained within Table 3 should characterize what updates to testing and documentation will occur before implementing the enhancements; and Table 4 should characterize the current state of the materials associated with the existing system.

All systems listed in the two security tables should be identified in the privacy table. The list of systems in the "Name of System" column of the privacy table (Table 8) should match the systems listed in columns titled "Name of System" in the security tables (Tables 3 and 4). For the Privacy table, it is possible that there may not be a one-to-one ratio between the list of systems and the related privacy documents. For example, one PIA could cover multiple systems. If this is the case, a working link to the PIA may be listed in column (d) of the privacy table more than once (for each system covered by the PIA).

I. E. 1. Have the IT security costs for the system(s) been identified and integrated into the overall costs of the investment?

I. E. 1. a. If "yes," provide the "Percentage IT Security" for the budget year:

I. E. 2. Is identifying and assessing security and privacy risks a part of the overall risk management effort for each system supporting or part of this investment?

I. E. 3. Systems in Planning and Undergoing Enhancement(s) – Security Table:
The questions asking whether there is a PIA which covers the system and whether a SORN is required for the system are discrete from the narrative fields. The narrative column provides an opportunity for free text explanation why a working link is not provided. For example, a SORN may be required for the system, but the system is not yet operational. In this circumstance, answer "yes" for column (e) and in the narrative in column (f), explain that because the system is not operational the SORN is not yet required to be published.

  Agency/or contractor Operated System Planned Operational Date Planned or Actual C&A Completion Date
       
       
       
       
       

I. E. 4. Operational Systems - Security:

  Agency/or contractor Operated System NIST FIPS 199 Risk Impact level (High, Moderate, Low). Has C&A been Completed, using NIST 800-37? (Y/N) Date C&A Complete. What standards were used for the Security Controls tests? Date Complete(d): Security Control Testing Date the contingency plan tested.
               
               
               
               
               

I. E. 5. Have any weaknesses related to any of the systems part of or supporting this investment been identified by the agency or IG?

I. E. 5. a. If "yes," have those weaknesses been incorporated into the agency's plan of action and milestone process?

I. E. 6. Indicate whether an increase in IT security funding is requested to remediate IT security weaknesses?

I. E. 6. a. If "yes," specify the amount, provide a general description of the weakness, and explain how the funding request will remediate the weakness.
(long text - 2500 characters)

I. E. 7. How are contractor security procedures monitored, verified, and validated by the agency for the contractor systems above?
(long text - 2500 characters)

I. E. 8. Planning & Operational Systems - Privacy Table:
Details for Text Options:
Column (d): If yes to (c), provide the link(s) to the publicly posted PIA(s) with which this system is associated. If no to (c), provide an explanation why the PIA has not been publicly posted or why the PIA has not been conducted.

Column (f): If yes to (e), provide the link(s) to where the current and up to date SORN(s) is published in the federal register. If no to (e), provide an explanation why the SORN has not been published or why there isn't a current and up to date SORN.

Note: Links must be provided to specific documents not general privacy websites.

  (b) Is this a new system? (Y/N) (c) Is there a Privacy Impact Assessment (PIA) that covers this system? (Y/N) (d) Internet Link or Explanation (e) Is a System of Records Notice (SORN) required for this system? (Y/N) (f) Internet Link or Explanation
           
           
           
           
           

Section F: Enterprise Architecture (EA) (IT Capital Assets only)

In order to successfully address this area of the business case and capital asset plan you must ensure the investment is included in the agency's EA and Capital Planning and Invesment Control (CPIC) process, and is mapped to and supports the FEA. You must also ensure the business case demonstrates the relationship between the investment and the business, performance, data, services, application, and technology layers of the agency's EA.

I. F. 1. Is this investment included in your agency's target enterprise architecture?
yes

I. F. 1. a. If "no," please explain why?
(long text - 2500 characters)

I. F. 2. Is this investment included in the agency's EA Transition Strategy?
yes

I. F. 2. a. If "yes," provide the investment name as identified in the Transition Strategy provided in the agency's most recent annual EA Assessment.
(medium text - 500 characters)
National Assessment of Educational Progress (NAEP)

I. F. 2. b. If "no," please explain why?
(long text - 2500 characters)

I. F. 3. Is this investment identified in a completed (contains a target architecture) and approved segment architecture?
no

I. F. 3. a. If "yes," provide the name of the segment architecture.
(medium text - 500 characters)

I. F. 4. Service Component Reference Model (SRM) Table :
Identify the service components funded by this major IT investment (e.g., knowledge management, content management, customer relationship management, etc.). Provide this information in the format of the following table. For detailed guidance regarding components, please refer to http://www.egov.gov.

a. Use existing SRM Components or identify as "NEW". A "NEW" component is one not already identified as a service component in the FEA SRM.
b. A reused component is one being funded by another investment, but being used by this investment. Rather than answer yes or no, identify the reused service component funded by the other investment and identify the other investment using the Unique Project Identifier (UPI) code from the OMB Ex 300 or Ex 53 submission.
c. 'Internal' reuse is within an agency. For example, one agency within a department is reusing a service component provided by another agency within the same department. 'External' reuse is one agency within a department reusing a service component provided by another agency in another department. A good example of this is an E-Gov initiative service being reused by multiple organizations across the federal government.
d. Please provide the percentage of the BY requested funding amount used for each service component listed in the table. If external, provide the percentage of the BY requested funding amount transferred to another agency to pay for the service. The percentages in this column can, but are not required to, add up to 100%.

  Agency Component Description FEA SRM Service Type FEA SRM Component (a) Service Component Reused - Component Name (b) Service Component Reused - UPI (b) Internal or External Reuse? (c) BY Funding Percentage (d)
NAEP Web Branding Market Branding - Continued work will be made towards the establish of a "NAEP" or "Nation's Report Card" branding, to include common look-and-feel across all NAEP-related products and easier access to NAEP tools. Content Management Content Publishing and Delivery     No Reuse 3
NAEP CRM NAEP Customer Relationship Management - The deployment and maintenance of the Customer Relationship Management (CRM) suite of applications will be in place and supported. Additional support for the public's inquiries and the ability to provide self-service to many of the questions facing the public and facing NAEP State Coordinators will be provided with the addition of a communications tracking system, a knowledgebase, and a participation tracking system. Customer Relationship Management Product Management     No Reuse 6
Enhanced Project Tracking Enhanced Project Tracking - The NAEP project teams have worked to use current intranet systems to exchange project planning and tracking documentation. However, the need remains to provide a single source of project status information. The NAEP contractor will provide a mechanism for providing snapshots of project progress to NCES and other appropriate participants in the NAEP program. The system will provide visibility into the status of the assessments and the tasks awaiting completion. Tracking and Workflow Process Tracking     No Reuse 2
NAEP Content Engineering New interface and content presentation through a redesigned Public Web Site in compliance with new NCES template standards. This redesign will result in streamlined access to content within the system and compliance with both revised NCES style and template guidelines and with Federal goals for metadata format and publication. Knowledge Management Knowledge Distribution and Delivery     No Reuse 8
NAEP Data Analytics Increased analytics delivered regarding NAEP results through the implementation of an enhanced knowledge query database system. This system will give access to the NAEP data to a secure list of personnel and will provide insights into state performance trends in NAEP. Knowledge Discovery Data Mining     No Reuse 4
Task/Budget Tracking Increased program oversight through the implementation of a Program Task/Budget Tracking System to aid in the tracking and forecasting of tasks and expenditures for each NAEP assessment. Financial Management Activity-Based Management     No Reuse 2
Integrated NAEP Item Bank Integrated NAEP Item Bank providing single-view of item development tools provided by NAEP Item Development Contractors Knowledge Management Information Sharing     No Reuse 13
NIES Web Release National Indian Education Study (NIES) Public Release Site Development and Support Content Management Content Publishing and Delivery     No Reuse 12
Operations and Maintenance Maintenance and operational support of deployed applications and software for the NAEP program. Includes software updates, additions, and hosting services. Additional technical and functional support provided. Development and Integration Enterprise Application Integration     No Reuse 45
Security and Access Control Security management and access control services to manage access to the NAEP web assets and applications. Security Management Identification and Authentication     No Reuse 5

I. F. 5. Table 1. Technical Reference Model (TRM) Table:
To demonstrate how this major IT investment aligns with the FEA Technical Reference Model (TRM), please list the Service Areas, Categories, Standards, and Service Specifications supporting this IT investment.

a. Service Components identified in the previous question should be entered in this column. Please enter multiple rows for FEA SRM Components supported by multiple TRM Service Specifications
b. In the Service Specification field, agencies should provide information on the specified technical standard or vendor product mapped to the FEA TRM Service Standard, including model or version numbers, as appropriate.

  FEA TRM Service Area FEA TRM Service Category FEA TRM Service Standard Service Specification (i.e., vendor and product name)
Enterprise Application Integration Service Access and Delivery Access Channels Other Electronic Channels Network Solutions (URL)
Identification and Authentication Service Access and Delivery Service Requirements Legislative / Compliance Accessibility Standards (508); Privacy Act
Information Sharing Service Access and Delivery Service Requirements Hosting Dedicated hosting provided by sub
Product Management Service Access and Delivery Service Transport Supporting Network Services SMTP using Microsoft Internet Information Server; LDAP using Microsoft Windows Server; DNS using Network Solutions
Data Mining Service Access and Delivery Service Transport Service Transport HTTP/HTTPS using Microsoft Internet Information Server; FTP using Microsoft Windows Server
Enterprise Application Integration Service Platform and Infrastructure Support Platforms Platform Dependent Microsoft Windows Server; Microsoft Windows ASP.NET
Information Sharing Service Platform and Infrastructure Delivery Servers Web Servers Microsoft Internet Information Server
Information Sharing Service Platform and Infrastructure Delivery Servers Application Servers Microsoft Internet Information Server
Knowledge Distribution and Delivery Service Platform and Infrastructure Delivery Servers Portal Servers Microsoft SharePoint
Content Publishing and Delivery Service Platform and Infrastructure Delivery Servers Application Servers Microsoft Content Management Server
Activity-Based Management Service Platform and Infrastructure Software Engineering Integrated Development Environment Microsoft Visual Studio.NET provided by sub
Enterprise Application Integration Service Platform and Infrastructure Software Engineering Software Configuration Management Microsoft Visual SourceSafe provided by sub
Case Management Service Platform and Infrastructure Software Engineering Modeling Microsoft Visio
Enterprise Application Integration Service Platform and Infrastructure Database / Storage Database Microsoft SQL Server
Data Mining Service Platform and Infrastructure Hardware / Infrastructure Servers / Computers Dell Poweredge Server provided by sub
Identification and Authentication Component Framework Security Certificates / Digital Signatures Digital Certificate by Thawte provided by sub
Content Publishing and Delivery Component Framework Presentation / Interface Static Display Microsoft Internet Information Server
Product Management Component Framework Presentation / Interface Dynamic Server-Side Display Microsoft ASP and Microsoft ASP.NET
Content Publishing and Delivery Component Framework Presentation / Interface Content Rendering Microsoft Internet Information Server
Enterprise Application Integration Component Framework Business Logic Platform Dependent Microsoft ASP and Microsoft ASP.NET
Data Mining Component Framework Data Management Database Connectivity Microsoft SQL Server
Knowledge Distribution and Delivery Component Framework Data Management Reporting and Analysis Microsoft SQL Server
Information Sharing Service Interface and Integration Interoperability Data Format / Classification Microsoft Visual Studio for XML creation
Information Sharing Service Interface and Integration Interoperability Data Transformation Microsoft Visual Studio for XML DTDs
Process Tracking Service Platform and Infrastructure Data Management Application Servers Microsoft SharePoint 2003 Portal Server
Process Tracking Service Platform and Infrastructure Data Management Content Rendering Microsoft Content Management Server 2000

I. F. 6. Will the application leverage existing components and/or applications across the Government (i.e., FirstGov, Pay.Gov, etc)?
no

I. F. 6. a. If "yes," please describe.
(long text - 2500 characters)

PART II: PLANNING, ACQUISITION AND PERFORMANCE INFORMATION

Part II should be completed only for investments identified as "Planning" or "Full Acquisition," or "Mixed Life-Cycle" investments in response to Question 6 in Part I, Section A above

Section A: Alternatives Analysis (All Capital Assets)

In selecting the best capital asset, you should identify and consider at least three viable alternatives, in addition to the current baseline, i.e., the status quo. Use OMB Circular A-94 for all investments and the Clinger Cohen Act of 1996 for IT investments to determine the criteria you should use in your Benefit/Cost Analysis.

II. A. 1. Did you conduct an alternatives analysis for this project?
yes

II. A. 1. a. If "yes," provide the date the analysis was completed?
2006-01-02

II. A. 1. b. If "no," what is the anticipated date this analysis will be completed?

II. A. 1. c. If no analysis is planned, please briefly explain why:
(medium text - 500 characters)

II. A. 2. Use the results of your alternatives analysis to complete the following table:
(Character Limitations: Alternative Analyzed - 250 characters; Description of Alternative - 500 Characters)

  Description of Alternative Risk Lifecycle Cost Estimate Risk Lifecycle Benefits Estimate
Alternative 1 Alternative 1 provides for the use of a performance-based contract for web hosting and development services with oversight from NCES. The performance based contract provides for the contractor to serve as the primary developer of web applications and software solutions to support the NAEP program and its activities, including random sampling, data collection, processing large amounts of information, and publishing printed materials. The NCES commitment in this alternative is to provide manageria    
Alternative 2 Contractors would continue to perform such duties as described in Alternative 4, with the exception of performing or maintaining web operations. New federal employees would be hired to implement all functions of web operations and maintenance. The contractor will be responsible for all software development and software testing, but all hosting services including hosting management configuration management, and testing services will be provided by the new federal employees under this alternative.    
Alternative 3 Keep all business functions in-house by hiring more federal workers. New federal employees would be hired to perform all NAEP activities listed in Alternative 1, thereby eliminating all contractor support.    
Alternative 4 Status quo whereby current contractor remains performing maintenance and support efforts and no additional costs for increased functionality and increased services    

II. A. 3. Which alternative was selected by the Agency's Executive/Investment Committee and why was it chosen?
(long text - 2500 characters)
Alternative 1 was selected because it offers the Government the best opportunity for success at the best possible value. The combination of an externally hosted web environment for secure intranet activities and data processing increases the efficiency of the other NAEP support contractors while reducing the overall managerial workload of DoED personnel. The use of contractor support in a performance-based model has provided and will provide the program with the best value for support and services to the NAEP program.

Alternatives 2 and 3 limit the capabilities and potential for the project and thus affect the legislative time constraints of the project. By having elements of the system securely hosted outside of the DoED infrastructure, the program is able to perform rapid development of intranet applications, perform technology upgrades faster, and respond to changing requirements for information technology support and services more quickly. Additionally, Alternatives 2 and 3 would require significant investments by DoED in infrastructure and manpower to support the current NAEP web operational environment.

Alternative 4 was not chosen for the alternative does not provide for the advancement of new technologies nor for increased streamlining efforts. Efforts performed here will require maintenance and support of existing systems and will result in outdated applications during the current lifecycle. The overall cost of the effort will increase under this scenario as the skills required to maintain and support the existing systems become less available and more expensive to the program.

II. A. 4. What specific qualitative benefits will be realized?
(long text - 2500 characters)
Using the performance-based model will create the incentives to meet the established legislative milestones for reporting of NAEP results as well as ensure high customer satisfaction with project deliverables. Through the alternative selected, additional benefits will be derived and are planned for the effort. These benefits include:

II. A. 5. Will the selected alternative replace a legacy system in-part or in-whole?
no

II. A. 5. a. If "yes," are the migration costs associated with the migration to the selected alternative included in this investment, the legacy investment, or in a separate migration investment?

II. A. 5. b. Table 1. If "yes," please provide the following information:

  UPI if available Date of the System Retirement
     

Section B: Risk Management (All Capital Assets)

You should have performed a risk assessment during the early planning and initial concept phase of this investment's life-cycle, developed a risk-adjusted life-cycle cost estimate and a plan to eliminate, mitigate or manage risk, and be actively managing risk throughout the investment's life-cycle.

II. B. 1. Does the investment have a Risk Management Plan?
yes

II. B. 1. a. If "yes," what is the date of the plan?
2007-07-30

II. B. 1. b. Has the Risk Management Plan been significantly changed since last year's submission to OMB?
yes

II. B. 1. c. If "yes," describe any significant changes:
(long text - 2500 characters)
Updated risk management plan provided for internal review and will be updated for new procurement. Updated plan provides only those changes relating to new procurement and does not reflect introduction of new risks to project.

II. B. 2. If there currently is no plan, will a plan be developed?

II. B. 2. a. If "yes," what is the planned completion date?

II. B. 2. b. If "no," what is the strategy for managing the risks?
(long text - 2500 characters)

II. B. 3. Briefly describe how investment risks are reflected in the life cycle cost estimate and investment schedule:
(long text - 2500 characters)
Investment risks are reflected in the life cycle through the development of phase deployments and specific targets for deliverables to minimize risks of cost overruns and delivery issues.

Section C: Cost and Schedule Performance (All Capital Assets)

EVM is required only on DME portions of investments. For mixed lifecycle investments, O&M milestones should still be included in the table (Comparison of Initial Baseline and Current Approved Baseline). This table should accurately reflect the milestones in the initial baseline, as well as milestones in the current baseline.

II. C. 1. Does the earned value management system meet the criteria in ANSI/EIA Standard - 748?
yes

II. C. 2. Is the CV or SV greater than 10%?
no

II. C. 2. a. If "yes," was it the CV or SV or both ?

II. C. 2. b. If "yes," explain the causes of the variance:
(long text - 2500 characters)

II. C. 2. c. If "yes," describe the corrective actions:
(long text - 2500 characters)

II. C. 3. Has the investment re-baselined during the past fiscal year?
no

II. C. 3. a. If "yes," when was it approved by the agency head?

II. C. 4. Comparison of Initial Baseline and Current Approved Baseline
Complete the following table to compare actual performance against the current performance baseline and to the initial performance baseline. In the Current Baseline section, for all milestones listed, you should provide both the baseline and actual completion dates (e.g., "03/23/2003"/ "04/28/2004") and the baseline and actual total costs (in $ Millions). In the event that a milestone is not found in both the initial and current baseline, leave the associated cells blank. Note that the 'Description of Milestone' and 'Percent Complete' fields are required. Indicate '0' for any milestone no longer active. (Character Limitations: Description of Milestone - 500 characters)

  Initial Baseline - Planned Completion Date Initial Baseline - Total Cost Current Baseline - Planned Completion Date Current Baseline - Actual Completion Date Current Baseline - Planned Total Cost Current Baseline - Actual Total Cost Current Baseline Variance - Schedule Current Baseline Variance - Cost Percent Complete
36 FY2006 Project Coordination Activities                  
37 FY2006 NAEP Network Redesign                  
38 FY2006 NAEP Public Website Activities                  
39 FY2006 Quality Control/Quality Assurrance Activities                  
40 FY2006 Web Development Activities                  
41 FY2006 Documentation and Training Activities                  
42 FY2006 Additional Maintenance                  
43 FY006 Training Support for NAEP                  
44 FY07 NAEP Maintenance                  
45 FY07 Project Coordination Activities                  
47 FY07 Quality Control/Quality Assurance Activities                  
50 FY07 Application Maintenance                  
51 FY07 Training Support for NAEP                  
52 FY07 NIES Maintenance                  
46 FY07 NAEP Security                  
48 FY07 NAEP DME Activities                  
49 FY07 NAEP Branding/Design                  
53 FY07 NAEP CRM Development                  
54 FY07 NAEP Project Tracking                  
55 FY07 NAEP Public Site Redesign                  
56 FY07 NAEP Knowedge Query System                  
57 FY07 NAEP Program/Budget Tracking                  
58 FY07 NAEP Item Bank                  
59 FY08 NAEP Maintenance                  
60 FY08 Project Coordination Activities                  
61 FY08 NAEP Quality Control/Quality Assurance Activities                  
62 FY08 NAEP Application Maintenance                  
63 FY08 NAEP Training Support                  
64 FY08 NAEP Security                  
65 FY08 NAEP DME Activities                  
66 FY08 NAEP Public Site Redesign                  
67 FY08 NAEP CRM Knowledge                  
68 FY08 NAEP Program/Budget Tracking                  
69 FY08 NAEP Item Bank                  
70 FY08 NAEP NIES Release Site                  
71 FY09 NAEP Maintenance                  
72 FY09 Project Coordination Activities                  
73 FY09 NAEP Quality Control/Quality Assurance Activities                  
74 FY09 NAEP Application Maintenance                  
75 FY09 NAEP Training Support                  
76 FY09 NAEP Security                  
77 FY09 NAEP DME Activities                  
78 FY09 NAEP Public Site Redesign                  
79 FY09 NAEP Science ICT                  
80 FY09 NAEP IMS Upgrades                  
81 FY09 NAEP Online Questionnaires                  
FY10 NAEP Maintenance                  
FY10 NAEP Security                  
FY10 NAEP DME                  
FY11 NAEP Maintenance                  
FY11 NAEP Security                  
FY11 NAEP DME                  
FY12 NAEP Maintenance                  
FY12 NAEP Security                  
FY12 NAEP DME                  

PART III: FOR "OPERATION AND MAINTENANCE" INVESTMENTS ONLY (STEADY-STATE)

Part III should be completed only for investments identified as "Operation and Maintenance" (Steady State) in response to Question 6 in Part I, Section A above.

Section A: Risk Management (All Capital Assets)

You should have performed a risk assessment during the early planning and initial concept phase of this investment's life-cycle, developed a risk-adjusted life-cycle cost estimate and a plan to eliminate, mitigate or manage risk, and be actively managing risk throughout the investment's life-cycle.

III. A. 1. Does the investment have a Risk Management Plan?

III. A. 1. a. If "yes," what is the date of the plan?

III. A. 1. b. Has the Risk Management Plan been significantly changed since last year's submission to OMB?

III. A. 1. c. If "yes," describe any significant changes:
(long text - 2500 characters)

III. A. 2. If there currently is no plan, will a plan be developed?

III. A. 2. a. If "yes," what is the planned completion date?

III. A. 2. b. If "no," what is the strategy for managing the risks?
(long text - 2500 characters)

Section B: Cost and Schedule Performance (All Capital Assets)

III. B. 1. Was operational analysis conducted?

III. B. 1. a. If "yes," provide the date the analysis was completed.

III. B. 1. b. If "yes," what were the results?
(long text - 2500 characters)

III. B. 1. c. If "no," please explain why it was not conducted and if there are any plans to conduct operational analysis in the future:
(long text - 2500 characters)

III. B. 2. Complete the following table to compare actual cost performance against the planned cost performance baseline. Milestones reported may include specific individual scheduled preventative and predictable corrective maintenance activities, or may be the total of planned annual operation and maintenance efforts).

(Character Limitations: Description of Milestone - 250 Characters)

III. B. 2. a. What costs are included in the reported Cost/Schedule Performance information (Government Only/Contractor Only/Both)?

III. B. 2. b. Comparison of Planned and Actual Cost

  Planned Completion Date Planned Total Cost Actual Completion Date Actual Total Cost Variance - Schedule Variance - Cost
    NaN        

PART IV: Planning For "Multi-Agency Collaboration" ONLY

Part IV should be completed only for investments identified as an E-Gov initiative, an Line of Business (LOB) Initiative, or a Multi-Agency Collaboration effort., selected the "Multi-Agency Collaboration" choice in response to Question 6 in Part I, Section A above. Investments identified as "Multi-Agency Collaboration" will complete only Parts I and IV of the exhibit 300.

Section A: Multi-Agency Collaboration Oversight (All Capital Assets)

Multi-agency Collaborations, such as E-Gov and LOB initiatives, should develop a joint exhibit 300.

IV. A. 1. Stakeholder Table
As a joint exhibit 300, please identify the agency stakeholders. Provide the partner agency and partner agency approval date for this joint exhibit 300.

  Joint exhibit approval date
   

IV. A. 2. Partner Capital Assets within this Investment
Provide the partnering strategies you are implementing with the participating agencies and organizations. Identify all partner agency capital assets supporting the common solution (section 300.7); Managing Partner capital assets should also be included in this joint exhibit 300. These capital assets should be included in the Summary of Spending table of Part I, Section B. All partner agency migration investments (section 53.4) should also be included in this table. Funding contributions/fee-for-service transfers should not be included in this table. (Partner Agency Asset UPIs should also appear on the Partner Agency's exhibit 53)

  Partner Agency Asset Title Partner Agency Exhibit 53 UPI
     

IV. A. 3. Partner Funding Strategies ($millions)
For jointly funded initiative activities, provide in the "Partner Funding Strategies Table": the name(s) of partner agencies; the UPI of the partner agency investments; and the partner agency contributions for CY and BY. Please indicate partner contribution amounts (in-kind contributions should also be included in this amount) and fee-for-service amounts. (Partner Agency Asset UPIs should also appear on the Partner Agency's exhibit 53. For non-IT fee-for-service amounts the Partner exhibit 53 UPI can be left blank) (IT migration investments should not be included in this table)

  Partner Exhibit 53 UPI CY Contribution CY Fee-for-Service BY Contribution BY Fee-for-Service
    NaN NaN NaN NaN

Return to OMB Exhibit 300 page