Exhibit 300 FY2009

 

 

 

 

 

Exhibit 300 FY2009  

 

 

PART I: SUMMARY INFORMATION AND JUSTIFICATION  

In Part I, complete Sections A, B, C, and D for all capital assets (IT and non-IT). Complete Sections E and F for IT capital assets.

 

 

 

Section A: Overview (All Capital Assets)  

The following series of questions are to be completed for all investments.

 

 

 

I. A. 1. Date of Submission:   

 

 

 

2007-09-10

 

 

I. A. 2. Agency:   

 

 

 

005

 

 

I. A. 3. Bureau:   

 

 

 

96

 

 

I. A. 4. Name of this Capital Asset:   

 

(short text - 250 characters)

 

 

Resource Ordering and Status System

 

 

I. A. 5. Unique Project (Investment) Identifier:   

 

For IT investment only, see section 53. For all other, use agency ID system.

 

 

005-96-01-11-01-0040-00

 

 

I. A. 6. What kind of investment will this be in FY2009?   

 

Please NOTE: Investments moving to O&M in FY2009, with Planning/Acquisition activities prior to FY2009 should not select O&M. These investments should indicate their current status.

 

 

Operations and Maintenance

 

 

I. A. 7. What was the first budget year this investment was submitted to OMB?   

 

 

 

FY2001 or earlier

 

 

I. A. 8. Provide a brief summary and justification for this investment, including a brief description of how this, closes in part or in whole, an identified agency performance gap:   

 

(long text - 2500 characters)

 

 

The Resource Ordering and Status System (ROSS) investment was initiated in response to serious disasters in 1994, which involved loss of life and property. These disasters precipitated a series of investigations by the interagency community. Investigative bodies included Occupational Safety and Health Administration (OSHA), Bureau of Land Management (BLM), and Forest Service (FS) and resulted in interagency management reviews and interagency prescribed actions. In part, the findings cited shortcomings of fire and other incident dispatch systems; insufficient resource status documentation; and the inability to mobilize appropriate resources in a timely manner. Reviews conducted in the mid to late 1990s pointed out weaknesses in the dispatch system due to lack of resource status and availability information. In the 12/95 Federal Wildland Fire Policy Memorandum signed by Secretary of Agriculture Dan Glickman and Secretary of Interior Bruce Babbitt, the Federal Wildland fire management agencies were directed as a matter of high priority to “implement the principles, policies and recommendations of the Federal Wildland Fire Management Policy and Program Report.” This memorandum directed the agencies to correct the deficiencies in the dispatch process. ROSS is a result of this action. ROSS addresses the issues documented in the reports by implementing an interagency resource status and ordering system throughout the nation. Today, more than 400 dispatch offices use ROSS nationwide. ROSS is used by agencies within the National Wildfire Coordinating Group (NWCG). The ROSS project was reviewed and approved by USDA’s e-Board in August 29, 2007 in continued support matching the commitment from prior years. The project is considered to be in the Evaluate Phase of CPIC. A Post Implementation Review (PIR) and Operational Analysis are being completed for ROSS at this time. ROSS Operations and Maintenance (O&M) for FY 2006 through FY 2012 are included in the Summary of Spending table below.

 

 

I. A. 9. Did the Agency's Executive/Investment Committee approve this request?   

 

 

 

yes

 

 

I. A. 9. a. If "yes," what was the date of this approval?   

 

 

 

2007-08-29

 

 

I. A. 10. Did the Project Manager review this Exhibit?   

 

 

 

yes

 

 

I. A. 11. Contact information of Project Manager  

 

 

Name   

 

(short text - 250 characters)

 

 

Jon C. Skeels

 

 

Phone Number   

 

(short text - 250 characters)

 

 

303-236-0630

 

 

E-mail   

 

(short text - 250 characters)

 

 

jskeels@fs.fed.us

 

 

I. A. 11. a. What is the current FAC-P/PM certification level of the project/program manager?   

 

 

 

TBD

 

 

I. A. 12. Has the agency developed and/or promoted cost effective, energy-efficient and environmentally sustainable techniques or practices for this project?   

 

 

 

no

 

 

I. A. 12. a. Will this investment include electronic assets (including computers)?   

 

 

 

yes

 

 

I. A. 12. b. Is this investment for new construction or major retrofit of a Federal building or facility? (answer applicable to non-IT assets only)   

 

 

 

no

 

 

I. A. 12. b. 1. If "yes," is an ESPC or UESC being used to help fund this investment?   

 

 

 

 

 

 

I. A. 12. b. 2. If "yes," will this investment meet sustainable design principles?   

 

 

 

 

 

 

I. A. 12. b. 3. If "yes," is it designed to be 30% more energy efficient than relevant code?   

 

 

 

 

 

 

I. A. 13. Does this investment directly support one of the PMA initiatives?   

 

 

 

yes

 

 

I. A. 13. a. If "yes," check all that apply:   

 

 

 

Human Capital

Expanded E-Government

 

 

I. A. 13. b. Briefly and specifically describe for each selected how this asset directly supports the identified initiative(s)? (e.g. If E-Gov is selected, is it an approved shared service provider or the managing partner?)   

 

(medium text - 500 characters)

 

 

ROSS can efficiently and easily generate needs, availability, and location information for incident resources. ROSS automates a manual process, providing a common user interface to a single centralized database, thereby reducing dependencies on multiple information sources; saving money; as well as providing reports, and intelligence information. ROSS is interagency, used by federal, state, and inter-tribal agencies.

 

 

I. A. 14. Does this investment support a program assessed using the Program Assessment Rating Tool (PART)? (For more information about the PART, visit www.whitehouse.gov/omb/part.)   

 

 

 

no

 

 

I. A. 14. a. If "yes," does this investment address a weakness found during the PART review?   

 

 

 

 

 

 

I. A. 14. b. If "yes," what is the name of the PARTed Program?   

 

(short text - 250 characters)

 

 

 

 

 

I. A. 14. c. If "yes," what PART rating did it receive?   

 

 

 

 

 

 

I. A. 15. Is this investment for information technology?   

 

 

 

yes

 

 

I. A. 16. What is the level of the IT Project? (per CIO Council PM Guidance)   

 

Level 1 - Projects with low-to-moderate complexity and risk. Example: Bureau-level project such as a stand-alone information system that has low- to-moderate complexity and risk.
Level 2 - Projects with high complexity and/or risk which are critical to the mission of the organization. Examples: Projects that are part of a portfolio of projects/systems that impact each other and/or impact mission activities. Department-wide projects that impact cross-organizational missions, such as an agency-wide system integration that includes large scale Enterprise Resource Planning (e.g., the DoD Business Mgmt Modernization Program).
Level 3 - Projects that have high complexity, and/or risk, and have government-wide impact. Examples: Government-wide initiative (E-GOV, President's Management Agenda). High interest projects with Congress, GAO, OMB, or the general public. Cross-cutting initiative (Homeland Security).

 

 

Level 3

 

 

I. A. 17. What project management qualifications does the Project Manager have? (per CIO Council’s PM Guidance):   

 

(1) Project manager has been validated as qualified for this investment;(2) Project manager qualification is under review for this investment;(3) Project manager assigned to investment, but does not meet requirements;(4) Project manager assigned but qualification status review has not yet started;(5) No Project manager has yet been assigned to this investment

 

 

(1) Project manager has been validated as qualified for this investment

 

 

I. A. 18. Is this investment identified as "high risk" on the Q4-FY 2007 agency high risk report (per OMB Memorandum M-05-23)?   

 

 

 

no

 

 

I. A. 19. Is this a financial management system?   

 

 

 

no

 

 

I. A. 19. a. If "yes," does this investment address a FFMIA compliance area?   

 

 

 

 

 

 

I. A. 19. a. 1. If "yes," which compliance area   

 

(short text - 250 characters)

 

 

 

 

 

I. A. 19. a. 2. If "no," what does it address?   

 

(medium text - 500 characters)

 

 

 

 

 

I. A. 19. b. If "yes," please identify the system name(s) and system acronym(s) as reported in the most recent financial systems inventory update required by Circular A-11 section 52   

 

(long text - 2500 characters)

 

 

 

 

 

I. A. 20. What is the percentage breakout for the total FY2009 funding request for the following? (This should total 100%)  

 

 

I. A. 20. a. Hardware   

 

 

 

6

 

 

I. A. 20. b. Software   

 

 

 

1

 

 

I. A. 20. c. Services   

 

 

 

93

 

 

I. A. 20. d. Other   

 

 

 

0

 

 

I. A. 21. If this project produces information dissemination products for the public, are these products published to the Internet in conformance with OMB Memorandum 05-04 and included in your agency inventory, schedules and priorities?   

 

 

 

n/a

 

 

I. A. 22. Contact information of individual responsible for privacy related questions:  

 

 

I. A. 22. a. Name   

 

(short text - 250 characters)

 

 

Nancy DeLong

 

 

I. A. 22. b. Phone Number   

 

(short text - 250 characters)

 

 

208-947-3710

 

 

I. A. 22. c. Title   

 

(short text - 250 characters)

 

 

Deputy Project Manager

 

 

I. A. 22. d. E-mail   

 

(short text - 250 characters)

 

 

ndelong@fs.fed.us

 

 

I. A. 23. Are the records produced by this investment appropriately scheduled with the National Archives and Records Administration's approval?   

 

 

 

yes

 

 

I. A. 24. Does this investment directly support one of the GAO High Risk Areas?   

 

Question 24 must be answered by all Investments:

 

 

no

 

 

Section B: Summary of Spending (All Capital Assets)  

 

 

I. B. 1. Provide the total estimated life-cycle cost for this investment by completing the following table. All amounts represent budget authority in millions, and are rounded to three decimal places. Federal personnel costs should be included only in the row designated "Government FTE Cost," and should be excluded from the amounts shown for "Planning," "Full Acquisition," and "Operation/Maintenance." The "TOTAL" estimated annual cost of the investment is the sum of costs for "Planning," "Full Acquisition," and "Operation/Maintenance." For Federal buildings and facilities, life-cycle costs should include long term energy, environmental, decommissioning, and/or restoration costs. The costs associated with the entire life-cycle of the investment should be included in this report.   

 

Note: For the cross-agency investments, this table should include all funding (both managing and partner agencies).
Government FTE Costs should not be included as part of the TOTAL represented.

 

 

 

PY-1 Spending Prior to 2007

PY 2007

CY 2008

BY 2009

BY+1 2010

BY+2 2011

BY+3 2012

BY+4 2013 and Beyond

Total

Planning

0

0

0

0

 

 

 

 

 

Acquisition

6.750

3.850

4.400

2.956

 

 

 

 

 

Subtotal Planning & Acquisition

6.750

3.850

4.400

2.956

 

 

 

 

 

Operations & Maintenance

5.837

4.642

6.097

8.170

 

 

 

 

 

TOTAL

12.587

8.492

10.497

11.126

 

 

 

 

 

Government FTE Costs

0

0

0

0

 

 

 

 

 

Number of FTE represented by cost

0

0

0

0

 

 

 

 

 

 

 

I. B. 2. Will this project require the agency to hire additional FTE's?   

 

 

 

no

 

 

I. B. 2. a. If "yes," How many and in what year?   

 

(medium text - 500 characters)

 

 

 

 

 

I. B. 3. If the summary of spending has changed from the FY2008 President's budget request, briefly explain those changes.   

 

(long text - 2500 characters)

 

 

The summary of spending has changed from the FY08 budget request. An increase in Operations and Maintenance (O&M) costs are now anticipated for 2008-2012. The new O&M work includes the periodic upgrading/replacement of system hardware and software. A major component of ROSS is the underlying code which was built/generated using the Versata Business Rules Management System, supporting middleware, and ORACLE Database. The project charter includes a requirement for the use of COTS software; Versata is a key COTS product. The application code was initially developed from 2000-2002 and requires significant upgrade to account for a technology refresh of the base Versata Software Technology which is the base level COTS software that ROSS was developed from. These changes will affect the ORACLE database which will require refactoring, table optimization, tuning, load testing; system reports; and supporting middleware.

 

 

Section C: Acquisition/Contract Strategy (All Capital Assets)  

 

 

I. C. 1. Complete the table for all (including all non-Federal) contracts and/or task orders currently in place or planned for this investment. Total Value should include all option years for each contract. Contracts and/or task orders completed do not need to be included.   

 

SIS - Share in Services contract; ESPC - Energy savings performance contract ; UESC - Utility energy efficiency service contract; EUL - Enhanced use lease contract; N/A - no alternative financing used.
(Character Limitations: Contract or Task Order Number - 250 Characters; Type of Contract/Task Order - 250 Characters; Name of CO - 250 Characters; CO Contact Information - 250 Characters)

 

 

 

 

 

I. C. 2. If earned value is not required or will not be a contract requirement for any of the contracts or task orders above, explain why:   

 

(long text - 2500 characters)

 

 

 

 

 

I. C. 3. Do the contracts ensure Section 508 compliance?   

 

 

 

 

 

 

I. C. 3. a. Explain Why:   

 

(medium text - 500 characters)

 

 

 

 

 

I. C. 4. Is there an acquisition plan which has been approved in accordance with agency requirements?   

 

 

 

 

 

 

I. C. 4. a. If "yes," what is the date?   

 

 

 

 

 

 

I. C. 4. b. If "no," will an acquisition plan be developed?   

 

 

 

 

 

 

I. C. 4. b. 1. If "no," briefly explain why:   

 

(medium text - 500 characters)

 

 

 

 

 

Section D: Performance Information (All Capital Assets)  

In order to successfully address this area of the exhibit 300, performance goals must be provided for the agency and be linked to the annual performance plan. The investment must discuss the agency’s mission and strategic goals, and performance measures (indicators) must be provided. These goals need to map to the gap in the agency's strategic goals and objectives this investment is designed to fill. They are the internal and external performance benefits this investment is expected to deliver to the agency (e.g., improve efficiency by 60 percent, increase citizen participation by 300 percent a year to achieve an overall citizen participation rate of 75 percent by FY 2xxx, etc.). The goals must be clearly measurable investment outcomes, and if applicable, investment outputs. They do not include the completion date of the module, milestones, or investment, or general goals, such as, significant, better, improved that do not have a quantitative measure.

 

 

 

I. D. 1. Table 1. Performance Information Table   

 

In order to successfully address this area of the exhibit 300, performance goals must be provided for the agency and be linked to the annual performance plan. The investment must discuss the agency’s mission and strategic goals, and performance measures (indicators) must be provided. These goals need to map to the gap in the agency's strategic goals and objectives this investment is designed to fill. They are the internal and external performance benefits this investment is expected to deliver to the agency (e.g., improve efficiency by 60 percent, increase citizen participation by 300 percent a year to achieve an overall citizen participation rate of 75 percent by FY 2xxx, etc.). The goals must be clearly measurable investment outcomes, and if applicable, investment outputs. They do not include the completion date of the module, milestones, or investment, or general goals, such as, significant, better, improved that do not have a quantitative measure.

Agencies must use the following table to report performance goals and measures for the major investment and use the Federal Enterprise Architecture (FEA) Performance Reference Model (PRM). Map all Measurement Indicators to the corresponding "Measurement Area" and "Measurement Grouping" identified in the PRM. There should be at least one Measurement Indicator for each of the four different Measurement Areas (for each fiscal year). The PRM is available at www.egov.gov. The table can be extended to include performance measures for years beyond FY 2009.

 

 

 

Strategic Goal(s) Supported

Measurement Area

Measurement Grouping

Measurement Indicator

Baseline

Target

Actual Results

2005

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Mission and Business Results

Disaster Preparedness and Planning

Extent to which outcomes related to Disaster Management are achieved

The ROSS dispatch component has been completed in all Geographic Areas except AK and CA. Use of the system for documentation, collection, consolidation, dissemination, and sharing of Resource Mobilization information is evident.

During 2005, the planned improvement is to complete implementation of the ROSS Dispatch module in Alaska.

ROSS has been successfully implemented in Alaska, both at the Federal level (the Geographic Area Coordination Center or GACC) and at the Local Dispatch office level. Implementation of ROSS in California at both the Federal level (GACC) and Local D

2005

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Customer Satisfaction

% of ROSS users satisfied with system support

Extent to which ROSS users are satisfied with system support

Customers participating in ROSS Help Desk surveys answer ?agree,? ?strongly agree,? ?neutral? or ?not applicable? for more than 85% of the questions.

Through the fourth quarter of 2005, all of the questions scored between 89% and 100% indicating that survey respondents answered answer ?agree,? ?strongly agree,? ?neutral? or ?not applicable? to the survey questions.

2005

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Response Time

Extent to which ROSS implementation improves dispatch, status documentation, and resource mobilization

Availability of dispatch, status, and resource mobilization information

Information available on a real-time basis

100% of the information is available on a real-time basis.

2005

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Processes and Activities

Savings and Cost Avoidance

Extent to which ROSS implementation eliminates the need for the National Interagency Coordination Center (NICC) to keep a separate database of orders for year end reporting.

Separate database required to support year end reporting; the data accuracy is very limited because the data are based on manual processes

No separate database is required and improved accuracy in year-end reporting.

As of September of 2005, a separate database is no longer required and the accuracy of year end reporting has increased.

2005

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Technology

Reliability

% of unscheduled/ unplanned system down time of ROSS infrastructure

Number of hours of unscheduled/ unplanned downtime during fire season

Production infrastructure available 95% of the time or more

As of August 25, 2005, for the 12-month period prior, ROSS Database Servers, Application Servers, Report & GIS Servers, and Edge Servers (used for routers for the DMS and Web services) were all available between 96.79% and 100% of the time.

2006

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Mission and Business Results

Disaster Preparedness and Planning

Extent to which outcomes related to Disaster Management are achieved

Implementation of the ROSS Dispatch component has been completed in all Geographic Areas except California.

Complete implementation of the ROSS Dispatch module in all Geographic Areas.

California implementation is 100% complete. Both of the two interagency GACCs and approximately 118 Local Dispatch offices have implemented ROSS.

2006

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Customer Satisfaction

% of ROSS users satisfied with system support

More than 85% of ROSS users are satisfied with system support

Customers participating in ROSS Help Desk surveys answer ?agree,? ?strongly agree,? ?neutral? or ?not applicable? for more than 87% of the questions.

Through the third quarter of 2006, all of the questions scored between 93% and 100% indicating that survey respondents answered answer ?agree,? ?strongly agree,? ?neutral? or ?not applicable? to the survey questions.

2006

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Response Time

Extent to which ROSS implementation improves dispatch, status documentation, and resource mobilization

100% of information is available on a real-time basis.

Information available on a real-time basis

100% of the information is available on a real-time basis.

2006

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Processes and Activities

Savings and Cost Avoidance

Extent to which ROSS implementation eliminates the need for the National Interagency Coordination Center (NICC) to keep a separate database of orders for year end reporting.

Separate database required to support year end reporting; the data accuracy is very limited because the data are based on manual processes

No separate database is required and improved accuracy in year-end reporting.

Separate database no longer required; accuracy of year end reporting has increased.

2006

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Technology

Reliability

% of unscheduled/ unplanned system down time of ROSS infrastructure

Production infrastructure available 95% of the time or more

Production infrastructure available more than 95% of the time

As of August 14, 2006, for the 12-month period prior, ROSS production Database Servers, Application Servers, and Report & GIS Servers were all available more than 96% of the time.

2007

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Mission and Business Results

Disaster Preparedness and Planning

Extent to which outcomes related to Disaster Management are achieved

Implementation of the ROSS Dispatch component has been completed in all Geographic Areas including California. Use of the system is evident.

Full implementation of ROSS throughout the dispatch community.

California and Alaska implementation are completed. Full state use of ROSS Is complete.

2007

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Customer Satisfaction

% of ROSS users satisfied with system support

More than 87% of ROSS users are satisfied with system support

Customers participating in ROSS Help Desk surveys answer ?agree,? ?strongly agree? or ?neutral? for more than 89% of the questions.

Through June of 2007, all of the questions scored between 78% and 100% indicating that survey respondents answered answer ?agree,? ?strongly agree,? ?neutral? or ?not applicable? to the survey questions. Two of the questions scored below 89%. These

2007

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Response Time

Extent to which ROSS implementation improves dispatch, status documentation, and resource mobilization

Availability of dispatch, status, and resource mobilization information

Information available on a real-time basis

100% of the information is available on a real-time basis

2007

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Processes and Activities

Savings and Cost Avoidance

Extent to which ROSS implementation eliminates the need for the National Interagency Coordination Center (NICC) to keep a separate database of orders for year end reporting.

Separate database required to support year end reporting; the data accuracy is very limited because the data are based on manual processes

No separate database is required and improved accuracy in year-end reporting.

Separate database no longer required; accuracy of year end reporting has increased.

2007

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Technology

Reliability

% of unscheduled/ unplanned system down time of ROSS infrastructure

Production infrastructure available 95% of the time or more

Production infrastructure available more than 95% of the time.

As of July 23, 2007, for the 12-month period prior, ROSS production Database Servers, Application Servers, and Report & GIS Servers were all available more than 98% of the time.

2008

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Mission and Business Results

Disaster Preparedness and Planning

Extent to which outcomes related to Disaster Management are achieved

Implementation of the ROSS Dispatch component has been completed in all Geographic Areas including California. Use of the system is evident.

Full implementation of ROSS throughout the dispatch community.

TBD

2008

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Customer Satisfaction

% of ROSS users satisfied with system support

More than 89% of ROSS users are satisfied with system support

Customers participating in ROSS Help Desk surveys answer ?agree,? ?strongly agree? or ?neutral? for more than 90% of the questions.

TBD

2008

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Response Time

Extent to which ROSS implementation improves dispatch, status documentation, and resource mobilization

Availability of dispatch, status, and resource mobilization information

Information available on a real-time basis

TBD

2008

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Processes and Activities

Savings and Cost Avoidance

Extent to which ROSS implementation eliminates the need for the National Interagency Coordination Center (NICC) to keep a separate database of orders for year end reporting.

Separate database required to support year end reporting; the data accuracy is very limited because the data are based on manual processes

No separate database is required and improved accuracy in year-end reporting.

TBD

2008

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Technology

Reliability

Disaster Preparedness and Planning

Production infrastructure available 95% of the time or more

Production infrastructure available 95% of the time or more

TBD

2009

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Mission and Business Results

Disaster Preparedness and Planning

Extent to which outcomes related to Disaster Management are achieved

Implementation of the ROSS Dispatch component has been completed in all Geographic Areas including California. Use of the system is evident.

Full implementation of ROSS throughout the dispatch community.

TBD

2009

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Customer Satisfaction

% of ROSS users satisfied with system support

More than 90% of ROSS users are satisfied with system support

Customers participating in ROSS Help Desk surveys answer ?agree,? ?strongly agree? or ?neutral? for more than 90% of the questions.

TBD

2009

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Response Time

Extent to which ROSS implementation improves dispatch, status documentation, and resource mobilization

Availability of dispatch, status, and resource mobilization information

Information available on a real-time basis

TBD

2009

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Processes and Activities

Savings and Cost Avoidance

Extent to which ROSS implementation eliminates the need for the National Interagency Coordination Center (NICC) to keep a separate database of orders for year end reporting.

Separate database required to support year end reporting; the data accuracy is very limited because the data are based on manual processes

No separate database is required and improved accuracy in year-end reporting.

TBD

2009

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Technology

Reliability

% of unscheduled/ unplanned system down time of ROSS infrastructure

Production infrastructure available 95% of the time or more

Production infrastructure available 95% of the time or more

TBD

2010

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Mission and Business Results

Disaster Preparedness and Planning

Extent to which outcomes related to Disaster Management are achieved

Implementation of the ROSS Dispatch component has been completed in all Geographic Areas including California. Use of the system is evident.

Continued implementation of ROSS throughout the dispatch community.

TBD

2010

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Customer Satisfaction

% of ROSS users satisfied with system support

More than 90% of ROSS users are satisfied with system support

Customers participating in ROSS Help Desk surveys answer ?agree,? ?strongly agree? or ?neutral? for more than 90% of the questions.

TBD

2010

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Response Time

Extent to which ROSS implementation improves dispatch, status documentation, and resource mobilization

Availability of dispatch, status, and resource mobilization information

Information available on a real-time basis

TBD

2010

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Processes and Activities

Savings and Cost Avoidance

Extent to which ROSS implementation eliminates the need for the National Interagency Coordination Center (NICC) to keep a separate database of orders for year end reporting.

Separate database required to support year end reporting; the data accuracy is very limited because the data are based on manual processes

No separate database is required and improved accuracy in year-end reporting.

TBD

2010

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Technology

Reliability

% of unscheduled/ unplanned system down time of ROSS infrastructure

Production infrastructure available 95% of the time or more

Infrastructure available 95% of the time or more

TBD

2011

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Mission and Business Results

Disaster Preparedness and Planning

Extent to which outcomes related to Disaster Management are achieved

Implementation of the ROSS Dispatch component has been completed in all Geographic Areas including California. Use of the system is evident.

Continued implementation of ROSS throughout the dispatch community.

TBD

2011

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Customer Satisfaction

% of ROSS users satisfied with system support

More than 90% of ROSS users are satisfied with system support

Customers participating in ROSS Help Desk surveys answer ?agree,? ?strongly agree? or ?neutral? for more than 90% of the questions.

TBD

2011

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Customer Results

Response Time

Extent to which ROSS implementation improves dispatch, status documentation, and resource mobilization

Availability of dispatch, status, and resource mobilization information

Information available on a real-time basis

TBD

2011

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Processes and Activities

Savings and Cost Avoidance

Extent to which ROSS implementation eliminates the need for the National Interagency Coordination Center (NICC) to keep a separate database of orders for year end reporting.

Separate database required to support year end reporting; the data accuracy is very limited because the data are based on manual processes

No separate database is required and improved accuracy in year-end reporting.

TBD

2011

USDA SP Goal 6: Protect and enhance the nation’s natural resource base and environment. FS SP Goal 1: Reduce risk from catastrophic wildland fire. Fire Plan Goal 1: Improve Fire Prevention and Suppression. DOI SP End Outcome Goal 1: Improve Protecti

Technology

Reliability

% of unscheduled/ unplanned system down time of ROSS infrastructure

Production infrastructure available 95% of the time or more

Infrastructure available 95% of the time or more

TBD

2003

SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS status training to dispatchers.

Implement a system in all wildland dispatch offices that automates the manual processes associated with the collecting and sharing of resource status information.

Train more than 300 students with representatives of more than 300 dispatch offices.

Between 2001 and 2003: 30 sessions held. 400 students trained. 350 offices using the system.

2003

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS ordering training to dispatchers.

Implement a system in all wildland dispatch offices that automates the manual processes associated with the collecting and sharing of resource ordering information.

Train more than 1000 students.

Between 2001 and 2003: 94 sessions held. 1354 students trained. 350 offices using the system.

2004

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS status training to dispatchers.

Between 2001 and 2003: 30 sessions held. 400 students trained. 350 offices using the system.

Train more than 100 students with representatives of more than 300 dispatch offices.

2004: 4 sessions held. 90 students trained. 350 offices using the system.

2004

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS ordering training to dispatchers.

Between 2001 and 2003: 94 sessions held. 1354 students trained. 350 offices using the system.

Train more than 300 students.

2004: 27 sessions held. 380 students trained. 350 offices using the system.

2005

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS status training to dispatchers.

2004: 4 sessions held. 90 students trained. 350 offices using the system.

Train more than 100 students with representatives of more than 300 dispatch offices.

2005: 9 sessions held. 101 students trained. 400 offices using the system.

2005

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS ordering training to dispatchers.

2004: 27 sessions held. 380 students trained. 350 offices using the system.

Train more than 300 students.

2005: 27 sessions held. 519 students trained. 400 offices using the system.

2006

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS status training to dispatchers.

2005: 9 sessions held. 101 students trained. 400 offices using the system.

Train more than 50 students.

2006: 2 sessions held. 71 students trained. 400 offices using the system

2006

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS ordering training to dispatchers.

2005: 27 sessions held. 519 students trained. 400 offices using the system.

Train more than 50 students.

2006: 2 sessions held. 71 students trained. 400 offices using the system

2007

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS status training to dispatchers.

2006: 2 sessions held. 71 students trained. 400 offices using the system

Availability of interactive “On-Demand” training for all users and updated user support information. Note: Classroom training will cease at the end of 2006 due to the completion of ROSS implementation.

Training modules are available to all users through On-Demand. User Guides, Reference Materials, Tips & Tricks, and the Knowledge database at the Help Desk are available on the ROSS Web site.

2007

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS ordering training to dispatchers.

2006: 24 sessions held. 517 students trained. 400 offices using the system

Availability of interactive “On-Demand” training for all users and updated user support information and updated user support information.

Training modules are available to all users through On-Demand. Completion of more On Demand modules are planned during the summer of 2007. User Guides, Reference Materials, Tips & Tricks, and the Knowledge database at the Help Desk are availabl

2008

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS status training to dispatchers.

Training modules are available to all users through On-Demand. Completion of more On Demand modules are planned during 2008.

Availability of additional interactive “On-Demand” training for all users and updated user support information.

TBD

2008

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS ordering training to dispatchers.

Training modules are available to all users through On-Demand. Completion of more On Demand modules are planned during 2008.

Availability of additional interactive “On-Demand” training for all users and updated user support information.

TBD

2009

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS status training to dispatchers.

Training modules are available to all users through On-Demand. Completion of more On Demand modules are planned during 2009.

Availability of additional interactive “On-Demand” training for all users and updated user support information.

TBD

2009

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS ordering training to dispatchers.

Training modules are available to all users through On-Demand. Completion of more On Demand modules are planned during 2009.

Availability of additional interactive “On-Demand” training for all users and updated user support information.

TBD

2010

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS status training to dispatchers.

Training modules are available to all users through On-Demand. Completion of more On Demand modules are planned during 2010.

Availability of additional interactive “On-Demand” training for all users and updated user support information.

TBD

2010

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS status training to dispatchers.

Training modules are available to all users through On-Demand. Completion of more On Demand modules are planned during 2010.

Availability of additional interactive “On-Demand” training for all users and updated user support information.

TBD

2011

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS status training to dispatchers.

Training modules are available to all users through On-Demand. Completion of more On Demand modules are planned during 2011.

Availability of additional interactive “On-Demand” training for all users and updated user support information.

TBD

2011

USDA SP Goal 6. FS SP Goal 1, and DOI SP End Outcome Goal 1.

Customer Results

Customer Satisfaction

Customer Training – Provide ROSS ordering training to dispatchers.

Training modules are available to all users through On-Demand. Completion of more On Demand modules are planned during 2011.

Availability of additional interactive “On-Demand” training for all users and updated user support information.

TBD

 

 

Section E: Security and Privacy (IT Capital Assets only)  

In order to successfully address this area of the business case, each question below must be answered at the system/application level, not at a program or agency level. Systems supporting this investment on the planning and operational systems security tables should match the systems on the privacy table below. Systems on the Operational Security Table must be included on your agency FISMA system inventory and should be easily referenced in the inventory (i.e., should use the same name or identifier).

For existing Mixed-Life Cycle investments where enhancement, development, and/or modernization is planned, include the investment in both the “Systems in Planning” table (Table 3) and the “Operational Systems” table (Table 4). Systems which are already operational, but have enhancement, development, and/or modernization activity, should be included in both Table 3 and Table 4. Table 3 should reflect the planned date for the system changes to be complete and operational, and the planned date for the associated C&A update. Table 4 should reflect the current status of the requirements listed. In this context, information contained within Table 3 should characterize what updates to testing and documentation will occur before implementing the enhancements; and Table 4 should characterize the current state of the materials associated with the existing system.

All systems listed in the two security tables should be identified in the privacy table. The list of systems in the “Name of System” column of the privacy table (Table 8) should match the systems listed in columns titled “Name of System” in the security tables (Tables 3 and 4). For the Privacy table, it is possible that there may not be a one-to-one ratio between the list of systems and the related privacy documents. For example, one PIA could cover multiple systems. If this is the case, a working link to the PIA may be listed in column (d) of the privacy table more than once (for each system covered by the PIA).

 

 

 

I. E. 1. Have the IT security costs for the system(s) been identified and integrated into the overall costs of the investment?   

 

 

 

 

 

 

I. E. 1. a. If "yes," provide the "Percentage IT Security" for the budget year:   

 

 

 

 

 

 

I. E. 2. Is identifying and assessing security and privacy risks a part of the overall risk management effort for each system supporting or part of this investment?   

 

 

 

 

 

 

I. E. 3. Systems in Planning and Undergoing Enhancement(s) – Security Table:   

 

The questions asking whether there is a PIA which covers the system and whether a SORN is required for the system are discrete from the narrative fields. The narrative column provides an opportunity for free text explanation why a working link is not provided. For example, a SORN may be required for the system, but the system is not yet operational. In this circumstance, answer “yes” for column (e) and in the narrative in column (f), explain that because the system is not operational the SORN is not yet required to be published.

 

 

 

 

 

I. E. 4. Operational Systems - Security:   

 

 

 

 

 

 

I. E. 5. Have any weaknesses related to any of the systems part of or supporting this investment been identified by the agency or IG?   

 

 

 

 

 

 

I. E. 5. a. If "yes," have those weaknesses been incorporated into the agency's plan of action and milestone process?   

 

 

 

 

 

 

I. E. 6. Indicate whether an increase in IT security funding is requested to remediate IT security weaknesses?   

 

 

 

 

 

 

I. E. 6. a. If "yes," specify the amount, provide a general description of the weakness, and explain how the funding request will remediate the weakness.   

 

(long text - 2500 characters)

 

 

 

 

 

I. E. 7. How are contractor security procedures monitored, verified, and validated by the agency for the contractor systems above?   

 

(long text - 2500 characters)

 

 

 

 

 

I. E. 8. Planning & Operational Systems - Privacy Table:   

 

Details for Text Options:
Column (d): If yes to (c), provide the link(s) to the publicly posted PIA(s) with which this system is associated. If no to (c), provide an explanation why the PIA has not been publicly posted or why the PIA has not been conducted.

Column (f): If yes to (e), provide the link(s) to where the current and up to date SORN(s) is published in the federal register. If no to (e), provide an explanation why the SORN has not been published or why there isn’t a current and up to date SORN.

Note: Links must be provided to specific documents not general privacy websites.

 

 

 

 

 

Section F: Enterprise Architecture (EA) (IT Capital Assets only)  

In order to successfully address this area of the business case and capital asset plan you must ensure the investment is included in the agency's EA and Capital Planning and Invesment Control (CPIC) process, and is mapped to and supports the FEA. You must also ensure the business case demonstrates the relationship between the investment and the business, performance, data, services, application, and technology layers of the agency's EA.

 

 

 

I. F. 1. Is this investment included in your agency's target enterprise architecture?   

 

 

 

yes

 

 

I. F. 1. a. If "no," please explain why?   

 

(long text - 2500 characters)

 

 

 

 

 

I. F. 2. Is this investment included in the agency's EA Transition Strategy?   

 

 

 

no

 

 

I. F. 2. a. If "yes," provide the investment name as identified in the Transition Strategy provided in the agency's most recent annual EA Assessment.   

 

(medium text - 500 characters)

 

 

 

 

 

I. F. 2. b. If "no," please explain why?   

 

(long text - 2500 characters)

 

 

There is no transition activity underway at this time.

 

 

I. F. 3. Is this investment identified in a completed (contains a target architecture) and approved segment architecture?   

 

 

 

no

 

 

I. F. 3. a. If "yes," provide the name of the segment architecture.   

 

(medium text - 500 characters)

 

 

 

 

 

I. F. 4. Service Component Reference Model (SRM) Table :   

 

Identify the service components funded by this major IT investment (e.g., knowledge management, content management, customer relationship management, etc.). Provide this information in the format of the following table. For detailed guidance regarding components, please refer to http://www.egov.gov.

a. Use existing SRM Components or identify as “NEW”. A “NEW” component is one not already identified as a service component in the FEA SRM.
b. A reused component is one being funded by another investment, but being used by this investment. Rather than answer yes or no, identify the reused service component funded by the other investment and identify the other investment using the Unique Project Identifier (UPI) code from the OMB Ex 300 or Ex 53 submission.
c. ‘Internal’ reuse is within an agency. For example, one agency within a department is reusing a service component provided by another agency within the same department. ‘External’ reuse is one agency within a department reusing a service component provided by another agency in another department. A good example of this is an E-Gov initiative service being reused by multiple organizations across the federal government.
d. Please provide the percentage of the BY requested funding amount used for each service component listed in the table. If external, provide the percentage of the BY requested funding amount transferred to another agency to pay for the service. The percentages in this column can, but are not required to, add up to 100%.

 

 

 

Agency Component Description

FEA SRM Service Type

FEA SRM Component (a)

Service Component Reused - Component Name (b)

Service Component Reused - UPI (b)

Internal or External Reuse? (c)

BY Funding Percentage (d)

ROSS Ad Hoc Service

Ad Hoc reports service

Reporting

Ad Hoc

 

 

No Reuse

7

ROSS Notification Service

Service to provide alerts

Customer Preferences

Alerts and Notifications

 

 

No Reuse

1

ROSS Business Rule Service

Service to manage business rules

Management of Processes

Business Rule Management

 

 

No Reuse

13

ROSS Catalog Service

Service to manage catalog data

Supply Chain Management

Catalog Management

 

 

No Reuse

10

ROSS Account Service

Service to manage customer accounts

Customer Relationship Management

Customer / Account Management

 

 

No Reuse

3

ROSS Exchange Service

Data exchange service

Data Management

Data Exchange

 

 

No Reuse

6

ROSS Data Service

Service to manage data marts

Data Management

Data Mart

 

 

No Reuse

3

ROSS Authentication Service

Service to manage user authentication

Security Management

Identification and Authentication

 

 

No Reuse

2

ROSS Retrieval Service

Service to retrieve data

Knowledge Management

Information Retrieval

 

 

No Reuse

5

ROSS Sharing Service

Service to retrieve data

Knowledge Management

Information Sharing

 

 

No Reuse

5

ROSS Meta Data Service

Service to manage meta data

Data Management

Meta Data Management

 

 

No Reuse

2

ROSS Help Service

Online help service

Customer Initiated Assistance

Online Help

 

 

No Reuse

2

ROSS Tutorial Service

Online tutorial service

Customer Initiated Assistance

Online Tutorials

 

 

No Reuse

2

ROSS Ordering Service

Service to manage ordering data

Supply Chain Management

Ordering / Purchasing

 

 

No Reuse

8

ROSS Query Service

Query service

Search

Query

 

 

No Reuse

4

ROSS Distribution Service

Service to manage software distribution

Systems Management

Software Distribution

 

 

No Reuse

2

ROSS Reports Service

Service to manage standard reports

Reporting

Standardized / Canned

 

 

No Reuse

8

 

 

I. F. 5. Table 1. Technical Reference Model (TRM) Table:   

 

To demonstrate how this major IT investment aligns with the FEA Technical Reference Model (TRM), please list the Service Areas, Categories, Standards, and Service Specifications supporting this IT investment.

a. Service Components identified in the previous question should be entered in this column. Please enter multiple rows for FEA SRM Components supported by multiple TRM Service Specifications
b. In the Service Specification field, agencies should provide information on the specified technical standard or vendor product mapped to the FEA TRM Service Standard, including model or version numbers, as appropriate.

 

 

 

FEA TRM Service Area

FEA TRM Service Category

FEA TRM Service Standard

Service Specification (i.e., vendor and product name)

Ad Hoc

Component Framework

Business Logic

Platform Independent

EJB

Ad Hoc

Component Framework

Presentation / Interface

Static Display

HTML

Ad Hoc

Service Access and Delivery

Access Channels

Web Browser

Internet Explorer

Ad Hoc

Service Access and Delivery

Access Channels

Web Browser

Netscape Communicator

Ad Hoc

Service Access and Delivery

Delivery Channels

Internet

TBD

Ad Hoc

Service Access and Delivery

Delivery Channels

Intranet

TBD

Ad Hoc

Service Interface and Integration

Integration

Middleware

WebSphere MQ

Ad Hoc

Service Platform and Infrastructure

Delivery Servers

Application Servers

WebSphere

Ad Hoc

Service Platform and Infrastructure

Hardware / Infrastructure

Embedded Technology Devices

RAM, Hard Disk Drive, Microprocessor, Redundant Array of Independent Disks

Ad Hoc

Service Platform and Infrastructure

Hardware / Infrastructure

Servers / Computers

Enterprise Server

Ad Hoc

Service Platform and Infrastructure

Software Engineering

Modeling

CASE

Alerts and Notifications

Service Access and Delivery

Access Channels

Web Browser

Internet Explorer

Alerts and Notifications

Service Access and Delivery

Access Channels

Web Browser

Netscape Communicator

Alerts and Notifications

Service Access and Delivery

Delivery Channels

Internet

TBD

Alerts and Notifications

Service Access and Delivery

Delivery Channels

Intranet

TBD

Alerts and Notifications

Service Access and Delivery

Service Transport

Supporting Network Services

Internet Message Access Protocol / Post Office Protocol (IMAP / POP3)

Alerts and Notifications

Service Access and Delivery

Service Transport

Supporting Network Services

Simple Mail Transfer Protocol (SMTP)

Business Rule Management

Component Framework

Business Logic

Platform Independent

EJB

Business Rule Management

Component Framework

Security

Certificates / Digital Signatures

Secure Sockets Layer (SSL)

Business Rule Management

Service Platform and Infrastructure

Hardware / Infrastructure

Servers / Computers

Enterprise Server

Business Rule Management

Service Platform and Infrastructure

Software Engineering

Modeling

CASE

Catalog Management

Component Framework

Business Logic

Platform Independent

EJB

Catalog Management

Component Framework

Presentation / Interface

Static Display

HTML

Catalog Management

Service Access and Delivery

Access Channels

Web Browser

Internet Explorer

Catalog Management

Service Access and Delivery

Access Channels

Web Browser

Netscape Communicator

Catalog Management

Service Interface and Integration

Integration

Middleware

WebSphere MQ

Catalog Management

Service Interface and Integration

Interoperability

Data Format / Classification

XML

Catalog Management

Service Interface and Integration

Interoperability

Data Format / Classification

XML Schema

Catalog Management

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Customer / Account Management

Component Framework

Business Logic

Platform Independent

EJB

Customer / Account Management

Component Framework

Presentation / Interface

Static Display

HTML

Customer / Account Management

Service Access and Delivery

Access Channels

Web Browser

Internet Explorer

Customer / Account Management

Service Access and Delivery

Access Channels

Web Browser

Netscape Communicator

Customer / Account Management

Service Interface and Integration

Integration

Middleware

WebSphere MQ

Customer / Account Management

Service Interface and Integration

Interoperability

Data Format / Classification

XML

Customer / Account Management

Service Interface and Integration

Interoperability

Data Format / Classification

XML Schema

Customer / Account Management

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Data Exchange

Component Framework

Data Interchange

Data Exchange

Electronic Business using XML (ebXML)

Data Exchange

Component Framework

Data Management

Database Connectivity

Java Database Connectivity (JDBC)

Data Exchange

Component Framework

Data Management

Database Connectivity

ODBC

Data Exchange

Service Access and Delivery

Access Channels

Web Browser

Internet Explorer

Data Exchange

Service Access and Delivery

Access Channels

Web Browser

Netscape Communicator

Data Exchange

Service Access and Delivery

Service Requirements

Hosting

Internal (within Agency)

Data Exchange

Service Access and Delivery

Service Transport

Supporting Network Services

Internet Message Access Protocol / Post Office Protocol (IMAP / POP3)

Data Exchange

Service Access and Delivery

Service Transport

Supporting Network Services

Simple Mail Transfer Protocol (SMTP)

Data Exchange

Service Access and Delivery

Service Transport

Supporting Network Services

Lightweight Directory Access Protocol (LDAP )

Data Exchange

Service Access and Delivery

Service Transport

Supporting Network Services

Directory Services (X.500)

Data Exchange

Service Access and Delivery

Service Transport

Supporting Network Services

Domain Name System (DNS)

Data Exchange

Service Access and Delivery

Service Transport

Supporting Network Services

Transport Control Protocol (TCP)

Data Exchange

Service Access and Delivery

Service Transport

Supporting Network Services

Internet Protocol (IP)

Data Exchange

Service Interface and Integration

Integration

Middleware

WebSphere MQ

Data Exchange

Service Interface and Integration

Interoperability

Data Format / Classification

XML

Data Exchange

Service Interface and Integration

Interoperability

Data Format / Classification

XML Schema

Data Exchange

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Data Exchange

Service Platform and Infrastructure

Delivery Servers

Application Servers

WebSphere

Data Exchange

Service Platform and Infrastructure

Hardware / Infrastructure

Embedded Technology Devices

RAM, Hard Disk Drive, Microprocessor, Redundant Array of Independent Disks

Data Exchange

Service Platform and Infrastructure

Hardware / Infrastructure

Servers / Computers

Enterprise Server

Data Mart

Component Framework

Data Interchange

Data Exchange

Electronic Business using XML (ebXML)

Data Mart

Component Framework

Data Management

Database Connectivity

Java Database Connectivity (JDBC)

Data Mart

Component Framework

Data Management

Database Connectivity

ODBC

Data Mart

Service Access and Delivery

Service Requirements

Hosting

Internal (within Agency)

Data Mart

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Data Mart

Service Platform and Infrastructure

Hardware / Infrastructure

Embedded Technology Devices

RAM, Hard Disk Drive, Microprocessor, Redundant Array of Independent Disks

Data Mart

Service Platform and Infrastructure

Hardware / Infrastructure

Servers / Computers

Enterprise Server

Identification and Authentication

Component Framework

Security

Certificates / Digital Signatures

Secure Sockets Layer (SSL)

Identification and Authentication

Service Access and Delivery

Service Requirements

Legislative / Compliance

Security

Identification and Authentication

Service Platform and Infrastructure

Hardware / Infrastructure

Servers / Computers

Enterprise Server

Identification and Authentication

Service Platform and Infrastructure

Hardware / Infrastructure

Wide Area Network (WAN)

Asynchronous Transfer Mode (ATM)

Information Retrieval

Component Framework

Business Logic

Platform Independent

EJB

Information Retrieval

Component Framework

Presentation / Interface

Static Display

HTML

Information Retrieval

Service Access and Delivery

Access Channels

Web Browser

Internet Explorer

Information Retrieval

Service Access and Delivery

Access Channels

Web Browser

Netscape Communicator

Information Retrieval

Service Access and Delivery

Delivery Channels

Internet

TBD

Information Retrieval

Service Access and Delivery

Delivery Channels

Intranet

TBD

Information Retrieval

Service Access and Delivery

Service Transport

Supporting Network Services

Lightweight Directory Access Protocol (LDAP )

Information Retrieval

Service Access and Delivery

Service Transport

Supporting Network Services

Transport Control Protocol (TCP)

Information Retrieval

Service Access and Delivery

Service Transport

Supporting Network Services

Internet Protocol (IP)

Information Retrieval

Service Interface and Integration

Integration

Middleware

WebSphere MQ

Information Retrieval

Service Interface and Integration

Interoperability

Data Format / Classification

XML

Information Retrieval

Service Interface and Integration

Interoperability

Data Format / Classification

XML Schema

Information Retrieval

Service Platform and Infrastructure

Delivery Servers

Application Servers

WebSphere

Information Retrieval

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Information Retrieval

Service Platform and Infrastructure

Hardware / Infrastructure

Servers / Computers

Enterprise Server

Information Sharing

Component Framework

Business Logic

Platform Independent

EJB

Information Sharing

Component Framework

Presentation / Interface

Static Display

HTML

Information Sharing

Service Access and Delivery

Access Channels

Web Browser

Internet Explorer

Information Sharing

Service Access and Delivery

Access Channels

Web Browser

Netscape Communicator

Information Sharing

Service Access and Delivery

Delivery Channels

Internet

TBD

Information Sharing

Service Access and Delivery

Delivery Channels

Intranet

TBD

Information Sharing

Service Access and Delivery

Service Transport

Supporting Network Services

Lightweight Directory Access Protocol (LDAP )

Information Sharing

Service Platform and Infrastructure

Delivery Servers

Application Servers

WebSphere

Information Sharing

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Meta Data Management

Service Access and Delivery

Service Requirements

Hosting

Internal (within Agency)

Meta Data Management

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Meta Data Management

Service Platform and Infrastructure

Hardware / Infrastructure

Embedded Technology Devices

RAM, Hard Disk Drive, Microprocessor, Redundant Array of Independent Disks

Meta Data Management

Service Platform and Infrastructure

Hardware / Infrastructure

Servers / Computers

Enterprise Server

Meta Data Management

Service Platform and Infrastructure

Hardware / Infrastructure

Servers / Computers

Enterprise Server

Meta Data Management

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Online Help

Service Access and Delivery

Access Channels

Web Browser

Internet Explorer

Online Help

Service Access and Delivery

Access Channels

Web Browser

Netscape Communicator

Online Help

Service Access and Delivery

Delivery Channels

Internet

TBD

Online Help

Service Access and Delivery

Delivery Channels

Intranet

TBD

Online Help

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Online Help

Service Platform and Infrastructure

Delivery Servers

Application Servers

WebSphere

Online Tutorials

Service Access and Delivery

Access Channels

Web Browser

Internet Explorer

Online Tutorials

Service Access and Delivery

Access Channels

Web Browser

Netscape Communicator

Online Tutorials

Service Access and Delivery

Delivery Channels

Internet

TBD

Online Tutorials

Service Access and Delivery

Delivery Channels

Intranet

TBD

Online Tutorials

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Online Tutorials

Service Platform and Infrastructure

Delivery Servers

Application Servers

WebSphere

Ordering / Purchasing

Component Framework

Business Logic

Platform Independent

EJB

Ordering / Purchasing

Component Framework

Presentation / Interface

Static Display

HTML

Ordering / Purchasing

Service Access and Delivery

Access Channels

Web Browser

Internet Explorer

Ordering / Purchasing

Service Access and Delivery

Access Channels

Web Browser

Netscape Communicator

Ordering / Purchasing

Service Interface and Integration

Integration

Middleware

WebSphere MQ

Ordering / Purchasing

Service Interface and Integration

Interoperability

Data Format / Classification

XML

Ordering / Purchasing

Service Interface and Integration

Interoperability

Data Format / Classification

XML Schema

Ordering / Purchasing

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Query

Service Access and Delivery

Delivery Channels

Internet

TBD

Query

Service Access and Delivery

Delivery Channels

Intranet

TBD

Query

Service Platform and Infrastructure

Database / Storage

Database

Oracle

Query

Service Platform and Infrastructure

Delivery Servers

Application Servers

WebSphere

Query

Component Framework

Business Logic

Platform Independent

EJB

Software Distribution

Service Platform and Infrastructure

Software Engineering

Software Configuration Management

Version Management

Standardized / Canned

Component Framework

Business Logic

Platform Independent

EJB

Standardized / Canned

Component Framework

Presentation / Interface

Static Display

HTML

Standardized / Canned

Service Access and Delivery

Access Channels

Web Browser

Internet Explorer

Standardized / Canned

Service Access and Delivery

Access Channels

Web Browser

Netscape Communicator

Standardized / Canned

Service Access and Delivery

Delivery Channels

Internet

TBD

Standardized / Canned

Service Access and Delivery

Delivery Channels

Intranet

TBD

Standardized / Canned

Service Interface and Integration

Integration

Middleware

WebSphere MQ

Standardized / Canned

Service Platform and Infrastructure

Delivery Servers

Application Servers

WebSphere

Standardized / Canned

Service Platform and Infrastructure

Hardware / Infrastructure

Embedded Technology Devices

RAM, Hard Disk Drive, Microprocessor, Redundant Array of Independent Disks

Standardized / Canned

Service Platform and Infrastructure

Hardware / Infrastructure

Servers / Computers

Enterprise Server

Standardized / Canned

Service Platform and Infrastructure

Software Engineering

Modeling

CASE

 

 

I. F. 6. Will the application leverage existing components and/or applications across the Government (i.e., FirstGov, Pay.Gov, etc)?   

 

 

 

yes

 

 

I. F. 6. a. If "yes," please describe.   

 

(long text - 2500 characters)

 

 

The ROSS project is integrated with the Disaster Management Initiative ? information on ROSS can be obtained through the Web portal, http://www.disasterhelp.gov. In May of 2005, OMB agreed that the ?alignment is complete? for ROSS and the Disaster Management Initiative. ROSS has been adopted as the National Automated Resource Management System (ARMS) standard by the Department of Homeland Security/Federal Emergency Management Agency (FEMA). ARMS is a mandated system that must meet the stated requirements of the National Incident Management System (NIMS). ROSS is the only Government Off the Shelf software to be evaluated for ARMS adoption by FEMA. A wide range of commercial products were considered, but ROSS is the only product which meets a significant percentage of the FEMA ARMS requirements.

 

 

PART II: PLANNING, ACQUISITION AND PERFORMANCE INFORMATION  

Part II should be completed only for investments identified as “Planning” or “Full-Acquisition,” or “Mixed Life-Cycle” investments in response to Question 6 in Part I, Section A above

 

 

 

Section A: Alternatives Analysis (All Capital Assets)  

In selecting the best capital asset, you should identify and consider at least three viable alternatives, in addition to the current baseline, i.e., the status quo. Use OMB Circular A-94 for all investments and the Clinger Cohen Act of 1996 for IT investments to determine the criteria you should use in your Benefit/Cost Analysis.

 

 

 

II. A. 1. Did you conduct an alternatives analysis for this project?   

 

 

 

 

 

 

II. A. 1. a. If "yes," provide the date the analysis was completed?   

 

 

 

 

 

 

II. A. 1. b. If "no," what is the anticipated date this analysis will be completed?   

 

 

 

 

 

 

II. A. 1. c. If no analysis is planned, please briefly explain why:   

 

(medium text - 500 characters)

 

 

 

 

 

II. A. 2. Use the results of your alternatives analysis to complete the following table:   

 

(Character Limitations: Alternative Analyzed - 250 characters; Description of Alternative - 500 Characters)

 

 

 

 

 

II. A. 3. Which alternative was selected by the Agency's Executive/Investment Committee and why was it chosen?   

 

(long text - 2500 characters)

 

 

 

 

 

II. A. 4. What specific qualitative benefits will be realized?   

 

(long text - 2500 characters)

 

 

 

 

 

II. A. 5. Will the selected alternative replace a legacy system in-part or in-whole?   

 

 

 

 

 

 

II. A. 5. a. If “yes,” are the migration costs associated with the migration to the selected alternative included in this investment, the legacy investment, or in a separate migration investment?   

 

 

 

 

 

 

II. A. 5. b. Table 1. If "yes," please provide the following information:   

 

 

 

 

 

 

Section B: Risk Management (All Capital Assets)  

You should have performed a risk assessment during the early planning and initial concept phase of this investment's life-cycle, developed a risk-adjusted life-cycle cost estimate and a plan to eliminate, mitigate or manage risk, and be actively managing risk throughout the investment's life-cycle.

 

 

 

II. B. 1. Does the investment have a Risk Management Plan?   

 

 

 

 

 

 

II. B. 1. a. If "yes," what is the date of the plan?   

 

 

 

 

 

 

II. B. 1. b. Has the Risk Management Plan been significantly changed since last year's submission to OMB?   

 

 

 

 

 

 

II. B. 1. c. If "yes," describe any significant changes:   

 

(long text - 2500 characters)

 

 

 

 

 

II. B. 2. If there currently is no plan, will a plan be developed?   

 

 

 

 

 

 

II. B. 2. a. If "yes," what is the planned completion date?   

 

 

 

 

 

 

II. B. 2. b. If "no," what is the strategy for managing the risks?   

 

(long text - 2500 characters)

 

 

 

 

 

II. B. 3. Briefly describe how investment risks are reflected in the life cycle cost estimate and investment schedule:   

 

(long text - 2500 characters)

 

 

 

 

 

Section C: Cost and Schedule Performance (All Capital Assets)  

EVM is required only on DME portions of investments. For mixed lifecycle investments, O&M milestones should still be included in the table (Comparison of Initial Baseline and Current Approved Baseline). This table should accurately reflect the milestones in the initial baseline, as well as milestones in the current baseline.

 

 

 

II. C. 1. Does the earned value management system meet the criteria in ANSI/EIA Standard - 748?   

 

 

 

 

 

 

II. C. 2. Is the CV or SV greater than 10%?   

 

 

 

 

 

 

II. C. 2. a. If "yes," was it the CV or SV or both ?   

 

 

 

 

 

 

II. C. 2. b. If "yes," explain the causes of the variance:   

 

(long text - 2500 characters)

 

 

 

 

 

II. C. 2. c. If "yes," describe the corrective actions:   

 

(long text - 2500 characters)

 

 

 

 

 

II. C. 3. Has the investment re-baselined during the past fiscal year?   

 

 

 

 

 

 

II. C. 3. a. If "yes," when was it approved by the agency head?   

 

 

 

 

 

 

II. C. 4. Comparison of Initial Baseline and Current Approved Baseline   

 

Complete the following table to compare actual performance against the current performance baseline and to the initial performance baseline. In the Current Baseline section, for all milestones listed, you should provide both the baseline and actual completion dates (e.g., “03/23/2003”/ “04/28/2004”) and the baseline and actual total costs (in $ Millions). In the event that a milestone is not found in both the initial and current baseline, leave the associated cells blank. Note that the ‘Description of Milestone’ and ‘Percent Complete’ fields are required. Indicate ‘0’ for any milestone no longer active. (Character Limitations: Description of Milestone - 500 characters)

 

 

 

 

 

PART III: FOR "OPERATION AND MAINTENANCE" INVESTMENTS ONLY (STEADY-STATE)  

Part III should be completed only for investments identified as "Operation and Maintenance" (Steady State) in response to Question 6 in Part I, Section A above.

 

 

 

Section A: Risk Management (All Capital Assets)  

You should have performed a risk assessment during the early planning and initial concept phase of this investment’s life-cycle, developed a risk-adjusted life-cycle cost estimate and a plan to eliminate, mitigate or manage risk, and be actively managing risk throughout the investment’s life-cycle.

 

 

 

III. A. 1. Does the investment have a Risk Management Plan?   

 

 

 

yes

 

 

III. A. 1. a. If "yes," what is the date of the plan?   

 

 

 

2007-07-23

 

 

III. A. 1. b. Has the Risk Management Plan been significantly changed since last year's submission to OMB?   

 

 

 

no

 

 

III. A. 1. c. If "yes," describe any significant changes:   

 

(long text - 2500 characters)

 

 

 

 

 

III. A. 2. If there currently is no plan, will a plan be developed?   

 

 

 

 

 

 

III. A. 2. a. If "yes," what is the planned completion date?   

 

 

 

 

 

 

III. A. 2. b. If "no," what is the strategy for managing the risks?   

 

(long text - 2500 characters)

 

 

 

 

 

Section B: Cost and Schedule Performance (All Capital Assets)  

 

 

III. B. 1. Was operational analysis conducted?   

 

 

 

 

 

 

III. B. 1. a. If "yes," provide the date the analysis was completed.   

 

 

 

 

 

 

III. B. 1. b. If "yes," what were the results?   

 

(long text - 2500 characters)

 

 

 

 

 

III. B. 1. c. If "no," please explain why it was not conducted and if there are any plans to conduct operational analysis in the future:   

 

(long text - 2500 characters)

 

 

 

 

 

III. B. 2. Complete the following table to compare actual cost performance against the planned cost performance baseline. Milestones reported may include specific individual scheduled preventative and predictable corrective maintenance activities, or may be the total of planned annual operation and maintenance efforts).  

(Character Limitations: Description of Milestone - 250 Characters)

 

 

 

III. B. 2. a. What costs are included in the reported Cost/Schedule Performance information (Government Only/Contractor Only/Both)?   

 

 

 

 

 

 

III. B. 2. b. Comparison of Planned and Actual Cost   

 

 

 

 

 

 

PART IV: Planning For "Multi-Agency Collaboration" ONLY  

Part IV should be completed only for investments identified as an E-Gov initiative, an Line of Business (LOB) Initiative, or a Multi-Agency Collaboration effort., selected the “Multi-Agency Collaboration” choice in response to Question 6 in Part I, Section A above. Investments identified as “Multi-Agency Collaboration” will complete only Parts I and IV of the exhibit 300.

 

 

 

Section A: Multi-Agency Collaboration Oversight (All Capital Assets)  

Multi-agency Collaborations, such as E-Gov and LOB initiatives, should develop a joint exhibit 300.

 

 

 

IV. A. 1. Stakeholder Table   

 

As a joint exhibit 300, please identify the agency stakeholders. Provide the partner agency and partner agency approval date for this joint exhibit 300.

 

 

 

 

 

IV. A. 2. Partner Capital Assets within this Investment   

 

Provide the partnering strategies you are implementing with the participating agencies and organizations. Identify all partner agency capital assets supporting the common solution (section 300.7); Managing Partner capital assets should also be included in this joint exhibit 300. These capital assets should be included in the Summary of Spending table of Part I, Section B. All partner agency migration investments (section 53.4) should also be included in this table. Funding contributions/fee-for-service transfers should not be included in this table. (Partner Agency Asset UPIs should also appear on the Partner Agency's exhibit 53)

 

 

 

 

 

IV. A. 3. Partner Funding Strategies ($millions)   

 

For jointly funded initiative activities, provide in the “Partner Funding Strategies Table”: the name(s) of partner agencies; the UPI of the partner agency investments; and the partner agency contributions for CY and BY. Please indicate partner contribution amounts (in-kind contributions should also be included in this amount) and fee-for-service amounts. (Partner Agency Asset UPIs should also appear on the Partner Agency's exhibit 53. For non-IT fee-for-service amounts the Partner exhibit 53 UPI can be left blank) (IT migration investments should not be included in this table)

 

 

 

 

 

IV. A. 4. Did you conduct an alternatives analysis for this project?   

 

 

 

 

 

 

IV. A. 4. a. If "yes," provide the date the analysis was completed?   

 

 

 

 

 

 

IV. A. 4. b. If "no," what is the anticipated date this analysis will be completed?   

 

 

 

 

 

 

IV. A. 4. c. If no analysis is planned, please briefly explain why:   

 

(medium text - 500 characters)

 

 

 

 

 

IV. A. 5. Use the results of your alternatives analysis to complete the following table:   

 

 

 

 

 

 

IV. A. 6. Which alternative was selected by the Initiative Governance process and why was it chosen?   

 

(long text - 2500 characters)

 

 

 

 

 

IV. A. 7. What specific qualitative benefits will be realized?   

 

(long text - 2500 characters)

 

 

 

 

 

IV. A. 8. Table 1. Federal Quantitative Benefits ($millions):   

 

What specific quantitative benefits will be realized (using current dollars)
Use the results of your alternatives analysis to complete the following table:

 

 

 

 

 

IV. A. 9. Will the selected alternative replace a legacy system in-part or in-whole?   

 

 

 

 

 

 

IV. A. 9. a. If "yes," are the migration costs associated with the migration to the selected alternative included in this investment, the legacy investment, or in a separate migration investment?   

 

 

 

 

 

 

IV. A. 9. b. Table 1. If "yes," please provide the following information:   

 

 

 

 

 

 

Section B: Risk Management (All Capital Assets)  

You should have performed a risk assessment during the early planning and initial concept phase of this investment’s life-cycle, developed a risk-adjusted life-cycle cost estimate and a plan to eliminate, mitigate or manage risk, and be actively managing risk throughout the investment’s life-cycle.

 

 

 

IV. B. 1. Does the investment have a Risk Management Plan?   

 

 

 

 

 

 

IV. B. 1. a. If "yes," what is the date of the plan?   

 

 

 

 

 

 

IV. B. 1. b. Has the Risk Management Plan been significantly changed since last year's submission to OMB?   

 

 

 

 

 

 

IV. B. 1. c. If "yes," describe any significant changes:   

 

(long text - 2500 characters)

 

 

 

 

 

IV. B. 2. If there currently is no plan, will a plan be developed?   

 

 

 

 

 

 

IV. B. 2. a. If "yes," what is the planned completion date?   

 

 

 

 

 

 

IV. B. 2. b. If "no," what is the strategy for managing the risks?   

 

(long text - 2500 characters)

 

 

 

 

 

Section C: Cost and Schedule Performance (All Capital Assets)  

You should also periodically be measuring the performance of operational assets against the baseline established during the planning or full acquisition phase (i.e., operational analysis), and be properly operating and maintaining the asset to maximize its useful life. Operational analysis may identify the need to redesign or modify an asset by identifying previously undetected faults in design, construction, or installation/integration, highlighting whether actual operation and maintenance costs vary significantly from budgeted costs, or documenting that the asset is failing to meet program requirements.

EVM is required only on DME portions of investments. For mixed lifecycle investments, O&M milestones should still be included in the table (Comparison of Initial Baseline and Current Approved Baseline). This table should accurately reflect the milestones in the initial baseline, as well as milestones in the current baseline.

Answer the following questions about the status of this investment. Include information on all appropriate capital assets supporting this investment except for assets in which the performance information is reported in a separate exhibit 300.

 

 

 

IV. C. 1. Are you using EVM to manage this investment?   

 

 

 

 

 

 

IV. C. 1. a. If "yes," does the earned value management system meet the criteria in ANSI/EIA Standard - 748?   

 

 

 

 

 

 

IV. C. 1. b. If "no," explain plans to implement EVM:   

 

(long text - 2500 characters)

 

 

 

 

 

IV. C. 1. c. If "N/A," please provide date operational analysis was conducted and a brief summary of the results?   

 

(long text - 2500 characters)

 

 

 

 

 

IV. C. 2. Is the CV% or SV% greater than ± 10%? (CV% = CV/EV x 100; SV% = SV/PV x 100)   

 

NOT applicable for capital assets with ONLY O&M.

 

 

 

 

 

IV. C. 2. a. If "yes," was it the CV or SV or both ?   

 

 

 

 

 

 

IV. C. 2. b. If "yes," explain the causes of the variance:   

 

(long text - 2500 characters)

 

 

 

 

 

IV. C. 2. c. If "yes," describe the corrective actions:   

 

(long text - 2500 characters)

 

 

 

 

 

IV. C. 3. Has the investment re-baselined during the past fiscal year?   

 

Applicable to ALL capital assets

 

 

 

 

 

IV. C. 3. a. If "yes," when was it approved by the agency head?   

 

Applicable to ALL capital assets

 

 

 

 

 

IV. C. 4. Comparison of Initial Baseline and Current Approved Baseline   

 

Complete the following table to compare actual performance against the current performance baseline and to the initial performance baseline. In the Current Baseline section, for all milestones listed, you should provide both the baseline and actual completion dates (e.g., “03/23/2003”/ “04/28/2004”) and the baseline and actual total costs (in $ Millions). In the event that a milestone is not found in both the initial and current baseline, leave the associated cells blank. Note that the ‘Description of Milestone’ and ‘Percent Complete’ fields are required. Indicate ‘0’ for any milestone no longer active.

 

 

 

 

Métier | work forward
Powered by WorkLenz
© Copyright. Métier, Ltd. 1999-2007. All rights reserved.
Patent Pending Application Numbers: 09/334,256;09/536,378;09/536,383;7,062,449;60/642,983;11/090,038
Version 5.0