| | Exhibit 300 (BY2009) for Centers for Medicare and Medicaid Services, Q-netPART ONE
OVERVIEW
- 1. Date of Submission:
- 2008-02-04
- 2. Agency:
- 009
- 3. Bureau:
- 38
- 4. Name of this Capital Asset:
- CMS Q-net
- 5. Unique Project Identifier:
- 009-38-01-02-01-2200-00
- 6. What kind of investment will this be in FY2009?
- Operations and Maintenance
- 7. What was the first budget year this investment was submitted to OMB?
- FY2001 or earlier
- 8. Provide a brief summary and justification for this investment, including a brief description of how this closes in part or in whole an identified agency performance gap.
- QualityNet (QNET), formerly the Standard Data Processing System (SDPS), was developed in response to the ongoing information requirements of the Quality Improvement Organizations (QIOs) and other affiliated partners (e.g., Clinical Data Abstraction Centers (CDAC)) to fulfill their contractual requirements. QIOs contract with CMS to improve the quality of care for beneficiaries by ensuring that this care meets professionally-recognized standards of health care. These partners are awarded contracts based on agency and legislative statements of work which directly reflect goals in the PMA Initiative "Expanded E-Government" and the HHS strategic plan of "Improving Health Care Quality, Safety, Cost and Value." The contract statements of work (SOW) cover 3-year periods. They include data center operations, application development, telecommunication fees, and all other IT infrastructure such as servers, racks, and backup. QNET provides the application, data, and physical infrastructure to support the SOW with the efficient collection, analysis, dissemination, and management of data guiding policy and intervention, as well as SOW evaluation. QNET, which became operational in May 1997, interfaces with CMS Central Office, the QIES systems, the ESRD system, 53 QIOs, and a CDAC. QNET is an information system solution that provides a common platform for users to share applications and databases to promote efficiency and increase production for all the contractors performing work on the SOW. The overall goal is to provide an IT infrastructure to support QIO operations, promote information sharing, and support management information, all in direct response to the 3- year SOW. As CMS has moved development to the Web, and as the QIO program mission has changed and legislative requirements have developed, QNET has expanded greatly to meet the needs its external communities, as well. In FY06, the QNET system began supporting the CMS' Pay for Performance program in its initial phase, The Physician Voluntary Reporting phase. QNET includes QualityNet.org which is the initial web portal for all external customers. Information resides primarily at QNet Complex 3 located at the BCSSI Data Center in Warrenton, VA on dedicated QNet servers and networks. In addition, the CMS claims warehouse and enrollment database resides at Complex 1, CMS Data Center in Baltimore, Maryland on CMS Office of Information Systems servers and networks. QNET is in the CPIC Steady State phase.
- 9. Did the Agency's Executive/Investment Committee approve this request?
- yes
- 9.a. If "yes," what was the date of this approval?
- 2007-06-26
- 10. Did the Project Manager review this Exhibit?
- yes
- 11.a. What is the current FAC-P/PM certification level of the project/program manager?
- TBD
- 12. Has the agency developed and/or promoted cost effective, energy-efficient and environmentally sustainable techniques or practices for this project.
- yes
- 12.a. Will this investment include electronic assets (including computers)?
- yes
- 12.b. Is this investment for new construction or major retrofit of a Federal building or facility? (answer applicable to non-IT assets only)
- no
- 13. Does this investment directly support one of the PMA initiatives?
- yes
- If yes, select the initiatives that apply:
Initiative Name |
---|
Expanded E-Government |
- 13.a. Briefly and specifically describe for each selected how this asset directly supports the identified initiative(s)? (e.g. If E-Gov is selected, is it an approved shared service provider or the managing partner?)
- QNET (formally SDPS) directly supports the Expanded E-Government Initiative, since QNET supports QualityNet Exchange and web e-government activities. QualityNet Exchange is a web interface that allows the public secure registration and the exchange of program-required data over the Internet, lowering costs for data collection and allowing greater outreach to provider and beneficiary communities. QualityNet Exchange is the primary application that is part of the QNET investment.
- 14. Does this investment support a program assessed using the Program Assessment Rating Tool (PART)?
- yes
- 14.a. If yes, does this investment address a weakness found during the PART review?
- yes
- 14.b. If yes, what is the name of the PARTed program?
- 2003: CMS - Medicare Program
- 14.c. If yes, what rating did the PART receive?
- Moderately Effective
- 15. Is this investment for information technology?
- yes
- 16. What is the level of the IT Project (per CIO Council's PM Guidance)?
- Level 3
- 17. What project management qualifications does the Project Manager have? (per CIO Council's PM Guidance)
- (1) Project manager has been validated as qualified for this investment
- 18. Is this investment identified as high risk on the Q4 - FY 2007 agency high risk report (per OMB memorandum M-05-23)?
- no
- 19. Is this a financial management system?
- no
- 20. What is the percentage breakout for the total FY2009 funding request for the following? (This should total 100%)
Area | Percentage |
---|
Hardware | 3 | Software | 4 | Services | 90 | Other | 3 |
- 21. If this project produces information dissemination products for the public, are these products published to the Internet in conformance with OMB Memorandum 05-04 and included in your agency inventory, schedules and priorities?
- yes
- 22. Contact information of individual responsible for privacy related questions.
Name | Maribel Franey | Phone Number | 410-786-0757 | Title | Director, Privacy Compliance | Email | Maribel.Franey@cms.hhs.gov |
- 23. Are the records produced by this investment appropriately scheduled with the National Archives and Records Administration's approval?
- yes
- 24. Does this investment directly support one of the GAO High Risk Areas?
- yes
SUMMARY OF SPEND
- 1. Provide the total estimated life-cycle cost for this investment by completing the following table. All amounts represent budget authority in millions, and are rounded to three decimal places. Federal personnel costs should be included only in the row designated Government FTE Cost, and should be excluded from the amounts shown for Planning, Full Acquisition, and Operation/Maintenance. The total estimated annual cost of the investment is the sum of costs for Planning, Full Acquisition, and Operation/Maintenance. For Federal buildings and facilities, life-cycle costs should include long term energy, environmental, decommissioning, and/or restoration costs. The costs associated with the entire life-cycle of the investment should be included in this report.
All amounts represent Budget Authority
Note: For the cross-agency investments, this table should include all funding (both managing partner and partner agencies).
Government FTE Costs should not be included as part of the TOTAL represented. Cost Type | Py-1 & Earlier -2006 | PY 2007 | CY 2008 | BY 2009 |
---|
Planning Budgetary Resources | 0.000 | 0.000 | 0.000 | 0.000 | Acquisition Budgetary Resources | 0.000 | 0.000 | 0.000 | 0.000 | Maintenance Budgetary Resources | 29.661 | 26.836 | 69.056 | 52.210 | Government FTE Cost | 1.823 | 3.000 | 3.100 | 3.200 | # of FTEs | 16 | 18 | 18 | 18 |
- 2. Will this project require the agency to hire additional FTE's?
- yes
- 2.a. If "yes," how many and in what year?
- 8 new FTEs to support new requirements and growth of system. Proper federal oversight of this system will require addtional FTE resources.If the agency is not given the addtional FTE's proper managment and oversight may lax. These should be added starting in FY08. This is due to a new scope of work and increase in effort. A zero base budget drill was performed and showed the need for at least 8 new FTE's to keep all work moving and properly managed.
- 3. If the summary of spending has changed from the FY2008 President's budget request, briefly explain those changes.
- Spending has increased from the FY2008 President's budget request since the Q-NET exhibit E300 now includes the costs needed to suppport the QIO 9th Scope of Work (SOW).
PERFORMANCE In order to successfully address this area of the exhibit 300, performance goals must be provided for the agency and be linked to the annual performance plan. The investment must discuss the agency's mission and strategic goals, and performance measures (indicators) must be provided. These goals need to map to the gap in the agency's strategic goals and objectives this investment is designed to fill. They are the internal and external performance benefits this investment is expected to deliver to the agency (e.g., improve efficiency by 60 percent, increase citizen participation by 300 percent a year to achieve an overall citizen participation rate of 75 percent by FY 2xxx, etc.). The goals must be clearly measurable investment outcomes, and if applicable, investment outputs. They do not include the completion date of the module, milestones, or investment, or general goals, such as, significant, better, improved that do not have a quantitative measure.
- Agencies must use the following table to report performance goals and measures for the major investment and use the Federal Enterprise Architecture (FEA) Performance Reference Model (PRM). Map all Measurement Indicators to the corresponding Measurement Area and Measurement Grouping identified in the PRM. There should be at least one Measurement Indicator for each of the four different Measurement Areas (for each fiscal year). The PRM is available at www.egov.gov. The table can be extended to include performance measures for years beyond FY 2009.
Row | Fiscal Year | Strategic Goal Supported | Measurement Area | Measurement Grouping | Measurement Indicator | Baseline | Planned Improvement to the Baseline | Actual Results |
---|
1 | 2005 | S.O. 1.3 - Improve health care quality, safety, cost and value | Mission and Business Results | Health Care Administration | % of QIO Quality Targets Established | 0% established | 100% | 75% | 2 | 2005 | S.O. 1.3 - Improve health care quality, safety, cost and value | Customer Results | Service Efficiency | % of helpdesk tickets closed within 10 days | 0% | 90% closed | 50% | 3 | 2005 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Availability | % of CART Hospital Measures Established | 0% | 100% | 75% | 4 | 2005 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Data Reliability and Quality | % of HL7 Standards Used | 0% standards used | 0% | 100% | 5 | 2005 | S.O. 1.3 - Improve health care quality, safety, cost and value | Processes and Activities | Errors | % of required patches/bug fixes to code | 0% | 10% | 25% | 6 | 2006 | S.O. 1.3 - Improve health care quality, safety, cost and value | Mission and Business Results | Health Care Administration | % of QIO Quality Targets Established | 75% | 100% | 90% | 7 | 2006 | S.O. 1.3 - Improve health care quality, safety, cost and value | Customer Results | Service Efficiency | % of helpdesk tickets closed within 10 days | 50% | 100% | 75% | 8 | 2006 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Availability | % of CART Hospital Measures Established | 75% | 100% | 100% | 9 | 2006 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Response Time | % of custom applications delivered on time | 60% | 90% | 50% | 10 | 2006 | S.O. 1.3 - Improve health care quality, safety, cost and value | Processes and Activities | Errors | % of required patches/bug fixes to code | 25% | 10% | 20% | 11 | 2007 | S.O. 1.3 - Improve health care quality, safety, cost and value | Mission and Business Results | Health Care Administration | % of QIO Quality Targets Established | 90% | 100% | 98% as of Q3 | 12 | 2007 | S.O. 1.3 - Improve health care quality, safety, cost and value | Customer Results | Service Efficiency | % of helpdesk tickets closed within 10 days | 75% | 90% | 98% as of Q3 | 13 | 2007 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Availability | % of NEW CART Hospital Measures Established for HQA and APU | 0% | 75% | 75% as of Q3 | 14 | 2007 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Response Time | % of custom applications delivdered on time | 50% | 75% | 40% as of Q3 | 15 | 2007 | S.O. 1.3 - Improve health care quality, safety, cost and value | Processes and Activities | Errors | % of required patches/bug fixes to code | 20% | 10% | 85% as of Q3 | 16 | 2008 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Response Time | % of custom applications delivered on time | 40% | 75% | TBD | 17 | 2008 | S.O. 1.3 - Improve health care quality, safety, cost and value | Customer Results | Service Efficiency | % of QIO Quality Targets Established | 98% | 100% | TBD | 18 | 2008 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Availability | % of NEW CART Hospital Measures Established for HQA and APU | 75% | 100% | TBD | 19 | 2009 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Response Time | % of custom applications delivered on time | 40% | 85% | TBD | 20 | 2008 | S.O. 1.3 - Improve health care quality, safety, cost and value | Processes and Activities | Errors | % of required patches/bug fixes to code | 85% | 20% | TBD | 21 | 2009 | S.O. 1.3 - Improve health care quality, safety, cost and value | Mission and Business Results | Health Care Administration | % of QIO 9 SOW Quality Targets Established | 0% | 100% | TBD | 22 | 2009 | S.O. 1.3 - Improve health care quality, safety, cost and value | Customer Results | Service Efficiency | % of helpdesk tickets closed within 10 days | 98% | 100% | TBD | 23 | 2009 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Availability | % of NEW 9 SOW CART Hospital Measures Established for HQA and APU | 0% | 75% | TBD | 24 | 2009 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Data Reliability and Quality | % of HL7 Standards established within EHR systems | 0% | 100% | TBD | 25 | 2009 | S.O. 1.3 - Improve health care quality, safety, cost and value | Processes and Activities | Errors | % of required patches/bug fixes to code | 85% | 10% | TBD | 26 | 2010 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Response Time | % of custom applications delivered on time | 40% | 90% | TBD | 27 | 2010 | S.O. 1.3 - Improve health care quality, safety, cost and value | Customer Results | Service Efficiency | % of helpdesk tickets closed within 10 days | 98% | 100% | TBD | 28 | 2010 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Availability | % of NEW 9 SOW CART Hospital Measures Established for HQA and APU | 0% | 100% | TBD | 29 | 2010 | S.O. 1.3 - Improve health care quality, safety, cost and value | Technology | Data Reliability and Quality | % of HL7 Standards established within EHR systems | 0% | 100% | TBD | 30 | 2010 | S.O. 1.3 - Improve health care quality, safety, cost and value | Processes and Activities | Errors | % of required patches/bug fixes to code | 85% | 5% | TBD | 31 | 2010 | S.O. 1.3 - Improve health care quality, safety, cost and value | Mission and Business Results | Health Care Administration | % of 9 SOW measures monitored in the New MIS systems | 0% | 100% | TBD | 32 | 2008 | S.O. 1.3 - Improve health care quality, safety, cost and value | Mission and Business Results | Health Care Administration | % of 9 SOW measures monitored in the New MIS systems | 0% | 75% | TBD |
Enterprise Architecture In order to successfully address this area of the business case and capital asset plan you must ensure the investment is included in the agency's EA and Capital Planning and Investment Control (CPIC) process, and is mapped to and supports the FEA. You must also ensure the business case demonstrates the relationship between the investment and the business, performance, data, services, application, and technology layers of the agency's EA. - 1. Is this investment included in your agency's target enterprise architecture?
- yes
- 2. Is this investment included in the agency's EA Transition Strategy?
- yes
- 2.a. If yes, provide the investment name as identified in the Transition Strategy provided in the agency's most recent annual EA Assessment.
- CMS Q-net
- 3. Is this investment identified in a completed (contains a target architecture) and approved segment architecture?
- no
- 4. Identify the service components funded by this major IT investment (e.g., knowledge management, content management, customer relationship management, etc.). Provide this information in the format of the following table. For detailed guidance regarding components, please refer to http://www.whitehouse.gov/omb/egov/.
Component: Use existing SRM Components or identify as NEW. A NEW component is one not already identified as a service component in the FEA SRM.
Reused Name and UPI: A reused component is one being funded by another investment, but being used by this investment. Rather than answer yes or no, identify the reused service component funded by the other investment and identify the other investment using the Unique Project Identifier (UPI) code from the OMB Ex 300 or Ex 53 submission.
Internal or External Reuse?: Internal reuse is within an agency. For example, one agency within a department is reusing a service component provided by another agency within the same department. External reuse is one agency within a department reusing a service component provided by another agency in another department. A good example of this is an E-Gov initiative service being reused by multiple organizations across the federal government.
Funding Percentage: Please provide the percentage of the BY requested funding amount used for each service component listed in the table. If external, provide the funding level transferred to another agency to pay for the service. Row | Agency Component Name | Agency Component Description | Service Type | Component | Reused Component Name | Reused UPI | Internal or External Reuse? | Funding % |
---|
1 | Identification and Authentication | Defines the set of capabilities that support obtaining information about those parties attempting to log on to a system or application for security purposes and the validation of those users. | Security Management | Identification and Authentication | | | No Reuse | 5 | 2 | Access Control | Access Control - Support the management of permissions for logging onto a computer, application, service, or network; includes user management and role/privilege management. | Security Management | Access Control | | | No Reuse | 5 | 3 | Standardized / Canned | Defines the set of capabilities that support the use of pre-conceived or pre-written reports. | Reporting | Standardized / Canned | | | No Reuse | 5 | 4 | Ad Hoc | Defines the set of capabilities that support the use of dynamic reports on an as needed basis. | Reporting | Ad Hoc | | | No Reuse | 5 | 5 | Data Warehouse | Defines the set of capabilities that support the archiving and storage of large volumes of data. | Data Management | Data Warehouse | | | No Reuse | 10 | 6 | Meta Data Management | Defines the set of capabilities that support the maintenance and administration of data that describes data. | Data Management | Meta Data Management | | | No Reuse | 5 | 7 | Data Exchange | Defines the set of capabilities that support the interchange of information between multiple systems or applications; includes verification that transmitted data was received unaltered. | Data Management | Data Exchange | | | No Reuse | 20 | 8 | Data Integration | Defines the set of capabilities that support the organization of data from separate data sources into a single source using middleware or application integration and the modification of system data models to capture new information within a single system. | Development and Integration | Data Integration | | | No Reuse | 10 | 9 | Software Development | Defines the set of capabilities that support the creation of both graphical and process application or system software. | Development and Integration | Software Development | | | No Reuse | 30 | 10 | Business Rule Management | Defines the set of capabilities that manage the enterprise processes that support an organization and its policies. | Management of Processes | Business Rule Management | | | No Reuse | 5 |
- 5. To demonstrate how this major IT investment aligns with the FEA Technical Reference Model (TRM), please list the Service Areas, Categories, Standards, and Service Specifications supporting this IT investment.
FEA SRM Component: Service Components identified in the previous question should be entered in this column. Please enter multiple rows for FEA SRM Components supported by multiple TRM Service Specifications.
Service Specification: In the Service Specification field, Agencies should provide information on the specified technical standard or vendor product mapped to the FEA TRM Service Standard, including model or version numbers, as appropriate. Row | SRM Component | >Service Area | Service Category | Service Standard | Service Specification (i.e., vendor and product name) |
---|
1 | Software Development | Component Framework | Data Management | Database Connectivity | Active Data Objects | 2 | Software Development | Component Framework | Data Interchange | Data Exchange | Altova XMLSpy | 3 | Software Development | Service Platform and Infrastructure | Support Platforms | Platform Dependent | Altova XMLSpy | 4 | Ad Hoc | Component Framework | Presentation / Interface | Dynamic Server-Side Display | Cognos | 5 | Data Warehouse | Component Framework | Presentation / Interface | Dynamic Server-Side Display | Cognos | 6 | Ad Hoc | Component Framework | Data Management | Reporting and Analysis | Cognos | 7 | Data Warehouse | Component Framework | Data Management | Reporting and Analysis | Cognos | 8 | Software Development | Service Platform and Infrastructure | Software Engineering | Integrated Development Environment | Eclipse | 9 | Software Development | Component Framework | Business Logic | Platform Independent | Enteprise Java Beans | 10 | Data Exchange | Service Interface and Integration | Interoperability | Data Format / Classification | Extensible Markup Language (XML) 1.1 | 11 | Software Development | Service Interface and Integration | Interoperability | Data Format / Classification | Extensible Markup Language (XML) 1.1 | 12 | Software Development | Component Framework | Presentation / Interface | Static Display | Hyper Text Markup Language | 13 | Data Exchange | Service Platform and Infrastructure | Hardware / Infrastructure | Network Devices / Standards | Internet Message Access Protocol | 14 | Data Exchange | Service Access and Delivery | Service Transport | Supporting Network Services | Internet Message Access Protocol | 15 | Software Development | Component Framework | Business Logic | Platform Independent | Java | 16 | Software Development | Component Framework | Business Logic | Platform Independent | Java 2 Enterprise Edition | 17 | Software Development | Service Platform and Infrastructure | Support Platforms | Platform Independent | Java 2 Enterprise Edition | 18 | Software Development | Component Framework | Presentation / Interface | Dynamic Server-Side Display | Java Server Pages | 19 | Software Development | Component Framework | Business Logic | Platform Independent | JavaScript | 20 | Software Development | Service Interface and Integration | Interoperability | Data Types / Validation | JAXP | 21 | Software Development | Service Platform and Infrastructure | Software Engineering | Test Management | LoadRunner | 22 | Software Development | Service Platform and Infrastructure | Software Engineering | Test Management | Mercury Interactive Test Director | 23 | Software Development | Service Platform and Infrastructure | Support Platforms | Platform Dependent | Mindreef SOAP Scope | 24 | Software Development | Service Platform and Infrastructure | Software Engineering | Test Management | Mindreef SOAP Scope | 25 | Data Integration | Component Framework | Data Management | Database Connectivity | Object Linking and Embedding, Database | 26 | Software Development | Component Framework | Data Management | Database Connectivity | Object Linking and Embedding, Database | 27 | Standardized / Canned | Component Framework | Data Management | Database Connectivity | ODBC, JDBC, ADO, OLE/DB, DAO | 28 | Data Integration | Component Framework | Data Management | Database Connectivity | Open Database Connectivity | 29 | Software Development | Component Framework | Data Management | Database Connectivity | Open Database Connectivity | 30 | Ad Hoc | Service Platform and Infrastructure | Software Engineering | Modeling | Oracle XML Query Service | 31 | Software Development | Service Platform and Infrastructure | Software Engineering | Integrated Development Environment | PowerBuilder | 32 | Software Development | Service Platform and Infrastructure | Software Engineering | Software Configuration Management | Serena PVCS - Dimensions | 33 | Data Exchange | Component Framework | Data Interchange | Data Exchange | SOAP | 34 | Identification and Authentication | Service Access and Delivery | Service Requirements | Authentication / Single Sign-on | Sun Identity Manager | 35 | Access Control | Service Access and Delivery | Service Requirements | Authentication / Single Sign-on | Sun Identity Manager | 36 | Software Development | Component Framework | Presentation / Interface | Dynamic Server-Side Display | SWING | 37 | Software Development | Service Platform and Infrastructure | Software Engineering | Modeling | Unified Modeling Language | 38 | Business Rule Management | Service Interface and Integration | Interoperability | Data Types / Validation | XML , UML, EMF, JAXP, JDOM | 39 | Meta Data Management | Service Interface and Integration | Interoperability | Data Format / Classification | XML, ORACLE, Power Designer |
- 6. Will the application leverage existing components and/or applications across the Government (i.e., FirstGov, Pay.Gov, etc)?
- no
PART THREE
RISK You should perform a risk assessment during the early planning and initial concept phase of the investment's life-cycle, develop a risk-adjusted life-cycle cost estimate and a plan to eliminate, mitigate or manage risk, and be actively managing risk throughout the investment's life-cycle.
Answer the following questions to describe how you are managing investment risks. - 1. Does the investment have a Risk Management Plan?
- yes
- 1.a. If yes, what is the date of the plan?
- 2007-07-07
- 1.b. Has the Risk Management Plan been significantly changed since last year's submission to OMB?
- no
COST & SCHEDULE
- 1. Was operational analysis conducted?
- yes
- 1.a. If yes, provide the date the analysis was completed.
- 2007-07-31
- What were the results of your operational analysis?
- Some planned milestones actually started a month or two behind schedule, the actual costs compared to the planned costs decreased. Other planned milestones start/end dates were on schedule when compared to actual milestone start/end dates and costs were also on target. New milestones were added to the investment and were approved by HHS.Two major audits were performed, one from the Senate Finance Committee (Grassley) and one from IOM , all issues are being addressed in the 9 SOW. The issues were reviewed and approved by OMB for future funding to continue.
- 1.c. If no, please explain why it was not conducted and if there are any plans to conduct operational analysis in the future.
- Not Applicable.
|
|
|