This is the accessible text file for GAO report number GAO-08-259T 
entitled 'Information Technology: Census Bureau Needs to Improve Its 
Risk Management of Decennial Systems' which was released on December 
11, 2007. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Testimony: 

Testimony Before the Subcommittee on Information Policy, Census, and 
National Archives, Committee on Oversight and Government Reform, U.S. 
House of Representatives: 

United States Government Accountability Office: 

GAO: 

For Release on Delivery Expected at 2:00 p.m. EST: 
Tuesday, December 11, 2007: 

Information Technology: 

Census Bureau Needs to Improve Its Risk Management of Decennial 
Systems: 

Statement of David A. Powner: 
Director, Information Technology Management Issues: 

Mathew J. Scirè: 
Director, Strategic Issues: 

GAO-08-259T: 

GAO Highlights: 

Highlights of GAO-08-259T, a testimony before the Subcommittee on 
Information Policy, Census, and National Archives, Committee on 
Oversight and Government Reform, U.S. House of Representatives. 

Why GAO Did This Study: 

For Census 2010, automation and information technology (IT) are 
expected to play a critical role. The Census Bureau plans to spend 
about $3 billion on automation and technology that are to improve the 
accuracy and efficiency of census collection, processing, and 
dissemination. From February 2006 through June 2009, the Bureau is 
holding a “Dress Rehearsal” during which it plans to conduct 
operational testing that includes decennial systems acquisitions. 

In October 2007, GAO reported on its review of four key 2010 Census IT 
acquisitions to (1) determine the status and plans, including schedule 
and cost, and (2) assess whether the Bureau is adequately managing 
associated risks. This testimony summarizes GAO’s report on these key 
acquisitions and describes GAO’s preliminary observations on the 
performance of handheld mobile computing devices used during the Dress 
Rehearsal. 

What GAO Found: 

As of October 2007, three key systems acquisitions for the 2010 Census 
were in process, and a fourth contract had recently been awarded. The 
ongoing acquisitions showed mixed progress in meeting schedule and cost 
estimates. Two of the projects were not on schedule. The award of the 
fourth contract, originally scheduled for 2005, was awarded in 
September 2007. In addition, one project had incurred cost overruns and 
increases to its projected life-cycle cost. As a result of the schedule 
changes, the full complement of systems and functionality that were 
originally planned will not be available for upcoming Dress Rehearsal 
operational testing. This limitation increases the importance of 
further system testing to ensure that the decennial systems work as 
intended.

The Bureau’s project teams for each of the four IT acquisitions had 
performed many practices associated with establishing sound and capable 
risk management processes, but critical weaknesses remained. Three 
project teams had developed a risk management strategy that identified 
the scope of the risk management effort. However, not all project teams 
had identified risks, established mitigation plans, or reported risks 
to executive-level officials. For example, one project team did not 
adequately identify risks associated with performance issues 
experienced by handheld mobile computing devices, even though Census 
field staff reported slow and inconsistent data transmissions with the 
device during the spring Dress Rehearsal operations. The magnitude of 
these difficulties is not clear, and the Bureau has not fully specified 
how it plans to measure the performance of the devices. Until the 
project teams implement key risk management activities, they face an 
increased probability that decennial systems will not be delivered on 
schedule and within budget or perform as expected. 

Performance of Risk Management Activities by Key Census Acquisition 
Projects: 

Specific practices: Preparing for risk management: Determine risk 
sources and categories; 
Acquisition project 1: practice not implemented; 
Acquisition project 2: practice fully implemented; 
Acquisition project 3: practice fully implemented; 
Acquisition project 4: practice fully implemented. 

Specific practices: Preparing for risk management: Define risk 
parameters; 
Acquisition project 1: practice fully implemented; 
Acquisition project 2: practice fully implemented; 
Acquisition project 3: practice fully implemented; 
Acquisition project 4: practice fully implemented. 

Specific practices: Preparing for risk management: Establish and 
maintain a risk management strategy; 
Acquisition project 1: practice partially implemented; 
Acquisition project 2: practice fully implemented; 
Acquisition project 3: practice fully implemented; 
Acquisition project 4: practice fully implemented. 

Specific practices: Preparing for risk management: Identify and involve 
the relevant stakeholders; 
Acquisition project 1: practice partially implemented; 
Acquisition project 2: practice partially implemented; 
Acquisition project 3: practice fully implemented; 
Acquisition project 4: practice partially implemented. 

Specific practices: Identify and analyze risks: Identify and document 
the risks; 
Acquisition project 1: practice fully implemented; 
Acquisition project 2: practice partially implemented; 
Acquisition project 3: practice fully implemented; 
Acquisition project 4: practice partially implemented. 

Specific practices: Identify and analyze risks: Evaluate, categorize, 
and prioritize risks; 
Acquisition project 1: practice partially implemented; 
Acquisition project 2: practice fully implemented; 
Acquisition project 3: practice fully implemented; 
Acquisition project 4: practice fully implemented. 

Specific practices: Mitigate risks: Develop risk mitigation plans; 
Acquisition project 1: practice partially implemented; 
Acquisition project 2: practice partially implemented; 
Acquisition project 3: practice fully implemented; 
Acquisition project 4: practice not implemented. 

Specific practices: Mitigate risks: Monitor status and implement risk 
mitigation plans; 
Acquisition project 1: practice partially implemented; 
Acquisition project 2: practice partially implemented; 
Acquisition project 3: practice fully implemented; 
Acquisition project 4: practice partially implemented. 

Specific practices: Executive oversight: Review status with executive-
level management; 
Acquisition project 1: practice not implemented; 
Acquisition project 2: practice not implemented; 
Acquisition project 3: practice fully implemented; 
Acquisition project 4: practice fully implemented. 

Source: GAO analysis of Census project data against industry standards. 

[End of table] 

What GAO Recommends: 

In its report, GAO made recommendations that the Bureau strengthen its 
systems testing and risk management activities, including risk 
identification and oversight. The Bureau agreed to examine additional 
ways to manage risks, but disagreed with the view that a full 
complement of systems would not be tested, stating it planned to do so 
during the Dress Rehearsal or later; however, the test plans have not 
been finalized and it remains unclear whether this testing will be 
done. 

To view the full product, including the scope and methodology, click on 
[hyperlink, http://www.GAO-08-259T]. For more information, contact 
David A. Powner at (202) 512-9286 or pownerd@gao.gov.

[End of section] 

Mr. Chairman and Members of the Subcommittee: 

Thank you for the opportunity to participate in today’s hearing on the 
2010 Decennial Census Information Technology (IT) acquisitions that are 
an integral part of the reengineered census. As you know, the decennial 
census is mandated by the U.S. Constitution and provides data that are 
vital to the nation. These data are used to reapportion the seats of 
the U.S. House of Representatives, realign the boundaries of the 
legislative districts of each state, allocate billions of dollars in 
federal financial assistance, and provide a social, demographic, and 
economic profile of the nation’s people to guide policy decisions at 
each level of government. 

Carrying out the census is the responsibility of the Department of 
Commerce’s Census Bureau, which is now preparing for the 2010 Census. 
The Bureau is required to count the population on April 1, 2010, and 
the Secretary of Commerce is required to report to the President on the 
tabulation of total population by state within 9 months of that date. 
[Footnote 1] 

The Bureau plans to rely on automation and technology to improve the 
coverage, accuracy, and efficiency of the 2010 Census, and has awarded 
four key IT contracts to that end. It is also holding what it refers to 
as a Dress Rehearsal, from February 2006 through June 2009, a period 
centering around a mock Census Day on April 1, 2008. [Footnote 2] 
Planned Dress Rehearsal activities include operational testing of the 
2010 Census systems in a census-like environment. The Bureau estimates 
that its IT acquisitions will absorb about $3 billion of the total 
$11.5 billion cost of the entire census. 

As requested, our testimony today will summarize our report on the four 
key IT acquisitions. In the report, we (1) determined the status and 
plans, including schedule and costs, for four key IT acquisitions; and 
(2) assessed whether the Bureau is adequately managing the risks facing 
these key system acquisitions. [Footnote 3] The report contains a 
detailed overview of the scope and methodology we used. As you also 
requested, our testimony today describes GAO’s preliminary observations 
on the performance of handheld mobile computing devices used during 
address canvassing activities in the Dress Rehearsal. [Footnote 4] The 
preliminary observations presented in this report are based on field 
work we have conducted at the two Dress Rehearsal sites (Stockton, CA 
and Fayetteville, NC), as well as a review of Bureau documentation of 
its own observations of the Dress Rehearsal. The work on which this 
testimony is based was performed in accordance with generally accepted 
government auditing standards. 

Results in Brief: 

As of October 2007, three key systems acquisitions for the 2010 Census 
were in process, and a fourth contract had recently been awarded: 

* In one project, the Bureau is modernizing the database that provides 
address lists, maps, and other geographic support services for the 
census. This project is on schedule to complete improvements by the end 
of fiscal year 2008 and is meeting cost estimates. 

* In a second project, the Bureau is acquiring systems, equipment, and 
infrastructure for field staff to use in collecting census data. 
Deliverables provided to date include handheld mobile computing devices 
and installation of key support infrastructure. However, the schedule 
for this acquisition has been revised, resulting in delays in system 
development and testing of interfaces. Also, the life-cycle cost 
estimates for this program have increased, and we projected an $18 
million cost overrun by December 2008. According to the contractor, the 
overrun is due primarily to an increase in the number of system 
requirements. 

* In a third project, the Bureau is acquiring a system for integrating 
paper, telephone responses, and field operations. The software 
development and testing are on schedule to provide (by December 2007) 
an initial system to process the major census forms during the Dress 
Rehearsal activities. However, the system development schedule was 
revised in October 2005, which is delaying some functionality. For 
example, a telephone-assistance system that was originally intended to 
be completed by fiscal year 2008 has been delayed. This acquisition is 
meeting current cost estimates. 

* Finally, a contract to replace the current system used to tabulate 
and disseminate census data was recently delayed by about a year (it was
ultimately awarded in September 2007). As a result, of the 1-year 
delay, the Dress Rehearsal activities will use the current tabulation 
and dissemination system rather than a modernized version. 

The delays mean that the Dress Rehearsal operational testing will take 
place without the full complement of systems and functionality that was 
originally planned. As a result, further system testing will be 
necessary to ensure that the decennial systems work as intended. 
However, as of October 2007, Bureau officials had not finalized their 
plans for testing all the systems, and it is not clear whether these 
plans would include testing to address all interrelated systems and 
functionality, such as end-to-end testing. [Footnote 5] According to 
officials, these plans will not be finalized until February 2008. 
Without sufficient testing of all systems and their functionality, the 
Bureau increases the risk that costs will increase further, that 
decennial systems will not perform as expected, or both. 

As of October 2007, the four project teams managing the acquisitions 
had performed many practices associated with establishing sound and 
capable risk management processes. However, critical weaknesses 
remained. Specifically, three of the four project teams had developed 
risk management strategies identifying the scope of their risk 
management efforts; however, three project teams had weaknesses in 
identifying risks, establishing mitigation plans that identified 
planned actions and milestones, and reporting risk status to executive-
level officials. For example, one project team did not adequately 
identify risks associated with performance issues experienced by 
handheld mobile computing devices. Further, in May and June 2007, both 
we and the Census Bureau observed the use of the handheld mobile 
computing device in Census-like conditions and these observations 
revealed a number of performance issues with the devices, such as slow 
and inconsistent data processing. The magnitude of these performance 
issues remains unclear. The Field Data Collection Automation (FDCA) 
contract anticipates the Bureau’s need for data on the performance of 
the handheld mobile computing device; however, the Bureau has not fully 
specified the performance data it will use for the devices. As we have 
previously reported, a root cause of weaknesses in completing key risk 
management activities is the lack of policies for managing major 
acquisitions at the Bureau. [Footnote 6] Until the project teams 
implement key risk management activities, they face an increased 
probability that decennial systems will not be delivered on schedule 
and within budget or perform as expected. 

Because the entire complement of systems will not be available for 
Dress Rehearsal activities as originally planned, we recommended that 
the Census Bureau plan for and perform end-to-end testing so that all 
systems are tested in a census-like environment. Further, to help 
ensure that the three key acquisitions for the 2010 Census operate as 
intended, we recommended that the project teams strengthen risk 
management activities, including those associated with risk 
identification, mitigation, and oversight. 

In written comments on a draft of our report, the department agreed to 
examine additional ways to manage risks and prepare a formal action 
plan in response to our final report. However, the department said it 
had a major disagreement with our findings with regard to not 
conducting operational testing on a full complement of the key 
decennial systems, stating it plans to test all critical systems and 
interfaces during the Dress Rehearsal or later. Nonetheless, the 
Bureau’s test plans have not been finalized, and it remains unclear 
whether testing will address all interrelated systems and functionality 
in a census-like environment, as would be provided by end-to-end 
testing. Consistent with our recommendation, following up with 
documented test plans to do end-to-end testing will help ensure that 
decennial systems will work as intended. 

Background: 

Conducting the decennial census is a major undertaking involving many 
interrelated steps including: 

* identifying and correcting addresses for all known living quarters in 
the United States (known as “address canvassing”); 
* sending questionnaires to housing units; 
* following up with nonrespondents through personal interviews; 
* identifying people with nontraditional living arrangements; 
* managing a voluminous workforce responsible for follow-up activities; 
* collecting census data by means of questionnaires, calls, and 
personal interviews; 
* tabulating and summarizing census data; and: 
* disseminating census analytical results to the public. 

Role of IT in the Decennial Census: 

The Bureau estimates that it will spend about $3 billion on automation 
and IT for the 2010 Census, including four major systems acquisitions 
that are expected to play a critical role in improving coverage, 
accuracy, and efficiency. Figure 1 shows the key systems and interfaces 
supporting the 2010 Census, and highlights the four major IT systems we 
discuss today. As the figure shows, these four systems are to play 
important roles with regard to different aspects of the process. 

Figure 1: Key 2010 Census Systems and Interfaces: 

[See PDF for image] 

This figure is a schematic illustration of Key 2010 Census Systems and 
Interfaces. Systems and interfaces are depicted in three areas, with 
relationships indicated by connections through either one-way or two-
way arrows. Four systems are shaded to indicate that the system is 
discussed in the report. The following data is depicted: 

Establish where to count: 
* Master Address File/Topologically Integrated Geographic Encoding and 
Referencing [system discussed in this report]; 
* Universe Control and Management; 
* Printing. 

Collect respondent information: 
* Decennial Response Integration System [system discussed in this 
report]; 
* Field Data Collection Automation; 
* Postal; 
* Respondent; 
* National Processing Center; 
* Decennial Applicant Personnel and Payroll System; 
* Census Evaluation and Experimentation (overlaps into Collect 
respondent information section). 

Provide results: 
* Data Access and Dissemination System [system discussed in this 
report]; 
* Response Processing System; 
* Archive System; 
* National Archives and Records Administration; 
* Census Evaluation and Experimentation (overlaps into Provide Results 
section). 

Relationships as indicated by connecting arrows (input and output): 

Master Address File/Topologically Integrated Geographic Encoding and 
Referencing [system discussed in this report]: 
* Output arrow to Archive System; 
* Output arrow to Data Access and Dissemination System [system 
discussed in this report]; 
* Output arrow to Decennial Response Integration System [system 
discussed in this report]; 
* Output and Input arrow with Census Evaluation and Experimentation; 
* Output and Input arrow with Response Processing System; 
* Output and Input arrow with Universe Control and Management; 
* Output and Input arrow with Field Data Collection Automation [system 
discussed in this report]; 
* Input arrow from National Processing Center. 

Universe Control and Management: 
* Output and Input arrow with Master Address File/Topologically 
Integrated Geographic Encoding and Referencing [system discussed in 
this report]; 
* Output and Input arrow with Field Data Collection Automation [system 
discussed in this report]; 
* Output and Input arrow with Response Processing System; 
* Output arrow to Printing; 
* Output and Input arrow to Decennial Response Integration System 
[system discussed in this report]; 
* Output arrow to Census Evaluation and Experimentation; 

Printing: 
* Output arrow to Postal; 
* Input arrow from Decennial Response Integration System [system 
discussed in this report]; 

Decennial Response Integration System [system discussed in this 
report]: 
* Output arrow to Printing; 
* Output and Input arrow with Universe Control and Management; 
* Input arrow from Postal; 
* Input arrow from Master Address File/Topologically Integrated 
Geographic Encoding and Referencing [system discussed in this report]; 
* Output and Input arrow with Field Data Collection Automation [system 
discussed in this report]; 
* Output arrow to Response Processing System; 
* Output arrow to Archive System. 

Field Data Collection Automation [system discussed in this report]: 
* Output and Input arrow with Universe Control and Management; 
* Output and Input arrow with Master Address File/Topologically 
Integrated Geographic Encoding and Referencing [system discussed in 
this report]; 
* Output and Input arrow with Decennial Response Integration System 
[system discussed in this report]; 
* Input arrow from Respondent; 
* Input arrow from Decennial Applicant Personnel and Payroll System; 
* Output arrow to Census Evaluation and Experimentation (overlaps into 
Provide Results section); 
* Output arrow to National Processing Center. 

Decennial Applicant Personnel and Payroll System: 
* Output arrow to Field Data Collection Automation [system discussed in 
this report]. 

Postal: 
* Output arrow to Decennial Response Integration System [system 
discussed in this report]; 
* Input arrow from Printing; 
* Output and Input arrow with Respondent. 

Respondent: 
* Output and Input arrow with Postal; 
* Output arrow to Field Data Collection Automation [system discussed in 
this report]. 

National Processing Center: 
* Output arrow to Master Address File/Topologically Integrated 
Geographic Encoding and Referencing [system discussed in this report]; 
* Output arrow to Census Evaluation and Experimentation; 
* Input arrow from Field Data Collection Automation [system discussed 
in this report]. 

Census Evaluation and Experimentation: 
* Output and Input arrow with Master Address File/Topologically 
Integrated Geographic Encoding and Referencing [system discussed in 
this report]; 
* Input arrow from Universe Control and Management; 
* Input arrow from Field Data Collection Automation [system discussed 
in this report]; 
* Input arrow from National Processing Center; 
* Output and Input arrow with Response Processing System. 

Data Access and Dissemination System [system discussed in this report]: 
* Input arrow from Master Address File/Topologically Integrated 
Geographic Encoding and Referencing [system discussed in this report]; 
* Input arrow from Response Processing System. 

Archive System: 
* Input arrow from Master Address File/Topologically Integrated 
Geographic Encoding and Referencing [system discussed in this report]; 
* Input arrow from Decennial Response Integration System [system 
discussed in this report]; 
* Input arrow from Response Processing System; 
* Output arrow to National Archives and Records Administration. 

Response Processing System: 
* Output and Input arrow with Master Address File/Topologically 
Integrated Geographic Encoding and Referencing [system discussed in 
this report]; 
* Output and Input arrow with Universe Control and Management; 
* Output and Input arrow with Census Evaluation and Experimentation; 
* Input arrow from Decennial Response Integration System [system 
discussed in this report]; 
* Output arrow to Archive System; 
* Output arrow to Data Access and Dissemination System [system 
discussed in this report]. 

National Archives and Records Administration: 
* Input arrow from Archive System. 

Source: U.S. Census Bureau. 

[End of figure] 

To establish where to count (as shown in the top section of fig. 1), 
the Bureau will depend heavily on a database that provides address 
lists, maps, and other geographic support services. The Bureau’s 
address list, known as the Master Address File (MAF), is associated 
with a geographic information system containing street maps known as 
the Topologically Integrated Geographic Encoding and Referencing 
(TIGER®) database. [Footnote 7] 

The MAF/TIGER database is the object of the first major IT 
acquisition—the MAF/TIGER Accuracy Improvement Project (MTAIP). 

To collect respondent information (a process depicted in the middle 
section of fig. 1), the Bureau is pursuing two initiatives. First, the 
Field Data Collection Automation (FDCA) program is expected to provide 
automation support for field data collection operations as well as 
reduce costs and improve data quality and operational efficiency. This 
acquisition includes the systems, equipment, and infrastructure that 
field staff will use to collect census data, such as handheld mobile 
computing devices. [Footnote 8] Second, the Decennial Response 
Integration System (DRIS) is to provide a system for collecting and 
integrating census responses from all sources, including forms, 
telephone interviews, and handheld mobile computing devices in the 
field. DRIS is expected to improve accuracy and timeliness by 
standardizing the response data and providing it to other Bureau 
systems for analysis and processing. 

To provide results (see the bottom section of fig. 1), the Data Access 
and Dissemination System II (DADS II) acquisition is to replace legacy 
systems for tabulating and publicly disseminating data. The DADS II 
program is expected to provide comprehensive support to DADS. 
Replacement of the legacy systems is expected to: 

* maximize the efficiency, timeliness, and accuracy of tabulation and 
dissemination products and services; 
* minimize the cost of tabulation and dissemination; and: 
* increase user satisfaction with related services. 

Table 1 provides a brief overview of the four acquisitions. 

Table 1: Four Key IT Acquisitions Supporting Census 2010: 

IT acquisition: MAF/TIGER Accuracy Improvement Project (MTAIP); 
Purpose: Modernize the system that provides the address list, maps, and 
other geographic support services for the Census and other Bureau 
surveys. 

IT acquisition: Field Data Collection Automation (FDCA); 
Purpose: Provide automated resources for supporting field data 
collection, including the provision of handheld mobile computing 
devices to collect data in the field, including address and map data. 

IT acquisition: Decennial Response Integration System (DRIS); 
Purpose: Provide a solution for data capture and respondent assistance. 

IT acquisition: Data Access and Dissemination System (DADS II); 
Purpose: Develop a replacement for the DADS legacy tabulation and 
dissemination systems. 

Source: GAO analysis of Census Bureau data. 

[End of table] 

Responsibility for these acquisitions lies with the Bureau’s Decennial 
Management Division and the Geography Division. Each of the four 
acquisitions is managed by an individual project team staffed by Bureau 
personnel. Additional information on the contracts for these four 
systems is provided in appendix I of the report. 

In preparation for the 2010 Census, the Bureau plans a series of tests 
of its (new and existing) operations and systems in different 
environments, as well as to conduct what it refers to as the Dress 
Rehearsal. During the Dress Rehearsal period, which runs from February 
2006 through June 2009, the Bureau plans to conduct development and 
testing of systems, run a mock Census Day, and prepare for Census 2010, 
which will include opening offices and hiring staff. These Dress 
Rehearsal activities are to provide an operational test of the 
available system functionalities in a census-like environment, as well 
as other operational and procedural activities. 

Decennial IT Acquisitions Were at Various Stages of Development and 
Showed Mixed Progress against Schedule and Cost Baselines: 

As of October 2007, three key decennial systems acquisitions were in 
process and a fourth contract had recently been awarded. The ongoing 
acquisitions (FDCA, DRIS) showed mixed progress in providing 
deliverables while adhering to planned schedules and cost estimates. 
The two ongoing projects had experienced schedule delays; the date for 
awarding the fourth contract was postponed several times. In addition, 
we estimated that one of the ongoing projects (FDCA) will incur about 
$18 million in cost overruns. In response to schedule delays as well as 
other factors, including cost, the Bureau made schedule adjustments and 
planned to delay certain system functionality. As a result, Dress 
Rehearsal operational testing will not address the full complement of 
systems and functionality that was originally planned, and the Bureau 
has not yet finalized its plans for further system tests. Delaying 
functionality increases the importance of operational testing after the 
Dress Rehearsal to ensure that the decennial systems work as intended. 

MTAIP Was Completing Improvements on Schedule and at Estimated Cost: 

MTAIP is a project to improve the accuracy of the MAF/TIGER database, 
which contains information on street locations, housing units, rivers, 
railroads, and other geographic features. We reported that MTAIP was on 
schedule to complete improvements by the end of fiscal year 2008 and 
was meeting cost estimates. 

As of October 2007, the acquisition was in the second and final phase 
of its life cycle. In Phase II, which began in January 2003 and is 
ongoing, the contractor is developing improved maps for all 3,037 
counties in the United States. We reported that the contractor had 
delivered more than 75 percent of these maps, which are due by 
September 2008. Beginning in fiscal year 2008, maintenance for the 
contract will begin. The contract closeout activities are scheduled for 
fiscal year 2009. 

FDCA Had Provided Deliverables but Had Delayed Functionality and Was 
Experiencing Cost Increases: 

FDCA is to provide the systems, equipment, and infrastructure that 
field staff will use to collect census data. At the peak of the 2010 
Census, about 4,000 field operations supervisors, 40,000 crew leaders, 
500,000 enumerators and address listers, and several thousand office 
employees are expected to use or access FDCA. 

As of October 2007, the contractor was in the process of developing and 
testing FDCA software for the Dress Rehearsal Census Day, and had 
delivered 1,388 handheld mobile computing devices to be used in address 
canvassing for the Dress Rehearsal. Also, key FDCA support 
infrastructure had been installed, including the Security Operation 
Center. In future contract phases, the project will continue 
development, deploy systems and hardware, support census operations, 
and perform operational and contract closeout activities. 

However, the Bureau revised FDCA’s original schedule and delayed or 
eliminated some of its key functionality from the Dress Rehearsal, 
including the automated software distribution system. According to the 
Bureau, it revised the schedule because it realized it had 
underestimated the costs for the early stages of the contract, and that 
it could not meet the contractor’s estimated level of first-year 
funding because the fiscal year 2006 budget was already in place. 

According to the Bureau, this initial underestimate led to schedule 
changes and overall cost increases. According to the Bureau, FDCA was 
meeting all planned milestones on the revised schedule. For example, 
all sites for Regional Census Centers and Puerto Rico Area Offices had 
been identified. According to the Bureau, it is on schedule to open all 
these offices in January 2008. 

The project life-cycle costs had increased. At contract award in March 
2006, the total cost of FDCA was estimated not to exceed $596 million. 
In May 2007, the life-cycle cost rose by a further $23 million because 
of increasing system requirements, which resulted in an estimated life-
cycle cost of about $647 million. Table 2 shows the life-cycle cost 
estimates for FDCA as of October 2007. 

Table 2: FDCA Life-Cycle Cost Estimates: 

Execution period: Baseline planning period; 
Start date: March 31, 2006; 
End date: June 30, 2006; 
Cost estimates (in millions) September 2006: $11; 
Cost estimates (in millions) May 2007: $11. 

Execution period: Execution Period 1; 
Start date: July 1, 2006; 
End date: December 31, 2008; 
Cost estimates (in millions) September 2006: $200; 
Cost estimates (in millions) May 2007: $225. 

Execution period: Execution Period 2; 
Start date: January 1, 2009; 
End date: September 30, 2011; 
Cost estimates (in millions) September 2006: $319; 
Cost estimates (in millions) May 2007: $318. 

Execution period: Execution Period 3; 
Start date: August 1, 2010; 
End date: End of contract; 
Cost estimates (in millions) September 2006: $10; 
Cost estimates (in millions) May 2007: $10. 

Execution period: Leased equipment; 
Start date: N/A; 
End date: N/A; 
Cost estimates (in millions) September 2006: $12; 
Cost estimates (in millions) May 2007: $12. 

Execution period: Management reserve; 
Start date: N/A; 
End date: N/A; 
Cost estimates (in millions) September 2006: $7; 
Cost estimates (in millions) May 2007: $5. 

Execution period: Award fee; 
Start date: N/A; 
End date: N/A; 
Cost estimates (in millions) September 2006: $65; 
Cost estimates (in millions) May 2007: $65. 

Execution period: Total; 
Cost estimates (in millions) September 2006: $624; 
Cost estimates (in millions) May 2007: $647. 

Source: GAO analysis of Census Bureau data. 

Note: Total may not add due to rounding. 

[End of table] 

In addition, FDCA had already experienced $6 million in cost overruns, 
and both our analysis and the contractor’s analysis expected FDCA to 
experience additional cost overruns. Based on our analysis of cost 
performance reports (from July 2006 to May 2007), we projected that the 
FDCA project will experience further cost overruns by December 2008. 
The FDCA cost overrun was estimated between $15 million and $19 
million, with the most likely overrun to be about $18 million. The 
contractor, in contrast, estimated about a $6 million overrun by 
December 2008. 

According to the contractor, the major cause of projected cost overruns 
was the system requirements definition process. For example, in 
December 2006, the contractor noted a significant increase in the 
requirements for the Dress Rehearsal Paper Based Operations in 
Execution Period 1. According to the cost performance reports, this 
increase has meant that more work must be conducted and more staffing 
assigned to meet the Dress Rehearsal schedule. 

The Bureau agreed that cost increases occurred in some cases because of 
the addition of new requirements, most of which related to the security 
of IT systems, but added that in other cases, increases occurred from 
the process of the contractor converting high-level functional 
requirements into more detailed specific requirements. However, the 
process of developing detailed requirements from high-level functional 
requirements does not inevitably lead to cost increases if the 
functional requirements were initially well-defined. 

The FDCA schedule changes have increased the likelihood that the 
systems testing at the Dress Rehearsal will not be as comprehensive as 
planned. The inability to perform comprehensive operational testing of 
all interrelated systems increases the risk that further cost overruns 
will occur and that decennial systems will experience performance 
shortfalls. 

After a Schedule Revision, DRIS Was Delivering Reduced Functionality at 
Projected Cost: 

DRIS is to provide a system for collecting and integrating census 
responses, standardizing the response data, and providing it to other 
systems for analysis and processing. The DRIS functionality is critical 
for providing assistance to the public via telephone and for monitoring 
the quality and status of data capture operations. 

Although DRIS was currently on schedule to meet its December 2007 
milestone, the Bureau revised the original DRIS schedule after the 
contract was awarded in October 2005. Under the revised schedule, the 
Bureau delayed or eliminated some functionality that was expected to be 
ready for the Dress Rehearsal mock Census Day. 

According to Bureau officials, they delayed the schedule and eliminated 
functionality for DRIS when they realized they had underestimated the 
fiscal years 2006 through 2008 costs for development. As shown in table 
3, the government’s funding estimates for DRIS Phase I were 
significantly lower than the contractor’s. 

Table 3: DRIS Cost Estimates for Phase I (as of March 2006): 

Fiscal year: 2006; 
Cost estimates (in millions) Contractor: $18.6; 
Cost estimates (in millions) Government: $11.2. 

Fiscal year: 2007; 
Cost estimates (in millions) Contractor: $53.3; 
Cost estimates (in millions) Government: $23.8. 

Fiscal year: 2008; 
Cost estimates (in millions) Contractor: $48.7; 
Cost estimates (in millions) Government: $31.5. 

Fiscal Year: Total; 
Cost estimates (in millions) Contractor: $120.6; 
Cost estimates (in millions) Government: $66.5. 

Source: GAO analysis of Census Bureau data. 

[End of table] 

Originally, the DRIS solution was to include paper, telephone, 
Internet, and field data collection processing; selection of data 
capture sites; and preparation and processing of 2010 Census forms. 
However, the Bureau reduced the scope of the solution by eliminating 
the Internet functionality. In addition, the Bureau has stated that it 
will not have a robust telephone questionnaire assistance system in 
place for the Dress Rehearsal. As of October 2007, the Bureau was also 
delaying selecting sites for data capture centers, preparing data 
capture facilities, and recruiting and hiring data capture staff. 

Although Bureau officials told us that the revisions to the schedule 
should not affect meeting milestones for the 2010 Census, the delays 
mean that more systems development and testing will need to be 
accomplished later. Given the immovable deadline of the decennial 
census, the Bureau is at risk of reducing functionality or increasing 
costs to meet its schedule. 

The DRIS project was not experiencing cost overruns, and our analysis 
of cost performance reports from April 2006 to May 2007 projected no 
cost overruns by December 2008. As of May 2007, the DRIS contract value 
had not increased. 

DADS II Contract Had Recently Been Awarded after a Delay: 

The DADS II acquisition is to replace the legacy DADS systems, which 
tabulate and publicly disseminate data from the decennial census and 
other Bureau surveys. [Footnote 9] The DADS II contractor is also 
expected to provide comprehensive support to the Census 2000 legacy 
DADS systems. 

The DADS II contract award date had been delayed multiple times. The 
award date was originally planned for the fourth quarter of 2005, but 
the date changed to August 2006. On March 8, 2006, the Bureau estimated 
it would delay the award of the DADS II contract from August to October 
2006 to gain a clearer sense of budget priorities before initiating the 
request for proposal process. The Bureau then delayed the contract 
award again by about another year. In January 2007, the Bureau released 
the DADS II request for proposal, and the contract was finally awarded 
in September 2007. Because of these delays, DADS II will not be 
developed in time for the Dress Rehearsal. Instead, the Bureau will use 
the legacy DADS system for tabulation during the Dress Rehearsal. 
Nonetheless, the Bureau plans to have the DADS II system available for 
the 2010 Census. 

The Bureau Was Making Progress in Risk Management Activities but 
Critical Weaknesses Remained: 

The project teams varied in the extent to which they followed 
disciplined risk management practices. For example, three of the four 
project teams had developed strategies to identify the scope of the 
risk management effort. However, three project teams had weaknesses in 
identifying risks, establishing adequate mitigation plans, and 
reporting risk status to executive-level officials. These weaknesses in 
completing key risk management activities can be attributed in part to 
the absence of Bureau policies for managing major acquisitions, as we 
described in an earlier report. [Footnote 10] Without effective risk 
management practices, the likelihood of project success is decreased. 

According to the Software Engineering Institute (SEI), the purpose of 
risk management is to identify potential problems before they occur. 
When problems are identified, risk-handling activities can be planned 
and invoked as needed across the life of a project in order to mitigate 
adverse impacts on objectives. Effective risk management involves early 
and aggressive risk identification through the collaboration and 
involvement of relevant stakeholders. Based on SEI’s Capability 
Maturity Model® Integration (CMMI®), risk management activities can be 
divided into four key areas: 

* preparing for risk management; 
* identifying and analyzing risks; 
* mitigating risks, and: 
* executive oversight. 

The discipline of risk management is important to help ensure that 
projects are delivered on time, within budget, and with the promised 
functionality. It is especially important for the 2010 Census, given 
the immovable deadline. 

Project Teams Had Usually Established Risk Preparation Activities, but 
Some Improvements in These Activities Were Needed: 

Risk preparation involves establishing and maintaining a strategy for 
identifying, analyzing, and mitigating risks. The risk management 
strategy addresses the specific actions and management approach used to 
perform and control the risk management program. It also includes 
identifying and involving relevant stakeholders in the risk management 
process. Table 4 shows the status of the four project teams’ 
implementation of key risk preparation activities as of October 2007. 
[Footnote 11] 

Table 4: Risk Management Preparation Activities Completed for the Key 
2010 Census Systems: 

Specific practices: Determine risk sources and categories; 
MTAIP: practice not implemented; 
FDCA: practice fully implemented; 
DRIS: practice fully implemented; 
DADS: practice fully implemented. 

Specific practices: Define parameters used to analyze and categorize 
risks and parameters used to control risk management efforts; 
MTAIP: practice fully implemented; 
FDCA: practice fully implemented; 
DRIS: practice fully implemented; 
DADS: practice fully implemented. 

Specific practices: Establish and maintain the strategy to be used for 
risk management; 
MTAIP: practice partially implemented; 
FDCA: practice fully implemented; 
DRIS: practice fully implemented; 
DADS: practice fully implemented. 

Specific practices: Identify and involve the relevant stakeholders of 
the risk management process as planned; 
MTAIP: practice partially implemented; 
FDCA: practice partially implemented; 
DRIS: practice fully implemented; 
DADS: practice partially implemented. 

Source: GAO analysis of project data. 

[End of table] 

As the table shows, three project teams had established most of the 
risk management preparation activities. However, the MTAIP project team 
had implemented the fewest practices. The team did not adequately 
determine risk sources and categories or adequately develop a strategy 
for risk management. As a result, the project’s risk management 
strategy was not comprehensive and did not fully address the scope of 
the risk management effort, including discussing techniques for risk 
mitigation and defining adequate risk sources and categories. In 
addition, three project teams (MTAIP, FDCA, and DADS II) had weaknesses 
regarding stakeholder involvement. The three teams did not provide 
sufficient evidence that the relevant stakeholders were involved in 
risk identification, analysis, and mitigation activities; reviewing the 
risk management strategy and risk mitigation plans; or communicating 
and reporting risk management status. 

These weaknesses can be attributed in part to the absence of Bureau 
policies for managing major acquisitions, as we described in our 
earlier reports. [Footnote 12] Without adequate preparation for risk 
management, including establishing an effective risk management 
strategy and identifying and involving relevant stakeholders, project 
teams cannot properly control the risk management process. 

The Project Teams Had Identified and Analyzed Risks but Not All Key 
Risks Were Identified: 

Risks must be identified and described in an understandable way before 
they can be analyzed and managed properly. This includes identifying 
risks from both internal and external sources and evaluating each risk 
to determine its likelihood and consequences. Table 5 shows the status 
of the four project teams’ implementation of key risk identification 
and evaluation activities at the time of our October 2007 report. 

Table 5: Risk Identification and Evaluation Activities Completed for 
the Key 2010 Census Systems: 

Specific practices: Identify and document the risks; 
MTAIP: practice fully implemented;
FDCA: practice partially implemented; 
DRIS: practice fully implemented; 
DADS: practice partially implemented. 

Specific practices: Evaluate and categorize each identified risk using 
the defined risk categories and parameters, and determine its relative 
priority; 
MTAIP: practice partially implemented; 
FDCA: practice fully implemented; 
DRIS: practice fully implemented; 
DADS: practice fully implemented. 

Source: GAO analysis of project data. 

[End of table] 

As of July 2007, the MTAIP and DRIS project teams were adequately 
identifying and documenting risks, including system interface risks. 
For example, the MTAIP project team identified significant risks 
regarding potential changes in funding and the turnover of contractor 
personnel as the program nears maturity, and the DRIS project team 
identified significant risks regarding new system security regulations, 
changes or increases to Phase II baseline requirements, and new 
interfaces after Dress Rehearsal. 

In contrast, the FDCA project team had not identified or documented any 
significant risks related to the handheld computers that will be used 
in the 2010 Census, despite problems arising during the Dress 
Rehearsal. The computers are designed to automate operations for field 
staff and eliminate the need to print millions of paper questionnaires 
and maps used by temporary field staff to conduct address canvassing 
and nonresponse follow-up. Automating operations may allow the Bureau 
to reduce the cost of operations; thus, it is critical that the risks 
surrounding the use of the handheld computers be closely monitored and 
effectively managed to ensure their success. However, the Bureau has 
not identified or documented risks associated with a variety of 
handheld computers performance problems that we identified through 
field work conducted at your request. Specifically, we found that 
during Dress Rehearsal activities between May 2007 and June 2007, as 
the Bureau tested a prototype of the handheld computers, field staff 
experienced multiple problems. For example, the field staff told us 
that they experienced slow and inconsistent data transmissions from the 
handheld computers to the central data processing center. The field 
staff reported the device was slow to process addresses that were a 
part of a large assignment area. Bureau staff reported similar problems 
with the handheld computers in observation reports, help desk calls, 
and debriefing reports. In addition, our own analysis of Bureau 
documentation revealed problems with the handheld computers: 

* Bureau observation reports revealed that the Bureau most frequently 
observed problems with slow processing of addresses, large assignment 
areas, and transmission. 

* The help desk call log revealed that field staff most frequently 
reported issues with transmission, the device freezing, map spotting 
and assignment areas. 

* Debriefing reports illustrated the impact of the handheld mobile 
computing problems on address canvassing. For example, one participant 
commented that the field staff struggled to find solutions to problems 
and wasted precious time in replacing the devices. 

* A time-and-motion study conducted by the Census Bureau indicated that 
field staff reported significant downtime in two test locations—about 
23 percent in one location and about 27 percent in another location. 
The study, which is a draft that is subject to change, also described 
occurrences of failed transmissions and field staff attempts to resolve 
transmission problems. 

Collectively, the observation reports, help desk calls, debriefing 
reports, and time-and-motion study raised serious questions about the 
performance of the handheld computers during the address canvassing 
operation. According to the Bureau, the contractor has used these 
indicators to identify and address underlying problems during the Dress 
Rehearsal. Still, the magnitude of handheld computers performance 
issues throughout the Dress Rehearsal remains unclear. For example, the 
Bureau received analyses from the contractor on average transmission 
times. However, the contractor has not provided analyses that show the 
full range of transmission times, nor how this may have changed 
throughout the entire operation. 

In addition, the Bureau has not fully specified how it will measure 
performance of the handheld computers, even though the FDCA contract 
anticipates the Bureau’s need for data on the performance of the 
handheld computers. The FDCA contract outlines the type of data the 
contractor will provide the Bureau on the performance of the handheld 
computers. Specifically, sections of the FDCA contract require the 
handheld computers to have a transmission log with what was 
transmitted, the date, time, user, destination, content/data type, and 
the outcome status. Another section of the Bureau’s FDCA contract 
states that the FDCA contractor shall provide near real time reporting 
and monitoring of performance metrics and a “control panel/dash board” 
application to visually report those metrics from any Internet enabled 
PC. However, the contractor and the Bureau are not using a dashboard 
for Dress Rehearsal activities. Rather, during the Dress Rehearsal, the 
Bureau plans to identify what data and performance they would need for 
tracking the performance of the handheld computers in 2010 operations. 

In order for the Bureau to ensure that the FDCA handheld computers are 
ready for full scale operations, it will have to identify risks on a 
tight time frame. We recommended in a report on the Bureau’s earlier 
version of the handheld computers that the Bureau define specific, 
measurable performance requirements for the handheld computer and other 
census-taking activities that address such important measures as 
productivity, cost savings, reliability, durability, and that the 
Bureau test the device’s ability to meet those requirements in 2006. 
[Footnote 13] We also recommended in a March 2006 testimony that the 
Bureau validate and approve FDCA baseline requirements. [Footnote 14] 
The Bureau is working within a compressed time frame. By law, the 
decennial census must occur on April 1, 2010, and the results must be 
submitted to the President in December 2010. These dates cannot be 
altered, even if preparations are delayed. Access to real-time 
performance metrics via a “control panel/dash board” would assist 
Bureau management in assessing the handheld computer’s performance and 
maximize the amount of time the Bureau and the contractor would have to 
remedy any problems identified during operations. Further, the Bureau’s 
tight 2010 Decennial Operations Schedule allows little time for fixing 
problems with the device, raising the importance of the Bureau’s access 
to these performance indicators. Such data would help fully inform 
stakeholders of the risks associated with the handheld computer, and 
allow project teams to develop mitigation activities to help avoid, 
reduce, and control the probability of these risks occurring. 

Finally, the FDCA and DADSII project teams did not provide evidence 
that specific system interface risks are being adequately identified to 
ensure that risk handling activities will be invoked should the systems 
fail during 2010 Census. For example, although the DADS II will not be 
available for the Dress Rehearsal, the project team did not identify 
any significant interface risks associated with this system. 

One reason for these weaknesses, as mentioned earlier, is the lack of 
Bureau policies for managing major acquisitions. If risks are not 
adequately identified and analyzed, management may be prevented from 
monitoring and tracking risks, and taking the appropriate mitigation 
actions, increasing the probability that the risks will materialize and 
magnifying the extent of damage incurred in such an event. 

Three of Four Project Teams’ Risk Mitigation Plans and Monitoring 
Activities Were Incomplete: 

Risk mitigation involves developing alternative courses of action, 
workarounds, and fallback positions, with a recommended course of 
action for the most important risks to the project. Mitigation includes 
techniques and methods used to avoid, reduce, and control the 
probability of occurrence of the risk; the extent of damage incurred 
should the risk occur; or both. Table 6 shows the status of the four 
project teams’ implementation of key risk mitigation activities. 

Table 6: Risk Mitigation Activities Completed for Key 2010 Census 
Systems: 

Specific practices: Develop a risk mitigation plan for the most 
important risks to the project, as defined by the risk management 
strategy; 
MTAIP: practice partially implemented; 
FDCA: practice partially implemented; 
DRIS: practice fully implemented; 
DADS: practice not implemented. 

Specific practices: Monitor the status of each risk periodically and 
implement the risk mitigation plan as appropriate; 
MTAIP: practice partially implemented; 
FDCA: practice partially implemented; 
DRIS: practice fully implemented; 
DADS: practice partially implemented. 

Source: GAO analysis of project data. 

[End of table] 

Three project teams (MTAIP, FDCA, and DADS II) had developed mitigation 
plans that were often untimely or included incomplete activities and 
milestones for addressing the risks. Some of these untimely and 
incomplete activities and milestones included the following: 

* The FDCA project team had developed mitigation plans for the most 
significant risks, but the plans did not always identify milestones for 
implementing mitigation activities. Moreover, the plans did not 
identify any commitment of resources, several did not establish a 
period of performance, and the team did not always update the plans 
with the latest information on the status of the risk. In addition, the 
FDCA project team did not provide evidence of developing mitigation 
plans to handle the other significant risks as described in their risk 
mitigation strategy. (These risks included a lack of consistency in 
requirements definition and insufficient FDCA project office staffing 
levels). 

* The mitigation plans for DADS II were incomplete, with no associated 
future milestones and no evidence of continual progress in working 
towards mitigating a risk. In several instances, DADS II mitigation 
plans were listed as “To Be Determined.” 

With regard to the second practice in the table (periodically 
monitoring risk status and implementing mitigation plans), the MTAIP, 
FDCA, and DADS II project teams were not always implementing the 
mitigation plans as appropriate. For example, although the MTAIP 
project team has periodically monitored the status of risks, it 
mitigation plans do not include detailed action items with start dates 
and anticipated completion dates; thus, the plans do not ensure that 
mitigation activities are implemented appropriately and tracked to 
closure. The FDCA and DADS II project teams did not identify system 
interface risks nor prepare adequate mitigation plans to ensure that 
systems will operate as intended. Because they did not develop complete 
mitigation plans, the MTAIP, FDCA, and DADS II project teams cannot 
ensure that for a given risk, techniques and methods will be invoked to 
avoid, reduce, and control the probability of occurrence. 

Project Teams Were Inconsistent in Reporting Risk Status to Executive-
Level Management: 

Reviews of the project teams’ risk management activities, status, and 
results should be held on a periodic and event-driven basis. The 
reviews should include appropriate levels of management, such as key 
Bureau executives, who can provide visibility into the potential for 
project risk exposure and appropriate corrective actions. Table 7 shows 
the status of the four project teams’ implementation of activities for 
senior-level risk oversight at the time of our prior report. 

Table 7: Executive-Level Risk Oversight Activities Completed for the 
Key 2010 Decennial Systems: 

Specific practices: Review the activities, status, and results of the 
risk management process with executive-level management, and resolve 
issues; 
MTAIP: practice not implemented; 
FDCA: practice not implemented; 
DRIS: practice fully implemented; 
DADS: practice fully implemented. 

Source: GAO analysis of project data. 

[End of table] 

The project teams were inconsistent in reporting the status of risks to 
executive-level officials. DRIS and DADS II did regularly report risks; 
however, the FDCA and MTAIP projects did not provide sufficient 
evidence to document that these discussions occurred or what they 
covered. Failure to report a project’s risks to executive-level 
officials reduces the visibility of risks to executives who should be 
playing a role in mitigating them. 

Implementation of GAO Recommendations Should Help Improve the Bureau’s 
Risk Management: 

To help ensure that the Bureau’s four key acquisitions for the 2010 
Census operate as intended, we made several recommendations in our 
report. First, to ensure that the Bureau’s decennial systems are fully 
tested, we recommended that the Secretary of Commerce require the 
Director of the Census Bureau to direct the Decennial Management 
Division and Geography Division to plan for and perform end-to-end 
testing so that the full complement of systems are tested in a census-
like environment. 

In written comments on a draft of our final report, the department 
disagreed with our findings that a full complement of systems would not 
be tested, stating it plans to do so during the Dress Rehearsal or 
later. Nonetheless, the Bureau’s test plans have not been finalized, 
and it remains unclear whether testing will address all interrelated 
systems and functionality in a census-like environment, as would be 
provided by end-to-end testing. Consistent with our recommendation 
following up with documented test plans to do end-to-end testing will 
help ensure that decennial systems will work as intended. 

Further, we recommended that the Secretary direct the Director of the 
Census Bureau to ensure that project teams strengthen risk management 
activities associated with risk identification, mitigation, and 
oversight. The department agreed to examine additional ways to manage 
risks and is working on an action plan to strengthen risk management 
activities. 

In summary, the IT acquisitions planned for 2010 Census will require 
continued oversight to ensure that they are achieved on schedule and at 
planned cost levels. Although, as of October 2007, the MTAIP and DRIS 
acquisitions were currently meeting cost estimates, FDCA was not. In 
addition, while the Bureau was making progress developing systems for 
the Dress Rehearsal, it was deferring certain functionality, with the 
result that the Dress Rehearsal operational testing would address less 
than a full complement of systems. Delaying functionality increases the 
importance of later development and testing activities, which will have 
to occur closer to the census date. It also raises the risk of cost 
increases, given the immovable deadline for conducting the 2010 Census. 
Further, the Bureau’s project teams for each of the four acquisitions 
had implemented many practices associated with establishing sound and 
capable risk management processes, but they were not always consistent: 
the teams had not always identified risks, developed complete risk 
mitigation plans, or briefed senior-level officials on risks and 
mitigation plans. At this stage, we are particularly concerned about 
managing the risks associated with the handheld mobile computing 
devices, the numerous systems interfaces, and the remaining systems 
testing. Regarding the handheld mobile computing devices, it is 
critical that performance of these devices is clearly specified, 
measured, and that deficiencies in performance is effectively 
addressed. Until the project teams and the Decennial Management 
Division implement appropriate risk management activities, they face an 
increased probability that decennial systems will not be delivered on 
schedule and within budget or perform as expected. 

Mr. Chairman and members of the subcommittee, this concludes our 
statement. We would be happy to respond to any questions that you or 
members of the subcommittee may have at this time. 

If you have any questions on matters discussed in this testimony, 
please contact David A. Powner at (202) 512-9286 or Mathew Scirè at 
(202) 512-6806 or by e-mail at pownerd@gao.gov or sciremj@gao.gov. 
Other key contributors to this testimony include Mathew Bader, Thomas 
Beall, Jeffrey DeMarco, Richard Hung, Barbara Lancaster, Andrea Levine, 
Signora May, Cynthia Scott, Niti Tandon, Amos Tevelow, Jonathan 
Ticehurst, and Timothy Wexler. 

[End of section] 

Appendix I: Key 2010 Census Information Technology Acquisitions: 

IT acquisition: MAF/TIGER Accuracy Improvement Project (MTAIP); 
Contractor: Harris Corporation; 
Purpose: Modernize the system that provides the address list, maps, and 
other geographic support services for the Census and other Bureau 
surveys; 
Contract type: Cost plus award fee; 
Contract award: June 2002. 

IT acquisition: Field Data Collection Automation (FDCA); 
Contractor: Harris Corporation; 
Purpose: Provide automated resources for supporting field data 
collection, including the provision of handheld mobile computing 
devices to collect data in the field, including address and map data; 
Contract type: Cost plus award fee with some firm fixed price elements; 
Contract award: March 2006. 

IT acquisition: Decennial Response Integration System (DRIS); 
Contractor: Lockheed Martin Corporation; 
Purpose: Provide a solution for data capture and respondent assistance; 
Contract type: Cost plus award fee with some firm fixed price elements; 
Contract award: October 2005. 

IT acquisition: Data Access and Dissemination System (DADS II); 
Contractor: IBM; 
Purpose: Develop a replacement for the DADS legacy tabulation and 
dissemination systems; 
Contract type: To be determined; 
Contract award: September 2007. 

Source: GAO analysis of Census Bureau data. 

[End of table] 

[End of section] 

Footnotes: 

[1] 13 U.S.C. 141 (a) and (b). 

[2] Since issuance of our report in October 2007, the Bureau has 
tentatively moved the mock Census Day from April 1, 2008 to May 1, 
2008. 

[3] GAO, Information Technology: Census Bureau Needs to Improve Its 
Risk Management of Decennial Systems, GAO-08-79 (Washington, D.C.: Oct. 
5, 2007). 

[4] Address canvassing is a field operation to build a complete and 
accurate address list. In this operation, census field workers go door 
to door verifying and correcting addresses for all households and 
street features contained on decennial maps. 

[5] End-to-end testing is a form of operational testing that is 
performed to verify that a defined set of interrelated systems that 
collectively support an organizational core business function 
interoperate as intended in an operational environment. The 
interrelated systems include not only those owned and managed by the 
organization, but also the external systems with which they interface. 

[6] GAO, Census Bureau: Important Activities for Improving Management 
of Key 2010 Decennial Acquisitions Remain to be Done, GAO-06-444T 
(Washington, D.C.: Mar. 1, 2006). 

[7] TIGER is a registered trademark of the U.S. Census Bureau. 

[8] Handheld mobile computing devices will be used to update the 
Bureau’s address list, to perform follow-up at addresses for which no 
questionnaire was returned, and to perform activities to measure census 
coverage. 

[9] The DADS II contract was originally planned to establish a new Web-
based system that would serve as a single point for public access to 
all census data and integrate many dissemination functions currently 
spread across multiple Bureau organizations. 

[10] GAO-06-444T. 

[11] This analysis primarily addresses project teams’ implementation of 
risk management processes. According to our analysis, the contractors 
for the three contracts awarded (MTAIP, FDCA, and DRIS) had implemented 
adequate risk management processes involving risk preparation, risk 
identification and analysis, and risk mitigation. 

[12] GAO, Information Technology Management: Census Bureau Has 
Implemented Many Key Practices, but Additional Actions Are Needed, GAO-
05-661 (Washington, D.C.: June 16, 2005) and GAO, Census Bureau: 
Important Activities for Improving Management of Key 2010 Decennial 
Acquisitions Remain to be Done, GAO-06-444T (Washington, D.C.: Mar. 1, 
2006). 

[13] GAO, 2010 Census: Basic Design Has Potential, but Remaining 
Challenges Need Prompt Resolution, GAO-05-9 (Washington, D.C.: 
January12, 2005). 

[14] GAO-06-444T. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, 
GAO posts newly released reports, testimony, and correspondence on its 
Web site. 
To have GAO e-mail you a list of newly posted products every afternoon, 
go to [hyperlink, http://www.gao.gov] and select "Subscribe to 
Updates." 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office: 
441 G Street NW, Room LM: 
Washington, D.C. 20548: 

To order by Phone: 
Voice: (202) 512-6000: 
TDD: (202) 512-2537: 
Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Gloria Jarmon, Managing Director, JarmonG@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: