This is the accessible text file for GAO report number GAO-07-736 
entitled '2010 Census: Census Bureau Has Improved the Local Update of 
Census Addresses Program, but Challenges Remain' which was released on 
June 15, 2007. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Addressees: 

United States Government Accountability Office: 

GAO: 

June 2007: 

2010 Census: 

Census Bureau Has Improved the Local Update of Census Addresses 
Program, but Challenges Remain: 

GAO-07-736: 

GAO Highlights: 

Highlights of GAO-07-736, a report to congressional addressees 

Why GAO Did This Study: 

The Department of Commerce’s (Commerce) U.S. Census Bureau (Bureau) 
seeks updated information on the addresses and maps of housing units 
and group quarters from state, local, and tribal governments through 
the Local Update of Census Addresses (LUCA) Program. Prepared under the 
Comptroller General’s authority, this report assesses (1) the status of 
the LUCA Program, (2) the Bureau’s response to prior recommendations by 
GAO and others and new challenges related to the program, and (3) the 
Bureau’s plans for conducting the program in areas affected by 
hurricanes Katrina and Rita. 

GAO reviewed LUCA program documents, met with and surveyed participants 
in the LUCA Dress Rehearsal, and interviewed Bureau officials and local 
officials in the Gulf Coast region. 

What GAO Found: 

The Bureau has conducted its planned LUCA operations in accordance with 
its published timeline. The Bureau has also taken steps to reduce 
workloads and burdens and improve training for localities that 
participate in LUCA—all areas GAO and others had identified as needing 
improvement. For instance, to reduce participant workload and burden, 
the Bureau provided a longer period for reviewing and updating LUCA 
materials; provided options for submitting materials for the LUCA 
Program; combined the collection of LUCA addresses from two separate 
operations into one integrated program; and created MTPS, which is 
designed to assist LUCA Program participants in reviewing and updating 
address and map data. Also, the Bureau has planned improvements to the 
2010 LUCA Program training (i.e., specialized workshops for 
informational and then technical training) and plans to supplement the 
workshops with CBT. 

Figure: 

[See PDF for Image] 

Source: GAO presentation of U.S. Census Bureau information, photo (GAO 
2006). 

Improvements made to LUCA program (such as MTPS), but challenges remain 
(such as uncertainty about the ability of localities in the Gulf Coast 
region to participate in LUCA). 

[End of figure] 

However, the Bureau faces new challenges. For instance, the Bureau 
tested MTPS with only one local government. Other local officials we 
spoke with had problems converting Bureau-provided address files. In 
addition, the Bureau did not test its CBT software in the LUCA Dress 
Rehearsal. Additional challenges stem from the damage to the Gulf Coast 
region caused by hurricanes Katrina and Rita. Officials in localities 
in hurricane-affected areas questioned their ability to participate in 
the LUCA Program. The continuous changes in housing stock may hinder 
local governments’ ability to accurately update their address lists and 
maps. The condition of the housing stock is likely to present 
additional challenges for the Bureau’s address canvassing operation (in 
which the Bureau verifies addresses) in the form of decreased 
productivity for Bureau staff, workforce shortages, and issues 
associated with identifying vacant and uninhabitable structures. The 
Bureau created a task force to assess the implications of storm-related 
issues that proposed a number of mitigating actions. However, the 
Bureau has no plans for modifying the address canvassing operation or 
subsequent operations in the Gulf Coast region. 

What GAO Recommends: 

GAO recommends that the Secretary of Commerce direct the Bureau to take 
several actions to improve the LUCA Program, including further testing 
the MAF/TIGER Partnership Software (MTPS) and the computer-based 
training (CBT) software to assess ease of use and establishing a 
schedule for plans for conducting address canvassing and related 
operations in hurricane-affected areas. In commenting on a draft of 
this report, Commerce generally agreed with GAO’s recommendations and 
offered technical comments. 

[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-736]. 

To view the full product, including the scope and methodology, click on 
the link above.
For more information, contact Mathew J. Scire at (202) 512-6806 or 
sciremj@gao.gov. 

[End of section] 

Contents: 

Letter: 

Results in Brief: 

Background: 

The Bureau Has Completed Nearly All Planned Activities for the LUCA 
Dress Rehearsal and the First Step of the 2010 LUCA Program: 

Bureau Modified the LUCA Program to Address Issues from the 2000 
Experience but Faces New Challenges: 

Bureau Has Proposed but Not Finalized Steps to Address Issues in 
Hurricane-Affected Areas: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Scope and Methodology: 

Appendix II: Comments from the Department of Commerce: 

Appendix III: Web-Based Survey of LUCA Dress Rehearsal Participants: 

Appendix IV: GAO Contact and Staff Acknowledgments: 

Related GAO Products: 

Figures: 

Figure 1: Map of the Bureau's California Dress Rehearsal Site: 

Figure 2: Map of the Bureau's North Carolina Dress Rehearsal Site: 

Figure 3: Bureau's LUCA Dress Rehearsal Timeline and Status: 

Figure 4: Bureau's 2010 LUCA Timeline: 

Figure 5: LUCA Dress Rehearsal Participants' Views on the Adequacy of 
Time Allowed to Complete the Review: 

Figure 6: Available Options for Participation in 2010 LUCA Program: 

Figure 7: LUCA Dress Rehearsal Participants' Satisfaction with 
Participation Options: 

Figure 8: Extent of LUCA Dress Rehearsal Participants' Problems with 
File Conversion: 

Figure 9: LUCA Dress Rehearsal Participants' Reports of the Usefulness 
of the Training Session: 

Figure 10: City Hall in Pass Christian, Mississippi, Destroyed by 
Hurricane Katrina (Below Left), and City Officials in Slidell, 
Louisiana, Forced to Operate Out of Trailers since the Hurricane (Below 
Right): 

Figure 11: Trailers in Front of Damaged Housing Units in New Orleans, 
Louisiana: 

Abbreviations: 

BAS: Boundary and Annexation Survey: 

CBT: computer-based training: 

GPS: global positioning system: 

LUCA: Local Update of Census Addresses: 

MAF: Master Address File: 

MTPS: MAF/TIGER Partnership Software: 

NRC: National Research Council: 

TIGER: Topographically Integrated Geographic Encoding and Referencing: 

United States Government Accountability Office: 
Washington, DC 20548: 

June 14, 2007: 

The Honorable Thomas R. Carper: 
Chairman: 
Subcommittee on Federal Financial Management, Government Information, 
Federal Services, and International Security: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

The Honorable Henry A. Waxman: 
Chairman: 
The Honorable Tom Davis: 
Ranking Member: 
Committee on Oversight and Government Reform: 
House of Representatives: 

The Honorable Wm. Lacy Clay: 
Chairman: 
The Honorable Michael Turner: 
Ranking Member: 
Subcommittee on Information Policy, Census, and National Archives: 
Committee on Oversight and Government Reform: 
House of Representatives: 

The Honorable Carolyn B. Maloney: 
House of Representatives: 

The decennial census is a constitutionally mandated activity undertaken 
by the U.S. Census Bureau (Bureau). The data that the census collects 
are used to reapportion the seats of the U.S. House of Representatives; 
redraw congressional districts; allocate billions of dollars each year 
in federal financial assistance; and provide a social, demographic, and 
economic profile of the nation's people to guide policy decisions at 
each level of government. Further, businesses use census data to market 
new services and products and to tailor existing ones to demographic 
changes. 

To ensure it delivers quality data, the Bureau employs a number of 
quality assurance programs throughout the course of the census. One 
such program is the Bureau's Local Update of Census Addresses (LUCA) 
Program, which provides a mechanism for state, local, and tribal 
governments to contribute to complete enumeration of their 
jurisdictions by reviewing, commenting on, and providing updated 
information on the addresses and maps that the Bureau will use to 
deliver questionnaires within those communities. 

The Bureau is testing the LUCA Program as part of the 2008 Census Dress 
Rehearsal in San Joaquin County, California, and nine counties in the 
area surrounding Fayetteville, North Carolina. Bureau officials state 
that they selected these sites to provide a comprehensive environment 
for demonstrating and refining planned 2010 Census operations, such as 
the LUCA Program and address canvassing.[Footnote 1] 

Because of the role LUCA plays in building complete and accurate 
address lists and maps, under the Comptroller General's statutory 
authority to initiate engagements, we reviewed the Bureau's LUCA Dress 
Rehearsal and 2010 LUCA Program. As agreed with your offices, we are 
providing this report to you, which contains information that will be 
useful for your oversight responsibilities of the decennial census. Our 
specific objectives were to (1) document the current status of the LUCA 
effort, (2) determine how the Bureau is addressing prior issues and new 
challenges associated with implementing the LUCA Program, and (3) 
examine how the Bureau is addressing the challenges in the areas 
affected by hurricanes Katrina and Rita that may affect the Bureau's 
successful implementation of the 2010 LUCA Program and related 
decennial census operations. 

To address the first objective, we collected source documents from 
Bureau headquarters, the Charlotte Regional Office, and the Seattle 
Regional Office detailing the 2010 LUCA Program and LUCA Dress 
Rehearsal timelines. We also interviewed Bureau officials to determine 
the status of current operations for the 2010 LUCA Program and LUCA 
Dress Rehearsal. Finally, we visited and collected documents from 12 
localities in California and North Carolina (the two LUCA Dress 
Rehearsal sites) to verify Bureau officials' testimonial evidence. 

For the second objective, we reviewed recommendations for improving the 
LUCA Program that were found in reports by GAO, the National Research 
Council (NRC),[Footnote 2] the Department of Commerce's (Commerce) 
Office of the Inspector General, and a contractor hired by the 
Bureau.[Footnote 3] We reviewed source documents and interviewed Bureau 
officials to determine how the Bureau addressed the recommendations and 
new challenges associated with the LUCA Program. We also conducted a 
Web-based survey of 42 LUCA Dress Rehearsal participants in California 
and North Carolina[Footnote 4] to gauge their satisfaction with how the 
Bureau addressed these recommendations and challenges, and performed 
structured telephone interviews with LUCA Dress Rehearsal 
nonparticipants to determine why they did not participate in LUCA. 

In order to address the third objective, we undertook fieldwork in 
areas of the Gulf Coast region affected by hurricanes Katrina and Rita 
(Louisiana, Mississippi, and Texas) by interviewing local officials and 
collecting photographic and documentary evidence to determine the 
challenges that implementing the LUCA Program in these areas presents. 
Additionally, we collected documents and interviewed officials from the 
Bureau's headquarters and Dallas Regional Office to determine Bureau 
plans for addressing these challenges and prior GAO recommendations 
addressing contingency planning for the affected areas. Appendix I 
provides additional details on our scope and methodology. We conducted 
our work from July 2006 through May 2007 in accordance with generally 
accepted government auditing standards. 

Results in Brief: 

The Bureau conducted nearly all of its planned LUCA Dress Rehearsal 
operations in accordance with its published timeline. The Bureau has 
begun address canvassing, in which it will verify information that 
localities provided to the Bureau for the LUCA Program. The Bureau will 
also enable Dress Rehearsal participants to review feedback materials 
regarding their submissions from December 2007 through January 2008. 
The Bureau met the time frames listed in its published LUCA Dress 
Rehearsal timeline, but as we describe below, this timeline did not 
include testing of software to be used in the 2010 LUCA Program. 
Additionally, the Bureau completed the first step of the 2010 LUCA 
Program, sending local jurisdictions advance letters to notify them 
about the LUCA Program in January and February 2007. 

The Bureau has taken steps to reduce participants' workloads and 
burdens and improve training--all areas NRC, GAO, and others had 
identified as needing improvement. Building on the progress it has 
made, the Bureau could take additional steps to address new challenges 
in these areas, as well as challenges related to measuring overall 
program effectiveness. For instance, to reduce participant workload and 
burden, the Bureau provided a longer period for reviewing and updating 
LUCA materials; provided options for how participants may submit 
updated information to the Bureau; combined the collection of addresses 
from two separate operations into one integrated and sequential 
operation; and created the MAF/TIGER[Footnote 5] Partnership Software 
(MTPS), which is designed to assist LUCA participants in reviewing and 
updating address and map data. However, the Bureau did not test MTPS as 
part of the LUCA Dress Rehearsal, and tested MTPS with only one 
locality in preparation for the 2010 LUCA Program. Additionally, many 
participants experienced problems with converting Bureau-provided 
address files to their own software formats. Also, the Bureau has 
planned improvements to the 2010 LUCA Program training (i.e., 
specialized workshops for informational and then technical training), 
and plans to supplement the workshops with computer-based training 
(CBT). However, the Bureau did not test these improvements in the LUCA 
Dress Rehearsal. Finally, although the Bureau has not finalized its 
evaluation plans regarding the 2010 LUCA Program, Bureau officials have 
stated that the Bureau intends to assess the LUCA Program's 
contribution to address counts and will consider a plan to assess the 
program's contribution to the census population count. Such analysis 
would provide a measure of the ultimate impact of the LUCA Program on 
achieving a complete count of the population. Further, the Bureau does 
not currently collect information needed to measure the percentage of 
eligible local governments that had assessed the accuracy of Bureau- 
provided addresses and maps but had no changes to the addresses and 
maps. Without these data, the Bureau may not be able to fully estimate 
the impact of the LUCA Program on the MAF database and the census 
population count. 

In response to hurricanes Katrina and Rita, the Bureau has proposed 
steps to address LUCA-related issues in hurricane-affected areas. 
During the course of commenting on a draft of this report, the Bureau 
finalized plans for implementing these steps. Hurricane Katrina alone 
destroyed or made uninhabitable an estimated 300,000 homes; in New 
Orleans, the hurricanes damaged an estimated 123,000 housing units. The 
2010 LUCA Program still faces challenges caused by the continuous 
changes in the housing stock in areas affected by storm damage or 
population influxes, which may hinder local governments' ability to 
accurately update their address lists and maps. Further, the condition 
of the housing stock is likely to present additional challenges to 
address canvassing and other decennial census operations in the form of 
decreased productivity for Bureau staff, issues associated with 
identifying vacant and uninhabitable structures, and workforce 
shortages. Officials in Bureau headquarters and the Dallas Regional 
Office have proposed and implemented several changes to the 2010 LUCA 
Program in the Gulf Coast region, such as conducting conference calls 
with the states of Louisiana and Mississippi and providing additional 
promotional workshops in areas hardest hit by hurricanes Katrina and 
Rita. Additionally, the Bureau is considering changes to its 2010 
Census address canvassing operation in the Gulf Coast region (an 
operation that begins in April 2009). 

We are recommending that the Secretary of Commerce direct the Bureau to 
(1) assess potential usability issues with the LUCA Program's CBT and 
MTPS by selecting localities to test the software packages or by 
providing alternative means to assess such issues before participants 
begin reviewing and updating materials for the 2010 LUCA Program in 
August 2007, and provide information on how localities can mitigate 
issues identified in such assessments via its public Web site and its 
LUCA technical help desk; (2) provide localities not using MTPS, via 
its public Web site, its LUCA technical help desk, and other 
appropriate means, instructions on converting files from the Bureau's 
format to the appropriate format for software most commonly used by 
participating localities to update address information; (3) assess the 
contribution of the LUCA Program to the final census population counts, 
as recommended by NRC (to permit an evaluation of the 2010 LUCA Program 
in preparation for 2020); (4) establish a process for localities that 
agreed to participate in the LUCA Program, but found no changes in 
their reviews to explicitly communicate to the Bureau that they have no 
changes; and (5) develop strategy, plans, and milestones for operations 
in areas in the Gulf Coast that address the challenges field staff are 
likely to encounter in conducting address canvassing and subsequent 
decennial operations in communities affected by the hurricanes. 

The Secretary of Commerce provided written comments on a draft of this 
report (see app. II). Commerce generally agreed with our 
recommendations for the Bureau to (1) assess usability issues with MTPS 
and CBT; (2) provide localities not using MTPS with instructions on 
file conversion; (3) assess the contribution of the LUCA Program to the 
final census population counts; (4) establish a process for localities 
to indicate that they participated in the LUCA Program but found no 
changes; and (5) develop strategy, plans, and milestones for operations 
in the Gulf Coast that address the challenges that field staff are 
likely to face. The Bureau also agreed with the draft report's 
recommendation that the Bureau finalize its plans for conducting the 
LUCA Program in the areas affected by Katrina and Rita, noting that its 
plans were now final. We therefore deleted this recommendation. 
Commerce also provided some technical comments and suggestions where 
additional context might be needed, and we revised the report to 
reflect these comments where appropriate. 

Background: 

A complete and accurate address list is the cornerstone of a successful 
census, because it identifies all living quarters that are to receive a 
census questionnaire and serves as the control mechanism for following 
up with living quarters that do not respond. If the address list is 
inaccurate, people can be missed, counted more than once, or included 
in the wrong locations. MAF is intended to be a complete and current 
list of all addresses and locations where people live or potentially 
live. The Topographically Integrated Geographic Encoding and 
Referencing (TIGER) database is a mapping system that identifies all 
visible geographic features, such as type and location of streets, 
housing units, rivers, and railroads.[Footnote 6] The Bureau's approach 
to building complete and accurate address lists and maps consists of a 
series of operations that sometimes overlap and are conducted over 
several years. These operations include partnerships with the U.S. 
Postal Service and other federal agencies; state, local, and tribal 
governments; local planning organizations; the private sector; and 
nongovernmental entities. One such operation is the Bureau's LUCA 
Program. 

The LUCA Program is mandated by the Census Address List Improvement Act 
of 1994[Footnote 7] that expanded the methods the Bureau uses to 
exchange information with tribal, state, and local governments in order 
to support its overall residential address list development and 
improvement process. The LUCA Program is a decennial census geographic 
partnership program that allows participants to contribute to complete 
enumeration of their jurisdictions by reviewing, commenting on, and 
providing updated information on the list of addresses and maps that 
the Bureau will use to deliver questionnaires within those communities. 
The LUCA Program was first implemented for the 2000 Census;[Footnote 8] 
under the program, the Bureau is authorized (prior to the decennial 
census) to share individual residential addresses with officials of 
tribal, state, and local governments who agreed to protect the 
confidentiality of the information.[Footnote 9] 

According to Bureau officials, one reason that participation in the 
LUCA Program is important is that local government officials may be 
better positioned to identify some housing units that are hard to find 
or are hidden because of their knowledge of or access to data in their 
jurisdictions. For example, local governments may have alternate 
sources of address information (such as utility bills, tax records, 
information from housing or zoning officials, or 911 emergency 
systems), which can help the Bureau build a complete and accurate 
address list. In addition, according to Bureau officials, providing 
local governments with opportunities to actively participate in the 
development of the MAF/TIGER database can have the added benefit for 
the Bureau of building local governments' understanding of and support 
for the census. Local governments have key roles in ensuring a 
successful census--not just in developing the address list, but during 
subsequent operations as well, especially those designed to boost 
public participation in the census. 

The LUCA Program was first implemented for the 2000 Census, and of the 
39,051 eligible entities--such as cities and counties--for the 2000 
LUCA Program, 18,333[Footnote 10] (47 percent) agreed to 
participate.[Footnote 11] Subsequently, for 2010, the Bureau has sent 
LUCA advance notification letters to approximately 40,000 entities and 
has set a participation goal of 60 percent. After localities that opted 
to participate in the LUCA Program have submitted their updated maps 
and address lists, the Bureau conducts a field check called address 
canvassing. At that time, the address canvassers--using handheld 
computers equipped with a global positioning system (GPS)--will go door 
to door updating the Census 2010 address list, verifing the information 
localities provided the Bureau during the LUCA Program, adding any 
additional addresses they find, and making other needed corrections to 
the address list and maps. The address canvassing operation will ensure 
that all addresses submitted during the LUCA Program actually exist and 
that they are assigned to the correct census block. 

In preparation for the 2010 Census, both the LUCA Program and the 
subsequent address canvassing operation will be tested as part of the 
Bureau's Dress Rehearsal. The 2008 Census Dress Rehearsal is taking 
place in San Joaquin County, California, and nine counties in the 
Fayetteville, North Carolina area (see figs. 1 and 2). The Bureau 
states that the Dress Rehearsal will help ensure a more accurate and 
cost-effective 2010 Census by demonstrating the methods to be used in 
the nation's decennial headcount, and that the main goal of the Dress 
Rehearsal is to fine-tune the various operations planned for the 
decennial census in 2010 under as close to census-like conditions as 
possible. According to the Bureau, the Dress Rehearsal sites provide a 
comprehensive environment for demonstrating and refining planned 2010 
Census operations and activities, such as the use of GPS-equipped 
handheld computers. 

This report is the latest of several studies we have issued on the 2010 
Census. See Related GAO Products at the end of this report for a list 
of selected products we have issued to date. 

Figure 1: Map of the Bureau's California Dress Rehearsal Site: 

[See PDF for image] 

Source: U.S. Census Bureau. 

[End of figure] 

Figure 2: Map of the Bureau's North Carolina Dress Rehearsal Site: 

[See PDF for image] 

Source: U.S. Census Bureau. 

[End of figure] 

The Bureau Has Completed Nearly All Planned Activities for the LUCA 
Dress Rehearsal and the First Step of the 2010 LUCA Program: 

The Bureau has completed nearly all planned operations for the LUCA 
Dress Rehearsal in accordance with the LUCA Dress Rehearsal timeline 
(see fig. 3).[Footnote 12] The only components that are not yet 
completed are address canvassing (which is scheduled to take place from 
April through June 2007) and the Dress Rehearsal participants' review 
of feedback materials regarding their submissions (which is scheduled 
to take place from December 2007 through January 2008). The Bureau met 
the first date on its timeline when it sent out the LUCA advance 
notification letters and informational materials to the highest elected 
officials in February 2006. The Bureau sent out the official invitation 
to localities, provided participant training, and shipped LUCA 
materials on schedule. Additionally, localities reviewed and updated 
LUCA materials within the June to October 2006 period specified on the 
timeline. Most recently, the Bureau finished its review of 
participants' LUCA submissions and updated the MAF/TIGER geographic 
database in December 2006. Bureau officials state that they expect to 
meet the dates on the timeline for the remaining component--address 
canvassing. 

Figure 3: Bureau's LUCA Dress Rehearsal Timeline and Status: 

[See PDF for image] 

Sources: U.S. Census Bureau and GAO analysis. 

[End of figure] 

It is important to note that while the Bureau met the time frames 
listed in its published LUCA Dress Rehearsal timeline, some activities 
were not included in that timeline. For example, plans to test the 
newly developed MTPS (which is intended to assist participating 
localities in their 2010 LUCA reviews) and test the new computer-based 
LUCA training were not included in the Bureau's LUCA Dress Rehearsal 
schedule--precluding the opportunity to test these software products 
under census-like conditions. 

The 2010 LUCA Program is now under way. In January and February 2007, 
the Bureau sent advance notification letters for the 2010 LUCA Program 
to the highest elected officials in each of the eligible localities. 
Bureau officials expect to meet the remaining dates listed on the 
published timeline (see fig. 4). 

Figure 4: Bureau's 2010 LUCA Timeline: 

[See PDF for image] 

Source: U.S. Census Bureau. 

Note: See the Bureau's Web site, 
http://www.census.gov/geo/www/luca2010/luca.html. 

[End of figure] 

Bureau Modified the LUCA Program to Address Issues from the 2000 
Experience but Faces New Challenges: 

The Bureau has modified the 2010 LUCA Program to address issues 
stemming from the 2000 experience but faces new challenges with the 
program. To reduce the workload and burden on LUCA participants, the 
Bureau provided a longer period for reviewing and updating LUCA 
materials; provided options to submit materials for the LUCA Program; 
combined the collection of addresses from two separate operations into 
one integrated and sequential operation; and created MTPS, which is 
designed to assist LUCA participants in reviewing and updating address 
and map data. However, the Bureau tested MTPS with only one potential 
user for the 2010 LUCA Program, and did not test MTPS with any 
localities during the LUCA Dress Rehearsal. In addition, many 
participants experienced problems with converting Bureau-provided 
address files. Further, the Bureau has planned modified training for 
the 2010 LUCA Program, but the Bureau did not test each of these 
modifications in the LUCA Dress Rehearsal. Finally, although the Bureau 
will likely plan to assess the contribution that the LUCA Program makes 
to address counts, the Bureau does not have a plan to assess the 
contribution that the program makes to population counts. Such analysis 
would provide a measure of the ultimate impact of the LUCA Program on 
achieving a complete count of the population. Also, the Bureau has not 
collected the information needed to fully measure LUCA participation 
rates and is therefore limited in its ability to assess the cost and 
benefits of the LUCA Program to the Bureau. Without this information, 
the Bureau may not be able to fully measure the extent to which local 
review contributed to the MAF database and the census population count. 
Moreover, an additional improvement to the LUCA Program that the Bureau 
cited was the agency's expansion of direct LUCA participation to state 
governments. The Bureau noted that allowing states to participate 
directly can fill the gap when local governments do not participate 
because of a lack of resources or technical challenges. 

Bureau Addressed Issues about Workload and Burden, but Challenges with 
Software and File Conversion Remain: 

Studies by us, NRC, and others highlighted concerns with the burden and 
workload placed on participants in the 2000 LUCA Program. In testimony 
given before the Subcommittee on the Census, House Committee on 
Government Reform in September 1999, we noted that LUCA may have 
stretched the resources of local governments and that the workload was 
greater than most local governments had expected.[Footnote 13] 
According to a report contracted by the Bureau, two reasons cited by 
localities for not participating in the 2000 LUCA Program were the 
volume of work required and the lack of sufficient personnel to conduct 
the LUCA review.[Footnote 14] 

Recognizing that not all localities have the resources to participate 
effectively in the LUCA Program within imposed time constraints, the 
Bureau made several changes to the program. First, the Bureau provided 
a longer review period for LUCA participants. In 2004, NRC reported on 
the 2000 LUCA experience and concluded that the Bureau should clearly 
articulate realistic schedules for the periods when localities can 
review and update LUCA materials.[Footnote 15] Concurrently, the Bureau 
itself recommended that it allow sufficient time for participants to 
complete LUCA updates before the Bureau begins address canvassing 
activities. As a result, the Bureau extended the review period for LUCA 
Program participants from 90 to 120 calendar days. The implementation 
of the review extension was well received by LUCA Dress Rehearsal 
participants; the majority of respondents to our survey of LUCA Dress 
Rehearsal participants indicated that 120 days allowed adequate time to 
complete the LUCA review (see fig. 5). 

Figure 5: LUCA Dress Rehearsal Participants' Views on the Adequacy of 
Time Allowed to Complete the Review: 

[See PDF for image] 

Source: GAO Web-based survey of LUCA Dress Rehearsal participants. 

[End of figure] 

Second, the Bureau provided localities with options for how they may 
participate in the LUCA Program, as recommended in a 2002 contractor 
study of the program.[Footnote 16] Specifically, the Bureau now 
provides three options for how localities can submit address and map 
information to the Bureau: (1) full address list review with count 
review, (2) Title 13 local address list submission, and (3) non-Title 
13 local address list submission (see fig. 6). The three options differ 
in the level of review of Bureau materials by participating localities 
and in requirements to adhere to rules concerning confidentiality of 
information. For options one or two, participants may use MTPS to 
assist in their reviews. 

Figure 6: Available Options for Participation in 2010 LUCA Program: 

[See PDF for image] 

Source: GAO analysis of U.S. Census Bureau materials. 

[End of figure] 

Our survey of LUCA Dress Rehearsal participants found that the majority 
of localities were satisfied with the participation options provided by 
the Bureau (see fig. 7). 

Figure 7: LUCA Dress Rehearsal Participants' Satisfaction with 
Participation Options: 

[See PDF for image] 

Source: GAO Web-based survey of LUCA Dress Rehearsal participants. 

[End of figure] 

Third, the Bureau combined the collection of addresses from two 
separate operations for city-style and non-city-style 
addresses[Footnote 17] into one integrated and sequential operation. In 
a 2004 report, NRC suggested that the Bureau coordinate efforts related 
to the decennial census so that the LUCA Program and other Bureau 
programs would not be unduly redundant and burdensome to 
localities.[Footnote 18] Based on complaints about the multiphased LUCA 
Program from the 2000 experience (where some participants found the two 
separate operations confusing), the Bureau designed the 2010 LUCA 
Program to be a single review operation for all addresses. Bureau 
officials also told us that the combined LUCA operation would be fully 
integrated with the decennial census schedule with address canvassing. 
As a result of the Bureau's efforts, localities could face a reduced 
burden, and participation in the 2010 LUCA Program could be less 
confusing. Further, the Bureau may be able to more effectively verify 
address information collected from LUCA Program participants during 
address canvassing. 

Finally, the Bureau has created MTPS, which is a geographic information 
system application that will allow LUCA Program participants to update 
the Bureau's address list and maps electronically.[Footnote 19] The 
application will also enable users to import address lists and maps for 
comparison to the Bureau's data and participate in both the LUCA 
Program and the Boundary and Annexation Survey (BAS)[Footnote 20] at 
the same time. The Bureau noted that participants who sign up to 
participate in the LUCA Program by October 31, 2007, will be allowed to 
provide their boundary updates with their LUCA updates and thereby 
avoid having to separately respond to the 2008 BAS. A 2004 study by NRC 
recommended that the Bureau coordinate efforts so that the LUCA 
Program, BAS, and other programs are not unduly redundant and 
burdensome for local and tribal entities.[Footnote 21] Consistent with 
that recommendation, the Bureau created MTPS, which Bureau officials 
said benefits participants by reducing their workloads and burdens in 
participating in the 2010 LUCA Program by allowing them to review and 
update address and map information together in one software package. 

Building on the progress it has already made, the Bureau can take 
additional steps to address new challenges in reducing workload and 
burdens for LUCA participants. First, although the Bureau performed 
internal tests of the software, the Bureau did not test MTPS as part of 
the LUCA Dress Rehearsal and tested MTPS with only one locality in 
preparation for the 2010 LUCA Program. Properly executed user-based 
methods for software testing can give the truest estimate of the extent 
to which real users can employ a software application effectively, 
efficiently, and satisfactorily. In addition, multiple users are 
required to tease out remaining problems in a product that is ready for 
distribution.[Footnote 22] The Bureau's statement of work regarding 
MTPS specifically required milestones for testing and review of the 
software by 10 local sites during its development. However, the 
Bureau's contract did not specify how many local sites would test the 
LUCA portion of MTPS. Further, meeting minutes between the Bureau and 
the MTPS contractor revealed that the contractor did not necessarily 
plan to test the LUCA portion of MTPS with local users during its 
development. The Bureau ultimately identified three local sites to test 
the LUCA portion of MTPS, but only performed the test with one. Of the 
other two proposed sites, one explicitly canceled testing, and the 
other did not respond to the Bureau's attempts at communication. 
Additionally, Bureau officials told us that user testing for the LUCA 
Program portions of MTPS was constrained by existing resource 
limitations and timing issues associated with the schedule for 
development of MTPS. Bureau officials also informed us that they will 
provide frequently asked questions regarding MTPS for the LUCA 
technical help desk. 

Second, a majority of LUCA Dress Rehearsal participants experienced 
problems with converting Bureau address files from the Bureau's format 
to their own software formats. If participants in the 2010 LUCA Program 
choose not to use MTPS to update address and map information, they can 
review and update computer-readable files of census address lists in a 
pipe-delimited text file format.[Footnote 23] While the Bureau included 
instruction for converting files in its LUCA Dress Rehearsal 
participation guide, it did not include information on specific 
commonly available types of software that localities are likely to 
use.[Footnote 24] Participants in the LUCA Dress Rehearsal experienced 
problems with converting the files from the Bureau's format to their 
respective applications; our survey of LUCA Dress Rehearsal 
participants revealed that the majority of respondents had, to some 
extent, problems with file conversions to appropriate formats (see fig. 
8). Our fieldwork also revealed issues pertaining to file conversion; 
for example, one local official noted that it took him 2 days to 
determine how to convert the Bureau's pipe-delimited files. To mitigate 
the potential burden on localities that choose not to use MTPS, the 
Bureau will provide technical guidance on file conversion through its 
LUCA technical help desk, but does not plan to provide instructions for 
converting Bureau-provided address files through other means. At 
present, the Bureau does not know how many localities will opt not to 
use MTPS for the 2010 LUCA Program, but those localities may face the 
same challenges faced by participants in the LUCA Dress Rehearsal. 

Figure 8: Extent of LUCA Dress Rehearsal Participants' Problems with 
File Conversion: 

[See PDF for image] 

Source: GAO Web-based survey of LUCA Dress Rehearsal participants. 

Note: Seven of the 31 respondents either had no basis to judge or did 
not respond to this question. 

[End of figure] 

Bureau Plans Improvements to the 2010 LUCA Program Training but Did Not 
Fully Test Improvements in the LUCA Dress Rehearsal: 

Leading up to the 2000 Census, we reported that LUCA training received 
less favorable reviews than the other components of the LUCA 
Program.[Footnote 25] The 2000 LUCA Program had one training session 
that encompassed all aspects of the LUCA Program. For the 2010 LUCA 
Program, the Bureau plans to separate LUCA classroom training into 
informational and technical training sessions and provide user guides 
tailored to the participation option chosen by LUCA Program 
participants. The Bureau provided localities with information on the 
participation options during the LUCA Dress Rehearsal. However, during 
the LUCA Dress Rehearsal, the Bureau conducted training sessions that 
combined promotional and technical components of training because it 
did not have time to conduct the promotional workshop prior to the LUCA 
Dress Rehearsal. Consequently, the Bureau was not able to obtain 
feedback from Dress Rehearsal participants about separating classroom 
training before the 2010 LUCA Program. Nevertheless, overall 
respondents to our survey found the LUCA Dress Rehearsal training 
session useful (see fig. 9). 

Figure 9: LUCA Dress Rehearsal Participants' Reports of the Usefulness 
of the Training Session: 

[See PDF for image] 

Source: GAO Web-based survey of LUCA Dress Rehearsal participants. 

[End of figure] 

The Bureau plans to further improve the 2010 LUCA Program by offering 
CBT modules to program participants. Though participants were not 
provided with CBT in the LUCA Dress Rehearsal, our work has found that 
this method of training is viewed by participants as helpful. 
Specifically, respondents to our survey ranked CBT higher than 
classroom training, in terms of being "extremely" or "very" useful. 
Additionally, local officials told us that CBT was more convenient for 
them because they need not leave their offices or adjust their 
schedules to learn how the LUCA Program works. However, the Bureau's 
plans for testing the LUCA CBT include only one user. Properly executed 
user-based methods of software testing can provide the truest estimate 
of the extent to which real users can employ an application 
effectively.[Footnote 26] The contractor responsible for creating the 
LUCA CBT was to have provided preliminary versions of the CBT to the 
Bureau for testing beginning in May 2007--7 months after the end of the 
LUCA Dress Rehearsal review and 3 months before participants begin 
reviewing and updating address lists and maps for the 2010 LUCA 
Program. This timing did not allow the Bureau to test the CBT under 
census-like conditions, and will leave little time to make any changes 
before the CBT is distributed to LUCA participants. 

Bureau Has Not Collected Information Needed to Fully Assess LUCA Costs, 
Benefits, and Contributions: 

A 2002 study by a Bureau contractor recommended that the Bureau 
evaluate the cost and benefits of its LUCA-related activities. An NRC 
study of the LUCA Program recommended that the Bureau quantify the 
value of the program in both housing and population terms. The study 
indicated that quantifying the value of the LUCA Program is useful to 
show that the cost for the effort is worthwhile and persuade local 
officials that it is worth their time and resources to become involved 
in the LUCA Program[Footnote 27] (for example, by showing how LUCA 
contributes to a more accurate count of their communities' 
populations). 

The Bureau said that it would mark and evaluate contributions (such as 
added, corrected, or deleted addresses) of the LUCA Program to the MAF 
database. The Bureau has not finalized its evaluation plans regarding 
the 2010 LUCA Program; these plans would include decisions about 
whether to conduct assessments of the program's contribution to the 
census population count. The Bureau also stated that measuring whether 
the LUCA Program is cost beneficial "has not been a priority" for the 
agency, given that the program is legally mandated. In addition, Bureau 
officials stated that they will not budget the LUCA Program separately 
until fiscal year 2008. They noted that the LUCA Program budget is 
currently combined with those of other geographic programs in the 
Decennial Management Division budget. 

Our work in the area of managing for results has found that federal 
agencies can use performance information, such as that described above, 
to make various types of management decisions to improve programs and 
results. For example, performance information can be used to identify 
problems in existing programs, identify the causes of problems, develop 
corrective actions, develop strategies, plan and budget, identify 
priorities, and make resource allocation decisions to affect programs 
in the future. Finally, managers can use performance information to 
identify more effective approaches to program implementation and share 
those approaches more widely across the agency.[Footnote 28] 

One aspect of assessing the LUCA Program is determining the extent to 
which localities assess Bureau-provided counts, addresses, and maps. 
However, LUCA Program participation rates are currently difficult to 
measure because the Bureau does not have a method of tracking 
localities that agreed to participate in the program but did not submit 
updates to the Bureau because they found no needed changes to Bureau- 
provided information. Officials from the Bureau measure LUCA Program 
participation by whether localities agree to participate in the 
program, regardless of whether they actually take the time to review 
the materials the Bureau provides them. Inventory forms used by 
localities to inform the Bureau of updated LUCA materials do not 
include an option for localities to indicate whether they reviewed the 
materials and chose not to provide updates or had not identified any 
needed changes. This information would allow the Bureau to distinguish 
between localities that initially agreed to participate but did not and 
localities that agreed to participate and either did not review the 
materials or found no changes to submit. The Bureau would then have a 
unique estimate of localities that found the Bureau's data to be 
accurate. Without more precise information on localities that do not 
provide information, the Bureau cannot fully track localities that 
actually reviewed materials during participation in the LUCA Program, 
and therefore cannot ascertain the actual participation rates. More 
important, without this information, the Bureau cannot fully measure 
the extent to which local reviews have contributed to accurate address 
lists and population counts. 

Bureau Has Proposed but Not Finalized Steps to Address Issues in 
Hurricane-Affected Areas: 

Hurricane Katrina made landfall in Mississippi and Louisiana on August 
29, 2005, and caused $96 billion in property damage--more than any 
other single natural disaster in the history of the United States. On 
September 24, 2005, Hurricane Rita followed when it made landfall in 
Texas and Louisiana and added to the devastation. Still today, the 
storms' impact is visible throughout the Gulf Coast region. Hurricane 
Katrina alone destroyed or made uninhabitable an estimated 300,000 
homes. In New Orleans, the hurricanes damaged an estimated 123,000 
housing units. The 2010 LUCA Program faces challenges caused by the 
continuous changes in the housing stock in areas affected by storm 
damage or population influxes, which may hinder the ability of local 
governments to accurately update their address lists and maps. Further, 
the condition of the housing stock is likely to present additional 
challenges for address canvassing and other decennial census operations 
in the form of decreased productivity for Bureau staff, issues 
associated with identifying vacant and uninhabitable structures, and 
workforce shortages. Early in 2006, based on our prior recommendations, 
the Bureau chartered a team to assess the impact of the storm damage on 
its address list and maps for the area. This team (working with other 
officials from Bureau headquarters and the Dallas Regional Office) 
proposed several changes to the 2010 LUCA Program and address 
canvassing in the Gulf Coast region. Officials in the Bureau 
headquarters and Dallas Regional Office have implemented several of 
these changes. 

Many officials of local governments we visited in hurricane-affected 
areas said they have identified numerous housing units that have been 
or will be demolished as a result of hurricanes Katrina and Rita and 
subsequent deterioration. Conversely, many local governments estimate 
that there is new development of housing units in their respective 
jurisdictions. The officials we interviewed from localities in the Gulf 
Coast region indicated that such changes in the housing stock of their 
jurisdictions are unlikely to subside before local governments begin 
updating and reviewing materials for the Bureau's 2010 LUCA Program--in 
August 2007.[Footnote 29] Local government officials told us that 
changes in housing unit stock are often caused by difficulties that 
families have in deciding whether to return to hurricane-affected 
areas. Local officials informed us that a family's decision to return 
is affected by various factors, such as the availability of insurance; 
timing of funding from Louisiana's Road Home Program;[Footnote 30] lack 
of availability of contractors; school systems that are closed; and 
lack of amenities, such as grocery stores.[Footnote 31] As a result of 
the still-changing housing unit stock, local governments in hurricane- 
affected areas may be unable to fully capture reliable information 
about their address lists before the beginning of the LUCA Program this 
year or address canvassing in April 2009. Furthermore, operation of 
local governments themselves has been affected by the hurricanes (see 
fig. 10). These local governments are focused on reconstruction, and 
officials we spoke with in two localities questioned their ability to 
participate in the LUCA Program. 

Figure 10: City Hall in Pass Christian, Mississippi, Destroyed by 
Hurricane Katrina (Below Left), and City Officials in Slidell, 
Louisiana, Forced to Operate Out of Trailers since the Hurricane (Below 
Right): 

[See PDF for image] 

Source: GAO (January 2007). 

[End of figure] 

The mixed condition of the housing stock in the Gulf Coast region could 
cause a decrease in productivity rates during address canvassing. 
During our fieldwork, we found that hurricane-affected areas have many 
neighborhoods with abandoned and vacant properties mixed in with 
occupied housing units. Bureau staff conducting address canvassing in 
these areas may have decreased productivity because of the additional 
time necessary to distinguish between abandoned, vacant, and occupied 
housing units. We also observed many areas where lots included a 
permanent structure with undetermined occupancy, as well as a trailer. 
Bureau field staff may be presented with the challenge of determining 
whether a residence or a trailer (see fig. 11), or both, are occupied. 
Another potential issue is that because of continuing changes in the 
condition in the housing stock, housing units that are deemed vacant or 
abandoned during address canvassing may be occupied on Census Day 
(April 1, 2010). 

Figure 11: Trailers in Front of Damaged Housing Units in New Orleans, 
Louisiana: 

[See PDF for image] 

Source: GAO (January 2007). 

[End of figure] 

Workforce shortages may also pose significant problems for the Bureau's 
hiring efforts for address canvassing. The effects of hurricanes 
Katrina and Rita caused a major shift in population away from the 
hurricane-affected areas. This migration displaced many low-wage 
workers. Should this continue, it could affect the availability of such 
workers for address canvassing and other decennial census operations. 

In 2006, we recommended that the Bureau develop plans (prior to the 
start of the 2010 LUCA Program in August 2007) to assess whether new 
procedures, additional resources, or local partnerships may be required 
to update the MAF/TIGER database in the areas affected by hurricanes 
Katrina and Rita.[Footnote 32] The Bureau responded to our 
recommendations by chartering a team to assess the impact of the storm 
damage on the Bureau's address lists and maps for areas along the Gulf 
Coast and develop strategies with the potential to mitigate these 
impacts. The chartered team recommended that the Bureau consult with 
state and regional officials (from the Gulf Coast region) on how to 
make the LUCA Program as successful as possible and hold special LUCA 
workshops for geographic areas identified by the Bureau as needing 
additional assistance. 

In addition to the recommendations made by the Bureau's chartered team, 
officials from Bureau headquarters and the Dallas Regional Office 
proposed steps to address LUCA-related issues in hurricane-affected 
areas. For example, they proposed that the Bureau provide LUCA training 
in several areas of Louisiana and Mississippi during promotional 
workshops for the LUCA Program. Finally, Bureau documentation indicated 
that the Bureau is considering an "Update/Enumerate" operation to 
enumerate addresses in the most severely devastated parishes and 
counties in hurricane-affected areas.[Footnote 33] 

The Bureau has implemented several of the proposed changes, cited 
above, to the 2010 LUCA Program in the Gulf Coast region based on 
recommendations from its chartered team, other Bureau headquarters 
officials, and regional office officials. For example, the Bureau 
conducted conference calls with the states of Louisiana and Mississippi 
(in October and December 2006, respectively) to discuss the LUCA 
Program, and had the Dallas and Atlanta regional offices hold 
additional promotional workshops in hurricane-impacted areas. In 
addition, Bureau officials have stated that the regional offices will 
also encourage participants in these areas to sign up for LUCA as early 
as possible so that if they need more than 120 days for conducting 
their LUCA review, they can request an extension from the Bureau. 

In addition to the changes in the 2010 LUCA Program, the Bureau has 
considered changes to the address canvassing and subsequent operations 
in the Gulf Coast region. For example, Bureau officials stated that 
they recognize issues with identifying uninhabitable structures in 
hurricane-affected zones and, as a result, that they may need to change 
procedures for address canvassing. The Bureau is still brainstorming 
ideas, including the possibility of using an "Update/Enumerate" 
operation in areas along the Gulf Coast. Bureau officials also said 
that they may adjust training for Bureau staff conducting address 
canvassing in hurricane-affected areas to help field staff distinguish 
between abandoned, vacant, and occupied housing units. Without proper 
training, field staff can make errors and will not operate as 
efficiently.[Footnote 34] The Bureau's plans for how it may adjust 
address canvassing operations in the Gulf Coast region can also have 
implications for subsequent operations. For example, instructing field 
staff to be as inclusive as possible in completing address canvassing 
could cause increased efforts to follow up on nonrespondents because 
the Bureau could send questionnaires to housing units that could be 
vacant on Census Day. In terms of the Bureau's workforce in the Gulf 
Coast region, officials from the Bureau's Dallas Regional Office 
recognize the potential difficulty of attracting field staff, and have 
recommended that the Bureau be prepared to pay hourly wage rates for 
future decennial staff that are considerably higher than usual. 
Further, Bureau officials noted that the Bureau's Dallas Regional 
Office, which has jurisdiction over hurricane-affected areas in 
Louisiana, Mississippi, and Texas, will examine local unemployment 
rates to adjust pay rates in the region and use "every single entity" 
available to advertise for workers in the New Orleans area. However, 
Bureau officials stated that there are "no concrete plans" to implement 
changes to address canvassing or subsequent decennial operations in the 
Gulf Coast region. For instance, Bureau documentation revealed that the 
Bureau has not yet decided whether to implement "Update/Enumerate" 
operations in areas along the Gulf Coast. 

Conclusions: 

The Bureau has met the time frames for the LUCA Dress Rehearsal and the 
distribution of advance letters for the 2010 LUCA Program. The Bureau 
has also taken a number of steps to improve the LUCA Program, including 
providing a longer review period for program participants, providing 
localities options for program participation, combining the collection 
of addresses from two separate operations into one integrated and 
sequential operation, creating MTPS for participant use in the program, 
and modifying LUCA training. 

However, there is more the Bureau can do to address information 
technology-based challenges to the LUCA Program prior to the 2010 
Census and beyond. The Bureau performed little user testing of MTPS and 
no user testing of the CBT module for the 2010 LUCA Program; however, 
the Bureau can do more to assess the usability of MTPS and the LUCA 
CBT. For example, the Bureau could test MTPS and LUCA CBT software with 
localities before participants begin reviewing and updating materials 
for the 2010 LUCA Program in August 2007. These tests would help the 
Bureau identify issues associated with MTPS and LUCA CBT software. 
Following the tests, the Bureau can provide information on how 
localities can mitigate such issues via its public Web site and its 
LUCA technical help desk. Without these tests, localities participating 
in LUCA 2010 may unnecessarily encounter issues with the CBT software 
that may otherwise have been identified through testing. The Bureau can 
also provide additional information, via its public Web site, its LUCA 
technical help desk, and other means, on converting Bureau address 
files from the Bureau's format to specific software applications used 
by LUCA Program participants in order to mitigate difficulties in file 
conversion previously identified by LUCA Dress Rehearsal participants. 
Without such guidance, localities may have difficulty with the file 
conversion process, creating additional and unnecessary burdens for the 
localities that choose not to use MTPS. 

NRC, in its assessment of the LUCA Program, concluded that quantifying 
the value of the LUCA program is worthwhile, citing for example its use 
in persuading local officials of the value of participating in the LUCA 
program. NRC suggests that an evaluation of the LUCA Program consider 
not only its contributions to address counts but also to population 
counts. We agree that the Bureau can use such information to measure 
the LUCA Program's contribution to the decennial census. In addition, 
the Bureau is limited in its ability to fully assess the impact of the 
program because it does not collect information on why localities that 
agreed to participate do not provide updated information. Without these 
data, the Bureau cannot determine whether nonresponding localities 
assessed the Bureau's information or whether these localities did 
assess the information but had no changes. Without these data, the 
Bureau may be hampered in its ability to estimate the impact of the 
LUCA Program on the MAF database and the census population count. 

Bureau efforts to consult with state officials and consider changes in 
decennial census operations, including LUCA, in hurricane-affected 
areas along the Gulf Coast have helped the Bureau better understand 
issues associated with implementing these operations in the Gulf Coast 
region. However, the Bureau can do more to successfully implement 
address canvassing and other decennial census operations in the Gulf 
Coast. For example, Bureau efforts to address issues associated with 
address canvassing, such as adjusting wage rates for future decennial 
staff, may help the Bureau fulfill staffing requirements for the 
address canvassing operation (which is scheduled to take place in 2009) 
and other decennial census operations. Because the changing stock may 
affect the Bureau's ability to effectively conduct address canvassing 
and other operations in the Gulf Coast region, it is important for the 
Bureau to complete its planning for addressing the challenges that 
field staff would likely face. 

Recommendations for Executive Action: 

In order for the Bureau to address the remaining challenges facing its 
implementation of the 2010 LUCA Program, we recommend that the 
Secretary of Commerce direct the Bureau to take the following five 
actions: 

* Assess potential usability issues with the LUCA Program's CBT and 
MTPS by randomly selecting localities in which to test the software 
packages or by providing alternative means to assess such issues before 
participants begin reviewing and updating materials for the 2010 LUCA 
Program in August 2007, and provide information on how localities can 
mitigate issues identified in such assessments via its public Web site 
and its LUCA technical help desk. 

* Provide localities not using MTPS, via its public Web site, its LUCA 
technical help desk, and other appropriate means, instructions on 
converting files from the Bureau's format to the appropriate format for 
software most commonly used by participating localities to update 
address information. 

* Assess the contribution of the LUCA Program to the final census 
population counts, as recommended by NRC (to permit an evaluation of 
the 2010 LUCA Program in preparation for 2020). 

* Establish a process for localities that agreed to participate in the 
LUCA Program but found no changes in their review to explicitly 
communicate to the Bureau that they have no changes. 

* Develop strategy, plans and milestones for operations in areas in the 
Gulf Coast that address the challenges field staff are likely to 
encounter in conducting address canvassing and subsequent decennial 
operations in communities affected by the hurricanes. 

Agency Comments and Our Evaluation: 

In written comments on a draft of this report, the Bureau generally 
agreed with our recommendations for the Bureau to assess usability 
issues with MTPS and CBT; provide localities not using MTPS with 
instructions on file conversion; assess the contribution of LUCA to the 
final census population counts; establish a process for localities to 
indicate that they participated in LUCA but found no changes; and 
develop strategy, plans, and milestones for operations in the Gulf 
Coast that address the challenges that field staff are likely to face. 
The Bureau also agreed with the draft report's recommendation that the 
Bureau finalize its plans for conducting the LUCA Program in the areas 
affected by the hurricanes, noting that its plans were now final. We 
therefore deleted this recommendation. The Bureau also provided some 
technical comments and suggestions where additional context might be 
needed, and we revised the report to reflect these comments as 
appropriate. The Bureau's comments are reprinted in their entirety in 
appendix II. 

We are sending copies of this report to interested congressional 
committees and members, the Secretary of Commerce, and the Director of 
the U.S. Census Bureau. Copies will be made available to others on 
request. This report will also be available at no charge on GAO's Web 
site at http://www.gao.gov. 

If you or your staff have any questions about this report, please 
contact me at (202) 512-6806 or sciremj@gao.gov. Contact points for our 
Offices of Congressional Relations and Public Affairs may be found on 
the last page of this report. GAO staff who made major contributions to 
this report are listed in appendix IV. 

Signed by: 

Mathew J. Scirè: 
Director, Strategic Issues: 

[End of section] 

Appendix I: Scope and Methodology: 

To assess the current status of the U.S. Census Bureau's (Bureau) Local 
Update of Census Addresses (LUCA) Program, we requested and obtained 
source documents from the Bureau's headquarters in Suitland, Maryland, 
and the Bureau's Web site regarding the updated timelines of the 2010 
LUCA Program and the LUCA Dress Rehearsal. We also visited the Bureau's 
regional office in Charlotte, North Carolina; conducted a phone 
interview with the Bureau's regional office in Seattle, Washington; and 
obtained documents, including the Bureau's timeline for headquarters 
and regional office activities associated with the 2010 Census LUCA 
Program. Additionally, we analyzed the data to determine if the 
Bureau's actual timelines met the planned timelines for the LUCA Dress 
Rehearsal and the 2010 LUCA Program. 

Additionally, we interviewed officials from the Bureau headquarters in 
Suitland, Maryland, to determine the extent to which activities 
associated with the 2010 LUCA Program and LUCA Dress Rehearsal (held 
June through October 2006) met their timelines. We also visited and 
obtained documentation from localities associated with the LUCA Dress 
Rehearsal in California and North Carolina. 

To assess how the Bureau is addressing prior issues and new challenges 
associated with implementing the LUCA Program, we performed a review of 
publications created by GAO and other entities (i.e., the National 
Research Council, the Department of Commerce's Office of the Inspector 
General, and Anteon Corporation) regarding the LUCA Program to 
ascertain critiques of the program and recommendations for improving 
the program for the 2010 Census. We also obtained source documents and 
interviewed officials from the Bureau's headquarters in Suitland, 
Maryland, to determine how the Bureau addressed prior issues and new 
challenges related to the LUCA Program and what modifications the 
Bureau has made to the 2010 LUCA Program. To determine how the 2010 
LUCA Program is being implemented, we undertook fieldwork in 12 
localities (in California and North Carolina) that were eligible to 
participate in the LUCA Dress Rehearsal, which was held from June 
through October 2006. The 12 localities were selected because they were 
geographically diverse and varied in population. During our visits to 
the localities, we interviewed and obtained documentation from local 
government officials to determine how the Bureau implemented the LUCA 
Dress Rehearsal and addressed prior issues and new challenges related 
to the LUCA Program. We also conducted interviews and collected 
documentation from the Bureau's regional offices in Charlotte, North 
Carolina, (in person) and Seattle, Washington, (via telephone) to 
determine the Bureau's implementation of the LUCA Dress Rehearsal from 
the perspective of Bureau officials responsible for the LUCA Dress 
Rehearsal sites. 

To obtain further information on the experiences of participants with 
LUCA Dress Rehearsal activities, we administered a World Wide Web 
questionnaire accessible through a secure server to 42 local 
governments participating in the LUCA Dress Rehearsal. We collected 
data on participants' experiences with the review process, the census 
maps and addresses, work materials, and interactions with the Bureau 
and other agencies. 

Because this was not a sample survey, it has no sampling errors. 
However, the practical difficulties of conducting a survey may 
introduce errors, commonly referred to as nonsampling errors. For 
example, difficulties in interpreting a particular question, or sources 
of information available to respondents, can introduce unwanted 
variability into the survey results. We took steps in developing the 
questionnaire, collecting the data, and analyzing them to minimize such 
nonsampling errors. 

For example, the survey was tested with two LUCA Dress Rehearsal 
participants in order to check that the questions were clear and 
unambiguous, the information could be obtained by the respondents, and 
the questionnaire did not place an undue burden on the respondents. 
When we analyzed the data, an independent analyst checked all computer 
programs. Once the questionnaire was finalized, each of the 42 local 
governments was notified that the questionnaire was available online 
and provided with a unique password and user name. Therefore, 
respondents entered their answers directly into the electronic 
questionnaire, eliminating the need to key data into a database. 

We included in our study population those local governments in 
California and North Carolina that participated in the LUCA Dress 
Rehearsal. We defined participants as those local governments that had 
signed up to participate and had not later indicated that they in fact 
did not participate in the LUCA Dress Rehearsal. The Bureau identified 
44 state, county, and municipal governments that met our criteria as 
participating in the LUCA Dress Rehearsal. Questionnaires were sent to 
42 local governments[Footnote 35] and were completed by 31 such 
governments, for a response rate of 74 percent. There were a total of 
62 localities eligible to participate in the LUCA Dress Rehearsal. In 
addition to our survey, we also performed structured interviews (in 
person and via telephone) with officials in 7 localities that were 
eligible to participate in the LUCA Dress Rehearsal but did not take 
part in the program. 

To assess how the Bureau is addressing the challenges in areas affected 
by hurricanes Katrina and Rita that may affect the Bureau's successful 
implementation of the 2010 LUCA Program, we undertook fieldwork in 
eight localities situated in portions of the Gulf Coast region 
(Louisiana, Mississippi, and Texas) affected by hurricanes Katrina and 
Rita. We selected these localities because they varied in size and 
location in the Gulf Coast region. During the fieldwork, we obtained 
documentation and interviewed officials from each locality about what 
challenges, if any, the hurricane damage poses to the locality's 
successful participation in the 2010 LUCA Program. 

We obtained source documents and interviewed officials from Bureau 
headquarters in Suitland, Maryland (in person), and the Bureau regional 
office in Dallas, Texas (via telephone), about how the Bureau is 
addressing the aforementioned challenges that are faced by eligible 
participants in the 2010 LUCA Program in the areas affected by 
hurricanes Katrina and Rita. We also obtained information, from the 
sources mentioned above, on the extent to which the Bureau has 
addressed prior GAO recommendations regarding performing decennial 
census operations in hurricane-affected areas. 

We conducted our work from July 2006 through May 2007 in accordance 
with generally accepted government auditing standards. 

[End of section] 

Appendix II: Comments from the Department of Commerce: 

The Secretary Of Commerce:
Washington, D.C. 20230: 

May 31, 2007: 

Mr. Mathew J. Scire: 
Director: 
Strategic Issues: 
U.S. Government Accountability Office: 
Washington, DC 20548: 

Dear Mr. Scire: 

The U.S. Department of Commerce appreciates the opportunity to comment 
on the United States Government Accountability Office's draft report 
entitled 2010 Census: Census Bureau Has Improved the Local Update of 
Census Addresses Program, but Challenges Remain (GAO-07-736). On behalf 
of the Department of Commerce, I enclose Census's programmatic comments 
on the draft report. 

Sincerely, 

Signed by: 

Carlos M. Gutierrez: 

Enclosure: 

Draft Report No. - GAO-07-736 - May 2007 United States Government 
Accountability Office Draft Report Entitled 2010 Census: Census Bureau 
Has Improved the Local Update of Census Addresses Program, but 
Challenges Remain U.S. Census Bureau Comments: 

The U.S. Census Bureau generally agrees with the recommendations in 
this report, but has some concerns and comments about various 
statements and conclusions. 

Our specific comments and concerns about the report are as follows: 

Page 9: 

The report includes the following statement: "A complete and accurate 
address list is the cornerstone of a successful census, because it both 
identifies all households that are to receive a census questionnaire 
and serves as the control mechanism for following up with households 
that do not respond." 

Census Bureau Response: The address list identifies living quarters not 
households. A household is considered a group of people living 
together. Our purpose is to identify any site where people live, stay, 
or could live. 

Page 11: 

The report includes the following statement: "Subsequently, for 2010, 
the Bureau has invited approximately 40, 000 entities to participate in 
the LUCA Program and has set a participation goal of 60 percent. " 

Census Bureau Response: To be precise, the Census Bureau has not yet 
invited any governments to participate in LUCA-this does not happen 
until July 2007. 

Page 16: 

The discussion regarding steps the Census Bureau has taken to address 
problems with the Census 2000 LUCA omits an important one-the expansion 
of direct LUCA participation eligibility to state governments. Allowing 
states to participate directly can fill the gap when local governments 
do not participate due to lack of resources or technical challenges. 
Additionally, we believe including the states as potential participants 
seems to have increased their awareness and promotional activities even 
if they ultimately choose not to participate. 

Another omitted improvement is our allowing participants who sign up to 
participate by October 31, 2007, to provide their boundary updates with 
their LUCA updates and thereby avoid having to separately respond to 
the 2008 Boundary and Annexation Survey. This will lower costs and 
increase efficiency. 

Page 20: 

The report displays the chart: "Available Options for Participation in 
2010 LUCA Program (Figure 6). " 

Census Bureau Response: There is an error in Option 3. Participants are 
allowed to use the MTPS to provide map updates to the Census Bureau. 

Pages 32-33: 

In the section discussing the impact of the hurricanes, the report 
states: "The mixed condition of the housing stock in the Gulf Coast 
region will increase the address canvassing workload." 

Census Bureau Response: For clarity, the Census Bureau notes that this 
will not increase the overall workload because Address Canvassing 
already has to visit every address/structure. What will be affected is 
the productivity rate. 

Regarding the recommendations that begin on page 40: 

GAO Recommendation 1: "Assess potential usability issues with the LUCA 
Program's CBT and MTPS by randomly selecting localities in which to 
test the software packages or by providing alternative means to assess 
such issues before participants begin reviewing and updating materials 
for the 2010 LUCA Program in August 2007, and provide information on 
how localities can mitigate issues identified in such assessments via 
its public Web site and its LUCA technical help desk." 

Census Bureau Response 1: The Census Bureau concurs with this 
recommendation. Regarding the LUCA CBT, the Census Bureau has contacted 
the local area's regional planning commission to request its assistance 
in identifying nearby local government volunteers to test pilot a 
preliminary version prior to finalization. We are using the Help Desk 
contractors-who are new to the LUCA Program and the MTPS-to review the 
software, help improve the user instructions for the MTPS, and develop 
Frequently Asked Questions to Help Desk callers and the public via the 
Census Bureau website. 

GAO Recommendation 2: "Provide localities not using MTPS, via its 
public Web site, its LUCA technical help desk, and other appropriate 
means, instructions on converting files from the Bureau's format to the 
appropriate format for software most commonly used by participating 
localities to update address information." 

Census Bureau Response 2: The Census Bureau concurs with this 
recommendation. We, along with our Help Desk contractors, will develop 
(and continuously improve) supplemental instructions on file formatting 
and shape file use for non-MTPS users. We plan to disseminate them to 
the public via the Census Bureau website and to Help Desk callers. 

GAO Recommendation 3: "Assess the contribution of the LUCA Program to 
the final census population counts, as recommended by NRC (to permit an 
evaluation of the 2010 LUCA Program in preparation for 2020)." 

Census Bureau Response 3: The Census Bureau agrees this would be 
useful, and will try to provide this estimate given the following 
limitations. It would be extremely problematic to assess the LUCA 
contribution to the final census count of persons. Since many 
operations contribute to the creation of the final address list, we 
would not be able to identify what effect other operations would have 
had in the absence of a LUCA Program, thereby hampering any effort to 
estimate the contribution of LUCA to either the housing unit or 
population count. 

GAO Recommendation 4: "Establish a process for localities that agreed 
to participate in the LUCA Program but found no changes in their review 
to explicitly communicate to the Bureau that they have no changes." 

Census Bureau Response 4: The Census Bureau agrees with this 
recommendation and will try to identify and analyze methods for 
gathering this information from entities that sign up to participate in 
LUCA but return no address list changes. 

GAO Recommendation 5: "Prior to August 2007, when localities begin to 
review and update Bureau provided materials, finalize plans for 
conducting the 2010 LUCA Program in the areas affected by hurricanes 
Katrina and Rita, and establish milestones for implementing these 
plans." 

Census Bureau Response 5: The Census Bureau concurs with this 
recommendation. The plans for conducting the 2010 LUCA Program in the 
areas impacted by the Gulf Coast Hurricanes are final. The 
recommendations of the Katrina/Rita working group with respect to the 
2010 LUCA Program have been implemented. We have taken steps in the 
impacted areas to solicit feedback on LUCA implementation, and have 
conducted special promotional workshops. Technical workshops in the 
impacted areas are planned for the fall 2007, as they are for all LUCA 
participants; no special technical training sessions are planned. The 
2010 LUCA implementation in the hurricane-impacted areas is on the same 
schedule as the 2010 LUCA implementation for the rest of the Nation. 

GAO Recommendation 6: "Develop strategy, plans and milestones for 
operations in areas in the gulf coast that address the challenges field 
staff are likely to encounter in conducting address canvassing and 
subsequent decennial operations in communities affected by the 
hurricanes." 

Census Bureau Response 6: The Census Bureau concurs with this 
recommendation. We are planning to discuss strategies for developing 
supplemental procedures for disaster-affected areas by late summer 
2007. It is our intention to have final procedures, English and Spanish 
versions, for 2010 Address Canvassing ready by late spring 2008. 

[End of section] 

Appendix III: Web-Based Survey of LUCA Dress Rehearsal Participants: 

Experiences with LUCA review. 

1. Between the time that the Census Bureau sent its invitation to take 
part in the LUCA dress rehearsal and your decision to participate, did 
the Bureau contact you to explain the importance of LUCA and encourage 
your participation? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Yes. (23). 
2. No (Click here to skip to Question 3.)  (0). 
3. Not sure (8. 

2. In what ways, if any did the Census Bureau contact you? 

(Check all that apply.)
(Number of participants that selected that answer). 

1. Telephone  (11). 
2. E-mail  (9). 
3. Mail (22). 
4. In-person (7). 
5. Other (0). 
6. Not sure (0). 

3. Did the Census Bureau notify you about LUCA classroom training 
opportunities in you area? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Yes (27). 
2. No (Click here to skip Question 5.) (1). 
3. Not sure (Click here to skip to Question 5.) (3). 

4. Did your participate in any LUCA classroom training provided by the 
Census Bureau? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Yes (24). 
2. No (3). 
3. Not sure (0). 

5. Did the Census Bureau contact you at any time after you agreed to 
participate in the LUCA dress rehearsal? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Yes (28). 
2. No (Click here to skip to Question 7) (0). 
3. Not sure (Click here to skip to Question 7.) (3). 

6. In what ways, if any did the Census Bureau contact you after you 
agreed to participate in the LUCA dress rehearsal? 

(Check all that apply.) 
(Number of participants that selected that answer). 

1. Telephone (11). 
2. E-mail (11). 
3. Mail (23). 
4. In-person (4). 
5. Other (0). 
6. Not sure (1). 

7. Have you completed and submitted the LUCA dress rehearsal materials 
to the Census Bureau? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Yes (30). 
2. No (1). 
3. Not sure (0). 

8. Which LUCA dress rehearsal participation option did you choose? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Option 1 - Title 13 Full Address List Review in paper format (8). 
2. Option 1 - Title 13 Full Address List Review in computer readable 
format (14). 
3. Option 2 - Title 13 Local Address List Submission (5). 
4. Option 3 - Non-Title 13 Local Address List Submission (4). 

9. What are the reasons that you chose that participation option? 

10. How satisfied were you with the participation options that were 
offered by the Census Bureau? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Very satisfied (6). 
2. Generally satisfied (16). 
3. Neither satisfied nor dissatisfied (9). 
4. Generally dissatisfied (0). 
5. Very dissatisfied (0). 

11. What other options, if any, would you have preferred to have 
offered to you and why? 

12. How clear was guidance on the schedule for initiating and 
completing the LUCA dress rehearsal review? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Very clear (11). 
2. Clear (15). 
3. Neither clear nor unclear (3). 
4. Unclear (2). 
5. Very unclear (0). 
6. No basis to judge (0). 

Experiences with maps during your LUCA review. 

13. Did you do a full review or a partial review of the maps? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Full review (reviewed 100% of the maps) (Click here to skip to 
Question 15.) (21). 

2. Partial review (targeted or sample checked) (Click here to skip to 
Question 14.) (7). 

3. Neither (we are not reviewing the maps) (Click here to skip to 
Question 17.) (3). 

14. If you did a partial review of the maps, what did you review and 
how did you decide which maps to review? 

15. Did you request either shape files of the maps or paper maps from 
the Census Bureau? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Shape files (11). 
2. Paper (17). 
3. None (1). 

16. Was the workload for reviewing the maps more or less than you 
expected? 

(Check only one answer.) 

1. Much more than we expected (4). 

2. Somewhat more than we expected (11). 

3. Neither more nor less than we expected (13). 

4. Somewhat less than we expected (0). 

5. Much less than we expected (0). 

17. Which of the Census Bureau's Boundary and Annexation Surveys, if 
any, did your jurisdiction participate in over the last 3 years? 

(Check all that apply.) 

1. Participated in 2003 Boundary and Annexation Survey (11). 

2. Participated in 2004 Boundary and Annexation Survey (14). 

3. Participated in 2005 Boundary and Annexation Survey (16). 

4. Did not participate in any Boundary and Annexation Surveys between 
2003 and 2005 (Click here to skip to Question 19.) (12). 

18. Were map changes that your jurisdiction submitted between 2003 and 
2005 as part of a Boundary and Annexation Survey incorporated into the 
LUCA dress rehearsal maps? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. All or almost all submitted changes were reflected in the LUCA dress 
rehearsal maps (9). 

2. Some submitted changes were reflected in the LUCA dress rehearsal 
maps (6). 

3. Few or none of the submitted changes were reflected in the dress 
rehearsal maps (0). 

4. Don't know (4). 

5. Other (please specify in question below) (1). 

To what extent were your jurisdiction's Boundary and Annexation map 
changes incorporated into the LUCA dress rehearsal maps? 

(If you reach the end of the text box and need to type more, please 
continue; the box will scroll forward as needed.) 

Experiences with address lists during your LUCA review. 

19. Did your jurisdiction do a full or partial review of the address 
list and/or address count? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Full review (reviewed 100% of the addresses on the list and/or 
count) (Click here to skip to Question 21.) (21). 

2. Partial review (targeted or sample checked) (Click here to skip to 
Question 20.) (7). 

3. Neither (we are not reviewing the Bureau's address list and/or 
count) (Click here to skip to Question 25.) (3). 

20. If you conducted a partial review of the address list and/or 
address count, what factors contributed to your decision to conduct a 
partial review? 

21. In verifying the Census Bureau's address list, did you use a single 
or multiple sources of address data? 

(Check only one answer.) 

1. A single source of address data for all or almost all addresses in 
paper format (Click here to skip to Question 23) (9). 

2. A single source of address data for all or almost all addresses (6). 

3. Multiple sources of address data in electronic format (Click here to 
skip to Question 23.) (13). 

22. What format did you use most for the multiple sources of address 
data?  

(Check only one answer.) 
(Number of participants that selected that answer). 

1. All or almost all sources in electronic format (4). 

2. Majority of sources are in electronic format (7). 

3. An approximately equal mix of electronic and paper formats (2). 

4. Majority of sources are in paper format (4). 

5. All or almost all sources are in paper format (2). 

23. What sources did you use to obtain the address data for your LUCA 
review? 

24. Was the workload for reviewing the address list and/or count more 
or less than what you expected? 

(Check only one answer.) 
(Number of participants that selected that answer). 

1. Much more than we expected (10). 

2. More than we expected (6). 

3. Neither more nor less than we expected (12). 

4. Less than we expected (0). 

5. Much less than we expected (0). 

Work Materials and Relationships. 

25. How easy or difficult did you find the following to work with? 

(Check one for each row.) 
(Number of participants that selected that answer). 

25a. Address list; 
Very easy: (0; 
Easy: (7); 
Neither easy nor difficult: (12); 
Difficult: (9); 
Very difficult: (2); 
No basis to judge: (0). 

25b. Address count; 
Very easy: (1); 
Easy: (10); 
Neither easy nor difficult: (13); 
Difficult: (4); 
Very difficult: (1); 
No basis to judge: (10. 

25c. Maps; 
Very easy: (1); 
Easy: (5); 
Neither easy nor difficult: (16); 
Difficult: (6); 
Very difficult: (0); 
No basis to judge: (1). 

25d. Instructions on completing LUCA dress rehearsal submission; 
Very easy: (0); 
Easy: (7); 
Neither easy nor difficult: (14); 
Difficult: (7); 
Very difficult: (2); 
No basis to judge: (0). 

[End of table] 

26. To what extent, if any, did you experience problems with the 
following? 

(Check one for each row.) 

26a. Scheduling of LUCA training; 
Very great extent: (0); 
Great extent: (0); 
Moderate extent: (4); 
Small extent: (5); 
No extent: (16); 
No basis to judge: (5). 

26b. Accuracy and completeness of the addresses on the list; 
Very great extent: (1); 
Great extent: (3); 
Moderate extent: (7); 
Small extent: (11); 
No extent: (5); 
No basis to judge: (3). 

26c. Accuracy and completeness of the address count; 
Very great extent: (1); 
Great extent: (1); 
Moderate extent: (7); 
Small extent: (11); 
No extent: (5); 
No basis to judge: (5). 

26d. Accuracy and completeness of the maps; 
Very great extent: (1); 
Great extent: (2); 
Moderate extent: (10); 
Small extent: (6); 
No extent: (7); 
No basis to judge: (2). 

26e. Media on which the Census Bureau provided information; 
Very great extent: (3); 
Great extent: (1); 
Moderate extent: (5); 
Small extent: (9); 
No extent: (10); 
No basis to judge: (1). 

26f. Meeting Census Bureau requirements concerning the format and media 
for returning information; 
Very great extent: (4); 
Great extent: (5); 
Moderate extent: (8); 
Small extent: (5); 
No extent: (5); 
No basis to judge: (0). 

26g. File conversion to appropriate formats; 
Very great extent: (3); 
Great extent: (6); 
Moderate extent: (5); 
Small extent: (3); 
No extent: (7); 
No basis to judge: (6). 

26h. Other (specify in question below); 
Very great extent: (0); 
Great extent: (3); 
Moderate extent: (1); 
Small extent: (0); 
No extent: (4); 
No basis to judge: (6). 

[End of table] 

What other problems, if any, did you experience? 

(If you reach the end of the text box and need to type more, please 
continue; the box will scroll forward as needed.)  

27. How sufficient, if at all, are the following resources in your 
jurisdiction to conduct the LUCA dress rehearsal review? 

(Check one for each row.) 

27a. Human resources available; 
Very sufficient: (2); 
Sufficient: (12); 
Moderately sufficient: (3); 
Somewhat sufficient: (6); 
Not at all sufficient: (7); 
No basis to judge: (1). 

27b. Technical resources available; 
Very sufficient: (2); 
Sufficient: (16); 
Moderately sufficient: (2); 
Somewhat sufficient: (5); 
Not at all sufficient: (4); 
No basis to judge: (2). 

[End of table] 

28. To what extent, if any, did your staff doing the LUCA dress 
rehearsal review have the skills necessary for this type of work? 

(Check only one answer.) 

1. Very great extent (3). 

2. Great extent (12). 

3. Moderate extent (9). 

4. Small extent (4). 

5. Not at all (0). 

6. No basis to judge (3). 

29. How satisfied, if at all, were you with the following Census Bureau 
actions? 

(Check one for each row.) 

29a. Extent of training regarding address list and/or count review; 
Very satisfied: (0); 
Satisfied: (9); 
Neither satisfied nor dissatisfied: (14); 
Dissatisfied: (4); 
Very dissatisfied:(1);  
No basis to judge: (3). 

29b. Extent of training regarding map review; 
Very satisfied: (0); 
Satisfied: (10); 
Neither satisfied nor dissatisfied: (14); 
Dissatisfied: (1); 
Very dissatisfied: (1); 
No basis to judge: (5). 

29c. Extent of training regarding the procedures used for submissions 
to the Bureau; 
Very satisfied: (0); 
Satisfied: (8);  
Neither satisfied nor dissatisfied: (15); 
Dissatisfied: (4); 
Very dissatisfied: (1); 
No basis to judge: (2). 

29d. Extent of Census Bureau assistance; 
Very satisfied: (3); 
Satisfied: (15); 
Neither satisfied nor dissatisfied: (10); 
Dissatisfied: (0); 
Very dissatisfied: (2); 
No basis to judge: (1). 

29e. Timeliness of the Census Bureau's response to your questions; 
Very satisfied: (6); 
Satisfied: (14); 
Neither satisfied nor dissatisfied: (7); 
Dissatisfied: (1); 
Very dissatisfied: (0); 
No basis to judge: (3). 

29f. Adequacy of responses provided by the Census Bureau to any 
questions you asked; 
Very satisfied: (7); 
Satisfied: (11); 
Neither satisfied nor dissatisfied: (8); 
Dissatisfied: (2); 
Very dissatisfied: (0); 
No basis to judge: (3). 

[End of table] 

30. Considering your experience completing the LUCA dress rehearsal, 
how helpful to you would the following types of training activities 
have been before you began your LUCA review? 

(Check one for each row.) 

30a. Classroom training at a regional site; 
Extremely helpful: (3); 
Very helpful: (9); 
Moderately helpful: (12); 
Slightly helpful: (2); 
Not at all helpful: (1); 
No basis to judge: (2). 

30b. Interactive computer-based training provided on CD-ROM or DVD; 
Extremely helpful: (5); 
Very helpful: (11); 
Moderately helpful: (3); 
Slightly helpful: (5); 
Not at all helpful: (2); 
No basis to judge: (4). 

30c. Interactive internet training; 
Extremely helpful: (2); 
Very helpful: (11); 
Moderately helpful: (7); 
Slightly helpful: (3); 
Not at all helpful: (2); 
No basis to judge: (5). 

30d. Self instruction using Census Bureau training guides; 
Extremely helpful: (1); 
Very helpful: (9); 
Moderately helpful: (9); 
Slightly helpful: (8); 
Not at all helpful: (2); 
No basis to judge: (2). 

[End of table] 

31. ow helpful to you would the following types of training activities 
have been during your LUCA review? 

(Check one for each row.) 

31a. Classroom training at a regional site; 
Extremely helpful: (4); 
Very helpful: (11); 
Moderately helpful: (5); 
Slightly helpful: (5); 
Not at all helpful: (2); 
No basis to judge: ( 2). 

31b. Interactive computer-based training provided on CD-ROM or DVD; 
Extremely helpful: (5); 
Very helpful: (12); 
Moderately helpful: (3); 
Slightly helpful: (3); 
Not at all helpful: (2); 
No basis to judge: (3). 

31c. Interactive internet training; 
Extremely helpful: (5);
Very helpful: (13); 
Moderately helpful: (4); 
Slightly helpful: (2); 
Not at all helpful: (2); 
No basis to judge: (3). 

31d. Self instruction using Census Bureau training guides; 
Extremely helpful: (2); 
Very helpful: (7); 
Moderately helpful: (8); 
Slightly helpful: (9); 
Not at all helpful: (3); 
No basis to judge: (1). 

32. How helpful would guidance specific to your office's software have 
been in completing your LUCA review?

(Check only one answer.) [

1. Extremely helpful (11). 

2. Very helpful (8). 

3. Moderately helpful (1). 

4. Slightly helpful (3). 

5. Not at all helpful (1). 

6. No basis to judge (7). 

33. Did your state data center assist you in completing the LUCA dress 
rehearsal review? 

(Check only one answer.) 

1. Yes (3). 

2. No (24). 

3. Don't Know (3). 

34. How useful have the following sources of assistance been in doing 
your review and update of the address list and/or count and maps? 

(Check one for each row.) 

34a. LUCA dress rehearsal training session; 
Extremely useful: (0); 
Very useful: (9); 
Moderately useful: (11); 
Slightly useful: (4); 
Not at all useful: (0); 
No basis to judge: (7). 

34b. LUCA dress rehearsal reference manuals; 
Extremely useful: (0); 
Very useful: (12); 
Moderately useful: (11); 
Slightly useful: (7); 
Not at all useful: (0); 
No basis to judge: (1). 

34c. State data center; 
Extremely useful: (0); 
Very useful: (0); 
Moderately useful: (1); 
Slightly useful: (4); 
Not at all useful: (2); 
No basis to judge: (22). 

34d. Other government entities, such as regional partnerships or county 
governments; 
Extremely useful: (2); 
Very useful: (6); 
Moderately useful: (3); 
Slightly useful: (2); 
Not at all useful: (0); 
No basis to judge: (17). 

34e. Census Bureau's regional office; [Empty]; Extremely useful: ( 2); 
Extremely useful: [Empty]; Very useful: ( 4); Very useful: [Empty]; 
Moderately useful: (12); Moderately useful: [Empty]; Slightly useful: ( 
6); Slightly useful: [Empty]; Not at all useful: ( 1); No basis to 
judge: (6). 

34f. E-mail contact with the Census Bureau; 
Extremely useful: (0); 
Very useful: (5); 
Moderately useful: (6); 
Slightly useful: (6); 
Not at all useful: (1); 
No basis to judge: (12). 

34g. Census Bureau's web site; 
Extremely useful: (0); 
Very useful: (2); 
Moderately useful: (5); 
Slightly useful: (3); 
Not at all useful: (4); 
No basis to judge: (17). 

34h. Other (please specify what type of assistance and who provided it 
in question below); 
Extremely useful: (0); 
Very useful: (3); 
Moderately useful: (0);
Slightly useful: (0); 
Not at all useful: (0); 
No basis to judge: (15). 

[End of table] 

What type of other assistance did you receive and who provided the 
assistance? 

35. If a source of assistance in question 34 was of "little or no use", 
please elaborate on each type of assistance providing examples or 
illustrations where possible. 

36. Which of the following best describes how much of the Census 
Bureau's LUCA materials your locality's review covered? 

(Check only one answer.) 

1. Covered more than originally planned or expected (6). 

2. Covered about what was originally planned or expected 20). 

3. Covered less than originally planned or expected (5). 

37. Was adequate time allowed to complete the review? 

(Check only one answer.) 

1. Yes (22)]. 

2. No (5). 

3. Don't know (4). 

38. Given your experiences completing the LUCA dress rehearsal, do you 
anticipate doing any of the following for the 2010 LUCA? 

(Check one for each row.) 

38a. Start our LUCA review earlier; 
Yes: (19); 
No: (8); 
Don't know: (4). 

38b. Make completing LUCA a higher priority for staff; 
Yes: (14); 
No: (9); 
Don't know: (8). 

38c. Better prepare local materials prior to receiving LUCA 
documentation; 
Yes: (16); 
No: (8); 
Don't know: (7). 

38d. Solicit technical assistance from Census regional staff earlier in 
the process; 
Yes: (15); 
No: (9); 
Don't know: (6). 

38e. Other (please specify in question below); 
Yes: (1); 
No: (5); 
Don't know: (12). 

[End of table] 

What other activities would you do differently in future LUCA reviews? 

39. Given your experiences with the LUCA dress rehearsal, what actions, 
if any, could the Bureau take to improve the program? 

40. If you have any additional comments regarding any previous 
questions or other comments concerning LUCA, the Census Bureau, or this 
survey, please use the space provided below. 

Background Information. 

41. Did your jurisdiction participate in any of the 2000 Decennial 
Census LUCA programs? 

(Check only one answer.) 

1. Yes (10). 

2. No (9). 

3. Not sure (11). 

42. Have you had previous experience with any LUCA reviews? 

(Check only one answer.) 

1. Yes (12). 

2. No (17). 

3. Not sure (1). 

43. How long have you served in your current position? 

Contact Information. 

44. What is the name of the person we should contact if we have any 
questions? 

Name:  

What is the telephone number of the person we should contact if we have 
any questions? 

Phone number: 

What is the e-mail address of the person we should contact if we have 
any questions? 

E-mail: 

[End of section] 

Appendix IV: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Mathew J. Scirè, (202) 512-6806 or sciremj@gao.gov: 

Acknowledgments: 

In addition to the individual named above, Ernie Hazera, Assistant 
Director; Timothy Wexler; Tom Beall; Michael Carley; Cynthia Cortese; 
Peter DelToro; Tom James; Andrea Levine; Amanda Miller; Matt Reilly; 
Mark Ryan; and Michael Volpe made key contributions to this report. 

[End of section] 

Related GAO Products: 

Gulf Coast Rebuilding: Preliminary Observations on Progress to Date and 
Challenges for the Future. GAO-07-574T. Washington, D.C.: April 12, 
2007. 

2010 Census: Census Bureau Should Refine Recruiting and Hiring Efforts 
and Enhance Training of Temporary Field Staff. GAO-07-361. Washington, 
D.C.: April 27, 2007: 

2010 Census: Redesigned Approach Holds Promise, but Census Bureau Needs 
to Annually Develop and Provide a Comprehensive Project Plan to Monitor 
Costs. GAO-06-1009T. Washington, D.C.: July 27, 2006. 

2010 Census: Census Bureau Needs to Take Prompt Actions to Resolve Long-
standing and Emerging Address and Mapping Challenges. GAO-06-272. 
Washington, D.C.: June 15, 2006. 

2010 Census: Costs and Risks Must be Closely Monitored and Evaluated 
with Mitigation Plans in Place. GAO-06-822T. Washington, D.C.: June 6, 
2006. 

2010 Census: Census Bureau Generally Follows Selected Leading 
Acquisition Planning Practices, but Continued Management Attentions Is 
Needed to Help Ensure Success. GAO-06-277. Washington, D.C.: May 18, 
2006. 

2010 Census: Planning and Testing Activities Are Making Progress. GAO- 
06-465T. Washington D.C.: March 1, 2006. 

2010 Census: Basic Design Has Potential, but Remaining Challenges Need 
Prompt Resolution. GAO-05-9. Washington, D.C.: January 12, 2005. 

2010 Census: Counting Americans Overseas as Part of the Decennial 
Census Would Not Be Cost-Effective. GAO-04-898. Washington, D.C.: 
August 19, 2004. 

2010 Census: Overseas Enumeration Test Raises Need for Clear Policy 
Direction. GAO-04-470. Washington, D.C.: May 21, 2004. 

2010 Census: Cost and Design Issues Need to Be Addressed Soon. GAO-04- 
37. Washington, D.C.: January 15, 2004. 

Decennial Census: Lessons Learned for Locating and Counting Migrant and 
Seasonal Farm Workers. GAO-03-605. Washington, D.C.: July 3, 2003. 

Decennial Census: Methods for Collecting and Reporting Hispanic 
Subgroup Data Need Refinement. GAO-03-228. Washington, D.C.: 

January 17, 2003. 

Decennial Census: Methods for Collecting and Reporting Data on the 
Homeless and Others without Conventional Housing Need Refinement. GAO- 
03-227. Washington, D.C.: January 17, 2003. 

2000 Census: Lessons Learned for Planning a More Cost-Effective 2010 
Census. GAO-03-40. Washington, D.C.: October 31, 2002. 

2000 Census: Local Address Review Program Has Had Mixed Results to 
Date. GAO/T-GGD-99-184. Washington, D.C.: September 29, 1999. 

FOOTNOTES 

[1] The address canvassing operation is a field check of all addresses 
done to verify housing unit addresses. The address canvassers add to 
the 2010 Census address list any additional addresses they find and 
make other needed corrections to the 2010 Census address list and maps 
using global-positioning-equipped handheld computers. 

[2] NRC is part of the National Academy of Sciences. 

[3] See GAO, 2010 Census: Census Bureau Needs to Take Prompt Actions to 
Resolve Long-standing and Emerging Address and Mapping Challenges, GAO-
06-272 (Washington, D.C.: June 15, 2006); 2010 Census: Planning and 
Testing Activities Are Making Progress, GAO-06-465T (Washington, D.C.: 
Mar. 1, 2006); and 2010 Census: Costs and Risks Must be Closely 
Monitored and Evaluated with Mitigation Plans in Place, GAO-06-822T 
(Washington, D.C.: June 6, 2006). See also National Research Council, 
Assessment of the 2000 Census LUCA Program (Washington, D.C.: December 
2002), and Department of Commerce, Office of the Inspector General, 
Additional Steps Needed to Improve Local Update of Census Addresses for 
the 2000 Decennial Census (Washington, D.C.: September 1998). 

[4] See app. III for a full list of survey responses. 

[5] The Bureau's address list is known as the Master Address File 
(MAF); its associated geographic information system is called the 
Topologically Integrated Geographic Encoding and Referencing (TIGER) 
database. TIGER is a registered trademark of the U.S. Census Bureau. 

[6] The MAF and TIGER databases are also linked into what is called the 
MAF/TIGER database, through a process where the Bureau assigns every 
housing unit in MAF to a specific location in TIGER. 

[7] Census Address List Improvement Act of 1994, Pub. L. No. 103-430, 
October 31, 1994. 

[8] The 2000 LUCA Program had two separate components: the 1998 city- 
style address operation and the 1999 non-city-style address operation. 

[9] Under 13 U.S.C. § 9(a), local governments that obtain access to 
Title 13 data are required to ensure the confidentiality of such data. 

[10] Of the 39,051 eligible entities, 20,718 chose not to participate, 
5,525 entities signed participation agreements, 2,877 entities returned 
materials but recorded no updates or action, and 9,931 entities 
submitted at least one address action or challenged at least one block. 

[11] National Research Council, Assessment of the 2000 Census LUCA 
Program. 

[12] Bureau headquarters and the Charlotte Regional Office provided us 
with internal timelines for the 2010 LUCA Program and the LUCA dress 
rehearsal operations held in parts of California and North Carolina 
from June through October 2006. Additionally, we obtained a public 
version of the Bureau's timelines for both the LUCA dress rehearsal and 
the 2010 LUCA Program from its Web site (see figs. 3 and 4). 

[13] GAO, 2000 Census: Local Address Review Program Has Had Mixed 
Results to Date, GAO/T-GGD-99-184 (Washington, D.C.: Sept. 29, 1999). 

[14] ITS Services, Inc., Results of the Survey of Selected Governments 
Eligible for the Local Update of Census Addresses (LUCA) Program 
(Fairfax, Va.: 2002), v. 

[15] National Research Council, Reengineering the 2010 Census: Risks 
and Challenges (Washington, D.C.: 2004), 96. 

[16] ITS Services, Inc., Recommended Communication Methods to Support 
Participation in the Ongoing LUCA Program (Fairfax, Va.: 2002), 6. 

[17] City-style addresses represent both the location of the housing 
unit on the ground and the mailing address for the housing unit (i.e., 
101 Main St., Anytown, MD 12345). Non-city-style addresses, such as 
Post Office box and rural route numbers, indicate where mail is 
delivered to an addressee but do not necessarily designate the location 
of the addressee's housing unit on the ground. 

[18] National Research Council, Reengineering the 2010 Census: Risks 
and Challenges, 97. 

[19] MTPS also incorporates the functions of the Boundary and 
Annexation Survey. 

[20] The Bureau conducts the BAS annually to collect information about 
selected defined geographic areas. The BAS is used to update 
information about the legal boundaries and names of all governmental 
units in the United States. 

[21] National Research Council, Reengineering the 2010 Census: Risks 
and Challenges, 97. 

[22] See A. Dillon, "Usability Evaluation," Encyclopedia of Human 
Factors and Ergonomics, ed. W. Karwowski, 1930-1933 (London: Taylor and 
Francis, 2001). Andrew Dillon, PhD, is the dean of the University of 
Texas School of Information; he is also a professor of information, 
psychology, information, and risk and operations management at the 
University of Texas. 

[23] Tab-delimited text is one of the more common data formats, defined 
by text separated by tabs. Pipe-delimited format is essentially the 
same kind of format, but uses the pipe symbol ("|") as its delimiting 
property. 

[24] Such software may include Microsoft Excel, Microsoft Access, Lotus 
1-2-3, Quattro Pro, and Oracle. 

[25] GAO/T-GGD-99-184. 

[26] See Dillon. 

[27] National Research Council, Assessment of the 2000 Census LUCA 
Program, 134. 

[28] GAO, Managing for Results: Enhancing Agency Use of Performance 
Information for Management Decision Making, GAO-05-927 (Washington, 
D.C.: Sept. 9, 2005). 

[29] The period for local review and update of addresses and maps for 
the 2010 LUCA Program is August 2007 through March 2008. 

[30] The Road Home Program was implemented by the State of Louisiana to 
provide compensation of up to $150,000 for eligible homeowners affected 
by hurricanes Katrina and Rita. 

[31] GAO, Gulf Coast Rebuilding: Preliminary Observations on Progress 
to Date and Challenges for the Future, GAO-07-574T (Washington, D.C.: 
Apr. 12, 2007). 

[32] GAO-06-272 and GAO-06-822T. 

[33] In an "Update/Enumerate" operation, interviewers enumerate a 
housing unit and update address registers and census maps at the time 
of their visit. 

[34] GAO, 2010 Census: Census Bureau Should Refine Recruiting and 
Hiring Efforts and Enhance Training of Temporary Field Staff, GAO-07-
361 (Washington, D.C.: Apr. 27, 2007). 

[35] The questionnaire was sent to 42 local governments, not 44, 
because one local official was responsible for 3 localities, and we 
sent the questionnaire to that official only once. 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site (www.gao.gov). Each weekday, GAO posts 
newly released reports, testimony, and correspondence on its Web site. 
To have GAO e-mail you a list of newly posted products every afternoon, 
go to www.gao.gov and select "Subscribe to Updates." 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office 441 G Street NW, Room LM 
Washington, D.C. 20548: 

To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202) 
512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm: 

E-mail: fraudnet@gao.gov: 

Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Gloria Jarmon, Managing Director, JarmonG@gao.gov (202) 512-4400: 

U.S. Government Accountability Office, 441 G Street NW, Room 7125 
Washington, D.C. 20548: 

Public Affairs: 

Paul Anderson, Managing Director, AndersonP1@gao.gov (202) 512-4800: 

U.S. Government Accountability Office, 441 G Street NW, Room 7149 
Washington, D.C. 20548: