The U.S. Census Bureau

Questionnaire Design Effects on Interview Outcomes

Jeffrey C. Moore and Laureen Moyer

KEY WORDS: automated instrument design, household surveys, nonresponse, interviewer assessment, respondent assessment

ABSTRACT

The U.S. Census Bureau conducts a number of household demographic surveys which gather a standard set of information about all members of sampled households. These surveys typically employ a "person-based" design -- that is, they ask the complete set of interview questions for each household member in turn, recycling through the interview sequence as many times as there are eligible members of the household. Elsewhere (Moore, 1996), we have argued the potential benefits of a "topic-based" approach, which completes each interview question (or topic) for all household members before proceeding to the next question, and have presented data from a small pretest which tended to support some of those benefits. This paper describes a large-scale questionnaire design experiment, conducted during the 1997 test of the Census Bureau’s American Community Survey’s (ACS) CATI followup operation, which tested a person-based ACS CATI instrument against a topic-based instrument. We compare the performance of the two instruments on the following dimensions: (1) household (unit) nonresponse; (2) interview efficiency; (3) interviewers' evaluations; (4) respondents' assessments; (5) item nonresponse; and (6) within-household response consistency. Results suggest generally superior performance for the topic-based design.

CITATION:

Source: U.S. Census Bureau, Statistical Research Division

Created: 19-FEB-2002
Last revised: February 19 2002