This is the accessible text file for GAO report number GAO-10-751R 
entitled 'Human Capital: Quality of DOD Status of Forces Surveys Could 
Be Improved by Performing Nonresponse Analysis of the Results' which 
was released on July 12, 2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

GAO-10-751R: 

United States Government Accountability Office: 
Washington, DC 20548: 

July 12, 2010:

The Honorable Carl Levin:
Chairman:
The Honorable John McCain:
Ranking Member:
Committee on Armed Services:
United States Senate:

The Honorable Ike Skelton:
Chairman:
The Honorable Howard P. McKeon:
Ranking Member:
Committee on Armed Services:
House of Representatives:

Subject: Human Capital: Quality of DOD Status of Forces Surveys Could 
Be Improved by Performing Nonresponse Analysis of the Results:

The Defense Manpower Data Center (DMDC) conducts a series of Web-based 
surveys called Status of Forces surveys,[Footnote 1] which help enable 
decision makers within the Department of Defense (DOD) to (1) evaluate 
existing programs and policies, (2) establish baselines before 
implementing new programs and policies, and (3) monitor the progress 
of programs and policies and their effects on the total force. 
[Footnote 2] In recent years, we have discussed the results of these 
surveys in several of our reports.[Footnote 3] While we have 
generally found the survey results to be sufficiently reliable for the 
purposes of our reporting, several of our reports have discussed low 
response rates and the potential for bias in the survey results. 
[Footnote 4] Nonresponse analysis is an established practice in survey 
research that helps determine whether nonresponse bias (i.e., survey 
results that do not accurately reflect the population) might occur due 
to under-or overrepresentation of some respondents' views on survey 
questions.[Footnote 5] When nonresponse analysis is performed, survey 
researchers can use the results to select and adjust the statistical 
weighting techniques they use that help ensure that survey results 
accurately reflect the survey population.[Footnote 6]

Because we have noted, in reports referring to the Status of Forces 
surveys, the potential for bias and because of DMDC's role in 
supporting DOD decision making, we initiated this review under the 
Comptroller General's statutory authority to conduct evaluations on 
his own initiative. Specifically, our objective was to determine the 
extent to which DMDC performs nonresponse analysis of the results of 
its Status of Forces surveys to determine whether reported results of 
respondents' views might be under-or overrepresented.

To address our objective, a team that included GAO social science 
analysts with survey research expertise and GAO's Chief Statistician 
(1) reviewed relevant documentation provided by DMDC regarding the 
survey methods used for the Status of Forces surveys, (2) interviewed 
DMDC survey officials who had knowledge of or were involved in the 
development and administration of the surveys, and (3) reviewed the 
response rates for the Status of Forces surveys conducted since 2003. 
We conducted this performance audit between November 2009 and May 2010 
in accordance with generally accepted government auditing standards. 
Those standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe 
that the evidence obtained provides a reasonable basis for our 
findings and conclusions based on our audit objectives.

DMDC Does Not Regularly Perform Nonresponse Analysis of the Results of 
Its Status of Forces Surveys, and It Lacks Guidance Specifying When 
and How Such Analysis Should Be Performed:

Although DMDC has conducted some research to assess and monitor the 
effects of nonresponse bias in its Status of Forces surveys in the 
past, it lacks guidance specifying when and how additional analysis of 
the results of its Status of Forces surveys should be performed in 
order to determine the extent of differences between survey 
respondents and nonrespondents. Leading survey research professional 
organizations, such as the American Association for Public Opinion 
Research, recognize nonresponse analysis as a sound method for 
assessing whether nonresponse bias might cause under-or 
overrepresentation of respondents' views on survey questions. Further, 
survey research guidelines issued by the Office of Management and 
Budget state that nonresponse analysis should be performed when survey 
response rate is below 80 percent, so as to identify the possibility 
of bias in a survey's results.[Footnote 7] Although these guidelines 
are not mandated for internal personnel surveys such as the Status of 
Forces surveys, as we have previously reported,[Footnote 8] they 
reflect generally accepted best practices in the field of survey 
research and are relevant for the purposes of assessing whether the 
results of a survey are representative of the population being surveyed.

In addition to our prior work discussing low response rates and the 
potential for bias in the Status of Forces surveys, we have also noted 
the need for caution when interpreting the results of federal surveys 
with low response rates.[Footnote 9] In our review of the various 
Status of Forces surveys conducted since 2003, we found that the 
response rates have been between 28 percent and 40 percent for the 
Status of Forces Active Duty Survey; between 25 percent and 42 percent 
for the Status of Forces Reserve Survey; and between 55 percent and 64 
percent for the Status of Forces Survey of Civilian Employees. While 
response rates alone are not sufficient indicators for determining the 
quality of survey results, we note--and DMDC survey officials 
recognize--that the Status of Forces surveys have had generally low 
response rates as compared with some other federal surveys. By not 
performing nonresponse analysis to identify the possibility for 
nonresponse bias in the results of its various Status of Forces 
surveys, DMDC survey officials may not have the information needed to 
adjust their statistical weighting techniques so as to ensure their 
survey results reflect the population being surveyed.

As mentioned previously, DMDC lacks guidance specifying when and how 
agency staff should assess the results of the Status of Forces surveys 
for nonresponse bias. Further, we found that since DMDC last conducted 
research on nonresponse bias and its Status of Forces surveys--in a 
study it conducted in 2007--DMDC has taken no steps to strengthen its 
understanding of the effects of nonresponse bias, even though its 
study noted that performing nonresponse analysis should be a priority 
for the agency. This is a concern, especially since DMDC's study also 
noted, for some of its survey measures, the existence of systematic 
nonresponse errors that had not been corrected by DMDC's current 
statistical weighting techniques. DMDC survey officials acknowledge 
the need to perform additional research on nonresponse bias. However, 
a senior DMDC survey official also told us that no additional research 
on nonresponse bias is planned at this time because of, among other 
things, a greater focus at this time in fielding surveys versus 
performing methodological evaluation. Without guidance for performing 
additional nonresponse analysis, DMDC's ability to identify and 
address the potential for nonresponse bias within the Status of Forces 
surveys is hindered.

Conclusion:

The Status of Forces surveys provide decision makers within the DOD 
community valuable information that is used to evaluate and monitor 
the progress of various defense programs and policies. This community 
could derive significant further benefit, however, if DMDC were to 
perform additional nonresponse analysis of its Status of Forces survey 
results. Specifically, performing nonresponse analysis--an established 
practice in survey research--could help DMDC improve the quality of 
the Status of Forces surveys by identifying the potential for 
nonresponse bias within its Status of Forces surveys. Taking steps to 
then address any bias found--such as adjusting the statistical 
weighting techniques used--could help strengthen the quality of the 
survey results over time, thereby enabling decision makers and other 
users of the survey results to better understand the perspectives of 
DOD personnel regarding the department's various programs and policies.

Recommendation for Executive Action:

To better determine the effects of nonresponse bias on the Status of 
Forces survey results, we recommend that you direct the Director of 
DMDC to develop and implement guidance both for conducting nonresponse 
analysis and for using the results of nonresponse analysis to inform 
DMDC's statistical weighting techniques, as part of the collection and 
analysis of the Status of Forces survey results.

Agency Comments and Our Evaluation:

In its written comments responding to a draft of this report, DMDC 
concurred with our recommendation. DMDC's comments are reprinted in 
enclosure I.

In these comments, DMDC stated that it understands our concerns 
regarding response rates and the lack of recurring nonresponse bias 
studies for its Status of Forces surveys. DMDC also stated that it 
concurs with us on the benefits of developing a systematic program to 
continually monitor the impact of nonresponse bias for its surveys. To 
that end, DMDC stated that it will take several actions to address our 
recommendation. These actions include developing plans to periodically 
assess the effect of nonresponse on its survey results by performing 
formal nonresponse bias studies, testing its approach and developing 
alternative approaches if necessary, and developing a comprehensive 
plan and guidance to continually monitor for nonresponse bias in its 
Status of Forces surveys. We commend DMDC for committing to actions 
that could help it better determine the effects of nonresponse bias in 
its studies, and note that such actions, if taken, would constitute 
steps in the right direction.

We note that, in its cover letter accompanying these comments, DMDC 
stated that it disagreed with our observation that "DMDC does not 
regularly perform nonresponse analysis of the results of its status of 
forces surveys, and it lacks guidance specifying when and how such 
analysis should be performed," noting that, while it does not formally 
perform nonresponse analysis, it continually monitors changes in 
response rates and potential nonresponse bias. While we acknowledge 
that DMDC takes some steps to address nonresponse--for example, 
monitoring response rates for a fixed set of variables and 
incorporating statistical weighting techniques in its survey estimates-
-monitoring response rates without performing more in-depth 
nonresponse analysis may not necessarily identify problems with 
nonresponse bias. In addition, during the course of our review, DMDC 
survey officials told us that they did not have any written policy or 
guidance in place on performing nonresponse analysis.

We are sending copies of this report to the Secretary of Defense, the 
Under Secretary of Defense for Personnel and Readiness, the Director 
of DMDC, and interested congressional committees. In addition, this 
report will be available at no charge on GAO's Web site at [hyperlink, 
http://www.gao.gov].

If you or your staff have any questions about this report, please 
contact Brenda S. Farrell at (202) 512-3604 or farrellb@gao.gov, or 
Ronald S. Fecso at (202) 512-7791 or fecsor@gao.gov. Contact points 
for our Offices of Congressional Relations and Public Affairs may be 
found on the last page of this report. Key contributors to this report 
include Marion A. Gatling, Assistant Director; James D. Ashley; 
Virginia A. Chanley; Wesley A. Johnson; Lonnie J. McAllister; and 
Cheryl A. Weissman. Other contributors include Jill N. Lacey and 
Jennifer L. Weber. 

Signed by: 

Brenda S. Farrell:
Director, Defense Capabilities and Management:

Signed by: 

Ronald S. Fecso:
Chief Statistician: 

End of section] 

Enclosure I:

Comments from the Defense Manpower Data Center:

Department Of Defense: 
Human Resources Activity: 
Defense Manpower Data Center: 
1600 Wilson Boulevard Suite 400: 
Arlington, VA 22209-2593: 

June 24, 2010: 

Ms. Brenda S. Farrell: 
Director, Defense Capabilities and Management: 
Mr. Ronald S. Fecso, Chief Statistician: 
U.S. Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Ms. Farrell and Mr. Fecso: 

Enclosed is the Department of Defense response to the GAO report, GA0-
10-751R, "Human Capital: The Defense Manpower Data Center Could 
Improve the Quality of the Status of Forces Surveys by Performing 
Nonresponse Analysis of the Results," dated May 27, 2010 (GAO Code 
351398). 

DMDC thanks the GAO for the opportunity to respond to GAO report, GAO-
10-751R. Although DMDC concurs with the GAO recommendation, we want to 
point out that the department disagrees with the report where it 
states, "DMDC does not regularly perform nonresponse analysis of the 
results of its status of forces surveys, and it lacks • guidance 
specifying when and how such analysis should be performed." DMDC
continually monitors response rates across multiple detailed 
demographic and geographic groups, including but not limited to branch 
of service, pay grade, geographic location (U.S. versus overseas), 
deployment status, gender, and race. While DMDC does not formally call 
this program nonresponse analysis, we continually monitor changes in 
potential nonresponse bias through analysis of respondent sample 
composition relative to nonrespondents. 

DMDC statisticians assert that SOFS surveys likely have lower levels of
nonresponse bias than surveys with much higher response rates because 
generally survey organizations know very little about survey 
nonrespondents, and consequently have limited accessible data to 
assist with nonresponse adjustments. For instance, in telephone 
surveys, the survey organization may only know limited geographic data 
based on the telephone exchange for "ring-no answer" cases. For 
household interview surveys, the surveyor may have outdated knowledge 
(usually Census data) of characteristics of the block (e.g., percent 
Hispanic). 

DMDC has an uncommon and advantageous position as a surveyor by 
maintaining extremely detailed, complete, and timely administrative 
data for our entire survey frames. Due to this complete sampling 
frame. DMDC has more extensive information regarding the 
characteristics of survey nonrespondents prior to conducting 
nonresponse analysis studies than most other survey organizations know 
after such studies. For the SOFS program, DMDC uses this thorough 
knowledge of nonrespondents both for statistical imputations for item-
missing data and nonresponse and post-stratification weighting 
adjustments to compensate for unit nonresponse. Both of these 
procedures are specifically designed to reduce nonresponse bias in 
SOPS estimates. 

Beginning with the first test of the SOFS in 2002, DMDC has 
periodically included tests of methodology differences affecting 
response rates and data quality. Such tests have concluded that a 
follow-up paper survey increases response rates by around seven 
percentage points without significantly or meaningfully changing 
estimates from the survey. Other tests have concentrated on contact 
methods that can improve response rates or at least not adversely 
impact response rates while lowering costs. 

For all SOFS surveys, DMDC statisticians consider survey estimates 
representative of their respective populations, allowing the results 
to be effectively used in program evaluation, policy decisions, and 
program planning and execution. While DMDC is confident in its survey 
program, we will investigate the advantages of an external review 
panel established by an organization such as the National Research 
Council. 

Sincerely, 

Signed by: 

Mary Snavely-Dixon: 
Director: 

Enclosure: As stated: 

[End of letter] 

GAO draft report, GAO-10-751R, "Human Capital: The Defense Manpower 
Data Center Could Improve the Quality of the Status of Forces Surveys 
by Performing Nonresponse Analysis of the Results," dated May 27, 2010 
(GAO Code 351398): 

Department Of Defense Comments To The GAO Recommendations: 

Recommendation 1: To better determine the effects of nonresponse bias 
on the Status of Forces (SOFS) survey results, the GAO recommends that 
the Secretary of Defense direct the Director of the Defense Manpower 
Data Center (DMDC) develop and implement guidance both for conducting 
nonresponse analysis and for using the results of nonresponse analysis 
to inform DMDC's statistical weighting techniques, as part of the 
collection and analysis of the Status of Forces survey results. 

DOD Response: Concur. DMDC understands GAO's concerns regarding 
response rates and lack of recurring nonresponse bias studies in the 
SOFS program, but a low response rate, in and of itself, is not 
indicative of a flawed study, nor does the lack of specific 
nonresponse analysis indicate that the original survey results are not 
statistically valid. Groves (2006) shows that, "...if we examine in a 
meta-analytic way what the survey methodological literature finds for 
the linkage between nonresponse rates and nonresponse biases, we find 
large nonresponse biases for some statistics but no strong empirical 
relationship between response rates and nonresponse bias."[Footnote 1] 

DMDC concurs with the GAO regarding the benefits of developing a 
systematic program to continually monitor the impact of nonresponse on 
survey results in the SOFS program. To address GAO's concerns, DMDC 
will develop plans to periodically assess the effect of nonresponse on 
SOFS survey estimates through formal nonresponse bias studies. In 
support of the Federal Voting Assistance Program (FVAP), DMDC will 
conduct two nonresponse bias studies in the winter 2010 on post-
election voting surveys on behalf of FVAP. The study methodology 
consists of contacting survey nonrespondents by telephone and asking a 
subset of key survey questions. To assess nonresponse bias, DMDC will 
compare responses from initial survey respondents to survey 
nonrespondents converted to response by the more expensive telephone 
mode. There will also be a comparison group of individuals initially 
contacted by phone. If the telephone nonresponse follow-up method 
proves effective in the voting surveys, judged by response rates to 
the nonresponse follow-up study and substantive, statistically 
significant differences in the estimates of key analysis variables, 
DMDC will further test these methods in the SOFS program starting in 
2011, and completing studies for the active duty, Reserve, and 
civilian SOFS by 2012. If this method proves ineffective, DMDC will 
develop alternate plans to assess SOFS nonresponse bias and test these 
plans in 2011. Based on the results of these studies, DMDC will 
develop a comprehensive plan and guidance to continually monitor 
nonresponse bias in the SOFS program.

Enclosure Footnote: 

[1] Groves, Robert M (2006). "Nonresponse Rates and Nonresponse Bias 
in Household Surveys." Public Opinion Quarterly, 70(5):646-675. 

[End of Enclosure] 

Footnotes: 

[1] The Status of Forces surveys include a survey of active duty 
military personnel, called the Status of Forces Active Duty Survey; a 
survey of reserve military personnel, called the Status of Forces 
Reserve Survey; and a survey of civilian employees, called the Status 
of Forces Survey of Civilian Employees. These surveys include outcome, 
or "leading indicator," measures for these individuals such as overall 
satisfaction, retention intention, and perceived readiness, as well as 
demographic items needed to classify individuals into various 
subpopulations.

[2] Specifically, DMDC is DOD's repository for departmentwide data and 
is a key support organization that, among other things, generates 
reports for decision makers in the Office of the Secretary of Defense, 
the military services, and the Joint Staff. External organizations 
such as GAO and federally funded research and development centers also 
rely on DMDC for quantitative data and analyses pertaining to a wide 
variety of issues, such as the number of DOD personnel in specified 
occupations or demographic groups, and DOD personnel's attitudes 
toward various DOD programs and policies.

[3] See, for example, GAO, Human Capital: Monitoring of Safeguards and 
Addressing Employee Perceptions Are Key to Implementing a Civilian 
Performance Management System, [hyperlink, 
http://www.gao.gov/products/GAO-10-102] (Washington, D.C.: Oct. 28, 
2009); Military Personnel: Reserve Component Servicemembers on Average 
Earn More Income while Activated, [hyperlink, 
http://www.gao.gov/products/GAO-09-688R] (Washington, D.C.: June 
23, 2009); Human Capital: DOD Needs to Improve Implementation of and 
Address Employee Concerns about Its National Security Personnel 
System, [hyperlink, http://www.gao.gov/products/GAO-08-773] 
(Washington, D.C.: Sept. 10, 2008); Military 
Personnel: Federal Management of Servicemember Employment Rights Can 
Be Further Improved, [hyperlink, 
http://www.gao.gov/products/GAO-06-60] (Washington, D.C.: Oct. 19, 
2005); Military Personnel: DOD's Tools for Curbing the Use and Effects 
of Predatory Lending Not Fully Utilized, [hyperlink, 
http://www.gao.gov/products/GAO-05-349] (Washington, D.C.: 
Apr. 26, 2005); and Military Personnel: More DOD Actions Needed to 
Address Servicemembers' Personal Financial Management Issues, 
[hyperlink, http://www.gao.gov/products/GAO-05-348] (Washington, D.C.: 
Apr. 26, 2005).

[4] See, for example, [hyperlink, 
http://www.gao.gov/products/GAO-08-773], [hyperlink, 
http://www.gao.gov/products/GAO-06-60], and [hyperlink, 
http://www.gao.gov/products/GAO-05-349].

[5] Nonresponse analysis may be performed using a variety of methods-- 
for example, by randomly selecting a sample of survey nonrespondents 
and surveying them to obtain answers to key survey questions. 
Nonresponse analysis may be completed on more than one occasion, 
depending on how frequently a survey is administered.

[6] For example, if the population being surveyed is 50 percent male 
and 50 percent female, the survey results could be weighted to reflect 
this demographic characteristic.

[7] Office of Management and Budget, Standards and Guidelines for 
Statistical Surveys, September 2006.

[8] GAO, Army Health Care: Progress Made in Staffing and Monitoring 
Units that Provide Outpatient Case Management, but Additional Steps 
Needed, [hyperlink, http://www.gao.gov/products/GAO-09-357] 
(Washington, D.C.: Apr. 20, 2009). 

[9] For examples of our work on federal surveys other than the Status 
of Forces survey, see [hyperlink, 
http://www.gao.gov/products/GAO-09-357]; Aviation Security: Federal Air 
Marshal Service Has Taken Actions to Fulfill Its Core Mission and 
Address Workforce Issues, but Additional Actions Are Needed to Improve 
Workforce Survey, [hyperlink, http://www.gao.gov/products/GAO-09-273] 
(Washington, D.C.: Jan. 14, 2009); and Elections: Absentee Voting 
Assistance to Military and Overseas Citizens Increased for the 2004 
General Election, but Challenges Remain, [hyperlink, 
http://www.gao.gov/products/GAO-06-521] (Washington, D.C.: Apr. 7, 
2006). 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: