Web-Based Surveys

    collage of natural resource and outdoor recreation photos
Click for a list of reports.
Jump to:
Click for a list of reports

Overview
Web-based surveys, also often referred to as Internet or online surveys, have become a popular and attractive way to conduct survey research. When used appropriately, online surveys are an excellent survey method for a closed population in which every member of that population has a verified email address and Internet access. For example, an internal survey of an organization in which all potential respondents (e.g., employees) have an assigned email address and guaranteed Internet access through the organization. Responsive Management has conducted this type of study for natural resource and fish and wildlife agencies in the past and has obtained results with scientifically valid sampling methodologies. Recent examples of such studies conducted by Responsive Management include a survey of Arkansas Game and Fish Commission employees to assist the agency in its first-ever strategic plan and a survey of Arizona Game and Fish Department employees to assess education and training needs.
Although online surveys may seem to be more economical and easier to administer than traditional survey research methods, they pose several problems to obtaining scientifically valid and accurate results when used in studies that do not have a closed population and guaranteed Internet access. Therefore, Responsive Management does not recommend an online survey for external studies of the general population or even of licensed hunters and anglers because they can produce inaccurate, unreliable, and biased data. There are four main reasons for this: sample validity, non-response bias, stakeholder bias, and unverified respondents.

Sample Validity
For a study to be unbiased, every member of the population under study must have an equal chance of participating. With the exception of studies with a closed population and guaranteed Internet access, such as an organization's internal survey of employees as mentioned above, online surveys at this time cannot accomplish this because there is no such thing as a representative sample of email addresses to contact all potential respondents, including the general population and its subpopulations, such as registered voters, park visitors, hunters, or anglers.
Furthermore, when online surveys are accessible to anyone who visits a website, the researcher has no control over sample selection. These self-selected opinion polls result in a sample of people who had Internet access, knew about the survey, and decided to take the survey-not a sample of scientifically selected respondents who represent the larger population under study.

Non-Response Bias
Non-response bias in online surveys is complicated by the most egregious form of self-selection. People who respond to a request to complete an online survey are likely to be more interested in or enthusiastic about the topic and therefore more willing to complete the survey, which biases the results. In fact, the very nature of the Internet, as an information-seeking tool, contributes to this form of bias. However, with a telephone survey, people are contacted who are not necessarily interested in the topic, and if they are not enthusiastic about completing the survey, a trained interviewer can encourage them to do so despite their disinterest, leading to results that represent the whole population being studied.
Other contributors to non-response bias in online surveys include spam filters that delete the email request for survey participation and respondents who may have multiple emails addresses they may or may not check on a regular basis.

Stakeholder Bias
Unless specific technical steps are taken with the survey to prevent it, people who have a vested interest in survey results can complete an online survey multiple times and urge others to complete they survey in order to influence the results. Even when safeguards against multiple responses are implemented, there are ways to work around them.

Unverified Respondents
Because of the inability to control who has access to online surveys, there is no way to verify who responds to them-who they are, their demographic background, their location, etc. Another complicating issue is incentives for completing the survey, which encourage multiple responses from a single person.

The Result
As a result of these issues, obtaining representative, unbiased, scientifically valid results from online surveys is not possible at this time, except in the case of closed population surveys, such as employee surveys, described above. Responsive Management recommends and implements online or electronic surveys only for studies of closed populations, most commonly for agency or organization employee studies.
For a more detailed look at the drawbacks of online surveys in the context of human dimensions research, please click here to see Responsive Management's newsletter article on online surveys.
Also see Duda, M.D. & Nobile, J.L., "The Fallacy of Online Surveys: No Data Are Better Thank Bad Data," Human Dimensions of Wildlife 15(1): 55-64. Reprints of the article can be ordered here.

Click for a list of reports.
RM Conducts:
Telephone Surveys
Mail Surveys
Focus Groups
Personal Interviews
Park/Outdoor Recreation Intercepts
Web-Based Surveys
Needs Assessments
Programmatic Evaluations
Literature Reviews
Data Collection for Universities and Researchers
RM Develops:
Marketing Plans
Communications Plans
Business Plans
Policy Analysis
Public Relations Plans

 130 Franklin Street, Harrisonburg, VA 22801    Phone (540) 432-1888    Fax (540) 432-1892    mark@responsivemanagement.com
Content © Responsive Management, unless otherwise noted.