Web Survey Bibliography
Research Questions and Methodology: As response rates for online surveys continue to decline, three issues are of particular interest: Do lottery style incentives increase response to online surveys? While there is research to suggest that monetary incentives can increase response, other research suggests that lottery style incentives do not increase response, at least in mail surveys. Would results be ‐groups. The population was defined as degree holders from Stanford from 1955‐2007, who live in the continental USA or Canada, and for whom Stanford has an e‐mail address. Three random samples of approximately 2500 alumni were selected from the population: Sample 1 was offered no incentive to participate in the survey. Sample 2 was told that five randomly selected respondents (i.e., those who completed the survey) would each win a $100 Visa gift card. Sample 3 was told that one randomly selected respondent would win a $500 Visa gift card. The invitations were e‐mailed on June 24, 2008, and three reminders were e‐mailed to non‐respondents at one‐week intervals before the survey was taken off the web on July 20, 2008. ‐incented sample drew a response of 30.6% – a marginally significant difference. Differences were also seen among certain demographic groups: Donors and women responded significantly higher to the incentives than to no incentive. Alumni in their 30s and 40s responded marginally significantly higher to the incentives than to no incentive. Higher response to the incentives among donors and women (but not among non‐donors and men) is noteworthy because overall response was also higher among donors and women – so not only did the incentives not improve data quality by increasing response among underrepresented groups, but the incentives in fact decreased data quality by further increasing the participation of overrepresented groups and magnifying the bias already present in the data. Within the other demographic groups, response was, at most, a few points higher in the incented samples than the non‐incented sample, but the differences are not significant. Overall response to the two tested incentives was virtually identical – 32.6% for the sample offered five chances to win $100, and 32.4% for the sample offered one chance to win $500. Furthermore, response to the two incentives was the same across almost all demographic groups. ‐incented sample – both overall and across most demographic groups. Therefore, if it is of paramount importance to obtain every last possible respondent, lottery style incentives in an online survey may be of some benefit. Nevertheless, the difference in response was only marginally significant, so for many projects, the incentives may not be worth the extra cost. While the incentives added to the cost of the survey, they also added bias to the results. So the accuracy of the data collected must also be considered when deciding whether or not to offer incentives. If a lottery style incentive is offered, it may not matter which (at least of the two incentives we tested) is used.
different for online surveys? Prospect theory suggests that the way a proposition is framed can affect the decision people make, so is it more effective to offer respondents multiple chances to win a smaller gift…or to offer them a single chance to win a larger gift?
Do lottery style incentives affect – either positively or negatively – the accuracy of the data collected, in terms of how representative the respondents are of the population?
To explore these issues, we conducted an online survey with alumni from Stanford University testing two incentives of the same expected value but framed differently, along with a control group offered no incentive. The survey was about resources and services available to Stanford alumni, and thus was salient to the general alumni population, and not just to certain sub
Results: The overall response rate was 31.9%. As we have consistently seen in fifteen years of surveying alumni from a wide range of institutions, overall response was significantly greater among alumni with whom the University has the strongest relationship – donors, Stanford Alumni Association members, and undergraduate alumni. Response was also somewhat greater among women than men, and among alumni 50 and older. The incented samples drew a response of 32.5%, while the non
Implications: Even with incentives, if the sponsor of a survey is identified upfront to respondents, response will be higher among those with whom the sponsor has the closest relationship (in this case, donors, Alumni Association members, and undergraduate alumni). Response was a few percentage points higher in the incented samples than in the non -incented sample – both overall and across most demographic groups. Therefore, if it is of paramount importance to obtain every last possible respondent, lottery style incentives in an online survey may be of some benefit. Nevertheless, the difference in response was only marginally significant, so for many projects, the incentives may not be worth the extra cost. While the incentives added to the cost of the survey, they also added bias to the results. So the accuracy of the data collected must also be considered when deciding whether or not to offer incentives. If a lottery style incentive is offered, it may not matter which (at least of the two incentives we tested) is used.
Conference homepage (abstract)
Web Survey Bibliography - Other (451)
- Cell Phone Mainly Households: Coverage and Reach for Telephone Surveys Using RDD Landline Samples; 2009; Boyle, J., Lewis, F., Tefft, B.
- Cell-Phone-Only Voters in the 2008 Exit Poll and Implications for Future Noncoverage Bias ; 2009; Mokrzycki, M., Keeter, S., Kennedy, C.
- Zero Banks: Coverage Error and Bias in Rdd Samples Based on Hundred Banks with Listed Numbers ; 2009; Boyle, J., Bucuvalas, M., Piekarski, L., Weiss, A.
- National Surveys Via RDD Telephone Interviewing vs. the Internet: Comparing Sample Representativeness...; 2009; Chang, L. C., Krosnick, J. A.
- Internet experiments: methods, guidelines, metadata; 2009; Reips, U. -D.
- Effects of incentives and the Big Five personality dimensions on internet panellists' ratings; 2009; Larson, A. J., Sachau, D. A.
- Best practices in mobile research; 2009; Zahariev, M., Ferneyhough, C., Ryan, C.
- Mobile interviewing; 2009; Lavine, S.
- Web 2.0: Transformational technology or pretty gradients and hype?; 2009; August, S., Ewing, T., Hamelle, A., Ryan, L.
- Innovative online research: The US presidential campaign of Barack Obama case study; 2009; Riley, R.
- A comparison of web-based and telephone surveys for assessing traffic safety concerns, beliefs, and...; 2009; Beck, K. H., Yan, A. F., Qi Wang, M.
- Design of Web Questionnaires: The Effect of Layout in Rating Scales; 2009; Toepoel, V., Das, M., van Soest, A.
- The Impact of Textual Messages of Encouragement on Web Survey Breakoffs: An Experiment ; 2009; Sakshaug, J. W., Crawford, S. D.
- The Coverage Bias of Mobile Web Surveys Across European Countries ; 2009; Fuchs, M., Busse, B.
- Item non-response rates: a comparison of online and paper questionnaires ; 2009; Denscombe, M.
- Young people, the Internet and Political Participation: Findings of a web survey in Italy, Spain and...; 2009; Calenda, D., Meijer, A.
- Using mobile phones for survey research A comparison with fixed phones ; 2009; Vicente, P., Reis, E., Santos, R.
- A Comparison of Different Survey Periods in Online Surveys of Persons with Eating Disorders and Their...; 2009; Wesemann, D., Grunwald, A., Grunwald, M.
- A Comparison of Web-Based and Paper-Based Survey Methods Testing Assumptions of Survey Mode and Response...; 2009; Greenlaw, C., Brown-Welty, S.
- Interactivity in self-administered surveys. Influence on respondents' experience; 2009; Suarez Vazquez, A., Garcia Rodriguez, N., Alvarez, M. B.
- Selecting techniques for use in an internet survey; 2009; B., Han, V., Albaum, G., Thirkell, P.Wiley, J. B.
- Designing Effective Web Survey Forms; 2009; Mitchell, J.
- Web based macroseismic survey: fast information exchange and elaboration of seismic intensity effects...; 2009; De Rubeis, V., Sbarra P., Sorrentino, D., Tosi, P.
- The Effects of the Initial Mode of Contact on the Response Rate and Data Quality in an Internet-Based...; 2009; Wiseman, F.
- Internet-based surveys and urban design education: A community outreach graduate project in Redding,...; 2009; del Rio, V., Levi, D.
- An experimental mixed mode design on a general population survey ; 2009; Eva, G.
- Declining Working Phone Rates Impact Sample Efficiency; 2009; Piekarski, L.
- Using Non-Probability Samples for Confusion Surveys - Mall Intercepts and the Internet; 2009; Ericksen, E. P.
- Using Debit Cards for Incentive Payments: Experiences of a Weekly Survey Study; 2009; Gatny, H. H., Couper, M. P., Axinn, W., Barber, J. S.
- Characteristics of Cell Phone Only, Listed, and Unlisted Telephone Households; 2009; Tarnai, J., , Schultz, R.Moore, D.
- Cell Phone-Only Households: A Good Target for Internet Surveys?; 2009; Bates, N.
- Nonsampling Error Research in Practice; 2009; Brick, J. M., Kalton, G.
- Metrics for panel contribution: a non probabilistic platform; 2009; Gittelmam, S. H., Trimarchi, E.
- Relation between values and topic of a survey in internet panel research; 2009; Vis, C., Marchand, M.
- The potential of mobile research: Implications for the future and the role of industry standards; 2009; Nelson, L.
- Lottery Style Incentives and Response Rates to Online Surveys; 2009; Pearson, J., E., Krosnick, J. A.Levine, R. E.
- Factors Contributing to Participation in Web‐based Surveys among Italian University Graduates; 2009; Cimini, C., Girottu, C., Gasperoni, G.
- Integration of different data collection techniques using the propensity score; 2009; Camillo, F., Conti, V., Ghiselli, S.
- The mixing of survey modes: application to Laon web and face‐to‐face household travel survey...; 2009; Bayart, C., Bonnel, P.
- Reason analysis: an ambitious alternative for mixed‐mode survey design; 2009; Jeøábek, H.
- Unit non‐response in panel surveys: empirical finding from an experiment; 2009; Haunberger, S.
- Do cash incentives helps with RDD studies? Examination of results from a national and a statewide survey...; 2009; Miller, Y., Barger, K., Hearn, D.
- The Potential of a Multi-mode Data Collection Design to Reduce non-response bias. The Case of a Survey...; 2009; Sala, E., Lynn, P.
- Are people sharing their mobile phones? Selection probabilities in cellular telephone surveys; 2009; Fuchs, M., Busse, B.
- Improving Response Rates in Online Business Surveys by Using CATI; 2009; Höglinger, M., Abraham, M., Arpagaus, J.
- Survey cooperation: response to initial and follow-up requests - Recent experiences from the recruitment...; 2009; Bartsch, S., Engel, U., Schnabel, C., Vehre, H.
- Using Mobile Phones to Administer a Working Memory Updating Task in a Survey - Cognitive Performance...; 2009; Schmiedek, F., Riediger, M., Lindenberger, U., Wagner, G. G.
- Accessibility of individuals for mobile phone surveys; 2009; Gabler, S., Häder, S.
- Mixed Modes and Measurement Error: Comparing face-to-face, telephone and web modes ; 2009; Hope, S., Nicolaas, G., Jäckle, A., Lynn, P., Nandi, A., Campanelli, P.
- The Presentation of a Web Survey, Nonresponse and Measurement Error among Members of Web Panel; 2009; Tourangeau, R., Groves, R. M., Kennedy, C., Yan, T.