Web Survey Bibliography
Research Questions and Methodology: As response rates for online surveys continue to decline, three issues are of particular interest: Do lottery style incentives increase response to online surveys? While there is research to suggest that monetary incentives can increase response, other research suggests that lottery style incentives do not increase response, at least in mail surveys. Would results be ‐groups. The population was defined as degree holders from Stanford from 1955‐2007, who live in the continental USA or Canada, and for whom Stanford has an e‐mail address. Three random samples of approximately 2500 alumni were selected from the population: Sample 1 was offered no incentive to participate in the survey. Sample 2 was told that five randomly selected respondents (i.e., those who completed the survey) would each win a $100 Visa gift card. Sample 3 was told that one randomly selected respondent would win a $500 Visa gift card. The invitations were e‐mailed on June 24, 2008, and three reminders were e‐mailed to non‐respondents at one‐week intervals before the survey was taken off the web on July 20, 2008. ‐incented sample drew a response of 30.6% – a marginally significant difference. Differences were also seen among certain demographic groups: Donors and women responded significantly higher to the incentives than to no incentive. Alumni in their 30s and 40s responded marginally significantly higher to the incentives than to no incentive. Higher response to the incentives among donors and women (but not among non‐donors and men) is noteworthy because overall response was also higher among donors and women – so not only did the incentives not improve data quality by increasing response among underrepresented groups, but the incentives in fact decreased data quality by further increasing the participation of overrepresented groups and magnifying the bias already present in the data. Within the other demographic groups, response was, at most, a few points higher in the incented samples than the non‐incented sample, but the differences are not significant. Overall response to the two tested incentives was virtually identical – 32.6% for the sample offered five chances to win $100, and 32.4% for the sample offered one chance to win $500. Furthermore, response to the two incentives was the same across almost all demographic groups. ‐incented sample – both overall and across most demographic groups. Therefore, if it is of paramount importance to obtain every last possible respondent, lottery style incentives in an online survey may be of some benefit. Nevertheless, the difference in response was only marginally significant, so for many projects, the incentives may not be worth the extra cost. While the incentives added to the cost of the survey, they also added bias to the results. So the accuracy of the data collected must also be considered when deciding whether or not to offer incentives. If a lottery style incentive is offered, it may not matter which (at least of the two incentives we tested) is used.
different for online surveys? Prospect theory suggests that the way a proposition is framed can affect the decision people make, so is it more effective to offer respondents multiple chances to win a smaller gift…or to offer them a single chance to win a larger gift?
Do lottery style incentives affect – either positively or negatively – the accuracy of the data collected, in terms of how representative the respondents are of the population?
To explore these issues, we conducted an online survey with alumni from Stanford University testing two incentives of the same expected value but framed differently, along with a control group offered no incentive. The survey was about resources and services available to Stanford alumni, and thus was salient to the general alumni population, and not just to certain sub
Results: The overall response rate was 31.9%. As we have consistently seen in fifteen years of surveying alumni from a wide range of institutions, overall response was significantly greater among alumni with whom the University has the strongest relationship – donors, Stanford Alumni Association members, and undergraduate alumni. Response was also somewhat greater among women than men, and among alumni 50 and older. The incented samples drew a response of 32.5%, while the non
Implications: Even with incentives, if the sponsor of a survey is identified upfront to respondents, response will be higher among those with whom the sponsor has the closest relationship (in this case, donors, Alumni Association members, and undergraduate alumni). Response was a few percentage points higher in the incented samples than in the non -incented sample – both overall and across most demographic groups. Therefore, if it is of paramount importance to obtain every last possible respondent, lottery style incentives in an online survey may be of some benefit. Nevertheless, the difference in response was only marginally significant, so for many projects, the incentives may not be worth the extra cost. While the incentives added to the cost of the survey, they also added bias to the results. So the accuracy of the data collected must also be considered when deciding whether or not to offer incentives. If a lottery style incentive is offered, it may not matter which (at least of the two incentives we tested) is used.
Conference homepage (abstract)
Web Survey Bibliography - Canada (120)
- Flexible insulin dosing improves health-related quality-of-life (HRQoL): a time trade-off survey; 2013; Evans, M. et al.
- 4 ways mobile research challenges insights pros; 2013; Rajan, B.
- Thoughts on using the new online qualitative tools; 2013; Freund, N. M.
- Optimizing quality of response through adaptive survey designs; 2013; Schouten, B., Calinescu, M., Luiten, A.
- Customer loyalty to a commercial website: Descriptive meta-analysis of the empirical literature and...; 2013; Toufaily, E., Ricard, L., Perrien, J.
- Discovering interest groups for marketing in virtual communities: An integrated approach; 2013; Wang, K. -Y., Wu, H. -J., Ting, I -H.
- Multimode, Global Scale Usage: Understanding respondent scale usage across borders and devices; 2013; Pettit, F. A., Courtright, M.
- Insights into Action Profiling shopping occasions for retailers through mobile and online research; 2013; Churkina, O., Morris, T.
- Recent Experiences with Electronic Questionnaire Testing at Statistics Canada ; 2013; Lawrence, D.
- A nationwide web-based freight data collection; 2013; Samimi, A., Mohammadian, A., Kawamura, K.
- Use of a Social Networking Web Site for Recruiting Canadian Youth for Medical Research; 2013; Chu, J. L., Snider, C. E.
- Mixed Mode: Phone and Web Discussion on Efficient Strategies; 2012; Gagnon, M.
- Survey Quality; 2012; Lyberg, L. E.
- Adaptive web sampling in ecology; 2012; Thompson, S. K.
- Respondent-driven sampling; 2012; Schonlau, M., Liebau, E.
- Reaching Under/Never Screened Populations Using an Online Survey; 2012; Filsinger, B., Gesink, D., Mihic, A., Kreiger, N.
- Evidence on the Comparison of Telephone and Internet Surveys for Respondent Recruitment.; 2012; Potoglou, D., Kanaroglou, P. S., Robinson, N.
- Firefly Online Surveys: A fully featured tool for Web surveys and forums; 2012; Deal, K.
- Transferring Telephone-Based National Household Travel Survey to the Internet ; 2012; Son, S., Khattak, A., Wang, X., Chen, J.-Y.
- Where is Neutral? Using Negativity Biases to Interpret Thermometer Scores; 2012; Soroka, S., Albaugh, Q.
- Comparability of Internet and telephone data in a survey on the respiratory health of children; 2012; Plante, C. Jacques, L., Chevalier, S., Fournier, M.
- Online and Paper-Based: A Mixed-Method Approach to Conducting a Needs Assessment Survey of Physicians...; 2012; Olatunbosun, T., Wu, C., Grewal, G., Lynn, B.
- Thinking, Planning & Operationalizing Empirical Mixed Methods Research Design; 2012; Ruhi, U.
- Designing and Doing Survey Research; 2012; Andres, L.
- Validity and reproducibility of a web-based, self-administered food frequency questionnaire; 2012; Labonte, M. - E., Cyr, A., Baril-Gravel, L., Royer, M. - M., Lamarche, B.
- Validation of an Informant-Reported Web-Based Data Collection to Assess Dementia Symptoms; 2012; Rockwood, K., Zeng, A., Leibman, C., Mucha, L., Mitnitski, A.
- Assessing the Effects of Technical Variance on the Statistical Outcomes of Web Experiments Measuring...; 2012; Brand, A., Bradley, M. T.
- The cost-effectiveness of cash versus lottery incentives for a web-based, stated-preference community...; 2012; Gajic, A., Cameron, D., Hurley, J.
- What is Probit; 2011
- Survey Data Quality Provisions in Statistics Canada E-Questionnaire Solution: Retrospective and Perspectives...; 2011; Abiza, Y.
- Developing Electronic Questionnaire Guidelines: Issues and Challenges in a Changing Environment; 2011; Cote, A.-M., Kelly, P., Lawrence, D.
- Canadian online panels: Similar or different?; 2011; Chan, P., Ambrose, D.
- How far is too far: Traditional, flash and gamification interfaces, and implications for the future...; 2011; Puleston, J., Malinoff, B.
- The Evolution of Edits in the Canadian Census of Population Online Questionnaires; 2011; Laroche, D.
- Innovations in survey sampling design: Discussion of three contributions presented at the U.S. Census...; 2011; Opsomer, J.
- Alternative survey sample designs: Sampling with multiple overlapping frames; 2011; Lohr, S. L.
- Adaptive network and spatial sampling; 2011; Thompson, S. K.
- Apples and oranges: does a web survey produce similar results to social media tracking?; 2011; Bourque, C., Hobbs, R., Hilaire, D. S.
- Preferences, intentions, and expectation violations: A large-scale experiment with a representative...; 2011; Bellemare. C., Kroeger, S., van Soest, A.
- Method of administration affects adolescent post-immunization survey response rate: phone, paper, internet...; 2011; Pielak, K. L.; Buxton, J.; McIntyrea, C.; Tu, A,; Botnick, M. R.
- e-Collection at Statistics Canada; 2011; Faid, M.
- Nonsampling errors in dual frame telephone surveys ; 2011; Brick, J. M., Flores Cervantes, I., Lee, S., Norman, G.
- Studying Political Behavior: A Comparison of Internet and Telephone Surveys; 2011; Stephenson, L. B., Crête, J.
- The feasibility of Web-based surveys as a data collection tool: a process evaluation ; 2011; Chizawsky L. L. K., Estabrooks, C. A., Sales, A. E.
- Accounting for the effects of data collection modes in population surveys; 2010; Huang, Y. C., Thompson, M. E., Boudreau, C., Fong, G. T.
- From Buzz to Biz: Social Media Research for Results; 2010; Pettit, F. A.
- The Rise of Survey Programming Automation as an Alternative to Outsourcing; 2010; Ahmed, A. S., Guyer, L.
- Sexy Questions, Dangerous Results? - Exploring the Impact of Rich Media Question Formats in Online Survey...; 2010; Malinoff, B.
- Statistics Canada's Process Automation of On-line Survey Development and CATI Data Integration; 2010; Dubois, M.-A.
- System and method of providing an online survey and summarizing survey response data; 2010; Ryan, C. J.