Web Survey Bibliography
Incentives are commonly used to boost survey response rates, although the evidence is still limited as to whether they actually reduce nonresponse bias. Besides, we speculate that incentives might also increase measurement error if they attract respondents who participate only to obtain the incentive and so are unwilling to expend much effort. We explore the impacts of incentives on data quality by manipulating its salience in email invitation letters for a Web survey sent to university staff that asks about their use of information technology for work duties. Two designs of the invitations are compared. In one condition, we make incentives salient by starting the email subject line with the incentive and putting a statement about the incentive at the beginning of the invitation letters, in bold and larger font. In the other condition, importance of the survey is emphasized. Thus, both conditions inform the staff of the incentive and the importance of the survey, but with different emphases. To assess nonresponse error, we compare response rates between the users and non-users of a campus-wide, web-based information management system, where the usage status is known for the sampled staff and related to some of the survey variables. To evaluate measurement error, we ask two questions for which we have administrative records. We also examine the indirect indicators of response quality, such as completion times and length of open-ended responses. We find that making incentives salient increases the response rate, particularly among the non-users of the system, which suggests reductions in nonresponse bias for some survey estimates. The incentive salience also affects the make-up of the respondents on their motivations for taking the survey, which we find are related to their response quality. The findings suggest a potential link between nonresponse and measurement error driven by incentives.
Conference Homepage (abstract)
Web survey bibliography - The American Association for Public Opinion Research (AAPOR) 66th Annual Conference, 2011 (26)
- The smart(phone) way to collect survey data; 2013; Stapleton, C.
- Exploring Health-related Experiences and Access to Care: Differences between Online and Telephone Survey...; 2011; Doty, M. M., Peugh, J., Shand-Lubbers, J.
- Using Community Information and Survey Methodology for Bias Reduction to Enhance the Quality of the...; 2011; Harvey, J., Prabhakaran, J., Spera, C., Zhang, Zh.
- Response Quantity, Response Quality, and Costs of Building an Online Panel via Social Contacts.; 2011; Toepoel, V.
- The Influence Of The Direction Of Likert-Type Scales In Web Surveys On Response Behavior In Different...; 2011; Keusch, F.
- An Injured Party?: A Comparison of Political Party Response Formats in Party Identification.; 2011; Schwarz, S., Barlas, F. M., Thomas, R. K., Corso, R. A., Szoc, R.
- Asking Sensitive Questions: Do They Affect Participation In Follow-Up Surveys?; 2011; Schaurer, I., Struminskaya, B., Kaczmirek, L., Bandilla, W.
- Designing Questions for Web Surveys: Effects of Check-List, Check-All, and Stand-Alone Response Formats...; 2011; Dykema, J., Schaeffer, N. C., Beach, J., Lein, V., Day, B.
- Differential Sampling Based on Historical Individual-Level Data in Online Panels.; 2011; Kelly, R. H.
- Web Survey Live Validations - What Are They Doing?; 2011; Crawford, S. D., McClain, C.
- Comparing Numeric and Text Open-End Responses in Mail and Web Surveys.; 2011; Olson, K., Smyth, J.
- Effects of Response Formats when Measuring Attitudes in Consumer Web Surveys Across Markets.; 2011; Couper, M. P., Nunge, E.
- Re-Examining the Validity of Different Survey Modes for Measuring Public Opinion in the U.S.: Findings...; 2011; Ansolabehere, S., Fraga, B., Schaffner, B. F.
- How to Survey All 14 000 Swedish Local Political Representatives And Get 10 000 Responses.; 2011; Gilljam, M., Granberg, D., Holm, B., Persson, M.
- Measuring User Satisfaction in the Lab: Questionnaire Mode, Physical Location, and Social Presence Concerns...; 2011; Jans, M., Romano, J. C., Ashenfelter, K. T., Krosnick, J. A.
- Interactive interventions in web surveys can increase response accuracy.; 2011; Conrad, F. G.
- Impact on Data Quality of Making Incentives Salient in Web Survey Invitations.; 2011; Zhang, Che.
- Effects of Mode and Incentives on Response Rates, Costs, and Response Quality in a Mixed Mode Survey...; 2011; Stevenson, J., Dykema, J., Kniss, C., Black, P., Moberg, P.
- Effects of Differential Incentives on Response Rates in Four Countries for a Web-based Follow Up Survey...; 2011; McSpurren, K.
- Completing Web Surveys on Cell-enabled iPads.; 2011; Dayton, J., Driscoll, H.
- The Social Aspect of the Digital Divide; 2011; Johnson, E. P.
- Which Technologies Do Respondents Use in Online Surveys – An International Comparison?; 2011; Kaczmirek, L., Behr, D., Bandilla, W.
- Matrix Questionnaire Design to Reduce Measurement Error; 2011; Peytchev, A., Peytcheva, E.
- Race-of-Virtual-Interviewer Effects; 2011; Conrad, F. G., Schober, M. F., Nielsen, D.
- Which Web Survey Respondents Are Most Likely to Click for Clarification?; 2011; Coiner, T., Schober, M. F., Conrad, F. G.
- Providing Clarifying Instructions in a Web Survey; 2011; Redline, C. D.