Web Survey Bibliography
(a) Relevance & Research Question: The proposed paper builds on findings presented by the authors at the GOR 10. High drop-out rates are considered a major shortcoming of web surveys and considerably threaten data quality. However, despite growing scholarly attention the knowledge on survey drop-out is still fractional. Previous research mainly addresses the impact of survey design, question wording, and characteristics of the respondents on survey drop-out via ex-post statistical methods. The research presented here is innovative in that the respondents are asked directly about the reasons for dropping out, the interview situation, and psychological predispositions in a follow-up survey.
(b) Methods & Data: Based on our previous research regarding survey drop-out, the principal investigators of the GLES granted funding for a series of short follow-up surveys of drops-outs. These surveys will be conducted subsequently to three consecutive online trackings of the GLES, beginning in December 2010. According to experience, a gross sample size of about 400 drop-outs per survey can be expected. Given an estimated response rate of 60 percent a net sample size of 210 to 240 per tracking is anticipated, thus providing a unique database of more than 600 interviews with drop-outs. Since the most essential items are also included in the tracking surveys, the design allows for comparisons between drop-outs and complete responders. Due to the explorative character of the research, the presentation will mainly focus on descriptive statistics as well as multivariate models illustrating our major findings.
(c) Results: First results will be available by mid-January 2011.
(d) Added Value: Follow-up surveys of respondents who dropped-out allow for an enhanced understanding of the complex processes underlying the phenomenon, especially with respect to the subjective reasons of the respondents as well as the situational influences and psychological predispositions, which cannot be studied applying ex-post statistical procedures. In this regard, our research will add to the knowledge on the reasons for drop-out in web surveys and amend both the theoretical explanations of and the prospects for reducing drop-outs.
Conference Homepage (abstract) / (presentation)
Web Survey Bibliography (6476)
- What we can learn from unintentional mobile respondents; 2012; Peterson, G.
- Using paradata to explore item-level response times in surveys; 2012; Couper, M. P., Kreuter, F.
- Using multivariate statistics, 6th Edition; 2012; Tabachnick, B. G., Fidell, L. S.
- Unintentional mobile respondents; 2012; Peterson, G.
- Tracking preference expression (DNT); 2012
- The smartphone psychology manifesto; 2012; Miller, G.
- The rise of the "connected viewer"; 2012; Smith, A., Boyles, J. L.
- The practice of social research; 2012; Babbie, E. R.
- The integration of facebook into class management: an exploratory study; 2012; Chou, P. N.
- The effects of item saliency and question design on measurement error in a self-administered survey; 2012; Stern, M. J., Smyth, J. D., Mendez, J.
- The cross platform report. Q2 -2012 - US; 2012
- Speed (necessarily) doesn’t kill: A new way to detect survey satisficing; 2012; Garland, P. et al.
- Smartphone ownership update: September 2012; 2012; Rainie, L.
- Sensitive topics in PC Web and mobile web surveys: Is there a difference?; 2012; Mavletova, A. M., Couper, M. P.
- Selection bias of internet panel surveys: A comparison with a paper-based survey and national governmental...; 2012; Tsuboi, S. et al.
- Screenwise panel: Frequently Asked Questions; 2012
- Research company spotlight - Mobile surveys; 2012
- Redeveloping the research section of Meningitis UK's website — A case study report; 2012; Witt, J. et al.
- Quality in market research. From theory to practice. 2nd Edition; 2012; Harding, D., Jackson, P.
- Participation of mobile users in traditional online studies; 2012; Jue, A.
- Online survey statistics for the mobile future. Updated with Q3 2012 data; 2012
- Ofcom technology tracker Wave 3; 2012
- Ofcom technology tracker Wave 2; 2012
- Not just playing around; 2012; Ewing, T.
- Norme di qualita' Assirm (Assirm quality rules]; 2012
- NBCU enlists Google, ComScore to track multiscreen Olympics viewing; 2012; Spangler, T.
- MRS Guidelines for online reseach; 2012
- More dirty little secrets of online panel research.; 2012
- Mobile usability; 2012; Nielsen, J., Budiu, R.
- Mobile email opens report 2nd half 2011; 2012
- Metering mobile usage. Insights from global Arbitron mobile trends panel; 2012; Verkasalo, H.
- Media tracker; 2012
- Measuring the quality of governmental websites in a controlled versus an online setting with the ‘...; 2012; Elling, S. et al.
- Measuring modern media consumption; 2012; Arini, N.
- ISO 20252. Market, opinion and social research-Vocabulary and service requirements, 2nd Edition; 2012
- Is „chapterisation“ a viable alternative to traditional progress indicators ?; 2012; Spicer, R., Dowling, Z.
- Internet use in households and by individual in 2012. Eurostat Statistics in Focus 50/2012; 2012; Seybert, H.
- Internet access - Households and individuals, 2012 part 2; 2012
- Internet access - Households and individuals, 2012; 2012
- Guide to social science data preparation. Best practice throughout the data life cycle; 2012
- Google et Médiamétrie créent une audience bimédia; 2012; Gonzales, P.
- GMI Pinnacle; 2012
- Global market research 2012; 2012
- Flowing with the mainstream. Is mobile market research finally living up to the hype?; 2012; Townsend, L.
- Explaining rising nonresponse rates in cross-sectional surveys; 2012; Brick, J. M., Williams, D.
- Eurobarometer Special surveys: Special Eurobarometer 381; 2012
- Online Surveys 2.0; 2012; Elferink, R.
- The Impact of Academic Sponsorship on Online Survey Dropout Rates; 2012; Allen, P. J., Roberts, L. D.
- Especially for You: Motivating Respondents in an Internet Panel by Offering Tailored Questions; 2012; Oudejans, M.
- Social media as a data collection tool: the impact of Facebook in behavioural research; 2012; Zoppos, E.