Web Survey Bibliography
a) Relevance & Research question: Although the use of progress bar seems to be standard in many online surveys, there is no consensus in the literature regarding its effect on survey drop-off rates. Researchers hope that using a progress bar helps reducing drop-off rates by providing respondents with a sense of survey length and allowing them to monitor their progress through it.
b) Methods & Data: In this meta-analysis we analyzed 27 randomized experiments that compared drop-off rates of an experimental group who completed an online survey where a progress bar was shown, to drop-off rates of a control group to whom the progress bar was not shown. In all the studies drop-offs were defined as any respondent who did not fully complete the survey. Three types of bars were analyzed: a) linear or constant, b) fast first then slow, and c) slow first then fast.
c) Results: Random effects analysis was used to compute odds ratios (OR) for each study. Because the dependent variable was drop-off rate, an OR greater than 1 indicates that the progress bar group had a higher drop-off rate while an OR lower than 1 indicates that the progress bar group had a lower drop-off rate. The OR for the 13 studies using a constant progress bar is 1.065 (p=0.304). The OR for the 7 studies using fast-to-slow progress bar is 0.835 (p=0.131), whereas the OR for the 7 studies presenting the slow-to-fast progress bar is 1.564 (p=0.002). These preliminary results suggest that, contrary to widespread expectations, the progress indicator does not help reduce drop-off rates for the constant progress indicator while there is some indication that the fast to slow does. Furthermore, the slow-to-fast bar increases drop-off rates as compared to not showing the progress bar. We do not suggest using the fast to slow progress indicator because ethically questionable and against AAPOR and ESOMAR codes.
d) Added value: To our knowledge this is the first meta-analysis study on the topic. Additional literature search will be performed and we are awaiting from some authors to send us data to add to the study.
Conference Homepage (abstract) / (presentation)
Web Survey Bibliography - 2011 (566)
- Triton: a general tool for data collection and micro editing; 2011; Erikson, J.
- A Generalized System for Aided Development and Monitoring of Web Surveys; 2011; Torelli, R.
- Ethical issues in Internet research; 2011; Hoerger, M., Currell, C.
- Using survey data collection as a tool for improving the survey process; 2011; Biffignandi, S., Perani, G., Laureti, A.
- Essential methods for design based sample surveys; 2011; Pfeffermann, D., Rao, C. R.
- ESOMAR AND CASRO submission to the W3C tracking protection working group - Market research techniques...; 2011
- A picnic in the field; negotiating the presentation of the self in researcher/respondent relationships...; 2011; Parsons, J.
- The benefits and constraints of e-mail interviews and discussions as methods of accessing valid data; 2011; Roberts, A.
- Is There a Quick Fix for Open-ended Questions? A Comparison of Qualitative Analysis Techniques; 2011; Tesfaye, C.
- The Impact of Open-Ended Questions: A Multivariate Study of Respondent Engagement; 2011; Gittelman, S. H.
- “You are Invited to Participate”: Challenges of Applying Mixed Survey Methods to Assess...; 2011; Chew, F.
- Literacy and Data Quality in Self-Administered Surveys; 2011; Smyth, J. D., Olson, K.
- Catch Them When You Can: Speeders and Their Role in Online Data Quality; 2011; Gutierrez, C., Wells, T., Rao, K., Kurzynski, D.
- Effects of Post-Incentives on Response Rates, Costs, and Response Quality in a Web Survey of College...; 2011; Stevenson, J., Dykema, J., Cyffka, K., Klein, L., Goldrick-Rab, S.
- Observed Differences in the Placement and Wording of Neutral Response Options in Web Surveys: An Experiment...; 2011; Walton, L., Cobb, C. L., DiSogra, C.
- Effects of response format on requalification for recontact studies; 2011; Thomas, R. K.
- A meta-analysis of experiments manipulating progress indicators in Web surveys; 2011; Callegaro, M., Villar, A., Yang, Y.
- Does mentioning "some people" and "other people" in a survey question increase the...; 2011; Yeager, D. S., Krosnick, J. A.
- Do not track gathers momentum; 2011; Stark, D.
- “Don’t know” the difference - An experimental comparison between Web and CATI; 2011; Schielicke, A.-M., Degen, M.
- Display resolution; 2011
- A Survey Stopping Rule Based on Weighting for Unit Nonresponse; 2011; Lewis, T.
- Classic Inspirations for Social Research Methodology in the time of Online Access Panels ; 2011; Jeřábek, H.
- Five Things You Should Know About Mobile Data Collection; 2011; Pingitore, G.
- Mixed Methods - Analyzing Open-Ended Comments in a Quantitative Employee Survey; 2011; Lawton, L., Broege, N.
- Changing Survey Methods (Discussion); 2011; Lavrakas, P. J.
- Code of standards and ethics for survey research; 2011
- Causes of survey incompletes: Why panelists say they abandon surveys; 2011; Henning, J.
- Canadian online panels: Similar or different?; 2011; Chan, P., Ambrose, D.
- Blend, balance, and stabilize respondent sources; 2011; Eggers, M., Drake, E.
- Beyond data stability: Rising above quality concerns; 2011
- Background - QSOAP; 2011
- Audience evolution: New technologies and the transformation of media audiences; 2011; Napoli, P. M.
- Assessing personality traits through response latencies using item response theory; 2011; Ranger, J., Ortner, T. M.
- American public opinion: Its origins, content, and impact (8th Edition); 2011; Erikson, R. S., Tedin, K. L.
- Amazon's Mechanical Turk. A new source of inexpensive, yet high quality, data?; 2011; Buhrmester, M., Kwang, T., Gosling, S. D.
- A theory of public opinion. A reprint of a classic book with a new introduction by H. Lee Cheek, Jr.; 2011; Wilson, F.
- Clarifying Survey Questions; 2011; Redline, C. D.
- A new representative standard for online research: Conquering the challenge of the dirty little "...; 2011; Gittelman, S., Trimarchi, E., Fawson, B.
- 2011 Skills for Life Survey: Headline findings ; 2011
- The place for mobile research? Multi-mode studies of major cultural events; 2011; Conry, S., Atkinson, S.
- Facial imaging: The new face of online survey research; 2011; Gordon, A., McCallum, D., Sorci, M., Llewellyn, T.
- How far is too far: Traditional, flash and gamification interfaces, and implications for the future...; 2011; Puleston, J., Malinoff, B.
- The Evolution of Edits in the Canadian Census of Population Online Questionnaires; 2011; Laroche, D.
- The Main Innovations of Data Editing and Imputation for the 2010 Italian Agricultural Census ; 2011; Bianchi, G., Lipsi, R. M., Ruocco, G., Salvatore, M. A.
- A classification of question characteristics relevant to measurement (error) and consequently important...; 2011; Campanelli, P., Nicolaas, G., Jaeckle, A., Lynn, P., Hope, S., Blake, M., Gray, M.
- Hrh remuneration: Comparing wage levels, ranking And dispersion of 16 occupations In the health workforce...; 2011; Tijdens, K., de Vries, D.
- Wages worldwide results and measurement issues from the multi-country. WageIndicator web-survey ; 2011; van Klaveren, M., Tijdens, K.
- Text string matching to measure occupations in web-surveys; 2011; Tijdens, K.
- Web-based rating scales: HTML 5 and other innovations; 2011; Funke, F.