Web Survey Bibliography
a) Relevance & Research question: Although the use of progress bar seems to be standard in many online surveys, there is no consensus in the literature regarding its effect on survey drop-off rates. Researchers hope that using a progress bar helps reducing drop-off rates by providing respondents with a sense of survey length and allowing them to monitor their progress through it.
b) Methods & Data: In this meta-analysis we analyzed 27 randomized experiments that compared drop-off rates of an experimental group who completed an online survey where a progress bar was shown, to drop-off rates of a control group to whom the progress bar was not shown. In all the studies drop-offs were defined as any respondent who did not fully complete the survey. Three types of bars were analyzed: a) linear or constant, b) fast first then slow, and c) slow first then fast.
c) Results: Random effects analysis was used to compute odds ratios (OR) for each study. Because the dependent variable was drop-off rate, an OR greater than 1 indicates that the progress bar group had a higher drop-off rate while an OR lower than 1 indicates that the progress bar group had a lower drop-off rate. The OR for the 13 studies using a constant progress bar is 1.065 (p=0.304). The OR for the 7 studies using fast-to-slow progress bar is 0.835 (p=0.131), whereas the OR for the 7 studies presenting the slow-to-fast progress bar is 1.564 (p=0.002). These preliminary results suggest that, contrary to widespread expectations, the progress indicator does not help reduce drop-off rates for the constant progress indicator while there is some indication that the fast to slow does. Furthermore, the slow-to-fast bar increases drop-off rates as compared to not showing the progress bar. We do not suggest using the fast to slow progress indicator because ethically questionable and against AAPOR and ESOMAR codes.
d) Added value: To our knowledge this is the first meta-analysis study on the topic. Additional literature search will be performed and we are awaiting from some authors to send us data to add to the study.
Conference Homepage (abstract) / (presentation)
Web Survey Bibliography (6374)
- Using paradata to explore item-level response times in surveys; 2012; Couper, M. P., Kreuter, F.
- Using multivariate statistics, 6th Edition; 2012; Tabachnick, B. G., Fidell, L. S.
- Unintentional mobile respondents; 2012; Peterson, G.
- Tracking preference expression (DNT); 2012
- The smartphone psychology manifesto; 2012; Miller, G.
- The rise of the "connected viewer"; 2012; Smith, A., Boyles, J. L.
- The practice of social research; 2012; Babbie, E. R.
- The integration of facebook into class management: an exploratory study; 2012; Chou, P. N.
- The effects of item saliency and question design on measurement error in a self-administered survey; 2012; Stern, M. J., Smyth, J. D., Mendez, J.
- The cross platform report. Q2 -2012 - US; 2012
- Speed (necessarily) doesn’t kill: A new way to detect survey satisficing; 2012; Garland, P. et al.
- Smartphone ownership update: September 2012; 2012; Rainie, L.
- Sensitive topics in PC Web and mobile web surveys: Is there a difference?; 2012; Mavletova, A. M., Couper, M. P.
- Selection bias of internet panel surveys: A comparison with a paper-based survey and national governmental...; 2012; Tsuboi, S. et al.
- Screenwise panel: Frequently Asked Questions; 2012
- Research company spotlight - Mobile surveys; 2012
- Redeveloping the research section of Meningitis UK's website — A case study report; 2012; Witt, J. et al.
- Quality in market research. From theory to practice. 2nd Edition; 2012; Harding, D., Jackson, P.
- Participation of mobile users in traditional online studies; 2012; Jue, A.
- Online survey statistics for the mobile future. Updated with Q3 2012 data; 2012
- Ofcom technology tracker Wave 3; 2012
- Ofcom technology tracker Wave 2; 2012
- Not just playing around; 2012; Ewing, T.
- Norme di qualita' Assirm (Assirm quality rules]; 2012
- NBCU enlists Google, ComScore to track multiscreen Olympics viewing; 2012; Spangler, T.
- MRS Guidelines for online reseach; 2012
- More dirty little secrets of online panel research.; 2012
- Mobile usability; 2012; Nielsen, J., Budiu, R.
- Mobile email opens report 2nd half 2011; 2012
- Metering mobile usage. Insights from global Arbitron mobile trends panel; 2012; Verkasalo, H.
- Media tracker; 2012
- Measuring the quality of governmental websites in a controlled versus an online setting with the ‘...; 2012; Elling, S. et al.
- Measuring modern media consumption; 2012; Arini, N.
- ISO 20252. Market, opinion and social research-Vocabulary and service requirements, 2nd Edition; 2012
- Is „chapterisation“ a viable alternative to traditional progress indicators ?; 2012; Spicer, R., Dowling, Z.
- Internet use in households and by individual in 2012. Eurostat Statistics in Focus 50/2012; 2012; Seybert, H.
- Internet access - Households and individuals, 2012 part 2; 2012
- Internet access - Households and individuals, 2012; 2012
- Guide to social science data preparation. Best practice throughout the data life cycle; 2012
- Google et Médiamétrie créent une audience bimédia; 2012; Gonzales, P.
- GMI Pinnacle; 2012
- Global market research 2012; 2012
- Flowing with the mainstream. Is mobile market research finally living up to the hype?; 2012; Townsend, L.
- Explaining rising nonresponse rates in cross-sectional surveys; 2012; Brick, J. M., Williams, D.
- Eurobarometer Special surveys: Special Eurobarometer 381; 2012
- Online Surveys 2.0; 2012; Elferink, R.
- The Impact of Academic Sponsorship on Online Survey Dropout Rates; 2012; Allen, P. J., Roberts, L. D.
- Especially for You: Motivating Respondents in an Internet Panel by Offering Tailored Questions; 2012; Oudejans, M.
- Social media as a data collection tool: the impact of Facebook in behavioural research; 2012; Zoppos, E.
- Smartphone Apps and User Engagement: Collecting Data in the Digital Era; 2012; Link, M. W.