Web Survey Bibliography
a) Relevance & Research question: Although the use of progress bar seems to be standard in many online surveys, there is no consensus in the literature regarding its effect on survey drop-off rates. Researchers hope that using a progress bar helps reducing drop-off rates by providing respondents with a sense of survey length and allowing them to monitor their progress through it.
b) Methods & Data: In this meta-analysis we analyzed 27 randomized experiments that compared drop-off rates of an experimental group who completed an online survey where a progress bar was shown, to drop-off rates of a control group to whom the progress bar was not shown. In all the studies drop-offs were defined as any respondent who did not fully complete the survey. Three types of bars were analyzed: a) linear or constant, b) fast first then slow, and c) slow first then fast.
c) Results: Random effects analysis was used to compute odds ratios (OR) for each study. Because the dependent variable was drop-off rate, an OR greater than 1 indicates that the progress bar group had a higher drop-off rate while an OR lower than 1 indicates that the progress bar group had a lower drop-off rate. The OR for the 13 studies using a constant progress bar is 1.065 (p=0.304). The OR for the 7 studies using fast-to-slow progress bar is 0.835 (p=0.131), whereas the OR for the 7 studies presenting the slow-to-fast progress bar is 1.564 (p=0.002). These preliminary results suggest that, contrary to widespread expectations, the progress indicator does not help reduce drop-off rates for the constant progress indicator while there is some indication that the fast to slow does. Furthermore, the slow-to-fast bar increases drop-off rates as compared to not showing the progress bar. We do not suggest using the fast to slow progress indicator because ethically questionable and against AAPOR and ESOMAR codes.
d) Added value: To our knowledge this is the first meta-analysis study on the topic. Additional literature search will be performed and we are awaiting from some authors to send us data to add to the study.
Conference Homepage (abstract) / (presentation)
Web Survey Bibliography (6389)
- Measure the response burden in the Swedish Intrastat system; 2012; Weideskog, F.
- Mode and non-response effects and their treatment; 2012; Chrysanthopoulos, S., Georgostathi, A.
- What can be said about quality in the Central Population Register based on a self-completion survey...; 2012; Falnes-Dalheim, E., Pedersen, H. E.
- Improving the quality of complex surveys: The case of the EU Labour Force Survey ; 2012; van der Valk, J.
- Pros and cons of Internet based User Satisfaction Surveys; 2012; Consoli, A., Matsulevits, L.
- The re-engineering of the Structural Earnings survey process: Mixed - Mode data collection and new E...; 2012; Cardinaleschi, S., De Santis, S., Rocci, F., Spinelli, V.
- Between demand and reality: Ensuring efficiency and quality in pretesting questionnaires; 2012; Sattelberger, S., Blanke, K.
- How to provide high data quality in online-questionnaires: Setting guidelines in design; 2012; Tries, S., Nebel, S., Blanke, K.
- Boosting Web pick-up Rates by referring to Compliance Principles ; 2012; Falnes-Dalheim, E., Haraldsen, G., Sundvoll, A.
- Choosing a Data Collection Approach: Mixed Mode Design Experiences in Statistics Finland; 2012; Taskinen, P., Kiianmaa, N.
- Ebook readings jumps, print book reading declines; 2012; Rainie, L., Duggan, M.
- Does mentioning "Some People" and "Other People" in an opinion question improve...; 2012; Yeager, D. S., Krosnick, J. A.
- Digital Divides: A connectivity continuum for the United States. Data from the 2011 Current Population...; 2012; File, T.
- Developments and the impact of smart technology; 2012; Macer, T.
- How Should Debriefing Be Undertaken in Web-Based Studies? Findings From a Randomized Controlled Trial...; 2012; McCambridge, J., Kypri, K., Wilson, A.
- Better customer in sight in real time; 2012; Macdonald, E., Wilson, H. N., Konus, H.
- Best practices in data cleaning: A complete guide to everything you need to do before and after collecting...; 2012; Osborne, J. W.
- Benchmarking for better surveys; 2012; Nallan, S.
- An experimental investigation of the effects of noncontingent and contingent incentives in recruiting...; 2012; Lavrakas, P. J., Dennis, J. M., Peugh, J., Shand-Lubbers, J., Lee, E., Peugh, J., Charlebois, O., Murakami...
- Adult gadget ownership over time (2006-2012); 2012
- Subjective Well-being Of Spanish Workers: Continuous Voluntary Web Survey Examination; 2012; de Pedraza, P., Guzi, M.
- 28 Questions to Help Buyers of Online Samples; 2012
- 2010 ACS Content Test Evaluation Report Covering Computer and Internet ; 2012; Shin, H. B.
- Specific mixed-mode methodology to reach sensory disabled people in quantitative surveys; 2012; Fontaine, S.
- Response Mode Choice and the Hard-to-Interview in the American Community Survey; 2012; Nichols, E. M., Horwitz, R., Guarino Tancreto, J.
- Recruiting in an Internet panel using respondent driven sampling; 2012; Schonlau, M.
- A Choice in Mode: A Solution for Increasing Response Rates of Hard-to-Survey Populations?; 2012; Haan, M., Ongena, Y. P.
- The Feasibility of Conducting a Web Survey Using Respondent Driven Sampling among Transgenders in the...; 2012; Kappelhof, J.
- The role of topic interest and topic salience in online panel web surveys.; 2012; Keusch, F.
- Multi-Language Multi-Continent B2B Community Panel: How B2B research can effectively span the world; 2012; Morden, M., Accomando, E.
- Can Survey Gaming Techniques Cross Continents? Examining cross cultural reactions to creative questioning...; 2012; Puleston, J.
- Facing The Future Webcams as a survey tool in China; 2012; Gordon, A., Llewellyn, T., Gu, E.
- Device Diversity: Understanding the complexity of varied devices for taking surveys – Case study...; 2012; Pearson, C., Backlund, K., Veling, L., Tsvelik, M., Jehoel, S.
- Research Goes Mobile: Findings from initial smartphone application research; 2012; Dubreuil, C., Joubert, S.
- Research in the Mobile Mindset: Exploring the unexplored in the mobile research space; 2012; Willems, A., Veris, E., Verhaeghe, A.
- Better Answers to Basic Questions: Enhancing the accuracy of online reach and audience metrics; 2012; van Dam, P. H., van Ossenbruggen, R., Voorend, R.
- Rules of engagement: The war against poorly engaged respondents - guidelines for elimination; 2012; Gittelman, S. H., Trimarchi, E.
- Reality check in the digital age: The relationship between what we ask and what people actually do; 2012; Hofmeyr, J., Louw, A.
- Website Versus Traditional Survey Comments: Do they tell the same story?; 2012; Brandt, R., House, M.
- Dimensions of Online Survey Data Quality What really matters?; 2012; Puleston, J., Eggers, M.
- WEBDATANET: web-based data-collection methodological challenges, solutions and implementations. Action...; 2012; de Pedraza, P.
- WebSM Study: Survey software features overview ; 2012; Vehovar, V.; Cehovin, G.; Kavcic, L.; Lenar, J.
- Web Panels; 2012; Bethlehem, J., Biffignandi, S.
- Use of Response Propensities; 2012; Bethlehem, J., Biffignandi, S.
- Weighting Adjustment Techniques; 2012; Bethlehem, J., Biffignandi, S.
- The Problem of Self-Selection; 2012; Bethlehem, J.,Biffignandi, S.
- The Problem of Undercoverage; 2012; Bethlehem, J., Biffignandi, S.
- Mixed-Mode Surveys; 2012; Bethlehem, J., Biffignandi, S.
- Designing a Web Survey Questionnaire; 2012; Bethlehem, J., Biffignandi, S.
- Examining Contexts-of-Use for Web-Based and Paper-Based Questionnaires; 2012; Hardré, P. L., Crowson, H. M., Xie, K.