Web Survey Bibliography
Compared to laboratory studies, participants much more likely drop out in Web studies, particularly if sensitive questions are asked. The high hurdle technique is thought to control drop out by maximizing de-motivating factors at the very beginning of a study or even earlier to reduce subsequent drop out, but previous research is inconclusive as to its impact and process (Göritz & Stieger, 2007; Reips, 2002). To investigate the technique’s dependence on other factors and its impact on drop out behavior, we conducted a Web experiment in which we (1) measured pre-experimental intended seriousness and (2) manipulated the high hurdle and the placement of sensitive items. Twelve different conditions resulted from the 2 x(seriousness) x 2 (high hurdle) x 3 (placement of sensitive items) between-subjects design.
Dependent measures were drop out, data quality and reaction time. After the seriousness check item, 396 Students answered a questionnaire about sport and nutrition. The high hurdle was placed in the introduction and was created by saying that the questionnaire contains sensitive items, which might make the person feel uncomfortable. In the control condition, there was no such high hurdle. The sensitive items were set either at the beginning, in the middle, or at the end of the experiment.
There was a significantly larger drop out directly after the high hurdle introduction, compared to the control condition (p<.001), i.e. the high hurdle served the first of its purposes. Also, there was an interaction with intended seriousness.
Further analyses showed that satisficer (people who secede to deliver accurate data in favor of roundings or estimates, likely due to decreasing motivation) had more missings than nonsatisficers, in contradiction to widespread intuitions. The assumption that participants take a longer time to answer sensitive items was confirmed (p=.005). Also, at sensitive items the frequency of chosen „don’t-want-to-answer“ options exceeded the amount of missings, whereas at non-sensitive items more missings than „don’t-want-to-answer“ options were counted. Implications of these results for research and questionnaire design will be discussed in the presentation.
Conference homepage (abstract)
Web Survey Bibliography - Other (471)
- An Analyze of the Zero Price Effect on Online Business Performance - An Research Based on the Mobile...; 2010; Liu, Y., Yuan, P.
- Dealing with Nonresponse in Survey Sampling: an Item Response Modeling Approach; 2010; Matei, A.
- Blogosphere and Democracy in Portugal–Results of a Websurvey; 2010; Carvalho, T., Casanova, J. L.
- Establishing a Qualitative Data Archive in Austria; 2010; Smioski, A.
- Web Based Paper Form Verification Using QueXF; 2010; Zammit, A.
- Does Making The Survey Topic More Salient Lead To An Expert Bias? – The Influence of Announcing...; 2010; Keusch, F., Mayerhofer, W., Weilbuchner, N., Jungreithmaier, S.
- Developing and Evaluating a Student Online Panel.; 2010; Stiglbauer, B., Gamsjäger, M., Gnambs, T., Batinic, B., Altrichter, H.
- Online Access Panels: A detailed look at different Ways of Entering, their Costs and Participation Behavior...; 2010; Führer, R., Keusch, F.
- Theoretical model of context-sensitive mobile methods; 2010; Maxl, E.
- Can a professional questionnaire layout make up for a boring topic? The mediating role of topic interest...; 2010; Keusch, F., Mayerhofer, W., Jungreithmaier, S., Weilbuchner, N., Führer, R., Kling, H.
- Using Propensity Score Weighting to Reduce Bias of a Swiss Market Research Web Panel; 2010; Wiegand, G., Jella, H., Beat, H., Stefan, L.
- A short self-report measure of problems with executive function suitable for administration via the...; 2010; Buchanan, T., Heffernan, T. M., Parrott, A. C., Ling, J., Rodgers, J., Scholey, A. B.
- Combining Link-Tracing Sampling and Cluster Sampling to Estimate Totals and Means of Hidden Human Populations...; 2010; Félix-Medina, M. H., Monjardin, P. E.
- Quality in Unimode and Mixed-Mode designs: A Multitrait-Multimethod approach; 2010; Revilla, M.
- Some Notes on the Probability Space of Statistical Surveys; 2010; Petrakos, G.
- The Ethics of Outsourcing Online Survey Research; 2010; Allen, P. J., Roberts, L. D.
- Comparative effectiveness report: online survey tools; 2010; Gottliebson, D., Layton, N., Wilson, E.
- Web survey design and usability; 2010; Karakoyun, F., Kurt, A. A.
- Web panels: Replacement technology for market research; 2010; Goeritz, A.
- Test-retest reliability of a questionnaire for the Korea youth risk behavior web-based survey; 2010; Bae, J., Joung, H., Kwon, K. N., Kim, Y. T., Kim, J. -Y., Park, S. -W.
- Nine issues for Internet-based survey research in service industries ; 2010; Wang, H.-C., Doong, H.-S.
- Color red in web-based knowledge testing; 2010; Gnambs, T., , Batinic, B.Appel, M.
- A comparison of surveys using different modes of data collection; 2010; Revilla, M., Saris, W. E.
- Exploring the feasibility of an online contextualised animation-based questionnaire for educational...; 2010; Chien, Y.-T., Chang, C.-Y.
- Examining the effects of website-induced flow in professional sporting team websites; 2010; O'Cass, A., Carlson, J.
- Research into questionnaire design - A summary of the literature; 2010; Lietz, P.
- Improving sample quality by utilising the multiple contacts approach; 2010; Przewlocka, J.
- Relationship of Metacognitive Monitoring with Interaction in an Asynchronous Online Discussion Forum; 2010; Topcu, A.
- A Research on the Application of Web Survey in Sport; 2010; Chen, Z. X., Pan, D. L., Lu, H. N.
- A Spanish Continuous Volunteer Web Survey: Sample Bias, Weighting and Efficiency; 2010; de Pedraza, P., Tijdens, K., de Bustillo, R., Steinmetz, S.
- College Students' Response Rate to an Incentivized Combination of Postal and Web-Based Health Survey; 2010; Balajti I., Daragó, L., Ádány, R., Kósa, K.
- Improving the response rate and quality in Web-based surveys through the personalization and frequency...; 2010; Muñoz-Leiva, F., Sánchez-Fernández, J., Montoro-Ríos, F. J., Ibáñez-Zapata, J. A.
- What are participants doing while filling in an online questionnaire: A paradata collection tool and...; 2010; Stieger, S., Reips, U. -D.
- An analysis of the effect of pre-incentives and post-incentives based on draws on response to web surveys...; 2010; Sánchez-Fernández, J., Muñoz-Leiva, F., Montoro-Ríos, F. J., Ibáñez-Zapata, J. A.
- MarketTools TrueSample; 2009
- ISO 26362 Access panels in market, opinion, and social research-Vocabulary and service requirements; 2009
- Getting data for (business) statistics: What's new? What's next?; 2009; Snijkers, G.
- Social Network Services as Data Sources and Platforms for e-Researching Social Networks; 2009; Ackland, R.
- Stochastic properties of the Internet sample; 2009; Getka-Wilczynska, E.
- Web based survey: an emerging tool; 2009; Srivenkataramana, T., Saisree, M.
- The impact of gender in e-mailed survey invitations; 2009; Derham, P.
- The Coverage Bias of Mobile Web Surveys Across European Countries ; 2009; Fuchs, M., Busse, B.
- Young people, the Internet and Political Participation: Findings of a web survey in Italy, Spain and...; 2009; Calenda, D., Meijer, A.
- Interactivity in self-administered surveys. Influence on respondents' experience; 2009; Suarez Vazquez, A., Garcia Rodriguez, N., Alvarez, M. B.
- Selecting techniques for use in an internet survey; 2009; B., Han, V., Albaum, G., Thirkell, P.Wiley, J. B.
- Designing Effective Web Survey Forms; 2009; Mitchell, J.
- Metrics for panel contribution: a non probabilistic platform; 2009; Gittelmam, S. H., Trimarchi, E.
- Mode effects in Switzerland: non‐response and measurement error on the European Social Survey; 2009; Roberts, C.
- Reason analysis: an ambitious alternative for mixed‐mode survey design; 2009; Jeøábek, H.
- Response rates in multi actor surveys; 2009; Pasteels, I., Ponnet, K., Mortelmans, D.