Web Survey Bibliography
In the field of online research, and particularly with respect to online access panels, the amount of information the research institute should provide its respondents with when sending out invitations is still discussed quite controversially.
On the one hand, revealing too many details holds the risk of self-selection among potential participants. Furthermore, if the topic is explained, experienced panellists only interested in incentives might adjust their answers to the screening questions in order to take part in every possible survey. Such behaviour is likely to cause untrue statements and bias in the studies’ results.
On the other hand, not including enough information in invitations involves the risk of potential participants deciding not to take part in a survey as they are, for instance, unsure about the incentives or the amount of time they have to invest in order to complete the questionnaire.
Consequently, the following questions arise:
Is it sufficient to just inform the panel members that there is a questionnaire waiting for completion? Should information on the length of the survey, the field time and the incentives paid for completion be included in the email? Or would it even benefit the response and results to announce the topic of the survey?
The lecture aims at eliciting answers to these questions.
It is based on an empirical study including about 30 experiments with 1000 respondents each. In the course of this study, the invitation-emails were systematically varied, and the different response rates were monitored.
The information content of the emails was differentiated in a way that one part of the panellists always received a mail containing extensive information about the study, whilst two additional groups either received information regarding the questionnaire’s length and the incentive or were informed about the topic of the survey. A last group finally obtained an invitation only including a link to the questionnaire but no additional information describing the survey at all.
The analysis of the projects realised aims at identifying and presenting the optimal information content of email- invitations for online surveys.
Conference homepage (abstract)
Web Survey Bibliography - Online measurement (799)
- Data quality of questions sensitive to social-desirability bias in web surveys; 2012; Lozar Manfreda, K., Zajc, N., Berzelak, N., Vehovar, V.
- Online Questionnaires: Development of ‘basic requirements’; 2012; Tries, S., Blanke, K.
- Efficacy of a health-related Facebook social network site on health-seeking behaviors; 2012; Woolley, P., Peterson, M.
- Paradata; 2012; Kreuter, F.
- Modes of Data Collection; 2012; Tourangeau, R.
- Measure the response burden in the Swedish Intrastat system; 2012; Weideskog, F.
- Mode and non-response effects and their treatment; 2012; Chrysanthopoulos, S., Georgostathi, A.
- What can be said about quality in the Central Population Register based on a self-completion survey...; 2012; Falnes-Dalheim, E., Pedersen, H. E.
- Improving the quality of complex surveys: The case of the EU Labour Force Survey ; 2012; van der Valk, J.
- The re-engineering of the Structural Earnings survey process: Mixed - Mode data collection and new E...; 2012; Cardinaleschi, S., De Santis, S., Rocci, F., Spinelli, V.
- Between demand and reality: Ensuring efficiency and quality in pretesting questionnaires; 2012; Sattelberger, S., Blanke, K.
- How to provide high data quality in online-questionnaires: Setting guidelines in design; 2012; Tries, S., Nebel, S., Blanke, K.
- Boosting Web pick-up Rates by referring to Compliance Principles ; 2012; Falnes-Dalheim, E., Haraldsen, G., Sundvoll, A.
- Ebook readings jumps, print book reading declines; 2012; Rainie, L., Duggan, M.
- Developments and the impact of smart technology; 2012; Macer, T.
- How Should Debriefing Be Undertaken in Web-Based Studies? Findings From a Randomized Controlled Trial...; 2012; McCambridge, J., Kypri, K., Wilson, A.
- Better customer in sight in real time; 2012; Macdonald, E., Wilson, H. N., Konus, H.
- An experimental investigation of the effects of noncontingent and contingent incentives in recruiting...; 2012; Lavrakas, P. J., Dennis, J. M., Peugh, J., Shand-Lubbers, J., Lee, E., Peugh, J., Charlebois, O., Murakami...
- The Feasibility of Conducting a Web Survey Using Respondent Driven Sampling among Transgenders in the...; 2012; Kappelhof, J.
- The role of topic interest and topic salience in online panel web surveys.; 2012; Keusch, F.
- Multi-Language Multi-Continent B2B Community Panel: How B2B research can effectively span the world; 2012; Morden, M., Accomando, E.
- Can Survey Gaming Techniques Cross Continents? Examining cross cultural reactions to creative questioning...; 2012; Puleston, J.
- Rules of engagement: The war against poorly engaged respondents - guidelines for elimination; 2012; Gittelman, S. H., Trimarchi, E.
- WebSM Study: Survey software features overview ; 2012; Vehovar, V.; Cehovin, G.; Kavcic, L.; Lenar, J.
- Web Panels; 2012; Bethlehem, J., Biffignandi, S.
- Use of Response Propensities; 2012; Bethlehem, J., Biffignandi, S.
- Weighting Adjustment Techniques; 2012; Bethlehem, J., Biffignandi, S.
- The Problem of Self-Selection; 2012; Bethlehem, J.,Biffignandi, S.
- Designing a Web Survey Questionnaire; 2012; Bethlehem, J., Biffignandi, S.
- Examining Contexts-of-Use for Web-Based and Paper-Based Questionnaires; 2012; Hardré, P. L., Crowson, H. M., Xie, K.
- Probabilistic survey questions and incorrect answers: Retirement income replacement rates; 2012; van Santen, P., Alessie, R., Kalwij, A.
- Survey Quality; 2012; Lyberg, L. E.
- Effects of E-Mailed Versus Mailed Invitations and Incentives on Response Rates, Data Quality, and Costs...; 2012; Dykema, J., Stevenson, J., Klein, L., Kim, Y., Day, B.
- Unit Non-Response Due to Refusal; 2012; Stoop, I.
- Non-Response and Measurement Error; 2012; Billiet, J., Matsuo, H.
- An Overlooked Approach in Survey Research: Total Survey Error; 2012; Bautista, R.
- Data Quality in HIV/AIDS Web-Based Surveys: Handling Invalid and Suspicious Data; 2012; Bauermeister, J. A., Pingel, E., Zimmerman, M., Couper, M. P., Carballo-Diéguez, A., Strecher, V. J.
- Response rates in three opinion surveys performed through online questionnaires in the health setting...; 2012; Aerny Perreten, N., Domínguez-Berjón, M. F., Astray Mochales, J., Esteban-Vasallo, M. D., Blanco Ancos...
- Impact of Fixed Choice Design on Blockmodeling Outcomes; 2012; Znidarsic, A.
- The Mode of Invitation for Web Surveys; 2012; Bandilla, W., Couper, M. P., Kaczmirek, L.
- Disfluencies and Gaze Aversion in Unreliable Responses to Survey Questions; 2012; Schober, M. F., Conrad, F. G., Dijkstra, W., Ongena, Y. P.
- Evaluating Survey Questions: A Comparison of Methods; 2012; Yan, T., Kreuter, F., Tourangeau, R.
- When More Gets You Less: A Meta-Analysis of the Effect of Concurrent Web Options on Mail Survey Response...; 2012; Medway, R., Fulton, J.
- Reliable Online Social Network Data Collection; 2012; Abdesslem, F. B., Parris, I., Henderson, T.
- Enhancing Web Surveys With New HTML5 Input Types; 2012; Funke, F.
- Panel retention rate and data quality: experimental results drawing on Reciprocity design; 2012; Biffignandi, S., Artaz, R.
- Presidential Elections in Iceland 2012 – Did online panel surveys give false hope to new candidates...; 2012; Jonsdottir, G. A., Dofradottir, A. G., Bjornsdottir, A. E.
- Website exit surveys. What can we measure with them?; 2012; Andreadis, I.
- Challenges and pitfalls of measuring wages via web surveys - some explorations; 2012; Steinmetz, S., Bianchi, A., Tijdens, K., Biffignandi, S.
- Firefly Online Surveys: A fully featured tool for Web surveys and forums; 2012; Deal, K.