Web Survey Bibliography
Title Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance for data quality
Author Wetzlehuetter, D.
Year 2017
Access date 15.09.2017
Abstract Starting point and focus: It is not possible to ignore the internet as a quick, practicable and economic source of information and nearly unlimited communication channel, as a mass medium (online news), a mainstream medium (social media) as well as an individual medium (email). The number of web surveys and methods of taking web surveys increased with the utilisation of the internet. For instance, the Arbeitskreis Deutscher Markt- und Sozialforschungsinstitute e.V. recorded a continuous increase from 1% quantitative web surveys of their members in 1998 to 16% in 2004, 38% in 2010 and 43% in 2014. However, webbased surveys – as extensive discussions show – are not free of controversy. Questable data quality, typically regarding the representativeness of the data (coverage error / missing data) and difficulties to achieve unbiased responses (measurement errors) caused by the equipment used (mode-effects) is more and more common. Errors caused by continuous rising proportions of drop-outs and item-nonresponses in online surveys, are relevant in almost the same manner. However, these sources of error are repeatedly neglected to a certain degree.
As the starting point of the paper, it is assumed that drop-out rates and item-nonresponse rates in online surveys differ as context-sensitive (whether at home or not and using a smart-phone or not) response behaviour. This means that systematic errors linked to the interview situation (in terms of location and device) are conceivable. Respectively, the presentation aims to illustrate, how/to what extend the context of the interview situation has to be considered for data cleansing and analysis of data captured online to avoid, as far as possible, biased results.
Methods and Data: To test this assumption, an online survey about “participation of university students” is used. To provoke drop-outs on the one hand and on the other hand test the consequences of different motivation strategies (prospect of profit, appeals, manipulation of the progress bar) that are easily inserted and therefore often used in online surveys, an experimental design was applied. For this purpose, an unusually long questionnaire (23 online-pages, 121 items) was developed, wherein different motivation strategies were included. 14.2% of the students (n=17,491) invited to take part in the survey reacted to the invitation, 1916 (11%) answered at least one question; just 7.3% (n=1282) reached the final page.
Results: Drop-out-rates and item-nonresponse-rates differ, depending on the above specified survey context: not being at home and using a smart-phone increases both. The motivation strategies used work differently: they solely reduce the risk of non-responses of those who did not use a smart-phone while at home. However, data cleansing does not affect the sample composition concerning studyrelated characteristics. Detailed analyses show that the influence of the defined survey context on substantial findings varies. Based on this the presentation will emphasize the importance of recording and considering the context-information of data collection for data cleansing, analysis and interpretation of results and will discuss how this
As the starting point of the paper, it is assumed that drop-out rates and item-nonresponse rates in online surveys differ as context-sensitive (whether at home or not and using a smart-phone or not) response behaviour. This means that systematic errors linked to the interview situation (in terms of location and device) are conceivable. Respectively, the presentation aims to illustrate, how/to what extend the context of the interview situation has to be considered for data cleansing and analysis of data captured online to avoid, as far as possible, biased results.
Methods and Data: To test this assumption, an online survey about “participation of university students” is used. To provoke drop-outs on the one hand and on the other hand test the consequences of different motivation strategies (prospect of profit, appeals, manipulation of the progress bar) that are easily inserted and therefore often used in online surveys, an experimental design was applied. For this purpose, an unusually long questionnaire (23 online-pages, 121 items) was developed, wherein different motivation strategies were included. 14.2% of the students (n=17,491) invited to take part in the survey reacted to the invitation, 1916 (11%) answered at least one question; just 7.3% (n=1282) reached the final page.
Results: Drop-out-rates and item-nonresponse-rates differ, depending on the above specified survey context: not being at home and using a smart-phone increases both. The motivation strategies used work differently: they solely reduce the risk of non-responses of those who did not use a smart-phone while at home. However, data cleansing does not affect the sample composition concerning studyrelated characteristics. Detailed analyses show that the influence of the defined survey context on substantial findings varies. Based on this the presentation will emphasize the importance of recording and considering the context-information of data collection for data cleansing, analysis and interpretation of results and will discuss how this
Access/Direct link Conference Homepage (abstract) / (presentation)
Year of publication2017
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - Other (439)
- Answering Without Reading: IMCs and Strong Satisficing in Online Surveys; 2017; Anduiza, E.; Galais, C.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Web Survey Gamification - Increasing Data Quality in Web Surveys by Using Game Design Elements; 2017; Schacht, S.; Keusch, F.; Bergmann, N.; Morana, S.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Comparability of web and telephone surveys for the measurement of subjective well-being; 2017; Sarracino, F.; Riillo, C. F. A.; Mikucka, M.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- Rates, Delays, and Completeness of General Practitioners’ Responses to a Postal Versus Web-Based...; 2017; Sebo, P.; Maisonneuve, H.; Cerutti, B.; Pascal Fournier, J.; Haller, D. M.
- Oversampling as a methodological strategy for the study of self-reported health among lesbian, gay and...; 2017; Anderssen, N.; Malterud, K.
- Utjecaj vizualne orientacije skale za odgovaranje i broja stranica web-upitnika na rezultate ispitivanja...; 2017; Malikovic, M.; Svegar, D.; Somodzi, S.
- How to Design a Web Survey Using Spring Boot With MYSQL: a Romanien Network Case Study; 2017; Bucea-Manea-Tonis, Ro.; Bucea-Manea-Tonis, Ra.
- Analyzing Survey Characteristics, Participation, and Evaluation Across 186 Surveys in an Online Opt-...; 2017; Revilla, M.
- Comparative analysis of a mobile device and paper as effective survey tools; 2017; Kim, K. J.; Bae, S.; Park, E.
- Enhancing survey participation: Facebook advertisements for recruitment in educational research; 2017; Forgasz, H.; Tan, H.; Leder, G.; McLeod, A.
- Virtual reality meets sensory research; 2017; Depoortere, L.
- PC, phone or tablet? Use, preference and completion rates for web surveys ; 2017; Brosnan, K.; Gruen, B.; Dolnicar, S.
- “Better do not touch” and other superstitions concerning melanoma: the cross-sectional web...; 2016; Gajda, M.; Kamiñska-Winciorek, G.; Wydmañski, J.; Tukiendorf, A.
- Making use of Internet interactivity to propose a dynamic presentation of web questionnaires; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- A streamlined approach to online linguistic surveys; 2016; Erlewine, M. Y.; Kotek, H.
- Du kommst hier nicht rein: Türsteherfragen identifizieren nachlässige Teilnehmer in Online-Umfragen; 2016; Merkle, B.; Kaczmirek, L.; Hellwig, O.
- Smartphones vs PCs: Does the Device Affect the Web Survey Experience and the Measurement Error for...; 2016; Toninelli, D.; Revilla, M.
- Estimation and Adjustment of Self-Selection Bias in Volunteer Panel Web Surveys ; 2016; Niu, Ch.
- Sensitive Questions in Online Surveys: An Experimental Evaluation of Different Implementations of the...; 2016; Hoglinger, M.; Jann, B.; Diekmann, A.
- Design and test of a web-survey for collecting observer’s ratings on dairy goats’ behavioural...; 2016; Vieira, A.; Oliveira, M. D.; Nunes, T.; Stilwell, G.
- Can Student Populations in Developing Countries Be Reached by Online Surveys? The Case of the National...; 2016; Langer, A., Meuleman, B., Oshodi, A.-G. T., Schroyens, M.
- Feature phones no barrier to conducting an effective conjoint study ; 2016; de Rooij, R.; Dossin, R.
- Patient preference: a comparison of electronic patient-completed questionnaires with paper among cancer...; 2016; Martin, P.; Brown, M.C.; Espin‐Garcia, O.; Cuffe, S.; Pringle, D.; Mahler, M.; Villeneuve, J.;...
- Does the Use of Smartphones to Participate in Web Surveys Affect the Survey Experience when Sensitive...; 2016; Toninelli, D.; Revilla, M.
- Device use in web surveys: The effect of differential incentives; 2016; Mavletova, A. M.; Couper, M. P.
- Device Effects - How different screen sizes affect answers in online surveys; 2016; Fisher, B.; Bernet, F.
- Do Initial Respondents Differ From Callback Respondents? Lessons From a Mobile CATI Survey; 2016; Vicente, P.; Marques, C.
- The use of online social networks as a promotional tool for self-administered internet surveys; 2016; de Rada, V. D.; Arino, L. V. C; Blasco, M. G
- Assessing the Effects and Effectiveness of Attention-check Questions in Web Surveys: Evidence From a...; 2016; Vannette, D.
- Mode Effects on Subjective Well-being Research: Do they Affect Regression Coefficients? ; 2016; Sanchez Tome, R.; Roberts, C.; Staehli, M. E.; Joye, D.
- Evaluating a Modular Design Approach to Collecting Survey Data Using Text Messages ; 2016; West, B. T.; Ghimire, D.; Axinn, W.
- Reaching the Mobile Generation: Reducing Web Survey Non-response through SMS Reminders ; 2016; Kanitkar, K. N.; Marlar, J.
- Safety First: Ensuring the Anonymity and Privacy of Iranian Panellists’ While Creating Iran...; 2016; Farmanesh, A.; Mohseni, E.
- Non-Observation Bias in an Address-Register-Based CATI/CAPI Mixed Mode Survey; 2016; Lipps, O.
- Web surveys for offline rural communities ; 2016; Gichohi, B. W.
- On-line life history calendar and sensitive topics: A pilot study; 2016; Morselli, D.; Berchtold, A.; Granell, J.-C. S.; Berchtold, And.
- An experiment comparing grids and item-by-item formats in web surveys completed through PCs and smartphones...; 2016; Revilla, M.; Toninelli, D.; Ochoa, C.
- Improving Inpatient Surveys: Web-Based Computer Adaptive Testing Accessed via Mobile Phone QR Codes; 2016; Chien, T. S.; Lin, W.S.
- Pre-Survey Text Messages (SMS) Improve Participation Rate in an Australian Mobile Telephone Survey:...; 2016; Dal Grande, E.; Chittleborough, C. R.; Campostrini, S.; Dollard, M.; Taylor, A. W.
- Short and Sweet? Length and Informative Content of Open-Ended Responses Using SMS as a Research Mode; 2016; Walsh, E.; Brinker, J. K.
- Mixing modes of data collection in Swiss social surveys: Methodological report of the LIVES-FORS mixed...; 2016; Roberts, C.; Joye, D.; Staehli, M. E.
- What is the gain in a probability-based online panel to provide Internet access to sampling units that...; 2016; Revilla, M.; Cornilleau, A.; Cousteaux, A-S.; Legleye, S; de Pedraza, P.