Web Survey Bibliography
Relevance & Research Question: Attrition is an important methodological challenge to panel surveys (Lynn 2009). Still, there is a remarkable shortage of variables which are associated with both, the propensity of respondents to stay in the panel and the variables of interest. As a result, propensity score weights which are designed to correct for this type of nonresponse frequently yield mixed results.
This paper addresses the question whether paradata can successfully be applied to improve the prediction of attrition in panel Web surveys. Their main advantage is that they are collected as a byproduct of the survey process. However, it is still an open question which paradata can be used to model attrition and to what extent these paradata are correlated with variables of interest (Kreuter and Olson 2013).
Methods & Data: We use logistic regressions to model attrition in a 7-wave panel Web survey and to compute propensity score weights. The models are fitted with sets of socio-demographic, substantial, survey evaluation, and paradata variables. The latter include measures of response times, user agent strings to determine the device used by the respondent, as well as indicators of the respondents’ response behavior. Finally, we use supplemental cross-sectional Web surveys to assess the effectiveness of propensity score weights based on different sets of variables.
Results: Our results show that including paradata significantly improves the prediction of panel attrition. However, the paradata variables do not supersede socio-demographic, survey evaluation and substantial variables, but they complement them. Yet, the paradata are only moderately correlated with variables of interest at best. As a result, including paradata does not significantly improve the effectiveness of propensity score weights.
Added Value: This paper enhances the existing knowledge in several ways: It presents a set of paradata variables and provides empirical tests of their capability to explain attrition. We show that these paradata can successfully be used to create auxiliary data in a cost-efficient way. At the same time, we demonstrate that they do not ultimately help to correct for panel attrition. Thus, we conclude that further research on paradata, panel attrition and its correction is needed.
Conference Homepage (abstract) / (presentation) >>
Web survey bibliography - General Online Research Conference (GOR) 2014 (29)
- Using Paradata to Predict and to Correct for Panel Attrition in a Web-based Panel Survey; 2014; Rossmann, J., Gummer, T.
- Targeting the bias – the impact of mass media attention on sample composition and representativeness...; 2014; Steinmetz, S., Oez, F., Tijdens, K. G.
- Offline Households in the German Internet Panel; 2014; Bossert, D., Holthausen, A., Krieger, U.
- Which fieldwork method for what target group? How to improve response rate and data quality; 2014; Wulfert, T., Woppmann, A.
- Exploring selection biases for developing countries - is the web a promising tool for data collection...; 2014; Tijdens, K. G., Steinmetz, S.
- Evaluating mixed-mode redesign strategies against benchmark surveys: the case of the Crime Victimization...; 2014; Klausch, L. T., Hox, J., Schouten, B.
- The quality of ego-centered social network data in web surveys: experiments with a visual elicitation...; 2014; Marcin, B., Matzat, U., Snijders, C.
- Switching the polarity of answer options within the questionnaire and using various numbering schemes...; 2014; Struminskaya, B., Schaurer, I., Bosnjak, M.
- Measuring the very long, fuzzy tail in the occupational distribution in web-surveys; 2014; Tijdens, K. G.
- Social Media and Surveys: Collaboration, Not Competition; 2014; Couper, M. P.
- Improving cheater detection in web-based randomized response using client-side paradata; 2014; Dombrowski, K., Becker, C.
- Interest Bias – An Extreme Form of Self-Selection?; 2014; Cape, P. J., Reichert, K.
- Online Qualitative Research – Personality Matters ; 2014; Tress, F., Doessel, C.
- Increasing data quality in online surveys 4.1; 2014; Hoeckel, H.
- Moving answers with the GyroScale: Using the mobile device’s gyroscope for market research purposes...; 2014; Luetters, H., Kraus, M., Westphal, D.
- Online Surveys as a Management Tool for Monitoring Multicultual Virtual Team Processes; 2014; Scovotti, C.
- How much is shorter CAWI questionnaire VS CATI questionnaire?; 2014; Bartoli, B.
- WEBDATANET: A Network on Web-based Data Collection, Methodological Challenges, Solutions, and Implementation...; 2014; Tijdens, K. G., Steinmetz, S., de Pedraza, P., Serrano, F.
- The Use of Paradata to Predict Future Cooperation in a Panel Study; 2014; Funke, F., Goeritz, A.
- Incentives on demand in a probability-based online panel: redemption and the choice between pay-out...; 2014; Schaurer, I., Struminskaya, B., Kaczmirek, L.
- The Effect of De-Contextualisation - A Comparison of Response Behaviour in Self-Administered Surveys; 2014; Wetzelhuetter, D.
- Responsive designed web surveys; 2014; Dreyer, M., Reich, M., Schwarzkopf, K.
- Extra incentives for extra efforts – impact of incentives for burdensome tasks within an incentivized...; 2014; Schreier, J. H., Biethahn, N., Drewes, F.
- Students First Choice – the influence of mobile mode on results; 2014; Maxl, E.
- Device Effects: How different screen sizes affect answer quality in online questionnaires; 2014; Fischer, B., Bernet, F.
- Moving towards mobile ready web panels; 2014; Wijnant, A., de Bruijne, M.
- Innovation for television research - online surveys via HbbTV. A new technology with fantastic opportunities...; 2014; Herche, J., Adler, M.
- Mixed-devices in a probability based panel survey. Effects on survey measurement error; 2014; Toepoel, V., Lugtig, P. J.
- Online mobile surveys in Italy: coverage and other methodological challenges; 2014; Poggio, T.