Web Survey Bibliography
Attrition is the process of respondents dropping out in a panel study. Errors resulting from attrition decrease statistical power and can potentially bias estimates derived from survey data. As panels are increasingly being used in the social sciences as a source of empirical data, a good understanding of the determinants and consequences of attrition is important for all social scientists who make use of panel study data. In many panel surveys, the process of attrition is more subtle than being either in or out of the study. Respondents often miss out on one or more waves, but might return after that. They start off responding infrequently, but participate more often later in the course of the study. Using current models, it is difficult to incorporate such nonmonothone attrition patterns in analyses of attrition. Non-monothone attrition is common in long running panels, or panels that collect data frequently. In order to separate different groups of respondents that each follow a distinct process of attrition, a Latent Class model is used. This allows the separation of different groups of respondents, that each follow a different and distinct process of attrition. Using background characteristics for a panel survey of 8000 respondents who were recruited using a probability-based method into the Web-based LISS panel, I show that respondents who loyally participate in every wave (stayers) are for example older and more conscientious than attriters, while infrequent (lurkers) respondents are younger and less educated. We can link these characteristics to attrition theories, and show that our findings can be related to theories on panel participation and reasons for dropout. I conclude by showing how each class contributes to attrition bias on voting behavior, and discuss ways to use attrition models to improve the panel survey process
Conference Homepage (abstract)
Web survey bibliography - Lugtig, P. J. (12)
- Data chunking for mobile web: effects on data quality; 2017; Lugtig, P. J.; Toepoel, V.
- Mobile-only web survey respondents; 2016; Lugtig, P. J.; Toepoel, V.; Amin, A.
- Reducing Underreports of Behaviors in Retrospective Surveys: The Effects of Three Different Strategies...; 2016; Lugtig, P. J.; Glasner, T.; Boeve, A.
- Dropouts in Longitudinal Surveys; 2016; Lugtig, P. J.; De Leeuw, E. D.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- The Effects of Adding a Mobile-Compatible Design to the American Life Panel; 2015; Toepoel, V.; Lugtig, P. J.; Amin, A.
- Panel Attrition - Separating Stayers, Fast Attriters, Gradual Attriters, and Lurkers; 2014; Lugtig, P. J.
- Mixed-devices in a probability based panel survey. Effects on survey measurement error; 2014; Toepoel, V., Lugtig, P. J.
- Mobile devices a way to recruit hard-to-reach groups? Results from a pilot study comparing desk top...; 2013; Toepoel, V., Lugtig, P. J.
- Panel Attrition: Separating Stayers, Sleepers and Other Types of Drop-Out in an Internet Panel; 2013; Lugtig, P. J.
- “I think I know what you did last summer” Improving data quality in panel surveys; 2012; Lugtig, P. J.
- Using propensity score matching to separate mode- and selection effects; 2011; Lugtig, P. J., Lensvelt-Mulders, G. J.