Web Survey Bibliography
Do survey respondents, recruited with extraordinary efforts, provide answers of lower quality than respondents who are recruited more easily? This question has worried survey practitioners and analysts alike for at least four decades (Cannell and Fowler 1963; Robins 1963), but an answer is not known. Although no direct relationship exists between response rates and nonresponse bias (Groves and Peytcheva 2008), an open question is the relationship between efforts to increase response rates and other sources of survey error, in particular, measurement error. The common hypothesis is that those who require greater recruitment effort provide answers of lower quality than those who are recruited more readily. A latent cooperation continuum often is posited (e.g., (Mason, Lesser, and Traugott 2002; Cannell and Fowler 1963), such that those who are most difficult to recruit to the sample pool have the lowest motivation and are thus the worst reporters. However, it is not clear if this is generally the case.
This paper reviews existing literature on the relationship between the levels of effort exerted for sample member recruitment and data quality, with a primary focus on item nonresponse. Two methods – a quantitative meta-analysis and a systematic qualitative review – are used to examine 44 articles examining the relationship between levels of recruitment effort and measurement error. These studies use five different measures of levels of effort (number of contact attempts/follow-up reminders, refusal conversion, date of interview, combination of these three, estimated response propensity) and look at multiple measurement error indicators (e.g., item nonresponse, response accuracy, signed deviations, scale reliability, acquiescence, non-differentiation).
This review asks the following four questions.
1.Do respondents recruited with more effort have higher item nonresponse rates than respondents recruited more easily?
2.Are particular study characteristics associated with higher item nonresponse rates among respondents recruited with more effort relative to respondents recruited with less effort?
3.Are particular types of items associated with higher item nonresponse rates among respondents recruited with more effort relative to respondents recruited with less effort?
4. Is there consistent evidence about higher or lower levels of other types of measurement error, and does it vary by the level of effort measure?
Conference homepage (abstract)/(full text)
Web Survey Bibliography - 2008 ()
- Voting technology: The not-so-simple act of casting a ballot; 2008; Traugott, M. W. et al.
- Usability testing; 2008; Roe, D. J.
- The semantic differential technique; 2008; Stoutenborough, J. W.
- The psychology of survey response; 2008; Schwarz, N.
- The advisory panel on online public opinion survey quality - Final report June 4, 2008; 2008
- Testing survey questions; 2008; Campanelli, P.
- Telephone survey methodology: Adapting to change; 2008; Tucker, C., Lepkowski, J. M.
- Survey documentation: Towards professional knowledge management in sample surveys; 2008; Mohler, P. et al.
- Some consequences of survey mode changes in longitudinal surveys; 2008; Dillman, D. A.
- Sample factors that influence data quality; 2008; Gailey, R., Teal, D., Haechrel, E.
- Representativity of web surveys – an illusion?; 2008; Bethlehem, J.
- Recruitment and retention for a consumer panel; 2008; Tortora, R. D.
- Probabilistic methods in surveys and offical statistics; 2008; Vehovar, V., Zaletel, M., Seljak, R.
- Privacy, confidentiality, and response burden as factors in telephone survey nonresponse; 2008; Singer, E., Presser, S.
- Personal data of 600,000 on lost laptop; 2008; Evans, M.
- Online panel management practices that minimize satisficing behavior; 2008; Weber, S.
- Modeling campaign dynamics on the web in the 2008 National Annenberg Election Study; 2008; Johnston, R.
- Mode effects; 2008; Jans, M.
- Mobile web surveys: A preliminary discussion of methodological implications; 2008; Fuchs, M.
- Missing data; 2008; de Leeuw, E. D., Hox, J.
- Measuring customer satisfaction and loyalty, Third Edition: Survey design, use, and statistical analysis...; 2008; Hayes, B. E.
- Long Tail, the, revised and updated edition: Why the future of business is selling less of more; 2008; Anderson, C.
- IVR: Interactive voice technology; 2008; Miller-Steiger, D., Conroy, B.
- Internet surveys and national election studies: A Symposium; 2008; Clarke, H. D., Sanders, D., Stewart, M. C., Whiteley, P.
- Internet surveys; 2008; Lozar Manfreda, K., Vehovar, V.
- History of the browser user agent string; 2008; Andersen, A.
- Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking...; 2008; West, R. F., Toplak, M. E., Stanovich, K. E.
- Global market research 2008; 2008
- Foundation of quality project overview; 2008
- Email survey; 2008; Porter, S. R.
- Effects of using a grid versus a sequential form of the ACS basic demographic data; 2008; Chesnut, J.
- Designing online election surveys: Lessons from the 2004 Australian election; 2008; Gibson, R., McAllister, I.
- College sophomores in the laboratory redux: Influences of a narrow data base on social psychology'...; 2008; Henry, P. J.
- Changing times, changing modes: The future of public opinion polling?; 2008; Terhanian, G.
- An unwanted impact; 2008; Balden, W.
- Access panels and online research, panacea or pitfall? Proceedings of the DABS symposium, Amsterdam,...; 2008; Stoop, I.
- Mixed Modes and Measurement Error: Study design and literature review; 2008; Hope, S., Nicolaas, G.
- 26 questions to help research buyers of online samples; 2008
- Forms that Work - Designing Web Forms for Usability; 2008; Jarrett, C., Gaffney, G.
- Understanding Society. Some preliminary results from the Wave 1 Innovation Panel ; 2008; Burton, J., Laurie, H., Uhrig, S.C. N.
- Whose Space? Differences Among Users and Non-Users of Social Network Sites; 2008; Hargittai, E.
- ‘Looking at’, ‘Looking up’ or ‘Keeping up with’ People? Motives...; 2008; Joinson, A. N.
- Design Variations in Adaptive Web Sampling; 2008; Vincent, K. S.
- Characteristics of Gay and Bisexual Men Who Drop Out of a Web Survey of Sexual Behaviour in the UK; 2008; Evans, A. R., Wiggins, D., Bolding, G., Elford, J.
- Objectivity, Reliability, and Validity of Search Engine Count Estimates ; 2008; Janetzko, D.
- The effects of reminders on data quality in online surveys; 2008; Tuten, T. L.
- Internet-basierte Messung sozialer ErwŁnschtheit: Theoretische Grundlagen und experimentelle Untersuchung...; 2008; Kaufmann, E., Reips, U. -D.
- Sozialforschung im Internet: Methodologie und Praxis der Online-Befragung; 2008; Jackob, N., Schoen, H., Zerback, T. (eds.)
- An online panel as a platform for multi-disciplinary research; 2008; Scherpenzeel, A.
- Comparison of Web-Based versus Paper-and-Pencil Self-Administered Questionnaire: Effects on Health Indicators...; 2008; van de Looij-Jansen, P. M., de Wilde, E. J.