Web Survey Bibliography
Non-response bias can significantly compromise the representativeness of a sample. In order to combat these problems, researchers tend to mandate marginal quotas to known population proportions. However, the problems associated with quota sampling have been extensively documented when marginal quotas do not fill up at the same time. Differential sampling, whether through propensity scores or model-based response rate adjustments, avoids the problems involved in quota sampling. Differential sampling oversamples subpopulations with lower response rates to ensure survey respondents are balanced according to population proportions. Adjusting for these subpopulations becomes even more important in the online space where individual response rates vary dramatically.
Practitioners who use online panels must develop solutions for coverage limitations, conditioning and respondent attrition, source variability, and non-response bias. We discuss how to more effectively correct non-response bias in online sample to create more accurate estimators from data collected online. The most common way to adjust for non-response bias involves modeling the response rate for each of the subpopulations and oversampling accordingly. Some real time sources use aggregate level demographic estimates to adjust the flow of respondents into their surveys. However, demographics do not account for a large percentage of the variation in response rates. Panel companies can instead use historical response rate data at the individual level to accurately adjust the flow of respondents and correct for non-response. We discuss a system of implementing individual-level response rate estimation and show the advantages when compared to model-based response rate estimates.
Conference Homepage (abstract)
Web survey bibliography (4086)
- Testing a single mode vs a mixed mode design; 2011; Laaksonen, S.
- Germans' segregation preferences and immigrant group size: A factorial survey approach; 2011; Schlueter, E., Ullrich, J., Schmidt, P.
- Errors within web-based surveys: a comparison between two different tools for the analysis of tourist...; 2011; Polizzi, G., Oliveri, A. M.
- Benefits of Structured DDI Metadata across the Data Lifecycle: The STARDAT Project at the GESIS Data...; 2011; Linne, M., Brislinger, E., Zenk-Moeltgen, W.
- Microdata Information System MISSY; 2011; Bohr, J.,
- The Use of Structured Survey Instrument Metadata throughout the Data Lifecycle; 2011; Hansen, S. E.
- DDI and the Lifecycle of Longitudinal Surveys; 2011; Hoyle, L., Wackerow, J.
- Dissemination of survey (meta)data in the LISS data archive; 2011; Streefkerk, M., Elshout, S.
- Underreporting in Interleafed Questionnaires: Evidence from Two Web Surveys; 2011; Medway, R., Viera Jr., L., Turner, S., Marsh, S. M.
- The use of cognitive interviewing methods to evaluate mode effects in survey questions; 2011; Gray, M., Blake, M., Campanelli, P., Hope, S.
- Does the direction of Likert-type scales influence response behavior in web surveys?; 2011; Keusch, F.
- Cross-country Comparisons: Effects of Scale Type and Response Style Differences; 2011; Thomas, R. K.
- Explaining more variance with visual analogue scales: A Web experiment; 2011; Funke, F.
- A Comparison of Branching Response Formats with Single Response Formats; 2011; Thomas, R. K.
- Different functioning of rating scale formats – results from psychometric and physiological experiments...; 2011; Koller, M., Salzberger, T.
- Cognitive process in answering questions: Are verbal labels in rating scales attended to?; 2011; Menold, N., Kaczmirek, L., Lenzner, T.
- Experiments on the Design of the Left-Right Self-Assessment Scale; 2011; Zuell, C., Scholz, E., Behr, D.
- Is it a good idea to optimise question format for mode of data collection? Results from a mixed modes...; 2011; Nicolaas, G., Campanelli, P., Hope, S., Lynn, P., Nandi, A.
- Separating selection from mode effects when switching from single (CATI) to mixed mode design (CATI /...; 2011; Carstensen, J., Kriwy, P., Krug, G., Lange, C.
- Testing between-mode measurement invariance under controlled selectivity conditions; 2011; Klausch, L. T.
- Using propensity score matching to separate mode- and selection effects; 2011; Lugtig, P. J., Lensvelt-Mulders, G. J.
- A mixed mode pilot on consumer barometer; 2011; Taskinen, P., Simpanen, M.
- Separation of selection bias and mode effect in mixed-mode survey – Application to the face-to...; 2011; Bayart, C., Bonnel, P.
- Optimization of dual frame telephone survey designs; 2011; Slavec, A., Vehovar, V.
- A Comparison of CAPI and PAPI through a Randomized Field Experiment; 2011; De Weerdt, J.
- Flexibility of Web Surveys: Probing 'do-not-know' over the Phone and on the Web; 2011; Hox, J., de Leeuw, E. D.
- Changing research methods in Ukraine: CATI or Mixed-Mode Surveys?; 2011; Paniotto, V., Kharchenko, N.
- The effects of mixed mode designs on simple and complex analyses; 2011; Martin, P., Lynn, P.
- In the Face of Declining Budgets: The Student Experience at Washington State University ; 2011; Allen, T., Dillman, D. A., Garza, B., Millar, M. M.
- Use of Cognitive Shortcuts in Landline and Cell Phone Surveys; 2011; Everett, S. E., Kennedy, C.
- Mode Effect or Question Wording? Measurement Error in Mixed Mode Surveys; 2011; de Leeuw, E. D., Hox, J., Scherpenzeel, A.
- Conducting Respondent Driven Sampling on the Web: An Experimental Approach to Recruiting Challenges; 2011; Kapteyn, A., Schonlau, M.
- Discussion of Melanie Revilla's presentation on "Impact of the mode of data collection on...; 2011; Blom, A. G.
- Comments on the paper - Geetha Garib: Perceived diversity among employees; 2011; Vehovar, V.
- Framing Effects and Expected Social Security Claiming Behavior; 2011; Brown, Je., Kapteyn, A., Mitchell, O. S.
- Building an Online Immigrant Panel: Response and Representativity; 2011; Scherpenzeel, A.
- Effects of Incentives and Prenotification on Response Rates and Costs in a National Web Survey of Physicians...; 2011; Bonham, V., Day, B., Dykema, J., Sellers, S., Stevenson, J.
- Nonsampling errors in dual frame telephone surveys ; 2011; Brick, J. M., Flores Cervantes, I., Lee, S., Norman, G.
- Measurement invariance in training evaluation: Old question, new context; 2011; P., Gissel, A., Stoughton, J. W., Whelan, T. J.Clark, A. P.
- Towards Usage of Avatar Interviewers in Web Surveys; 2011; Jans, M., Malakhoff, L.
- Measurement Error in Mixed Mode Surveys: Mode or Question Format?; 2011; de Leeuw, E. D., Hox, J.
- Toward a Benefit-Cost Theory of Survey Participation: Evidence, Further Tests, and Implications; 2011; Singer, E.
- Using Community Information and Survey Methodology for Bias Reduction to Enhance the Quality of the...; 2011; Harvey, J., Prabhakaran, J., Spera, C., Zhang, Zh.
- Response Quantity, Response Quality, and Costs of Building an Online Panel via Social Contacts.; 2011; Toepoel, V.
- The Influence Of The Direction Of Likert-Type Scales In Web Surveys On Response Behavior In Different...; 2011; Keusch, F.
- An Injured Party?: A Comparison of Political Party Response Formats in Party Identification.; 2011; Schwarz, S., Barlas, F. M., Thomas, R. K., Corso, R. A., Szoc, R.
- Asking Sensitive Questions: Do They Affect Participation In Follow-Up Surveys?; 2011; Schaurer, I., Struminskaya, B., Kaczmirek, L., Bandilla, W.
- Designing Questions for Web Surveys: Effects of Check-List, Check-All, and Stand-Alone Response Formats...; 2011; Dykema, J., Schaeffer, N. C., Beach, J., Lein, V., Day, B.
- Differential Sampling Based on Historical Individual-Level Data in Online Panels.; 2011; Kelly, R. H.
- Web Survey Live Validations - What Are They Doing?; 2011; Crawford, S. D., McClain, C.