Web Survey Bibliography
Title Response Rates and Response Bias in Web Panel Surveys
Year 2015
Access date 22.08.2016
Abstract
Non-probability samples, such as online panels, are increasingly accepted as “fit for purpose” for low incidence populations (e.g., pregnant women), difficult to reach populations (e.g., health care workers) and other special populations, particularly when time or cost make probability surveys infeasible. However, there is much less enthusiasm for the application of these methods in social science research for general populations. Aside from the issue of statistical generalizability, low response rates within the panel and demographic biases in the achieved samples are often cited (AAPOR 2010).
Are low response rates and demographic biases endemic to population surveys using web panels, or do they reflect the methods of particular surveys? Many web panel surveys are conducted in such a way that response rate cannot be calculated. In other cases, response rate is not reported. Further, most web surveys are not conducted to optimize response rate since sample is nearly unlimited and speed is often critically important to the client. In addition, biases in web surveys are usually identified by comparing the characteristics of the achieved sample to the population, which does not address the source of the error as the frame or the survey procedures.
This paper examines the application of two survey protocols in a general population survey conducted in the same community using a national web panel. Invitations will be sent to two Census balanced samples of 5,000 from the master panel, with the goal of achieving at least 500 completes in each sample. For the first protocol, invitations will be followed by a single reminder, an industry standard. For the second protocol, a robust reminder schedule including up to 4 reminders will be fielded over a three week period. Response rate is calculated as the proportion of invited respondents who complete the interview. Non-response bias is calculated by comparing the characteristics of responders and non-responders from their panel profile. Findings are compared across the two samples from the same community in the experiment.
Are low response rates and demographic biases endemic to population surveys using web panels, or do they reflect the methods of particular surveys? Many web panel surveys are conducted in such a way that response rate cannot be calculated. In other cases, response rate is not reported. Further, most web surveys are not conducted to optimize response rate since sample is nearly unlimited and speed is often critically important to the client. In addition, biases in web surveys are usually identified by comparing the characteristics of the achieved sample to the population, which does not address the source of the error as the frame or the survey procedures.
This paper examines the application of two survey protocols in a general population survey conducted in the same community using a national web panel. Invitations will be sent to two Census balanced samples of 5,000 from the master panel, with the goal of achieving at least 500 completes in each sample. For the first protocol, invitations will be followed by a single reminder, an industry standard. For the second protocol, a robust reminder schedule including up to 4 reminders will be fielded over a three week period. Response rate is calculated as the proportion of invited respondents who complete the interview. Non-response bias is calculated by comparing the characteristics of responders and non-responders from their panel profile. Findings are compared across the two samples from the same community in the experiment.
Access/Direct link FCSM Research Conference Homepage (Abstract) / (Full text)
Year of publication2015
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography (4086)
- Measuring Generalized Trust: An Examination of Question Wording and the Number of Scale Points; 2016; Lundmark, S.; Giljam, M.; Dahlberg, S.
- A Statistical Approach to Provide Individualized Privacy for Surveys; 2016; Esponda, F.; Huerta, K.; Guerrero, V. M.
- Online and Social Media Data As an Imperfect Continuous Panel Survey; 2016; Diaz, F.; Garmon, F.; Hofman, J. K.; Kiciman, E.; Rothschild, D.
- Social Media Analyses for Social Measurement; 2016; Schober, M. F.; Pasek, J.; Guggenheim, L.; Lampe, C.; Conrad, F. G.
- Equivalence of paper-and-pencil and computerized self-report surveys in older adults; 2016; Weigold, A.; Weigold, I. K.; Drakeford, M. K.; Dykema, S. A.; Smith, C. A.
- Quality of Different Scales in an Online Survey in Mexico and Colombia; 2016; Revilla, M.; Ochoa, C.
- A multi-group analysis of online survey respondent data quality: Comparing a regular USA consumer panel...; 2016; Golden, L.; Albaum, G.; Roster, C. A.; Smith, S. M.
- Does the Inclusion of Non-Internet Households in a Web Panel Reduce Coverage Bias?; 2016; Eckman, S.
- Investigating respondent multitasking in web surveys using paradata; 2016; Sendelbah, A.; Vehovar, V.; Slavec, A.; Petrovcic, A.
- The effect of email invitation elements on response rate in a web survey within an online community; 2016; Petrovcic, A.; Petric, G.; Lozar Manfreda, K.
- Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments; 2016; Struminskaya, B.
- Swapping bricks for clicks: Crowdsourcing longitudinal data on Amazon Turk; 2016; Daly, T. M.; Nataraajan, R.
- A reliability analysis of Mechanical Turk data; 2016; Rouse, S. V.
- Quota Controls in Survey Research.; 2016; Gittelman, S. H.; Thomas, R. K.; Lavrakas, P. J.; Lange, V.
- Presentation matters: how mode effects in item non-response depend on the presentation of response options...; 2016; Zeglovits, E.; Schwarzer, S.
- Internet-administered Health-related Quality of Life Questionnaires Compared With Pen and Paper in an...; 2016; Nitikman, M.; Mulpuri, K.; Reilly, C. W.
- Computers, Tablets, and Smart Phones: The Truth About Web-based Surveys; 2016; Merle, P.; Gearhart, S.; Craig, C.; Vandyke, M.; Brooks, M. E.; Rahimi, M.
- Scientific Surveys Based on Incomplete Sampling Frames and High Rates of Nonresponse; 2016; Fahimi, M.; Barlas, F. M.; Thomas, R. K.; Buttermore, N. R.
- Doing Surveys Online ; 2016; Toepoel, V.
- Exploring Factors in Contributing Student Progress in the Open University; 2016; Arifin, M. H.
- Taming Big Data: Using App Technology to Study Organizational Behavior on Social Media; 2015; Bail, C. A.
- The Use of a Nonprobability Internet Panel to Monitor Sexual and Reproductive Health in the General...; 2015; Legleye, S; Charrance, G.; Razafindratsima, N.; Bajos, N.; Bohet, A.; Moreau, C.
- Adapting Labour Force Survey questions from interviewer-administered modes for web self-completion in...; 2015; Betts, P.; Cubbon, B.
- ESOMAR/GRBN Online Research Guideline; 2015
- Taking MARS Digital; 2015; Melton, E.; Krahn, J.
- A Comparison of the Effects of Face-to-Face and Online Deliberation on Young Students’ Attitudes...; 2015; Triantafillidou, A.; Yannas, P.; Lappas, G.; Kleftodimos, A.
- A Privacy-Friendly Method to Reward Participants of Online-Surveys; 2015; Herfert, M.; Lange, B.; Selzer, A.; Waldmann, U.
- Doing Online Surveys: Zum Einsatz in der sozialwissenschaftlichen Raumforschung; 2015; Nadler, R.; Petzold, K.; Schoenduwe, R.
- Are Fast Responses More Random? Testing the Effect of Response Time on Scale in an Online Choice Experiment...; 2015; Boerger, T.
- The impact of frequency rating scale formats on the measurement of latent variables in web surveys -...; 2015; Menold, N.; Kemper, C. J.
- Investigating response order effects in web surveys using eye tracking; 2015; Karem Hoehne, J.; Lenzner, T.
- Implementation of the forced answering option within online surveys: Do higher item response rates come...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- Internet Panels, Professional Respondents, and Data Quality; 2015; Matthijsse, S.; De Leeuw, E. D.; Hox, J.
- Self-administered Questions and Interviewer–Respondent Familiarity; 2015; Rodriguez, L. A., Sana, M., Sisk, B.
- Comparing Food Label Experiments Using Samples from Web Panels versus Mall Intercepts; 2015; Chang, L. C., Lin, C. T. J.
- Translating Answers to Open-ended Survey Questions in Cross-cultural Research: A Case Study on the Interplay...; 2015; Behr, D.
- The impact of gamifying to increase spontaneous awareness; 2015; Cape, P.
- Using eye-tracking to understand how fourth grade students answer matrix items; 2015; Maitland, A.; Sun, H.; Caporaso, A.; Tourangeau, R.; Bertling, J.; Almonte, D.
- Incentive Types and Amounts in a Web-based Survey of College Students; 2015; Krebs, C.; Planty, M.; Stroop, J.; Berzofsky, M.; Lindquist, C.
- Response Rates and Response Bias in Web Panel Surveys; 2015; Boyle, J.; Berman, L.; Dayton, Ja.; Fakhouri, T.; Iachan, R.; Courtright, M.; Pashupati, K.
- Characteristics of the Population of Internet Panel Members; 2015; Boyle, J; Freedner, N.; Fakhouri, T.
- Internet and Smartphone Coverage in a National Health Survey: Implications for Alternative Modes; 2015; Couper, M. P.; Kelley, J.; Axinn, W.; Guyer, H.; Wagner, J.; West, B. T.
- An Overview of Mobile CATI Issues in Europe; 2015; Slavec, A.; Toninelli, D.
- Using Mobile Phones for High-Frequency Data Collection; 2015; Azevedo, J. P.; Ballivian, A.; Durbin, W.
- Willingness of Online Access Panel Members to Participate in Smartphone Application-Based Research; 2015; Pinter, R.
- Who Has Access to Mobile Devices in an Online Opt-in Panel? An Analysis of Potential Respondents for...; 2015; Revilla, M.; Toninelli, D.; Ochoa, C.; Loewe, G.
- Who Are the Internet Users, Mobile Internet Users, and Mobile-Mostly Internet Users?: Demographic Differences...; 2015; Antoun, C.
- A Meta-Analysis of Breakoff Rates in Mobile Web Surveys; 2015; Mavletova, A. M.; Couper, M. P.
- The Best of Both Worlds? Combining Passive Data with Survey Data, its Opportunities, Challenges and...; 2015; Duivenvoorde, S.; Dillon, A.
- Optimizing the Decennial Census for Mobile – A Case Study; 2015; Nichols, E. M.; Hawala, E. O.; Horwitz, R.; Bentley, M.