Web Survey Bibliography
Title Assessing the Accuracy of 51 Nonprobability Online Panels and River Samples: A Study of the Advertising Research Foundation 2013 Online Panel Comparison Experiment
Author Yang,Y.;Callegaro,M.;Yang,Y.;Callegaro,M.;Chin,K.;Yang,Y.;Villar,A.;Callegaro, M.; Chin, K.; Krosnick, J. A.; Villar, A.; Yang, Y.
Year 2016
Access date 09.06.2016
Abstract
Survey research is increasingly conducted using online panels and river samples. With a large number of data suppliers available, data purchasers need to understand the accuracy of the data being provided and whether probability sampling continues to yield more accurate measurements of populations. This paper evaluates the accuracy of a probability sample and non-probability survey samples that were created using various different quota sampling strategies and sample sources (panel versus river
samples) on the accuracy of estimates. Data collection was organized by the Advertising Research Foundation (ARF) in 2013. We compare estimates from 45 U.S. online panels of non-probability samples, 6 river samples, and one RDD telephone sample to high-quality benchmarks -- population estimates obtained from large-scale face-to-face surveys of probability samples with extremely high response rates (e.g., ACS, NHIS, and NHANES). The non-probability samples were supplied by 17 major U.S. providers. Online respondents were directed to a third party website where the same questionnaire was administered. The online samples were created using three quota methods: (A) age and gender within regions; (B) Method A plus race/ethnicity; and (C) Method B plus education. Mean questionnaire completion time was 26 minutes, and the average sample size was 1,118. Comparisons are made using unweighted and weighted data, with different weighting strategies of increasing complexity. Accuracy is evaluated using the absolute average error method, where the percentage of respondents who chose the modal category in the benchmark survey is compared to the corresponding percentage in each sample. The study illustrates the need for methodol
ogical rigor when evaluating the performance of survey samples.
samples) on the accuracy of estimates. Data collection was organized by the Advertising Research Foundation (ARF) in 2013. We compare estimates from 45 U.S. online panels of non-probability samples, 6 river samples, and one RDD telephone sample to high-quality benchmarks -- population estimates obtained from large-scale face-to-face surveys of probability samples with extremely high response rates (e.g., ACS, NHIS, and NHANES). The non-probability samples were supplied by 17 major U.S. providers. Online respondents were directed to a third party website where the same questionnaire was administered. The online samples were created using three quota methods: (A) age and gender within regions; (B) Method A plus race/ethnicity; and (C) Method B plus education. Mean questionnaire completion time was 26 minutes, and the average sample size was 1,118. Comparisons are made using unweighted and weighted data, with different weighting strategies of increasing complexity. Accuracy is evaluated using the absolute average error method, where the percentage of respondents who chose the modal category in the benchmark survey is compared to the corresponding percentage in each sample. The study illustrates the need for methodol
ogical rigor when evaluating the performance of survey samples.
Access/Direct link Conference Homepage (abstract)
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography (4086)
- Respondent Processing of Rating Scales and the Scale Direction Effect ; 2016; Caporaso, A.
- The Effects of Pictorial vs. Verbal Examples on Survey Responses ; 2016; Sun, H.; Bertling, J.; Almonte, D.
- Evaluating Grid Questions for 4th Graders; 2016; Maitland, A.
- Mixing Modes: Challenges (and Tradeoffs) of Adapting a Mailed Paper Survey to the Web ; 2016; Wilkinson-Flicker, S.; McPhee, C. B.; Medway, R.; Kaiser, A.; Cutts, K.
- An Examination of How Survey Mode Affect Eligibility, Response and Health Condition Reporting Rates...; 2016; Stern, M. J.; Ghandour, R.
- Investigating Measurement Error through Survey Question Placement ; 2016; Wilson, A.; Wine, J.; Janson, N.; Conzelmann, J.; Peytcheva, E.
- Instructions in Self-administered Survey Questions: Do They Improve Data Quality or Just Make the Questionnaire...; 2016; Redline, C. D.; Zukerberg, A.; Owens, C.; Ho, A.
- Usability Testing within Agile Process; 2016; Holland, T.
- Exploring Why Web Surveys Take Longer to Complete on Smartphones than PCs: Findings from a Within-subjects...; 2016; Antoun, C.; Cernat, A.
- Making Mobile Web Surveys Accessible; 2016; Malakhoff, L.
- Association of Eye Tracking with Other Usability Metrics ; 2016; Olmsted, E. L.
- Cognitive Probing Methods in Usability Testing – Pros and Cons; 2016; Nichols, E. M.
- Grids and Online Surveys: Do More Complex Grids Induce Survey Satisficing? Evidence from the Gallup...; 2016; Wang, Me.; McCutcheon, A. L.
- Assessing the Accuracy of 51 Nonprobability Online Panels and River Samples: A Study of the Advertising...; 2016; Yang,Y.;Callegaro,M.;Yang,Y.;Callegaro,M.;Chin,K.;Yang,Y.;Villar,A.;Callegaro, M.; Chin, K.; Krosnick...
- Calculating Standard Errors for Nonprobability Samples when Matching to Probability Samples ; 2016; Lee, Ad.; ZuWallack, R. S.
- Communicating Data Use and Privacy: In-person versus Web based methods for message testing ; 2016; Clark Fobia, A.; Hunter Childs, J. E.
- User Experience and Eye-tracking: Results to Optimize Completion of a Web Survey and Website Design ; 2016; Walton, L.; Ricci, K.; Libman Barry, A.; Eiginger, C.; Christian, L. M.
- Estimated-control Calibrated Estimates from Nonprobability Surveys; 2016; Dever, J. A.
- Decomposing Selection Effects in Non-probability Samples ; 2016; Mercer, A. W.; Keeter, S.; Kreuter, F.
- The Effect of Emphasizing the Web Option in a Mixed-mode Establishment Survey ; 2016; O'Brien, J.; Rajapaksa, S.; Schafer, B.; Langetieg, P.
- A Multi-phase Exploration Into Web-based Panel Respondents: Assessing Differences in Recruitment, Respondents...; 2016; Redlawsk, D.; Rogers, K.; Borie-Holtz, D.
- Effect of Clarifying Instructions on Response to Numerical Open-ended Questions in Self-administered...; 2016; Kumar Chaudhary, A.; Israel, G. D.
- Exploring the Feasibility of Using Facebook for Surveying Special Interest Populations ; 2016; Lee, C.; Jang, S.
- National Estimates of Sexual Minority Women Alcohol Use through Web Based Respondent Driven Sampling...; 2016; Farrell Middleton, D.; Iachan, R.; Freedner-Maguire, N.; Trocki, K.; Evans, C.
- Bringing Fair Market Rent Surveys into the 21st Century – Evaluating the Effectiveness of MSG...; 2016; Dayton, J.; Brassell, T.; Cooper, V.; Dion, R.; Williams, R.
- Measuring Survey Behavior of Smartphone Users; 2016; Luks, S.; Phillips, R.
- Practical Considerations for Using Vignettes to Evaluate Survey Items ; 2016; Steiger, D. M.; Williams, Do.; Edwards, W. S.; Cantor, D.; Truman, J.
- Using Web Panels to Quantify the Qualitative: The National Center for Health Statistics Research and...; 2016; Scanlon, P. J.
- Impact of Field Period Length in the Estimates of Sexual Victimization in a Web-based Survey of College...; 2016; Berzofsky, M.; Peterson, K.; Shook-Sa, B. E.; Lindquist, C.; Krebs, C.
- Longitudinal Online Ego-centric Social Network Data Collection with EgoWeb 2.0 ; 2016; Amin, A.; Kennedy, D.
- Influences on Item Response Times in a Multinational Web Survey ; 2016; Phillips, B. T.; Kolenikov, S.; Howard Ecklund, E.; Ackermann, A.; Brulia, A.
- QR Codes for Survey Access: Is It Worth It?; 2016; Allen, L.; Marlar, J.
- An Exploration of the Relationship between Usability Testing and Data Verification ; 2016; Langer Tesfaye, C.; Kurmlavage, V.
- Beyond the Survey: Improving Data Insights and User Experience with Mobile Devices ; 2016; Graham, P.; Lew, G.
- User Experience Considerations for Contextual Product Surveys on Smartphones ; 2016; Sedley, A.; Mueller, H.
- The Differential Effect of Mobile-friendly Surveys on Data Quality; 2016; Horwitz, R.
- Embedding Survey Questions within Non-research Mobile Apps: A Method for Collecting High-quality Data...; 2016; Bapna, V.; Antoun, C.
- Does Changing Monetary Incentive Schemes in Panel Studies Affect Cooperation? A Quasi-experiment on...; 2016; Schaurer, I.; Bosnjak, M.
- Survey Mode and Mail Method: A Practical Experiment in Survey Fielding for a Multi-round Survey ; 2016; Sullivan, B. D.; Duda, N.; Bogen, K.; Clusen, N. A.; Wakar, B.; Zhou, H.
- Web Probing for Question Evaluation: The Effects of Probe Placement ; 2016; Fowler, S.; Willis, G. B.; Moser, R. P.; Townsend, R. L. M.; Maitland, A.; Sun, H.; Berrigan, D.
- Early-bird Incentives: Results From an Experiment to Determine Response Rate and Cost Effects ; 2016; De Santis, J.; Callahan, R.; Marsh, S.; Perez-Johnson, I.
- Using Cash Incentives to Help Recruitment in a Probability Based Web Panel: The Effects on Sign Up Rates...; 2016; Krieger, U.
- Assessing Changes in Coverage Bias of Web Surveys a s Internet Access Increases in the United States...; 2016; Sterrett, D.; Malato, D.; Benz, J.; Tompson, T.; English, N.
- Timing is Everything: Discretely Discouraging Mobile Survey Response through the Timing of Email Contacts...; 2016; Richards, A.; C.; Shook-Sa, B. E.; C.; Berzofsky, M.; Smith, A. C.
- Dynamic Instructions in Check-All-That-Apply Questions ; 2016; Kunz, T.; Fuchs, M.
- Patterns of Unit and Item Nonresponse in a Multinational Web Survey ; 2016; Ackermann, A.; Howard Ecklund, E.; Phillips, B. T.; Brulia, A.
- Debunking Myths About the Quality of Industry and O ccupation Data Collected Through Self-administered...; 2016; Hurwitz, F. I.; Stein, J.; Skaff, A. L.
- Desktops, Tablets and Phones, Oh My! Device Prefere nce for Web Based Surveys ; 2016; Schy, S.; Ghirardelli, A.; Morrison, H.
- Assessing Potential Bias in Respondent-driven Incident Based Data from a Web Survey of College Students...; 2016; Peterson, K.; Berzofsky, M.; Shook-Sa, B. E.; Krebs, C.; Lindquist, C.
- Making Connections on the Internet: Online Survey Panel Communications ; 2016; Libman Barry, A.; Eiginger, C.; Walton, L.; Ricci, K.