Web Survey Bibliography
Title Assessing the Accuracy of 51 Nonprobability Online Panels and River Samples: A Study of the Advertising Research Foundation 2013 Online Panel Comparison Experiment
Author Yang,Y.;Callegaro,M.;Yang,Y.;Callegaro,M.;Chin,K.;Yang,Y.;Villar,A.;Callegaro, M.; Chin, K.; Krosnick, J. A.; Villar, A.; Yang, Y.
Year 2016
Access date 09.06.2016
Abstract
Survey research is increasingly conducted using online panels and river samples. With a large number of data suppliers available, data purchasers need to understand the accuracy of the data being provided and whether probability sampling continues to yield more accurate measurements of populations. This paper evaluates the accuracy of a probability sample and non-probability survey samples that were created using various different quota sampling strategies and sample sources (panel versus river
samples) on the accuracy of estimates. Data collection was organized by the Advertising Research Foundation (ARF) in 2013. We compare estimates from 45 U.S. online panels of non-probability samples, 6 river samples, and one RDD telephone sample to high-quality benchmarks -- population estimates obtained from large-scale face-to-face surveys of probability samples with extremely high response rates (e.g., ACS, NHIS, and NHANES). The non-probability samples were supplied by 17 major U.S. providers. Online respondents were directed to a third party website where the same questionnaire was administered. The online samples were created using three quota methods: (A) age and gender within regions; (B) Method A plus race/ethnicity; and (C) Method B plus education. Mean questionnaire completion time was 26 minutes, and the average sample size was 1,118. Comparisons are made using unweighted and weighted data, with different weighting strategies of increasing complexity. Accuracy is evaluated using the absolute average error method, where the percentage of respondents who chose the modal category in the benchmark survey is compared to the corresponding percentage in each sample. The study illustrates the need for methodol
ogical rigor when evaluating the performance of survey samples.
samples) on the accuracy of estimates. Data collection was organized by the Advertising Research Foundation (ARF) in 2013. We compare estimates from 45 U.S. online panels of non-probability samples, 6 river samples, and one RDD telephone sample to high-quality benchmarks -- population estimates obtained from large-scale face-to-face surveys of probability samples with extremely high response rates (e.g., ACS, NHIS, and NHANES). The non-probability samples were supplied by 17 major U.S. providers. Online respondents were directed to a third party website where the same questionnaire was administered. The online samples were created using three quota methods: (A) age and gender within regions; (B) Method A plus race/ethnicity; and (C) Method B plus education. Mean questionnaire completion time was 26 minutes, and the average sample size was 1,118. Comparisons are made using unweighted and weighted data, with different weighting strategies of increasing complexity. Accuracy is evaluated using the absolute average error method, where the percentage of respondents who chose the modal category in the benchmark survey is compared to the corresponding percentage in each sample. The study illustrates the need for methodol
ogical rigor when evaluating the performance of survey samples.
Access/Direct link Conference Homepage (abstract)
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - Callegaro, M. (41)
- Assessing the Accuracy of 51 Nonprobability Online Panels and River Samples: A Study of the Advertising...; 2016; Yang,Y.;Callegaro,M.;Yang,Y.;Callegaro,M.;Chin,K.;Yang,Y.;Villar,A.;Callegaro, M.; Chin, K.; Krosnick...
- Report of the Inquiry into the 2015 British general election opinion polls; 2016; Sturgis, P., Baker, N., Callegaro, M., Fisher, St., Green, J., Jennings, W., Kuha, J., Lauderdale, B...
- The quality of data collected using online panels: a decade of research ; 2015; Callegaro, M.
- Metrics and Design Tool for Building and Evaluating Probability-Based Online Panels; 2015; DiSogra, C.; Callegaro, M.
- Yes-no answers versus check-all in self-administered modes ; 2015; Callegaro, M.; Henderson, V.; Murakami, M.; Tepman, Z.
- A critical review of studies investigating the quality of data obtained with online panels based on...; 2014; Callegaro, M., Villar, A., Yeager, D. S., Krosnick, J. A.
- Online panel research: History, concepts, applications and a look at the future; 2014; Callegaro, M., Baker, R., Bethlehem, J., Goeritz, A., Krosnick, J. A., Lavrakas, P. J.
- Recent Books and Journals in Public Opinion, Survey Methods, and Survey Statistics; 2014; Callegaro, M.
- Paradata in web surveys; 2013; Callegaro, M.
- From mixed-mode to multiple devices. Web surveys, smartphone surveys and apps: has the respondent gone...; 2013; Callegaro, M.
- Web coverage in the UK and its potential impact on general population web surveys; 2013; Callegaro, M.
- Effects of Progress Indicators on Short Questionnaires; 2012; Sedley, A., Callegaro, M.
- Effects of Pagination on Short Online Surveys; 2012; Sedley, A., Callegaro, M.
- A Systematic Review of Studies Investigating the Quality of Data Obtained with Online Panels; 2012; Callegaro, M., Villar, A., Krosnick, J. A., Yeager, D. S.
- A taxonomy of paradata for web surveys and computer assisted self interviewing (Casi); 2012; Callegaro, M.
- Unpublisihed internal Google report on break off rates by device type; 2011; Callegaro, M.
- Should we use the progress bar in online surveys? A meta-analysis of experiments manipulating progress...; 2011; Callegaro, M., Yang, Y., Villar, A.
- IVR and web administration in structured interviews utilizing rating scales: exploring the role of motivation...; 2011; Yang, Y., Callegaro, M., Bhola, D. S., Dillman, D. A.
- The Effect of Email Invitation Subject Title and Text on Online Survey Completion Rates in Internet...; 2009; Kruse, Y., Thomas, M., Nukulkij, P., Callegaro, M.
- Differences Between Internet and Non-Internet Households on Survey Items: Do These Differences Disappear...; 2009; Zhang, C., Callegaro, M., Thomas, M.
- Producing Straightlining and Item Non-Differentiation in a Web Survey: How Visual Design Plays a Role...; 2009; Callegaro, M., Shand-Lubbers, J., Dennis, J. M.
- Do we hear different voices?: Investigating the differences between internet and non-internet users...; 2009; Zhang, C., Callegaro, M., Thomas, M., DiSogra, C.
- The Effect of Email Invitation Customization on Survey Completion Rates in an Internet Panel: A Meta...; 2009; Callegaro, M., Kruse, Y., Thomas, M., Nukulkij, P.
- Panel Conditioning and Attrition in the AP-Yahoo! News Election Panel Study; 2009; Kruse, Y., Callegaro, M., Dennis, J. M., DiSogra, C., Subias, S., Lawrence, M., Tompson, T.
- Recruiting Probability-Based Web Panel Members Using an Address-Based Sample Frame: Results from a Pilot...; 2009; DiSogra, C., Callegaro, M., Hendarwan, E.
- Presentation of a Single Item versus a Grid: Effects on the Vitality and Mental Health Scales of the...; 2009; Callegaro, M., Shand-Lubbers, J., Dennis, J. M.
- Computing Response Rates for Probability-Based Web Panels; 2009; DiSogra, C., Callegaro, M.
- Is the digital divide still closing? New evidence points to skewed online results absent non-Internet...; 2008; Callegaro, M., Wells, T.
- Effects of Pre-coding Response Options for Five Point Satisfaction Scale in Web Surveys; 2008; Callegaro, M., Wells, T., Kruse, Y.
- An implementation of a within-household selection procedure for web surveys; 2008; Callegaro, M., Osborn, L., Debell, M., Leuvano, P.
- Response options order effect and category number association: An experiment using items on a five point...; 2008; Tang, G., Callegaro, M.
- More than the digital divide?: Investigating the differences between Internet and non-Internet users; 2008; Zhang, C., Callegaro, M., Thomas, M.
- “R U in the Network?!” Using Text Messaging Interfaces as Screeners for Working Cell Phone...; 2008; Buskirk, T. D., Rao, K., Callegaro, M., Arens, Z., Steiger, D. M.
- Computing Metrics for Online Panels; 2008; Callegaro, M., DiSogra, C.
- Impact of new technologies in data collection methods; 2008; Callegaro, M.
- Key Issues in Research Accuracy: Sources of bias and error in online research; 2008; Dennis, J. M., Callegaro, M.
- The influence of mobile telephones on telephone surveys; 2008; Kuusela, V., Callegaro, M., Vehovar, V.
- The influence of advance letters on response in telephone surveys; 2007; de Leeuw, E. D., Callegaro, M., Hox, J., Korendijk, E., Lensvelt-Mulders, G. J.
- Using Text Messages in U.S. Mobile Phone Surveys ; 2007; Steeh, C. G., Buskirk, T. D., Callegaro, M.
- Response latency as an indicator of optimizing. A study comparing job applicants and job incumbents...; 2004; Callegaro, M., Yang, Y., Bhola, D. S., Dillman, D. A.
- Electronic Voting Machines – A comparison applying the principles of computer-human interaction...; 2003; Callegaro, M., Peytcheva, E.