Web Survey Bibliography
Relevance & Research Question: Survey researchers have been forced—by time and budget constraints—to rely on a slew of sampling methods to estimate population parameters. To be valid, sampling procedures must a) have nearly 100% population coverage b) ensure equal chance of selection, and c) feature reasonable response rates. Convenience samples have long been considered inadequate for serious research but collecting opinions from probability samples now takes longer than conveniently interviewing millions of people.
Increasingly, RDD telephone sampling now features response rates below 20%. Moreover, the decrease in landline telephony and the corresponding rise in mobile have put pressure on the claim of near total coverage. Though all online interviewing relies on convenience sampling, it is arguable that people only answer their phones to take surveys (that are a part of RDD studies) when it is convenient for them—a group that continues to shrink.
Methods & Data: This paper investigates whether volume-based approaches to collecting opinions are reasonable substitutes for probability sampling. To do so, we exposed Gallup’s long-used, well documented, United States Presidential approval rating question to a random subset of people who recently completed a survey for one of SurveyMonkey’s seven million survey creators. We presented the question to roughly 10,000 survey takers per day from June 10, 2010 to July 29, 2010. In total, 87, 308 people answered the question.
Results: The results track Gallup’s daily approval numbers within the reported margin of error nearly each day that the test ran without mimicking Gallup’s use of statistical weighting. A closer look reveals that coverage and response rate were respectable relative to those of probability studies. Specifically, respondents were from 8,300 of 19,000 American cities (43%) which approaches the proportion of U.S. households with a landline telephone (60%). Moreover, the respondents yielded an average daily response rate of 46%--more than double that of a typical telephone survey.
Added Value: These findings shed light on recent interest by market researchers in gathering data from technology and social networking sources that have access to extremely large and diverse pools of people and can ask questions of them for virtually no cost.
Conference Homepage (abstract) / (presentation)
Web Survey Bibliography - 2011 (584)
- Involve while you evolve. How to make mobile research work for everyone; 2011; Luck, K.
- Internet access quarterly update 2011 Q1; 2011
- In search of a new approach to measure newspaper audiences in Canada: The journey continues; 2011; Crassweller, A., Rogers, J., Graves, F., Gauthier, E., Charlebois, O.
- Households with Computers, Telephone Subscriptions, and Internet Access, Selected Years, 1997 - 2010; 2011
- Harvard research data security policy; 2011
- Handbook of statistical data editing and imputation; 2011; Waal, T., Pannekoek, J.; Scholtus, S.
- Handbook of parametric and nonparametric statistical procedures, 5th Edition; 2011; Sheskin, D. J.
- GRE® program announces big benefits and big savings for GRE® test takers worldwide; 2011
- Google and Kantar develop measurement panel; 2011
- Going online with assessment; 2011; Burke, E. et al.
- Global market research 2011; 2011
- Focus Group Methodology: Principle and Practice; 2011; Liamputtong, P.
- External preference mapping of commercial antiaging creams based on consumers' responses to a check...; 2011; Parente, M. E., Manzoni, A. V., Ares, G.
- Exploring the digital nation. Computer and Internet use at home; 2011
- Experiments for evaluating survey questions; 2011; Krosnick, J. A.
- Evaluating the usability of personal digital assistants to collect behavioral data on adolescents with...; 2011; McClamroch, K. J
- Eurobarometer Special surveys: EB75.1 E-Communications Household Survey. Special Eurobarometer 362; 2011
- Survey Data Quality Provisions in Statistics Canada E-Questionnaire Solution: Retrospective and Perspectives...; 2011; Abiza, Y.
- Developing Electronic Questionnaire Guidelines: Issues and Challenges in a Changing Environment; 2011; Cote, A.-M., Kelly, P., Lawrence, D.
- Triton: a general tool for data collection and micro editing; 2011; Erikson, J.
- A Generalized System for Aided Development and Monitoring of Web Surveys; 2011; Torelli, R.
- Ethical issues in Internet research; 2011; Hoerger, M., Currell, C.
- Using survey data collection as a tool for improving the survey process; 2011; Biffignandi, S., Perani, G., Laureti, A.
- Essential methods for design based sample surveys; 2011; Pfeffermann, D., Rao, C. R.
- ESOMAR AND CASRO submission to the W3C tracking protection working group - Market research techniques...; 2011
- A picnic in the field; negotiating the presentation of the self in researcher/respondent relationships...; 2011; Parsons, J.
- The benefits and constraints of e-mail interviews and discussions as methods of accessing valid data; 2011; Roberts, A.
- Is There a Quick Fix for Open-ended Questions? A Comparison of Qualitative Analysis Techniques; 2011; Tesfaye, C.
- The Impact of Open-Ended Questions: A Multivariate Study of Respondent Engagement; 2011; Gittelman, S. H.
- “You are Invited to Participate”: Challenges of Applying Mixed Survey Methods to Assess...; 2011; Chew, F.
- Literacy and Data Quality in Self-Administered Surveys; 2011; Smyth, J. D., Olson, K.
- Catch Them When You Can: Speeders and Their Role in Online Data Quality; 2011; Gutierrez, C., Wells, T., Rao, K., Kurzynski, D.
- Effects of Post-Incentives on Response Rates, Costs, and Response Quality in a Web Survey of College...; 2011; Stevenson, J., Dykema, J., Cyffka, K., Klein, L., Goldrick-Rab, S.
- Observed Differences in the Placement and Wording of Neutral Response Options in Web Surveys: An Experiment...; 2011; Walton, L., Cobb, C. L., DiSogra, C.
- Effects of response format on requalification for recontact studies; 2011; Thomas, R. K.
- A meta-analysis of experiments manipulating progress indicators in Web surveys; 2011; Callegaro, M., Villar, A., Yang, Y.
- Does mentioning "some people" and "other people" in a survey question increase the...; 2011; Yeager, D. S., Krosnick, J. A.
- Do not track gathers momentum; 2011; Stark, D.
- “Don’t know” the difference - An experimental comparison between Web and CATI; 2011; Schielicke, A.-M., Degen, M.
- Display resolution; 2011
- A Survey Stopping Rule Based on Weighting for Unit Nonresponse; 2011; Lewis, T.
- Classic Inspirations for Social Research Methodology in the time of Online Access Panels ; 2011; Jeřábek, H.
- Five Things You Should Know About Mobile Data Collection; 2011; Pingitore, G.
- Mixed Methods - Analyzing Open-Ended Comments in a Quantitative Employee Survey; 2011; Lawton, L., Broege, N.
- Changing Survey Methods (Discussion); 2011; Lavrakas, P. J.
- Code of standards and ethics for survey research; 2011
- Causes of survey incompletes: Why panelists say they abandon surveys; 2011; Henning, J.
- Canadian online panels: Similar or different?; 2011; Chan, P., Ambrose, D.
- Blend, balance, and stabilize respondent sources; 2011; Eggers, M., Drake, E.
- Beyond data stability: Rising above quality concerns; 2011