Web Survey Bibliography
Relevance & Research Question: Survey researchers have been forced—by time and budget constraints—to rely on a slew of sampling methods to estimate population parameters. To be valid, sampling procedures must a) have nearly 100% population coverage b) ensure equal chance of selection, and c) feature reasonable response rates. Convenience samples have long been considered inadequate for serious research but collecting opinions from probability samples now takes longer than conveniently interviewing millions of people.
Increasingly, RDD telephone sampling now features response rates below 20%. Moreover, the decrease in landline telephony and the corresponding rise in mobile have put pressure on the claim of near total coverage. Though all online interviewing relies on convenience sampling, it is arguable that people only answer their phones to take surveys (that are a part of RDD studies) when it is convenient for them—a group that continues to shrink.
Methods & Data: This paper investigates whether volume-based approaches to collecting opinions are reasonable substitutes for probability sampling. To do so, we exposed Gallup’s long-used, well documented, United States Presidential approval rating question to a random subset of people who recently completed a survey for one of SurveyMonkey’s seven million survey creators. We presented the question to roughly 10,000 survey takers per day from June 10, 2010 to July 29, 2010. In total, 87, 308 people answered the question.
Results: The results track Gallup’s daily approval numbers within the reported margin of error nearly each day that the test ran without mimicking Gallup’s use of statistical weighting. A closer look reveals that coverage and response rate were respectable relative to those of probability studies. Specifically, respondents were from 8,300 of 19,000 American cities (43%) which approaches the proportion of U.S. households with a landline telephone (60%). Moreover, the respondents yielded an average daily response rate of 46%--more than double that of a typical telephone survey.
Added Value: These findings shed light on recent interest by market researchers in gathering data from technology and social networking sources that have access to extremely large and diverse pools of people and can ask questions of them for virtually no cost.
Conference Homepage (abstract) / (presentation)
Web survey bibliography - General Online Research Conference (GOR) 2011 (17)
- Sampling v. Scale: An investigation the tension between convenience sampling, response rates, probability...; 2011; Garland, P.
- Effectiveness and consequences of various recruitment methods in psychological research: case study; 2011; Poltorak, M.
- A new approach to the analysis of survey drop-out. Results from Follow-up Surveys in the German Longitudinal...; 2011; Rossmann, J., Blumenstiel, J. E., Steinbrecher, M.
- Tracking the decision-making process – Findings from an Online Rolling Cross-Section Panel Study...; 2011; Faas, T.
- Should we use the progress bar in online surveys? A meta-analysis of experiments manipulating progress...; 2011; Callegaro, M., Yang, Y., Villar, A.
- From "Web Questions" to "Propensity Score Weighting": An Evaluation of Topics and...; 2011; Welker, M., Taddicken, M.
- Rich Profiles – Or: What's the problem with self-disclosure data?; 2011; Tress, F.
- Who are leaving our panel: panel attrition and personality traits; 2011; Marchand, M.
- Mobile Research Apps – Adding New Capabilities to Market Research; 2011; Rieber, D.
- The influence of personality traits and motives for joining on participation behavior in online panels...; 2011; Keusch, F.
- Asking sensitive questions in a recruitment interview for an online panel: the income question; 2011; Schaurer, I., Struminskaya, B., Kaczmirek, L., Bandilla, W.
- Speeders in Online Value Research: Cross-checking results of fast and slow respondents in two separate...; 2011; Beckers, T., Siegers, P., Kuntz, A.
- Effects of survey question clarity on data quality; 2011; Lenzner, T.
- Respondent Characteristics as Explanations for Uninformative Survey Response: Sources of Nondifferentiation...; 2011; Van Meurs, L., Klausch, L. T., Schoenbach, K.
- Response Quantity, Response Quality, and Costs of Building an Online Panel via Social Contacts.; 2011; Toepoel, V.
- The Influence Of The Direction Of Likert-Type Scales In Web Surveys On Response Behavior In Different...; 2011; Keusch, F.
- Social desirability and self-reported health risk behaviors in web-based research: three longitudinal...; 2010; Crutzen, R., Goeritz, A.