Web Survey Bibliography
Online surveys continue to transform how survey research is conducted, not just in terms of the capabilities they offer, but also how online surveys are designed. Several companies have recently entered the survey research field with a new type of platform, offering researchers a do-it-yourself (DIY), cost-effective approach to surveying thousands of people online. Respondents to DIY surveys are recruited from an online panel of Internet users or by using a variety of online recruitment methods, including banner advertisements, email campaigns, and search campaigns (i.e., search engine generated links). A new recruitment approach for conducting DIY surveys has been gaining traction -- a “surveywall” that first intercepts Internet users attempting to access restricted/paid content from a participating website then solicits them to participate in a very brief survey (1-2 questions). Users are sampled in real-time and, in exchange for their survey participation, are given access to the paid content. Proponents of this DIY approach argue that by reducing survey burden, and simultaneously providing more meaningful incentives (i.e., access to content), survey results are as accurate as those derived from probability-based online panels. In this study, we test the feasibility and performance of intercept-type DIY survey relative to a probability-based online panel, a traditional opt-in online panel, and online populations recruited through two popular social media platforms using a common questionnaire. We provide an independent assessment, useful to those studying and contemplating using such a system. We compare responses from all platforms to demographic and behavioral benchmarks, using the average percentage point absolute error across all the questions in the survey, as done by McDonald, Mohebbi, and Slatkin (2012) and Yeager, Krosnick, et al. (2011) in their comparative research on survey accuracy. We discuss the findings from the study and conclude with a call and recommendations for further research on this topic.
Conference Homepage (abstract)
Web survey bibliography - Wells, T. (10)
- Evaluating Visual Design Elements for Data Collection and Panelist Engagement; 2015; Christian, L. M.; Harm, D.; Langer Tesfaye, C.; Wells, T.
- Comparing Tablet, Computer, and Smartphone Survey Administrations; 2013; Wells, T., Bailey, J., Link, M. W.
- Surveywalls: A Breakthrough for Survey Customers or DIY Run Amok?; 2013; Wells, T., Dean, E., Rao, K., Murphy, J., Roe, D. J.
- A Direct Comparison of Mobile Versus Online Survey Modes; 2012; Wells, T., Bailey, J., Link, M. W.
- Catch Them When You Can: Speeders and Their Role in Online Data Quality; 2011; Gutierrez, C., Wells, T., Rao, K., Kurzynski, D.
- Representing Seniors in an Online National Probability Panel Survey: Measuring Technology Attitudes...; 2010; Peugh, J., Mansfield, W., Wells, T., Semans, K.
- The Challenge and Importance of Including Spanish- Dominant Latinos in Online Panel Studies Addressing...; 2009; DiSogra, C., Wells, T., Torres, J.
- The Challenge and Importance of Including Spanish-Dominant Latinos in an Online Panel; 2009; Dennis, J. M., Wells, T., Torres, J.
- Is the digital divide still closing? New evidence points to skewed online results absent non-Internet...; 2008; Callegaro, M., Wells, T.
- Effects of Pre-coding Response Options for Five Point Satisfaction Scale in Web Surveys; 2008; Callegaro, M., Wells, T., Kruse, Y.