Web Survey Bibliography
Title Social Media in Public Opinion Research: Report of the AAPOR Task Force on Emerging Technologies in Public Opinion Research
Author Murphy, J., Link, M. W., Childs, J. H., Tesfaye, C., Dean, E., Stern, M. J., Pasek, J., Cohen, J., Callegaro, M., Harwood, P. G.
Source AAPOR
Year 2014
Access date 04.06.2014
Full text
Abstract
Public opinion research is entering a new era, one in which traditional survey research may play a less dominant role. The proliferation of new technologies, such as mobile devices and social media platforms, are changing the societal landscape across which public opinion researchers operate. As these technologies expand, so does access to users’ thoughts, feelings and actions expressed instantaneously, organically, and often publicly across the platforms they use. The ways in which people both access and share information about opinions, attitudes, and behaviors have gone through perhaps a greater transformation in the last decade than in any previous point in history and this trend appears likely to continue. The ubiquity of social media and the opinions users express on social media provide researchers with new data collection tools and alternative sources of qualitative and quantitative information to augment or, in some cases, provide alternatives to more traditional data collection methods. The reasons to consider social media in public opinion and survey research are no different than those of any alternative method. We are ultimately concerned with answering research questions, and this often requires the collection of data in one form or another. This may involve the analysis of data to obtain qualitative insights or quantitative estimates. The quality of data and ability to help accurately answer research questions is of paramount concern. Other practical considerations include the cost efficiency of the method and speed at which the data can be collected, analyzed, and disseminated. If the combination of data quality, cost efficiency, and timeliness required by a study can best be achieved through the use of social media, then there is reason to consider these methods for research. An additional reason to consider social media in public opinion and survey research is its explosion in popularity over the last several years. At a time when many are eschewing landline telephones (Blumberg and Luke, 2013) or actively taking steps to prevent unsolicited contact (e.g. caller ID, restricted access buildings), many are now communicating and interacting online via social networking sites. It is only natural for researchers to aim to meet potential respondents where they have the best chance of getting their attention and potentially gaining their cooperation. However, this brave new world is not without its share of issues and pitfalls – technological, statistical, methodological, and ethical, and much remains to be investigated. As the leading association of public opinion research professionals, AAPOR is uniquely situated to examine and assess the potential impact of new “emerging technologies” on the broader discipline and industry of opinion research. In September 2012, AAPOR Council approved the formation of the Emerging Technologies Task Force with the goal of focusing on two critical areas: smartphones as data collection vehicles and social media as platform and information source. The current report focuses on social media; a companion report covers mobile data collection. This report examines the potential impact of social media on public opinion research – as a vehicle for facilitating some aspect of the survey research process (i.e., questionnaire development, recruitment, locating, etc.) and/or augmenting or replacing traditional survey research methods (i.e., content analysis of existing data). We distinguish between qualitative insights and quantitative indicators from social media and discuss the factors that must be evaluated to determine its fitness for use.
Access/Direct link
Year of publication2014
Bibliographic typeReports, seminars
Web survey bibliography - Reports, seminars (231)
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2016; 2016
- Standard Definitions Final Dispositions of Case Codes and Outcome Rates for Surveys; 2016
- FocusVision 2015 Annual MR Technology Report; 2016; Macer, T., Wilson, S.
- Establishing the accuracy of online panels for survey research; 2016; Bruggen, E.; van den Brakel, J.; Krosnick, J. A.
- Mixing modes of data collection in Swiss social surveys: Methodological report of the LIVES-FORS mixed...; 2016; Roberts, C.; Joye, D.; Staehli, M. E.
- Assessment of Innovations in Data Collection Technology for Understanding Society; 2016; Couper, M. P.
- Report of the Inquiry into the 2015 British general election opinion polls; 2016; Sturgis, P., Baker, N., Callegaro, M., Fisher, St., Green, J., Jennings, W., Kuha, J., Lauderdale, B...
- Evaluating a New Proposal for Detecting Data Falsification in Surveys; 2016; Simmons, K.; Mercer, A. W.; Schwarzer, S.; Courtney, K.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- Predictive inference for non-probability samples: a simulation study ; 2016; Buelens, B.; Burger, J.; van den Brakel, J.
- ESOMAR/GRBN Online Research Guideline; 2015
- App vs. Web for Surveys of Smartphone Users: Experimenting with mobile apps for signal-contingent experience...; 2015; McGeeney, K.; Keeter, S.; Igielnik, R.; Smith, A.; Rainie, L.
- On Climbing Stairs Many Steps at a Time: The New Normal in Survey Methodology; 2015; Dillman, D. A.
- Polling Error in the 2015 UK General Election: An Analysis of YouGov’s Pre and Post-Election Polls...; 2015; Wells, A.; Rivers, D.
- GreenBook Research Industry Trends Report; 2015; Murphy, L. (Ed.)
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2015; 2015
- Methodology of the RAND Mid-Term 2014 Election Panel; 2015; Carman, K. G; Pollack, S.
- 28 Questions to Help Buyers of Online Samples; 2015; Cape, P. J.; Phillips, A.; Baker, R.; Cooke, M.; Ribeiro, E.; Terhanian, G.
- Understanding Society Innovation Panel Wave 7: Results from Methodological Experiments; 2015; Blom, A. G.; Burton, J.; Booker, C. L.; Cernat, A.; Fairbrother, M.; Jaeckle, A.; Kaminska, O.; Keusch...
- Tips for Creating Web Surveys for Completion on a Mobile Device; 2015; McGeeney, K.
- U.S. Survey Research: Sampling; 2015
- A Comparison of Different Online Sampling Approaches for Generating National Samples; 2014; Heen, M. S. J., Lieberman, J. D., Miethe, T. D.
- FocusVision 2014 Annual MR Technology Report; 2014; Macer, T., Wilson, S.
- The Changing Landscape of Technology and its Effect on Online Survey Data Collection; 2014; Mitchell, N.
- Query on Data Collection for Social Surveys; 2014; Blanke, K., Luiten, A.
- The role of email addresses and email contact in encouraging web response in a mixed mode design ; 2014; Cernat, A., Lynn, P.
- Mixed-mode surveys of the general population - Results from the European Social Survey mixed-mode experiment...; 2014; Park, A., Humphrey, A.
- Mixed-Mode Designs bei Erhebungen mit sensitiven Fragen: Einfluss auf das Teilnahme- und Antwortverhalten...; 2014; Krug, G., Kriwy, P., Carstensen, J.
- Methods and systems for managing an online opinion survey service; 2014; Mcloughlin, M. H., Seton, N., Blesy, K.
- Mobile Technologies for Conducting, Augmenting and Potentially Replacing Surveys: Report of the AAPOR...; 2014; Link, M. W., Murphy, J., Schober, M. F., Buskirk, T. D., Childs, J. H., Tesfaye, C.
- The use of within-subject experiments for estimating measurement effects in mixed-mode surveys ; 2014; Klausch, L. T., Schouten, B., Hox, J.
- Measuring well-being: An analysis of different response scales; 2014; van Beuningen, J., van der Houwen, K., Moonen, L.
- The impact of contact effort and interviewer performance on mode-specific nonresponse and measurement...; 2014; Schouten, B., Cobben, F., van der Laan, J., Arends, J.
- Community Life Survey: Summary of web experiment findings; 2013
- The Short-term Campaign Panel of the German Longitudinal Election Study 2009. Design, Implementation...; 2013; Steinbrecher, M., Rossmann, J.
- Too Fast, Too Straight, Too Weird: Post Hoc Identification of Meaningless Data in Internet ; 2013; Leiner, D. J.
- Postal recruitment into a longitudinal online panel survey. The effects of different number of reminder...; 2013; Martinsson, J.
- The world in 2013. ICT facts and figures; 2013
- Microsoft Security Intelligence Report, Volume 15; 2013
- A Comparison of Results from a Spanish and English Mail Survey: Effects of Instruction Placement on...; 2013; Wang, K., Sha, M.
- Research Note: Reducing the Threat of Sensitive Questions in Online Surveys?; 2013; Couper, M. P.
- Global market research 2013; 2013
- Exploring the Digital Nation: America’s Emerging Online Experience; 2013
- Advantages of a global multimodal print & digital readership survey; 2013; Cour, N., Saint-Joanis, G.
- Australia: building a 21st century readership survey; 2013; Green, A., White, H.
- The new swiss national readership survey: fit for the future ; 2013; Amschler, H., Hoffmann, J.
- ESS Mixed Mode Experiment Results in Estonia (CAWI and CAPI Mode Sequential Design); 2013; Ainsaar, M., Lilleoja, L., Lumiste, K., Roots, A.
- Using smartphones in survey research: a multifunctional tool Implementation of a time use app; a feasability...; 2013; Sonck, N., Fernee, H.
- Adaptive survey designs to minimize survey mode effects. A case study on the Dutch Labour Force Survey...; 2013; Calinescu, M., Schouten, B.
- Optimal Resource Allocation in Adaptive Survey Designs; 2013; Calinescu, M.