Web Survey Bibliography
Web surveys have been a commonly used method for collecting and analyzing data about social and psychological phenomena in online communities. Nevertheless, as yet, little systematic methodological research has been done about administrating and conducting surveys in these online domains. This is somewhat surprising due to the methodological advances of other research techniques in the field of online community research such as virtual ethnography, data mining and social network analysis and to the fact that internet surveys were among the first techniques used in online community research (Jones 1998, 1999).Quality of web surveys in online communities is commonly affected by two types of survey errors (Andrews et al. 2003): non-coverage and unit nonresponse error. Although both errors can be interrelated, on one hand, the noncoverage error mainly applies to the issue of defining a valid unit of analysis (on theoretical and empirical level). Generally online communities require from individuals a kind of formal registration in order to participate in community activities (e.g., in order to submit a message to a forum thread one must log in with their email to an online community). However, there are also online communities that allow individuals to participate without being necessarily registered (i.e., anonymous users). In such cases, researchers often opt for intercept surveys and/or unrestricted self-selected surveys (e.g., by posting an invitation message to a forum discussion thread or publish a banner on the website) (Fricker 2010) failing to give any chance of sample selection to some users of the analyzed online community. On the other hand, the unit nonresponse error generally pertains to list-based web surveys (Lozar Manfreda et al. 2002). In such cases a sampling frame of online community members is available (e.g., a list of all members of an online community), but due to various reasons some sample units refuse to answer the survey. As in general population surveys also in online community surveys unit nonresponse can lead to bias. For instance, there is some empirical evidence that a larger percentage of high-frequency online community participants takes part in surveys compared to low-frequency participants and that high-frequency participants reply quicker to survey invitations than low-frequency participants (Andrews et al. 2003).The purpose of this paper is to present a research design of a study aiming at evaluating the effect of survey invitation characteristics and response reluctance on unit nonresponse in list-based surveys of web forum participants. On one hand, we are interested in response-inducing techniques (Dillman et al. 2007). Up to now the literature on list-based web surveys has been generally focused on the topic of personalization of email invitations, their subject lines and graphical design. The focus of this study is on the content of email invitations. Namely, we will attempt to gain an insight into how response rates are associated with request for help, authority, and sense of community. The decision to study a combination of these three elements was theoretically informed by research on online communities, showing that exchange of social support, belonging and norms are among the most salient building blocks of their development and sustainability (Preece 2000, Kraut et al. 2012). Considering the leverage-salience theory (Groves et al. 2000) and the fact that the these elements are not appealing for all online community members to the same extent, putting forward different combinations of the three elements in invitations might therefore result in different response rates. In addition, we would like to explore if and how the three response-inducing techniques are associated with the unit nonresponse bias. In order to address these questions an experiment with a full factorial between-subjects design with eight experimental groups was planned to be implemented. On the other hand, the research design was developed to explore whether there are differences between early and late respondents as well as if sending reminders can reduce the potential bias associated with unit non-response error. The advantage of list-based web surveys in web forums is that nonresponse bias can be analyzed with the record-linkage technique (Porter and Whitcomb 2005) since both respondents and nonrespondents to the survey can be linked to database records of registered web forum users (which generally includes information about frequency of participation, number of posted messages, year of registration, etc.). The presentation will end with an open discussion of potential avenues for optimization of the proposed research design.
Workshop Homepage (abstract)
Web survey bibliography (4086)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not...; 2017; Toepoel, V.; Emerson, H.
- Mind the Mode: Differences in Paper vs. Web-Based Survey Modes Among Women With Cancer; 2017; Hagan, T. L.; Belcher, S. M.; Donovan, H. S.
- Answering Without Reading: IMCs and Strong Satisficing in Online Surveys; 2017; Anduiza, E.; Galais, C.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Social desirability bias in self-reported well-being measures: evidence from an online survey; 2017; Caputo, A.
- Web-Based Survey Methodology; 2017; Wright, K. B.
- Handbook of Research Methods in Health Social Sciences; 2017; Liamputtong, P.
- Lessons from recruitment to an internet based survey for Degenerative Cervical Myelopathy: merits of...; 2017; Davies, B.; Kotter, M. R.
- Web Survey Gamification - Increasing Data Quality in Web Surveys by Using Game Design Elements; 2017; Schacht, S.; Keusch, F.; Bergmann, N.; Morana, S.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Comparability of web and telephone surveys for the measurement of subjective well-being; 2017; Sarracino, F.; Riillo, C. F. A.; Mikucka, M.
- Achieving Strong Privacy in Online Survey; 2017; Zhou, Yo.; Zhou, Yi.; Chen, S.; Wu, S. S.
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Telephone versus Online Survey Modes for Election Studies: Comparing Canadian Public Opinion and Vote...; 2017; Breton, C.; Cutler, F.; Lachance, S.; Mierke-Zatwarnicki, A.
- Examining Factors Impacting Online Survey Response Ratesin Educational Research: Perceptions of Graduate...; 2017; Saleh, A.; Bista, K.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- Rates, Delays, and Completeness of General Practitioners’ Responses to a Postal Versus Web-Based...; 2017; Sebo, P.; Maisonneuve, H.; Cerutti, B.; Pascal Fournier, J.; Haller, D. M.
- Necessary but Insufficient: Why Measurement Invariance Tests Need Online Probing as a Complementary...; 2017; Meitinger, K.
- Nonresponse in Organizational Surveying: Attitudinal Distribution Form and Conditional Response Probabilities...; 2017; Kulas, J. T.; Robinson, D. H.; Kellar, D. Z.; Smith, J. A.
- Theory and Practice in Nonprobability Surveys: Parallels between Causal Inference and Survey Inference...; 2017; Mercer, A. W.; Kreuter, F.; Keeter, S.; Stuart, E. A.
- Is There a Future for Surveys; 2017; Miller, P. V.
- Reducing speeding in web surveys by providing immediate feedback; 2017; Conrad, F.; Tourangeau, R.; Couper, M. P.; Zhang, C.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- A Working Example of How to Use Artificial Intelligence To Automate and Transform Surveys Into Customer...; 2017; Neve, S.
- A Case Study on Evaluating the Relevance of Some Rules for Writing Requirements through an Online Survey...; 2017; Warnier, M.; Condamines, A.
- Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response...; 2017; Kappelhof, J. W. S.; De Leeuw, E. D.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.