Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Research design for studying online communities with web surveys
Year 2012
Access date 01.12.2012
Abstract

Web surveys have been a commonly used method for collecting and analyzing data about social and psychological phenomena in online communities. Nevertheless, as yet, little systematic methodological research has been done about administrating and conducting surveys in these online domains. This is somewhat surprising due to the methodological advances of other research techniques in the field of online community research such as virtual ethnography, data mining and social network analysis and to the fact that internet surveys were among the first techniques used in online community research (Jones 1998, 1999).Quality of web surveys in online communities is commonly affected by two types of survey errors (Andrews et al. 2003): non-coverage and unit nonresponse error. Although both errors can be interrelated, on one hand, the noncoverage error mainly applies to the issue of defining a valid unit of analysis (on theoretical and empirical level). Generally online communities require from individuals a kind of formal registration in order to participate in community activities (e.g., in order to submit a message to a forum thread one must log in with their email to an online community). However, there are also online communities that allow individuals to participate without being necessarily registered (i.e., anonymous users). In such cases, researchers often opt for intercept surveys and/or unrestricted self-selected surveys (e.g., by posting an invitation message to a forum discussion thread or publish a banner on the website) (Fricker 2010) failing to give any chance of sample selection to some users of the analyzed online community. On the other hand, the unit nonresponse error generally pertains to list-based web surveys (Lozar Manfreda et al. 2002). In such cases a sampling frame of online community members is available (e.g., a list of all members of an online community), but due to various reasons some sample units refuse to answer the survey. As in general population surveys also in online community surveys unit nonresponse can lead to bias. For instance, there is some empirical evidence that a larger percentage of high-frequency online community participants takes part in surveys compared to low-frequency participants and that high-frequency participants reply quicker to survey invitations than low-frequency participants (Andrews et al. 2003).The purpose of this paper is to present a research design of a study aiming at evaluating the effect of survey invitation characteristics and response reluctance on unit nonresponse in list-based surveys of web forum participants. On one hand, we are interested in response-inducing techniques (Dillman et al. 2007). Up to now the literature on list-based web surveys has been generally focused on the topic of personalization of email invitations, their subject lines and graphical design. The focus of this study is on the content of email invitations. Namely, we will attempt to gain an insight into how response rates are associated with request for help, authority, and sense of community. The decision to study a combination of these three elements was theoretically informed by research on online communities, showing that exchange of social support, belonging and norms are among the most salient building blocks of their development and sustainability (Preece 2000, Kraut et al. 2012). Considering the leverage-salience theory (Groves et al. 2000) and the fact that the these elements are not appealing for all online community members to the same extent, putting forward different combinations of the three elements in invitations might therefore result in different response rates. In addition, we would like to explore if and how the three response-inducing techniques are associated with the unit nonresponse bias. In order to address these questions an experiment with a full factorial between-subjects design with eight experimental groups was planned to be implemented. On the other hand, the research design was developed to explore whether there are differences between early and late respondents as well as if sending reminders can reduce the potential bias associated with unit non-response error. The advantage of list-based web surveys in web forums is that nonresponse bias can be analyzed with the record-linkage technique (Porter and Whitcomb 2005) since both respondents and nonrespondents to the survey can be linked to database records of registered web forum users (which generally includes information about frequency of participation, number of posted messages, year of registration, etc.). The presentation will end with an open discussion of potential avenues for optimization of the proposed research design.

Access/Direct link

Workshop Homepage (abstract)

Year of publication2012
Bibliographic typeConferences, workshops, tutorials, presentations
Print

Web survey bibliography - 6th Internet Survey Metodology Workshop 2012 (21)