Web Survey Bibliography
Decreasing survey participation leads to increasing survey costs and smaller precision of survey estimates. In the likely case that noncontacts and refusals differ from respondents, it may also increase nonresponse bias. Online panels are promoted as the solution to these problems. Online panels are relatively inexpensive as large samples can easily be drawn from panels containing hundreds of thousands of willing respondents. With an increasing internet penetration, even specific groups such as the elderly will have internet access and therefore they can included in online panels. Survey methodology has been developed over a period of more than 100 years. The paradigm of probability sampling has shown to work well in social research, official statistics and market research. It has allowed researchers to produce well-founded and reliable survey results. Often the impression is created that this paradigm also applies to online panel research. Sadly, general survey quality criteria such as sample size and response rate cannot be generalized to panel studies. Under-coverage and self-selection may seriously limit their value for scientific and policy making purposes. Some survey researchers claim that the problems mentioned above can be reduced by applying some kind of weighting adjustment procedure, e.g. using weighting variables measured in a reference survey. We argue this is a too optimistic point of view. The presentation will outline why online panels do not solve the problems caused by decreased participation. Will discuss online panels within a quality framework and will present examples of possibly useful applications of online panels in academic and governmental research.
In market research online panels are considered to be the future. There are signs, however, that despite the rapidly increasing market share of web research, the rising star of online panels is already on the wane.
Conference Homepage (abstract) / (full text)
Web Survey Bibliography - Stoop, I. (11)
- Unit Non-Response Due to Refusal; 2012; Stoop, I.
- Classification of Surveys; 2012; Stoop, I., Harrison, E.
- Smartphone Data Collection Among LISS Panel Members; 2011; Fernee, H., Sonck, N., Stoop, I.
- Nonresponse Bias in Surveys; 2009; Bethlehem, J., Vehovar, V., Stoop, I., Schouten, B., Shlomo, N., Skinner, C., Montaquila, J.
- Access panels and online research, panacea or pitfall? Proceedings of the DABS symposium, Amsterdam,...; 2008; Stoop, I.
- Survey data, context and event data; 2007; Stoop, I.
- Increased fieldwork efforts, enhanced response rates, better estimates?; 2007; Stoop, I., Verhagen, J., van Ingen, E.
- Online Panels - A Paradigm Theft? ; 2007; Bethlehem, J., Stoop, I.
- Access Panels and Survey Nonresponse: Making It Better or Worse?; 2007; Stoop, I., Bethlehem, J., de Bie, S.
- The Impact of Events on Attitudes: Real-time Measurement; 2007; Stoop, I.
- Access pools as a solution to the nonresponse problem?; 2004; Stoop, I.