Web Survey Bibliography
Relevance and research question: Cost pressures continue to necessitate switching single-mode designs, such as face-to-face (F2F), to inexpensive mixed-mode (MM) designs. This, however, involves the risk of finding different estimates in the MM and single-mode designs, if total survey error is affected by the redesign strategy. Differences in measurement and non-observation error are particularly relevant candidates for design differences in total error, also called measurement and selection effects (MEs and SEs). Knowledge about these effects is relevant to evaluate the usefulness of a MM redesign. We demonstrate the estimation of MEs and SEs for the case of the Dutch Crime Victimization Survey (CVS).
Methods and Data: We administered a split-ballot design, in which four independent samples (n=2,200 each) were drawn from the Dutch population register and assigned either to one of three sequential mixed-mode surveys (web, mail, and telephone, followed by F2F, respectively) or to a single-mode F2F condition, which served as benchmark. Additionally, the respondents to web, mail and telephone were approached a second time in F2F after four weeks. This step made available ‘within-subject’ estimates of MEs, which were exploited to disentangle MEs and SEs in the split-ballot design.
Results: Largest design differences in estimates were found for the mixed-mode mail-F2F design and, with smaller magnitude, also for web-F2F. The telephone-F2F survey showed mainly insignificant differences against the benchmark. In evaluating MEs and SEs, we found that MEs were the predominant cause of the differences between both, mail-F2F and web-F2F, and the benchmark, whereas SEs were generally very small. MEs and SEs were absent when comparing telephone-F2F against F2F.
Added Value: In the CVS case, the large MEs for the mail-F2F and web-F2F designs would require further redesign of questionnaires (e.g. by unimode strategies) to balance measurement error in mail and web towards F2F. Telephone-F2F yielded comparable estimates vis-à-vis the benchmark due to similar measurement error properties and could be implemented directly. The absence of SEs might suggest that the MM designs were successful in mitigating non-observation error differences between designs. More generally, our method could be used by other researchers to evaluate MM redesigns for other surveys.
Web survey bibliography - General Online Research Conference (GOR) 2014 (29)
- Using Paradata to Predict and to Correct for Panel Attrition in a Web-based Panel Survey; 2014; Rossmann, J., Gummer, T.
- Targeting the bias – the impact of mass media attention on sample composition and representativeness...; 2014; Steinmetz, S., Oez, F., Tijdens, K. G.
- Offline Households in the German Internet Panel; 2014; Bossert, D., Holthausen, A., Krieger, U.
- Which fieldwork method for what target group? How to improve response rate and data quality; 2014; Wulfert, T., Woppmann, A.
- Exploring selection biases for developing countries - is the web a promising tool for data collection...; 2014; Tijdens, K. G., Steinmetz, S.
- Evaluating mixed-mode redesign strategies against benchmark surveys: the case of the Crime Victimization...; 2014; Klausch, L. T., Hox, J., Schouten, B.
- The quality of ego-centered social network data in web surveys: experiments with a visual elicitation...; 2014; Marcin, B., Matzat, U., Snijders, C.
- Switching the polarity of answer options within the questionnaire and using various numbering schemes...; 2014; Struminskaya, B., Schaurer, I., Bosnjak, M.
- Measuring the very long, fuzzy tail in the occupational distribution in web-surveys; 2014; Tijdens, K. G.
- Social Media and Surveys: Collaboration, Not Competition; 2014; Couper, M. P.
- Improving cheater detection in web-based randomized response using client-side paradata; 2014; Dombrowski, K., Becker, C.
- Interest Bias – An Extreme Form of Self-Selection?; 2014; Cape, P. J., Reichert, K.
- Online Qualitative Research – Personality Matters ; 2014; Tress, F., Doessel, C.
- Increasing data quality in online surveys 4.1; 2014; Hoeckel, H.
- Moving answers with the GyroScale: Using the mobile device’s gyroscope for market research purposes...; 2014; Luetters, H., Kraus, M., Westphal, D.
- Online Surveys as a Management Tool for Monitoring Multicultual Virtual Team Processes; 2014; Scovotti, C.
- How much is shorter CAWI questionnaire VS CATI questionnaire?; 2014; Bartoli, B.
- WEBDATANET: A Network on Web-based Data Collection, Methodological Challenges, Solutions, and Implementation...; 2014; Tijdens, K. G., Steinmetz, S., de Pedraza, P., Serrano, F.
- The Use of Paradata to Predict Future Cooperation in a Panel Study; 2014; Funke, F., Goeritz, A.
- Incentives on demand in a probability-based online panel: redemption and the choice between pay-out...; 2014; Schaurer, I., Struminskaya, B., Kaczmirek, L.
- The Effect of De-Contextualisation - A Comparison of Response Behaviour in Self-Administered Surveys; 2014; Wetzelhuetter, D.
- Responsive designed web surveys; 2014; Dreyer, M., Reich, M., Schwarzkopf, K.
- Extra incentives for extra efforts – impact of incentives for burdensome tasks within an incentivized...; 2014; Schreier, J. H., Biethahn, N., Drewes, F.
- Students First Choice – the influence of mobile mode on results; 2014; Maxl, E.
- Device Effects: How different screen sizes affect answer quality in online questionnaires; 2014; Fischer, B., Bernet, F.
- Moving towards mobile ready web panels; 2014; Wijnant, A., de Bruijne, M.
- Innovation for television research - online surveys via HbbTV. A new technology with fantastic opportunities...; 2014; Herche, J., Adler, M.
- Mixed-devices in a probability based panel survey. Effects on survey measurement error; 2014; Toepoel, V., Lugtig, P. J.
- Online mobile surveys in Italy: coverage and other methodological challenges; 2014; Poggio, T.