Web Survey Bibliography
Relevance & Research Question: Surveys sometimes include sensitive topics, e.g. sexual behavior or tax evasion. Respondents often hesitate to answer such sensitive items which results in high item non-response rates and a specific type of response error: a tendency to underreport socially undesirable and overreport desirable behavior. The randomized response technique (RRT) (Warner, 1965) is a well-known survey technique to reduce the problem of misreporting by protecting the privacy of the respondents. However, to obtain valid and reliable data, respondents have to understand and follow the technique´s instructions. Cheating detection models (e.g. Clark & Desharnais, 1998) try to identify the respondents which do not act according to the instructions of the design (and, hence, are cheating). Web surveys offer the opportunity to “observe” the respondents´ answering process by means of additional so-called paradata. In this study we present a new approach to detect cheaters using such client-side paradata (especially item response times).
Methods & Data: We conducted a web survey during the university´s open house (N=159) using the RRT to estimate the prevalence of deceiving in a partnership. To assess the individual item response times we implemented two comparable experimental situations; the classical RRT (including a sensitive question) and a similar RR design (without a sensitive question). Assuming that cheaters give quick answers without paying much attention to the content of the question we finally tested whether the individual item response times are significantly different in both settings.
Results: We found a small proportion of cheaters. The detected proportion of cheaters has an effect on the estimated proportion of people carrying the sensitive characteristic as a comparison with the unadjusted estimator shows.
Added Value: Previous research on cheating detection has focused only on the aggregated quantity and not on the individual “quality” of cheaters. The data quality of answers to sensitive questions is improved with such a cheating detection method based on an individual level. Here item response times (and other client-side paradata) could prospectively contribute to improve the estimation process.
Web survey bibliography - General Online Research Conference (GOR) 2014 (29)
- Using Paradata to Predict and to Correct for Panel Attrition in a Web-based Panel Survey; 2014; Rossmann, J., Gummer, T.
- Targeting the bias – the impact of mass media attention on sample composition and representativeness...; 2014; Steinmetz, S., Oez, F., Tijdens, K. G.
- Offline Households in the German Internet Panel; 2014; Bossert, D., Holthausen, A., Krieger, U.
- Which fieldwork method for what target group? How to improve response rate and data quality; 2014; Wulfert, T., Woppmann, A.
- Exploring selection biases for developing countries - is the web a promising tool for data collection...; 2014; Tijdens, K. G., Steinmetz, S.
- Evaluating mixed-mode redesign strategies against benchmark surveys: the case of the Crime Victimization...; 2014; Klausch, L. T., Hox, J., Schouten, B.
- The quality of ego-centered social network data in web surveys: experiments with a visual elicitation...; 2014; Marcin, B., Matzat, U., Snijders, C.
- Switching the polarity of answer options within the questionnaire and using various numbering schemes...; 2014; Struminskaya, B., Schaurer, I., Bosnjak, M.
- Measuring the very long, fuzzy tail in the occupational distribution in web-surveys; 2014; Tijdens, K. G.
- Social Media and Surveys: Collaboration, Not Competition; 2014; Couper, M. P.
- Improving cheater detection in web-based randomized response using client-side paradata; 2014; Dombrowski, K., Becker, C.
- Interest Bias – An Extreme Form of Self-Selection?; 2014; Cape, P. J., Reichert, K.
- Online Qualitative Research – Personality Matters ; 2014; Tress, F., Doessel, C.
- Increasing data quality in online surveys 4.1; 2014; Hoeckel, H.
- Moving answers with the GyroScale: Using the mobile device’s gyroscope for market research purposes...; 2014; Luetters, H., Kraus, M., Westphal, D.
- Online Surveys as a Management Tool for Monitoring Multicultual Virtual Team Processes; 2014; Scovotti, C.
- How much is shorter CAWI questionnaire VS CATI questionnaire?; 2014; Bartoli, B.
- WEBDATANET: A Network on Web-based Data Collection, Methodological Challenges, Solutions, and Implementation...; 2014; Tijdens, K. G., Steinmetz, S., de Pedraza, P., Serrano, F.
- The Use of Paradata to Predict Future Cooperation in a Panel Study; 2014; Funke, F., Goeritz, A.
- Incentives on demand in a probability-based online panel: redemption and the choice between pay-out...; 2014; Schaurer, I., Struminskaya, B., Kaczmirek, L.
- The Effect of De-Contextualisation - A Comparison of Response Behaviour in Self-Administered Surveys; 2014; Wetzelhuetter, D.
- Responsive designed web surveys; 2014; Dreyer, M., Reich, M., Schwarzkopf, K.
- Extra incentives for extra efforts – impact of incentives for burdensome tasks within an incentivized...; 2014; Schreier, J. H., Biethahn, N., Drewes, F.
- Students First Choice – the influence of mobile mode on results; 2014; Maxl, E.
- Device Effects: How different screen sizes affect answer quality in online questionnaires; 2014; Fischer, B., Bernet, F.
- Moving towards mobile ready web panels; 2014; Wijnant, A., de Bruijne, M.
- Innovation for television research - online surveys via HbbTV. A new technology with fantastic opportunities...; 2014; Herche, J., Adler, M.
- Mixed-devices in a probability based panel survey. Effects on survey measurement error; 2014; Toepoel, V., Lugtig, P. J.
- Online mobile surveys in Italy: coverage and other methodological challenges; 2014; Poggio, T.