Web Survey Bibliography
It is generally accepted that conducting surveys online is both faster and cheaper than other, more traditional, survey methodologies. Such advantages have helped to drive the growth of online surveys over recent years. In addition to the growth of online surveys, respondents are increasingly required to answer more personal and sensitive questions online. It is therefore important to investigate and understand the behaviour of respondents to sensitive questions in surveys in order to ensure the most effective methodology is employed.
A salient issue in online survey research is the removal of an interviewer. This is particularly relevant when dealing with sensitive topics - when the lack of interviewer presence can remove response bias. Much research has demonstrated that surveys administered online, without an interviewer being present, are characterised by higher levels of self disclosure (Weisband and Kiesler 1996), an increased willingness to answer sensitive questions (Tourangeau 2004) and reductions in socially desirable responding (Frick et al. 2001; Joinson 1999). Furthermore, survey methodologies that reduce the level of question administration by human interviewers (e.g. via computer-aided self interviews) also increase responses to sensitive personal questions and yield more honest, candid answers.
As part of the ongoing experimental work at Ipsos MORI we are investigating the affect of different survey methodologies on respondents’ behaviour to sensitive questions.
In the present paper we present a two part study. Part 1 searches evidence of survey mode effect on disclosure levels and examines data consisting of participants interviewed in one of three conditions. In condition one, 1,645 members of the Ipsos Online Panel completed an online survey. In condition two, 902 were interviewed offline, face-to-face using Computer Assisted Personal Interviewing (CAPI) interviewing. Finally, in condition three, 1028 participants were again interviewed offline, using Computer Assisted Self Interviewing (CASI). Direct comparison were possible between the two offline samples. Allocation to the online sample, on the other hand, was not randomized thus propensity score adjustment was applied to control for possible confounding of online/offline comparisons. Respondents were asked more than 50 questions about a variety of topics from politics to media consumption. Within these questions respondents were asked five which were deemed as sensitive. The topics for the sensitive items covered: immigration, adultery, drink driving, abortion, and attitudes toward debt.
Part 2 examined the association between the level of sensitivity and level of disclosure, and specifically any differences between the three survey modes. To estimate the social sensitivity an ad hoc panel of five experienced independent social researchers sampled from a larger pool of experts and were asked to rank levels of sensitivity of each of the five questions. After passing reliability tests of agreement between raters the estimated sensitivity was correlated with item disclosure level by mode.
Finally, implications for the handling of sensitive questions in survey research are discussed.
Web Survey Bibliography - 2007 (365)
- XSight and the shaping of Marketing Analytics; 2007; Birks, D. F.
- Whither statistical metadata?; 2007; Westlake, A.
- What's the time? Relations between interview periods and output periods in surveys; 2007; Lound, C. et al.
- Web survey design; 2007; Ma, Q., McCord, M.
- Web survey and representativeness: Close to three in ten Canadians do not have access to the Internet...; 2007; Bourque, C., Lafrance, S.
- Video mediated communication: Implications for surveys; 2007; Anderson, A. H.
- Utopia - a complete research management system; 2007; Brandwood, T.
- Utility and happiness; 2007; Kimball, M. S., Willis, R.
- Triple-S: The broader horizon; 2007; Wright, G.
- The use of seasonal adjustment software within the Office for National Statistics; 2007; Hussain, F., McLaren, C. H., Stuttard, N.
- The Internet audience. Constitution & measurement; 2007; Bermejo, F.
- The influence of advance letters on response in telephone surveys; 2007; de Leeuw, E. D., Callegaro, M., Hox, J., Korendijk, E., Lensvelt-Mulders, G. J.
- The impact of cookie deletion on the accuracy of site-server and ad-server metrics: An empirical comScore...; 2007; Abraham, M., Meierhoefer, C., Lipsman, A.
- The difficulty of understanding social survey questionnaires from the published documentation; 2007; Hughes, G. N.
- The challenge of geocoding large-scale travel surveys; 2007; J.Smith, A. J.
- The case for publishing (some) online polls; 2007; Taylor, H.
- Surveys interviews and new communication technologies; 2007; Schober, M. F., Conrad, F. G.
- Survey data, context and event data; 2007; Stoop, I.
- Spoken and multimodal dialog systems for survey research; 2007; Johnston, M.
- Software licence agreements: Just what are you agreeing to when you press the ''Accept'...; 2007; Sampson, P., Wills, P.
- Software design tips for online surveys; 2007; Artz, J. M.
- Simple rating scale formats. Exploring extreme response; 2007; Albaum, G. et al.
- Sampling in online surveys; 2007; Beidernikl, G., Kerschbaumer, A.
- Sampling for web surveys; 2007; Rivers, D.
- Response option ordering: Reconciliating meanings conveyed by rating scale position and label. Unpublished...; 2007; Garland, P., Krosnick, J. A.
- Research synthesis: The practice of cognitive interviewing; 2007; Beatty, P. C., Willis, G. B.
- Case Study: Evolution of Web Interview Capabilities in a Large Commercial Setting ; 2007; Cohen, A.
- Reporting societal events to facilitate the interpretation of survey results; 2007; Zuell, C., Landmann, J.
- Reliability, equivalence and respondent preference of computerized versus paper-and-pencil mental health...; 2007; Wijndaele, K. et al.
- Reconstructing childhood health histories using internet panels; 2007; Smith, J. P.
- Qualitative data exchange: Methods and tools; 2007; Corti, L.
- Problems with surveys among ethnic minorities in the Netherlands; 2007; Kappelhof, J.
- Pilot study to recruite a sample for an online panel: Effects of contact mode, incentives and information...; 2007; Scherpenzeel, A.
- Overcoming challenges to conducting online surveys; 2007; Ye, J.
- Opportunities and constraints of electronic research; 2007; Roberts, L. D.
- Online-questionnaire design guidelines; 2007; Lumsden, J.
- Online market research, 5th Edition; 2007; Comley, P.
- Online access panels and tracking research. The conditioning issue; 2007; Nancarrow, C., Cartwright, T.
- Nicht-reaktive datenerhebung: Teinahmeverhalten bei befragungen mit paradaten evaluieren. [Non reactive...; 2007; Kaczmirek, L., Neubarth, W.
- New technologies and tools for study management: Designing, implementing and maintaining a Web-based...; 2007; Courtney, L. et al.
- Multiple imputation: review of theory, implementation and software; 2007; Harel, O., Zhou, X. H.
- More honest answers to surveys? A Study of data collection mode effects; 2007; Dennis, J. M., Li, R. J.
- Modes, trends, and content: A comparison of the 2003 HRS internet survey with HRS 2002 and 2004 Core...; 2007; Weir, D.
- Mixed-mode surveys with Netservey; 2007; Papagiannidis, S., Li, F.
- Lessons learned: Converting a telephone survey panel to an internet panel; 2007; Roe, D. J., Stockdale, J., Farrelly, M., Heinrich, T.
- Issue preferences and evaluations of the U.S. supreme court; 2007; Hetherington, M. J., Smith, J. L.
- Is Quanvert here to stay?; 2007; Read, N.
- Increased fieldwork efforts, enhanced response rates, better estimates?; 2007; Stoop, I., Verhagen, J., van Ingen, E.
- IMRO guidelines for best practices in online sample and panel management; 2007
- ICC/ESOMAR International code on market and social research; 2007