Web Survey Bibliography
Relevance & Research Question: Many studies found that the wording of a survey question can influence the answers that respondents provide. In particular, it has been shown that vague and ambiguous terms are often interpreted idiosyncratically by respondents, and thus can introduce a systematic bias into the survey data. In addition to ambiguity, the cognitive effort required to understand survey questions may affect data quality in a similar way. Earlier research identified several problematic text features (such as low-frequency words, left-embedded syntactic structures, low syntactic redundancy) that reduce question clarity and make survey questions difficult to comprehend (e.g. Lenzner, Kaczmirek, & Lenzner, 2010). This paper extends the earlier findings and examines whether the effort required to comprehend survey questions affects data quality.
Methods & Data: An experiment was carried out in which respondents were asked to complete two Web surveys (N1=825, N2=515) at a two-week interval. Approximately half of the respondents answered questionnaires that included unclear and less comprehensible questions, the other half received control questions that were easier to comprehend. Indicators of data quality were drop-out rates, number of non-substantive responses (“Don’t know’s”), number of neutral (midpoint) responses, and over-time consistency of responses across the two surveys. In addition, respondents’ verbal intelligence and motivation were assessed to examine whether question clarity effects were moderated by these two respondent characteristics.
Results: As expected, respondents receiving unclear questions provided lower-quality responses than respondents answering more comprehensible questions. Moreover, some of these effects were more pronounced among respondents with limited verbal skills and among respondents with low motivation to answer surveys.
Added value: These findings indicate that survey results can be systematically biased if questions are difficult to understand and exceed the processing effort that respondents are willing or able to invest. Making it easy for respondents to retrieve the meaning of a survey question seems to be an important requirement for obtaining high-quality answers.
Conference Homepage (abstract) / (presentation)
Web Survey Bibliography (6389)
- Examining the Gender Effects of Different Incentive Amounts in a Web Survey; 2013; Boulianne, S. J.
- Online Survey Software; 2013; Baker, J. D.
- How Do Lotteries and Study Results Influence Response Behavior in Online Panels?; 2013; Goeritz, A., Luthe, S. C.
- Mode Effects in Free-list Elicitation: Comparing Oral, Written, and Web-based Data Collection; 2013; Gravlee, C. C., Bernard, H. R., R., Jacobsohn, A., R.Maxwell, C. R.
- Incentives for college student participation in web-based substance use surveys; 2013; Patrick, M. E., Singer, E., Boyd, C. J., Cranford, J. A., McCabe, S. E.
- The effect of short formative diagnostic web quizzes with minimal feedback; 2013; Baelter, O., Enstroem, E., Klingenberg, B.
- Increasing Web Survey Response Rates in Innovation Research: An Experimental Study of Static and Dynamic...; 2013; Sauermann, H.; Roach, M.
- Sample composition discrepancies in different stages of a probability-based online panel; 2013; Bosnjak, M., Haas, I., Galesic, M., Kaczmirek, L., Bandilla, W., Couper, M. P.
- Survey of Cloud Computing; 2013; Furht, B.
- A comparison of data quality and practicality of online versus postal questionnaires in a sample of...; 2013; King, M. T., Butow, P., Olver, I., Smith, A. B.
- Up Means Good: The Impact of Screen Position on Evaluative Ratings in Web Surveys.; 2013; Tourangeau, R., Conrad, F. G., Couper, M. P.
- WebSM Study: Overview of Features of Software Packages: SurveyMonkey, QuestionPro, FluidSurveys, Wufoo...; 2012; Cehovin, G.; Vehovar, V.
- WebSM Study: Speed and efficiency of online survey tools; 2012; Cehovin, G.; Vehovar, V.
- Worldwide online research spending; 2012
- What we can learn from unintentional mobile respondents; 2012; Peterson, G.
- Using paradata to explore item-level response times in surveys; 2012; Couper, M. P., Kreuter, F.
- Using multivariate statistics, 6th Edition; 2012; Tabachnick, B. G., Fidell, L. S.
- Unintentional mobile respondents; 2012; Peterson, G.
- Tracking preference expression (DNT); 2012
- The smartphone psychology manifesto; 2012; Miller, G.
- The rise of the "connected viewer"; 2012; Smith, A., Boyles, J. L.
- The practice of social research; 2012; Babbie, E. R.
- The integration of facebook into class management: an exploratory study; 2012; Chou, P. N.
- The effects of item saliency and question design on measurement error in a self-administered survey; 2012; Stern, M. J., Smyth, J. D., Mendez, J.
- The cross platform report. Q2 -2012 - US; 2012
- Speed (necessarily) doesn’t kill: A new way to detect survey satisficing; 2012; Garland, P. et al.
- Smartphone ownership update: September 2012; 2012; Rainie, L.
- Sensitive topics in PC Web and mobile web surveys: Is there a difference?; 2012; Mavletova, A. M., Couper, M. P.
- Selection bias of internet panel surveys: A comparison with a paper-based survey and national governmental...; 2012; Tsuboi, S. et al.
- Screenwise panel: Frequently Asked Questions; 2012
- Research company spotlight - Mobile surveys; 2012
- Redeveloping the research section of Meningitis UK's website — A case study report; 2012; Witt, J. et al.
- Quality in market research. From theory to practice. 2nd Edition; 2012; Harding, D., Jackson, P.
- Participation of mobile users in traditional online studies; 2012; Jue, A.
- Online survey statistics for the mobile future. Updated with Q3 2012 data; 2012
- Ofcom technology tracker Wave 3; 2012
- Ofcom technology tracker Wave 2; 2012
- Not just playing around; 2012; Ewing, T.
- Norme di qualita' Assirm (Assirm quality rules]; 2012
- NBCU enlists Google, ComScore to track multiscreen Olympics viewing; 2012; Spangler, T.
- MRS Guidelines for online reseach; 2012
- More dirty little secrets of online panel research.; 2012
- Mobile usability; 2012; Nielsen, J., Budiu, R.
- Mobile email opens report 2nd half 2011; 2012
- Metering mobile usage. Insights from global Arbitron mobile trends panel; 2012; Verkasalo, H.
- Media tracker; 2012
- Measuring the quality of governmental websites in a controlled versus an online setting with the ‘...; 2012; Elling, S. et al.
- Measuring modern media consumption; 2012; Arini, N.
- ISO 20252. Market, opinion and social research-Vocabulary and service requirements, 2nd Edition; 2012
- Is „chapterisation“ a viable alternative to traditional progress indicators ?; 2012; Spicer, R., Dowling, Z.