Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Evaluating Interactive Feedback in Computer-Assisted Self-Interviewing (CASI)
Year 2013
Access date 30.05.2013
Abstract

A long-standing concern with self-interviewing methods is that respondents may lack the motivation to spend effort in completing the survey, which can lead to satisficing and compromised data quality. Recently researchers have started to explore the use of interactive feedback in computer-assisted self-interviewing (CASI) whereby respondents are prompted if satisficing behaviors are detected (e.g., respondents receive messages saying they are going too fast when their response time is quicker than a certain threshold). In particular, a small number of studies, mostly using online panels, have shown that such interactive feedback can effectively reduce targeted undesirable behaviors in Web surveys without a substantial increase in break-offs. While these findings are promising, it is not clear if the same success would be observed with other survey populations who may not be as motivated to complete surveys as panel respondents. Even more importantly, little is known as to whether this type of interactive feedback in self-administered surveys could affect perceived privacy and thus, introduce social desirability bias in answers to sensitive questions. We will report findings from a CASI survey of mental health risk and resilience among Soldiers new to the U.S. Army. Response speed prompts were implemented in response to concerns about satisficing behavior. The speed prompts were introduced approximately one quarter of the way through the study. Since the monthly samples are independent and representative, a natural pre/post comparison is possible. Survey data will be compared before and after the implementation to evaluate whether these prompts can effectively influence response time and improve response quality (based on indicators such as item nonresponse, straightlining, and acquiescence). We will also assess if the use of prompts could backfire – i.e., producing more break-offs and fewer reports of socially undesirable answers, given the survey is voluntary and contains many sensitive questions (e.g., suicidal ideation).

Access/Direct link

Conference Homepage (abstract)

Year of publication2013
Bibliographic typeConferences, workshops, tutorials, presentations
Full text availabilityFurther details
Print

Web survey bibliography - The American Association for Public Opinion Research (AAPOR) 68th Annual Conference, 2013 (88)

Page:
  • 1
  • 2
Page:
  • 1
  • 2