Web Survey Bibliography

Title Does the Exposure to an Instructed Response Item Attention Check Affect Response Behavior?
Year 2017
Access date 13.04.2017

Relevance & Research Question: Providing high-quality answers requires respondents to thoroughly process survey questions. Accordingly, identifying inattentive respondents is a challenge to web survey methodologists. Instructed response items (IRI) are one tool to detect inattentive respondents. IRIs are included as one item in a grid and instruct the respondents to mark a specific response category (e.g., “click strongly agree”). By now it has not been established whether it has positive or negative spill-over effects on response behavior if respondents are made aware that they are being controlled. Consequently, we investigated how the exposure to an IRI attention check affects response behaviors and, thus, answer quality.

Methods & Data: We rely on data from a web-based survey that was fielded in January 2013 in Germany (participation rate=25.3%, N=1,034). The sample was drawn from an offline-recruited access panel. We randomly split the sample into three groups: Two treatment and one control group. Both treatment groups received an IRI in a grid with 7 items (at the beginning vs. the end of the questionnaire). The control group received the same grid but without the IRI. To assess the effect of being exposed to an IRI on data quality, we compared the following 8 indicators of questionable response behavior between the three experimental groups: straightlining, speeding, choosing “don’t know”, item nonresponse, inconsistent answers, implausible answers, respondent’s self-reported motivation and effort.

Results: Overall, our study did not provide evidence that the exposure to an IRI affected the response behavior. The only notable exception was straightlining for which we found respondents who received an attention check at the beginning of the questionnaire to less frequently straightline in grid questions compared to respondents who were not made aware that they were being controlled.

Added Value: Our experimental study provides insights into the implications of using attention checks in surveys – a topic for which research is surprisingly sparse. While our study is encouraging in terms of negative backlashes by using IRIs in a survey, it also means that we did not find IRIs to raise the respondents’ awareness and, thus, enhance the overall data quality.

Year of publication2017
Bibliographic typeConferences, workshops, tutorials, presentations

Web survey bibliography (8390)