Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Influences on Response Latency in a Web Survey
Year 2015
Access date 08.07.2015
Abstract

Response latency to web surveys is of considerable interest. Time from the stimulus (here, displaying a question) to the response (here, recording the answer) is used to identify potentially problematic respondents (those with low latency) and items (those with high latency). Such uses are, however, typically narrowly constructed. We analyze a wide variety of factors using a rich dataset to develop a deeper understanding of the drivers of response latency in web surveys. The Rice University Religion and Science in International Context (RASIC) survey of members of biology and physics departments in Italian universities and research institutes measured response latency for each survey item. The RASIC dataset is a rich source of material. Respondent-level measures include extensive biographical data including age, academic rank, and language of choice (the survey was offered in Italian and English). Item-level measures include length of item, reading difficulty, topic, number of responses, and position in survey. Paradata include accumulated time spent on the survey, time of day, and device/browser used. The resulting dataset has respondent x item observations, with each observation being nested within respondent (e.g., age, tenure) and item (e.g., item length, reading grade level). Due to this nesting, a hierarchical cross-classified model is used for analysis. Our findings will shed light on the impact of a broad range of factors associated with response latency, addressing questions including the effects of time of day, age, means of access, reading grade level, number of response options, and so on. These analyses will provide important context for the perhaps simplistic interpretations of response latency: low latency being a desirable trait for items but undesirable for a respondent. Data collection utilized for this paper was funded by the Templeton World Charity Foundation, grant TWCF0033.AB14, Elaine Howard Ecklund, PI, Kirstin RW Matthews and Steven W. Lewis co-PIs.

Year of publication2015
Bibliographic typeConferences, workshops, tutorials, presentations
Print

Web survey bibliography - The American Association for Public Opinion Research (AAPOR) 70th Annual Conference, 2015 (35)