Web Survey Bibliography
Response latency to web surveys is of considerable interest. Time from the stimulus (here, displaying a question) to the response (here, recording the answer) is used to identify potentially problematic respondents (those with low latency) and items (those with high latency). Such uses are, however, typically narrowly constructed. We analyze a wide variety of factors using a rich dataset to develop a deeper understanding of the drivers of response latency in web surveys. The Rice University Religion and Science in International Context (RASIC) survey of members of biology and physics departments in Italian universities and research institutes measured response latency for each survey item. The RASIC dataset is a rich source of material. Respondent-level measures include extensive biographical data including age, academic rank, and language of choice (the survey was offered in Italian and English). Item-level measures include length of item, reading difficulty, topic, number of responses, and position in survey. Paradata include accumulated time spent on the survey, time of day, and device/browser used. The resulting dataset has respondent x item observations, with each observation being nested within respondent (e.g., age, tenure) and item (e.g., item length, reading grade level). Due to this nesting, a hierarchical cross-classified model is used for analysis. Our findings will shed light on the impact of a broad range of factors associated with response latency, addressing questions including the effects of time of day, age, means of access, reading grade level, number of response options, and so on. These analyses will provide important context for the perhaps simplistic interpretations of response latency: low latency being a desirable trait for items but undesirable for a respondent. Data collection utilized for this paper was funded by the Templeton World Charity Foundation, grant TWCF0033.AB14, Elaine Howard Ecklund, PI, Kirstin RW Matthews and Steven W. Lewis co-PIs.
Web survey bibliography - Kolenikov, S. (7)
- Influences on Item Response Times in a Multinational Web Survey ; 2016; Phillips, B. T.; Kolenikov, S.; Howard Ecklund, E.; Ackermann, A.; Brulia, A.
- Mode Effects in American Trends Panel: Bayesian Analysis of a Cross-classified Item-person Mixed Model...; 2016; Gill, Je.; Kolenikov, S.; McGeeney, K.
- Evaluating Three Approaches to Statistically Adjust for Mode Effects; 2016; Kolenikov, S.; Kennedy, C.
- Future Training of Survey Methodologists; 2015; Kolenikov, S., Jans, M., O'Hare, B. C., Fricker, S.
- Influences on Response Latency in a Web Survey; 2015; Ackermann, A.; Cheng, H. W.; Howard Ecklund, E.; Kolenikov, S.; Phillips, B. T.
- Nonresponse Analysis and Adjustment in the Follow- Up Study of a National Cohort of Gulf War And Gulf...; 2015; Dursa, E.; Hammer, H.; Kolenikov, S.; Schneiderman, A. I.
- Mode effect analysis and adjustment in a split-sample mixed-mode Web/CATI survey; 2013; Kolenikov, S., Kennedy, C.