Web Survey Bibliography
Title Influences on Item Response Times in a Multinational Web Survey
Year 2016
Access date 09.06.2016
Abstract
We model time to respond in web surveys of members of biology and physics departments in French, Italian, Turkish, and U.S. universities and
research institutes to understand factors associated with time to respond to survey items in a cross-national, multilingual context. Our findings identify points at which respondent attention diminishes, providing guidance on optimal length of item stems, response options, and survey length for similar populations.The Rice University Religion among Scientists in International Context (RASIC) survey included measures of time to respond. The survey provides a rich source of material. Respondent-level measures include biographical data including age,academic rank, language of choice (the survey was offered in the native language and English in non-U.S. locales), and country of origin. Item-level measures include length of item, reading difficulty, topic, number of responses, and position in survey. Paradata include accumulated time spent on the survey, and time of day. We find inflection points beyond which we see satisficing in the form of diminished respondent attention for the following factors: number of words in item stems, time from start of the survey. Differences in inflection points by language of survey are analyzed by respondent country of birth to understand variations for nonnative speakers. Variations in time of response for sequence of item in instrument (controlling for time), question type, time of day, day of week, and academic rank are also seen. No effect is found for reading grade level, number of response options (controlling for words in response options), gender, inclusion of a “don’tknow” option, scientific discipline, or restarting the survey. A hierarchical cross-classified model is used for analysis. Implications of these findings for questionnaire design are discussed.RASIC data collection was funded by the Templeton World Charity Foundation, grant TWCF0033.AB14, Elaine Howard Ecklund, PI, Kirstin RW Matthews and Steven W. Lewis, co-PIs.
research institutes to understand factors associated with time to respond to survey items in a cross-national, multilingual context. Our findings identify points at which respondent attention diminishes, providing guidance on optimal length of item stems, response options, and survey length for similar populations.The Rice University Religion among Scientists in International Context (RASIC) survey included measures of time to respond. The survey provides a rich source of material. Respondent-level measures include biographical data including age,academic rank, language of choice (the survey was offered in the native language and English in non-U.S. locales), and country of origin. Item-level measures include length of item, reading difficulty, topic, number of responses, and position in survey. Paradata include accumulated time spent on the survey, and time of day. We find inflection points beyond which we see satisficing in the form of diminished respondent attention for the following factors: number of words in item stems, time from start of the survey. Differences in inflection points by language of survey are analyzed by respondent country of birth to understand variations for nonnative speakers. Variations in time of response for sequence of item in instrument (controlling for time), question type, time of day, day of week, and academic rank are also seen. No effect is found for reading grade level, number of response options (controlling for words in response options), gender, inclusion of a “don’tknow” option, scientific discipline, or restarting the survey. A hierarchical cross-classified model is used for analysis. Implications of these findings for questionnaire design are discussed.RASIC data collection was funded by the Templeton World Charity Foundation, grant TWCF0033.AB14, Elaine Howard Ecklund, PI, Kirstin RW Matthews and Steven W. Lewis, co-PIs.
Access/Direct link Conference Homepage (abstract)
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - Kolenikov, S. (7)
- Influences on Item Response Times in a Multinational Web Survey ; 2016; Phillips, B. T.; Kolenikov, S.; Howard Ecklund, E.; Ackermann, A.; Brulia, A.
- Mode Effects in American Trends Panel: Bayesian Analysis of a Cross-classified Item-person Mixed Model...; 2016; Gill, Je.; Kolenikov, S.; McGeeney, K.
- Evaluating Three Approaches to Statistically Adjust for Mode Effects; 2016; Kolenikov, S.; Kennedy, C.
- Future Training of Survey Methodologists; 2015; Kolenikov, S., Jans, M., O'Hare, B. C., Fricker, S.
- Influences on Response Latency in a Web Survey; 2015; Ackermann, A.; Cheng, H. W.; Howard Ecklund, E.; Kolenikov, S.; Phillips, B. T.
- Nonresponse Analysis and Adjustment in the Follow- Up Study of a National Cohort of Gulf War And Gulf...; 2015; Dursa, E.; Hammer, H.; Kolenikov, S.; Schneiderman, A. I.
- Mode effect analysis and adjustment in a split-sample mixed-mode Web/CATI survey; 2013; Kolenikov, S., Kennedy, C.