Web Survey Bibliography
Title Influences on Item Response Times in a Multinational Web Survey
Year 2016
Access date 09.06.2016
Abstract
We model time to respond in web surveys of members of biology and physics departments in French, Italian, Turkish, and U.S. universities and
research institutes to understand factors associated with time to respond to survey items in a cross-national, multilingual context. Our findings identify points at which respondent attention diminishes, providing guidance on optimal length of item stems, response options, and survey length for similar populations.The Rice University Religion among Scientists in International Context (RASIC) survey included measures of time to respond. The survey provides a rich source of material. Respondent-level measures include biographical data including age,academic rank, language of choice (the survey was offered in the native language and English in non-U.S. locales), and country of origin. Item-level measures include length of item, reading difficulty, topic, number of responses, and position in survey. Paradata include accumulated time spent on the survey, and time of day. We find inflection points beyond which we see satisficing in the form of diminished respondent attention for the following factors: number of words in item stems, time from start of the survey. Differences in inflection points by language of survey are analyzed by respondent country of birth to understand variations for nonnative speakers. Variations in time of response for sequence of item in instrument (controlling for time), question type, time of day, day of week, and academic rank are also seen. No effect is found for reading grade level, number of response options (controlling for words in response options), gender, inclusion of a “don’tknow” option, scientific discipline, or restarting the survey. A hierarchical cross-classified model is used for analysis. Implications of these findings for questionnaire design are discussed.RASIC data collection was funded by the Templeton World Charity Foundation, grant TWCF0033.AB14, Elaine Howard Ecklund, PI, Kirstin RW Matthews and Steven W. Lewis, co-PIs.
research institutes to understand factors associated with time to respond to survey items in a cross-national, multilingual context. Our findings identify points at which respondent attention diminishes, providing guidance on optimal length of item stems, response options, and survey length for similar populations.The Rice University Religion among Scientists in International Context (RASIC) survey included measures of time to respond. The survey provides a rich source of material. Respondent-level measures include biographical data including age,academic rank, language of choice (the survey was offered in the native language and English in non-U.S. locales), and country of origin. Item-level measures include length of item, reading difficulty, topic, number of responses, and position in survey. Paradata include accumulated time spent on the survey, and time of day. We find inflection points beyond which we see satisficing in the form of diminished respondent attention for the following factors: number of words in item stems, time from start of the survey. Differences in inflection points by language of survey are analyzed by respondent country of birth to understand variations for nonnative speakers. Variations in time of response for sequence of item in instrument (controlling for time), question type, time of day, day of week, and academic rank are also seen. No effect is found for reading grade level, number of response options (controlling for words in response options), gender, inclusion of a “don’tknow” option, scientific discipline, or restarting the survey. A hierarchical cross-classified model is used for analysis. Implications of these findings for questionnaire design are discussed.RASIC data collection was funded by the Templeton World Charity Foundation, grant TWCF0033.AB14, Elaine Howard Ecklund, PI, Kirstin RW Matthews and Steven W. Lewis, co-PIs.
Access/Direct link Conference Homepage (abstract)
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography (4086)
- Respondent Processing of Rating Scales and the Scale Direction Effect ; 2016; Caporaso, A.
- The Effects of Pictorial vs. Verbal Examples on Survey Responses ; 2016; Sun, H.; Bertling, J.; Almonte, D.
- Evaluating Grid Questions for 4th Graders; 2016; Maitland, A.
- Mixing Modes: Challenges (and Tradeoffs) of Adapting a Mailed Paper Survey to the Web ; 2016; Wilkinson-Flicker, S.; McPhee, C. B.; Medway, R.; Kaiser, A.; Cutts, K.
- An Examination of How Survey Mode Affect Eligibility, Response and Health Condition Reporting Rates...; 2016; Stern, M. J.; Ghandour, R.
- Investigating Measurement Error through Survey Question Placement ; 2016; Wilson, A.; Wine, J.; Janson, N.; Conzelmann, J.; Peytcheva, E.
- Instructions in Self-administered Survey Questions: Do They Improve Data Quality or Just Make the Questionnaire...; 2016; Redline, C. D.; Zukerberg, A.; Owens, C.; Ho, A.
- Usability Testing within Agile Process; 2016; Holland, T.
- Exploring Why Web Surveys Take Longer to Complete on Smartphones than PCs: Findings from a Within-subjects...; 2016; Antoun, C.; Cernat, A.
- Making Mobile Web Surveys Accessible; 2016; Malakhoff, L.
- Association of Eye Tracking with Other Usability Metrics ; 2016; Olmsted, E. L.
- Cognitive Probing Methods in Usability Testing – Pros and Cons; 2016; Nichols, E. M.
- Grids and Online Surveys: Do More Complex Grids Induce Survey Satisficing? Evidence from the Gallup...; 2016; Wang, Me.; McCutcheon, A. L.
- Assessing the Accuracy of 51 Nonprobability Online Panels and River Samples: A Study of the Advertising...; 2016; Yang,Y.;Callegaro,M.;Yang,Y.;Callegaro,M.;Chin,K.;Yang,Y.;Villar,A.;Callegaro, M.; Chin, K.; Krosnick...
- Calculating Standard Errors for Nonprobability Samples when Matching to Probability Samples ; 2016; Lee, Ad.; ZuWallack, R. S.
- Communicating Data Use and Privacy: In-person versus Web based methods for message testing ; 2016; Clark Fobia, A.; Hunter Childs, J. E.
- User Experience and Eye-tracking: Results to Optimize Completion of a Web Survey and Website Design ; 2016; Walton, L.; Ricci, K.; Libman Barry, A.; Eiginger, C.; Christian, L. M.
- Estimated-control Calibrated Estimates from Nonprobability Surveys; 2016; Dever, J. A.
- Decomposing Selection Effects in Non-probability Samples ; 2016; Mercer, A. W.; Keeter, S.; Kreuter, F.
- The Effect of Emphasizing the Web Option in a Mixed-mode Establishment Survey ; 2016; O'Brien, J.; Rajapaksa, S.; Schafer, B.; Langetieg, P.
- A Multi-phase Exploration Into Web-based Panel Respondents: Assessing Differences in Recruitment, Respondents...; 2016; Redlawsk, D.; Rogers, K.; Borie-Holtz, D.
- Effect of Clarifying Instructions on Response to Numerical Open-ended Questions in Self-administered...; 2016; Kumar Chaudhary, A.; Israel, G. D.
- Exploring the Feasibility of Using Facebook for Surveying Special Interest Populations ; 2016; Lee, C.; Jang, S.
- National Estimates of Sexual Minority Women Alcohol Use through Web Based Respondent Driven Sampling...; 2016; Farrell Middleton, D.; Iachan, R.; Freedner-Maguire, N.; Trocki, K.; Evans, C.
- Bringing Fair Market Rent Surveys into the 21st Century – Evaluating the Effectiveness of MSG...; 2016; Dayton, J.; Brassell, T.; Cooper, V.; Dion, R.; Williams, R.
- Measuring Survey Behavior of Smartphone Users; 2016; Luks, S.; Phillips, R.
- Practical Considerations for Using Vignettes to Evaluate Survey Items ; 2016; Steiger, D. M.; Williams, Do.; Edwards, W. S.; Cantor, D.; Truman, J.
- Using Web Panels to Quantify the Qualitative: The National Center for Health Statistics Research and...; 2016; Scanlon, P. J.
- Impact of Field Period Length in the Estimates of Sexual Victimization in a Web-based Survey of College...; 2016; Berzofsky, M.; Peterson, K.; Shook-Sa, B. E.; Lindquist, C.; Krebs, C.
- Longitudinal Online Ego-centric Social Network Data Collection with EgoWeb 2.0 ; 2016; Amin, A.; Kennedy, D.
- Influences on Item Response Times in a Multinational Web Survey ; 2016; Phillips, B. T.; Kolenikov, S.; Howard Ecklund, E.; Ackermann, A.; Brulia, A.
- QR Codes for Survey Access: Is It Worth It?; 2016; Allen, L.; Marlar, J.
- An Exploration of the Relationship between Usability Testing and Data Verification ; 2016; Langer Tesfaye, C.; Kurmlavage, V.
- Beyond the Survey: Improving Data Insights and User Experience with Mobile Devices ; 2016; Graham, P.; Lew, G.
- User Experience Considerations for Contextual Product Surveys on Smartphones ; 2016; Sedley, A.; Mueller, H.
- The Differential Effect of Mobile-friendly Surveys on Data Quality; 2016; Horwitz, R.
- Embedding Survey Questions within Non-research Mobile Apps: A Method for Collecting High-quality Data...; 2016; Bapna, V.; Antoun, C.
- Does Changing Monetary Incentive Schemes in Panel Studies Affect Cooperation? A Quasi-experiment on...; 2016; Schaurer, I.; Bosnjak, M.
- Survey Mode and Mail Method: A Practical Experiment in Survey Fielding for a Multi-round Survey ; 2016; Sullivan, B. D.; Duda, N.; Bogen, K.; Clusen, N. A.; Wakar, B.; Zhou, H.
- Web Probing for Question Evaluation: The Effects of Probe Placement ; 2016; Fowler, S.; Willis, G. B.; Moser, R. P.; Townsend, R. L. M.; Maitland, A.; Sun, H.; Berrigan, D.
- Early-bird Incentives: Results From an Experiment to Determine Response Rate and Cost Effects ; 2016; De Santis, J.; Callahan, R.; Marsh, S.; Perez-Johnson, I.
- Using Cash Incentives to Help Recruitment in a Probability Based Web Panel: The Effects on Sign Up Rates...; 2016; Krieger, U.
- Assessing Changes in Coverage Bias of Web Surveys a s Internet Access Increases in the United States...; 2016; Sterrett, D.; Malato, D.; Benz, J.; Tompson, T.; English, N.
- Timing is Everything: Discretely Discouraging Mobile Survey Response through the Timing of Email Contacts...; 2016; Richards, A.; C.; Shook-Sa, B. E.; C.; Berzofsky, M.; Smith, A. C.
- Dynamic Instructions in Check-All-That-Apply Questions ; 2016; Kunz, T.; Fuchs, M.
- Patterns of Unit and Item Nonresponse in a Multinational Web Survey ; 2016; Ackermann, A.; Howard Ecklund, E.; Phillips, B. T.; Brulia, A.
- Debunking Myths About the Quality of Industry and O ccupation Data Collected Through Self-administered...; 2016; Hurwitz, F. I.; Stein, J.; Skaff, A. L.
- Desktops, Tablets and Phones, Oh My! Device Prefere nce for Web Based Surveys ; 2016; Schy, S.; Ghirardelli, A.; Morrison, H.
- Assessing Potential Bias in Respondent-driven Incident Based Data from a Web Survey of College Students...; 2016; Peterson, K.; Berzofsky, M.; Shook-Sa, B. E.; Krebs, C.; Lindquist, C.
- Making Connections on the Internet: Online Survey Panel Communications ; 2016; Libman Barry, A.; Eiginger, C.; Walton, L.; Ricci, K.