Web Survey Bibliography
For self-administered instruments, questions asking if respondents engaged in a series of different behaviors can be formatted using:
Check-all response formats in which respondents are asked to mark all of the categories that apply under the assumption that marked categories are equivalent to "yes" and unmarked categories are equivalent to "no;"
Check-lists response formats in which questions are presented in a grid or matrix and "yes" or "no" are offered as explicit choices; or
Stand-alone response formats in which each question is presented separately and with its own set of "yes" and "no" categories.
For web surveys, the small body of research comparing check-all to check-list formats shows that checklists are associated with respondents endorsing a larger number of the individual questions (providing higher counts of the behaviors) and taking longer to answer the questions (possibly reflecting deeper processing of the items). However, other research on web surveys demonstrates that long questions and grid/matrix formats may encourage breakoffs. Thus, a plausible alternative to either the check-all or check-list format is to format items as stand-alone questions. To the best of our knowledge no one has compared all three of these response formats simultaneously and in a web survey, and that is the focus of the current experiment. We examined the three response formats for several series of questions about news consumption and social activities in a web survey of college students (RR1 18.3%). We examine how the formats differ with regard to the following outcomes: breakoff rates, endorsement of the individual items, completion times, primacy effects, and criterion validity.
Conference Homepage (abstract)
Web Survey Bibliography - Dykema, J. (11)
- Associations Between Interactional Indicators of Problematic Questions and Systems for Coding Question...; 2013; Dykema, J., Schaeffer, N. C., Garbarski, D.
- Effects of E-Mailed Versus Mailed Invitations and Incentives on Response Rates, Data Quality, and Costs...; 2013; Dykema, J., Stevenson, J., Klein, L., Kim, Y., Day, B.
- What are the Odds? Lotteries versus Cash Incentives. Response Rates, Cost and Data Quality for a Web...; 2012; Stevenson, J., Dykema, J., Klein, L., Cyffka, K., Goldrick-Rab, S.
- Effects of Post-Incentives on Response Rates, Costs, and Response Quality in a Web Survey of College...; 2011; Stevenson, J., Dykema, J., Cyffka, K., Klein, L., Goldrick-Rab, S.
- Questions for Surveys: Current Trends and Future Directions; 2011; Schaeffer, N. C., Schaeffer, N. C.
- Incentives, Research-based Best Practices; 2011; Dykema, J.
- Effects of Incentives and Prenotification on Response Rates and Costs in a National Web Survey of Physicians...; 2011; Bonham, V., Day, B., Dykema, J., Sellers, S., Stevenson, J.
- Designing Questions for Web Surveys: Effects of Check-List, Check-All, and Stand-Alone Response Formats...; 2011; Dykema, J., Schaeffer, N. C., Beach, J., Lein, V., Day, B.
- Effects of Mode and Incentives on Response Rates, Costs, and Response Quality in a Mixed Mode Survey...; 2011; Stevenson, J., Dykema, J., Kniss, C., Black, P., Moberg, P.
- The Use of Advance Contact, Monetary Incentives, and Lotteries to Increase Response Rates in a Web Survey...; 2009; Stevenson, J., Dykema, J., Day, D., Bonham, V., Sellers, S.
- Face-to-Face Surveys; 2008; Dykema, J., Basson, D., Schaeffer, N. C.