Web Survey Bibliography
It is common survey practice to convert a series of yes/no (forced-choice) items in telephone surveys to check-all-that-apply items in web and mail surveys. However, relatively little is known about how these different question formats may influence answers. This paper reports results from two web experiments and a comparison paper experiment in which nine different questions, varying in substantive topic and type (opinion-based and fact/behavior-based) were asked in 16 experimental comparisons of the check-all and forced-choice formats. Our purpose was to determine whether this change in question format influenced the number of response options marked affirmatively within each question and why any differences might occur. Results revealed that in every instance respondents marked significantly more items in the forced-choice format than in the check-all format. Given these results, detailed analyses of response patterns within questions, answering time, and alternative wording structures of questions were undertaken to examine which of three theories (satisficing, depth of processing, and acquiescence) best accounted for the response differences across question formats. These analyses indicated that the forced-choice format appears to invoke deeper processing and to eliminate satisficing behavior that occurs among some respondents to the check-all format, but that acquiescence does not seem to be an issue in the forcedchoice format. Thus, it appears that the use of the forced-choice question format is a desirable alternative to the use of the check-all question format for multiple answer questions. In addition, the findings reported here give ample reason to be concerned about the current practice of automatically converting items from the forced-choice format to the check-all format or vise versa when switching between telephone and paper or web surveys.
Conference program
Web survey bibliography - The American Association for Public Opinion Research (AAPOR) 60th Annual Conference, 2005 (7)
- Unintended Consequences of Incentive Induced Response Rate Differences; 2005; Pope, D., Crawford, S. D., Johnson, E. O., McCabe, S. E.
- Mode Effects in Customer Satisfaction Measurement; 2005; Steiger, D. M., Keil, L., Gaertner, G.
- Vote Over-Reporting: Testing the Social Desirability Hypothesis in Telephone and Internet Surveys; 2005; Holbrook, A. L., Krosnick, J. A.
- Using the Web to Survey College Students: Institutional Characteristics That Influence Survey Quality...; 2005; Crawford, S. D., McCabe, S. E., Inkelas, K. K.
- Visual Context Effects in Web Surveys; 2005; Couper, M. P., Conrad, F. G., Tourangeau, R.
- Comparing Check-All and Forced-Choice Question Formats in Web Surveys: The Role of Satisficing, Depth...; 2005; Smyth, J. D., Dillman, D. A., Christian, L. M., Stern, M. J.
- Data Collection Mode Effects Controlling for Sample Origins in a Panel Study: Telephone versus Internet...; 2005; Dennis, J. M., Chatt, C., de Rouvray, C., Pulliam, P.