Web Survey Bibliography
The literature on human computer interaction consistently stresses the importance of reducing the cognitive effort required by users who interact with a computer in order to improve the experience and enhance usability and comprehension (e.g., Shneiderman, 1998). Applying this perspective to Web surveys, questionnaire designers are advised to strive for layouts that facilitate the response process and reduce the effort required to select an answer. In this paper, we examine whether placing the input fields (i.e., radio buttons or check boxes) to the left or to the right of the answer options in closed-ended questions enhances usability and facilitates responding. First, we discuss two opposing principles of how respondents may process these questions in Web surveys, one suggesting placing the answer boxes to the left and the other suggesting placing them to the right side of the answer categories. Second, we report an eye-tracking experiment (N = 47), which examined whether Web survey responding is better described by one or the other of these two principles, and consequently whether one or the other layout is preferable in terms of usability. Our results indicate that the vast majority of respondents conform to the principle suggesting placing the answer boxes to the left of the response options. Moreover, respondents require less cognitive effort (operationalized by fixation times, fixation counts, and number of gaze switches between answer options and answer boxes) to select an answer in this layout.
Conference Homepage (abstract)
Web survey bibliography - European survey research associaton conference 2011, ESRA, Lausanne (35)
- Effects of speeding on satisficing in Mixed-Mode Surveys; 2011; Bathelt, S., Bauknecht, J.
- Quantifying Open-Ended Responses: Results from an Online Advertising Tracking Survey; 2011; Jacobe, A., Brewer, L., Vakalia, F., Turner, S., Marsh, S. M.
- Quality of responses to an open-ended question on a mixed-mode survey; 2011; Gibson, J., Vakalia, F., Turner, S.
- Open-ended questions in the context of temporary work research; 2011; Siponen, K.
- How do Respondents Perceive a Questionnaire? The Contribution of Open-ended Questions; 2011; Markou, E., Garnier, B.
- The Uses of Open-Ended Questions in Quantitative Surveys; 2011; Singer, E., Couper, M. P.
- Agree-Disagree Response Format versus Importance Judgment; 2011; Krebs, D.
- Testing a single mode vs a mixed mode design; 2011; Laaksonen, S.
- Germans' segregation preferences and immigrant group size: A factorial survey approach; 2011; Schlueter, E., Ullrich, J., Schmidt, P.
- Errors within web-based surveys: a comparison between two different tools for the analysis of tourist...; 2011; Polizzi, G., Oliveri, A. M.
- Benefits of Structured DDI Metadata across the Data Lifecycle: The STARDAT Project at the GESIS Data...; 2011; Linne, M., Brislinger, E., Zenk-Moeltgen, W.
- Microdata Information System MISSY; 2011; Bohr, J.,
- The Use of Structured Survey Instrument Metadata throughout the Data Lifecycle; 2011; Hansen, S. E.
- DDI and the Lifecycle of Longitudinal Surveys; 2011; Hoyle, L., Wackerow, J.
- Dissemination of survey (meta)data in the LISS data archive; 2011; Streefkerk, M., Elshout, S.
- Underreporting in Interleafed Questionnaires: Evidence from Two Web Surveys; 2011; Medway, R., Viera Jr., L., Turner, S., Marsh, S. M.
- The use of cognitive interviewing methods to evaluate mode effects in survey questions; 2011; Gray, M., Blake, M., Campanelli, P., Hope, S.
- Does the direction of Likert-type scales influence response behavior in web surveys?; 2011; Keusch, F.
- Cross-country Comparisons: Effects of Scale Type and Response Style Differences; 2011; Thomas, R. K.
- Explaining more variance with visual analogue scales: A Web experiment; 2011; Funke, F.
- A Comparison of Branching Response Formats with Single Response Formats; 2011; Thomas, R. K.
- Different functioning of rating scale formats – results from psychometric and physiological experiments...; 2011; Koller, M., Salzberger, T.
- Cognitive process in answering questions: Are verbal labels in rating scales attended to?; 2011; Menold, N., Kaczmirek, L., Lenzner, T.
- Experiments on the Design of the Left-Right Self-Assessment Scale; 2011; Zuell, C., Scholz, E., Behr, D.
- Separating selection from mode effects when switching from single (CATI) to mixed mode design (CATI /...; 2011; Carstensen, J., Kriwy, P., Krug, G., Lange, C.
- Testing between-mode measurement invariance under controlled selectivity conditions; 2011; Klausch, L. T.
- Using propensity score matching to separate mode- and selection effects; 2011; Lugtig, P. J., Lensvelt-Mulders, G. J.
- A mixed mode pilot on consumer barometer; 2011; Taskinen, P., Simpanen, M.
- Separation of selection bias and mode effect in mixed-mode survey – Application to the face-to...; 2011; Bayart, C., Bonnel, P.
- Optimization of dual frame telephone survey designs; 2011; Slavec, A., Vehovar, V.
- A Comparison of CAPI and PAPI through a Randomized Field Experiment; 2011; De Weerdt, J.
- Flexibility of Web Surveys: Probing 'do-not-know' over the Phone and on the Web; 2011; Hox, J., de Leeuw, E. D.
- Changing research methods in Ukraine: CATI or Mixed-Mode Surveys?; 2011; Paniotto, V., Kharchenko, N.
- The effects of mixed mode designs on simple and complex analyses; 2011; Martin, P., Lynn, P.
- Measurement Error in Mixed Mode Surveys: Mode or Question Format?; 2011; de Leeuw, E. D., Hox, J.