Web Survey Bibliography
The apparently easy task of constructing answer formats poses many design decisions with consequences for the reliability and validity of requests. In accordance with Parduccis theoretical framework, the range of scale points and their frequency build a frame for the respondents’ understanding of what was asked and how the answer should be edited. The consequent research shows that answer scales influence respondents’ understanding regarding 1) the meaning of the concept to be measured and 2) their assumptions about the distribution of the associated behaviour or opinion in the population. Consequently, the task of designing answer scales is a complex one. While designing answer scales one should consider the properties of dimension to be measured, the appropriate scale characteristics, the modus of measurement and the characteristics of respondents. This presentation gives an overview of the literature addressing different questions related to the design of answer scales. Is an open ended or a closed answer format appropriate? What is the optimal number of scale points? What about the response order effect? How should labelling be applied? Are fully labelled scales preferable to numbered scales or end-labelled scales? Should the answer scale be polar or bipolar? Should a scale have a middle point and “no opinion” option? While regarding some of these aspects, e.g. numbers of scale points or order effects similar findings and design suggestions are available in the literature (e.g. maxim of reliability in the case of 5-7 scale points), other aspects are controversial. For example prefer some researchers an open ended question format while asking about the frequencies of certain behaviours since the answers are more exact and the context effect of answer scales is absent here. Others discuss the psychological problems associated with an open answer format, e.g. that the respondents round their answers off and build their own answer categories. Considering the presented review of literature the consequences for the design of answer scales and open research questions are summarized.
Conference Homepage (abstract)
Web survey bibliography - Kaczmirek, L. (43)
- Online Survey Software; 2017; Kaczmirek, L.
- Du kommst hier nicht rein: Türsteherfragen identifizieren nachlässige Teilnehmer in Online-Umfragen; 2016; Merkle, B.; Kaczmirek, L.; Hellwig, O.
- Mode System Effects in an Online Panel Study: Comparing a Probability-based Online Panel with two Face...; 2015; Struminskaya, B.; De Leeuw, E. D.; Kaczmirek, L.
- Item nonresponse in open-ended questions: Identification and reduction in web surveys; 2015; Kaczmirek, L.; Behr, D.
- The Effectiveness of Mailed Invitations for Web Surveys and the Representativeness of Mixed-Mode versus...; 2014; Bandilla, W., Couper, M. P., Kaczmirek, L.
- Assessing representativeness of a probability-based online panel in Germany; 2014; Struminskaya, B., Kaczmirek, L., Schaurer, I., Bandilla, W.
- Incentives on demand in a probability-based online panel: redemption and the choice between pay-out...; 2014; Schaurer, I., Struminskaya, B., Kaczmirek, L.
- Does left still feel right? The optimal position of answer boxes in Web surveys - revisited; 2013; Lenzner, T., Kaczmirek, L.,Galesic, M.
- The Effectiveness of Mailed Invitations for Web Surveys; 2013; Bandilla, W., Couper, M. P., Kaczmirek, L.
- GESIS Online Panel Pilot: Results from a Probability-Based Online Access Panel; 2013; Kaczmirek, L., Bandilla, W., Schaurer, I., Struminskaya, B., Weyandt, K.
- Effects of incentive reduction after a series of higher incentive waves in a probability-based online...; 2013; Struminskaya, B., Kaczmirek, L., Schaurer, I., Bandilla, W.
- Sample composition discrepancies in different stages of a probability-based online panel; 2013; Bosnjak, M., Haas, I., Galesic, M., Kaczmirek, L., Bandilla, W., Couper, M. P.
- Testing the Validity of Gender Ideology Items by Implementing Probing Questions ; 2013; Behr, D., Braun, M., Kaczmirek, L., Bandilla, W.
- A Framework for the Collection of Universal Client Side Paradata (UCSP); 2012; Kaczmirek, L.
- Challenges of assessing the quality of a prerecruited probability-based panel of internet users in...; 2012; Struminskaya, B., Kaczmirek, L.
- Assessing Cross-National Equivalence of Measures of Xenophobia: Evidence from Probing in Web Surveys; 2012; Behr, D., Braun, M., Kaczmirek, L.
- Little experience with technology as a cause of nonresponse in online surveys; 2012; Struminskaya, B., Schaurer, I., Kaczmirek, L., Bandilla, W.
- Establishing Cross-National Equivalence of Measures of Xenophobia: Evidence from Probing in Web Surveys...; 2011; Braun, M., Behr, D., Kaczmirek, L.
- Online Research @ GESIS; 2011; Kaczmirek, L., Lenzner, T.
- Seeing Through the Eyes of the Respondent: An Eye-tracking Study on Survey Question Comprehension; 2011; Lenzner, A., Kaczmirek, L., Galesic, M.
- Asking sensitive questions in a recruitment interview for an online panel: the income question; 2011; Schaurer, I., Struminskaya, B., Kaczmirek, L., Bandilla, W.
- Cognitive process in answering questions: Are verbal labels in rating scales attended to?; 2011; Menold, N., Kaczmirek, L., Lenzner, T.
- Asking Sensitive Questions: Do They Affect Participation In Follow-Up Surveys?; 2011; Schaurer, I., Struminskaya, B., Kaczmirek, L., Bandilla, W.
- Which Technologies Do Respondents Use in Online Surveys – An International Comparison?; 2011; Kaczmirek, L., Behr, D., Bandilla, W.
- Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies; 2011; Das, M., Ester, P., Kaczmirek, L.
- Optimizing response rates in online surveys; 2010; Kaczmirek, L.
- Developing a Research Framework for Usability in Online Surveys: Human-Survey Interaction; 2010; Kaczmirek, L.
- Security and Data Protection: Collection, Storage, Feedback in Internet Research; 2010; Thiele, O., Kaczmirek, L.
- Is this e-mail relevant? An eyetracking experiment on how potential respondents read e-mail invitations...; 2009; Kaczmirek, L., Faaß, T., Galesic, M.
- Respondent-Oriented Interaction Design Reduces Item Nonresponse in Internet Surveys; 2009; Kaczmirek, L.
- Panel Discussion: Does Mixed Mode Help Us Increase Response Rates?; 2009; Kaczmirek, L.
- A literature review on constructing answer formats; 2009; Menold, N., Kaczmirek, L., Hoffmeyer-Zlotnik, J.
- Coverage- und Nonresponse-Effekte bei Online-Bevölkerungsumfragen ; 2009; Bandilla, W., Kaczmirek, L., Blohm, M., Neubarth, W.
- Internet Survey Software Tools; 2008; Kaczmirek, L.
- Increasing item completion rates in matrix questions ; 2008; Kaczmirek, L.
- Differences between respondents and nonrespondents in an Internet survey recruited from face-to-face...; 2007; Bandilla, W., Blohm, M., Kaczmirek, L. & Neubarth, W.
- Nicht-reaktive datenerhebung: Teinahmeverhalten bei befragungen mit paradaten evaluieren. [Non reactive...; 2007; Kaczmirek, L., Neubarth, W.
- Response time measurement in the lab and on the Web: A comparison; 2007; Galesic, M., Reips, U.-D., Kaczmirek, L., Czienskowski, U., Liske, N., von Oertzen, T.
- Applications of the Document Object Model (DOM) in Web-Surveys; 2007; Neubarth, W., Kaczmirek, L.
- Sampling Bias: Face to face to Web; 2007; Bandilla, W., Blohm, M., Kaczmirek, L., Neubarth, W.
- A short introduction to usability in online surveys; 2006; Kaczmirek, L.
- Standards in Online Surveys. Sources for Professional Codes of Conduct, Ethical Guidelines and Quality...; 2005; Kaczmirek, L., Schulze, N.
- Web Surveys. A Brief Guide on Usability and Implementation Issues; 2005; Kaczmirek, L.