Web Survey Bibliography
Relevance & Research Question: For several years, the Federal Statistical Office (FSO) has been working on the systematic implementation of questionnaire testing. A pretest laboratory was established in 2007 and complemented by an eye tracker in 2009. Questionnaires of online surveys are now increasingly evaluated by qualitative testing methods and redesigned to reduce the burden for respondents and to increase data quality of official statistics.
Methods & Data: Pretesting online questionnaires shall improve their usability, functionality and comprehensibility. At the FSO, a three step approach is applied: Firstly, we observe eye movements and facial expressions (in real-time), while respondents deal with the questionnaire. Secondly, we conduct cognitive interviews afterwards in order to discover the reasons why respondents proceeded the way they did. Thirdly, we evaluate the process of self-completing by eye tracking data (e. g. ‘Areas of Interest’) and the sequence of mouse clicks.
Results: Each source of information has its strengths and weaknesses: Generally, it is challenging to analyze eye tracking data. It is for example difficult to assess whether a longer fixation duration indicates problems or simply a higher interest in a question. Consequently, the interpretation might be misleading without profound background knowledge. From a different angle, results derived from cognitive interviews are of minor value if the answers of respondents seem to be determined by effects like acquiescence, social desirability or limited capacity for remembering and verbalizing cognitive processes. By linking our sources of information (“triangulation”) we are able to provide more valid pretesting results and recommendations for improving online questionnaires.
Added Value: When online questionnaires are tested at the FSO, cognitive interviews are conducted after eye tracking itself. Combining both methods has given us insights into users’ behaviour when reading off screen and their expectations concerning navigation. The analyses illustrate whether respondents perceive links to detailed explanations, skip instructions or entire lists of response options. General advice is provided on wording and design principles for improving online questionnaires. Our findings lead us directly to Steve Krug’s (2006) saying: “Don’t make me think!”
Conference Homepage (abstract)
Web Survey Bibliography - Germany (413)
- Mobile Befragungen: Was Big Data mit kleinen Geräten zu tun hat; 2012
- Item non-response in open-ended questions: Who does not answer on the meaning of left and right?; 2012; Scholz, E., Zuell, C.
- Innovation der Online-Datenerhebung für wissenschaftliche Forschungen. Das niederländische MESS-Projekt...; 2012; Das, M.
- Comparing Ranking Techniques in Web Surveys; 2012; Blasius, J.
- Design of CAWI Instruments for Social Surveys ; 2012; Blanke, K.
- Enhancing Web Surveys With New HTML5 Input Types; 2012; Funke, F.
- GESIS Online Access Panel Pilot Study: Recruitment and Panel Maintenance; 2012; Kaczmirek, L., Bandilla, W., Schaurer, I., Struminskaya, B., Weyandt, K.
- The German Internet Panel: First Results from the Recruitment Phases; 2012; Blom, A. G.
- Assessing the Magnitude of Non-Consent Biases in Linked Survey and Administrative Data; 2012; Sakshaug, J. W., Kreuter, F.
- How Do Lotteries and Study Results Influence Response Behavior in Online Panels?; 2012; Goeritz, A., Luthe, S. C.
- Improving RDD Cell Phone Samples. Evaluation of Different Pre-call Validation Methods; 2012; Kunz, T., Fuchs, M.
- Marktforschung mit dem iPad-Panel von Axel Springer Media Impact; 2012
- Effects of Personalized Versus Generic Implementation of an Intra-Organizational Online Survey on Psychological...; 2012; Mueller, K., Straatmann, T., Hattrup, K., Jochum, M.
- Positioning of Clarification Features in Web Surveys: Evidence from Eye Tracking Data; 2012; Kunz, T., Fuchs, M.
- Using Adaptive Questionnaire Design in Open-ended Questions: A Fieldexperimental Study on the Size of...; 2012; Fuchs, M., Emde, M.
- Exploring New Pathways to Survey Recruitment; 2012; Bilgram, V., Stadler, D.Jawecki, G.
- Does Mode Matter? Initial Evidence from the German Longitudinal Election Study (GLES); 2012; Blumenstiel, J. E., Rossmann, J.
- Best of both worlds – The INSA study 50plus; 2012; Geissler, H., Blome, C.
- The “MediaLiveTracker” – A New Online-Tool for Real-Time-Response-Measurement; 2012; Kercher, J., Bachl, M., Voegele, C., Vohle, F.
- Surveytainment 2.0: Why investing 10 more minutes more in constructing your questionnaire is worth considering...; 2012; Muehle, A., Tress, F., Schmidt, S., Winkler, T.
- Market research online community (MROC) versus focus group; 2012; Zuber, M.
- Data quality in MAWI and CAWI; 2012; Mavletova, A. M., Blasius, J.
- Can mobile-web surveys substitute classic web-surveys? Results from an exploratory, comparative method...; 2012; Bohn, A., Doering, N., Maxl, E.
- Scrutinizing Dynamics – Rolling panel waves in theory and practice; 2012; Faas, T., Blumenberg, J. N.
- The German Internet Panel: Design of a Probability-Based Online Survey; 2012; Blom, A. G., Gathmann, C., Holthausen, A., Riepe, C.
- The price we have to pay: Incentive experiments in the recruitment process for a probability-based online...; 2012; Schaurer, I., Struminskaya, B., Kaczmirek, L., Bandilla, W.
- Effects of number of response options in web surveys: The role of verbal labels; 2012; Thorsdottir, F., Fuchs, M., Jonsdottir, J.
- Little experience with technology as a cause of nonresponse in online surveys; 2012; Struminskaya, B., Schaurer, I., Kaczmirek, L., Bandilla, W.
- Comparing Item-Non-Response and Open Questions within different Web Survey Types; 2012; Silber, H., Lischewski, J., Leibold, J.
- Continuous large-scale volunteer web-surveys: The experience of Lohnspiegel and WageIndicator; 2012; Oez, F.
- Is Pretesting Established Among Online Survey Tool Users?; 2012
- An Evaluation of Two Non-Reactive Web Questionnaire Pretesting Methods; 2012; Lenzner, T.
- High potential for mobile Web surveys: Findings from a survey representative for German Internet users...; 2012; Funke, F., Wachenfeld, A.
- Better low-tech than sorry: How technophile questionnaires may affect psychological representativeness...; 2012; Funke, F., Reips, U. -D.
- Can Social Media Research replace traditional research methods?; 2012; Faber, T., Einhorn, M., Hofmann, O., Loeffler, M.
- Bad Boy Matrix Question – Whatcha gonna do when they come for you?; 2012; Tress, F.
- Effects of Static versus Dynamic Formatting Instructions for Open-Ended Numerical Questions in Web Surveys...; 2012; Kunz, T., Fuchs, M.
- FamilyVote – Conducting online surveys with children and families; 2012; Geissler, H., Peeters, H.
- New Insights about market research with an iPad-panel; 2012; Manouchehri, A., Rieber, D., Moughrabi, C.
- Asking Probing Questions in Web Surveys: Which factors have an Impact on the Quality of Responses? ; 2012; Behr, D., Kaczmirek, L., Braun, M., Bandilla, W.
- Assessing the Quality of Survey Data ; 2012; Blasius, J.
- Exploring Animated Faces Scales in Web Surveys: Drawbacks and Prospects; 2012; Emde, M., Fuchs, M.
- Reminders in Web-Based Data Collection: Increasing Response at the Price of Retention?; 2012; Goeritz, A., Crutzen, R.
- Mobile, webmail, desktops: Where are we viewing email now?; 2011
- Assessing personality traits through response latencies using item response theory; 2011; Ranger, J., Ortner, T. M.
- Web-based rating scales: HTML 5 and other innovations; 2011; Funke, F.
- German Web-based Registry for Eating Disorders; 2011; Gross, G., Birgegård, A., Zipfel, S.
- E-dater, Artificial Actors, and German Households; 2011; Hebing, M.
- Seeing Through the Eyes of the Respondent: An Eye-tracking Study on Survey Question Comprehension; 2011; Lenzner, A., Kaczmirek, L., Galesic, M.
- Eye Tracking in testing questionnaires: What’s the added value?; 2011; Tries, S.