Web Survey Bibliography
Relevance & Research Question: Response latency measurement and eye tracking are two computer-assisted pretesting methods that may be particularly useful for evaluating Web questionnaires. In contrast to other techniques (e.g., expert reviews, qualitative interviews), both methods produce nonreactive and objective measures of behavior that are neither affected by the researcher (and the ways in which she tests the questions) nor by the research context. While previous studies have shown that longer response latencies and fixation times are indicative of problematic questions (Lenzner et al., 2010, 2011), little is known about the utility of the two methods (or measures) in the practical pretesting context (e.g., in testing draft questions). This study examines whether response latencies and fixation times are discriminative features to distinguish flawed from improved survey questions.
Methods & Data: In a laboratory experiment, respondents’ eye movements and response latencies were recorded while they were answering two versions of a Web questionnaire. One group (n=22) received a questionnaire including poorly worded questions and the other group (n=22) received the same questionnaire with improved question wordings. Given that response latencies and fixation times are highly individual, we computed the baseline fixation rate (eye tracking) and baseline reading rate (response latency) for every respondent from seven additional questions asked in the same Web survey. In the analyses, whenever the response or fixation times for a question exceeded respondents’ baseline by more than 15%, the question was deemed problematic. (The analyses were repeated with 10%, 20%, and 25% thresholds, but all conclusions remained unchanged).
Results: Fixation rate (eye tracking) was consistently more accurate than reading rate (response latency) in classifying the questions as flawed or improved. The overall accuracy of the fixation rate ranged from 60% to 85%, the accuracy of the reading rate from 43% to 70%. Also, the eye tracking measure resulted in considerably fewer misses (failures to detect problems) and fewer false alarms.
Added Value: This study suggests that fixation times and response latencies are potentially useful methods for pretesting (draft) Web questionnaires, albeit the level of accuracy with which they identify problematic questions is not yet satisfactory.
GOR Homepage (abstract) / (full text)
Web Survey Bibliography (6390)
- Research Design as an Influencing Factor for Reliability in Online Market Research; 2013; Wengrzik, J., Theuner, G.
- Ethics, privacy and data security in web-based course evaluation; 2013; Salaschek, M., Meese, C., Thielsch, M.
- Seducing the respondent – how to optimise invitations in on-site online research?; 2013; Póltorak, M., Kowalski, J.
- Influence of mobile devices in online surveys; 2013; Maxl, E., Baumgartner, T.
- E-questionnaire in cross-sectional household surveys; 2013; Karaganis, M.
- GESIS Online Panel Pilot: Results from a Probability-Based Online Access Panel; 2013; Kaczmirek, L., Bandilla, W., Schaurer, I., Struminskaya, B., Weyandt, K.
- Online Survey – Research with children on advertising impact; 2013; Funkenweh, V., Busch, J., Amthor, A. L., Boeer, A., Gaedke, J.
- HTML5 and mobile Web surveys: A Web experiment on new input types; 2013; Funke, F.
- Metadata on the demographics of online research: Results from a full-range study of available online...; 2013; Burger, C., Stieger, S.
- How the screen-out influence the dropout of a commercial panel; 2013; Bartoli, B.
- Beyond methodology - some ethical implications of "doing research online"; 2013; Heise, N.
- Innovation in Data Collection: the Responsive Design Approach; 2013; Bianchi, A., Biffignandi, S.
- Break-off and attrition in the GIP amongst technologically experienced and inexperienced participants...; 2013; Blom, A. G., Bossert, D., Clark, V., Funke, F., Gebhard, F., Holthausen, A., Krieger, U., Wachenfeld...
- Nonresponse and Nonresponse Bias in a Probability-Based Internet Panel; 2013; Blom, A. G., Bossert, D., Funke, F., Gebhard, F., Holthausen, A., Krieger, U.
- Rewards - Money for Nothing?; 2013; Cape, P. J., Martin, P.
- Effects of incentive reduction after a series of higher incentive waves in a probability-based online...; 2013; Struminskaya, B., Kaczmirek, L., Schaurer, I., Bandilla, W.
- Timing of Nonparticipation in an Online Panel: The effect of incentive strategies; 2013; Douhou, S., Scherpenzeel, A.
- Mixed-mode including web: Recent developments at Statistics Netherlands; 2013; Luiten, A., Schouten, B.
- Web coverage in the UK and its potential impact on general population web surveys; 2013; Callegaro, M.
- Surveys on Mobile Devices: Opportunities and Challenges; 2013; Couper, M. P.
- Measurement effects in mixed-mode panel surveys; 2013; Lugtig, P. J.
- Life history calendars - a viable method for web-based data collection?; 2013; Glasner, T., van der Vaart, W.
- Measurement issues in web surveys: An overview of opportunities and challenges; 2013; Calderwood, L.
- Experiences from a probability-based Internet panel: Sample, recruitment and participation; 2013; Scherpenzeel, A.
- Participation and engagement in web surveys of the general population: An overview of challenges and...; 2013; Roberts, C.
- Using Web Survey Panels to Estimate Population Characteristics: A Comparison of Alternative Approaches...; 2013; Rivers, D.
- Online Research, Game On!; 2013; Puleston, J.
- The ONS Beyond 2011 Programme & possible implications for social surveys; 2013; Morris, L.
- Issues of Coverage and Sampling in Web Surveys for the General Population: An Overview; 2013; Lynn, P.
- Use of a Social Networking Web Site for Recruiting Canadian Youth for Medical Research; 2013; Chu, J. L., Snider, C. E.
- Comparison of web-based versus paper-and-pencil administration of a humor survey; 2013; Wang, C.-C., Cheng, C.-L.;, Liu, K.-S., Cheng, Y.-Y.
- The Design of Grids in Web Surveys; 2013; Couper, M. P., Tourangeau, R., Conrad, F. G., Zhang, C.
- The smartphone in survey research: experiments for time use data; 2013; Fernee, H., Scherpenzeel, A.
- Survey Research; 2013; Abbott, M. L., McKinney, J.
- Understanding and Applying Research Design; 2013; Abbott, M. L., McKinney, J.
- Large-Scale Analysis and Testing; 2013; Cao, M., Zhang, Q.
- The Science of Web Surveys; 2013; Tourangeau, R., Conrad, F. G., Couper, M. P.
- How to create online questionnaires: A beginner's guide to survey design for businesses and students...; 2013; Lipscomb, L.
- True experimental data collection on the Internet; 2013; Reips, U. -D., Krantz, J. H.
- Virtual Research Methods; 2013; Hine, C.
- Askito: An open source Web questionnaire tool; 2013; Reips, U. -D., Heilmann, T.
- Informed Consent for Web Paradata Use; 2013; Couper, M. P., Singer, E.
- Measurement invariance and quality of composite scores in a face-to-face and a web survey; 2013; Revilla, M.
- Exploring Response Differences between Face-to-Face and Web Surveys: A Qualitative Comparative Analysis...; 2013; Bennink, M., Moors, G., Gelissen, J.
- 'Ready to complete the survey on Facebook': Web 2.0 as a research tool in business studies; 2013; Gregori, A., Baltar, F.
- Surveying “difficult-to-sample” backpackers through Facebook? Employing a mixed-mode dual...; 2013; Morris Paris, C.
- The Use of Mixed Methods in Organizational Communication Research; 2013; Salem, P. J.
- The Use of E-Questionnaires in Organizational Surveys; 2013; Brender-Ilan, Y., Vinitzky, G.
- Online Instruments, Data Collection, and Electronic Measurements: Organizational Advancements; 2013; Bocarnea, M. C., Reynolds, R. A., Baker, J. D.
- Convenient yet not a convenience sample: Jury pools as experimental subject pools; 2013; Murray, G. R., Rugeley, C. R., Mitchell, D.-G., Mondak, J. J.