Web Survey Bibliography
Relevance & Research Question: Response latency measurement and eye tracking are two computer-assisted pretesting methods that may be particularly useful for evaluating Web questionnaires. In contrast to other techniques (e.g., expert reviews, qualitative interviews), both methods produce nonreactive and objective measures of behavior that are neither affected by the researcher (and the ways in which she tests the questions) nor by the research context. While previous studies have shown that longer response latencies and fixation times are indicative of problematic questions (Lenzner et al., 2010, 2011), little is known about the utility of the two methods (or measures) in the practical pretesting context (e.g., in testing draft questions). This study examines whether response latencies and fixation times are discriminative features to distinguish flawed from improved survey questions.
Methods & Data: In a laboratory experiment, respondents’ eye movements and response latencies were recorded while they were answering two versions of a Web questionnaire. One group (n=22) received a questionnaire including poorly worded questions and the other group (n=22) received the same questionnaire with improved question wordings. Given that response latencies and fixation times are highly individual, we computed the baseline fixation rate (eye tracking) and baseline reading rate (response latency) for every respondent from seven additional questions asked in the same Web survey. In the analyses, whenever the response or fixation times for a question exceeded respondents’ baseline by more than 15%, the question was deemed problematic. (The analyses were repeated with 10%, 20%, and 25% thresholds, but all conclusions remained unchanged).
Results: Fixation rate (eye tracking) was consistently more accurate than reading rate (response latency) in classifying the questions as flawed or improved. The overall accuracy of the fixation rate ranged from 60% to 85%, the accuracy of the reading rate from 43% to 70%. Also, the eye tracking measure resulted in considerably fewer misses (failures to detect problems) and fewer false alarms.
Added Value: This study suggests that fixation times and response latencies are potentially useful methods for pretesting (draft) Web questionnaires, albeit the level of accuracy with which they identify problematic questions is not yet satisfactory.
GOR Homepage (abstract) / (full text)
Web Survey Bibliography - Usability, HCI (409)
- Tips for Evaluating Online Effectiveness; 2013; Stevenson, S. C.
- Using Web Surveys for Psychology Experiments: A Case Study in New Media Technology for Research; 2013; Peden, B. F., Tiry , A. M.
- The Distinctiveness of Online Research: Descriptive Assemblages, Unobtrusiveness, and Novel Kinds of...; 2013; Lanfrey, D.
- Advancing Research Methods with New Technologies; 2013; Sappleton, N.
- Compared to a small, supervised lab experiment, a large, unsupervised web-based experiment on a previously...; 2013; Ryan, R. S., Wilde, M., Crist, S.
- From mixed-mode to multiple devices. Web surveys, smartphone surveys and apps: has the respondent gone...; 2013; Callegaro, M.
- Moving an established survey online – or not?; 2013; Barber, T., Chilvers, D., Kaul, S.
- Using mobile devices to access the realities of youth: How identification with society influences political...; 2013; Smith, M.
- By the Numbers: Theory of adaptation or survival of the fittest?; 2013; Cavallaro, K.
- Modular Survey Design: A Bite Size Proposal; 2013; Kelly, F., Stevens, S., Johnson, A.
- Cyborgs vs. Monsters: Assembling Modular Surveys to Create Complete Datasets; 2013; Johnson, E. P., Siluk, L., Tarraf, S.
- Do I Have Your Full Attention?; 2013; Cape, P. J.
- Optimizing Surveys for Smartphones: Maximizing Response Rates While Minimizing Bias; 2013; Lattery, K., Park Bartolone, G., Saunders, T.
- Shorter Isn't Always Better; 2013; Burdein, I.
- Solving the Unintentional Mobile Challenge; 2013; Peterson, G., Mechling, J., LaFrance, J., Ham, G.
- Mobile Research Risk: What Happens to Data Quality When Respondents Use a Mobile Device for a Survey...; 2013; Baker-Prewitt, J.
- A standard for test reliability in group research; 2013; Ellis, J. L.
- The comparison of road safety survey answers between web-panel and face-to-face; Dutch results of SARTRE...; 2013; Goldenbeld, C., de Craen, S.
- Addressing Disclosure Concerns and Analysis Demands in a Real-Time Online Analytic System; 2013; Krenzke, T., Gentleman, J. F., Li, J., Moriarity, C.
- Examination of the equivalence of self-report survey-based paper-and-pencil and internet data collection...; 2013; Weigold, A., Weigold, I. K., Russell, E. J.
- Using Online and Paper Surveys - The Effectiveness of Mixed-Mode Methodology for Populations Over 50; 2013; De Bernardo, D. H., Curtis, A.
- Who responds to website visitor satisfaction surveys?; 2013; Andreadis, I.
- Comparison of psychometric properties of internet versions of the Marlowe-Crowne Social Desirability...; 2013; Vesteinsdottir, V., Reips, U. -D., Joinson, A. N., Porsdottir, F.
- Seducing the respondent – how to optimise invitations in on-site online research?; 2013; Póltorak, M., Kowalski, J.
- Influence of mobile devices in online surveys; 2013; Maxl, E., Baumgartner, T.
- The ONS Beyond 2011 Programme & possible implications for social surveys; 2013; Morris, L.
- Survey Research; 2013; Abbott, M. L., McKinney, J.
- The Use of E-Questionnaires in Organizational Surveys; 2013; Brender-Ilan, Y., Vinitzky, G.
- Online Survey Software; 2013; Baker, J. D.
- The effect of short formative diagnostic web quizzes with minimal feedback; 2013; Baelter, O., Enstroem, E., Klingenberg, B.
- Up Means Good: The Impact of Screen Position on Evaluative Ratings in Web Surveys.; 2013; Tourangeau, R., Conrad, F. G., Couper, M. P.
- What we can learn from unintentional mobile respondents; 2012; Peterson, G.
- The integration of facebook into class management: an exploratory study; 2012; Chou, P. N.
- The cross platform report. Q2 -2012 - US; 2012
- Mobile usability; 2012; Nielsen, J., Budiu, R.
- Smartphone Apps and User Engagement: Collecting Data in the Digital Era; 2012; Link, M. W.
- How Often Do You Use the App with a Bird on It? Exploring Differences in Survey Completion Times, Primacy...; 2012; Buskirk, T. D.
- Data quality of questions sensitive to social-desirability bias in web surveys; 2012; Lozar Manfreda, K., Zajc, N., Berzelak, N., Vehovar, V.
- Online Questionnaires: Development of ‘basic requirements’; 2012; Tries, S., Blanke, K.
- Social research in online context: methodological reflections on web surveys from a case study; 2012; Pandolfini, V.
- Improving Survey Website Usability ; 2012; Vannette, D.
- How accurate are surveys of objective phenomena?; 2012; Chang, L. C., Krosnick, J. A.
- Pros and cons of Internet based User Satisfaction Surveys; 2012; Consoli, A., Matsulevits, L.
- The re-engineering of the Structural Earnings survey process: Mixed - Mode data collection and new E...; 2012; Cardinaleschi, S., De Santis, S., Rocci, F., Spinelli, V.
- Between demand and reality: Ensuring efficiency and quality in pretesting questionnaires; 2012; Sattelberger, S., Blanke, K.
- How to provide high data quality in online-questionnaires: Setting guidelines in design; 2012; Tries, S., Nebel, S., Blanke, K.
- The Feasibility of Conducting a Web Survey Using Respondent Driven Sampling among Transgenders in the...; 2012; Kappelhof, J.
- Device Diversity: Understanding the complexity of varied devices for taking surveys – Case study...; 2012; Pearson, C., Backlund, K., Veling, L., Tsvelik, M., Jehoel, S.
- Research in the Mobile Mindset: Exploring the unexplored in the mobile research space; 2012; Willems, A., Veris, E., Verhaeghe, A.
- WebSM Study: Survey software features overview ; 2012; Vehovar, V.; Cehovin, G.; Kavcic, L.; Lenar, J.