Web Survey Bibliography
Relevance & Research Question: Response latency measurement and eye tracking are two computer-assisted pretesting methods that may be particularly useful for evaluating Web questionnaires. In contrast to other techniques (e.g., expert reviews, qualitative interviews), both methods produce nonreactive and objective measures of behavior that are neither affected by the researcher (and the ways in which she tests the questions) nor by the research context. While previous studies have shown that longer response latencies and fixation times are indicative of problematic questions (Lenzner et al., 2010, 2011), little is known about the utility of the two methods (or measures) in the practical pretesting context (e.g., in testing draft questions). This study examines whether response latencies and fixation times are discriminative features to distinguish flawed from improved survey questions.
Methods & Data: In a laboratory experiment, respondents’ eye movements and response latencies were recorded while they were answering two versions of a Web questionnaire. One group (n=22) received a questionnaire including poorly worded questions and the other group (n=22) received the same questionnaire with improved question wordings. Given that response latencies and fixation times are highly individual, we computed the baseline fixation rate (eye tracking) and baseline reading rate (response latency) for every respondent from seven additional questions asked in the same Web survey. In the analyses, whenever the response or fixation times for a question exceeded respondents’ baseline by more than 15%, the question was deemed problematic. (The analyses were repeated with 10%, 20%, and 25% thresholds, but all conclusions remained unchanged).
Results: Fixation rate (eye tracking) was consistently more accurate than reading rate (response latency) in classifying the questions as flawed or improved. The overall accuracy of the fixation rate ranged from 60% to 85%, the accuracy of the reading rate from 43% to 70%. Also, the eye tracking measure resulted in considerably fewer misses (failures to detect problems) and fewer false alarms.
Added Value: This study suggests that fixation times and response latencies are potentially useful methods for pretesting (draft) Web questionnaires, albeit the level of accuracy with which they identify problematic questions is not yet satisfactory.
GOR Homepage (abstract) / (full text)
Web Survey Bibliography - Conferences, workshops, tutorials, presentations (2845)
- How Representative are Google Consumer Surveys?: Results From an Analysis of a Google Consumer Survey...; 2013; Krishnamurty, P., Tanenbaum, E., Stern, M. J.
- One Drink or Two: Does Quantity Depicted in an Image Affect Web Survey Responses?; 2013; Charoenruk, N., Stange, M.
- A Comparison Between Screen/Follow Item Format and Yes/No Item Format on a Multi-Mode Federal Survey; 2013; Hernandez,S. J., Arakelyan, S. N., Welch, V. E.
- Using Multiple Modes in Follow-Up Contacts in Random-Digit Dialing Surveys; 2013; Chowdhury, P. P.
- Tablets and Smartphones and Netbooks, Oh My! Effects of Device Type on Respondent Behavior; 2013; Ross, H., Mendelson, J., Lackey, M.
- Impacts of Unit Nonresponse in a Recontact Study of Youth; 2013; Mendelson, J., Viera Jr., L.
- Multi-Mode Survey Administration: Does Offering Multiple Modes at Once Depress Response Rates?; 2013; Newsome, J., Levin, K., Langetieg, P., Vigil, M., Sebastiani, M.
- Responsive Design for Web Panel Data Collection; 2013; Bianchi, A., Biffignandi, S.
- Utilizing the Web in a Multi-Mode Survey; 2013; Venkataraman, L.
- Changing to a Mixed-Mode Design: The Role of Mode in Respondents’ Decisions About Participation...; 2013; Collins, D., Mitchell, M., Toomes, M.
- Comparing the Effects of Mode Design on Response Rate, Representativeness, and Cost Per Complete in...; 2013; Tully, R.
- Internet Response for the Decennial Census – 2012 National Census Test; 2013; Reiser, C.
- The Effects of Pushing Web in a Mixed-Mode Establishment Data Collection; 2013; Ellis, C.
- Using Web Ex to Conduct Usability Testing of an On-Line Survey Instrument; 2013; Stettler, K.
- Battle of the Scales: Understanding Respondent Scale Usage in the US and Abroad; 2013; Courtright, M., Pashupati, K., Pettit, F. A.
- Modular Survey Design: A Bite Size Proposal; 2013; Kelly, F., Stevens, S., Johnson, A.
- Cyborgs vs. Monsters: Assembling Modular Surveys to Create Complete Datasets; 2013; Johnson, E. P., Siluk, L., Tarraf, S.
- Do I Have Your Full Attention?; 2013; Cape, P. J.
- Does Sample Size Still Matter?; 2013; Bakken, D. G., Bond, M.
- Optimizing Surveys for Smartphones: Maximizing Response Rates While Minimizing Bias; 2013; Lattery, K., Park Bartolone, G., Saunders, T.
- Shorter Isn't Always Better; 2013; Burdein, I.
- Solving the Unintentional Mobile Challenge; 2013; Peterson, G., Mechling, J., LaFrance, J., Ham, G.
- Mobile Research Risk: What Happens to Data Quality When Respondents Use a Mobile Device for a Survey...; 2013; Baker-Prewitt, J.
- Pros and cons of virtual interviewers – vote in the discussion about surveytainment; 2013; Póltorak, M., Kowalski, J.
- The fish model: What factors affect participants while filling in an online questionnaire?; 2013; Mohamed, B., Lorenz, A., Pscheida, D.
- Interview Duration in Web Surveys: Integrating Different Levels of Explanation; 2013; Rossmann, J., Gummer, T.
- The monetary value of good questionnaire design; 2013; Tress, F.
- Technical and methodological meta-information on current practices in online research: A full population...; 2013; Burger, C., Stieger, S.
- Using interactive feedback to enhance response quality in Web surveys. The case of open-ended questions...; 2013; Emde, M., Fuchs, M.
- Reducing Response Order Effects in Check-All-That-Apply Questions by Use of Dynamic Tooltip Instructions...; 2013; Kunz, T., Fuchs, M.
- Slide to ruin data: How slider scales may negatively affect data quality and what to do about it; 2013; Funke, F.
- Measuring wages via a volunteer web survey – a cross-national analysis of item nonresponse; 2013; Steinmetz, S., Annmaria, B.
- Identifying and Mitigating Satisficing in Web Surveys: Some Experimental Evidence; 2013; Blumenstiel, J. E., Rossmann, J.
- Does one really know?: Avoiding noninformative answers in a reliable way.; 2013; de Leeuw, E. D., Boevee, A., Hox, J.
- Online Mixed Mode Surveying using a Responsive Design; 2013; Kissau, K.
- Sensitive Topics in PC and Mobile Web Surveys; 2013; Mavletova, A. M., Couper, M. P.
- Mobile Research Performance: How Mobile Respondents Differ from PC Users Concerning Interview Quality...; 2013; Schmidt, S., Wenzel, O.
- Who responds to website visitor satisfaction surveys?; 2013; Andreadis, I.
- Measuring working conditions in a volunteer web survey; 2013; de Pedraza, P., Villacampa, A.
- Sampling online communities: using triplets as basis for a (semi-) automated hyperlink web crawler.; 2013; Veny, Y.
- Prison break: Releasing offline experiments from methodological constraints by transforming them into...; 2013; Förstel, H., Manthei, K., Mohnen, A., Berger, G.
- Comparison of psychometric properties of internet versions of the Marlowe-Crowne Social Desirability...; 2013; Vesteinsdottir, V., Reips, U. -D., Joinson, A. N., Porsdottir, F.
- Why are you leaving me?? - Personality predictors of answering drop out in an online-study; 2013; Thielsch, M., Nestler, S., Back, M.
- Propensity Score Weighting – Can Personality Adjust for Selectivity?; 2013; Glantz, A., Greszki, R.
- Research Design as an Influencing Factor for Reliability in Online Market Research; 2013; Wengrzik, J., Theuner, G.
- Ethics, privacy and data security in web-based course evaluation; 2013; Salaschek, M., Meese, C., Thielsch, M.
- Seducing the respondent – how to optimise invitations in on-site online research?; 2013; Póltorak, M., Kowalski, J.
- Influence of mobile devices in online surveys; 2013; Maxl, E., Baumgartner, T.
- E-questionnaire in cross-sectional household surveys; 2013; Karaganis, M.
- GESIS Online Panel Pilot: Results from a Probability-Based Online Access Panel; 2013; Kaczmirek, L., Bandilla, W., Schaurer, I., Struminskaya, B., Weyandt, K.