Web Survey Bibliography
Title A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys
Author Geisen, E., Murphy, J.
Year 2017
Access date 10.09.2017
Abstract Due to low costs, improvements in coverage, and technological advances many surveys are now being conducted in whole or in part via self-administered web questionnaires. Increasingly, respondents are choosing to complete web surveys on touch-screen mobile devices such as tablets and smartphones. Recent estimates show that the proportion of respondents completing a survey on a mobile device can be 30% or more for some surveys (Lugtig, Toepoel, and Amin, 2016; Saunders, 2015). Mobile apps are also being used by survey respondents who are panel members and by interviewers to administer household screening surveys. Because of these technological advances, the ways that respondents and interviewers interact with surveys are changing.
With the pace of change in survey administration, we need to consider whether traditional pretesting methodologies address the types of potential quality concerns these newer modes introduce. For example, modern web surveys support dynamic survey features such as hover-over definitions, calculate total buttons, videos/images, error messages, dynamic look-ups, touch-screen, swiping to navigate, GPS, and other capabilities. Each of these features changes the respondent-survey interaction, which can affect the quality of the data collected in a survey.
The purpose of this paper is to introduce emerging survey pretesting methodologies and compare these with traditional methods in the light of modern data collection technologies to consider where the standard approaches for pretesting can be improved. We begin by discussing the key limitations of traditional pretesting methods such as expert review, cognitive interviewing, and pilot testing for evaluating “modern” surveys. We then provide an overview of emerging pretesting methods including usability testing, eye tracking, and crowdsourcing. We discuss the advantages offered by these methods – particularly in terms of budget and schedule—and provide empirical examples of how these methods can improve data quality. We conclude with a theoretical mode for the optimal combination of traditional and newer methods for pretesting modern surveys.
With the pace of change in survey administration, we need to consider whether traditional pretesting methodologies address the types of potential quality concerns these newer modes introduce. For example, modern web surveys support dynamic survey features such as hover-over definitions, calculate total buttons, videos/images, error messages, dynamic look-ups, touch-screen, swiping to navigate, GPS, and other capabilities. Each of these features changes the respondent-survey interaction, which can affect the quality of the data collected in a survey.
The purpose of this paper is to introduce emerging survey pretesting methodologies and compare these with traditional methods in the light of modern data collection technologies to consider where the standard approaches for pretesting can be improved. We begin by discussing the key limitations of traditional pretesting methods such as expert review, cognitive interviewing, and pilot testing for evaluating “modern” surveys. We then provide an overview of emerging pretesting methods including usability testing, eye tracking, and crowdsourcing. We discuss the advantages offered by these methods – particularly in terms of budget and schedule—and provide empirical examples of how these methods can improve data quality. We conclude with a theoretical mode for the optimal combination of traditional and newer methods for pretesting modern surveys.
Access/Direct link Conference Homepage (abstract) / (presentation)
Year of publication2017
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - European survey research associaton conference 2017, ESRA, Lisbon (26)
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- A test of sample matching using a pseudo-web sample; 2017; Chatrchi, G., Gambino, J.
- A Partially Successful Attempt to Integrate a Web-Recruited Cohort into an Address-Based Sample; 2017; Kott, P. S., Farrelly, M., Kamyab, K.
- Nonprobability sampling as model construction; 2017; Mercer, A. W.