Web Survey Bibliography
Over the last decades, a tremendous increase has occurred in cross-national data production in social science research (Harkness 2008). The large-scale provision and the wide-spread use of cross-national data sets constitute a huge opportunity for the research community but also pose the challenge to develop cross-national comparable survey items (Lynn, Japec, and Lyberg 2006). At the same time, substantive researchers are increasingly aware of the necessity to understand respondents’ cognitive processes when answering a survey question (Smith et al. 2011). Recently, the method of online probing has been developed that implements probing techniques from cognitive interviewing in web surveys. In the traditional probing approach, interviewers obtain additional information by asking follow-up questions called probes (Beatty and Willis 2007). In contrast, online probing transfers probing questions as open-ended questions in the web context. It can reveal the cognitive processes of web survey participants and it helps to assess whether respondents’ interpretations of an item differ across countries (Braun et al. 2015).
The implementation of probes within web surveys offers respondents a higher level of anonymity of their answers in comparison to the laboratory situation during cognitive interviewing (Behr and Braun 2015), which potentially reduces social desirability effects in the response process (Bethlehem and Biffignandi 2012). Online probing can easily realize large samples sizes, which increases the generalizability of the results, enables an evaluation of the prevalence of problems or themes, and can explain the response patterns of specific subpopulations (Braun et al. 2015). Since all probes have to be programmed in advance, all respondents receive the same probe, and the procedure is highly standardized (Braun et al. 2015). When applied to cross-national data, online probing is a powerful tool to assess the comparability of questions. In contrast to traditional quantitative approaches to assess the equivalence of items (e.g., measurement invariance tests), online probing can explain why respondents in certain countries might misunderstand a specific item or why they adopt different perspectives when providing a response (Behr et al. 2014a).
The overarching goal of this dissertation project is to explore the potential of the method of online probing vis-à-vis other relevant methods that share similar goals (cognitive interviewing and measurement invariance tests) and as an assessment tool for single-item indicators in cross-national surveys. In particular, the dissertation addressed the following research questions: 1) Does online probing arrive at similar results than other methods? 2) Which are the strength and weaknesses of online probing in comparison to other methods? 3) How can online probing be combined with other methods in a mixed-methods approach? 3) How useful is online probing to assess the cross-national comparability of single-item indicators? Since the dissertation’s goal is to compare the methods of online probing, cognitive interviewing, and measurement invariance tests in regard to their potential to detect problematic issues at the item level, the field of national identity has been chosen as a substantive application for the method comparisons due to the existence of potentially problematic measures in a cross-national context. This dissertation focused on items from the 2013 International Social Survey Programme module on National Identity.
The first article of this dissertation (“Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?”; published in Field Methods) analyzed whether online probing and cognitive interviewing arrive at similar conclusion with regard to error detection and themes that are mentioned by respondents when applied to the same set of items (ISSP item battery on specific national pride). The study compares data from cognitive interviews conducted with 20 German respondents in April 2013 with a web survey conducted with 532 German respondents in September 2013. The article revealed that both methods share complementary strength and weaknesses. While probing answers in cognitive interviewing show indications for a higher response quality, online probing can compensate through a larger sample size. The article also provides the researcher with guidance which method is preferable in a given research situation and advocates the combination of both methods in a mixed-methods approach.
The second article of this dissertation (“Necessary but Insufficient: Why Measurement Invariance Tests Need Online Probing as a Complementary Tool”; forthcoming in Public Opinion Quarterly, “2016 AAPOR/WAPOR Janet A. Harkness Award” and “2016 QDET2 Monroe Sirken Innovative Paper Award for Young Scholars of Question Evaluation”) provides an example for a mixed-method approach that combines online probing with quantitative measurement invariance tests. With the examples of the concepts of constructive patriotism and nationalisms, this study explains how the combination of both methods can reveal incomparable items and countries but also explain issues related to cross-national comparability. By analyzing data from the 2013 ISSP and a web survey with 2,685 respondents from five countries, online probing discovered the reasons for missing comparability (varying lexical scope and silent misunderstanding of a key term) that was also detected during the measurement invariance tests.
Finally, the third article showed the potential of online probing for the assessment of the cross-national comparability of single-item indicators with the example of the general national pride item. Online probing provides a unique solution for the decision whether single-item indicators are equivalent because the traditional approach of measurement invariance tests presupposes multiple-indicator measures and is, therefore, inapplicable for single-item indicators. This study analyzed 2,685 probe responses from a web survey that was conducted in five countries. Online probing uncovered several potentially problematic issues and the fact that respondents in all countries associate various concepts with the general national pride item.
Therefore, the contribution of this dissertation is:
1. The insight that online probing arrives at similar results than cognitive interviewing and measurement invariance tests.
2. A clear understanding of the method’s strength and weaknesses vis-à-vis cognitive interviewing and measurement invariance tests.
3. An explanation of optimal implementations of online probing in a mixed-methods approach.
4. A demonstration of the usefulness of online probing to assess the cross-national comparability of single-item indicators.
5. But also, an assessment of the cross-national comparability of measures of national identity for substantive researchers.
Web survey bibliography (4086)
- Respondent mode choice in a smartphone survey ; 2017; Conrad, F. G., Schober, M. F., Antoun, C., Yan, H. Y., Hupp, A., Johnston, M., Ehlen, P., Vickers, L...
- Effects of Mobile versus PC Web on Survey Response Quality: a Crossover Experiment in a Probability...; 2017; Antoun, C.; Couper, M. P.; G. G.Conrad, F. G.
- Socially Desirable Responding in Web-Based Questionnaires: A Meta-Analytic Review of the Candor Hypothesis...; 2016; Gnambs, T.; Kaspar, K.
- Dynamic Question Ordering in Online Surveys; 2016; Early, K.; Mankoff, J.; Fienberg, S. E.
- How to use online surveys to understand human behaviour concerning window opening in terms of building...; 2016; Fabbri, K.
- Impact of satisficing behavior in online surveys on consumer preference and welfare estimates; 2016; Gao, Z.; House, L. A.; Bi, X.
- Comparing Twitter and Online Panels for Survey Recruitment of E-Cigarette Users and Smokers; 2016; Guillory, J.; Kim, A.; Murphy, J.; Bradfield, B.; Nonnemaker, J.; Hsieh, Y. P.
- Influence of Importance Statements and Box Size on Response Rate and Response Quality of Open-Ended...; 2016; Kumar Chaudhary, A.; Israel, G. D.
- Web based health surveys: Using a Two Step Heckman model to examine their potential for population health...; 2016; Morrissey, K.; Kinderman, P.; Pontin, E.; Tai, S.; Schwannauer, M.
- “Better do not touch” and other superstitions concerning melanoma: the cross-sectional web...; 2016; Gajda, M.; Kamiñska-Winciorek, G.; Wydmañski, J.; Tukiendorf, A.
- Methods for Evaluating Respondent Attrition in Web-Based Surveys; 2016; Hochheimer, C. J.; Sabo, R. T.; Krist, A. H.; Day, T.; Cyrus, J.; Woolf, S. H.
- The Low Response Score (LRS): A Metric to Locate, Predict, and Manage Hard-to-Survey Populations; 2016; Erdman, C.; Bates, N.
- Targeted Appeals for Participation in Letters to Panel Survey Members; 2016; Lynn, P.
- Can we assess representativeness of cross-national surveys using the education variable?; 2016; Ortmanns, V.; Schneider, S.
- Methodological Aspects of Central Left-Right Scale Placement in a Cross-national Perspective; 2016; Scholz, E.; Zuell, C.
- Fieldwork Effort, Response Rate, and the Distribution of Survey Outcomes: A Multilevel Meta-analysis; 2016; Sturgis, P.; Williams, Jo.; Brunton-Smith, I.; Moore, J.
- Mobile-only web survey respondents; 2016; Lugtig, P. J.; Toepoel, V.; Amin, A.
- Comparison of Face-to-Face and Web Surveys on the Topic of Homosexual Rights; 2016; Liu, M.; Wang, Yic.
- Question order sensitivity of subjective well-being measures: focus on life satisfaction, self-rated...; 2016; Lee, S.; McClain, C.; Webster, N.; Han, S.
- Web-Based Statistical Sampling and Analysis; 2016; Quinn, A.; Larson, K.
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2016; 2016
- Using Visual Analogue Scales in eHealth: Non-Response Effects in a Lifestyle Intervention; 2016; Kuhlmann, T.; Reips, U.-D.; Wienert, J.; Lippke, S.
- Development and Pilot Test of a Mobile Application for Field Data Collection; 2016; Chiappetta, L.; Kerr, M. M.
- Statistical Design for Online Experiments Across Desktops, Tablets, Smartphones (and Maybe Wearable...; 2016; Qian, P.; Sadeghi, S.; Arora, N. K.
- A Case Study on the Use of Propensity Score Adjustments with Web Survey Data; 2016; Parsons, V.
- Motivated Misreporting in Web Panels; 2016; Bach, R.; Eckman, S.
- Are Initial Respondents Different from the Nonresponse Follow-Up Cases? A Study of Probability-Based...; 2016; Zeng, W.; Dennis, J. M.
- Using official surveys to reduce bias of estimates from nonrandom samples collected by web surveys; 2016; Beresovsky, V.; Dorfman, A.; Rumcheva, P.
- Predicting and Preventing Break-Offs in Web Surveys; 2016; Mittereder, F.
- A Feasibility Study of Recruiting and Maintaining a Web Panel of People with Disabilities; 2016; Chandler, J.
- Exploration of Methods for Blending Unconventional Samples with Traditional Probability Samples; 2016; Gellar, J.; Zhou, H.; D.; Sinclair, M. D.
- Ratio of Vector Lengths as an Indicator of Sample Representativeness ; 2016; Shin, H. C.
- Design of Sample Surveys That Complement Observational Data to Achieve Population Coverage; 2016; Slud, E.; Ashmead, R.
- Inferences from Internet Panel Studies and Comparisons with Probability Samples; 2016; Lachan, R.; Boyle, J.; Harding, R.
- Exploring the Gig Economy Using a Web-Based Survey: Measuring the Online 'and' Offline Side...; 2016; Robles, B. J.; McGee, M.
- Comparing data quality between online panel and intercept samples; 2016; Liu, M.
- Effect of a Pre-Paid Incentive on Response Rates to an Address-Based Sampling (ABS) Web-Mail Survey; 2016; Suzer-Gurtekin, Z.; Elkasabi, M.; Liu, Me.; Lepkowski, J. M.; Curtin, R.; McBee, R.
- Response Behavior in a Video-Web Survey: A Mode Comparison Study; 2016; Haan, M.; Ongena, Y. P.; Vannieuwenhuyze, J. T. A.; de Glopper, K.
- Standard Definitions Final Dispositions of Case Codes and Outcome Rates for Surveys; 2016
- Integration of a phone-based household travel survey and a web-based student travel survey; 2016; Verreault, H.; Morency, C.
- Evaluation of mode equivalence of the MSKCC Bowel Function Instrument, LASA Quality of Life, and Subjective...; 2016; Bennett, A. V.; Keenoy, K.; Shouery, M.; Basch, E.; Temple, L. K.
- Making use of Internet interactivity to propose a dynamic presentation of web questionnaires; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- A streamlined approach to online linguistic surveys; 2016; Erlewine, M. Y.; Kotek, H.
- Du kommst hier nicht rein: Türsteherfragen identifizieren nachlässige Teilnehmer in Online-Umfragen; 2016; Merkle, B.; Kaczmirek, L.; Hellwig, O.
- Incorporating eye tracking into cognitive interviewing to pretest survey questions; 2016; Neuert, C.; Lenzner, T.
- Population Survey Features and Response Rates: A Randomized Experiment; 2016; Guo, Y.; Kopec, J.; Cibere, J.; Li, L. C.; Goldsmith, C. H.
- Mode Effect and Response Rate Issues in Mixed-Mode Survey Research: Implications for Recreational Fisheries...; 2016; Wallen, K. E.; Landon, A. C.; Kyle, G. T.; Schuett, M. A.; Leitz, J.; Kurzawski, K.
- A measure of survey mode differences; 2016; Homola, J.; Jackson, N. M.; Gill, Je.
- Web Health Monitoring Survey: A New Approach to Enhance the Effectiveness of Telemedicine Systems ; 2016; Romano, M. F.; Sardella, M. V.; Alboni, F.
- Smartphones vs PCs: Does the Device Affect the Web Survey Experience and the Measurement Error for...; 2016; Toninelli, D.; Revilla, M.