Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Using Web Panels to Quantify the Qualitative: The National Center for Health Statistics Research and Development Survey
Year 2016
Access date 09.06.2016
As with any qualitative method, one of the major limitations of cognitive interviewing is the fact that its findings cannot be generalized from its purposive sample to the larger, statistical sample of a survey. In 2014 the National Center for Health Statistics launched the Research and Development Survey(RANDS), a web-mode survey using the Gallup Panel whose purpose was to explore how web panel surveys may be used to supplement official health statistics and the methodological work at NCHS. One of the main research questions RANDS was developed to explore was whether or not web panels were appropriate vehicles for attempting to extrapolate qualitative findings from question evaluation studies through the use of “embedded probes.” Embedded probes are structured cognitive probe questions that are administered directly following a survey item that is under evaluation. Unlike typical cognitive question evaluation methods, these probes are administered as part of a fielded survey, and not in a pre-test or lab environment. In this research, National Health Interview Survey (NHIS) questions were evaluated using in-depth, face-to-face cognitive interviews. Following the analysis of this data, patterns of interpretation that respondents used to answer the NHIS questions were analyzed and the resulting cognitive schema were developed. Using these schema, structured probe questions were designed and then administered following their corresponding NHIS questions in the RANDS. Here, we present the initial findings of the RANDS analysis, showing that the use of embedded probes on a web panel survey are a relatively cheap and efficient way to extrapolate the results of qualitative question evaluation studies to a wider survey population, and to show the different patterns of interpretation of questions across various groups of respondents.
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations

Web survey bibliography (431)