Web Survey Bibliography
Survey panels provide a useful mechanism for longitudinal measurement of within-person change and also can be cost effective compared to conducting another cross-sectional survey. However, because of sample losses that predictably occur, the panel sample can deteriorate in terms of representativeness or else the respondents themselves might change in their self-reports to survey questions as a direct consequence of their participation in the panel. This latter set of consequences constitutes panel conditioning. Not all panel conditioning lead to more error. Respondents might become better ―survey takers‖ in being able to report their true attitudes and opinions in answering survey questions (i.e., improved predictive reliability). Other consequences are less helpful, such as actual changes in attitudes, opinions, behavior, and knowledge that can be attributed to panel participation. In this paper, we‘ll accomplish two things. First, we will present a typology of panel conditioning. Second, we will show the results of multivariate regression tests that isolate the impact of prior survey taking on survey responses. We have identified certain political knowledge questions for which there is evidence of limited panel conditioning effects (i.e., more panel experience, more knowledge), while a host of opinion and attitude items did not have such effects. The data source will be the 2008 Associated Press-Yahoo! News Poll conducted by Knowledge Networks with contributions from political scientists at Harvard University and Stanford University. The study involved an eleven-wave web panel election survey of general population U.S. adults. The baseline data collection occurred prior to the onset of the political primaries (November 2007) and the final data collection took place after the November 2008 general election.
Conference Homepage (abstract)
Web survey bibliography - 2011 (358)
- Research on Internet survey errors and control methods; 2011; Mingyue, F., Xicang, Z.
- Lösungsansätze gegen den Allgemeinarztmangel auf dem Land - Ergebnisse einer Online-Befragung unter Ä...; 2011; Steinhäuser, J., Annan, N. F., Roos, M., Szecsenyi, J., Joos, S.
- Partnership-Driven Resources to Improve and Enhance Research (PRIMER): A Survey of Community-Engaged...; 2011; Dolor, R. J., Greene, S. M., Thompson, E., Baldwin, L.-M., Neale, A. V.
- Question Comprehensibility and Satisficing Behavior in Web Surveys; 2011; Lenzner, T.
- Examination of a ’Web Mode Effect’. An Experimental Comparison of Web and Paper Based Surveys...; 2011; Shamshiri-Petersen, D., Clement, S. L.
- Conceptualizing Trust in Digital Environments: Health-e Skepticism: Trust in the Age of the Internet; 2011; Harris, A., Wyatt, S., Kelly, S.
- Some Researchers Do, Some Researchers Don’t: Reflections on Interdisciplinarity and Digital Social...; 2011; Pangbourne, K., Philip, L., Pignotti, E., Edwards, P.
- Internet & Learning: A Decade of Transformation in Learning Practices; 2011; Haythornthwaite, C., Andrews, R., Jones, C., de Castell, S., Goodfellow, R., Jewitt, C., Barton, D.
- Social Science Research Methods in Internet Time; 2011; Karpf, D. A.
- Quantifying Open-Ended Responses: Results from an Online Advertising Tracking Survey; 2011; Jacobe, A., Brewer, L., Vakalia, F., Turner, S., Marsh, S. M.
- Quality of responses to an open-ended question on a mixed-mode survey; 2011; Gibson, J., Vakalia, F., Turner, S.
- Open-ended questions in the context of temporary work research; 2011; Siponen, K.
- How do Respondents Perceive a Questionnaire? The Contribution of Open-ended Questions; 2011; Markou, E., Garnier, B.
- The Uses of Open-Ended Questions in Quantitative Surveys; 2011; Singer, E., Couper, M. P.
- Implementation, implementation, implementation: old and new options for putting surveys and experiments...; 2011; MacKerron, G.
- Weaving the Web into Personal Communication Networks: A Mobile Phone Based Study of Smartphone Users; 2011; Kobayashi, T., Boase, J.
- Agree-Disagree Response Format versus Importance Judgment; 2011; Krebs, D.
- Testing a single mode vs a mixed mode design; 2011; Laaksonen, S.
- Germans' segregation preferences and immigrant group size: A factorial survey approach; 2011; Schlueter, E., Ullrich, J., Schmidt, P.
- Errors within web-based surveys: a comparison between two different tools for the analysis of tourist...; 2011; Polizzi, G., Oliveri, A. M.
- Benefits of Structured DDI Metadata across the Data Lifecycle: The STARDAT Project at the GESIS Data...; 2011; Linne, M., Brislinger, E., Zenk-Moeltgen, W.
- Microdata Information System MISSY; 2011; Bohr, J.,
- The Use of Structured Survey Instrument Metadata throughout the Data Lifecycle; 2011; Hansen, S. E.
- DDI and the Lifecycle of Longitudinal Surveys; 2011; Hoyle, L., Wackerow, J.
- Dissemination of survey (meta)data in the LISS data archive; 2011; Streefkerk, M., Elshout, S.
- Underreporting in Interleafed Questionnaires: Evidence from Two Web Surveys; 2011; Medway, R., Viera Jr., L., Turner, S., Marsh, S. M.
- The use of cognitive interviewing methods to evaluate mode effects in survey questions; 2011; Gray, M., Blake, M., Campanelli, P., Hope, S.
- Does the direction of Likert-type scales influence response behavior in web surveys?; 2011; Keusch, F.
- Cross-country Comparisons: Effects of Scale Type and Response Style Differences; 2011; Thomas, R. K.
- Explaining more variance with visual analogue scales: A Web experiment; 2011; Funke, F.
- A Comparison of Branching Response Formats with Single Response Formats; 2011; Thomas, R. K.
- Different functioning of rating scale formats – results from psychometric and physiological experiments...; 2011; Koller, M., Salzberger, T.
- Cognitive process in answering questions: Are verbal labels in rating scales attended to?; 2011; Menold, N., Kaczmirek, L., Lenzner, T.
- Experiments on the Design of the Left-Right Self-Assessment Scale; 2011; Zuell, C., Scholz, E., Behr, D.
- Is it a good idea to optimise question format for mode of data collection? Results from a mixed modes...; 2011; Nicolaas, G., Campanelli, P., Hope, S., Lynn, P., Nandi, A.
- Separating selection from mode effects when switching from single (CATI) to mixed mode design (CATI /...; 2011; Carstensen, J., Kriwy, P., Krug, G., Lange, C.
- Testing between-mode measurement invariance under controlled selectivity conditions; 2011; Klausch, L. T.
- Using propensity score matching to separate mode- and selection effects; 2011; Lugtig, P. J., Lensvelt-Mulders, G. J.
- A mixed mode pilot on consumer barometer; 2011; Taskinen, P., Simpanen, M.
- Separation of selection bias and mode effect in mixed-mode survey – Application to the face-to...; 2011; Bayart, C., Bonnel, P.
- Optimization of dual frame telephone survey designs; 2011; Slavec, A., Vehovar, V.
- A Comparison of CAPI and PAPI through a Randomized Field Experiment; 2011; De Weerdt, J.
- Flexibility of Web Surveys: Probing 'do-not-know' over the Phone and on the Web; 2011; Hox, J., de Leeuw, E. D.
- Changing research methods in Ukraine: CATI or Mixed-Mode Surveys?; 2011; Paniotto, V., Kharchenko, N.
- The effects of mixed mode designs on simple and complex analyses; 2011; Martin, P., Lynn, P.
- In the Face of Declining Budgets: The Student Experience at Washington State University ; 2011; Allen, T., Dillman, D. A., Garza, B., Millar, M. M.
- Use of Cognitive Shortcuts in Landline and Cell Phone Surveys; 2011; Everett, S. E., Kennedy, C.
- Mode Effect or Question Wording? Measurement Error in Mixed Mode Surveys; 2011; de Leeuw, E. D., Hox, J., Scherpenzeel, A.
- Conducting Respondent Driven Sampling on the Web: An Experimental Approach to Recruiting Challenges; 2011; Kapteyn, A., Schonlau, M.
- Discussion of Melanie Revilla's presentation on "Impact of the mode of data collection on...; 2011; Blom, A. G.