Web Survey Bibliography
All social surveys suffer from different types of errors, of which one of the most studied is non-response bias. Non-response bias is a systematic error that occurs because individuals differ in their accessibility and propensity to participate in a survey according to their own characteristics as well as those from the survey itself. The extent of the problem heavily depends on the correlation between response mechanisms and key survey variables. However, non-response bias is difficult to measure or to correct for due to the lack of relevant data about the whole target population or sample. In this paper, non-response follow-up surveys are considered as a possible source of information about non-respondents. Non-response follow-ups, however, suffer from two methodological issues: they themselves operate through a response mechanism that can cause potential non-response bias, and they pose a problem of comparability of measure, mostly because the survey design differs between main survey and non-response follow-up. In order to detect possible bias, the survey variables included in non-response surveys have to be related to the mechanism of participation, but not be sensitive to measurement effects due to the different designs. Based on accumulated experience of four similar non-response follow-ups, we studied the survey variables that fulfill these conditions. We differentiated socio-demographic variables that are measurement-invariant but have a lower correlation with non-response and variables that measure attitudes, such as trust, social participation, or integration in the public sphere, which are more sensitive to measurement effects but potentially more appropriate to account for the non-response mechanism. Our results show that education level, work status, and living alone, as well as political interest, satisfaction with democracy, and trust in institutions are pertinent variables to include in non-response follow-ups of general social surveys.
All social surveys suffer from different types of errors, of which one of the most studied is non-response bias. Non-response bias is a systematic error that occurs because individuals differ in their accessibility and propensity to participate in a survey according to their own characteristics as well as those from the survey itself. The extent of the problem heavily depends on the correlation between response mechanisms and key survey variables. However, non-response bias is difficult to measure or to correct for due to the lack of relevant data about the whole target population or sample. In this paper, non-response follow-up surveys are considered as a possible source of information about non-respondents. Non-response follow-ups, however, suffer from two methodological issues: they themselves operate through a response mechanism that can cause potential non-response bias, and they pose a problem of comparability of measure, mostly because the survey design differs between main survey and non-response follow-up. In order to detect possible bias, the survey variables included in non-response surveys have to be related to the mechanism of participation, but not be sensitive to measurement effects due to the different designs. Based on accumulated experience of four similar non-response follow-ups, we studied the survey variables that fulfill these conditions. We differentiated socio-demographic variables that are measurement-invariant but have a lower correlation with non-response and variables that measure attitudes, such as trust, social participation, or integration in the public sphere, which are more sensitive to measurement effects but potentially more appropriate to account for the non-response mechanism. Our results show that education level, work status, and living alone, as well as political interest, satisfaction with democracy, and trust in institutions are pertinent variables to include in non-response follow-ups of general social surveys. - See more at: https://ojs.ub.uni-konstanz.de/srm/article/view/6138#sthash.CEiOvCVB.dpuf
Web survey bibliography (4086)
- Sample Representation and Substantive Outcomes Using Web With and Without Incentives Compared to Telephone...; 2016; Lipps, O.; Pekari, N.
- Effects of Data Collection Mode and Response Entry Device on Survey Response Quality; 2016; Ha, L.; Zhang, Che.; Jiang, W.
- The Dynamic Identity Fusion Index: A New Continuous Measure of Identity Fusion for Web-Based Questionnaires...; 2016; Jimenez, J.; Gomez, A.; Buhrmester, M.; Whitehouse, H.; Swann, W. B.
- Recommended Practices for the design of business surveys questionnaires; 2016; Macchia, S.
- Navigation Buttons in Web-Based Surveys: Respondents’ Preferences Revisited in the Laboratory; 2016; Romano Bergstrom, J. C.; Erdman, C.; Lakhe, S.
- Collecting Data from mHealth Users via SMS Surveys: A Case Study in Kenya; 2016; Johnson, D.
- “Money Will Solve the Problem”: Testing the Effectiveness of Conditional Incentives for...; 2016; DeCamp, W.; Manierre, M. J.
- Effects of Incentive Amount and Type of Web Survey Response Rates; 2016; Coopersmith, J.; Vogel, L. K.; Bruursema, T.; Feeney, K.
- Effect of a Post-paid Incentive on Response to a Web-based Survey; 2016; Brown, J. A.; Serrato, C. A.; Hugh, M.; Kanter, M. H.; A.; Spritzer, K. L.; Hays, R. D.
- Web-based versus Paper-based Survey Data: An Estimation of Road Users’ Value of Travel Time Savings...; 2016; Kato, H.; Sakashita, A.; Tsuchiya, Tak.
- Reminder Effect and Data Usability on Web Questionnaire Survey for University Students; 2016; Oishi, T.; Mori, M.; Takata, E.
- Feasibility of using a multilingual web survey in studying the health of ethnic minority youth.; 2016; Kinnunen, J. M.; Malin, M.; Raisamo, S. U.; Lindfors, P. L.; Pere, L. A.; Rimpelae, A. H.
- Respondents of a follow-up web-based survey; 2016; Stoddard, S. A.; Amparo, P.; Popick, H.; Yudd, R.; Sujeer, A.; Baath, M.
- An Examination of Opposing Responses on Duplicated Multi-Mode Survey Responses; 2016; Djangali, A.
- Is One More Reminder Worth It? If So, Pick Up the Phone: Findings from a Web Survey; 2016; Lin-Freeman, L.
- Reducing Underreports of Behaviors in Retrospective Surveys: The Effects of Three Different Strategies...; 2016; Lugtig, P. J.; Glasner, T.; Boeve, A.
- Dropouts in Longitudinal Surveys; 2016; Lugtig, P. J.; De Leeuw, E. D.
- Participant recruitment and data collection through Facebook: the role of personality factors; 2016; Rife, S. C.; Cate, K. L.; Kosinski, M.; Stillwell, D.
- What drives the participation in a monthly research web panel? The experience of ELIPSS, a French random...; 2016; Legleye, S; Cornilleau, A.; Razakamanana, N.
- When Should I Call You? An Analysis of Differences in Demographics and Responses According to Respondents...; 2016; Vicente, P.; Lopes, I.
- Evaluating a New Proposal for Detecting Data Falsification in Surveys; 2016; Simmons, K.; Mercer, A. W.; Schwarzer, S.; Courtney, K.
- Quantifying Under- and Overreporting in Surveys Through a Dual-Questioning-Technique Design. ; 2016; de Jong , M.; Fox, J.-P.; Steenkamp, J. - B. E. M.
- The use and positioning of clarification features in web surveys; 2016; Metzler, A., Kunz, T., Fuchs, M.
- Take the money and run? Redemption of a gift card incentive in a clinician survey. ; 2016; Chen, J. S.; Sprague, B. L.; Klabunde, C. N.; Tosteson, A. N. A.; Bitton, A.; Onega, T.; MacLean, C....
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- Mail merge can be used to create personalized questionnaires in complex surveys. ; 2016; Taljaard, M.; Chaudhry, S. H.; Brehaut, J. C.; Weijer, C.; Grimshaw, J. M.
- Electronic and paper based data collection methods in library and information science research: A comparative...; 2016; Tella, A.
- Stable Relationships, Stable Participation? The Effects of Partnership Dissolution and Changes in Relationship...; 2016; Mueller, B.; Castiglioni, L.
- Identifying Pertinent Variables for Nonresponse Follow-Up Surveys. Lessons Learned from 4 Cases in Switzerland...; 2016; Vandenplas, C.; Joye, D.; Staehli, M. E.; Pollien, A.
- The 2013 Census Test: Piloting Methods to Reduce 2020 Census Costs; 2016; Walejko, G. K.; Miller, P. V.
- A Technical Guide to Effective and Accessible web Surveys; 2016; Baatard, G.
- The Validity of Surveys: Online and Offline; 2016; Wiersma, W.
- Methods can matter: Where Web surveys produce different results than phone interviews; 2016; Keeter, S.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- Sunday shopping – The case of three surveys; 2016; Bethlehem, J.
- Solving the Nonresponse Problem With Sample Matching?; 2016
- Will They Stay or Will They Go? Personality Predictors of Dropout in Online Study; 2016; Nestler, S.; Thielsch, M.; Vasilev, E.; Back, M.
- Do Polls Still Work If People Don't Answer Their Phones?; 2016; Edwards-Levy, A.; Jackson, N. M.
- HUFFPOLLSTER: Why Reaching Latinos Is A Challenge For Pollsters; 2016; Jackson, N. M.; Edwards-Levy, A.; Velencia, J.
- A Framework of Incorporating Thai Social Networking Data in Online Marketing Survey; 2016; Jiamthapthaksin, R.; Aung, T. H.; Ratanasawadwat, N.
- Creation and Usability Testing of a Web-Based Pre-Scanning Radiology Patient Safety and History Questionnaire...; 2016; Robinson, T. J.; DuVall, S.; Wiggins III, R
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- Development of a scale to measure skepticism toward electronic word-of-mouth; 2016; Zhang, Xia.; Ko, M.; Carpenter, D.
- Improving social media measurement in surveys: Avoiding acquiescence bias in Facebook research; 2016; Kuru, O.; Pasek, J.
- Psychological research in the internet age: The quality of web-based data; 2016; Ramsey, S. R.; Thompson, K. L.; McKenzie, M.; Rosenbaum, A.
- Internet Abusive Use Questionnaire: Psychometric properties; 2016; Calvo-Frances, F.
- Revisiting “yes/no” versus “check all that apply”: Results from a mixed modes...; 2016; Nicolaas, G.; Campanelli, P.; Hope, S.; Jaeckle, A.; Lynn, P.
- The impact of academic sponsorship on Web survey dropout and item non-response; 2016; Allen, P. J.; Roberts, L. D.
- Moderators of Candidate Name-Order Effects in Elections: An Experiment; 2016; Kim, Nu.; Krosnick, J. A.; Casasanto, D.
- Predictive inference for non-probability samples: a simulation study ; 2016; Buelens, B.; Burger, J.; van den Brakel, J.