Web Survey Bibliography
More than 50% of all survey data in the Netherlands are collected via Internet. However, these data may not adequately represent the views of the Dutch people. The majority of the Dutch people are not willing to join a web panel, and from the people that are in a panel the minority (20%) fills out the majority (80%) of the questionnaires (NOPVO, 2006). Therefore, the answers obtained from web panels can differ significantly from the general population. It is well known that panels contain too many (heavy) Internet users and too few ethnic minorities. So how can we get people into a panel that would normally not join and (hopefully) make the results more reliable? An unconventional approach is used for building this panel: via social networks. Traditionally one could make the distinction between probability and volunteer opt-in panels. Although most survey researchers agree that probability panels are needed for representativeness, the majority of web surveys is based on volunteer opt-in panels because of budget restraints. Volunteer opt-in panels are prone to selection bias, however. This new way of recruitment may increase representativeness compared to volunteer opt-in panels (recruitment is on invitation only; respondent driven sampling can be used for difficult to reach groups) while keeping the costs at a minimum. By asking respondents via friends and relatives to join the panel, respondents that are normally not willing to join a panel might be persuaded to join. The starting point of building the panel are administrative records of Breda University of Applied Sciences in the Netherlands (about 7000 students with a national spread). I will investigate response quantity, response quality, and costs and give suggestions about when to use this type of recruitment. Note that the Internet penetration rate in the Netherlands is about 90% in 2010.
AAPOR Homepage (abstract)
GOR Homepage (abstract) / (presentation)
Web survey bibliography - The American Association for Public Opinion Research (AAPOR) 66th Annual Conference, 2011 (26)
- The smart(phone) way to collect survey data; 2013; Stapleton, C.
- Exploring Health-related Experiences and Access to Care: Differences between Online and Telephone Survey...; 2011; Doty, M. M., Peugh, J., Shand-Lubbers, J.
- Using Community Information and Survey Methodology for Bias Reduction to Enhance the Quality of the...; 2011; Harvey, J., Prabhakaran, J., Spera, C., Zhang, Zh.
- Response Quantity, Response Quality, and Costs of Building an Online Panel via Social Contacts.; 2011; Toepoel, V.
- The Influence Of The Direction Of Likert-Type Scales In Web Surveys On Response Behavior In Different...; 2011; Keusch, F.
- An Injured Party?: A Comparison of Political Party Response Formats in Party Identification.; 2011; Schwarz, S., Barlas, F. M., Thomas, R. K., Corso, R. A., Szoc, R.
- Asking Sensitive Questions: Do They Affect Participation In Follow-Up Surveys?; 2011; Schaurer, I., Struminskaya, B., Kaczmirek, L., Bandilla, W.
- Designing Questions for Web Surveys: Effects of Check-List, Check-All, and Stand-Alone Response Formats...; 2011; Dykema, J., Schaeffer, N. C., Beach, J., Lein, V., Day, B.
- Differential Sampling Based on Historical Individual-Level Data in Online Panels.; 2011; Kelly, R. H.
- Web Survey Live Validations - What Are They Doing?; 2011; Crawford, S. D., McClain, C.
- Comparing Numeric and Text Open-End Responses in Mail and Web Surveys.; 2011; Olson, K., Smyth, J.
- Effects of Response Formats when Measuring Attitudes in Consumer Web Surveys Across Markets.; 2011; Couper, M. P., Nunge, E.
- Re-Examining the Validity of Different Survey Modes for Measuring Public Opinion in the U.S.: Findings...; 2011; Ansolabehere, S., Fraga, B., Schaffner, B. F.
- How to Survey All 14 000 Swedish Local Political Representatives And Get 10 000 Responses.; 2011; Gilljam, M., Granberg, D., Holm, B., Persson, M.
- Measuring User Satisfaction in the Lab: Questionnaire Mode, Physical Location, and Social Presence Concerns...; 2011; Jans, M., Romano, J. C., Ashenfelter, K. T., Krosnick, J. A.
- Interactive interventions in web surveys can increase response accuracy.; 2011; Conrad, F. G.
- Impact on Data Quality of Making Incentives Salient in Web Survey Invitations.; 2011; Zhang, Che.
- Effects of Mode and Incentives on Response Rates, Costs, and Response Quality in a Mixed Mode Survey...; 2011; Stevenson, J., Dykema, J., Kniss, C., Black, P., Moberg, P.
- Effects of Differential Incentives on Response Rates in Four Countries for a Web-based Follow Up Survey...; 2011; McSpurren, K.
- Completing Web Surveys on Cell-enabled iPads.; 2011; Dayton, J., Driscoll, H.
- The Social Aspect of the Digital Divide; 2011; Johnson, E. P.
- Which Technologies Do Respondents Use in Online Surveys – An International Comparison?; 2011; Kaczmirek, L., Behr, D., Bandilla, W.
- Matrix Questionnaire Design to Reduce Measurement Error; 2011; Peytchev, A., Peytcheva, E.
- Race-of-Virtual-Interviewer Effects; 2011; Conrad, F. G., Schober, M. F., Nielsen, D.
- Which Web Survey Respondents Are Most Likely to Click for Clarification?; 2011; Coiner, T., Schober, M. F., Conrad, F. G.
- Providing Clarifying Instructions in a Web Survey; 2011; Redline, C. D.