Web Survey Bibliography
In this study, we investigate the perspectives of marketing researchers views about the two important concepts of survey research, response rate and response bias. In an attempt to answer the research questions stated, we have collected both primary and secondary data. The primary data was collected from the Academy of Marketing Science active members. Eight versions of an excerpt were taken from an actual article that was accepted for publication recently.
The first treatment was regarding the consideration of population. The subjects were manipulated with two versions of the excerpt of which one was with a Canadian sample and the second was with a so called North American sample. The second treatment was about the manipulation of the initial number of surveys sent out which as a result would change the response rate percentage. The two different versions included 500 vs. 5000 initial surveys sent out varying the response rate from 5.1% to 50.2%. The third treatment included the manipulation of Armstrong and Overton (1977) citation. First version contained a sentence that stated that the early and the late respondents were compared and no significant differences were found as evidence of no response bias including the citation of Armstrong and Overton (1977). The second version of the excerpt included a table with the expected demographics regarding the population of interest. In addition to these, the subjects were also assigned to two different conditions where they were asked to evaluate the excerpt as an author or as a reviewer. In the invitation email respondents were asked to toss a coin or to click on a web link that would toss the coin for them and select the appropriate link that corresponds with their choice. In order to assess the popular techniques of enhancing response rate, we have divided the sample into several groups as the pre-notification, the reminder and the pre-notification and reminder groups and a control group that received neither the reminder nor the pre-notification treatment. The results revealed that, according to our sample none of the techniques mentioned above, improve the response rate.
The secondary data were collected from major outlets of the marketing science (Journal of Marketing-JM, Journal of Marketing Research-JMR, and Journal of the Academy of Marketing Science-JAMS) during the periods of 2005-2010. The final sample consisted of 68 JM, 23 JMR and 84 JAMS articles. In addition to these, we also randomly selected 31 rejected articles from the Journal of Business Research (JBR) archives.
The results of the study revealed that, survey researchers do not clearly grasp the concepts of response rate and response bias. In addition, the results demonstrated that the data quality should be measured by the sample’s representativeness of the population and the researcher’s capability of decreasing the response and the non-response biases. Further, the techniques used to enhance response rate such as reminder and pre-notification letters as well as incentives are not as effective and are likely to introduce additional response bias to a study. The results also showed that the optimal data collection method researchers should consider adopting is the combination method.
Book section
Web survey bibliography - Conference proceedings (83)
- Estimation and Adjustment of Self-Selection Bias in Volunteer Panel Web Surveys ; 2016; Niu, Ch.
- Shorter Interviews, Longer Surveys: Optimising the survey participant experience whilst accommodating...; 2016; Halder, A.; Bansal, H. S.; Knowles, R.; Eldridge, J.; Murray, Mi.
- Gamifying. Not all fun and games; 2016; Stubington, P.; Crichton, C.
- Are interviews costing £0.08 a waste of money? Reviewing Google Surveys for Wisdom of the Crowd...; 2016; Roughton, G.; MacKay, I.
- Observations from Twelve Years of an Annual Market Research Technology Survey; 2016; Macer, T.; Wilson, S.
- A Comparison of the Effects of Face-to-Face and Online Deliberation on Young Students’ Attitudes...; 2015; Triantafillidou, A.; Yannas, P.; Lappas, G.; Kleftodimos, A.
- A Privacy-Friendly Method to Reward Participants of Online-Surveys; 2015; Herfert, M.; Lange, B.; Selzer, A.; Waldmann, U.
- Designing Bonsai Surveys: The small but perfectly formed survey experience to meet the needs of the...; 2015; Puleston, J.
- Is accuracy only for probability samples? Comparing probability and non-probability samples in a country...; 2013; Martinsson, J., Dahlberg, S., Lundmark, S.
- The effect of language in answering qualitative questions in user experience evaluation web-surveys; 2013; Walsh, T., Nurkka, P., Petrie, H., Olson, J.
- Beyond Satisfaction Questionnaires: “Hacking” the Online Survey; 2013; Evans, A. L.
- Advancing the field of questionnaire translation - identifying problems, discussing methods, pushing...; 2013; Behr, D., Dorer, B., Van Houten, G
- European Values Study - methodological and substantive applications; 2013; Luijkx, R., Jagodzinski, W.
- The Impact of Culture and Economy on Values and Attitudes; 2013; Duelmer, H., Voicu, M.
- Educational attainment in cross-national surveys: instrument design, data collection, harmonisation...; 2013; Schneider, S.
- Mode Effects in Mixed-Mode Surveys: Prevention, Diagnostics, and Adjustment 1; 2013; de Leeuw, E. D., Dillman, D. A., Schouten, B.
- The smart(phone) way to collect survey data; 2013; Stapleton, C.
- Unintentional mobile respondents; 2012; Peterson, G.
- Metering mobile usage. Insights from global Arbitron mobile trends panel; 2012; Verkasalo, H.
- Is „chapterisation“ a viable alternative to traditional progress indicators ?; 2012; Spicer, R., Dowling, Z.
- Self-administered mobile surveys; 2011; Bosnjak, M.
- Online survey research: Findings, Best practices, and future research; 2011
- Blend, balance, and stabilize respondent sources; 2011; Eggers, M., Drake, E.
- Mode Effect or Question Wording? Measurement Error in Mixed Mode Surveys; 2011; de Leeuw, E. D., Hox, J., Scherpenzeel, A.
- There is an app for that! A review of smartphone apps for marketing research; 2010; Michelson, M.
- The state of online research in the U.S.; 2010; Miller, J.
- A framework for understanding and applying ethical principles in network and security research; 2010; Kenneally, E., Bailey, M., Maughan, D.
- Restructuring and innovations on the survey “capacity of collective tourist accommodation”...; 2010; Santoro, M. T., Staffieri, S.
- An Analyze of the Zero Price Effect on Online Business Performance - An Research Based on the Mobile...; 2010; Liu, Y., Yuan, P.
- Dealing with Nonresponse in Survey Sampling: an Item Response Modeling Approach; 2010; Matei, A.
- Response format effects on measurement of employment; 2009; Thomas, R. K., Dillman, D. A., Smyth, J. D.
- Response Mode and Bias Analysis in the IRS’ Individual Taxpayer Burden Survey; 2009; Brick, J. M., Contos, G., Masken, K., Nord, R.
- Survey Mode Effects in Two Military Surveys; 2009; Yang, M., Falcone, A. E., Milan, L. M.
- Web based macroseismic survey: fast information exchange and elaboration of seismic intensity effects...; 2009; De Rubeis, V., Sbarra P., Sorrentino, D., Tosi, P.
- The representativeness of the LISS panel ; 2009; Knoef, M., de Vos, K.
- Sample factors that influence data quality; 2008; Gailey, R., Teal, D., Haechrel, E.
- An online panel as a platform for multi-disciplinary research; 2008; Scherpenzeel, A.
- Visual Design Effects on on Respondents Behaviour in Web-Surveys. A Design Experiment; 2008; Greinoecker, A.
- Effects of Privacy Assurances on the Online Measurement of Psychological Constructs; 2008; Witzki, A., Kramer, J.
- How Web 2.0 Technologies Can Become a Valuable Part of Online Research; 2008; Jaron, R.
- Respondent Authenticity - A biometrical approach to authenticate panelists; 2008; Wachter, B., Bender, C.
- Not Mixed-Mode but Switch-Mode; 2008; Höglinger, M., Abraham, M., Arpagaus, J.
- The Impact of Cognitive and Computer Skills on Data Quality in Computer Assisted Self Administered Questionnaires...; 2008; Brecko, B. N., Vehovar, V.
- Optimal Contact Strategy in a Mail-and-Web Mixed Mode Survey; 2008; Holmberg, A., Lorenc, B., Werner, P.
- 10 Years of Meinungsplatz.de: Success in the Collection of Data for Targeted Audiences, Such as the...; 2008; Weyergraf, O.
- Self-selection in Online Access Panels: No “Little Difference” in the Recruiting Process...; 2008; Wirth, T.
- Mobile Market Research; 2008; Maxl, E.
- Online vs. Offline in Mobile Surveys; 2008; Neubarth, W., Maier, U.
- Gender-of-Interviewer Effects in Video-Enhanced Web Surveys. Results from a Randomized Field-Experiment...; 2008; Fuchs, M.
- The Online Use of Randomized Response Measurements; 2008; Snijders, C., Weesie, J.