Web Survey Bibliography
A lot of websites use web based exit surveys in order to measure the satisfaction of their visitors. The related literature includes user satisfaction studies of library, health related and other more commercial-oriented websites. Usually the response rate to these web surveys is very low. This low response rate raises questions about the quality of the data collected by the web survey. In this paper I try to provide some answers to the question if the sample is representative of the total population of website visitors.
Methods & Data:
The findings presented in this paper are based on the analysis of data collected by the Greek voting advice application HelpMeVote and the analysis of data collected by the corresponding exit survey. The setting of HelpMeVote is perfect for the comparison between the total set of visitors and the subset of people who have responded to the web survey. The only reason someone visits a voting advice application is to answer to a series of questions in order to get his/her proximity with the political parties. Before giving the output I ask users to fill-in a form with their personal information. Although it is not mandatory (users can click “continue” and move on to the output without answering the questions of the form) the vast majority responds to these questions. As a result I have the distribution of the population and I can compare it with the distribution of the sample.
Results:
Logistic regression analysis shows that the probability to respond to the website exit survey is larger for male, younger and more educated visitors, but the most significant predictor of responding to the web survey is the level of satisfaction. As a result, satisfied users are over-represented and unsatisfied users are under-represented in the sample.
Added Value:
The findings of this paper demonstrate that visitor satisfaction estimated by a web-based exit survey will be higher than the visitor satisfaction we would measure if non-respondents would be included in the calculation.
GOR Homepage (abstract) / (presentation)
Web survey bibliography - General Online Research Conference (GOR) 2013 (34)
- Respondent Rewards: Money for Nothing?; 2013; Martin, P.
- Pros and cons of virtual interviewers – vote in the discussion about surveytainment; 2013; Póltorak, M., Kowalski, J.
- The fish model: What factors affect participants while filling in an online questionnaire?; 2013; Mohamed, B., Lorenz, A., Pscheida, D.
- Interview Duration in Web Surveys: Integrating Different Levels of Explanation; 2013; Rossmann, J., Gummer, T.
- The monetary value of good questionnaire design; 2013; Tress, F.
- Technical and methodological meta-information on current practices in online research: A full population...; 2013; Burger, C., Stieger, S.
- Using interactive feedback to enhance response quality in Web surveys. The case of open-ended questions...; 2013; Emde, M., Fuchs, M.
- Reducing Response Order Effects in Check-All-That-Apply Questions by Use of Dynamic Tooltip Instructions...; 2013; Kunz, T., Fuchs, M.
- Measuring wages via a volunteer web survey – a cross-national analysis of item nonresponse; 2013; Steinmetz, S., Annmaria, B.
- Does one really know?: Avoiding noninformative answers in a reliable way.; 2013; de Leeuw, E. D., Boevee, A., Hox, J.
- Sensitive Topics in PC and Mobile Web Surveys; 2013; Mavletova, A. M., Couper, M. P.
- Mobile Research Performance: How Mobile Respondents Differ from PC Users Concerning Interview Quality...; 2013; Schmidt, S., Wenzel, O.
- Who responds to website visitor satisfaction surveys?; 2013; Andreadis, I.
- Measuring working conditions in a volunteer web survey; 2013; de Pedraza, P., Villacampa, A.
- Sampling online communities: using triplets as basis for a (semi-) automated hyperlink web crawler.; 2013; Veny, Y.
- Why are you leaving me?? - Personality predictors of answering drop out in an online-study; 2013; Thielsch, M., Nestler, S., Back, M.
- Propensity Score Weighting – Can Personality Adjust for Selectivity?; 2013; Glantz, A., Greszki, R.
- Research Design as an Influencing Factor for Reliability in Online Market Research; 2013; Wengrzik, J., Theuner, G.
- Ethics, privacy and data security in web-based course evaluation; 2013; Salaschek, M., Meese, C., Thielsch, M.
- Seducing the respondent – how to optimise invitations in on-site online research?; 2013; Póltorak, M., Kowalski, J.
- Influence of mobile devices in online surveys; 2013; Maxl, E., Baumgartner, T.
- E-questionnaire in cross-sectional household surveys; 2013; Karaganis, M.
- GESIS Online Panel Pilot: Results from a Probability-Based Online Access Panel; 2013; Kaczmirek, L., Bandilla, W., Schaurer, I., Struminskaya, B., Weyandt, K.
- Online Survey – Research with children on advertising impact; 2013; Funkenweh, V., Busch, J., Amthor, A. L., Boeer, A., Gaedke, J.
- HTML5 and mobile Web surveys: A Web experiment on new input types; 2013; Funke, F.
- Metadata on the demographics of online research: Results from a full-range study of available online...; 2013; Burger, C., Stieger, S.
- How the screen-out influence the dropout of a commercial panel; 2013; Bartoli, B.
- Beyond methodology - some ethical implications of "doing research online"; 2013; Heise, N.
- Innovation in Data Collection: the Responsive Design Approach; 2013; Bianchi, A., Biffignandi, S.
- Break-off and attrition in the GIP amongst technologically experienced and inexperienced participants...; 2013; Blom, A. G., Bossert, D., Clark, V., Funke, F., Gebhard, F., Holthausen, A., Krieger, U., Wachenfeld...
- Nonresponse and Nonresponse Bias in a Probability-Based Internet Panel; 2013; Blom, A. G., Bossert, D., Funke, F., Gebhard, F., Holthausen, A., Krieger, U.
- Rewards - Money for Nothing?; 2013; Cape, P. J., Martin, P.
- Effects of incentive reduction after a series of higher incentive waves in a probability-based online...; 2013; Struminskaya, B., Kaczmirek, L., Schaurer, I., Bandilla, W.
- Timing of Nonparticipation in an Online Panel: The effect of incentive strategies; 2013; Douhou, S., Scherpenzeel, A.