Web Survey Bibliography
Title Using Behavioral Economic Games as Replacement for Grid Questions to Increase Respondent Engagement
Author Buder, F.; Unfried, M.
Year 2016
Access date 29.04.2016
Presentation PDF (1.02MB)
Abstract
Relevance & Research Question:
Latent variables like trust, brand image or attitudes toward a product are widely used in explaining consumer behavior. As these constructs are not directly observable, multi-item scales (also referred to as grid questions), sets of consecutive text statements using the same response scale, are a common measurement instrument. But, the use of grid questions is controversially discussed especially in terms of respondent engagement. This contribution evaluates to what extend experiments with simple behavioral economics games in questionnaires produce similar information about subjects’ attitude and provides insights into potentials of such games in terms of engaging and incentivizing respondents.
Methods & Data:
The data for this contribution were obtained from a two-part survey study: a traditional questionnaire including grid questions on different characteristics regarding the trustworthiness (e.g., diligence, honesty) of people from other European countries as well as two simple games on honesty and the willingness to volunteer. After playing the game the respondents were asked to assess the behavior of participants from other countries to measure their attitudes towards people from these countries. The respondent’s monetary reward depended on the accuracy of this assessment.
Firstly, we checked if grid questions and games yield similar results. Secondly, data from the grid questions were analyzed with respect to indications of decreasing respondent engagement over time (decreasing variance between evaluations of different countries, straight lining etc.).
Results:
In terms of outcome, grid questions and games show quite comparable results regarding, e.g., the ranking of the countries. Regarding respondent engagement, first results for the grid questions indicate decreasing respondent engagement over time.
The biggest advantage of games is the incentive compatible payment. Using just grid questions it is only possible to reward respondents’ participation, not thoughtful responses. In the applied games respondents’ answers determine their individual payoff motivating them to give honest and thoughtful responses. Evidence shows that the social desirability bias can be reduced.
Added Value:
The contribution demonstrates a new way to evaluate latent constructs in questionnaires using behavioral economic games. The advantages lie in a higher respondent engagement and better opportunities to incentivize respondents for thoughtful responses.
Latent variables like trust, brand image or attitudes toward a product are widely used in explaining consumer behavior. As these constructs are not directly observable, multi-item scales (also referred to as grid questions), sets of consecutive text statements using the same response scale, are a common measurement instrument. But, the use of grid questions is controversially discussed especially in terms of respondent engagement. This contribution evaluates to what extend experiments with simple behavioral economics games in questionnaires produce similar information about subjects’ attitude and provides insights into potentials of such games in terms of engaging and incentivizing respondents.
Methods & Data:
The data for this contribution were obtained from a two-part survey study: a traditional questionnaire including grid questions on different characteristics regarding the trustworthiness (e.g., diligence, honesty) of people from other European countries as well as two simple games on honesty and the willingness to volunteer. After playing the game the respondents were asked to assess the behavior of participants from other countries to measure their attitudes towards people from these countries. The respondent’s monetary reward depended on the accuracy of this assessment.
Firstly, we checked if grid questions and games yield similar results. Secondly, data from the grid questions were analyzed with respect to indications of decreasing respondent engagement over time (decreasing variance between evaluations of different countries, straight lining etc.).
Results:
In terms of outcome, grid questions and games show quite comparable results regarding, e.g., the ranking of the countries. Regarding respondent engagement, first results for the grid questions indicate decreasing respondent engagement over time.
The biggest advantage of games is the incentive compatible payment. Using just grid questions it is only possible to reward respondents’ participation, not thoughtful responses. In the applied games respondents’ answers determine their individual payoff motivating them to give honest and thoughtful responses. Evidence shows that the social desirability bias can be reduced.
Added Value:
The contribution demonstrates a new way to evaluate latent constructs in questionnaires using behavioral economic games. The advantages lie in a higher respondent engagement and better opportunities to incentivize respondents for thoughtful responses.
Access/Direct link Conference Homepage (presentation)
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - Germany (639)
- Using Paradata to Predict and Correct for Panel Attrition; 2016
- Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?; 2016; Meitinger, K., Behr, D.
- Device Effects - How different screen sizes affect answers in online surveys; 2016; Fisher, B.; Bernet, F.
- Effects of motivating question types with graphical support in multi channel design studies; 2016; Luetters, H.; Friedrich-Freksa, M.; Vitt, SGoldstein, D. G.
- Analyzing Cognitive Burden of Survey Questions with Paradata: A Web Survey Experiment; 2016; Hoehne, J. K.; Schlosser, S.; Krebs, D.
- Secondary Respondent Consent in the German Family Panel; 2016; Schmiedeberg, C.; Castiglioni, L.; Schroeder, J.
- Does Changing Monetary Incentive Schemes in Panel Studies Affect Cooperation? A Quasi-experiment on...; 2016; Schaurer, I.; Bosnjak, M.
- Using Cash Incentives to Help Recruitment in a Probability Based Web Panel: The Effects on Sign Up Rates...; 2016; Krieger, U.
- The Mobile Web Only Population: Socio-demographic Characteristics and Potential Bias ; 2016; Fuchs, M.; Metzler, A.
- The Impact of Scale Direction, Alignment and Length on Responses to Rating Scale Questions in a Web...; 2016; Keusch, F.; Liu, M.; Yan, T.
- Web Surveys Versus Other Survey Modes: An Updated Meta-analysis Comparing Response Rates ; 2016; Wengrzik, J.; Bosnjak, M.; Lozar Manfreda, K.
- Retrospective Measurement of Students’ Extracurricular Activities with a Self-administered Calendar...; 2016; Furthmueller, P.
- Privacy Concerns in Responses to Sensitive Questions. A Survey Experiment on the Influence of Numeric...; 2016; Bader, F., Bauer, J., Kroher, M., Riordan, P.
- Ballpoint Pens as Incentives with Mail Questionnaires – Results of a Survey Experiment; 2016; Heise, M.
- Does survey mode matter for studying electoral behaviour? Evidence from the 2009 German Longitudinal...; 2016; Bytzek, E.; Bieber, I. E.
- Consequences of the forced answering option within online surveys: Do higher item response rates come...; 2016; Sischka, P.; Decieux, J. P.; Mergener, A.; Neufang, K.
- How the Placement of the Linkage Consent Question Impacts the Consent Rate in an Online Establishment...; 2016; Vicari, B.; Sakshaug, J. W.
- Effects of Issue Salience, Questionnaire Design and Incentives on Web Survey Response Rates; 2016; Hentschel, A.; Mueller, A.
- Motivated Underreporting and Response Propensity: Do persons likely to respond give better answers to...; 2016; Wengrzik, J.
- A study on panel engagement in a mobile survey app; 2016; Scharioth, N.; Tschida, K.
- Mixed-Method Approaches in Enterprise Social Software Evaluation; 2016; Schnurr, J. M.; Behrendt, S.; Buelow, C.
- Fit4You - an online survey for the transition from school to vocational training / study; 2016; Koehler, T.; Haertel, L.; Funke, F.; Neumann, J.; Ossowski, A.; Hellwig, O.; Bartsch, J.; Sander, N.
- Mobile app respondents: a study on panel engagement; 2016; Scharioth, N.; Tschida, K.
- The Adequacy of Outlier Definitions based on Response Time Distributions in Web Surveys: A Paradata...; 2016; Schlosser, S.; Karem Hoehne, J.
- Changing the scoring procedure and the response format to get the most out of multiple-choice tests...; 2016; Papenberg, M.; Diedenhofen, B.; Musch, J.
- Human vs. artificial intelligence: Are software solutions already able to replace human beings?; 2016; Koch, M.
- Impulsiveness, Speed and Reliability in Online Questionnaire; 2016; Harms, C.
- Non-response in evaluation of teaching; 2016; Brinkmoeller, B.; Forthmann, B.; Thielsch, M.
- The Big Show-Stopper: Online Research in the Stranglehold of Data Protection Regulation?; 2016; Mueller - Peters, H.
- Participation in a mixed-mode panel over the course of the field period: An analysis of different response...; 2016; Struminskaya, B.; Gummer, T.
- Propensity score weighting in a web-based panel survey: Comparing the effects on attrition biases in...; 2016; Gummer, T.; Rossmann, J.
- Using Behavioral Economic Games as Replacement for Grid Questions to Increase Respondent Engagement; 2016; Buder, F.; Unfried, M.
- Gaming-Genres and Motivation: Why do we play what we play?; 2016; Stetina, B. U. Kovacovsky, Z. Kluss, K. Klaps, A. Aden, J. Bendas, C. Daude, A. Lehenbauer, M.
- Forecasting proportional representation elections from non-representative expectation surveys; 2016; Graefe, A.
- Setting Up an Online Panel Representative of the General Population The German Internet Panel; 2016; Blom, A. G.; Gathmann, C.; Krieger, U.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- Stable Relationships, Stable Participation? The Effects of Partnership Dissolution and Changes in Relationship...; 2016; Mueller, B.; Castiglioni, L.
- A Web Experiment Showing Negative Effects of Slider Scales Compared to Visual Analogue Scales and Radio...; 2016; Funke, F.
- Will They Stay or Will They Go? Personality Predictors of Dropout in Online Study; 2016; Nestler, S.; Thielsch, M.; Vasilev, E.; Back, M.
- Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments; 2016; Struminskaya, B.
- A Privacy-Friendly Method to Reward Participants of Online-Surveys; 2015; Herfert, M.; Lange, B.; Selzer, A.; Waldmann, U.
- The impact of frequency rating scale formats on the measurement of latent variables in web surveys -...; 2015; Menold, N.; Kemper, C. J.
- Investigating response order effects in web surveys using eye tracking; 2015; Karem Hoehne, J.; Lenzner, T.
- Implementation of the forced answering option within online surveys: Do higher item response rates come...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- Translating Answers to Open-ended Survey Questions in Cross-cultural Research: A Case Study on the Interplay...; 2015; Behr, D.
- The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability...; 2015; Bosnjak, M.; Struminskaya, B.; Weyandt, K.
- Are they willing to use the web? First results of a possible switch from PAPI to CAPI/CAWI in an establishment...; 2015; Ellguth, P.; Kohaut, S.
- Measuring Political Knowledge in Web-Based Surveys: An Experimental Validation of Visual Versus Verbal...; 2015; Munzert, S.; Selb, P.
- Changing from CAPI to CAWI in an ongoing household panel - experiences from the German Socio-Economic...; 2015; Schupp, J.; Sassenroth, D.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.