Web Survey Bibliography
Title Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected by the Interview Situation?
Author Niebruegge, S.
Year 2017
Access date 15.09.2017
Abstract RELEVANCE & RESEARCH QUESTION
Screens are everywhere. And so, of course, are interviews. Market research now happens in real life.
The author emphasizes the importance of the interview and its environment for several reasons: It’s the core of a good research practice. Its costs heavily affect the economic health of research businesses. The fact that we don’t see the actual interview environment might make researchers unaware of potential impacts on the answering behaviour. Panel interviews compete with multiple distractions that come with ubiquitous devices. We can assume that the interview environment is under constant change. Last not least, we need to include in our equation the beginning shift from the interview to the observation.
METHODS & DATA
A survey with a total N = 1.049 provides a comprehensive and representative picture of present-day interview environments. The respondents were free to choose time, place and device. The consistency and commitment to the online interview was measured using a fit statistic from a MaxDiff exercise.
RESULTS
A large share of panel interviews is done at home. Only 2 % of the interviews can be classified as truly mobile (out-of-home, using a mobile data connection). 88 % of the resp. show a 100-%-consistency in their answering behaviour.
The quality of the answering behaviour is largely influenced by non-situational parameters such as the general personality trait of honesty and truthfulness as measured with the HEXACO-60 personality inventory. It’s not or only to a neglectable extent affected by parameters of the actual interview situation. But, there are a few remarkable exceptions such as the consumption of alcohol prior to the interview.
ADDED VALUE
For research designs, it’s key to keep in mind in which environment panel interviews take place. For some research designs that expand the scope from lab situations to the real world the very low share of truly mobile interviews is bad news, whereas results indicate that interview environments are more homogeneous than expected.
Screens are everywhere. And so, of course, are interviews. Market research now happens in real life.
The author emphasizes the importance of the interview and its environment for several reasons: It’s the core of a good research practice. Its costs heavily affect the economic health of research businesses. The fact that we don’t see the actual interview environment might make researchers unaware of potential impacts on the answering behaviour. Panel interviews compete with multiple distractions that come with ubiquitous devices. We can assume that the interview environment is under constant change. Last not least, we need to include in our equation the beginning shift from the interview to the observation.
METHODS & DATA
A survey with a total N = 1.049 provides a comprehensive and representative picture of present-day interview environments. The respondents were free to choose time, place and device. The consistency and commitment to the online interview was measured using a fit statistic from a MaxDiff exercise.
RESULTS
A large share of panel interviews is done at home. Only 2 % of the interviews can be classified as truly mobile (out-of-home, using a mobile data connection). 88 % of the resp. show a 100-%-consistency in their answering behaviour.
The quality of the answering behaviour is largely influenced by non-situational parameters such as the general personality trait of honesty and truthfulness as measured with the HEXACO-60 personality inventory. It’s not or only to a neglectable extent affected by parameters of the actual interview situation. But, there are a few remarkable exceptions such as the consumption of alcohol prior to the interview.
ADDED VALUE
For research designs, it’s key to keep in mind in which environment panel interviews take place. For some research designs that expand the scope from lab situations to the real world the very low share of truly mobile interviews is bad news, whereas results indicate that interview environments are more homogeneous than expected.
Access/Direct link Conference Homepage (abstract) / (presentation)
Year of publication2017
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography - European survey research associaton conference 2017, ESRA, Lisbon (26)
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- A test of sample matching using a pseudo-web sample; 2017; Chatrchi, G., Gambino, J.
- A Partially Successful Attempt to Integrate a Web-Recruited Cohort into an Address-Based Sample; 2017; Kott, P. S., Farrelly, M., Kamyab, K.
- Nonprobability sampling as model construction; 2017; Mercer, A. W.