Web Survey Bibliography
Placing related questions together can alter the associations between answers to them. We conducted two experiments that varied whether related questions were presented together on a single screen in a web survey. The first experiment replicated that inter-item associations and scale reliability were highest when the questions were presented together. However, a structural equation model revealed that these higher associations reflected correlated measurement error and decreased rather than increased construct validity. We carried out a second experiment to test three possible mechanisms for the heightened correlations, but reduced validity. First, the questions may be perceived as being multiple measures of the same construct, inducing more similar interpretations of the items. Secondly, when no actions are needed to get to the next question, the same material may be retrieved from working memory in answering all the questions. Thirdly, respondents may be minimizing effort by clicking response options in the same columns and paying less attention to the individual questions when they are presented in a grid. Our second experiment used a factorial design in a web survey with 2,694 respondents. Respondents answered 4 questions on diet and 4 on exercise, where the layout (together in a grid, together on a screen but listed separately, in separate screens), the accompanying instructions (related, independent, no instructions), and the order of the questions (by topic, intermixed) were varied randomly. We also expected these manipulations to interact with the location of the experiment in the questionnaire and randomly assigned its placement. Respondents' Body Mass Index was calculated in order to estimate and compare measurement error properties and validity of the diet and exercise constructs. The findings will allow us to understand the mechanisms generating differences in responses to questions on the same topic and guide survey design decisions that affect measurement error, nonresponse, and cost.
Web survey bibliography - Peytchev, A. (13)
- Multiple Sources of Nonobservation Error in Telephone Surveys: Coverage and Nonresponse; 2011; Peytchev, A.; Carley-Baxter, L. R.; Black, M. C.
- Developing and Implementing Adaptive Total Design (ATD); 2011; Carley-Baxter, L. R., Mitchell, S., Peytchev, A., Day, O.
- Matrix Questionnaire Design to Reduce Measurement Error; 2011; Peytchev, A., Peytcheva, E.
- Increasing Respondents' Use of Definitions in Web Surveys; 2010; Peytchev, A., Conrad, F. G., Couper, M. P., Tourangeau, R.
- Experiments in Mobile Web Survey Design; 2008; Peytchev, A., Hill, C.
- Coverage Bias in Surveys Excluding Cell Phone Only Adults: Evaluation of Bias and Effectiveness of Post...; 2008; Peytchev, A., Carley-Baxter, L. R., Black, M. L.
- Experiments in Visual Survey Design for Mobile Devices; 2008; Peytchev, A., Hill, C.
- Mobile Web Survey Design; 2008; Peytchev, A. Hill, C.
- Following Up Nonrespondents to an Online Weight Management Intervention: Randomized Trial Comparing...; 2007; Couper, M. P., Peytchev, A., Strecher, V., Rothert, K., Anderson, K. J.
- Web Survey Design: Paging versus Scrolling; 2006; Peytchev, A., Couper, M. P., McCabe, S. E., Crawford, S. D.
- Web Survey Design: Paging vs. Scrolling; 2004; Peytchev, A., Crawford, S. D., McCabe, S. E., Conrad, F. G., Couper, M. P.
- Validations in Web-based Surveys; 2003; Crawford, S. D., Peytchev, A.
- Statistical Data Validation in Web Instruments:An Empirical Study; 2002; Peytchev, A., Petrova, E. A.