Web Survey Bibliography
There is a lot of debate about whether questions should be presented on a grid or in a single item per screen. Operationally, grids take less time for respondents to complete. Use
of grids should decrease response burden, although new research shows that respondents seem to prefer a single item per screen. From a measurement point of view, grids pose numerous issues: higher item non-response, higher item non-differentiation, and sometimes higher measurement error.
In this experiment, we are testing the Vitality (4 items) and Mental Health (5 items) scales of the SF-36v2® Health Survey. The SF-36v2 asks 36 questions to measure functional
health and well-being from the patient's point of view. It is called a generic health survey, because it can be used across age (18 and older), disease, and treatment groups, as opposed to a disease-specific health survey which focuses on a particular condition or disease. Two of the four items of the vitality scale and two out of five items of the mental health scale are reversed in scoring.
A sample of 2,500 KnowledgePanel® respondents was randomly assigned to one of five experimental conditions: Group 1: Standard grid; Group 2: Shaded grid; Group 3: One item per screen with horizontal response options; Group 4: One item per screen with vertical response options; Group 5: One item per screen with vertical shaded response options. Approximately 360 respondents completed the survey per condition for a completion rate of 73.4%. The survey was optimized to be seen on a screen with minimum resolution of 800 by 600 pixels. During the study we collected the browser type for each respondent. This allowed us to exclude cases in which the survey was taken either on a MSNTV or on an iPhone/PDA because they could not properly see the grid items. The final sample used for the analysis, after exclusions, was of 1,419 cases for an average group size of about 280.
We hypothesized that items presented on a grid would lead to more measurement error as indicated by a higher rate of “inconsistencies” in the self-reports to grid questions and a lower rate of inconsistencies in the self-reports to the single-item questions. We speculated that presenting items on a single screen allows the respondent to bring more cognitive focus to each question and therefore be more consistent in their answers to questions. In contrast, when items are on a grid, it is easier for the respondent to get confused, especially when the meaning of some of the items is reversed. We computed an index of consistency by correlating the total sum of scores for the reversed items with the total sum of scores for the non-reversed items. If respondents are consistent in their answers the correlation between reversed and non-reversed should be higher. We calculated Cronbach's alpha scores to measure consistency in answers for each of the five experimental conditions.
The direction of the study findings were consistent with our hypotheses -- lower alpha level for the grid presentation and higher correlation for the single-item presentation -- although the differences among groups do not reach statistical significance.
Homepage (abstract)/(full text)
Web Survey Bibliography - Callegaro, M. (54)
- Paradata in web surveys; 2013; Callegaro, M.
- From mixed-mode to multiple devices. Web surveys, smartphone surveys and apps: has the respondent gone...; 2013; Callegaro, M.
- Web coverage in the UK and its potential impact on general population web surveys; 2013; Callegaro, M.
- Yes-No versus Checkboxes Response Options in Web Surveys: What Form is Closer to Benchmarks?; 2012; Murakami, M., Callegaro, M., Henderson, V., Tepman, Z., Dong, Q.
- Effects of Progress Indicators on Short Questionnaires; 2012; Sedley, A., Callegaro, M.
- Effects of Pagination on Short Online Surveys; 2012; Sedley, A., Callegaro, M.
- A Systematic Review of Studies Investigating the Quality of Data Obtained with Online Panels; 2012; Callegaro, M., Villar, A., Krosnick, J. A., Yeager, D. S.
- A taxonomy of paradata for web surveys and computer assisted self interviewing (Casi); 2012; Callegaro, M.
- Unpublisihed internal Google report on break off rates by device type; 2011; Callegaro, M.
- A meta-analysis of experiments manipulating progress indicators in Web surveys; 2011; Callegaro, M., Villar, A., Yang, Y.
- Should we use the progress bar in online surveys? A meta-analysis of experiments manipulating progress...; 2011; Callegaro, M., Yang, Y., Villar, A.
- Designing Surveys for Mobile Devices: Pocket-sized surveys that yield powerful results; 2011; Callegaro, M., Macer, T.
- How the Order of Response Options in a Running Tally Can Affect Online Survey Estimates.; 2011; Callegaro, M., DiSogra, C., Wells, T.
- IVR and web administration in structured interviews utilizing rating scales: exploring the role of motivation...; 2011; Yang, Y., Callegaro, M., Bhola, D. S., Dillman, D. A.
- Presentation of a single item versus a grid: Effects on the vitality and mental health subscales of...; 2010; Callegaro, M., Shand-Lubbers, J., Dennis, J. M.
- Do You Know Which Device Your Respondent Has Used to Take Your Online Survey?; 2010; Callegaro, M.
- Computing response metrics for online panels; 2009; Callegaro, M., DiSogra, C.
- The Effect of Email Invitation Subject Title and Text on Online Survey Completion Rates in Internet...; 2009; Kruse, Y., Thomas, M., Nukulkij, P., Callegaro, M.
- Differences Between Internet and Non-Internet Households on Survey Items: Do These Differences Disappear...; 2009; Zhang, C., Callegaro, M., Thomas, M.
- Producing Straightlining and Item Non-Differentiation in a Web Survey: How Visual Design Plays a Role...; 2009; Callegaro, M., Shand-Lubbers, J., Dennis, J. M.
- Do we hear different voices?: Investigating the differences between internet and non-internet users...; 2009; Zhang, C., Callegaro, M., Thomas, M., DiSogra, C.
- The Effect of Email Invitation Customization on Survey Completion Rates in an Internet Panel: A Meta...; 2009; Callegaro, M., Kruse, Y., Thomas, M., Nukulkij, P.
- Panel Conditioning and Attrition in the AP-Yahoo! News Election Panel Study; 2009; Kruse, Y., Callegaro, M., Dennis, J. M., DiSogra, C., Subias, S., Lawrence, M., Tompson, T.
- Recruiting Probability-Based Web Panel Members Using an Address-Based Sample Frame: Results from a Pilot...; 2009; DiSogra, C., Callegaro, M., Hendarwan, E.
- Presentation of a Single Item versus a Grid: Effects on the Vitality and Mental Health Scales of the...; 2009; Callegaro, M., Shand-Lubbers, J., Dennis, J. M.
- Computing Response Rates for Probability-Based Web Panels; 2009; DiSogra, C., Callegaro, M.
- Response latency as an indicator of optimizing in online questionnaires; 2009; Callegaro, M., Yang, Y., Bhola, D. S., Dillman, D. A., Chin, T. -J.
- Computing Response Metrics for Online Panels; 2009; Callegaro, M., DiSogra, C.
- Effects of Pre-coding Response Options for Five Point Satisfaction Scale in Web Surveys; 2008; Callegaro, M., Wells, T., Kruse, Y.
- An implementation of a within-household selection procedure for web surveys; 2008; Callegaro, M., Osborn, L., Debell, M., Leuvano, P.
- Do online respondents go the extra mile and take on inconvenient tasks?; 2008; Callegaro, M., Wells, T.
- Response options order effect and category number association: An experiment using items on a five point...; 2008; Tang, G., Callegaro, M.
- More than the digital divide?: Investigating the differences between Internet and non-Internet users; 2008; Zhang, C., Callegaro, M., Thomas, M.
- “R U in the Network?!” Using Text Messaging Interfaces as Screeners for Working Cell Phone...; 2008; Buskirk, T. D., Rao, K., Callegaro, M., Arens, Z., Steiger, D. M.
- Computing Metrics for Online Panels; 2008; Callegaro, M., DiSogra, C.
- Impact of new technologies in data collection methods; 2008; Callegaro, M.
- Seam Effects in Longitudinal Surveys; 2008; Callegaro, M.
- Key Issues in Research Accuracy: Sources of bias and error in online research; 2008; Dennis, J. M., Callegaro, M.
- Web Questionnaires: Tested Approaches from Knowledge Networks for the Online World ; 2008; Callegaro, M.
- The influence of mobile telephones on telephone surveys; 2008; Kuusela, V., Callegaro, M., Vehovar, V.
- The influence of advance letters on response in telephone surveys; 2007; de Leeuw, E. D., Callegaro, M., Hox, J., Korendijk, E., Lensvelt-Mulders, G. J.
- Using Text Messages in U.S. Mobile Phone Surveys ; 2007; Steeh, C. G., Buskirk, T. D., Callegaro, M.
- Fitting disposition codes to mobile phone surveys: experiences from studies in Finland, Slovenia and...; 2007; Callegaro, M., G., Buskirk, T. D., Vehovar, V., Kuusela, V., Piekarski, L. G.Steeh, C. G.
- Social Aspects of Mobile Phone Usage and Their Impact on Survey Cooperation; 2007; Vehovar, V., Callegaro, M.
- Mobile Phones - Influence on Telephone Surveys; 2006; Kuusela, V., Vehovar, V., Callegaro, M.
- Text 2 U: Contacting wireless subscribers using text messaging and wireless web for mobile phone surveys...; 2004; Buskirk, T. D., Callegaro, M., Steeh, C. G.
- Response latency as an indicator of optimizing. A study comparing job applicants and job incumbents'...; 2004; Callegaro, M., Yang, Y., Bhola, D. S., Dillman, D. A.
- Mobile Phones as a Survey Tool; 2004; Vehovar, V., Callegaro, M., Kuusela, V.
- Calculating outcome rates for mobile phone surveys. A proposal of a modified AAPOR standard and its...; 2004; Callegaro, M., Buskirk, T. D., Piekarski, L., Kuusela, V., Vehovar, V., Steeh, C. G.
- Where can I call you? The "mobile (phone) revolution" and its impact on survey research and coverage...; 2004; Callegaro, M., Poggio, T.