Web Survey Bibliography
There is a lot of debate about whether questions should be presented on a grid or in a single item per screen. Operationally, grids take less time for respondents to complete. Use
of grids should decrease response burden, although new research shows that respondents seem to prefer a single item per screen. From a measurement point of view, grids pose numerous issues: higher item non-response, higher item non-differentiation, and sometimes higher measurement error.
In this experiment, we are testing the Vitality (4 items) and Mental Health (5 items) scales of the SF-36v2® Health Survey. The SF-36v2 asks 36 questions to measure functional
health and well-being from the patient's point of view. It is called a generic health survey, because it can be used across age (18 and older), disease, and treatment groups, as opposed to a disease-specific health survey which focuses on a particular condition or disease. Two of the four items of the vitality scale and two out of five items of the mental health scale are reversed in scoring.
A sample of 2,500 KnowledgePanel® respondents was randomly assigned to one of five experimental conditions: Group 1: Standard grid; Group 2: Shaded grid; Group 3: One item per screen with horizontal response options; Group 4: One item per screen with vertical response options; Group 5: One item per screen with vertical shaded response options. Approximately 360 respondents completed the survey per condition for a completion rate of 73.4%. The survey was optimized to be seen on a screen with minimum resolution of 800 by 600 pixels. During the study we collected the browser type for each respondent. This allowed us to exclude cases in which the survey was taken either on a MSNTV or on an iPhone/PDA because they could not properly see the grid items. The final sample used for the analysis, after exclusions, was of 1,419 cases for an average group size of about 280.
We hypothesized that items presented on a grid would lead to more measurement error as indicated by a higher rate of “inconsistencies” in the self-reports to grid questions and a lower rate of inconsistencies in the self-reports to the single-item questions. We speculated that presenting items on a single screen allows the respondent to bring more cognitive focus to each question and therefore be more consistent in their answers to questions. In contrast, when items are on a grid, it is easier for the respondent to get confused, especially when the meaning of some of the items is reversed. We computed an index of consistency by correlating the total sum of scores for the reversed items with the total sum of scores for the non-reversed items. If respondents are consistent in their answers the correlation between reversed and non-reversed should be higher. We calculated Cronbach's alpha scores to measure consistency in answers for each of the five experimental conditions.
The direction of the study findings were consistent with our hypotheses -- lower alpha level for the grid presentation and higher correlation for the single-item presentation -- although the differences among groups do not reach statistical significance.
Homepage (abstract)/(full text)
Web Survey Bibliography - Dennis, J. M. (45)
- How Far Have We Come? The Lingering Digital Divide and Its Impact on the Representativeness of Internet...; 2013; Dennis, J. M., Cobb, C. L.
- An experimental investigation of the effects of noncontingent and contingent incentives in recruiting...; 2012; Lavrakas, P. J., Dennis, J. M., Peugh, J., Shand-Lubbers, J., Lee, E., Peugh, J., Charlebois, O., Murakami...
- Using Probability-based On-line Samples to Calibrate Non-probability Opt-in Samples; 2012; DiSogra, C., Cobb, C. L., Chan, E., Dennis, J. M.
- Experimenting with Noncontingent and Contingent Incentives in a Media Measurement Panel; 2012; Lavrakas, P. J., Dennis, J. M., Peugh, J., Shand-Lubbers, J., Lee, E.
- Investigating Nonresponse Bias in a Nonresponse Bias Study; 2012; Lavrakas, P. J., Dennis, J. M., Peugh, J., Shand-Lubbers, J., Lee, E., Charlebois, O.
- Calibrating Non-Probability Internet Samples with Probability Samples Using Early Adopter Characteristics...; 2011; DiSogra, C., Cobb, C. L., Chan, E., Dennis, J. M.
- Examination of Panel Conditioning Effects in a Web-Based 2007-2008 Election Study.; 2011; Dennis, J. M., Kruse, Y., Tompson, T.
- Presentation of a single item versus a grid: Effects on the vitality and mental health subscales of...; 2010; Callegaro, M., Shand-Lubbers, J., Dennis, J. M.
- KnowledgePanel®: Processes & Procedures Contributing to Sample Representativeness & Tests for Self...; 2010; Dennis, J. M.
- Does Providing Internet Access to Non-Internet Households Affect Reported Media Behavior for Latinos...; 2010; Dennis, J. M., DiSogra, C.
- Using KnowledgePanel® to Improve the Sample Representativeness and Accuracy of Opt-in Panel Data...; 2010; Dennis, J. M., Peugh, J., Graham, P.
- Analysis of Late Responders to Probability-Based Web Panel Recruitment and Panel Surveys; 2010; Kruse, Y., DiSogra, C., Hendarwan, E., Dennis, J. M.
- Web Panel Studies of the 2008 Election: New Opportunities for Causal Analysis of Dynamic Change in the...; 2009; Dennis, J. M., Tompson, T.
- Producing Straightlining and Item Non-Differentiation in a Web Survey: How Visual Design Plays a Role...; 2009; Callegaro, M., Shand-Lubbers, J., Dennis, J. M.
- Patterns of response and non-response on an election day survey; 2009; Tompson, T., Dennis, J. M., Subias, S.
- The impact of news reports of survey findings on respondents in a longitudinal panel study; 2009; Tompson, T., Dennis, J. M., Kruse, Y.
- Panel Conditioning and Attrition in the AP-Yahoo! News Election Panel Study; 2009; Kruse, Y., Callegaro, M., Dennis, J. M., DiSogra, C., Subias, S., Lawrence, M., Tompson, T.
- The Challenge and Importance of Including Spanish-Dominant Latinos in an Online Panel; 2009; Dennis, J. M., Wells, T., Torres, J.
- Web Panel Studies of the 2008 Election; 2009; Dennis, J. M., Tompson, T.
- Comparison Study of Early Adopter Attitudes and Online Behavior in Probability and Non-Probability Web...; 2009; Dennis, J. M., Osborn, L., Semans, K.
- Description of Within-Panel Survey Sampling Methodology: The Knowledge Networks Approach; 2009; Dennis, J. M.
- Summary of KnowledgePanel® Design; 2009; Dennis, J. M.
- Presentation of a Single Item versus a Grid: Effects on the Vitality and Mental Health Scales of the...; 2009; Callegaro, M., Shand-Lubbers, J., Dennis, J. M.
- A comparison of results from an alcohol survey of a prerecruited Internet panel and the National Epidemiologic...; 2008; Heeren, T., Edwards, E., Dennis, J. M., Rodkin, S., Hingson, R. W., Rosenbloom D. L.
- Mode Effects on In-Person and Internet Surveys: A Comparison of the General Social Survey and Knowledge...; 2008; Smith, T. W., Dennis, J. M.
- Key Issues in Research Accuracy: Sources of bias and error in online research; 2008; Dennis, J. M., Callegaro, M.
- Making Quality Real: Delivering on a Promise of the Best Service and Online Survey Sample ; 2008; Dennis, J. M.
- More honest answers to surveys? A Study of data collection mode effects; 2007; Dennis, J. M., Li, R. J.
- Internet vs. Telephone; Examining Mode Effects in Alcohol Research; 2007; Rodkin, S., Heeren, T., Edwards, E., Dennis, J. M.
- Face-to-Face Recruitment of an Internet Survey Panel: Lessons from an NSF-Sponsored Demonstration Protect...; 2007; O'Muircheartaigh, C., Krosnick, J. A., Dennis, J. M.
- Results of a Within-Panel Web Survey Experiment of Mode Effects: Using the General Social Survey's National...; 2007; Dennis, J. M., Li, R. J.
- Data Collection Mode Effects Controlling for Sample Origins in a Panel Study: Telephone versus Internet...; 2005; Dennis, J. M., Chatt, C.Li, R. J.; de Rouvray, C.Pulliam, P.; de Rouvray, C.
- Benchmarking Knowledge Networks’ Web-enabled panel survey of selected GSS questions against GSS...; 2004; Dennis, J. M., Li, R. J., Chatt, C.
- Methodology for Probability-Based Recruitment for a Web-Enabled Panel; 2004; Pineau, V., Dennis, J. M.
- The Effects of Correcting for Sample Selection Bias in Internet Panel Surveys Based on Random Digit...; 2004; Cameron, T. A., de Shazo, J. R., Dennis, J. M., Lee, R. J.
- Data Collection Mode Effects Controlling for Sample Origins in an Internet Panel Survey; 2004; Dennis, J. M., Chatt, C.
- An Overview of Capabilities and Methodological Research Conducted by the Government and Academic Area...; 2004; Dennis, J. M.
- Correcting for Sample Selection Bias in Internet Panel Surveys Based on Random Digit Dialing Sampling...; 2003; Dennis, J. M., Li, R. J., de Shazo, J. R., Cameron, T. A.
- Data Collection Mode Effects Controlling for Sample Origins in a Panel Survey: Telephone versus Internet...; 2003; Chatt, C., Dennis, J. M., Li, R. J., Motta-Stanko A., Pulliam, P.
- An Evaluation of Nonresponse Bias in Internet Surveys Conducted Using the Knowledge Networks Panel; 2002; Seryakova, K., Dennis, J. M., Huggins, V. J.
- Using the Multimedia Capabilities of Web-Enabled Probability-Based Survey Methodology to Gather Vaccination...; 2002; McCready, W. C., Dennis, J. M., Thalji, L.
- Probability-Based Survey Research on the Internet; 2001; Krotki, K., Dennis, J. M.
- Response Timing and Coverage of Non-Internet Households in an Internet-Enabled Panel; 2001; Dennis, J. M.
- Are Internet Panels Creating Professional Respondents? A Study of Panel Effects.; 2001; Dennis, J. M.
- Questionnaire design for probability-based Web surveys; 2000; Dennis, J. M., de Rouvray, C., Couper, M. P.