Web Survey Bibliography
Usability testing and user satisfaction are cornerstones of electronic survey development. There are nos clear recommendations regarding how self-reported user satisfaction questions (e.g., Chin, Diehl, & Norman, 1988) should be implemented. Human-computer interaction research and theory suggest that the method matters, and that overly-positive ratings are likely when they are collected on the same computer that previously ran the software being evaluated by the user (Nass & Moon, 2000). We use a 2x2 between-subjects randomized experiment to evaluate the effect of two components of social presence on user satisfaction ratings: data collection mode (paper-and-pencil v. computerized, where the computer used for satisfaction ratings is sometimes the computer that ran the software that was usability tested), and physical proximity of the respondent to the computer on which the usability test was conducted (i.e., in the same room as the usability test or a different room). We present data from about 200 respondents, across 5 usability tests of websites and web surveys. Our initial analyses do not support past research findings that participants rate satisfaction highest when answering on the computer that was used for the usability test. We find the lowest ratings when the satisfaction questionnaire was completed on paper in the same room as the usability test (social presence on the proximity dimension, but not the mode dimension). Individual satisfaction scale items asking about information arrangement, clarity, and navigation within the website or survey showed this difference. This difference was also found in a simple summative satisfaction scale. Our findings bring into question whether the robust social presence findings of Nass and colleagues and their theoretical implications (e.g., ethopoeia) are of concern in all human-computer interaction settings, and may provide guidance for usability testers‘ decisions about how and where to collect user satisfaction ratings.
Conference Homepage (abstract)
Web survey bibliography - The American Association for Public Opinion Research (AAPOR) 66th Annual Conference, 2011 (26)
- The smart(phone) way to collect survey data; 2013; Stapleton, C.
- Exploring Health-related Experiences and Access to Care: Differences between Online and Telephone Survey...; 2011; Doty, M. M., Peugh, J., Shand-Lubbers, J.
- Using Community Information and Survey Methodology for Bias Reduction to Enhance the Quality of the...; 2011; Harvey, J., Prabhakaran, J., Spera, C., Zhang, Zh.
- Response Quantity, Response Quality, and Costs of Building an Online Panel via Social Contacts.; 2011; Toepoel, V.
- The Influence Of The Direction Of Likert-Type Scales In Web Surveys On Response Behavior In Different...; 2011; Keusch, F.
- An Injured Party?: A Comparison of Political Party Response Formats in Party Identification.; 2011; Schwarz, S., Barlas, F. M., Thomas, R. K., Corso, R. A., Szoc, R.
- Asking Sensitive Questions: Do They Affect Participation In Follow-Up Surveys?; 2011; Schaurer, I., Struminskaya, B., Kaczmirek, L., Bandilla, W.
- Designing Questions for Web Surveys: Effects of Check-List, Check-All, and Stand-Alone Response Formats...; 2011; Dykema, J., Schaeffer, N. C., Beach, J., Lein, V., Day, B.
- Differential Sampling Based on Historical Individual-Level Data in Online Panels.; 2011; Kelly, R. H.
- Web Survey Live Validations - What Are They Doing?; 2011; Crawford, S. D., McClain, C.
- Comparing Numeric and Text Open-End Responses in Mail and Web Surveys.; 2011; Olson, K., Smyth, J.
- Effects of Response Formats when Measuring Attitudes in Consumer Web Surveys Across Markets.; 2011; Couper, M. P., Nunge, E.
- Re-Examining the Validity of Different Survey Modes for Measuring Public Opinion in the U.S.: Findings...; 2011; Ansolabehere, S., Fraga, B., Schaffner, B. F.
- How to Survey All 14 000 Swedish Local Political Representatives And Get 10 000 Responses.; 2011; Gilljam, M., Granberg, D., Holm, B., Persson, M.
- Measuring User Satisfaction in the Lab: Questionnaire Mode, Physical Location, and Social Presence Concerns...; 2011; Jans, M., Romano, J. C., Ashenfelter, K. T., Krosnick, J. A.
- Interactive interventions in web surveys can increase response accuracy.; 2011; Conrad, F. G.
- Impact on Data Quality of Making Incentives Salient in Web Survey Invitations.; 2011; Zhang, Che.
- Effects of Mode and Incentives on Response Rates, Costs, and Response Quality in a Mixed Mode Survey...; 2011; Stevenson, J., Dykema, J., Kniss, C., Black, P., Moberg, P.
- Effects of Differential Incentives on Response Rates in Four Countries for a Web-based Follow Up Survey...; 2011; McSpurren, K.
- Completing Web Surveys on Cell-enabled iPads.; 2011; Dayton, J., Driscoll, H.
- The Social Aspect of the Digital Divide; 2011; Johnson, E. P.
- Which Technologies Do Respondents Use in Online Surveys – An International Comparison?; 2011; Kaczmirek, L., Behr, D., Bandilla, W.
- Matrix Questionnaire Design to Reduce Measurement Error; 2011; Peytchev, A., Peytcheva, E.
- Race-of-Virtual-Interviewer Effects; 2011; Conrad, F. G., Schober, M. F., Nielsen, D.
- Which Web Survey Respondents Are Most Likely to Click for Clarification?; 2011; Coiner, T., Schober, M. F., Conrad, F. G.
- Providing Clarifying Instructions in a Web Survey; 2011; Redline, C. D.