Web Survey Bibliography
Title Assessing Potential Bias in Respondent-driven Incident Based Data from a Web Survey of College Students
Year 2016
Access date 06.06.2016
Abstract
Incident-level data collection is a useful approach when measuring events that can occur multiple times within a survey’s reference period. Incident-based data allow survey researchers to analyze not just characteristics of persons but also characteristics of incidents (e.g. to assess the proportion of victimizations reported to authorities). However, asking respondents to complete detailed incident reports for all incidents experienced within the reference period may be too burdensome for persons who experienced several incidents. Therefore, survey practitioners often cap the number of
incident reports required for each respondent. If reported incidents differ from those not covered in the survey instrument, then bias potentially exists because limiting the number of incidents seemingly excluded incidents with certain characteristics. The Campus Climate Survey Validation Study (CCSVS), sponsored by the Bureau of Justice Statistics and the Office of Violence Against Women, was a web-based survey administered at nine colleges that collected prevalence and incident-based information onunwanted sexual contact. The CCSVS capped the number of incident reports at three and allowed respondents to determine the order in which incident reports were completed. To assess the potential for bias, we determine whether respondents systematically ordered the reported incidents. Bias could be introduced if the incidents that do not have a completed incident report are fundamentally different (e.g., occur later in the year or are less severe) than those that were reported. We consider incident ordering based on the chronological order and severity of incidents. In addition, we assess whether respondents who were unsure of the month in which one of their incidents occurred reported those incidents in a systematic way. Our analysis found that respondents do appear to systematically order their incidents both in terms of chronological order and severity. We quantify the potential impact of this biased ordering on key victimization estimates.
incident reports required for each respondent. If reported incidents differ from those not covered in the survey instrument, then bias potentially exists because limiting the number of incidents seemingly excluded incidents with certain characteristics. The Campus Climate Survey Validation Study (CCSVS), sponsored by the Bureau of Justice Statistics and the Office of Violence Against Women, was a web-based survey administered at nine colleges that collected prevalence and incident-based information onunwanted sexual contact. The CCSVS capped the number of incident reports at three and allowed respondents to determine the order in which incident reports were completed. To assess the potential for bias, we determine whether respondents systematically ordered the reported incidents. Bias could be introduced if the incidents that do not have a completed incident report are fundamentally different (e.g., occur later in the year or are less severe) than those that were reported. We consider incident ordering based on the chronological order and severity of incidents. In addition, we assess whether respondents who were unsure of the month in which one of their incidents occurred reported those incidents in a systematic way. Our analysis found that respondents do appear to systematically order their incidents both in terms of chronological order and severity. We quantify the potential impact of this biased ordering on key victimization estimates.
Access/Direct link Conference Homepage (abstract)
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography (4086)
- Respondent Processing of Rating Scales and the Scale Direction Effect ; 2016; Caporaso, A.
- The Effects of Pictorial vs. Verbal Examples on Survey Responses ; 2016; Sun, H.; Bertling, J.; Almonte, D.
- Evaluating Grid Questions for 4th Graders; 2016; Maitland, A.
- Mixing Modes: Challenges (and Tradeoffs) of Adapting a Mailed Paper Survey to the Web ; 2016; Wilkinson-Flicker, S.; McPhee, C. B.; Medway, R.; Kaiser, A.; Cutts, K.
- An Examination of How Survey Mode Affect Eligibility, Response and Health Condition Reporting Rates...; 2016; Stern, M. J.; Ghandour, R.
- Investigating Measurement Error through Survey Question Placement ; 2016; Wilson, A.; Wine, J.; Janson, N.; Conzelmann, J.; Peytcheva, E.
- Instructions in Self-administered Survey Questions: Do They Improve Data Quality or Just Make the Questionnaire...; 2016; Redline, C. D.; Zukerberg, A.; Owens, C.; Ho, A.
- Usability Testing within Agile Process; 2016; Holland, T.
- Exploring Why Web Surveys Take Longer to Complete on Smartphones than PCs: Findings from a Within-subjects...; 2016; Antoun, C.; Cernat, A.
- Making Mobile Web Surveys Accessible; 2016; Malakhoff, L.
- Association of Eye Tracking with Other Usability Metrics ; 2016; Olmsted, E. L.
- Cognitive Probing Methods in Usability Testing – Pros and Cons; 2016; Nichols, E. M.
- Grids and Online Surveys: Do More Complex Grids Induce Survey Satisficing? Evidence from the Gallup...; 2016; Wang, Me.; McCutcheon, A. L.
- Assessing the Accuracy of 51 Nonprobability Online Panels and River Samples: A Study of the Advertising...; 2016; Yang,Y.;Callegaro,M.;Yang,Y.;Callegaro,M.;Chin,K.;Yang,Y.;Villar,A.;Callegaro, M.; Chin, K.; Krosnick...
- Calculating Standard Errors for Nonprobability Samples when Matching to Probability Samples ; 2016; Lee, Ad.; ZuWallack, R. S.
- Communicating Data Use and Privacy: In-person versus Web based methods for message testing ; 2016; Clark Fobia, A.; Hunter Childs, J. E.
- User Experience and Eye-tracking: Results to Optimize Completion of a Web Survey and Website Design ; 2016; Walton, L.; Ricci, K.; Libman Barry, A.; Eiginger, C.; Christian, L. M.
- Estimated-control Calibrated Estimates from Nonprobability Surveys; 2016; Dever, J. A.
- Decomposing Selection Effects in Non-probability Samples ; 2016; Mercer, A. W.; Keeter, S.; Kreuter, F.
- The Effect of Emphasizing the Web Option in a Mixed-mode Establishment Survey ; 2016; O'Brien, J.; Rajapaksa, S.; Schafer, B.; Langetieg, P.
- A Multi-phase Exploration Into Web-based Panel Respondents: Assessing Differences in Recruitment, Respondents...; 2016; Redlawsk, D.; Rogers, K.; Borie-Holtz, D.
- Effect of Clarifying Instructions on Response to Numerical Open-ended Questions in Self-administered...; 2016; Kumar Chaudhary, A.; Israel, G. D.
- Exploring the Feasibility of Using Facebook for Surveying Special Interest Populations ; 2016; Lee, C.; Jang, S.
- National Estimates of Sexual Minority Women Alcohol Use through Web Based Respondent Driven Sampling...; 2016; Farrell Middleton, D.; Iachan, R.; Freedner-Maguire, N.; Trocki, K.; Evans, C.
- Bringing Fair Market Rent Surveys into the 21st Century – Evaluating the Effectiveness of MSG...; 2016; Dayton, J.; Brassell, T.; Cooper, V.; Dion, R.; Williams, R.
- Measuring Survey Behavior of Smartphone Users; 2016; Luks, S.; Phillips, R.
- Practical Considerations for Using Vignettes to Evaluate Survey Items ; 2016; Steiger, D. M.; Williams, Do.; Edwards, W. S.; Cantor, D.; Truman, J.
- Using Web Panels to Quantify the Qualitative: The National Center for Health Statistics Research and...; 2016; Scanlon, P. J.
- Impact of Field Period Length in the Estimates of Sexual Victimization in a Web-based Survey of College...; 2016; Berzofsky, M.; Peterson, K.; Shook-Sa, B. E.; Lindquist, C.; Krebs, C.
- Longitudinal Online Ego-centric Social Network Data Collection with EgoWeb 2.0 ; 2016; Amin, A.; Kennedy, D.
- Influences on Item Response Times in a Multinational Web Survey ; 2016; Phillips, B. T.; Kolenikov, S.; Howard Ecklund, E.; Ackermann, A.; Brulia, A.
- QR Codes for Survey Access: Is It Worth It?; 2016; Allen, L.; Marlar, J.
- An Exploration of the Relationship between Usability Testing and Data Verification ; 2016; Langer Tesfaye, C.; Kurmlavage, V.
- Beyond the Survey: Improving Data Insights and User Experience with Mobile Devices ; 2016; Graham, P.; Lew, G.
- User Experience Considerations for Contextual Product Surveys on Smartphones ; 2016; Sedley, A.; Mueller, H.
- The Differential Effect of Mobile-friendly Surveys on Data Quality; 2016; Horwitz, R.
- Embedding Survey Questions within Non-research Mobile Apps: A Method for Collecting High-quality Data...; 2016; Bapna, V.; Antoun, C.
- Does Changing Monetary Incentive Schemes in Panel Studies Affect Cooperation? A Quasi-experiment on...; 2016; Schaurer, I.; Bosnjak, M.
- Survey Mode and Mail Method: A Practical Experiment in Survey Fielding for a Multi-round Survey ; 2016; Sullivan, B. D.; Duda, N.; Bogen, K.; Clusen, N. A.; Wakar, B.; Zhou, H.
- Web Probing for Question Evaluation: The Effects of Probe Placement ; 2016; Fowler, S.; Willis, G. B.; Moser, R. P.; Townsend, R. L. M.; Maitland, A.; Sun, H.; Berrigan, D.
- Early-bird Incentives: Results From an Experiment to Determine Response Rate and Cost Effects ; 2016; De Santis, J.; Callahan, R.; Marsh, S.; Perez-Johnson, I.
- Using Cash Incentives to Help Recruitment in a Probability Based Web Panel: The Effects on Sign Up Rates...; 2016; Krieger, U.
- Assessing Changes in Coverage Bias of Web Surveys a s Internet Access Increases in the United States...; 2016; Sterrett, D.; Malato, D.; Benz, J.; Tompson, T.; English, N.
- Timing is Everything: Discretely Discouraging Mobile Survey Response through the Timing of Email Contacts...; 2016; Richards, A.; C.; Shook-Sa, B. E.; C.; Berzofsky, M.; Smith, A. C.
- Dynamic Instructions in Check-All-That-Apply Questions ; 2016; Kunz, T.; Fuchs, M.
- Patterns of Unit and Item Nonresponse in a Multinational Web Survey ; 2016; Ackermann, A.; Howard Ecklund, E.; Phillips, B. T.; Brulia, A.
- Debunking Myths About the Quality of Industry and O ccupation Data Collected Through Self-administered...; 2016; Hurwitz, F. I.; Stein, J.; Skaff, A. L.
- Desktops, Tablets and Phones, Oh My! Device Prefere nce for Web Based Surveys ; 2016; Schy, S.; Ghirardelli, A.; Morrison, H.
- Assessing Potential Bias in Respondent-driven Incident Based Data from a Web Survey of College Students...; 2016; Peterson, K.; Berzofsky, M.; Shook-Sa, B. E.; Krebs, C.; Lindquist, C.
- Making Connections on the Internet: Online Survey Panel Communications ; 2016; Libman Barry, A.; Eiginger, C.; Walton, L.; Ricci, K.