Web Survey Bibliography
Title Mixing Modes: Challenges (and Tradeoffs) of Adapting a Mailed Paper Survey to the Web
Year 2016
Access date 09.06.2016
Abstract
Recent trends show that survey respondents are increasingly difficult and expensive to reach. Methodology research consistently demonstrates that
tailored and adaptive designs may offer the best solution for collecting high-quality data. One strategy that can increase coverage and representativeness — and potentially reduce cost — uses sequential mix
ed-mode designs that include a web-based response component. In January 2016, the National Center for Education Statistics will test a sequential mixed-mode, web-push design for the 2016 administration of the National Household Education Survey (NHES). For several cycles, the NHES has used an address-based sample to administer a two-phase, self-
administered mailed questionnaire in which sampled households are rostered using a phase-1 screener and then a single individual is sampled from responding households to complete a longer phase-2 “topical” survey. This presentation will describe the process of adapting the two-phase paper design to incorporate a variable-phase web survey, and some of the key challenges faced while transitioning from a well-tested paper-only to a mixed-mode administration. Authors will describe the tradeoffs between maintaining consistency with the paper instrument and optimizing the web survey; the complexity of building a web instrument that in some situations (e.g., single-adult households) must be a single-phase survey with both phases completed by one individual, while other situations require a different respondent to complete each phase; and the intricacies of using phase-1 screener data to customize wording in both English and Spanish using known information about the respondent. In addition to discussing the above challenges and proposed solutions, the paper will present selected results of usability testing and the resulting design changes to the web instrument. This study contributes to the growing body of research examining the most effective ways to use mixed-mode designs to increase survey response and representativeness while minimizing cost and mode effects in a national household survey.
tailored and adaptive designs may offer the best solution for collecting high-quality data. One strategy that can increase coverage and representativeness — and potentially reduce cost — uses sequential mix
ed-mode designs that include a web-based response component. In January 2016, the National Center for Education Statistics will test a sequential mixed-mode, web-push design for the 2016 administration of the National Household Education Survey (NHES). For several cycles, the NHES has used an address-based sample to administer a two-phase, self-
administered mailed questionnaire in which sampled households are rostered using a phase-1 screener and then a single individual is sampled from responding households to complete a longer phase-2 “topical” survey. This presentation will describe the process of adapting the two-phase paper design to incorporate a variable-phase web survey, and some of the key challenges faced while transitioning from a well-tested paper-only to a mixed-mode administration. Authors will describe the tradeoffs between maintaining consistency with the paper instrument and optimizing the web survey; the complexity of building a web instrument that in some situations (e.g., single-adult households) must be a single-phase survey with both phases completed by one individual, while other situations require a different respondent to complete each phase; and the intricacies of using phase-1 screener data to customize wording in both English and Spanish using known information about the respondent. In addition to discussing the above challenges and proposed solutions, the paper will present selected results of usability testing and the resulting design changes to the web instrument. This study contributes to the growing body of research examining the most effective ways to use mixed-mode designs to increase survey response and representativeness while minimizing cost and mode effects in a national household survey.
Access/Direct link Conference Homepage (abstract)
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography (4086)
- Respondent Processing of Rating Scales and the Scale Direction Effect ; 2016; Caporaso, A.
- The Effects of Pictorial vs. Verbal Examples on Survey Responses ; 2016; Sun, H.; Bertling, J.; Almonte, D.
- Evaluating Grid Questions for 4th Graders; 2016; Maitland, A.
- Mixing Modes: Challenges (and Tradeoffs) of Adapting a Mailed Paper Survey to the Web ; 2016; Wilkinson-Flicker, S.; McPhee, C. B.; Medway, R.; Kaiser, A.; Cutts, K.
- An Examination of How Survey Mode Affect Eligibility, Response and Health Condition Reporting Rates...; 2016; Stern, M. J.; Ghandour, R.
- Investigating Measurement Error through Survey Question Placement ; 2016; Wilson, A.; Wine, J.; Janson, N.; Conzelmann, J.; Peytcheva, E.
- Instructions in Self-administered Survey Questions: Do They Improve Data Quality or Just Make the Questionnaire...; 2016; Redline, C. D.; Zukerberg, A.; Owens, C.; Ho, A.
- Usability Testing within Agile Process; 2016; Holland, T.
- Exploring Why Web Surveys Take Longer to Complete on Smartphones than PCs: Findings from a Within-subjects...; 2016; Antoun, C.; Cernat, A.
- Making Mobile Web Surveys Accessible; 2016; Malakhoff, L.
- Association of Eye Tracking with Other Usability Metrics ; 2016; Olmsted, E. L.
- Cognitive Probing Methods in Usability Testing – Pros and Cons; 2016; Nichols, E. M.
- Grids and Online Surveys: Do More Complex Grids Induce Survey Satisficing? Evidence from the Gallup...; 2016; Wang, Me.; McCutcheon, A. L.
- Assessing the Accuracy of 51 Nonprobability Online Panels and River Samples: A Study of the Advertising...; 2016; Yang,Y.;Callegaro,M.;Yang,Y.;Callegaro,M.;Chin,K.;Yang,Y.;Villar,A.;Callegaro, M.; Chin, K.; Krosnick...
- Calculating Standard Errors for Nonprobability Samples when Matching to Probability Samples ; 2016; Lee, Ad.; ZuWallack, R. S.
- Communicating Data Use and Privacy: In-person versus Web based methods for message testing ; 2016; Clark Fobia, A.; Hunter Childs, J. E.
- User Experience and Eye-tracking: Results to Optimize Completion of a Web Survey and Website Design ; 2016; Walton, L.; Ricci, K.; Libman Barry, A.; Eiginger, C.; Christian, L. M.
- Estimated-control Calibrated Estimates from Nonprobability Surveys; 2016; Dever, J. A.
- Decomposing Selection Effects in Non-probability Samples ; 2016; Mercer, A. W.; Keeter, S.; Kreuter, F.
- The Effect of Emphasizing the Web Option in a Mixed-mode Establishment Survey ; 2016; O'Brien, J.; Rajapaksa, S.; Schafer, B.; Langetieg, P.
- A Multi-phase Exploration Into Web-based Panel Respondents: Assessing Differences in Recruitment, Respondents...; 2016; Redlawsk, D.; Rogers, K.; Borie-Holtz, D.
- Effect of Clarifying Instructions on Response to Numerical Open-ended Questions in Self-administered...; 2016; Kumar Chaudhary, A.; Israel, G. D.
- Exploring the Feasibility of Using Facebook for Surveying Special Interest Populations ; 2016; Lee, C.; Jang, S.
- National Estimates of Sexual Minority Women Alcohol Use through Web Based Respondent Driven Sampling...; 2016; Farrell Middleton, D.; Iachan, R.; Freedner-Maguire, N.; Trocki, K.; Evans, C.
- Bringing Fair Market Rent Surveys into the 21st Century – Evaluating the Effectiveness of MSG...; 2016; Dayton, J.; Brassell, T.; Cooper, V.; Dion, R.; Williams, R.
- Measuring Survey Behavior of Smartphone Users; 2016; Luks, S.; Phillips, R.
- Practical Considerations for Using Vignettes to Evaluate Survey Items ; 2016; Steiger, D. M.; Williams, Do.; Edwards, W. S.; Cantor, D.; Truman, J.
- Using Web Panels to Quantify the Qualitative: The National Center for Health Statistics Research and...; 2016; Scanlon, P. J.
- Impact of Field Period Length in the Estimates of Sexual Victimization in a Web-based Survey of College...; 2016; Berzofsky, M.; Peterson, K.; Shook-Sa, B. E.; Lindquist, C.; Krebs, C.
- Longitudinal Online Ego-centric Social Network Data Collection with EgoWeb 2.0 ; 2016; Amin, A.; Kennedy, D.
- Influences on Item Response Times in a Multinational Web Survey ; 2016; Phillips, B. T.; Kolenikov, S.; Howard Ecklund, E.; Ackermann, A.; Brulia, A.
- QR Codes for Survey Access: Is It Worth It?; 2016; Allen, L.; Marlar, J.
- An Exploration of the Relationship between Usability Testing and Data Verification ; 2016; Langer Tesfaye, C.; Kurmlavage, V.
- Beyond the Survey: Improving Data Insights and User Experience with Mobile Devices ; 2016; Graham, P.; Lew, G.
- User Experience Considerations for Contextual Product Surveys on Smartphones ; 2016; Sedley, A.; Mueller, H.
- The Differential Effect of Mobile-friendly Surveys on Data Quality; 2016; Horwitz, R.
- Embedding Survey Questions within Non-research Mobile Apps: A Method for Collecting High-quality Data...; 2016; Bapna, V.; Antoun, C.
- Does Changing Monetary Incentive Schemes in Panel Studies Affect Cooperation? A Quasi-experiment on...; 2016; Schaurer, I.; Bosnjak, M.
- Survey Mode and Mail Method: A Practical Experiment in Survey Fielding for a Multi-round Survey ; 2016; Sullivan, B. D.; Duda, N.; Bogen, K.; Clusen, N. A.; Wakar, B.; Zhou, H.
- Web Probing for Question Evaluation: The Effects of Probe Placement ; 2016; Fowler, S.; Willis, G. B.; Moser, R. P.; Townsend, R. L. M.; Maitland, A.; Sun, H.; Berrigan, D.
- Early-bird Incentives: Results From an Experiment to Determine Response Rate and Cost Effects ; 2016; De Santis, J.; Callahan, R.; Marsh, S.; Perez-Johnson, I.
- Using Cash Incentives to Help Recruitment in a Probability Based Web Panel: The Effects on Sign Up Rates...; 2016; Krieger, U.
- Assessing Changes in Coverage Bias of Web Surveys a s Internet Access Increases in the United States...; 2016; Sterrett, D.; Malato, D.; Benz, J.; Tompson, T.; English, N.
- Timing is Everything: Discretely Discouraging Mobile Survey Response through the Timing of Email Contacts...; 2016; Richards, A.; C.; Shook-Sa, B. E.; C.; Berzofsky, M.; Smith, A. C.
- Dynamic Instructions in Check-All-That-Apply Questions ; 2016; Kunz, T.; Fuchs, M.
- Patterns of Unit and Item Nonresponse in a Multinational Web Survey ; 2016; Ackermann, A.; Howard Ecklund, E.; Phillips, B. T.; Brulia, A.
- Debunking Myths About the Quality of Industry and O ccupation Data Collected Through Self-administered...; 2016; Hurwitz, F. I.; Stein, J.; Skaff, A. L.
- Desktops, Tablets and Phones, Oh My! Device Prefere nce for Web Based Surveys ; 2016; Schy, S.; Ghirardelli, A.; Morrison, H.
- Assessing Potential Bias in Respondent-driven Incident Based Data from a Web Survey of College Students...; 2016; Peterson, K.; Berzofsky, M.; Shook-Sa, B. E.; Krebs, C.; Lindquist, C.
- Making Connections on the Internet: Online Survey Panel Communications ; 2016; Libman Barry, A.; Eiginger, C.; Walton, L.; Ricci, K.