Web Survey Bibliography
Relevance & Research Question: Open-ended questions are often used to gather short numeric information in self-administered web questionnaires. Respondents are encouraged to enter numbers, quantities or frequencies into input fields, most often without any computerized formatting constraints predominantly in order to prevent item nonresponse. However, the absence of any formatting restrictions encourages large variances in answers deviating from the desired format, including value ranges, estimations, alphanumeric supplements, or even different measuring units which affect data quality negatively, and increase the efforts for data cleansing and preparation. Thus, concise and clear formatting instructions are needed to guide respondents providing answers in the desired format. Considering the fact that instructions are likely to be ignored the question arises how different modes of verbal instructions and visual cues can be applied to improve the impact of formatting instructions, and finally to enhance data quality.
Methods & Data: In a between-subjects field experiment conducted among university freshman students in an opt-in panel (N=670), we tested different visual modes of formatting instructions in open-ended numeric questions: (1) conventional instruction in a static manner, (2) dynamic instruction in a tooltip appearing when the mouse cursor hovers over the input field, and (3) symbolic instruction in terms of pre-defined default values in the input field indicating the desired response format. The effectiveness of each instruction mode was determined by the proportion of formally correct answers.
Results: Findings indicated that the implementation of dynamic formatting elements in terms of tooltips or default values had no positive effect on an improvement of response quality compared to conventional static formatting instructions. Even a combination of tooltips and pre-filled symbols could not achieve a significant increase in correctly formatted answers compared to the sole presentation of a fixed instruction.
Added Value: The results indicated that static formatting instructions should not be replaced hastily without examining the effect of dynamic elements sufficiently. However, initial findings suggested the potential of dynamic formatting instructions in enhancing the positive effect of conventional instructions.
GOR Homepage (abstract) / (presentation)
Web Survey Bibliography - Measurement (1964)
- How Representative are Google Consumer Surveys?: Results From an Analysis of a Google Consumer Survey...; 2013; Krishnamurty, P., Tanenbaum, E., Stern, M. J.
- One Drink or Two: Does Quantity Depicted in an Image Affect Web Survey Responses?; 2013; Charoenruk, N., Stange, M.
- Using Multiple Modes in Follow-Up Contacts in Random-Digit Dialing Surveys; 2013; Chowdhury, P. P.
- Tablets and Smartphones and Netbooks, Oh My! Effects of Device Type on Respondent Behavior; 2013; Ross, H., Mendelson, J., Lackey, M.
- Impacts of Unit Nonresponse in a Recontact Study of Youth; 2013; Mendelson, J., Viera Jr., L.
- Multi-Mode Survey Administration: Does Offering Multiple Modes at Once Depress Response Rates?; 2013; Newsome, J., Levin, K., Langetieg, P., Vigil, M., Sebastiani, M.
- Responsive Design for Web Panel Data Collection; 2013; Bianchi, A., Biffignandi, S.
- Utilizing the Web in a Multi-Mode Survey; 2013; Venkataraman, L.
- Changing to a Mixed-Mode Design: The Role of Mode in Respondents’ Decisions About Participation...; 2013; Collins, D., Mitchell, M., Toomes, M.
- Comparing the Effects of Mode Design on Response Rate, Representativeness, and Cost Per Complete in...; 2013; Tully, R.
- Internet Response for the Decennial Census – 2012 National Census Test; 2013; Reiser, C.
- The Effects of Pushing Web in a Mixed-Mode Establishment Data Collection; 2013; Ellis, C.
- Using Web Ex to Conduct Usability Testing of an On-Line Survey Instrument; 2013; Stettler, K.
- Effects of Lotteries on Response Behavior in Online Panels; 2013; Goeritz, A., Luthe, S. C.
- Lotteries and study results in market research online panels; 2013; Goeritz, A., Luthe, S. C.
- The Effects of Errors in Paradata on Weighting Class Adjustments: A Simulation Study; 2013; West, B. T.
- Using Paradata to Study Response to Within-Survey Requests; 2013; Sakshaug, J. W.
- Paradata for Coverage Research ; 2013; Eckman, S.
- Improving Surveys with Paradata: Analytic Uses of Process Information; 2013; Kreuter, F.
- Theory of adaptation or survival of the fittest?; 2013; Cavallaro, K.
- Online Fundraising Essentials, Second Edition; 2013; Stevenson, S. C.
- Tips for Evaluating Online Effectiveness; 2013; Stevenson, S. C.
- The Digital Divide: The internet and social inequality in international perspective; 2013; Ragnedda, M., Muschert, G.
- Ten questions to ask your online survey provider; 2013; Williams, D.
- Survey quality prediction system 2.0; 2013
- Practical tools for designing and weighting survey samples; 2013; Valliant, R. L., Daver, J. A., Kreuter, F.
- Paradata in web surveys; 2013; Callegaro, M.
- Report Of The AAPOR Task Force On Non-probability sampling; 2013; Baker, R. P., Brick, J. M., Bates, N., Battaglia, M. P., Couper, M. P., Dever, J. A., Gile, K. J., Tourangeau...
- Incentive effects; 2013; Goeritz, A.
- A nationwide web-based freight data collection; 2013; Samimi, A., Mohammadian, A., Kawamura, K.
- Mode Matters: Evaluating Response Comparability in a Mixed-Mode Survey; 2013; Bowyer, B. T., Rogowski, J. C.
- Comparing Survey Results Obtained via Mobile Devices and Computers: An Experiment With a Mobile Web...; 2013; de Bruijne, M., Wijnant, A.
- Cognitive Probes in Web Surveys: On the Effect of Different Text Box Size and Probing Exposure on Response...; 2013; Behr, D., Bandilla, W., Kaczmirek, L., Braun, M.
- Best Practice in Online Survey Research with Sensitive Topics; 2013; Kays, K., Keith, T. L., Broughal, M. T.
- Research Intentions are Nothing without Technology: Mixed-Method Web Surveys and the Coberen Wall of...; 2013; Ganassali, S., Rodriguez-Santos, C.
- Reducing Response Burden for Enterprises Combining Methods for Data Collection on the Internet; 2013; Vik, T.
- Measuring Wages Worldwide: Exploring the Potentials and Constraints of Volunteer Web Surveys; 2013; Steinmetz, S., Raess, D., Tijdens, K., de Pedraza, P.
- Using Web Surveys for Psychology Experiments: A Case Study in New Media Technology for Research; 2013; Peden, B. F., Tiry , A. M.
- The Distinctiveness of Online Research: Descriptive Assemblages, Unobtrusiveness, and Novel Kinds of...; 2013; Lanfrey, D.
- Advancing Research Methods with New Technologies; 2013; Sappleton, N.
- Data Quality in PC and Mobile Web Surveys; 2013; Mavletova, A. M.
- PDAs in socio-economic surveys: instrument bias, surveyor bias or both?; 2013; Escobal, J., Benites, S.
- Virtual research assistants: Replacing human interviewers by automated avatars in virtual worlds; 2013; Hasler, B. S., Tuchman, P., Friedman, D.
- Compared to a small, supervised lab experiment, a large, unsupervised web-based experiment on a previously...; 2013; Ryan, R. S., Wilde, M., Crist, S.
- From mixed-mode to multiple devices. Web surveys, smartphone surveys and apps: has the respondent gone...; 2013; Callegaro, M.
- Moving an established survey online – or not?; 2013; Barber, T., Chilvers, D., Kaul, S.
- Using mobile devices to access the realities of youth: How identification with society influences political...; 2013; Smith, M.
- On the Use of Latent Variable Models to Detect Differences in the Interpretation of Vague Quantifiers...; 2013; Griffin, J.
- Managing mobile research: How it's different and why it matters; 2013; Kachhi-Jiwani, D., Tucker, J., Wilding-Brown, L.
- By the Numbers: Theory of adaptation or survival of the fittest?; 2013; Cavallaro, K.