Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Editing Strategies for Electronic Establishment Survey Data Collection: Research and Experience
Year 2004
Access date 14.06.2004
Abstract Human error, such as programming mistakes, miscalculations by respondents, keypunch errors, and interviewer misclassifications are a fact-of-life for surveys. The goal of data editing is to identify and correct as much error as possible. In traditional mail surveys of establishments, data editing is performed during post-data-collection processing. The growth in electronic data collection via computerized self-administered questionnaires or Web surveys opens the door to incorporating data edits into the data collection instruments themselves, so that suspicious data entered by respondents may be corrected – or explained – at their source. The result should be improved data quality that more accurately reflects respondents’ circumstances. However, incorporating data edits into the data collection instrument brings with it new responsibilities and new issues for survey designers. What types of edits are appropriate for the instrument itself and which should remain relegated to post-collection processing? How many edits are too many, beyond what the respondent will bear? How should edit messages be formulated so that respondents may use them effectively? The U.S. Census Bureau has more than ten years’ experience with electronic data collection for establishment surveys, along with having conducted usability research with business respondents. This paper will share insights gained from this experience and research and will describe how editing strategies have evolved so that data quality is improved without sacrificing survey response.
Year of publication2004
Bibliographic typeConferences, workshops, tutorials, presentations
Print

Web survey bibliography (8390)

Page:
Page: