Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Addressing Survey Nonresponse Issues: Implications for ATE Principal Investigators, Evaluators, and Researchers
Year 2013
Access date 05.04.2013
Abstract

Many ATE principal investigators (PIs), evaluators, and researchers, use mailed or emailed surveys to help them assess the effectiveness of their work. However, a common problem is low response rate and the potential that it creates for nonresponse bias. This is the bias that occurs when those who do not respond to a survey are different in some systematic way, from those who do answer the survey. If this occurs, serious questions may be raised about the validity of the findings. The purpose of this report is to raise awareness and present ways to address nonresponse problems among those that use surveys in their studies of ATE grants. This includes PIs, evaluators, researchers, and those who may be evaluating the full program. We summarize the research on nonresponse issues, present generally acceptable standards for response rates, offer suggestions on how to increase response rates, describe ways to check for nonresponse bias, and apply these methods to a research study of the Advanced Technological Education (ATE) program sponsored by the National Science Foundation (NSF). We used a mailed survey to gather information about the impact and sustainability of the ATE program. Although we had a high response rate, we decided to check for nonresponse bias. We used three methods to investigate the problem; we compared responders with the total population and compared responders with nonresponders on five background characteristics. We also compared early responders with late responders on two scales, an ATE Impact Scale and an ATE Sustainability Scale. We used late responders as a surrogate for nonresponders. We found a slightly higher response rate for the larger center grants when compared to projects. However, we found no differences in the actual survey responses between early responders and late responders, our proxy for nonresponders. This led us to believe we did not have a nonresponse bias in our results. We believe that our experience will be useful for those doing research on and evaluation
of the ATE program. We hope the suggestions we offer will help improve the validity of their findings. 

Access/Direct link

University of Colorado (abstract) / (full text)

Year of publication2013
Bibliographic typeGeneric - other
Print

Web survey bibliography (4086)

Page:
Page: