Web Survey Bibliography

Title Invites, intros and incentives: lessons form a web survey
Year 2000
Access date 23.05.2005
Full text pdf (60k)
Abstract

The age of Internet research is upon us. Despite the methodological constraints to conducting survey research on the Web, Internet-based methodologies have emerged as a viable mode of interviewing for many types of research. This is especially true when doing Web site assessments; that is, getting users and site visitors reactions to their Web site experience. Corporate America comes to the Internet with plans to extend their businesses, to forge new customer relationships, to produce sales through online commerce, and to serve their client bases. Since this medium is a new one, much experimentation with content offerings, navigation options and e-commerce is occurring on these Web sites. This means that businesses must find out what their site visitors are experiencing and how their sites compare with their competitors’. When faced with these types of decisions (with large sums of money being devoted to developing their Internet presences), companies turn to research to help guide them. From a research point-of-view, there are several methods for intercepting visitors to ask them to participate in a survey. For example you can: post a banner or icon on the site, fire e-mails to a sample of past visitors, have a pop-up invitation window, use a guest registration approach. This paper will discuss the pros and cons of each of these Web-intercept methods. Regardless of the method used, there are three main issues related to participation. First, what is the response to the invitation (hit rate)? Second, does the survey introduction work (call to action)? Finally, is the incentive compelling? Much research has been done on the use of incentives and survey/introduction design in more traditional types of surveys. This paper will discuss the experiences with manipulating the invitations, the introductions and the incentives for two Web assessment surveys, including the effect of changes on hit rates and completion rates. It should be noted that the findings presented in this paper can best be characterized as learnings. In other words, the Web surveys reported here were not designed to be experimental studies. Rather, in the midst of fielding the studies we aimed to improve our response with site visitors through the trial of a number of changes to the invitations, the intros and the incentives. In some cases, our response rates seem to show that a change was effective; in other cases, no such change was found. Therefore these findings should be viewed as qualitative and directional; nonetheless, we feel that valuable learnings were achieved that can be applied to future studies.

Access/Direct link Online Survey Design Guide
Year of publication2000
Bibliographic typeConferences, workshops, tutorials, presentations
Print