Notice: the WebSM website has not been updated since the beginning of 2018.

Web Survey Bibliography

Title Slider Scales in Online Surveys
Year 2009
Access date 14.07.2015
Full text

pdf (3.139 KB)


Online surveys today are often perceived as dull, repetitive, and lacking in involvement and interactivity. Some have argued that market research is not, nor is it intended to be, a form of entertainment but is serious scientific inquiry. Others have looked, more pragmatically perhaps, at what the Internet has to offer and how these techniques might be applied to answer market research problems. The latter offers the greater potential. Market research techniques have evolved over decades to encompass the possibilities within the confines of the methodologies employed. The Internet allows researchers to, once again, “reAthink the question set,” and incorporate elements such as Adobe Flash, a multimedia platform for adding animation and interactivity. Its almost universal installation on PCs has opened up the opportunity to produce question and answer styles that, if nothing else, look engaging and offer a degree of interactivity beyond merely answering questions. Many companies providing programming and hosting services, as well as research institutes themselves, offer Flash toolkits to replace clumsy or timeAconsuming traditional questioning methods. Of the tools in the toolkit, sliders as replacements for scales were among the first to be developed. Yet they remain, at least according to anecdotal evidence, the least popular. This unpopularity probably stems from a lack of understanding of precisely how the slider is being perceived and used by the respondent along with very real concerns about loss of comparability with previous data which may have been collected via a completely different mode. As an industry, we are somewhat conservative and, in the case of scales, there is a feeling that “if it ain’t broke, don’t fix it.” This paper attempts to answer a number of questions about the “standard” 5Apoint Likert Scale and how a FlashAbased alternative might perform in terms of data collected, levels of engagement engendered, and satisfaction with the instrument on the part of the subject, that is the respondent.

Year of publication2009
Bibliographic typeConferences, workshops, tutorials, presentations