Web Survey Bibliography
Title PageFocus: A new JavaScript to detect cheating in online tests
Author Diedenhofen, B.; Musch, J.
Year 2014
Access date 03.08.2016
Abstract
The validity of unproctored tests is threatened by participants who decide to cheat when stakes are high. To address this problem, we developed PageFocus, a JavaScript allowing to determine whether participants abandon a test page to pursue other activities such as looking up the solution to a test item in a second browser. As a validation and a first application of the script, we conducted a study to explore the determinants of cheating in an unproctored online test.
Methods & Data: 541 members of an online panel participated in an experiment presenting 10 test items that could easily be looked up using a search engine (general knowledge questions) and 10 items for which the solution could not easily be looked up on the Internet (a logic test based on matrices). The incentive to cheat was varied experimentally in three steps: In a first group, participants received a monetary reward only when they performed well enough in the test; in a second group, participants were rewarded regardless of their performance on the test, and in a third group no incentive whatsoever was offered.
Results: The PageFocus script revealed that participants cheated more when performance-related incentives were being offered. As expected, this effect was however limited to those items for which it was possible to look up the solution using a search engine. Cheating participants achieved higher scores.
Added Value: Our study provides a first successful validation of PageFocus, a new JavaScript allowing to determine whether the participants of an online test switch to another web page while completing their test items. The script can be used to detect both, whether test takers look up the solution to items they would otherwise be unable to solve, or to detect participants switching back and forth from a survey to an unrelated application running in the background. PageFocus might be a promising new tool to improve data quality in online research.
Methods & Data: 541 members of an online panel participated in an experiment presenting 10 test items that could easily be looked up using a search engine (general knowledge questions) and 10 items for which the solution could not easily be looked up on the Internet (a logic test based on matrices). The incentive to cheat was varied experimentally in three steps: In a first group, participants received a monetary reward only when they performed well enough in the test; in a second group, participants were rewarded regardless of their performance on the test, and in a third group no incentive whatsoever was offered.
Results: The PageFocus script revealed that participants cheated more when performance-related incentives were being offered. As expected, this effect was however limited to those items for which it was possible to look up the solution using a search engine. Cheating participants achieved higher scores.
Added Value: Our study provides a first successful validation of PageFocus, a new JavaScript allowing to determine whether the participants of an online test switch to another web page while completing their test items. The script can be used to detect both, whether test takers look up the solution to items they would otherwise be unable to solve, or to detect participants switching back and forth from a survey to an unrelated application running in the background. PageFocus might be a promising new tool to improve data quality in online research.
Access/Direct link Conference Homepage (Abstract) / (Full text)
Year of publication2014
Bibliographic typeConferences, workshops, tutorials, presentations
Full text availabilityNon-existant
Web survey bibliography - General Online Research Conference (GOR) 2014 (29)
- Using Paradata to Predict and to Correct for Panel Attrition in a Web-based Panel Survey; 2014; Rossmann, J., Gummer, T.
- Targeting the bias – the impact of mass media attention on sample composition and representativeness...; 2014; Steinmetz, S., Oez, F., Tijdens, K. G.
- Offline Households in the German Internet Panel; 2014; Bossert, D., Holthausen, A., Krieger, U.
- Which fieldwork method for what target group? How to improve response rate and data quality; 2014; Wulfert, T., Woppmann, A.
- Exploring selection biases for developing countries - is the web a promising tool for data collection...; 2014; Tijdens, K. G., Steinmetz, S.
- Evaluating mixed-mode redesign strategies against benchmark surveys: the case of the Crime Victimization...; 2014; Klausch, L. T., Hox, J., Schouten, B.
- The quality of ego-centered social network data in web surveys: experiments with a visual elicitation...; 2014; Marcin, B., Matzat, U., Snijders, C.
- Switching the polarity of answer options within the questionnaire and using various numbering schemes...; 2014; Struminskaya, B., Schaurer, I., Bosnjak, M.
- Measuring the very long, fuzzy tail in the occupational distribution in web-surveys; 2014; Tijdens, K. G.
- Social Media and Surveys: Collaboration, Not Competition; 2014; Couper, M. P.
- Improving cheater detection in web-based randomized response using client-side paradata; 2014; Dombrowski, K., Becker, C.
- Interest Bias – An Extreme Form of Self-Selection?; 2014; Cape, P. J., Reichert, K.
- Online Qualitative Research – Personality Matters ; 2014; Tress, F., Doessel, C.
- Increasing data quality in online surveys 4.1; 2014; Hoeckel, H.
- Moving answers with the GyroScale: Using the mobile device’s gyroscope for market research purposes...; 2014; Luetters, H., Kraus, M., Westphal, D.
- Online Surveys as a Management Tool for Monitoring Multicultual Virtual Team Processes; 2014; Scovotti, C.
- How much is shorter CAWI questionnaire VS CATI questionnaire?; 2014; Bartoli, B.
- WEBDATANET: A Network on Web-based Data Collection, Methodological Challenges, Solutions, and Implementation...; 2014; Tijdens, K. G., Steinmetz, S., de Pedraza, P., Serrano, F.
- The Use of Paradata to Predict Future Cooperation in a Panel Study; 2014; Funke, F., Goeritz, A.
- Incentives on demand in a probability-based online panel: redemption and the choice between pay-out...; 2014; Schaurer, I., Struminskaya, B., Kaczmirek, L.
- The Effect of De-Contextualisation - A Comparison of Response Behaviour in Self-Administered Surveys; 2014; Wetzelhuetter, D.
- Responsive designed web surveys; 2014; Dreyer, M., Reich, M., Schwarzkopf, K.
- Extra incentives for extra efforts – impact of incentives for burdensome tasks within an incentivized...; 2014; Schreier, J. H., Biethahn, N., Drewes, F.
- Students First Choice – the influence of mobile mode on results; 2014; Maxl, E.
- Device Effects: How different screen sizes affect answer quality in online questionnaires; 2014; Fischer, B., Bernet, F.
- Moving towards mobile ready web panels; 2014; Wijnant, A., de Bruijne, M.
- Innovation for television research - online surveys via HbbTV. A new technology with fantastic opportunities...; 2014; Herche, J., Adler, M.
- Mixed-devices in a probability based panel survey. Effects on survey measurement error; 2014; Toepoel, V., Lugtig, P. J.
- Online mobile surveys in Italy: coverage and other methodological challenges; 2014; Poggio, T.