Web Survey Bibliography
Title PageFocus: A new JavaScript to detect cheating in online tests
Author Diedenhofen, B.; Musch, J.
Year 2014
Access date 03.08.2016
Abstract
The validity of unproctored tests is threatened by participants who decide to cheat when stakes are high. To address this problem, we developed PageFocus, a JavaScript allowing to determine whether participants abandon a test page to pursue other activities such as looking up the solution to a test item in a second browser. As a validation and a first application of the script, we conducted a study to explore the determinants of cheating in an unproctored online test.
Methods & Data: 541 members of an online panel participated in an experiment presenting 10 test items that could easily be looked up using a search engine (general knowledge questions) and 10 items for which the solution could not easily be looked up on the Internet (a logic test based on matrices). The incentive to cheat was varied experimentally in three steps: In a first group, participants received a monetary reward only when they performed well enough in the test; in a second group, participants were rewarded regardless of their performance on the test, and in a third group no incentive whatsoever was offered.
Results: The PageFocus script revealed that participants cheated more when performance-related incentives were being offered. As expected, this effect was however limited to those items for which it was possible to look up the solution using a search engine. Cheating participants achieved higher scores.
Added Value: Our study provides a first successful validation of PageFocus, a new JavaScript allowing to determine whether the participants of an online test switch to another web page while completing their test items. The script can be used to detect both, whether test takers look up the solution to items they would otherwise be unable to solve, or to detect participants switching back and forth from a survey to an unrelated application running in the background. PageFocus might be a promising new tool to improve data quality in online research.
Methods & Data: 541 members of an online panel participated in an experiment presenting 10 test items that could easily be looked up using a search engine (general knowledge questions) and 10 items for which the solution could not easily be looked up on the Internet (a logic test based on matrices). The incentive to cheat was varied experimentally in three steps: In a first group, participants received a monetary reward only when they performed well enough in the test; in a second group, participants were rewarded regardless of their performance on the test, and in a third group no incentive whatsoever was offered.
Results: The PageFocus script revealed that participants cheated more when performance-related incentives were being offered. As expected, this effect was however limited to those items for which it was possible to look up the solution using a search engine. Cheating participants achieved higher scores.
Added Value: Our study provides a first successful validation of PageFocus, a new JavaScript allowing to determine whether the participants of an online test switch to another web page while completing their test items. The script can be used to detect both, whether test takers look up the solution to items they would otherwise be unable to solve, or to detect participants switching back and forth from a survey to an unrelated application running in the background. PageFocus might be a promising new tool to improve data quality in online research.
Access/Direct link Conference Homepage (Abstract) / (Full text)
Year of publication2014
Bibliographic typeConferences, workshops, tutorials, presentations
Full text availabilityNon-existant
Web survey bibliography - 2014 (234)
- Detecting Insufficient Effort Responding with an Infrequency Scale: Evaluating Validity and Participant...; 2016; Huang, J. L.; Bowling, N. A.; Liu, Me.; Li, Yu.
- Evaluating Three Approaches to Statistically Adjust for Mode Effects; 2016; Kolenikov, S.; Kennedy, C.
- An Examination of Opposing Responses on Duplicated Multi-Mode Survey Responses; 2016; Djangali, A.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- Usability of the ACS Internet Instrument on Mobile Devices; 2015; Horwitz, R.
- Explorations in Non - Probability Sampling Using the Web; 2015; Brick, J. M.
- On Bias Adjustments for Web Surveys; 2015; Fan, L.; Lou, W.; Landsman, V.
- Are they willing to use the web? First results of a possible switch from PAPI to CAPI/CAWI in an establishment...; 2015; Ellguth, P.; Kohaut, S.
- Web panel surveys – a challenge for official statistics; 2015; Svensson, J.
- Estimation with Non-probability Surveys and the Question of External Validity; 2015; Dever, J. A.; Valliant, R. L.
- Measurement Properties of Web Surveys; 2015; Tourangeau, R.
- Improving Response to Household Surveys Using Mail Contact to Request Responses over the Internet: Results...; 2015; Dillman, D. A.
- The quality of data collected using online panels: a decade of research ; 2015; Callegaro, M.
- Sub-optimal Respondent Behavior and Data Quality in Online Surveys; 2015; Thomas, R. K.
- Methodology of the RAND Mid-Term 2014 Election Panel; 2015; Carman, K. G; Pollack, S.
- Designing Bonsai Surveys: The small but perfectly formed survey experience to meet the needs of the...; 2015; Puleston, J.
- Suggestions for international research using electronic surveys; 2015; e Silva, S. C.; Duarte, P.
- Recruiting Respondents for a Mobile Phone Panel: The Impact of Recruitment Question Wording on Cooperation...; 2015; Busse, B.; Fuchs, M.
- The effect of multiple reminders on response patterns in a Danish health survey; 2015; Christensen, A. I.; Ekholm, O.; Kristensen, P. L.; Larsen, F. B.; Vinding, A. L.; Gluemer, C.; Juel,...
- The quality of responses to grid questions as used in Web questionnaires (compared with paper questionnaires...; 2015; Dominguez, J. A.; de Rada, V. D.
- Identifying predictors of survey mode preference; 2015; Millar, M. M.; Olson, K.; Smyth, J. D.
- The Impact of Mixing Modes on Reliability in Longitudinal Studies; 2014; Cernat, A.
- Growing Beyond the Phone Tree; 2014; Hayzlett, J.
- A Comparison of Different Online Sampling Approaches for Generating National Samples; 2014; Heen, M. S. J., Lieberman, J. D., Miethe, T. D.
- Does Sequence Matter in Multimode Surveys: Results from an Experiment; 2014; Wagner, J., Arrieta, J., Guyer, H., Ofstedal, M. B.
- The Use of Cognitive Interviewing Methods to Evaluate Mode Effects in Survey Questions; 2014; Gray, M., Blake, M., Campanelli, P.
- A Mixed Methods Approach to Network Data Collection; 2014; Rice, E., Holloway, I. W., Barman-Adhikari, A., Fuentes, D., Brown, C. H., Palinkas, L. A.
- Infliential Factors on Survey Outcomes: Length of Survey, Device Selection and Extrnal Elements; 2014; Ribeiro, E.
- The Effect of Mobile Web Survey Design on Screen Orientation Manipulation; 2014; Young, R.H.; Crawford, S. D.; Couper, M. P.; Nelson, T. F.
- Investigating Response Quality in Mobile and Desktop Surveys: A Comparison of Radio Buttons, Visual...; 2014; Toepoel, V.; Funke, F.
- Do online access panels really need to allow and adapt surveys to mobile devices? ; 2014; Revilla, M.; Toninelli, D.; Ochoa, C.; Loewe, G.
- Why you need to make your surveys mobile friendly NOW; 2014; Lorch, J.; Mitchell, N.
- Assessing the Impact Device Choice Has on Web Survey Data Collection ; 2014; Hupp, A.; Schroeder, H. M.; Piskorowski, A.D.
- Understanding Mobility: Consent and Capture of Geolocation Data in Web Surveys; 2014; Crawford, S. D.; McClain, C.; Young, R.H.; Nelson, T. F.
- Swipe, Snap & Chat: Mobile Survey Data Collection Using Touch Question Types and Mobile OS Features ; 2014; Buskirk, T. D.; Michaud, J.; Saunders, T.
- Statistical Approaches to Analyze Self-Reported Susceptibility to Driver Distraction; 2014; Chen, H-Y. W.; Donmez, B.; Ko, Y-D.
- Using Web Panels for Official Statistics; 2014; Bethlehem, J.
- The problem of non-response in population surveys on the topic of HIV and sexuality: a comparative study...; 2014; Wallander, L.; H.; Mannheimer, L. N.; Oestergren, P. O.; Plantin, L.Tikkanen, R. H.
- Does the Length of Fielding Period Matter? Examining Response Scores of Early Versus Late Responders; 2014; Dyer Yount, N.; Lewis, T.; Lee, K.; Sigman, R.
- FocusVision 2014 Annual MR Technology Report; 2014; Macer, T., Wilson, S.
- When it comes to mobile respondent experience and data quality, survey design matters; 2014; Mitchell, N.
- The Changing Landscape of Technology and its Effect on Online Survey Data Collection; 2014; Mitchell, N.
- Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Edition; 2014; Dillman, D. A., Smyth, J. D., Christian, L. M.
- The survey playbook: how to create the perfect survey. (Vol.1); 2014; Champagne, M. V.
- Do your own online surveys. DYI and self serve market research; 2014; Cary, N.
- The Influence of Answer Box Format on Response Behavior on List-Style Open-Ended Questions; 2014; Keusch, F.
- Nonprobability Web Surveys to Measure Sexual Behaviors and Attitudes in the General Population: A Comparison...; 2014; Erens, B.; Burkill, S.; Couper, M. P.; C., Clifton, S., Tanton, C., Phelps, A., Datta, J., Mercer,...
- Luteal-phase support in assisted reproduction treatment: real-life practices reported worldwide by an...; 2014; Vaisbuch, E., de Ziegler, D., Leong, M., Shoham, Z., Weissman, A.
- Facebook, Twitter, & Qr Codes: An Exploratory Trial Examining The Feasibility Of Social Media Mechanisms...; 2014; Gu, L. L.
- Time-dependent variation in the responses to the web-based ISAAC questionnaire; 2014; Yoshida, K., Sasaki, M., Odajima, H., Itazawa, T., Hashimoto, K., Furukawa, M., Adachi, Y.