Web Survey Bibliography
Less than half a decade ago, online research had to proof in general that its data quality could keep up with traditional methods. Now that this initial debate has cooled and Web 2.0 is emerging, naturally the question arises how online research can profit from new web phenomena and surpass the features of traditional computer assisted interviews.
1. Can the average online user be surveyed with the new technology? Or do technical obstacles exist which either prevent her or him from filling out the questionnaire altogether or cause errors which distort the measurement?
2. How does the solution with the new technology compare to the older one? Does it yield to more or less information and do the results correlate well enough to ensure sufficient test reliability?
3. How do online users feel about the new survey technologies? Especially in market research, large numbers of users are routinely interviewed and surveys which are participant friendly are desirable to prevent from high rates of drop out within the survey and to ensure high participation rates in the long run.
To answer these questions, we carry out a case-study in which 300 online panel members are asked to rate various print and web stimulus material. 150 participants fill out a “traditional” HTML-based questionnaire (“web 1.0-group”). The other 150 participants fill out a HTML-based questionnaire which in addition features web 2.0 technologies to present and evaluate the stimuli (“web 2.0-group”). For example, the task of rating a print advertisement is helped with a magnifying glass. The task of rating web material is helped with an interactive “diary” tool ( NLR web scan ), which allows users to comment on websites while surfing on them.
To answer all three questions, we measure the failure rate due to technical problems. Furthermore, we calculate the inter-correlation between both methods as a measurement of inter-test-reliability and rate the amount and quality of collected data. Finally we assess the reaction of panellists towards the new technology: Does the new technology offer “joy of use” and support for “traditional” online research to keep up with the changing web landscape?
Based on all results, the acceptance and applicability of the web 2.0 technologies mentioned is evaluated and a recommendation for commercial as well as scientific use is provided.
General online research (GOR) 2008 (abstract)
Web Survey Bibliography (6797)
- Internet Survey Methods: A Review of Strengths, Weaknesses, and Innovations; 2010; Smyth, J. D., Pearson, J. E.
- Continuity and Innovation in the Design of Understanding Society: the UK Household Longitudinal Study...; 2010; Laurie, H.
- Weighting Strategy for Understanding Society; 2010; Lynn, P., Kaminska, O.
- Globalpark Annual Market Research Software Survey 2009; 2010; Macer, T.; Wilson, S.
- Lessons from a Randomised Experiment with Mixed-Mode Designs for a Household Panel Survey; 2010; Lynn, P., Uhrig, S.C. N., Burton, J.
- Understanding Society Innovation Panel Wave 2: Results from Methodological Experiments ; 2010; Burton, J., Laurie, H., Uhrig, S.C. N.
- Offering a Web Option in a Mail Survey of Young Adults: Impact on Survey Quality; 2010; Turner, S., Viera Jr., L., Marsh, S. M.
- Using Web-Hosted Surveys to Obtain Responses from Extension Clients: A Cautionary Tale.; 2010; Israel, G. D.
- Mobile Experience Sampling: Reaching the Parts of Facebook Other Methods Cannot Reach; 2010; Abdesslem, F. B., Parris, I., Henderson, T.
- Investigating Data Quality in Cell Phone Surveying; 2010; Lavrakas, P. J., Tompson, T., Benford, R.
- Beyond the Usability Lab: Conducting Large-scale Online User Experience Studies; 2010; Albert, W., Tullis, T., Tedesco, D.
- Walking in Facebook: A Case Study of Unbiased Sampling of OSNs; 2010; Gjoka, M., Kurant, M., Butts, C. T., Markopoulou, A.
- Social Networking Sites: Evaluating and Investigating their use in Academic Research; 2010; Redmond, F.
- Update on the ARF’s Quality Enhancement Process (QeP); 2010; Pettit, R.
- Quality Matters – Now And Especially Tomorrow; 2010; Dedeker, K.
- Measuring selection bias introduced by routing; 2010; Porter, S., de Gaudemar, O., Kimura, M.
- Quantifying the Impact of Survey Design Parameters on Respondent Engagement and Data Quality; 2010; Suresh, N., Conklin, M.
- Using ad hoc measures for response styles: a cautionary note; 2010; de Beuckelaer, A.; Weijters, B.; Rutten, A.
- How Harmful are Survey Translations? A Test with Schwartz's Human Values Instrument; 2010; Davidov, E., de Beuckelaer, A.
- Elastic-R, a Google docs-like portal for data analysis in the Cloud ; 2010; Chine, K.
- Restructuring and innovations on the survey “capacity of collective tourist accommodation”...; 2010; Santoro, M. T., Staffieri, S.
- Managing the knowledge base - the DUVA system, from data entry to output tools; 2010; Then, R., Bangert, D.
- An Analyze of the Zero Price Effect on Online Business Performance - An Research Based on the Mobile...; 2010; Liu, Y., Yuan, P.
- Is there a future for “real” qualitative market research interviewing in the digital age...; 2010; McPhee, N.
- From clipboards to online research communities; 2010; Poynter, R., Cierpicki, S., Lorch, J., Zuo, B., Davis, C., Eddy, C.
- 3 screen measurement: Soccer World Cup 2010; 2010; Conry, S., Benezra, K., Singh, S.
- Can biomarkers be collected in an Internet survey? A pilot study in the LISS panel; 2010; Avendano, M., Mackenbach, J., Scherpenzeel, A.
- Dealing with Nonresponse in Survey Sampling: an Item Response Modeling Approach; 2010; Matei, A.
- Power, sample size, and optimal designs in social research; 2010; Moerbeek, M., van Breukelen, G. J. P.
- Codebook and explanatory note on the WageIndicator dataset ; 2010; Tijdens, K., van Zijl, S., Hughie-Williams, M., van Klaveren, M., Steinmetz, S.
- Modeling non-sampling errors and participation in Web surveys; 2010; Biffignandi, S.
- Perspectives on Web Survey Development: Views from Programmers, Content Specialists, and Survey Methodologists...; 2010; Downey, K.
- Using a Mixed-Mode Design to Survey Ethnic Minorities?; 2010; Feskens, R., Kappelhof, J.
- Blogosphere and Democracy in Portugal–Results of a Websurvey; 2010; Carvalho, T., Casanova, J. L.
- Recent Findings on Using Rich Media in Online Surveys; 2010; Malinoff, B., Henning, J.
- Digital, Social Moms: Using Social Media to Increase Respondent Engagement and Decrease Recruiting Costs...; 2010; Stemberg, C., Rimmer, L., Weinstein, D.
- From Buzz to Biz: Social Media Research for Results; 2010; Pettit, F. A.
- Archiving and Re-using Qualitative and Qualitative Longitudinal Data in Slovenia; 2010; Stebe, J., Hudales, J., Kragelj, B.
- Establishing a Qualitative Data Archive in Austria; 2010; Smioski, A.
- Methodological and Ethical Dilemmas of Archiving Qualitative Data; 2010; Kuula, A.
- Qualitative and Qualitative Longitudinal Resources in Europe; 2010; Bishop, L., Neale, B.
- The Rise of Survey Programming Automation as an Alternative to Outsourcing; 2010; Ahmed, A. S., Guyer, L.
- How is Emerging Technology/Web 2.0 Changing Gathering Consumer Feedback and the Delivery of Data? ; 2010; Price, K., Patel, N.
- Sexy Questions, Dangerous Results? - Exploring the Impact of Rich Media Question Formats in Online Survey...; 2010; Malinoff, B.
- Mobile Research: New Platform, New Thinking ; 2010; Johnson, A. J., Conry, S.
- Managing Quality in Large Multi-Country / Multi-Category / Multi-Modal Studies; 2010; Thompson, J.
- State of Technology in Market Research ; 2010; Coates, D.
- Mobile Research – The future’s in their hands: Exploring the Application of Digital Research...; 2010; Dodgson, S.
- Factorial Design on Survey Router Configuration Effect by Sample Source; 2010; Fawson, B., Johnson, E. P.
- Use of Faces in Surveys; 2010; Burdein, I.