Figures
Abstract
This study introduces a new randomized field experiment exploring the impact of offering a decoy charity donation incentive together with a monetary reward to increase response rates in an online survey about coronavirus fears. The study used a two-stage approach, starting with a preliminary survey to investigate participant attitudes toward different types of donations. Subsequently, an experiment was conducted wherein a less desirable £2 donation (the decoy) was introduced as an alternative to a £2 Amazon voucher (the target) within the choice set. The study sample consisted of 431 university students. They were split into three groups: a control group with a standard £2 Amazon voucher incentive (216 participants), a decoy group with the target shown first (108 participants), and a decoy group with the decoy shown first (107 participants). We found significantly higher survey completion rates in the decoy than in the control condition (82.3% vs. 74.5%). Notably, an order effect was observed–presenting the target before the decoy led to a higher completion rate (89.8%) compared to presenting the decoy first (74.8%). Importantly, the inclusion of the decoy incentive did not introduce any response bias. This study offers a proof of principle that incorporating a decoy charity donation incentive into the choice set can have a positive impact on survey participation without adversely affecting response behaviour. It demonstrates the potential of such incentives to encourage participants to complete online surveys, even when a small monetary reward is offered.
Citation: Stoffel ST, Chaki B, Vlaev I (2024) Testing a decoy donation incentive to improve online survey participation: Evidence from a field experiment. PLoS ONE 19(2): e0299711. https://doi.org/10.1371/journal.pone.0299711
Editor: Andrzej Kloczkowski, The Ohio State University College of Medicine and Public Health: The Ohio State University College of Medicine, UNITED STATES
Received: October 9, 2023; Accepted: February 13, 2024; Published: February 29, 2024
Copyright: © 2024 Stoffel et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: All data files and materials are publicly available via the Open Science Framework at https://osf.io/qundv/?view_only=9378ac9e1d91409d8fe554da44e2e020.
Funding: The author(s) received no specific funding for this work.
Competing interests: The authors have declared that no competing interests exist.
Introduction
Health surveys are vital for grasping population health and intervention efficacy [1]. Recent research highlights declining online survey responses, which impact research validity [2]. Enhancing response rates is thus a priority [3]. Various factors affect online survey response rates, including questionnaire length, incentive, structure, persuasion, difficulty, communication, participant interests, and sender identity [3–8]. Recent systematic reviews have shown that monetary incentives can increase involvement in surveys [9,10]. Small monetary incentives can increase participation [10], especially prepaid incentives because of psychological obligation [11,12]. The reciprocity principle explains participation as a result of perceiving incentives as reciprocal gifts [13], while economic exchange theory explains it with anticipated rewards [14].
While non-monetary incentives like charitable donations can boost participation [15], people generally favour monetary rewards over non-monetary ones [16–20]. Despite some studies presuming charity’s suitability [21–23], it hasn’t consistently improved response rates [19,24–26].
A recent study demonstrated that offering a decoy survey with less convenient questions and delayed remuneration increases survey participation [27]. The decoy effect, also known as the asymmetric decoy effect or attraction effect, enhances preferences for an option (the target) by introducing a less attractive alternative (the decoy) [28], representing a well-studied context-dependent preference enhancement [29]. This effect operates by making the target appear more favourable in comparison to the inferior decoy, thus increasing the probability of selecting the more attractive target [28]. While extensively explored in various contexts [30,31], the decoy effect’s role in survey participation is limited [27]. This study extends that research by examining decoy incentives, specifically their impact on survey completion when a donation incentive is provided. It is hypothesized that including the decoy incentive increases the attractiveness and likelihood of selecting the standard incentive (the target).
Methodology
Study design
The study design followed a previous study by Stoffel and colleagues [27] and contained two stages. In the first stage (preliminary study), participants were recruited for a brief survey to collect email addresses, and to assess their attitudes towards various incentives (e.g., Amazon vouchers and donations to different charities). In the second stage (experiment), the participants who provided their email addresses were sent the invite to fill out the main questionnaire. The data collection for both the preliminary and final survey was conducted in July 2023. Study participants were recruited from a UK-based university through different social media platforms like WhatsApp, Facebook, WeChat as well as official email groups of different departments of the University. In line with the previous study, participants were eligible to participate in the study if they were an undergraduate, postgraduate, or Ph.D. student at the University of Warwick. We further excluded 3 study participants from the experiment due to their age, as they were significantly older than others.
Preliminary survey
The preliminary survey questionnaire was intended to recruit students from the University of Warwick to assess their eligibility and preference for different incentives (see S1 Text in the supplementary file). The invitation stated that study participants were needed for a survey on the fear of the coronavirus. Interested individuals were invited to click on the survey link for registration. The information sheet did not disclose the experimental nature of the study. After providing consent to participate in the study, interested participants were asked to provide an email address to receive the main survey and respond to some demographic questions. Additionally, they were asked to indicate their choice for the most preferred and least favoured incentive as participation fee. The specific incentives being assessed included a £2 Amazon voucher as well as donation of the £2 participation fee to four potential charities like (1) Donkey Sanctuary, (2) University of Warwick, Horses and Ponies Protection Association, (3) Veterinarian for Animal Welfare Zimbabwe (UK), (4) World Association for Transport Animal Welfare and Studies. The four charities were chosen as they allowed individual donations of £2.
Experiment
Eligible participants who provided their email addresses in the preliminary survey were individually randomised into the control and decoy conditions. Additionally, to explore the order effect, respondents in the decoy condition were further randomized into two sub-groups at a 1:1 ratio: 1) decoy shown first, and 2) target shown first. Invitations for the main survey were sent a week after the preliminary survey and no reminders were employed. The invitations urged study participation to complete the brief questionnaire within 7 days. Survey completers of the experiment received their incentives the following week.
Questionnaire in the control condition
Participants in the control condition received a standard survey invitation email (see S1 Fig). They were informed of a £2 Amazon voucher incentive upon survey completion. The control questionnaire comprised demographic questions on age, gender, ethnicity, and education level and the eight close-ended questions of the validated Fear of the Coronavirus Questionnaire (FCQ, [32]). The survey concluded with an attention check question and a debrief question about reasons for participation (see S2 Text in the supplementary file). To measure survey completion, none of the questions were mandatory.
Questionnaire in the decoy condition
Participants in the decoy condition received an email containing two distinct URL links: one for the standard £2 Amazon voucher incentive (target) and another for the decoy incentive. The order of presentation for these links was randomized. Both options were displayed in tabular format within the email to facilitate easy comparison (see S2 and S3 Figs). The survey for the decoy condition was identical to the one in the control condition, with an extra debrief question about the preferred incentive: Amazon voucher, donation, or no preference (see S3 Text in the supplementary file).
Outcomes
The primary outcome was the proportion of individuals selecting the target incentive and completing the survey across the two experimental conditions. Consistent with prior research [26,32], the single participant who opted for the decoy and completed the questionnaire was categorized as not having selected the target. Additionally, we investigated whether the decoy introduction influenced data quality by investigating the attention question and the FCQ scores. Non-response bias was assessed by comparing the socio-demographic characteristics of respondents who completed the questionnaire across the two experimental conditions [33].
Preregistration and ethics approval
Ethics approval for the present study was received from the Humanities and Social Science Research Ethics Committee (HSSREC) and all data files and materials are publicly available via the Open Science Framework at https://osf.io/qundv/. Written informed consent was obtained from all study participants prior to their involvement in the research using tick boxes.
Statistical analyses
The required sample size for the experiment was determined pre-data collection, assuming around 40% control and 50% decoy completion rates [27]. With a minimum of 180 participants per condition, the experiment had 80% power and an alpha value of 0.05 to detect such a difference [34]. The primary outcome was assessed with a Chi-square test of independence and multivariate logistic regressions, adjusting for demographics. The FCQ scores between the control and decoy groups were compared using a Student t-test. Statistical analysis was conducted using Stata/IC version 16.0 (StataCorp LP, College Station, TX).
Results
Preliminary survey
Study sample. 480 participants started the preliminary survey. Of these, 11 didn’t complete it, 35 were ineligible, and 2 chose not to share emails, resulting in a sample size of 432 (refer to Fig 1). A majority were female (78%, 337/432), aged 22–25 (47.9%, 207/432), held mixed ethnicity (30.3%, 131/432), and had some university education without a degree (46.8%, 202/432; see S1 Table).
Attitude towards the incentives. Fig 2 illustrates that 95.4% of participants (412/432) favoured the Amazon voucher as their preferred remuneration. Conversely, The Horses and Ponies Protection Association was the least favoured incentive, disliked by 23.4% (101/432), followed by Veterinarian for Animal Welfare Zimbabwe (UK) (22.4%; 97/432) and University of Warwick (19.2%; 82/432). For administrative simplicity and compliance with university regulations, the experiment used the University of Warwick charity.
Experiment
Study sample. Of 432 emails sent to participants from the preliminary survey, 1 bounced back, resulting in a sample of 431 (216 in control, 215 in decoy). Sociodemographic variables aligned with the preliminary survey (refer to S2 Table).
Effect on survey participation. In Fig 3, more completed the survey with the target incentive in the decoy condition than in the control condition (82.3% vs. 74.5%, χ2(1, N = 431) = 3.86, p = 0.049). Binary logistic regressions in Table 1 indicate non-significance in the unadjusted model (OR 1.59, 95% CI: 0.999–2.534, p = 0.050), but significance in the fully adjusted model (aOR 1.98, 95% CI: 1.164–3.356, p = 0.012).
Order effect. Fig 3 indicates that survey completion significantly increased when the target incentive was presented before the decoy incentive (89.8% vs. 74.5%, aOR 4.17; 95% CI: 1.962–8.883, p<0.001, refer to Table 2). However, displaying the decoy before the target incentive showed no decoy effect (74.8% vs. 74.5%, aOR 1.10; 95% CI: 0.599–2.049, p = 0.745).
Effect on FCQ scale and non-response bias. The analysis of sociodemographic variables and FCQ responses reveals no differences among those who completed the survey in both conditions showed no disparities (see S3 Table and S4 Fig). Mean FCQ scores were comparable between control (23.73) and decoy conditions (23.65, t(336) = 0.06, p = 0.565; see S3 Table). While comparisons between those who completed the survey and those who did not reveal that mainly individuals aged 26–30, women, of Black or Black British or Arab ethnicity, or those with a Bachelor’s degree completed the survey, the decoy incentive did not seem to create this non-response bias, as the pattern can be observed within the two experimental conditions (see S4–S6 Tables).
Attention check and debrief questions. All survey completers correctly answered the attention question, and a majority (55.9%, 189/338) stated incentives as their primary motivation for participation. Additionally, 22.2% (75/338) participated to support research, while 21.9% (74/338) were interested in the survey topic. Of those motivated by incentives, a majority (89.4%, 169/189) preferred the £2 Amazon vouchers, while the rest (20.6%, 20/189) were indifferent to the incentives.
Discussion
This study presents a proof of principle that pairing a decoy charity donation incentive with a monetary reward can increase survey participation. In line with previous research, the preliminary survey showed that most study participants prefer a monetary incentive over charitable donations [16,25]. The experiment unveiled a significantly decoy effect in that including a donation option in the choice set increased the likelihood that individuals chose and complete the survey with the target incentive. Similar to Stoffel and colleagues [27], we find a distinct order effect, in that only presenting the monetary incentive first, followed by the decoy donation, heightened survey completion rates compared to reversed presentation. This is in line with another recent study that found the decoy effect is influenced by the framing of the decoy alternative [35]. In our experiment, the order effect exhibits a primacy bias, as individuals tend to consider the option listed first rather than last [36]. Studies have shown that simply changing the order of the alternatives can influence decision-making [37].
Furthermore, similar as in previous research, the decoy incentive’s inclusion showed no response bias or impact on survey responses [27]. Nonetheless, differences exist in the baseline completion rates in the two experimental conditions, which were significantly higher in this study. Additionally, the magnitude of the decoy effect observed here was notably smaller. While this divergence could stem from the nature of the decoy alternative—we varied the incentive, our study’s decoy was also dominated in only one attribute, compared to Stoffel et al.’s [27] two attributes. Previous research has shown that the decoy effect depends on how strongly the decoy is dominated by the target [38]. Additionally, our decoy incentive wasn’t ranked least desirable, leaving room for a stronger effect had the least desirable option been employed.
We found that mainly individuals aged 26–30, females, of Black or Black British or Arab ethnicity, or those with a Bachelor’s degree completed the survey with the target incentive. While studies suggest that the decoy effect emerges at a young age and that testosterone is associated with inconsistent decisions [39,40], our study does not indicate that male participants were more likely to react to the decoy. Our study presents limitations that warrant further investigation. Firstly, the analytical sample was comprised of university students, which may be more sensitive to incentives, limiting so the possibility to generalize the findings to the public. Additionally, while donations to one’s own institution has been found to be an ineffective incentive [23], it is plausible that different welfare-oriented charities (e.g., medical research) could generate varied decoy effects. Moreover, the study involved a degree of deception by informing participants of a coronavirus fear study. Future research could employ a comprehensive design to explore the decoy effect without deceptive practices and a two-stage design, thus reducing the risk of priming and reducing the baseline completion rates.
Conclusion
This study is the first investigation to explore how offering a decoy charity donation incentive for a survey can improve response rate of a survey providing monetary incentive. Unlike prior research comparing charity donations and monetary rewards, our study demonstrates how charity donations effectively serve as decoys to boost survey engagement. Beyond the decoy effect, our research highlights an order effect: presenting the monetary reward before the decoy incentive significantly increases survey completion rates. Importantly, the decoy incentive doesn’t seem to introduce bias or alter responses. While the decoy effect may be modest, this research underscores the effectiveness of a decoy donation in encouraging survey participation, even with minimal financial incentives. This study furthers understanding of respondent behaviour in incentive-driven surveys and invites nuanced exploration of testing decoy effects.
Supporting information
S1 Fig. Invitation email for the control condition.
https://doi.org/10.1371/journal.pone.0299711.s001
(DOCX)
S2 Fig. Invitation email for the decoy condition-target shown first.
https://doi.org/10.1371/journal.pone.0299711.s002
(DOCX)
S3 Fig. Invitation email for the decoy condition-decoy shown first.
https://doi.org/10.1371/journal.pone.0299711.s003
(DOCX)
S4 Fig. Distribution of FCQ score across the two experimental conditions (N = 338).
https://doi.org/10.1371/journal.pone.0299711.s004
(DOCX)
S1 Table. Description of the study sample in the preliminary survey (N = 432).
https://doi.org/10.1371/journal.pone.0299711.s005
(DOCX)
S2 Table. Description of the study sample in the experiment (main survey) (N = 431).
https://doi.org/10.1371/journal.pone.0299711.s006
(DOCX)
S3 Table. Characteristics of individuals choosing the survey with the target incentive (N = 338).
https://doi.org/10.1371/journal.pone.0299711.s007
(DOCX)
S4 Table. Overall non-response bias (N = 431).
https://doi.org/10.1371/journal.pone.0299711.s008
(DOCX)
S5 Table. Non-response bias in control condition (N = 216).
https://doi.org/10.1371/journal.pone.0299711.s009
(DOCX)
S6 Table. Non-response bias in decoy condition (N = 215).
https://doi.org/10.1371/journal.pone.0299711.s010
(DOCX)
S2 Text. Survey used in main experiment (control condition).
https://doi.org/10.1371/journal.pone.0299711.s012
(DOCX)
S3 Text. Survey used in main experiment (decoy condition).
https://doi.org/10.1371/journal.pone.0299711.s013
(DOCX)
References
- 1. Brtnikova M., Crane L. A., Allison M. A., Hurley L. P., Beaty B. L., & Kempe A. (2018). A method for achieving high response rates in national surveys of U.S. primary care physicians. PloS one, 13(8), e0202755. pmid:30138406
- 2. Glass D. C., Kelsall H. L., Slegers C., Forbes A. B., Loff B., Zion D., & Fritschi L. (2015). A telephone survey of factors affecting willingness to participate in health research surveys. BMC public health, 15, 1017. pmid:26438148
- 3. Schwarz G. (2013). Response Rate in European Business Tendency Surveys. Australian Institute of Economic Research, 2(1), 1–54.
- 4. Becker R. (2021). Have you ever seen the rain? The causal impact of the weather situation and the season on survey participation in a multi-wave panel study. Survey Research Methods, 15(1), 27–41.
- 5. Dillman D. A. (2007). Mail and internet surveys. The Tailored design method. New York: Wiley.
- 6. Edwards P. J., Roberts I., Clarke M. J., Diguiseppi C., Wentz R., Kwan I., Cooper R., Felix L. M., & Pratap S. (2009). Methods to increase response to postal and electronic questionnaires. The Cochrane database of systematic reviews, 2009(3), MR000008. pmid:19588449
- 7. Saleh A., & Bista K. (2017). Examining Factors Impacting Online Survey Response Rates in Educational Research: Perceptions of Graduate Students. Journal of MultiDisciplinary Evaluation, 13(29), 63–74.
- 8. Vangeest J. B., Johnson T. P., & Welch V. L. (2007). Methodologies for improving response rates in surveys of physicians: A systematic review. Evaluation & the Health Professions, 30, 303–321. pmid:17986667
- 9. Abdelazeem B., Hamdallah A., Rizk M. A., Abbas K. S., El-Shahat N. A., Manasrah N.,… & Eltobgy M. (2023). Does usage of monetary incentive impact the involvement in surveys? A systematic review and meta-analysis of 46 randomized controlled trials. PLoS One, 18(1), e0279128. pmid:36649255
- 10. Jia P., Furuya-Kanamori L., Qin Z. S., Jia P. Y., & Xu C. (2021). Association between response rates and monetary incentives in sample study: a systematic review and meta-analysis. Postgraduate medical journal, 97(1150), 501–510. pmid:32848082
- 11. Bosnjak M., & Tuten T. L. (2003). Prepaid and promised incentives in web surveys: An experiment. Social Science Computer Review, 21(2), 208–217.
- 12. Jobber D., Saunders J., & Mitchell V. W. (2004). Prepaid monetary incentive effects on mail survey response. Journal of Business Research, 57(1), 21–25.
- 13. Groves R. M., Cialdini R. B., & Couper M. P. (1992). Understanding the decision to participate in a survey. Public Opinion Quarterly, 56, 475–495.
- 14. Ryu E., Couper M. P., & Marans R. W. (2005). Survey incentives: cash vs. in-kind; face-to-face vs. mail; response rate vs. nonresponse error. International Journal of Public Opinion Research, 18, 89–106.
- 15. Church A. H. (1993). Estimating the effect of incentives on mail survey response rates: A meta-analysis. Public opinion quarterly, 57(1), 62–79.
- 16. Furse D. H., & Stewart D. W. (1982). Monetary Incentives versus Promised Contribution to Charity: New Evidence on Mail Survey Response. Journal of Marketing Research, 19(3), 375–380.
- 17. Hubbard R., & Little E.L. (1998). Promised Contributions to charity and mail survey responses: replication with extension. Public Opinion Quarterly, 52, 223–230.
- 18. Skinner S.J., Ferrell O.C. & Pride W.M. (1984) Personal and nonpersonal incentives in mail surveys: immediate versus delayed inducements. Journal of the Academy of Marketing Science, 12, 1, pp. 106–114.
- 19. Warriner K., Goyder J., Gjertsen H., Hohner P., & McSpurren K. (1996). Charities, no; lotteries, no; cash, yes. main effects and interactions in a Canadian incentives experiment. Public Opinion Quarterly, 60, 542–562.
- 20. Deehan A., Templeton M., Taylor C., Drummond C. & Strang J. (1997) The effect of cash and other financial inducements on the response rate of general practitioners in a national postal survey. British Journal of General Practice, 47, pp. 87–90.
- 21. Robertson D. H., & Bellenger D. N. (1978). A new method of increasing mail survey responses: Contributions to charity. Journal of Marketing Research, 15(4), 632–633.
- 22. Gendall P. & Healey B. (2008) Alternatives to prepaid monetary incentives in mail surveys. International journal of Public Opinion Research, 20, 4, pp. 517–527.
- 23. Gendall P., & Healey B. (2010). Effect of a Promised Donation to Charity on Survey l Response. International Journal of Market Research, 52(5), 565–577.
- 24. Gattellari M., & Ward J. E. (2001). Will donations to their learned college increase surgeons’ participation in surveys? A randomized trial. Journal of clinical epidemiology, 54(6), 645–649. pmid:11377126
- 25. Göritz A. S., & Neumann B. P. (2016). The longitudinal effects of incentives on response quantity in online panels. Translational Issues in Psychological Science, 2(2), 163–173.
- 26. Conn K. M., Mo C. H., & Sellers L. M. (2019). When Less Is More in Boosting Survey Response Rates. Social Science Quarterly, 100(4), 1445–1458.
- 27. Stoffel S. T., Sun Y., Hirst Y., von Wagner C., & Vlaev I. (2023). Testing the decoy effect to improve online survey participation: Evidence from a field experiment. Journal of Behavioral and Experimental Economics, 102103. https://doi.org/10.1016/j.socec.2023.102103.
- 28. Huber J., Payne J. W., & Puto C. (1982). Adding asymmetrically dominated alternatives: Violations of regularity and the similarity hypothesis. Journal of Consumer Research, 9(1), 90–98.
- 29. Vlaev I., Chater N., Stewart N., & Brown G. D. (2011). Does the brain calculate value? Trends in Cognitive Sciences, 15(11), 546–554.
- 30. Milberg S. J., Silva M., Celedon P., & Sinn F. (2014). Synthesis of attraction effect research: Practical market implications?. European Journal of Marketing, 48(7/8), 1413–1430.
- 31. Yang S., & Lynn M. (2014). More evidence challenging the robustness and usefulness of the attraction effect. Journal of Marketing Research, 51(4), 508–513.
- 32. Mertens G., Gerritsen L., Duijndam S., Salemink E., & Engelhard I. M. (2020). Fear of the coronavirus (COVID-19): Predictors in an online study conducted in March 2020. Journal of Anxiety Disorders, 74, 102258. pmid:32569905
- 33. Rybak A. (2023). Survey mode and nonresponse bias: A meta-analysis based on the data from the international social survey programme waves 1996–2018 and the European social survey rounds 1 to 9. Plos one, 18(3), e0283092. pmid:36928697
- 34.
Cohen J. (1988). Statistical power analysis for the behavioural sciences. (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associate.
- 35. Di Crosta A., Marin A., Palumbo R., Ceccato I., La Malva P., Gatti M.,… & Di Domenico A. (2023). Changing Decisions: The Interaction between Framing and Decoy Effects. Behavioral Sciences, 13(9), 755. pmid:37754033
- 36. Lohse G. L. (1997). Consumer eye movement patterns on yellow pages advertising. Journal of Advertising, 26(1), 61–73.
- 37. Bansback N., Li L. C., Lynd L., & Bryan S. (2014). Exploiting order effects to improve the quality of decisions. Patient Education and Counseling, 96(2), 197–203. pmid:24961445
- 38. Stoffel S. T., Yang J., Vlaev I., & Wagner C. (2019). Testing the decoy effect to increase interest in colorectal cancer screening. PloS One, 14(3).
- 39. Zhen S., & Yu R. (2016). The development of the asymmetrically dominated decoy effect in young children. Scientific reports, 6(1), 22678. pmid:26935899
- 40. Liao J., Zhang Y., Li Y., Li H., Zilioli S., & Wu Y. (2018). Exogenous testosterone increases decoy effect in healthy males. Frontiers in psychology, 9, 2188. pmid:30483195