Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Who shares fake news on social media? Evidence from vaccines and infertility claims in sub-Saharan Africa

  • Kerstin Unfried ,

    Contributed equally to this work with: Kerstin Unfried, Jan Priebe

    Roles Conceptualization, Formal analysis, Investigation, Methodology, Project administration, Visualization, Writing – original draft

    kerstin.unfried@bnitm.de

    Affiliation Health Economics Research Group, Bernhard Nocht Institute for Tropical Medicine (BNITM), Hamburg, Germany

  • Jan Priebe

    Contributed equally to this work with: Kerstin Unfried, Jan Priebe

    Roles Conceptualization, Project administration, Supervision, Writing – review & editing

    Affiliation BNITM & Hamburg Center for Health Economics (HCHE), Hamburg, Germany

Abstract

The widespread dissemination of misinformation on social media is a serious threat to global health. To a large extent, it is still unclear who actually shares health-related misinformation deliberately and accidentally. We conducted a large-scale online survey among 5,307 Facebook users in six sub-Saharan African countries, in which we collected information on sharing of fake news and truth discernment. We estimate the magnitude and determinants of deliberate and accidental sharing of misinformation related to three vaccines (HPV, polio, and COVID-19). In an OLS framework we relate the actual sharing of fake news to several socioeconomic characteristics (age, gender, employment status, education), social media consumption, personality factors and vaccine-related characteristics while controlling for country and vaccine-specific effects. We first show that actual sharing rates of fake news articles are substantially higher than those reported from developed countries and that most of the sharing occurs accidentally. Second, we reveal that the determinants of deliberate vs. accidental sharing differ. While deliberate sharing is related to being older and risk-loving, accidental sharing is associated with being older, male, and high levels of trust in institutions. Lastly, we demonstrate that the determinants of sharing differ by the adopted measure (intentions vs. actual sharing) which underscores the limitations of commonly used intention-based measures to derive insights about actual fake news sharing behaviour.

1 Introduction

Fake news is a worldwide concern carrying grave political and social consequences [1, 2]. Regarding health, fake news has been linked to worse mental health outcomes, the misallocation of resources, lower compliance with health authorities & regulations, and increases in vaccine hesitancy [38]. The substantial influence of misinformation on health beliefs, behavior, and its negative consequences became strikingly visible during the COVID-19 pandemic. For instance, exposure to misinformation about the protective role of smoking and alcohol consumption against COVID-19 infections has been associated with increased smoking and alcohol use in current consumers. In contrast, people that believed smoking is a risk factor for a severe COVID-19 infection showed greater intentions to quit smoking [9, 10]. In how far people show misinformation-consistent behavior or followed public health recommendations depends among others on the individual level of trust in institutions such as the government and national health authorities [11, 12].

The spread of fake news has often been attributed to the rise of social media platforms which allow for the rapid dissemination of false or misleading information to a broad audience [1315]. Bearing in mind the continuing rise in social media usage globally and the regular development of innovative and sophisticated techniques to manipulate digital content, a better understanding of the cognitive and behavioral drivers behind the distribution of fake news on social media appears crucial.

In this paper, we conducted online surveys in six sub-Saharan African countries (Ghana, Kenya, Nigeria, South Africa, Tanzania, Uganda) to analyse the individual-level determinants of spreading fake news. More specifically, we examine (deliberate and accidental) actual sharing behavior of vaccine-related fake news—unsubstantiated infertility claims of vaccines for polio, HPV, and COVID-19—on Facebook. As part of the surveys, participants read one fake news article, were asked about their willingness to share it on Facebook, and were ultimately able to share the article. The surveys were conducted in early 2023 and gathered information from 5,307 participants.

Our analysis rests on established conceptual frameworks that distinguish between the accidental and deliberate sharing of fake news. Accidental sharing refers to the distribution of false information due to (i) a person’s inability to discern truth (e.g. lacking knowledge or cognitive skills to distinguish truthful from misleading information) or (ii) a lack of motivation to do so (e.g. inattention or psychological biases such as confirmatory bias or motivated reasoning) [1618]. In contrast, deliberate sharing refers to the purposeful distribution of misinformation. Existing explanations for deliberate sharing imply that other preferences outweigh any preference for truth telling [19, 20].

Only few studies have investigated what determines accidental vs. deliberate sharing of fake news [2124]. The evidence coming from these studies is far from conclusive, suggesting that the drivers of deliberate and accidental sharing of fake news is context and study-specific. For instance, some studies find that most sharing of fake news can be attributed to accidental sharing [23], while others stress the role of psychological reward mechanisms [25] or the interest in the information as well as claim veracity [26]. Likewise, while some studies find that older people are more likely to share political fake news [27, 28], others find that younger people might do so [21, 23]. There are also contradictory findings regarding the influence of personality traits and demographic variables such as gender and education [21, 24, 25]. Similar context-specific effects are reported from the related literature on individuals’ ability to detect misinformation. While there is consensus on some characteristics such as cognitive ability and literacy, for others the empirical evidence is mixed (e.g. age, gender, and income) [18, 23, 2932].

A major drawback of the current literature is that almost all studies rely on self-reported measures of sharing behavior (intentions) that are notoriously difficult to interpret with respect to actual behavior due to differences related to self-efficacy, social cognition, risk perceptions, task switching, memory capacity, among others [33]. In fact, we are only aware of one study that distinguishes between accidental and deliberate sharing of fake news and utilises actual sharing information, studying the U.S. political context [25].

In this context our study differs from previous research on the role and determinants of accidental and deliberative sharing of fake news in four aspects. First, the existing literature is limited to developed countries and political fake news topics. In contrast, our study examines the sharing of fake news in six low- and middle-income countries (LMICs) in SSA with respect to vaccinations. Considering that (i) the majority of the world’s population lives in LMICs, (ii) the rise in social media usage being particularly pronounced in LMICs, (iii) social media representing a major source of information in LMICs due to lower levels of trust in national governments, (iv) the particularly high susceptibility to fall for fake news in LMICs [34], and (v) vaccine hesitancy, often originating from the consumption of fake news, being among the top ten global health threats [15], our study investigates a pivotal setting that has been largely disregarded. Second, we complement previous research by integrating a more comprehensive set of variables into our analyses. In addition to standard socio-economic controls and personality traits, we in particular examine the role of institutional trust and risk preferences. Third, we are able to investigate actual sharing behavior with click data. In this regard our study provides direct insights into the implications of measuring sharing intentions vs. actual sharing behavior. Bearing in mind that almost all studies are confined to intentional measures, we shed light on the extend and determinants of the related measurement bias. Fourth, by collecting data from six countries and regarding sharing of misinformation related to three distinct vaccines, our study allows us to compare results across contexts.

2 Methods

2.1 Data collection

For this study we collected primary data via online surveys in six SSA countries (Ghana, Kenya, Nigeria, South Africa, Tanzania, and Uganda). The surveys were hosted on the UniPark platform and implemented during February and March 2023. In the following we refer to the surveys as African Health Surveys (AHS). All respondents had to be adults (older than 17 years) and were recruited via Facebook ads. Participation was incentivised with a lottery. Participants could win phone credit of 5GB.

The AHS is an online survey that collects information on participants’ health status, health behavior and knowledge, attitudes towards health topics, media usage and other socio-demographic characteristics. As part of the survey, respondents were exposed to an article about the common rumour that vaccines are causing infertility. Participants received randomly one of the three articles that are shown in S2 Appendix. Participants were asked to read the fake news article and about their willingness to share the article on their Facebook account directly afterwards. Ultimately, participants had the opportunity to actually share the article. We observed this behavior with click data. At a later stage of the survey, respondents evaluated the accuracy of the article content.

2.2 Ethics statement

This study was approved by the Ethics Committee of the Medical Association of Hamburg, Germany. All participants provided their consent at the beginning of the survey. We have implemented several measures to limit negative consequences related to the exposure to fake news. First of all, the survey contained a thorough debriefing that explained to respondents that they were exposed to an article containing fake news and provided corrective information. Second, the website on which the articles were hosted included a disclaimer on each side in which we visibly highlight that these articles contain misinformation and were used as part of a scientific experiment. Third, the website was shut-down immediately after the finalization of our experiment. Lastly, several tedious steps were implemented before the sharing action. The surveys were carried out in accordance with relevant guidelines and regulations.

2.3 Study sample

Our study sample includes 5,307 respondents. We dropped observations that did not finish the survey or could not pass the attention checks. The majority of our respondents are from Kenya (see S2 Table). The average respondent is 29.2 years old (age ranges from 18 to 75 years), male (about 65%), and possesses tertiary education (about 73%). Further descriptive statistics are reported in S1 Table in the supplementary information. In general, the sample appears to be representative of the social media user population. Bearing in mind selection effects into internet access, device possession (smartphone, tablet, pc), social media usage, and willingness to participate in research projects, the sample is not necessarily representative of the national populations in the respective countries.

2.4 Construction of variables

Detection of misinformation.

Detection of misinformation is measured on the basis of respondents’ perception of the truthfulness of the article. Respondents assessed the accuracy of the presented article on a 5-point Likert scale ranging from 0 “not true at all” to 4 “entirely true”. Based on this information, we classify respondents with a value of 2 and below to be able to detect the misleading information.

Measures related to the sharing of fake news.

Intentional sharing behavior is measured using the following 1-item survey question “Would you like to share the article on your Facebook account?”. Based on respondents’ answer to this question we constructed a binary indicator that takes the value 1 if the respondent wanted to share the article. Likewise, actual sharing behaviour is coded as a binary indicator that takes the value 1 if a person actually shared the article on Facebook. To measure whether a person shared the article we rely on click data. More specifically, we recorded a respondent’s click on a Facebook share button that linked the survey with the Facebook environment (a respondent’s Facebook account). Furthermore, and based on the variables related to actual sharing behaviour and detection of misinformation, we classify respondents into deliberate and accidental sharing types, whereby ‘deliberate’ refers to respondents that detected the misinformation and shared it and ‘accidental’ to those who believed in the accuracy of the content and shared the fake news article.

Individual characteristics.

In our analysis we consider a broad range of individual characteristics that were collected as part of the surveys. First, socio-demographic variables are: age categories (binary indicator variables for age intervals), gender (0 = male, 1 = female), education categories (binary indicator variables for completed education levels), marital status (binary indicator whether person is married), work status ((self-)employed = 1, 0 otherwise), binary indicators for wealth categories (poor, average, rich relative to others). These variables were assessed with self-reported measures. Second, we construct two variables that capture different dimensions of economic preferences: Risk is measured using the well-established 1-item scale on a respondent’s willingness to take risk using the following question “Are you a person who is generally willing to take risks?”. The item uses a 11-point Likert response scale ranging from 0 “unwilling to take risks” to 10 “very willing to take risks”. The item is among others used in the global preference survey [35] and has been shown to be a good predictor of risk behavior in various settings (e.g. [3638]). Trust is a composite index (average value) that is based on three questionnaire items related to respondents’ trust in health-related information from the (i) government, (ii) science and research, and (iii) traditional media. Responses to each of the three items are recorded on a 4-point Likert scale ranging from 0 “not at all trustworthy” to 3 “a lot trustworthy”. We adopt general trust measures that is used in related literature (e.g. [39, 40]) to our specific context by focusing on health-related information. Moreover, we constructed two personality traits indicators (agreeableness and openness to new experiences) using ‘Ten Item Personality Index (TIPI)’ personality test items. The TIPI is an established test, a short version based on the five factor model of personality that has been validated in several countries [41]. Cognitive skills are measured in terms of numeracy using three items from an established module [42]. Respondents had to solve three mathematical tasks. Our measure of cognitive skills captures the share of correctly solved tasks. We measure the extent of social media usage in terms of the time respondents spent on social media in the last week and construct the following binary indicators for social media usage: a) less than 1 hour, b) between 1 and 10 hours, c) between 11 and 20 hours, and d) more than 20 hours.

Lastly, we also consider vaccine-related aspects: ‘Vaccination status’ refers to the share of vaccinations the respondent had received against the following three diseases: HPV, COVID-19, and polio. ‘Vaccination knowledge’ is a composite index that was derived by the responses to the following two statements: “Vaccination against tetanus has to be refreshed regularly to stay effective.” and “The measles vaccines used in my country in the last 10 years used mRNA technology.” ‘Vaccine hesitancy’ is an index that we constructed on the basis of the following three statements: (i) “I believe that governmental regulations in my country ensure quality vaccines and drugs.”, (ii) “I believe that vaccines often cause more harm than good.” and (iii) “I believe that Western countries use pharmaceutical companies to exploit African people for their own purposes.”. The first item enters the index in reverse form. The items are adjusted versions of the vaccine readiness scale [43]. Respondents answered on a 7-point Likert scale ranging from 0 “strongly disagree” to 6 “strongly agree”. Both indices were constructed using principal component analysis. The questionnaire including all survey items is presented in S1 Appendix.

2.5 Empirical strategy

Our analyses start with describing detection rates and statistics related to the intentional and actual sharing of misinformation in our sample.

Second, we run OLS regressions to estimate the determinants of deliberate and accidental fake news sharing actions on Facebook. More specifically, we estimate Eq 1 below: (1) where Yicv is a binary indicator that identifies whether respondent i in country c has shared the presented article about vaccine v accidentally or deliberately. Xic is a matrix of individual-level characteristics including socio-demographics, personality traits, economic preferences, and vaccination-related attitudes, knowledge, and past behavior. The matrix also includes binary indicators related to an individual’s assignment in another survey experiment. ηv describes vaccine fixed effects, μc country fixed effects, and ϵiv is the error term. Our main specifications use heteroskedastic-robust standard errors. Since covariates are not randomly assigned we do not claim to estimate causal effects since we cannot completely rule out omitted variable bias. However, given that were are able to control for various factors we think that our results are able to reliably identify the determinants of accidental and deliberate sharing of fake news. All analyses were conducting in Stata 18.

3 Results

3.1 Detection of misinformation and sharing intention vs. action

In our sample, 54.7% of respondents correctly detected that the article contained misinformation. S3 Table in the supplementary information shows that in particular women, respondents with better cognitive skills, and persons with lower levels of institutional trust were more likely to detect misinformation.

Examining sharing intentions, we find that about 52.1% of our respondents stated that they would be willing to share the article on their Facebook account (binary indicator). The intention to share question was directly asked after respondents had read the article. In Column 3 of S3 Table, we show that, ceteris paribus, in particular respondents who are older, male, richer, risk-loving and those with higher levels of institutional trust and lower cognitive skills were more willing to share the fake news article.

Turning towards our main outcome measure: About 13.8% of respondents shared the article on Facebook. Respondents could share the article on Facebook directly after they had declared their sharing intention. The results suggest that substantially less people actually share fake news articles compared to those who stated their intention to do so. Relative to the results of studies in the U.S. that observed actual sharing behavior, it should be noted that actual sharing rates in our study are substantially higher [2, 25, 27].

Before discussing our main results, we distinguish between accidental and deliberate distributors for both sharing indicators (intention vs. action). Regarding the intentional measure, about 40% of respondents can be classified as deliberate and about 60% as accidental distributors of misinformation. With respect to actual article sharing, we obtain very similar results, though the relative share tends to shift towards even higher rates of accidental distribution (63% vs. 37%). Overall, we can conclude that the majority of fake news distribution in our sample occurred accidentally, while a non-negligible number of respondents is purposefully spreading misinformation via social media.

3.2 Individual-level determinants of sharing patterns

In this section we investigate respondents’ characteristics that are related to the accidental vs. deliberate forwarding of misinformation. Adopting a linear probability model and multivariate regression framework (OLS), we estimate what determines sharing intentions & actions. Our independent variables of interest include socio-demographic factors (age, gender, marital status, education, employment status, wealth, cognitive skills), personality traits (agreeableness and openness), economic preferences (trust and risk taking), social media usage, and vaccine-specific indicators (vaccination status, vaccine knowledge, vaccine attitudes). All regressions include country and vaccine-type (polio, HPV or COVID-19) fixed effects. We use robust standard errors.

Actual sharing of fake news.

Column 1 of Table 1 presents the individual determinants of actual fake news sharing in general, while columns 2 and 3 contain our main results focusing on the determinants of deliberate and accidental sharing.

thumbnail
Table 1. Determinants of deliberate and accidental fake news sharing.

https://doi.org/10.1371/journal.pone.0301818.t001

First, referring to column 1, our results show that the likelihood to share misinformation increases considerably with the age of the respondent. Relative to the reference group of persons age 18–29, respondents age 30–39 (50+) are about 7.4 (12.3) percentage points more likely to share the fake news article. Moreover, respondents who are male, in employment, more risk-loving, exhibit higher levels of trust in institutions, and have a history of past vaccinations are also more likely to share misinformation.

With respect to the determinants of deliberate sharing (column 2), we find that in particular older persons, respondents who are in employment and risk-loving individuals are more likely to share fake news deliberately. Next, we turn to the determinants of accidental fake news sharing. Column 3 reports that respondents that share misinformation accidentally are, on average, more likely to be older, male, and have higher levels of trust in institutions. Moreover, we observe that persons with vaccine-hesitant attitudes are less likely to share fake news accidentally (about 0.7 percentage points).

As robustness checks, we estimate probit models instead of a linear probability models, presented in S4 Table, and provide results from OLS regressions in which standard errors are clustered at the country-vaccine-type level (S5 Table). To address concerns related to language barriers and misunderstanding, we run the analysis with a sub-sample that reports good to very good English language skills as a robustness check in S6 Table excluding N = 148 observations. While the results change somewhat, the overall findings are consistent across the different model specifications.

Differences between intention vs. actual sharing measures.

In the following we explore differences in the determinants of intentional vs. actual sharing behaviour. Since in many contexts policy conclusions are derived from intentional measures only—given the absence of actual sharing data—the subsequent analysis highlights to what extent conclusions differ depending on the used measure. Table 2 depicts the respective results with columns 1–3 (4–6) showing results related to the deliberate (accidental) sharing of fake news.

thumbnail
Table 2. Individual characteristics of deliberate and accidental fake news distributors.

https://doi.org/10.1371/journal.pone.0301818.t002

Regarding the deliberate sharing of fake news articles columns 1 displays the determinants of the intention to share with column 3 showing the determinants of actual sharing behavior (conditional of having stated the intention to do so). Our main focus is on column 2 that shows the determinants of what we refer to as inconsistent behaviour (stating the intention to share the fake news article but ultimately not sharing it). We find that in particular men and more educated persons are more likely to exhibit inconsistent behavior regarding the deliberate sharing of the fake news article.

Turning towards the accidental sharing of fake news column 5 contains our results with respect to inconsistent behavior. We find that, ceteris paribus, especially younger individuals, men, persons with lower levels of education and cognitive skills, individuals with high levels of trust, and less vaccine hesitant persons are associated with inconsistent behavior with respect to the accidental sharing of the fake news article.

4 Discussion

This study investigates (i) the propensity to share vaccine-related fake news in SSA and (ii) the drivers behind the deliberate and accidental sharing of these articles. In online surveys with social media users in six SSA countries, we show study participants an article about a new vaccine (polio or HPV or COVID-19) that explicitly entails misinformation regarding its side-effects (infertility claim). Subsequently, we measure respondents’ intention to share and actual sharing behaviour on Facebook. Likewise, we receive from each respondent an assessment of the accuracy of the article content.

The first part of our analysis derives three main findings. First, actual sharing rates of fake news articles are substantially higher in the context of health and SSA compared to estimates reported in studies on political topics in rich countries. For instance, Guess et al. [27] report that about 8.5% of respondents in the U.S. shared at least once misinformation on Facebook during the presidential election campaign in 2016, while our study documents a sharing rate of 13.8%. Second, we demonstrate that the majority (around two thirds) of fake news articles is disseminated accidentally which is in line with studies from richer countries [23]. Third, we document massive differences across the adopted measures of fake news sharing (intention to share vs. actual sharing) with actual sharing rates being much lower compared to those derived from intention-based measures.

The second part of our analysis identifies the determinants of deliberate and accidental sharing of fake news articles on Facebook. With respect to the actual sharing of the fake news article our results suggest that being older, risk-loving, and in employment increases the likelihood to deliberately share fake news. Furthermore, being older, male, in employment, and exhibiting high levels of trust in institutions is positively associated with sharing fake news accidentally.

Overall, our results indicate that older, male, risk-loving, trusting persons, and those in employment are more likely to distribute vaccine-related misinformation. In particular, our study confirms the importance of age in spreading misinformation. The results show that while older persons are more likely to detect misinformation, they are still more likely to share misinformation both accidentally and deliberately. A further interesting insight concerns the relationship between trust and accidental sharing, hinting towards the use of the information source as a credibility signal in the decision to share the article. Such heuristics help to rapidly take decisions, however, may trick persons into sharing misinformation particular when misinformation is provided by seemingly trustworthy sources.

Assessing the potential measurement error of intentional measures, our study documents biases in the individual determinants of deliberate and accidental sharing. The documented differences between the two sharing measures (intentional and actual sharing behavior) challenge the validity of studies using self-reported behavioral intentions for making claims about actual sharing behaviour.

We believe that this study can help policymakers to more efficiently target relevant groups to limit the spread of misinformation. In particular, our results imply that older social media users with employment were relatively more prone to circulate health-related misinformation in our context. Additionally, we find a strong relationship between trust in institutions and the accidental distribution of misinformation, highlighting that official sources should particularly pay attention to distributing valid and truthful information.

Lastly, we would like to point out some limitations of our study. First, respondents predominantly stem from one country, Kenya, which potentially reduces the external validity of our results for other LMICs. Consequently, empirical evidence from other regions is desirable in order to better understand the generalizibility of our findings. Second, we cannot fully rule out that the action of sharing itself has biased respondents’ perception of the accuracy of the article content. Though the survey item on the perceived reliability of the article comes at a much later stage in the online survey, respondents still might remember their behavior and adjust their perception or response accordingly (consistency bias) [22]. If such a bias were to exist in our study context it would, however, not affect any of the results related to the determinants of the overall sharing action of the fake news article. Third and bearing in mind that covariates are not randomly distributed, the study does not provide causal evidence on the impact of a certain variable on deliberate and accidental fake news sharing actions. Nonetheless, the study adopts a well-established framework to elicit the determinants of the related fake news sharing actions.

5 Conclusion

Limitations notwithstanding, our study shows that the sharing of vaccine misinformation is prevalent in SSA countries, yet to a lesser extent as suggested by standard survey measures on sharing intentions. While the majority of health misinformation is shared accidentally, a not negligible part purposefully distributes misinformation. Analysing the individual determinants of fake news distributors, our study reveals that older, male, risk-loving, trusting persons have a higher likelihood to share misinformation in our context. Overall, our results shed light on the detection and sharing of health misinformation in a realistic online setting, providing novel insights on who is susceptible to fall for and more likely to disseminate fake news.

Supporting information

S1 Table. Summary statistics.

This file provides the summary statistics of the study sample.

https://doi.org/10.1371/journal.pone.0301818.s001

(PDF)

S2 Table. Observations by country.

This file provides the descriptive distribution of respondents per country.

https://doi.org/10.1371/journal.pone.0301818.s002

(PDF)

S3 Table. Detection of misinformation.

This file provides the regression results of individual characteristics associated with detecting misinformation.

https://doi.org/10.1371/journal.pone.0301818.s003

(PDF)

S4 Table. Alternative regression model: Probit.

This file provides the regression results of the main analysis using a probit instead of an OLS regression estimation.

https://doi.org/10.1371/journal.pone.0301818.s004

(PDF)

S5 Table. Alternative regression model: Clustered standard errors.

This file provides the regression results of the main analysis using alternative specifications of standard errors.

https://doi.org/10.1371/journal.pone.0301818.s005

(PDF)

S6 Table. Robustness check: English language skills.

This file provides the regression results of the main analysis with a sub-sample that reports very good or good English language skills.

https://doi.org/10.1371/journal.pone.0301818.s006

(PDF)

S1 Appendix. Survey questionnaire: This file contains the survey questionnaire.

https://doi.org/10.1371/journal.pone.0301818.s007

(PDF)

S2 Appendix. Fake news articles: This file contains the articles used in the experiment.

https://doi.org/10.1371/journal.pone.0301818.s008

(PDF)

References

  1. 1. Allcott H, Gentzkow M. Social media and fake news in the 2016 election. Journal of Economic Perspectives. 2017;31(2):211–236.
  2. 2. Grinberg N, Joseph K, Friedland L, Swire-Thompson B, Lazer D. Fake news on Twitter during the 2016 US presidential election. Science. 2019;363(6425):374–378. pmid:30679368
  3. 3. Bursztyn L, Rao A, Roth CP, Yanagizawa-Drott DH. Misinformation during a pandemic. National Bureau of Economic Research; 2020. 27417.
  4. 4. Loomba S, de Figueiredo A, Piatek SJ, de Graaf K, Larson HJ. Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA. Nature Human Behaviour. 2021;5(3):337–348. pmid:33547453
  5. 5. Montagni I, Ouazzani-Touhami K, Mebarki A, Texier N, Schück S, Tzourio C, et al. Acceptance of a Covid-19 vaccine is associated with ability to detect fake news and health literacy. Journal of Public Health. 2021;43(4):695–702. pmid:33693905
  6. 6. do Nascimento BIJ, Pizarro AB, Almeida JM, Azzopardi-Muscat N, Gonçalves MA, Björklund M, et al. Infodemics and health misinformation: a systematic review of reviews. Bulletin of the World Health Organization. 2022;100(9):544–561.
  7. 7. Romer D, Jamieson KH. Conspiracy theories as barriers to controlling the spread of COVID-19 in the US. Social Science & Medicine. 2020;263:113356.
  8. 8. Bowles J, Larreguy H, Liu S. Countering misinformation via WhatsApp: Preliminary evidence from the COVID-19 pandemic in Zimbabwe. PloS one. 2020;15(10):e0240005. pmid:33052967
  9. 9. Luk TT, Zhao S, Weng X, Wong JYH, Wu YS, Ho SY, et al. Exposure to health misinformation about COVID-19 and increased tobacco and alcohol use: a population-based survey in Hong Kong. Tobacco control. 2020;. pmid:32855353
  10. 10. Nurmansyah MI, Suraya I, Fauzi R, Al-Aufa B, et al. Beliefs about the effects of smoking on corona virus disease 2019 and its impact on the intention to quit and smoking frequencies among university students smokers in Jakarta, Indonesia. Asian Journal of Social Health and Behavior. 2023;6(1):7.
  11. 11. Kar B, Kar N, Panda MC. Social trust and COVID-appropriate behavior: Learning from the pandemic. Asian Journal of Social Health and Behavior. 2023;6(3):93–104.
  12. 12. Lee SJ, Lee CJ, Hwang H. The impact of COVID-19 misinformation and trust in institutions on preventive behaviors. Health education research. 2023;38(1):95–105. pmid:36564938
  13. 13. Burki T. Vaccine misinformation and social media. Lancet Digital Health. 2019;1(6):30136–0.
  14. 14. Pierri F, Perry BL, DeVerna MR, Yang KC, Flammini A, Menczer F, et al. Why do people spread false information online? The effects of message and viewer characteristics on self-reported likelihood of sharing social media disinformation. Scientific Reports. 2022;12:5966.
  15. 15. Wilson SL, Wiysonge C. Social media and vaccine hesitancy. BMJ Global Health. 2020;5(10). pmid:33097547
  16. 16. Pennycook G, Rand DG. The psychology of fake news. Trends in Cognitive Sciences. 2021;25(5):388–402. pmid:33736957
  17. 17. Rathje S, Roozenbeek J, Van Bavel JJ, van der Linden S. Accuracy and social motivations shape judgements of (mis) information. Nature Human Behaviour. 2023; p. 1–12. pmid:36879042
  18. 18. Van Der Linden S. Misinformation: susceptibility, spread, and interventions to immunize the public. Nature Medicine. 2022;28(3):460–467. pmid:35273402
  19. 19. Lawson MA, Anand S, Kakkar H. Tribalism and tribulations: The social costs of not sharing fake news. Journal of Experimental Psychology: General. 2023;. pmid:36892902
  20. 20. Melchior C, Oliveira M. A systematic literature review of the motivations to share fake news on social media platforms and how to fight them. New Media & Society. 2023; p. 14614448231174224.
  21. 21. Buchanan T. Why do people spread false information online? The effects of message and viewer characteristics on self-reported likelihood of sharing social media disinformation. Plos One. 2020;15(10):e0239666. pmid:33027262
  22. 22. Epstein Z, Sirlin N, Arechar A, Pennycook G, Rand D. The social media context interferes with truth discernment. Science Advances. 2023;9(9):eabo6169. pmid:36867704
  23. 23. Peren Arin K, Mazrekaj D, Thum M. Ability of detecting and willingness to share fake news. Scientific Reports. 2023;13(1):7298. pmid:37147456
  24. 24. Shephard MP, Robertson DJ, Huhe N, Anderson A. Everyday non-partisan fake news: Sharing behavior, platform specificity, and detection. Frontiers in Psychology. 2023;14:1118407. pmid:37255519
  25. 25. Ceylan G, Anderson IA, Wood W. Sharing of misinformation is habitual, not just lazy or biased. Proceedings of the National Academy of Sciences. 2023;120(4):e2216614120. pmid:36649414
  26. 26. Saling LL, Mallal D, Scholer F, Skelton R, Spina D. No one is immune to misinformation: An investigation of misinformation sharing by subscribers to a fact-checking newsletter. PloS One. 2021;16(8):e0255702. pmid:34375356
  27. 27. Guess A, Nagler J, Tucker J. Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances. 2019;5(1):eaau4586. pmid:30662946
  28. 28. Guess A, Aslett K, Tucker J, Bonneau R, Nagler J. Cracking open the news feed: Exploring what us Facebook users see and share with large-scale platform data. Journal of Quantitative Description: Digital Media. 2021;1.
  29. 29. Bryanov K, Vziatysheva V. Determinants of individuals’ belief in fake news: A scoping review determinants of belief in fake news. Plos One. 2021;16(6):e0253717. pmid:34166478
  30. 30. Enders AM, Uscinski J, Klofstad C, Stoler J. On the relationship between conspiracy theory beliefs, misinformation, and vaccine hesitancy. Plos One. 2022;17(10):e0276082. pmid:36288357
  31. 31. Nan X, Wang Y, Thier K. Why people believe health misinformation and who are at risk? A systematic review of individual differences in susceptibility to health misinformation. Social Science & Medicine. 2022; p. 115398.
  32. 32. Preston S, Anderson A, Robertson DJ, Shephard MP, Huhe N. Correction: Detecting fake news on Facebook: The role of emotional intelligence. Plos one. 2021;16(10):e0258719. pmid:34644360
  33. 33. Sheeran P, Webb TL. The intention–behavior gap. Social and Personality Psychology Compass. 2016;10(9):503–518.
  34. 34. Singh K, Lima G, Cha M, Cha C, Kulshrestha J, Ahn YY, et al. Misinformation, believability, and vaccine acceptance over 40 countries: Takeaways from the initial phase of the COVID-19 infodemic. Plos One. 2022;17(2):e0263381. pmid:35139117
  35. 35. Falk A, Becker A, Dohmen T, Huffman D, Sunde U. The preference survey module: A validated instrument for measuring risk, time, and social preferences. Management Science. 2023;69(4):1935–1950.
  36. 36. Bonin H, Dohmen T, Falk A, Huffman D, Sunde U. Cross-sectional earnings risk and occupational sorting: The role of risk attitudes. Labour Economics. 2007;14(6):926–937.
  37. 37. Lönnqvist JE, Verkasalo M, Walkowitz G, Wichardt PC. Measuring individual risk attitudes in the lab: Task or ask? An empirical comparison. Journal of Economic Behavior & Organization. 2015;119:254–266.
  38. 38. Dohmen T, Falk A, Huffman D, Sunde U, Schupp J, Wagner GG. Individual risk attitudes: Measurement, determinants, and behavioral consequences. Journal of the european economic association. 2011;9(3):522–550.
  39. 39. Carrieri V, Guthmuller S, Wübker A. Trust and COVID-19 vaccine hesitancy. Scientific Reports. 2023;13(1):9245. pmid:37286569
  40. 40. Stoop N, Hirvonen K, Maystadt JF. Institutional mistrust and child vaccination coverage in Africa. BMJ global health. 2021;6(4):e004595. pmid:33926893
  41. 41. Gosling SD, Rentfrow PJ, Swann WB Jr. A very brief measure of the Big-Five personality domains. Journal of Research in personality. 2003;37(6):504–528.
  42. 42. Fuster A, Perez-Truglia R, Wiederholt M, Zafar B. Expectations with Endogenous Information Acquisition: An Experimental Investigation. The Review of Economics and Statistics. 2022;104(5):1059–1078.
  43. 43. Geiger Mattis, et al. Measuring the 7Cs of vaccination readiness. European Journal of Psychological Assessment. 2022. 38(4):261–269.