Correction
1 Mar 2016: Grimes DR (2016) Correction: On the Viability of Conspiratorial Beliefs. PLOS ONE 11(3): e0151003. https://doi.org/10.1371/journal.pone.0151003 View correction
Figures
Abstract
Conspiratorial ideation is the tendency of individuals to believe that events and power relations are secretly manipulated by certain clandestine groups and organisations. Many of these ostensibly explanatory conjectures are non-falsifiable, lacking in evidence or demonstrably false, yet public acceptance remains high. Efforts to convince the general public of the validity of medical and scientific findings can be hampered by such narratives, which can create the impression of doubt or disagreement in areas where the science is well established. Conversely, historical examples of exposed conspiracies do exist and it may be difficult for people to differentiate between reasonable and dubious assertions. In this work, we establish a simple mathematical model for conspiracies involving multiple actors with time, which yields failure probability for any given conspiracy. Parameters for the model are estimated from literature examples of known scandals, and the factors influencing conspiracy success and failure are explored. The model is also used to estimate the likelihood of claims from some commonly-held conspiratorial beliefs; these are namely that the moon-landings were faked, climate-change is a hoax, vaccination is dangerous and that a cure for cancer is being suppressed by vested interests. Simulations of these claims predict that intrinsic failure would be imminent even with the most generous estimates for the secret-keeping ability of active participants—the results of this model suggest that large conspiracies (≥1000 agents) quickly become untenable and prone to failure. The theory presented here might be useful in counteracting the potentially deleterious consequences of bogus and anti-science narratives, and examining the hypothetical conditions under which sustainable conspiracy might be possible.
Citation: Grimes DR (2016) On the Viability of Conspiratorial Beliefs. PLoS ONE 11(1): e0147905. https://doi.org/10.1371/journal.pone.0147905
Editor: Chris T. Bauch, University of Waterloo, CANADA
Received: September 23, 2015; Accepted: January 6, 2016; Published: January 26, 2016
Copyright: © 2016 David Robert Grimes. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Data Availability: In this modelling paper, all data used comes from previously published sources and is available in the paper itself.
Funding: The author has no support or funding to report.
Competing interests: The author has declared that no competing interests exist.
Introduction
Conspiratorial beliefs, which attribute events to secret manipulative actions by powerful individuals, are widely held [1] by a broad-cross section of society. Belief in one conspiracy theory is often correlated with belief in others, and some stripe of conspiratorial belief is ubiquitous across diverse social and racial groups [2]. These concepts run the gauntlet from the political to the supernatural, and a single working definition is not easy to obtain. We shall clarify the working definition of conspiracy theory here as being in line the characterisation of Sunstein et al [1] as “an effort to explain some event or practice by reference to the machinations of powerful people, who attempt to conceal their role (at least until their aims are accomplished)”.While the modern usage of conspiracy theory is often derogatory (pertaining to an exceptionally paranoid and ill-founded world-view) the definition we will use does not a priori dismiss all such theories as inherently false.
However, even with this disclaimer, there are a disconcerting number of conspiracy theories which enjoy popular support and yet are demonstrably nonsensical. This is particularly true of conspiracies over scientific and medical issues where conspiratorial ideation can lead to outright opposition to and rejection of the scientific method [3]. This can be exceptionally detrimental, not only to believers but to society in general; conspiratorial beliefs over medical interventions such as vaccination, for example, can have potentially lethal consequence [4]. Conspiratorial thinking is endemic in anti-vaccination groups, with those advocating the scientific and medical consensus often regarded as agents of some ominous interest group bent on concealing “the truth”. This becomes a defence mechanism to protect beliefs that are incompatible with the evidence, and unsurprisingly perhaps proponents of such views display not only conspiratorial traits but a litany of reasoning flaws, a reliance on anecdote over data and low cognitive complexity in thinking patterns [5].
Similarly, the framing of climate-change as a hoax creates needless uncertainty in public discourse, and increases the risk of damaging inertia instead of corrective action. The dismissal of scientific findings as a hoax also has a political element; a 2011 study found conservative white males in the US were far more likely than other Americans to deny climate change [6]. Similarly, a UK study found that climate-change denialism was more common among politically conservative individuals with traditional values [7]. The public acceptance of climate-change conspiracy transcends the typical wide-ranging domain of conspiratorial belief; a 2013 investigation by Lewandowsky et al [8] found that while subjects who subscribed to conspiracist thought tended to reject all scientific propositions they encountered, those with strong traits of conservatism or pronounced free-market world views only tended towards rejecting scientific findings with regulatory implications at odds with their ideological position.
Challenging dubious anti-science assertions is an important element for constructive social debate, and there is some evidence that challenging such narratives can be successful. Belief in the moon-landing hoax is highly associated with acceptance of other conspiracy theories, but there is some evidence that when presented with scientific evidence critical of this narrative that a significant decrease in support for that theory ensues [9]. Previous investigation has also shown that improved communication of knowledge of the scientific consensus can also overcome some conspiratorial thinking on issues as diverse as the link between HIV and AIDs to acceptance of climate-change [10].
Of course, it is worthwhile to take a considered Devil’s advocate approach—there are numerous historical examples of exposed conspiracies and scandals, from Watergate to the recent revelations on the sheer scale of spying on the online activity of citizens by their own governments. It would be unfair then to simply dismiss all allegation of conspiracy as paranoid where in some instances it is demonstrably not so. There is also merit to charges that vested interests can distort and confuse public perception—in the case of climate-change, for example, conservative demagogues have succeeded in casting a perception of doubt on robust science in public discussion [8, 11–14]. Evidently an approach which dismisses these very real concerns out of hand and without due consideration is not good enough, and there must be a clear rationale for clarifying the outlandish from the reasonable.
Something currently lacking that might be useful is a method for ascertaining the likelihood that a conspiracy is viable, and the factors that influence this. The benefits of this would be two-fold; firstly, it would allow one to gauge whether a particular narrative was likely and what scale it would have to operate at. Secondly, and perhaps more usefully, it would help counteract potentially damaging anti-science beliefs by giving an estimate of viability for a conspiracy over time. The parameters for this model are taken from literature accounts of exposed conspiracies and scandals, and used to analyse several commonly held conspiracy theories, and examine the theoretical bounds for the magnitude and time-frame of any posited conspiracy theory.
0.1 Anti-Science conspiracy narratives—A brief overview
Conspiracy theories which posit some nefarious underhanded action by scientists are ubiquitous. In these work, we shall restrict our focus to four prominent beliefs of this genre. These are listed below.
- NASA Moon-landing conspiracy—The successful 1969 Apollo 11 mission first put men on the moon, a seminal achievement in human history. Yet even since that historic day, there has been a persistent fringe belief group that strongly believe the moon-landings were faked, mocked up for propaganda purposes. In 2013 it was estimated that 7% of Americans subscribe to this view [15]. Those advocating this conspiracy claim there are inconsistencies in pictures taken on the moon’s surface, despite these claims being comprehensively debunked [16].
- Climate change conspiracy—Climate-change denial has a deep political dimension [7, 8]. Despite the overwhelming strength of evidence supporting the scientific consensus of anthropogenic global warming [17], there are many who reject this consensus. Of these, many claim that climate-change is a hoax staged by scientists and environmentalists [18–20], ostensibly to yield research income. Such beliefs are utterly negated by the sheer wealth of evidence against such a proposition, but remain popular due to an often-skewed false balance present in partisan media [20, 21], resulting in public confusion and inertia.
- Vaccination conspiracy—Conspiratorial beliefs about vaccination are endemic in the anti-vaccination movement [18, 22]. It is estimated that roughly 20% of Americans hold the long de-bunked notion that there is a link between autism and the MMR vaccine [15], a belief which has reduced uptake of important vaccinations [22] in several countries. Anti-vaccination beliefs and scare-mongering are also endemic in the internet age, with vaccine critical websites asserting dubious information [23, 24]. Ill-founded beliefs over vaccination have been darkly successful in stirring panic and reducing vaccine uptake, which has led to damaging resurgence in diseases such as measles [4].
- Cancer cure conspiracy—The belief that a cure for cancer is being withheld by vested interests is a long-standing one [25]. It is often used as a universal deus ex machina for those pushing an alternative alleged cure, and assertion of the conspiracy theory functions as an explanatory device to explain the complete paucity of clinical evidence for such claims [26]. Such claims can be detrimental to patients, some of whom abandon conventional treatment for the lofty but ill-founded promises of alternative medicine [27].
Methods
1.1 Model derivation
We initially assume that for a given conspiracy, conspirators are in general dedicated for the most part to the concealment of their activity. We further assume that a leak of information from any conspirator is sufficient to expose the conspiracy and render it redundant—such leaks might be intentional (in the form of whistle-blowing or defection) or accidental (mistaken release of information). We concern ourselves only with potential intrinsic exposure of the conspiracy and do not consider for now the possibility that external agents may reveal the operation. Thus, it follows that the act of a conspiracy being exposed is a relatively rare and independent event. We can then apply Poisson statistics, and express the probability of at least one leak sufficient to lead to failure of the conspiracy as (1) where ϕ is the mean number of failures expected per unit time. This is in turn a function of number of conspirators with time N(t) and p, the intrinsic probability of failure per person per year. Then we may specify ϕ by (2) and writing ψ = 1 − p for brevity, the probability of conspiracy failure can be re-written as a function of time, given by (3) There are several possibilities for the parameter N(t), the number of conspirators—the appropriate selection will depend on the type of conspiracy involved. If a conspiracy requires constant upkeep then the number required to sustain the fiction is approximately constant with time. This pertains to situations where some active input in either covering up an event or maintaining a deception is vital. In such a case, the number involved takes a simple form of (4) where No is the initial number of conspirators. If instead the conspiracy is a single event after which no new conspirators are required then over time those involved will die off, reducing probability of exposure. If this is the case, a Gompertzian survival function can be employed for the function N(t). If the average age of the conspirators at the moment of the event is te, then (5) where No is the initial number of involved individuals, and α and β are function constants for the Gompertzian curve. For humans, we can use α = 10−4 and β = 0.085 [28] to approximate human mortality. Finally, if conspirators are rapidly removed due to internal friction or otherwise (an action itself which is arguably a meta-conspiratorial event), there may be circumstances where we can model N(t) as an exponential decay. If members are removed rapidly with only half remaining after a period t2, then the decay constant is and the number of conspirators at a given time is (6) It is important to note that Eq 6 pivots on the assumption that rapid removal of conspirators doesn’t change the per conspirator probability of exposure—this assumption may not hold in practice and is refined in the discussion section. From Eq 3 it is clear that increasing N(t) will always act to increase L(t) no matter what form is chosen for conspirator density. The failure rate with time is slightly more complicated; for the constant case given in Eq 4, L will increase monotonically with time. If instead non-constant forms are used, such as those in Eqs 5 and 6, L is non-linear with time, as illustrated in Fig 1. The time at which L is a maximum in these cases, tm, is given by solving , which yields the indentity (7) This equation is transcendental and cannot be solved analytically, but can be readily estimated by graphical or numerical techniques. The maximum failure probability is then L(tm), given by Eq 3. The form of N(t) shapes the dynamics of the problem markedly, as shown in Fig 1.
The blue sold line depicts L over time with a constant level of conspirators being maintained. The red dotted line shows a single event with Gompertzian decay of the conspiring population, assuming an average initial age of 40 years old and the dashed orange line shows an exponential decay with number of conspirators being halved every 10 years. In the first case, the likelihood of conspiracy failure always increases with time. In the Gompertzian case, the chances of failure initially increase towards a maximum (L = 0.38 after 29 years in this example), but the death of conspirators with time acts to decrease probability of failure after this. Finally, if conspirators are removed extrinsically, then the curve hits a maximum (L = 0.12 after 14 years) before decaying to lower likelihoods as less conspirators exist to betray confidence.
1.2 Parameter estimation
To use the model, realistic parameter estimates are required. In particular, the parameter p, the probability of an intrinsic leak or failure, is extremely important; if p were zero, absolute conspiracy would be maintained, only resolvable by extrinsic analysis. In practice, this is not the case—historical examples show that even in incredibly secretive organizations, there is always some possibility of an accidental or intentional intrinsic leak whether by whistle-blowing or ineptitude. By definition, details of conspiracy are rarely known but we may very conservatively estimate parameters using data from exposed examples where sufficient data on duration and number of conspirators is publicly available. The three examples used here are namely
- The National Security Agency (NSA) PRISM affair—The staggering extent of spying by the NSA and its allies on civilian internet users [29] was exposed by contractor Edward Snowden in 2013. The extent of the eavesdropping was unprecedented, including the tapping of fiber-optic cables, phone calls from allied heads of state and a huge amount of meta-data [30].
- The Tuskegee syphilis experiment—In 1932 the US Public Health Service began an observational study on African-American men who had contracted syphilis in Alabama. The study became unethical in the mid 1940s, when penicillin was shown to effectively cure the ailment and yet was not given to the infected men. Ethical questions about the research were raised in the mid 1960s, and finally exposed by researcher Dr. Peter Buxtun in 1972 [31–33].
- The Federal Bureau of Investigation (FBI) forensic scandal—Dr. Frederic Whitehurst wrote hundreds of letters to his superiors detailing the pseudoscientific nature of many of the FBI forensics tests. The dubious nature of these protocols resulted in a large number of innocent men being detained for decades, several of whom were executed for these crimes or died in prison, before Whitehurst exposed the debacle in 1998. A subsequent report by the FBI and Department of justice found that at least 26 of the 28 dedicated hair analysts gave misleading testimony, prompting an on-going massive re-evaluation of unsafe convictions [34, 35].
With data available from these events, we can estimate values for p conservatively. We assume that after duration t when conspiracies are uncovered that, their probability of failure stands at L ≥ 0.5. A lower-bound for p is then given by (8)
There is considerable and unavoidable ambiguity on some of these estimates, especially on the number of people with full knowledge of the event. In the PRISM case, the figure of 30,000 comes from total NSA staff. In reality, the proportion of those employed would would have knowledge of this program would likely be a lot less but we take the upper bound figure to minimize the estimate of p. Given the short time-frame involved, we further assume the number of conspirators stayed approximately constant over the duration before the event was exposed. The situation is even more complicated regarding the Tuskegee experiment. This originally fell under the remit of the venereal diseases division of the United States Public Health Service (USPHS) in the early 1930s, before this department was restructured in later years. Historical employment levels for the USPHS are not readily available, so the estimation of 6700 is taken from data for current officer staff levels of the entire USPHS. This likely over-estimates the number involved substantially, which historically would have chiefly concerned only the much smaller venereal disease division. The FBI forensics scandal is also difficult to quantify; while 28 agents were specifically involved with the microscopic hair analysis debacle [39], Dr Whitehurst’s whistle-blowing exposed much wider scale problems affecting the entire forensics department. Accordingly, we have used the modern estimate of FBI forensic staff both scientific and agency. Taking a larger value for N tends to over-estimate the ability of a mass of conspirators to retain a secret, yet it allows us to set an extreme lower bound for p, the failure odds per unit time per conspirator. This essentially yields a “best-case” scenario for the conspirators.
In addition to this, the life-time of the conspiracy is not always clear—in the NSA case, estimates span only a narrow range, between 5 and 6 years [29]. The Tuskegee experiment is more ambigious; the original experiment commenced in the 1930s but did not become unethical until the late 1940s, when the decision was made to deny penicilin to the afflicted individuals. There were also ethical questions raised by others before Dr. Peter Buxten, but we use 1972 as our upper-limit as it was his whistle-blowing that focused attention on the long-running abuses. Finally, the FBI forensics time-frame is rather opaque—the FBI forensics laboratory was established in 1932, and naively we could take the conspiracy life-time as 66 years before exposure in 1998, in which case this would push the estimate of p down by roughly an order of magnitude to p > 2.11 × 10−5. Yet this is unrealistic, as the problems with certain aspects of nascent criminology were unlikely to have been known. However, between 1992 and 1997 Dr. Whitehurst penned several hundred letters to this superiors about gaping problems with aspects of the analysis, which were roundly ignored. It follows that the FBI were aware from at least 1992 that their forensic methods were untenable, giving a life-time until exposure of only 6 years. In all cases, we take the largest realistic value of t as this pertains to the best-case scenario for a conspiracy.
1.3 Experimental method
The model established allows estimation of how certain parameters influence the success or failure chance for any conspiracy. From Table 1, assuming the derived best-case scenario value for the conspirators (p = 4.09 × 10−6), we can apply the model outlined to several popular and enduring conspiracy theories and ascertain their viability with time. As discussed in the previous section, this estimate is intentionally optimistic for conspirators, and corresponds to a case where the average expected number of fatal leaks for a conspiracy is as low as roughly 4 in a million. In keeping with “best case scenario” estimates for conspiracies, we also neglect the upper figure of p = 2.45 × 10−4, which is roughly 60 times greater than the minimum projected probability of failure per conspirator per year as outlined in Table 1.
Results
Table 2 lists non-exhaustive estimations of the number of conspirators required for the anti-science belief outlined. Critically, the estimates for N(t) shown here assume all scientists involved would have be aware of an active cover-up, and that a small group of odious actors would be unable to deceive the scientific community for long timescales; the rationale for this assumption is expanded further in the discussion section. In most of these cases, constant up-keep would be required to maintain secrecy, so N(t) = No. In the case of the NASA hoax conjecture, it could be argued that the conspiracy was a single-event fiction, and thus the Gompertzian population form in Eq 5 could apply. This is not a very realistic assumption, but is considered here too. The climate-change conspiracy narrative requires some clarification too; those sceptical of the scientific consensus on anthropogenic climate change may take either a “hard” position that climate-change is not occurring or a “soft” position that it may be occurring but isn’t anthropogenic. For this investigation, we’ll define climate change conspiracy as those taking a hard position for simplicity. Results are shown in Fig 2. From this, we can also determine the maximum time-scales before imminent failure under best-possible conditions for these conspiracies, taken as L > 0.95. These estimates are given in Table 3.
Failure curves for (a) NASA moon-landing hoax—results for both constant population and Gompertzian function are so close as to be non-resolvable visually (b) Climate change hoax—The blue solid line depicts failure probability with time if all scientific bodies endorsing the scientific consensus are involved, the red-dotted line presents the curve if solely active climate researchers were involved (c) Vaccination conspiracy—blue solid line showing failure probability with time for a combination of public health bodies and major drug manufacturers and the red-dotted line depicting case if only public health bodies were conspiring (d) Failure with time for a suppressed cancer cure conspiracy.
Discussion
The analysis here predicts that even with parameter estimates favourable to conspiratorial leanings that the conspiracies analysed tend rapidly towards collapse. Even if there was a concerted effort, the sheer number of people required for the sheer scale of hypothetical scientific deceptions would inextricably undermine these nascent conspiracies. For a conspiracy of even only a few thousand actors, intrinsic failure would arise within decades. For hundreds of thousands, such failure would be assured within less than half a decade. It’s also important to note that this analysis deals solely with intrinsic failure, or the odds of a conspiracy being exposed intentionally or accidentally by actors involved—extrinsic analysis by non-participants would also increase the odds of detection, rendering such Byzantine cover-ups far more likely to fail. Moreover, the number of actors in this analysis as outlined in Table 2 represent an incredibly conservative estimate. A more comprehensive quantification would undoubtedly drive failure rate up for all considered conspiracy narratives.
This problem appears insurmountable for any large conspiracy; if it requires constant upkeep (N(t) ≈ No) then odds of failure approach unity with time. If we assign a detection threshold under which a conspiracy should remain (μ = 0.05) in a time-frame, then Table 4 enumerates the maximum number of conspirators possible. Even for a relatively short time of 5 years, the limit is hit with only 2521 agents. To sustain it for more than 10 years, less than 1000 people can be involved even with the generous estimate of p = 4.09 × 10−6 derived in this work. Even for single-events with Gompertzian population decay, the problem of large conspiracy failure is not adequately circumvented—for such an event, the odds of failure exceed 5% at around 650 participants even with the ideal value of p and an average age of participants of 40 years. In this situation however, failure probability eventually falls as the population involved decrease, meaning that the threshold can be considered a maximum probability of detection in this scenario. This probability also rapidly increases with number of conspirators involved, rendering large sustained conspiracies unlikely. Under ideal circumstances, it would only be possible to keep a single conspiratorial event below detection thereshold if the number of actors involved was very small (≪ 1000).
As outlined in the section on parameter estimation, estimates used here were deliberately selected to be maximally conducive to conspirators; the lowest values for p obtained were used for estimates, but the highest value was roughly two orders of magnitude above this. If this estimate is instead used, it would have a very stark effect, hugely decreasing time-frame to detection as depicted in Fig 3. Given the lack of clarity in getting precise numbers and time-frames, there is inherent uncertainty in this work on the estimated parameters and better estimates would allow better quantification of p. There is also an open question of whether using exposed conspiracies to estimate parameters might itself introduce bias and produce overly high estimates of p—this may be the case, but given the highly conservative estimates employed for other parameters, it is more likely that p for most conspiracies will be much higher than our estimate, as even relatively small conspiracies (such as Watergate, for example) have historically been rapidly exposed. It is also important to note that p will likely vary markedly for different conspiracies, depending on how deeply invested agents are invested in a given conspiracy and the figures here are at best a conservative approximation of typical values. However, even if agents are highly invested in a conspiracy, p also includes the odds of an accidental intrinsic exposure. While conspiracies do undoubtedly happen, their continued secrecy is probably more due to keeping the number of agents low than having an intrinsically small per agent per time leak probability.
The number of conspirators No is also an important uncertainty that needs to be carefully interpreted; the estimates made in this paper (shown in Table 2) are at best order of magnitude estimates. These have deliberately been picked to be relatively conservative in one many respects; for example, the number involved in a hypothetical vaccine conspiracy is likely a massive underestimate due to the ubiquity of vaccination. The estimates also make the assumption that all agents in the estimate are considered to have knowledge of the conspiracy at hand; if this wasn’t the case, then only those with adequate knowledge of the deception would count towards the number No. This might potentially be the case for some political or social conspiracies, yet for a hypothetical scientific conspiracy it is probably fair to assume that all agents working with the data would have to be aware of any deception. Were this not the case, fraudulent claims or suspect data would be extrinsically exposed by other scientists upon examination of the data in much the same way that instances of scientific fraud are typically exposed by other members of the scientific community. Thus even if a small devious cohort of rouge scientists falsified data for climate change or attempted to cover-up vaccine information, examination by other scientists would fatally undermine the nascent conspiracy. To circumvent this, the vast majority of scientists in a field would have to mutually conspire—a circumstance the model predicts is exceptionally unlikely to be viable.
The assumption of Poisson statistics used in this work is justified for discrete events, from cars arriving at a traffic light [49] to radiation induced DNA damage [50] and should hold for exposure of conspiracy events. The model outlined is simple, yet depending on the population function it can yield interesting behaviour. As depicted in Fig 1, the form of N(t) hugely influences the detection probability of a conspiracy with time. The exponential decay form posited in Eq 6 would in theory yield the lowest probability of detection from a single conspiratorial event, but is likely unrealistic. The reasons for these are twofold—firstly, it implies that conspirators are assassinated or otherwise removed, which itself would be a conspiracy event. But perhaps more relevant is the observation that rapid removal of conspirators would itself likely create panic and disunity amongst invested parties. In this case, p would likely become a function of number of conspirators and time. If we assume that the probability of failure increases proportionally to the extinction rate of conspiring parties, then p(t) = po eλt then the odds of failure increase dramatically. This behaviour is depicted in Fig 4. For these reasons, the rapid extinction model of population is probably not realistic, even for single-event conspiracy and can be disregarded. In lieu of any available data, we have neglected the potential variation of probability with time in this work, but defining ψ(t) = 1 − p(t) (for any suitable p(t)), we modify Eq 3 to account for this if known by (9) One of the major motivations is to help counter-act anti-science beliefs from gaining a foothold by quantifying how extraordinarily unlikely it is that a cohesive scientific fraud could take place on such massive scales. This applies not only to the stances examined here, but also to the wild array of popular anti-science beliefs. A sizeable contingent still hold conspiratorial conviction about a range of topics, including Genetically Modified Organisms [51, 52], Water Fluoridation [18, 53, 54], and AIDS Denialism [55, 56] to name but a few prominent examples. It is important to challenge such narratives, as not only are they detrimental to our health and well-being, current research suggests that exposure to conspiratorial beliefs can affect our perception of events to a greater degree than we are aware [57]. Acceptance of such anti-science thinking seems to be correlated with lower resistance to pseudoscientific claims; acceptance of cancer conspiracy claims can drive patients to neglect mainstream medicine in favour of dubious wares by alternative therapists [27]. There is considerable evidence that alternative health practioneers such as homeopaths are far more likely to encourage rejection of vaccination [23, 58, 59] despite their ostensible medical technique being utterly devoid of evidence and completely contradicted by the basic laws of physics [60]. It is unclear whether this relationship is causal or merely correlative.
Failure curves for a conspiracy of No = 5000 over a 50 year period with exponential removal of conspirators with half-life t2 of 5 years () with (a) assumption of constant p (b) proportional change in probability p(t) = po eλt.
The theory outlined is useful in predicting the broad patterns expected from a conspiracy event, but does not consider the dynamics, motivations and interactions of individual agents. This interplay might be an avenue for future work, perhaps employing agent based models to account for the various internal frictions and pressures affecting the gross failure rate. The approach outlined here might give some insight into the gross behaviour of conspiracies, but agent based modelling focused on individual actors interacting with certain probabilities might better capture the intricacies of conspiracy and whistle-blowing. Such models could also readily be informed by psychological data, ascribing simulated actors a spectrum of traits, with specific interaction rules to see whether the emergent dynamics affect the success or failure of any secretive event.
While challenging anti-science is important, it is important to note the limitations of this approach. Explaining misconceptions and analysis such as this one might be useful to a reasonable core [9], but this might not be the case if a person is sufficiently convinced of a narrative. Recent work has illustrated that conspiracy theories can spread rapidly online in polarized echo-chambers, which may be deeply invested in a particular narrative and closed off to other sources of information [61]. In a recent Californian study on parents, it was found that countering anti-vaccination misconceptions related to autism was possible with clear explanation, but that for parents resolutely opposed to vaccination attempts to use rational approach further entrenched them in their ill-founded views [62]. The grim reality is that there appears to be a cohort so ideologically invested in a belief that for whom no reasoning will shift, their convictions impervious to the intrusions of reality. In these cases, it is highly unlikely that a simple mathematical demonstration of the untenability of their belief will change their view-point. However, for the less invested such an intervention might indeed prove useful.
Acknowledgments
Many thanks to Dr. Frederic Whitehurst for this helpful first-hand insight into the FBI forensics scandal. As the author is a physicist rather than a psychologist, I am indebted to Profs. Stephan Lewandowsky and Ted Goertzel and Ms. Mathilde Hernu for their valuable input on conspiratorial thinking. Thanks also to Drs. Ben Goertzel and David Basanta for their comments and suggestions, and to the reviewers for their helpful comments and suggestions. I would also like to thank my University of Oxford colleagues for their continued support, in particular Dr. Mike Partridge. This work did not require specific funding, from nebulous clandestine cabals or otherwise.
Author Contributions
Conceived and designed the experiments: DRG. Performed the experiments: DRG. Analyzed the data: DRG. Contributed reagents/materials/analysis tools: DRG. Wrote the paper: DRG.
References
- 1. Sunstein CR, Vermeule A. Conspiracy Theories: Causes and Cures*. Journal of Political Philosophy. 2009;17(2):202–227.
- 2. Goertzel T. Belief in conspiracy theories. Political Psychology. 1994;15(4):731–742.
- 3. Lewandowsky S, Gignac GE, Oberauer K. The Role of Conspiracist Ideation and Worldviews in Predicting Rejection of Science. PLoS ONE. 2013 10;8(10):e75637. pmid:24098391
- 4. Poland GA, Jacobson RM. The Age-Old Struggle against the Antivaccinationists. New England Journal of Medicine. 2011;364(2):97–99. pmid:21226573
- 5. Jacobson RM, Targonski PV, Poland GA. A taxonomy of reasoning flaws in the anti-vaccine movement. Vaccine. 2007;25(16):3146–3152. pmid:17292515
- 6. McCright AM, Dunlap RE. Cool dudes: The denial of climate change among conservative white males in the United States. Global Environmental Change. 2011;21(4):1163—1172. Available from: http://www.sciencedirect.com/science/article/pii/S095937801100104X.
- 7. Poortinga W, Spence A, Whitmarsh L, Capstick S, Pidgeon NF. Uncertain climate: An investigation into public scepticism about anthropogenic climate change. Global Environmental Change. 2011;21(3):1015—1024. Symposium on Social Theory and the Environment in the New World (dis)Order. Available from: http://www.sciencedirect.com/science/article/pii/S0959378011000288.
- 8.
Lewandowsky S, Oberauer K, Gignac GE. NASA Faked the Moon Landing therefore (Climate) Science Is a Hoax: An Anatomy of the Motivated Rejection of Science. Psychological Science. 2013;Available from: http://pss.sagepub.com/content/early/2013/03/25/0956797612457686.abstract.
- 9. Swami V, Pietschnig J, Tran US, Nader IW, Stieger S, Voracek M. Lunar Lies: The Impact of Informational Framing and Individual Differences in Shaping Conspiracist Beliefs About the Moon Landings. Applied Cognitive Psychology. 2013;27(1):71–80.
- 10. Lewandowsky S, Gignac GE, Vaughan S. The pivotal role of perceived scientific consensus in acceptance of science. Nature Climate Change. 2013;3(4):399–404.
- 11. Freudenburg WR, Gramling R, Davidson DJ. Scientific Certainty Argumentation Methods (SCAMs): Science and the Politics of Doubt*. Sociological Inquiry. 2008;78(1):2–38.
- 12. Leiserowitz A. Climate Change Risk Perception and Policy Preferences: The Role of Affect, Imagery, and Values. Climatic Change. 2006;77(1–2):45–72.
- 13. McCright AM, Dunlap RE. Defeating Kyoto: The Conservative Movement’s Impact on U.S. Climate Change Policy. Social Problems. 2003;50(3):348–373.
- 14. Lewandowsky S, Cook J, Oberauer K, Brophy S, Lloyd EA, Marriott M. Recurrent fury: Conspiratorial discourse in the blogosphere triggered by research on the role of conspiracist ideation in climate denial. Journal of Social and Political Psychology. 2015;3(1):142–178.
- 15.
Public Policy Polling. Democrats and Republicans differ on conspiracy theory beliefs. 2013. Available from: http://www.publicpolicypolling.com/pdf/2011/PPP_Release_National_ConspiracyTheories_040213.pdf.
- 16. Perlmutter DD, Dahmen NS. (In)visible evidence: pictorially enhanced disbelief in the Apollo moon landings. Visual Communication. 2008;7(2):229–251. Available from: http://vcj.sagepub.com/content/7/2/229.abstract.
- 17.
Pachauri RK, Allen M, Barros V, Broome J, Cramer W, Christ R, et al. Climate Change 2014: Synthesis Report. Contribution of Working Groups I, II and III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change. 2014;.
- 18. Goertzel T. Conspiracy theories in science. EMBO reports. 2010;11(7):493–499. pmid:20539311
- 19. Leiserowitz AA. American Risk Perceptions: Is Climate Change Dangerous? Risk Analysis. 2005;25(6):1433–1442. pmid:16506973
- 20. Antilla L. Climate of scepticism: {US} newspaper coverage of the science of climate change. Global Environmental Change. 2005;15(4):338—352. Available from: http://www.sciencedirect.com/science/article/pii/S095937800500052X.
- 21. Carvalho A. Ideological cultures and media discourses on scientific knowledge: re-reading news on climate change. Public Understanding of Science. 2007;16(2):223–243. Available from: http://pus.sagepub.com/content/16/2/223.abstract.
- 22. Jolley D, Douglas KM. The Effects of Anti-Vaccine Conspiracy Theories on Vaccination Intentions. PLoS ONE. 2014 02;9(2):e89177. pmid:24586574
- 23. Zimmerman RK, Wolfe RM, Fox DE, Fox JR, Nowalk MP, Troy JA, et al. Vaccine criticism on the world wide web. Journal of Medical Internet Research. 2005;7(2). pmid:15998608
- 24. Kata A. Anti-vaccine activists, Web 2.0, and the postmodern paradigm—An overview of tactics and tropes used online by the anti-vaccination movement. Vaccine. 2012;30(25):3778—3789. Special Issue: The Role of Internet Use in Vaccination Decisions. Available from: http://www.sciencedirect.com/science/article/pii/S0264410X11019086. pmid:22172504
- 25. W R . The grand conspiracy against the cancer cure. JAMA. 1980;243(4):337–339.
- 26. Cassileth BR. Alternative and complementary medicine. Cancer. 1999;86(10):1900–1902. pmid:10570411
- 27. CASSILETH BR, LUSK EJ, STROUSE TB, BODENHEIMER BJ. Contemporary Unorthodox Treatments in Cancer MedicineA Study of Patients, Treatments, and Practitioners. Annals of Internal Medicine. 1984;101(1):105–112. pmid:6732073
- 28.
Levy G, Levin B. The Biostatistics of Aging: From Gompertzian Mortality to an Index of Aging-relatedness. John Wiley & Sons; 2014.
- 29.
Director of National Intelligence. Facts on the Collection of Intelligence Pursuant to Section 702 of the Foreign Intelligence Surveillance Act. 2013;.
- 30. Landau S. Highlights from Making Sense of Snowden, Part II: What’s Significant in the NSA Revelations. Security Privacy, IEEE. 2014 Jan;12(1):62–64.
- 31. Brawley OW. The study of untreated syphilis in the Negro male. International Journal of Radiation Oncology* Biology* Physics. 1998;40(1):5–8.
- 32. Thomas SB, Quinn SC. The Tuskegee Syphilis Study, 1932 to 1972: implications for HIV education and AIDS risk education programs in the black community. American journal of public health. 1991;81(11):1498–1505. pmid:1951814
- 33. Corbie-Smith G. The continuing legacy of the Tuskegee Syphilis Study: considerations for clinical investigation. The American journal of the medical sciences. 1999;317(1):5–8. pmid:9892266
- 34.
Edwards H, Gotsonis C. Strengthening forensic science in the United States: A path forward. Statement before the United State Senate Committee on the Judiciary.2009;.
- 35.
Office of the Inspector General. An Assessment of the 1996 Department of Justice Task Force Review of the FBI Laboratory; 2014.
- 36.
National Security Agency. 60 Years of Defending Our Nation. 2012;.
- 37.
USPHS—Career and Benefits;. Available from: http://www.usphs.gov/profession/.
- 38.
FBI—Laboratory services;. Available from: https://www.fbi.gov/about-us/lab.
- 39.
FBI Testimony on Microscopic Hair Analysis Contained Errors in at Least 90 Percent of Cases in Ongoing Review;. Available from: https://www.fbi.gov/news/pressrel/press-releases/fbi-testimony-on-microscopic-hair-analysis-contained-errors-in-at-least-90-percent-of-cases-in-ongoing-review.
- 40.
Nimmen JV, Bruno LC, Rosholt RL. NASA Historical Databook, 1958–1968, Volume I (SP-4012). Washington, D.C,: NASA Resources; 1976.
- 41.
American Geophysical Union—Our History;. Available from: http://about.agu.org/our-history/.
- 42.
NASA Workforce;. Available from: http://nasapeople.nasa.gov/workforce/default.htm.
- 43.
AAAS Membership;. Available from: http://membercentral.aaas.org/membership.
- 44.
Royal Society Fellowships;. Available from: https://royalsociety.org/about-us/fellowship/.
- 45.
European Physical Society membership;. Available from: http://www.eps.org/?page = membership_ms.
- 46. Cook J, Nuccitelli D, Green SA, Richardson M, Winkler B, Painting R, et al. Quantifying the consensus on anthropogenic global warming in the scientific literature. Environmental Research Letters. 2013;8(2):024024.
- 47.
Centre for Disease Control Factsheet;. Available from: http://www.cdc.gov/about/resources/facts.htm.
- 48.
World Health Organization—people and offices;. Available from: http://www.who.int/about/structure/en/.
- 49. McNeil DR. A Solution to the Fixed-cycle Traffic Light Problem for Compound Poisson Arrivals. Journal of Applied Probability. 1968;5(3):624–635.
- 50. Grimes DR, Partridge M. A mechanistic investigation of the oxygen fixation hypothesis and oxygen enhancement ratio. Biomedical Physics & Engineering Express. 2015;1(4): 045209.
- 51. Shermer M. Conspiracy Central. Scientific American. 2014;311(6):94–94.
- 52. Yang J, Xu K, Rodriguez L. The rejection of science frames in the news coverage of the golden rice experiment in Hunan, China. Health, risk & society. 2014;16(4):339–354.
- 53. Milgrom P, Reisine S. Oral health in the United States: the post-fluoride generation. Annual review of public health. 2000;21(1):403–436. pmid:10884959
- 54. Grimes DR. Commentary on’Are fluoride levels in drinking water associated with hypothyroidism prevalence in England? A large observational study of GP practice data and fluoride levels in drinking water’. Journal of epidemiology and community health. 2015;p. jech–2015.
- 55. Chigwedere P, Essex M. AIDS denialism and public health practice. AIDS and Behavior. 2010;14(2):237–247. pmid:20058063
- 56. Nattrass N. Still crazy after all these years: The challenge of AIDS denialism for science. AIDS and Behavior. 2010;14(2):248–251. pmid:19937271
- 57. Douglas KM, Sutton RM. The Hidden Impact of Conspiracy Theories: Perceived and Actual Influence of Theories Surrounding the Death of Princess Diana. The Journal of Social Psychology. 2008;148(2):210–222. pmid:18512419
- 58. Schmidt K, Ernst E. {MMR} vaccination advice over the Internet. Vaccine. 2003;21(11–12):1044—1047. pmid:12559777
- 59. Kata A. A postmodern Pandora’s box: Anti-vaccination misinformation on the Internet. Vaccine. 2010;28(7):1709—1716. Available from: http://www.sciencedirect.com/science/article/pii/S0264410X09019264. pmid:20045099
- 60. Grimes DR. Proposed mechanisms for homeopathy are physically impossible. Focus on Alternative and Complementary Therapies. 2012;17(3):149–155.
- 61.
Del Vicario M, Bessi A, Zollo F, Petroni F, Scala A, Caldarelli G, Stanley HE, Quattrociocchi W.. The spreading of misinformation online. Proceedings of the National Academy of Sciences, 2016: p.201517441.
- 62.
Nyhan B, Reifler J, Richey S, Freed GL. Effective Messages in Vaccine Promotion: A Randomized Trial. Pediatrics. 2014;Available from: http://pediatrics.aappublications.org/content/early/2014/02/25/peds.2013-2365.abstract.