Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Quantifying the effect of Wakefield et al. (1998) on skepticism about MMR vaccine safety in the U.S.

Abstract

Background

Efforts to trace the rise of childhood vaccine safety concerns in the US often suggest Andrew Wakefield and colleagues’ retracted 1998 Lancet study (AW98)–which alleged that the MMR vaccine can cause children to develop autism–as a primary cause of US vaccine skepticism. However, a lack of public opinion data on MMR safety collected before/after AW98’s publication obscures whether anecdotal accounts are indicative of a potentially-causal effect.

Methods

We address this problem using a regression discontinuity framework to study change in monthly MMR injury claims (N = 74,850; from 1990–2019) from the Vaccine Adverse Events Reporting System (VAERS) to proxy concern about vaccine safety. Additionally, we suggest a potential mechanism for the effect of AW98 on vaccine skepticism, via automated sentiment analyses of MMR-related news stories (N = 674; from 1996–2000) in major television and newspaper outlets.

Results

AW98 led to an immediate increase of about 70 MMR injury claims cases per month, averaging across six estimation strategies (meta-analytic effect = 70.44 [52.19, 88.75], p < 0.01). Preliminary evidence suggests that the volume of negative media attention to MMR increased in the weeks following AW98’s publication, across four estimation strategies (meta-analytic effect = 9.59% [3.66, 15.51], p < 0.01).

Conclusions

Vaccine skepticism increased following the publication of AW98, which was potentially made possible by increased negative media coverage of MMR.

Significance

Childhood vaccine skepticism presents an important challenge to widespread vaccine uptake, and undermines support for pro-vaccine health policies. In addition to advancing our understanding of the previously-obscured origins of US vaccine skepticism, our work cautions that high-profile media attention to inaccurate scientific studies can undermine public confidence in vaccines. We conclude by offering several recommendations that researchers and health communicators might consider to detect and address future threats to vaccine confidence.

Introduction

In late February 1998, a research team led by Andrew Wakefield published an article in The Lancet suggesting a link between the Measles, Mumps, and Rubella vaccine (MMR) and the development of autism in children. An investigation into the study (hereafter, AW98), conducted more than ten years after its original publication date, concluded that the article’s data did not support this claim, and documented credible evidence of research malfeasance [1].

Although the piece was eventually retracted in Spring 2010 [1], both scholarly [24] and journalistic [5] efforts to identify the origins of contemporary vaccine skepticism in the U.S. suggest that the publication of AW98— and early media attention to it—was a pivotal moment (and perhaps the pivotal moment) in the mainstream acceptance of skepticism about MMR vaccine safety. Today, approximately one in three Americans believes that childhood vaccines can cause children to develop autism [6].

Studies attributing the rise in childhood vaccine skepticism to AW98, however, rely primarily on anecdotal evidence [2, 45]. This may be due in part to the scarcity of polling data about MMR safety in the immediate aftermath of AW98, and the virtual absence of such data prior to 1998 [7]. One way that scholars have circumvented this concern is by investigating the effect of AW98 on childhood vaccine skepticism is to assess whether or not MMR refusals (presumably resulting from safety concerns) increased in the aftermath [3]. Using this approach, AW98—and media coverage of it—appears to have had a short-lived and limited influence on vaccine refusal rates.

However, as outright MMR refusal is an extreme form of action—complicated by the fact that taking parents may be required to seek medical or philosophical exemptions to immunization statutes [8]—it may lack the sensitivity necessary to detect broader changes in public opinion toward MMR. In this study, make use of a comparatively more-sensitive indicator of MMR skepticism; parent reports of negative reactions to the MMR vaccine from the Vaccine Adverse Events Reporting System (VAERS; Department of Health & Human Services). These reports enable parents to express skepticism about MMR safety by reporting potential side effects their children may have experienced; meaning that parents need not take the more-extreme step of refusing vaccines outright to register concerns about its safety.

If VAERS reports are sufficiently-sensitive indicators of MMR skepticism, and if conventional wisdom about AW98 is correct, we should expect to see a sharp increase in reports following the publication of—and media attention to—the debunked study. While VAERS reports, unlike public opinion surveys, cannot tell us the “base rate” of MMR skeptical views in the mass public, they can at least offer insights into whether or not the levels of skepticism necessary to attribute MMR to various potential side effects following AW98.

Empirically demonstrating whether or not AW98 “moved the needle” on vaccine skepticism is an important empirical task, as it can help scholars better understand how communication surrounding debunked or misleading science might go on to shape behaviors relevant to public health. Additionally, given the link between MMR skepticism and opposition to policies that encourage vaccination [7, 9], this research can help us understand the potential long-run impacts of attention to fraudulent scientific claims on support for public health policies.

Materials & methods

Study design, population, and setting

Data from this study come from two sources. First, we test whether or not concern about MMR safety increased following the publication of AW98 using publicly available data of reports of adverse reactions to the MMR vaccine, via the Department of Health and Human Services (DHHS) Vaccine Adverse Event Reporting System (VAERS) [10]. Doctors and other vaccine providers report potential adverse events to VAERS in consultation with the parents of children administered the MMR vaccine.

Next, we provide supplemental tests of whether or not media coverage of AW98 presents a plausible mechanism for the effects documented in Fig 1. We obtained data from Lexis Nexis Academic by searching for news stories referencing “measles” or “MMR” in national television broadcasts (ABC, CBS, PBS, and NBC News) and high-circulation newspapers (USA Today, Washington Post, and the New York Times, Associated Press). We measured story sentiment using Linguistic Inquiry and Word Count (LIWC) software [11], which computes the ratio of negatively to positively valenced words in each story, standardized on a scale ranging from 0 (most negative) to 100 (most positive).

thumbnail
Fig 1. MMR VAERS reports pre/post AW98.

Panel a presents yearly MMR VAERS reports before/after the publication of AW98 (dashed vertical line). Panel b presents RD estimates of the effect (B) and two-tailed significance (p) of AW98 on weekly report counts before/after AW98 (dashed vertical line), across several RD estimation strategies (see: Materials & Methods). All RD estimates are calculated as linear effects; which is appropriate given the high degree of consonance between the locally weighted polynomial trend line (dashed lines) and linear trend line (solid line) fit to the monthly data.

https://doi.org/10.1371/journal.pone.0256395.g001

Outcome measures

Our measure of MMR safety concerns is a count of all monthly MMR event reports filed to VAERS from 1990 (when the program was created) to 2019. Note that while several varieties of MMR are in use today (e.g., MMRV, which includes varicella), we focus on just MMR; both because it was the subject of AW98, and because it is consistently available throughout the series.

Additionally, for our supplemental tests, negative news volume about the MMR vaccine is the average (mean) negativity score for all MMR-related stories produced each week from March 1996 (two years before AW98’s publication) to March 2000 (two years post). We weighted weekly averages by the total number of stories featured in that week.

Statistical analysis

To test whether or not AW98 had a causal effect on increased MMR safety concerns, we first compare the number of MMR reports filed to VAERS pre/post-AW98 using a regression discontinuity (RD) setup. For robustness, we present several versions of the RD results that vary (1) whether the RD is “sharp” (estimated before/after the date AW98 was published) or “fuzzy” (within a month of the paper’s publication); (2) report aggregation level (monthly, quarterly, yearly, or using an automatic “bin” selection method via a coverage error rate [CER] optimal bandwidth estimator); (3) population adjusted vs. non-adjusted effect size estimators (to ensure that a potential increase in reports is not the result of population growth over time), and (4) the use of bias-correction standard error estimates. Results of all models are presented in Fig 1. All data and code necessary to replicate these analyses, as well as the placebo tests referenced above, can be found at: https://osf.io/n7z2v/.

To detect whether or not media coverage of AW98 might be responsible for increased MMR concern, we again use an RD setup; this time aggregating both the volume and tone of MMR stories from two years before and after the publication of AW98 (1996–2000). These tests, visualizations, and the time demarcations are constructed identically to those reported in Fig 1 —modeling change in sentiment over time—with the exception that: (1) we restrict data aggregation to be at the weekly level (as monthly aggregations would leave the model under-powered; daily aggregations would have ’sparse’ days with no coverage); (2) all models include story counts as a covariate (as we are interested in account for the volume of negative coverage); and, (3) both the locally weighted and linear trend lines adjust for weekly story volume.

In both cases, we summarize the results of these estimation procedures with a formal meta-analysis, using Cohen’s standardized mean difference procedure. Note that we report these quantities at only the monthly level for the adverse events data. This serves as a conservative estimate of the effect of AW98 on vaccine skepticism, as reporting the larger aggregation periods (included as robustness checks) would potentially inflate the average effect size.

Results

Fig 1 presents an initial test of AW98’s effect on MMR safety concerns. For reference, panel a presents the total yearly count of reported adverse experiences with MMR, from 1990 to 2019. Panel b visualizes the results of several regression discontinuity (RD) estimates of both the substantive effect of AW98 on monthly VAERS reports and statistical significance. If AW98 did indeed influence public perceptions of MMR vaccine safety, would we expect to see a substantively large and statistically significant increase in the period following the paper’s publication, compared to the period before it.

The descriptive results presented in Fig 1A document that, prior to the publication of AW98, parents typically reported under 2,000 adverse MMR events per year. Following AW98, that quantity steeply grew to over 4,000 yearly reports by 2004. Formal RD tests presented in Fig 2B document a large and statistically significant increase in events immediately following the publication of AW98. Meta-analyzing the six monthly tests presented in the figure suggests an increase of 70.44 [52.19, 88.75] reports per week.

thumbnail
Fig 2. MMR news sentiment pre/post AW98 (weighted by volume).

Figure presents weekly sentiment ratings of vaccine-related stories from major news outlets (gray shaded circles), weighted by the total amount of stories published that week (larger when there is more coverage in a given week). The figure also reports the effect (B) and two-tailed significance (p) of AW98 on weekly report counts before/after AW98 (dashed vertical line), across several RD estimation strategies (see: Materials & Methods). All RD estimates are calculated as linear effects; which is appropriate given the high degree of consonance between the locally weighted polynomial trend line (dashed lines) and linear trend line (solid line) fit to the weekly data; both of which adjust for total volume. N = 674 stories.

https://doi.org/10.1371/journal.pone.0256395.g002

These results hold across tests that vary both the report aggregation period, standard error estimator, and whether AW98 is treated as a “sharp” or “fuzzy” discontinuity. The results also hold when accounting for population growth over time (which might result in more reports being filed).

Additionally, to ensure that growth MMR concern is not confounded by (1) changes in the procedures by which parents report adverse effects to VAERS over time (e.g., the ability to submit reports online as personal home computing and internet access grew over time), (2) changes in how health care providers diagnose autism, and/or (3) time itself–i.e., the possibility that attitudes toward all vaccines grew more negative over time for reasons unrelated to AW98 –we offer a placebo test using reports of adverse reactions to HIBV (Haemophilus Influenzae Type B Vaccine)–which has been routinely administered to children since the early 1990s, but was not identified as a health risk in AW98 –in the online materials.

If the effects of AW98 were to coincide with unobserved changes in reporting mechanisms and/or more general changes in public vaccine sentiment, we should observe comparatively more VAERS reports in the post (vs. pre) AW98 period for the HIBV placebo. This would be indicative of a spurious effect of AW98. However, the analyses presented in the online materials suggest that AW98 had no statistically or substantively discernable effect on HIBV adverse event reports (see: S1 Fig).

Fig 2 presents evidence of a potential mechanism for this effect. In the weeks following AW98’s publication, the volume of negative news stories about the MMR vaccine from major news outlets increased substantially. Meta-analyzing the RD analyses presented in Fig 2 suggests that negativity increased by 9.59% [3.66, 15.51] following publication, approaching conventional two-tailed significance in three out of four estimation strategies. This provides preliminary evidence that AW98 influenced how the media talked about MMR, which in turn drew public attention to concerns about vaccine safety.

Conclusions

This research demonstrates that AW98, and media attention to it, may have changed how some Americans viewed MMR safety. Whereas anecdotal accounts suggest that AW98 led to an increase in vaccine skepticism in the US, early studies of pre/post AW98 vaccine compliance rates cast doubt on the study’s impact on public opinion. By constructing more sensitive indicators of vaccine skepticism, we detect a large, robust, and statistically significant uptick in public concern about MMR safety following AW98.

Without public opinion data, of course, we cannot determine precisely how many Americans came to hold negative views toward MMR post-AW98. However, our approach allows us to document change in vaccine sentiment over time attributable to AW98; enabling us to shed new light on a previously-muddled area of vaccine history.

Discussion

Many Americans hold skeptical views about the safety of childhood vaccines [7, 9, 1215]. Vaccine skepticism has important public health consequences, as people who believe that vaccines are unsafe tend to be less likely to intend to vaccinate themselves and their children against vaccine-preventable illnesses [1416], and more likely to oppose pro-vaccine health policies [7, 17]. Understanding the origins of vaccine skepticism in the U.S. can help researchers better preempt how to mitigate the effect that fraudulent claims might have on public vaccine opinion in the future [18].

Consequently, this research has several important public health consequences. First, it suggests that attention to false or misleading vaccine research can impact public confidence in vaccines. Media sources should therefore work in consultation with researchers to stringently vet vaccine-related stories before sharing their results with the public. This will be particularly important in a post COVID-19 pandemic public health environment, where scholars have the opportunity to produce research on the (potential) long-term side effects of vaccines currently approved for public use.

Additionally, our work underscores the importance of conducting regular public opinion polling about vaccine safety; even before the possibility of controversy. While VAERS is a useful way to detect change in vaccine attitudes in the absence of public opinion data, surveys enable us to more-precisely document change in vaccine skepticism, over time. Constant surveillance of vaccine opinion can help researchers better understand how changes in the media environment might influence public vaccine confidence, and (potentially) thereby influence related vaccine policy attitudes and health behaviors.

Finally, our work suggests several opportunities for future research. For example, our study cannot disentangle the precise mechanism by which AW98 media coverage might influence public vaccine attitudes. Media attention could, for example, directly influence how Americans feel about vaccination by drawing parents’ attentions to the possibility that their children experience adverse side effects from vaccination. Alternatively, the vaccine media environment could indirectly influence vaccine attitudes by changing health care professionals attentiveness to potential side effects (e.g., increased monitoring for autism symptoms [19]), and thereby engender concern among parents resulting in increased VAERS reports. Efforts to disentangle direct from more-complex causal accounts are a worthwhile endeavor in previous research.

Additionally, and perhaps most importantly, our research offers an opportunity to learn more about the effects of COVID-19 vaccine media coverage on attitudes toward vaccination. As more Americans became eligible to vaccinate against the virus, vaccine skeptics and prominent media figures who held skeptical views toward vaccination sometimes made use of VAERS reports to justify their (often, misinterpreted) concerns about COVID-19 vaccine safety [2022]. Consequently, future work should make an effort to determine whether or not media coverage of these issues in turn influenced reports to VAERS. This information can help scholars to not only better understand the effects of media coverage on vaccine attitudes, in application to diseases not studied in our work, but how the quality of adverse effect data reported to VAERS might be influenced by exogenous media, political, and other social events.

Moreover, although public opinion data about COVID-19 vaccine attitudes and behaviors is comparatively more plentiful than MMR opinion data, few surveys ask uptake questions at the daily level. As we demonstrate in this research, VAERS data–which can be readily aggregated at the daily level –could function as a useful supplement for public opinion data to detect short-term effects of changes in the vaccine communication environment. For example, researchers could use VAERS data to document the effects of media attention the federal government’s decision to temporarily pause administration of Johnson & Johnson’s COVID-19 vaccine. Because this pause lasted just ten days [23], VAERS data may offer an opportunity to assess these effects with additional granularity.

Supporting information

S1 Fig. Placebo Test: HIBV adverse effect reports.

Placebo test replicates the analyses and tests presented in Fig 1 in the main text (although note that these analyses also account for exponential decline in VAERS reports following the publication of AW98 in the bias correction RD models by specifying a quadratic regression estimator). Please refer to the main text for additional methodological details. HIBV refers to the Haemophilus Influenzae Type B Vaccine, which–while typically administered in childhood since the early 1990s, like MMR—was not mentioned as a potential cause of autism in the AW98 paper. Consequently, if the effects documented in Fig 1 are merely the result of unobserved confounds, or time itself, we would expect to see a corresponding spike in adverse event reports following the publication of AW98. Consistent with the idea that the effects of AW98 are not confounded, the results document little effect of AW98 on VAERS reports for HIBV. Visually, although adverse event reports rise slightly following the publication of AW98, reports (1) were already increasing prior to its publication, and (2) had been in a period of exponential decline before 1998. Statistically, the results fail to document a significant increase in VAERS reports after correcting for the (pronounced) inverted parabolic nature of the pre-AW98 data, using quadratic regression. Note that because the analyses both substantively and statistically document no evidence of a discontinuity in adverse reports following AW98 in higher-powered (weekly) analyses, we do not re-estimate at the yearly or quarterly level.

https://doi.org/10.1371/journal.pone.0256395.s001

(TIF)

References

  1. 1. Eggertson L. (2010). Lancet retracts 12-year-old article linking autism to MMR vaccines. Canadian Medical Association. Journal, 182(4), E199. pmid:20142376
  2. 2. Leask J., Booy R., & McIntyre P. B. (2010). MMR, Wakefield and The Lancet: what can we learn. Med J Aust, 193(1), 5–7. pmid:20618105
  3. 3. Smith M. J., Ellenberg S. S., Bell L. M., & Rubin D. M. (2008). Media coverage of the measles-mumps-rubella vaccine and autism controversy and its relationship to MMR immunization rates in the United States. Pediatrics, 121(4), e836–e843. pmid:18381512
  4. 4. Li N., Stroud N. J., & Jamieson K. H. (2017). Overcoming false causal attribution: Debunking the MMR–autism association. The Oxford Handbook of the Science of Science Communication, 433–444.
  5. 5. Quick J.D. & Larson H. (2018). The Vaccine Autism Myth Started 20 Years Ago. Here’s Why it Still Endures Today. Time Maazine. https://time.com/5175704/andrew-wakefield-vaccine-autism/
  6. 6. Stecula D. A., Kuru O., & Jamieson K. H. (2020). How trust in experts and media use affect acceptance of common anti-vaccination claims. Harvard Kennedy School Misinformation Review, 1(1).
  7. 7. Dube E., Vivion M., & MacDonald N. E. (2015). Vaccine hesitancy, vaccine refusal and the anti-vaccine movement: influence, impact and implications. Expert review of vaccines, 14(1), 99–117. pmid:25373435
  8. 8. Olive J. K., Hotez P. J., Damania A., & Nolan M. S. (2018). The state of the antivaccine movement in the United States: A focused examination of nonmedical exemptions in states and counties. PLoS medicine, 15(6), e1002578. pmid:29894470
  9. 9. Motta M., Callaghan T., & Sylvester S. (2018). Knowing less but presuming more: Dunning-Kruger effects and the endorsement of anti-vaccine policy attitudes. Social Science & Medicine, 211, 274–281. pmid:29966822
  10. 10. Centers for Disease Control & Prevention / Food & Drug Administration. (2017). VAERS Data Use Guide. Department of Health & Human Services. https://vaers.hhs.gov/docs/VAERSDataUseGuide_October2017.pdf
  11. 11. Tausczik Y. R., & Pennebaker J. W. (2010). The psychological meaning of words: LIWC and computerized text analysis methods. Journal of language and social psychology, 29(1), 24–54.
  12. 12. Reinhart RJ. (2020). Fewer in US continue to see vaccines as important. Gallup News. https://news.gallup.com/poll/276929/fewer-continue-vaccines-important.aspx
  13. 13. Hornsey M. J., Harris E. A., & Fielding K. S. (2018). The psychological roots of anti-vaccination attitudes: A 24-nation investigation. Health Psychology, 37(4), 307. pmid:29389158
  14. 14. Oliver J. E., & Wood T. (2014). Medical conspiracy theories and health behaviors in the United States. JAMA internal medicine, 174(5), 817–818. pmid:24638266
  15. 15. Nyhan B., Reifler J., Richey S., & Freed G. L. (2014). Effective messages in vaccine promotion: a randomized trial. Pediatrics, 133(4), e835–e842. pmid:24590751
  16. 16. Ophir Y., & Jamieson K. H. (2018). Intentions to use a novel Zika vaccine: the effects of misbeliefs about the MMR vaccine and perceptions about Zika. Journal of Public Health, 40(4), e531–e537. pmid:29554290
  17. 17. Stecula D. A., Kuru O., Albarracin D., & Jamieson K. H. (2020). Policy Views and Negative Beliefs About Vaccines in the United States, 2019. American journal of public health, 110(10), 1561–1563. pmid:32816542
  18. 18. MacFarlane D., Hurlstone M. J., & Ecker U. K. (2020). Protecting consumers from fraudulent health claims: A taxonomy of psychological drivers, interventions, barriers, and treatments. Social Science & Medicine, 112790.
  19. 19. Wing L., & Potter D. (2002). The epidemiology of autistic spectrum disorders: is the prevalence rising?. Mental retardation and developmental disabilities research reviews, 8(3), 151–161. pmid:12216059
  20. 20. Kim N.Y. (2021). Deaths After Vaccination Don’t Prove that COVID-19 Vaccine is Lethal. https://www.politifact.com/factchecks/2021/feb/16/facebook-posts/says-death-reports-federal-database-show-fatal-ris/
  21. 21. Brewster J. (2021). The Truth Behind Tucker Carlson’s Claims about COVID-19 Vaccine Deaths and the Government’s VAERS Database. Forbes. https://www.forbes.com/sites/jackbrewster/2021/05/12/the-truth-behind-tucker-carlsons-claims-about-covid-19-vaccine-deaths-and-the-governments-vaers-database/?sh=657d25013a9a
  22. 22. Spencer, S.H. (2021). Tucker Carlson Misrepresents Vaccine Safety Reporting Data. FactCheck.org https://www.factcheck.org/2021/05/scicheck-tucker-carlson-misrepresents-vaccine-safety-reporting-data/
  23. 23. Remmel A. (2021). ’It’s a minefield’: COVID vaccine safety poses unique communication challenge. Nature, 488–489. pmid:34021288