Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Tracking Public Beliefs About Anthropogenic Climate Change

  • Lawrence C. Hamilton ,

    Lawrence.Hamilton@unh.edu

    Affiliations University of New Hampshire, Durham, New Hampshire, United States of America, Carsey School of Public Policy, University of New Hampshire, Durham, New Hampshire, United States of America

  • Joel Hartter,

    Affiliations Carsey School of Public Policy, University of New Hampshire, Durham, New Hampshire, United States of America, Environmental Studies Program, University of Colorado, Boulder, Colorado, United States of America

  • Mary Lemcke-Stampone,

    Affiliation Geography Department, Durham, New Hampshire, United States of America

  • David W. Moore,

    Affiliation Carsey School of Public Policy, University of New Hampshire, Durham, New Hampshire, United States of America

  • Thomas G. Safford

    Affiliations University of New Hampshire, Durham, New Hampshire, United States of America, Carsey School of Public Policy, University of New Hampshire, Durham, New Hampshire, United States of America

Abstract

A simple question about climate change, with one choice designed to match consensus statements by scientists, was asked on 35 US nationwide, single-state or regional surveys from 2010 to 2015. Analysis of these data (over 28,000 interviews) yields robust and exceptionally well replicated findings on public beliefs about anthropogenic climate change, including regional variations, change over time, demographic bases, and the interacting effects of respondent education and political views. We find that more than half of the US public accepts the scientific consensus that climate change is happening now, caused mainly by human activities. A sizable, politically opposite minority (about 30 to 40%) concede the fact of climate change, but believe it has mainly natural causes. Few (about 10 to 15%) say they believe climate is not changing, or express no opinion. The overall proportions appear relatively stable nationwide, but exhibit place-to-place variations. Detailed analysis of 21 consecutive surveys within one fairly representative state (New Hampshire) finds a mild but statistically significant rise in agreement with the scientific consensus over 2010–2015. Effects from daily temperature are detectable but minor. Hurricane Sandy, which brushed New Hampshire but caused no disaster there, shows no lasting impact on that state’s time series—suggesting that non-immediate weather disasters have limited effects. In all datasets political orientation dominates among individual-level predictors of climate beliefs, moderating the otherwise positive effects from education. Acceptance of anthropogenic climate change rises with education among Democrats and Independents, but not so among Republicans. The continuing series of surveys provides a baseline for tracking how future scientific, political, socioeconomic or climate developments impact public acceptance of the scientific consensus.

Introduction

“Human activities are changing Earth’s climate,” reads the opening sentence of the American Geophysical Union’s position statement on climate change [1]. The same point is central to statements by other science organizations, broad interdisciplinary reviews [2], direct surveys of scientists [3][4], and literature reviews [5][6]. No major science organization takes a contrary position that human activities are not changing the Earth’s climate [7].

While the scientific consensus has strengthened, public opinion remains seriously divided, without a clear trend [8][9]. Repeated surveys report annual-scale variations possibly related to developments such as release of the 2007 IPCC report, the 2008 economic crisis, “climategate” attacks on scientists in 2009, or a snowy northeastern US winter in 2011 [10][11]. Decadal-scale surveys provide essential perspective, but must employ questions with wording that has changed over the years, or else was frozen at a time when the discourse was different. Whereas recent scientific statements emphasize the term “climate change,” referencing regional differences and shifts in precipitation, storms or extreme events, the legacy survey questions often ask about “global warming” instead. Non-scientists sometimes misinterpret this term to mean that every place should be constantly warming, which seems easily refuted by pointing out a place that is cooling. Moreover, there has been publicity about a “pause” or slowdown in the rate of global air temperature rise, leading to unscientific claims that global warming had stopped [12]. The term “global warming” by itself apparently can elicit more conservative opposition than the term “climate change” on surveys [13]. A potentially greater problem with wording is that some of the longest-running survey questions do not specify human causation, which today (rather than the mere fact of change) forms the main point of public contention [14]. The reality of climate change has been publicly acknowledged even by political leaders who dismiss human causation as a hoax [15]. These complications in public discourse make it harder to interpret responses to survey questions designed long ago.

To unambiguously track public acceptance of the scientific consensus, in 2010 we started asking a question with three response choices:

Which of the following three statements do you personally believe?

  1. Climate change is happening now, caused mainly by human activities.
  2. Climate change is happening now, but caused mainly by natural forces.
  3. Climate change is NOT happening now.

Respondents can also say they don’t know, or decline to answer. Our question is present-tense and neutrally worded, with no mention of policies or future consequences. One response corresponds to the central point of scientific consensus statements, while others present the main logical alternatives. Although some scientists might argue that “belief” is the wrong term for their conclusions, it makes more sense with regard to acceptance by the general public. Trained telephone or face-to-face interviewers read the response choices in rotating order to avoid possible bias. From 2010 to 2015 over 28,000 people answered this question on 35 random-sample surveys, including the benchmark General Social Survey and a unique statewide time series.

Below we synthesize data from all of these surveys, analyzing them in a common multivariate framework. Logistic regression quantifies the effects of respondent age, gender, education and political orientation. This broad replication establishes a set of robust and consistent results. Regional surveys reflect the scale of place-to-place variation in climate-change beliefs, while the single-state time series shows temporal variation, permitting tests for the influence of daily weather, seasons and trends.

Data

Three US nationwide surveys, 11 surveys in selected, often rural US regions, and a series of 21 surveys in the state of New Hampshire comprise the data for this paper. Individual surveys, which include questions on many topics besides climate, have been introduced in previous papers. Here we undertake the first synthesis bringing all of them together, and analyzing responses to the common climate-change question.

General Social Survey (GSS 2012, 1,295 interviews)

The climate beliefs question was asked in face-to-face interviews for a panel subset of this representative US survey (variable clmtchng in GSS terminology) [16]. The National Opinion Research Center (NORC) at the University of Chicago, supported by the National Science Foundation, conducts the GSS and publishes its data as a resource for research. Intensive sampling and diagnostic efforts make GSS a benchmark for representativeness among US surveys. The 2012 response rate is given as 71%. Our analysis applies probability weights (variable wtssall) calculated by NORC.

National Community and Environment in Rural America Survey (NCERA 2011, 2,006 interviews)

Climate belief and knowledge questions were carried on this representative 50-state telephone survey conducted in summer 2011 [14]. NCERA was developed by researchers at the Carsey School of Public Policy, with sampling and interviewing done by the University of New Hampshire (UNH) Survey Center. The response rate was 31%, as calculated by the American Association for Public Opinion Research (AAPOR) definition 4 [17]. Probability weights (named ncerawt in S1 Dataset attached; see S1 File for a complete list of variables) that take account of household size, age-sex-race distributions by region, and metropolitan/nonmetropolitan composition are applied with relatively minor effects.

iMediaEthics Poll on Climate Change (IME 2014, 1,002 interviews)

Princeton Survey Research Associates International conducted this landline and cell phone survey with a nationally representative sample of adults living in the continental United States. Interviews were done in English and Spanish by Princeton Data Source from July 17–20, 2014. Probability weights (variable wt2 in S2 Dataset attached) correct for known demographic discrepancies. Wording of the climate change question on other surveys described here is identical, but the context and wording for the iMediaEthics survey are slightly different, as given in the documentation file attached (S3 File).

New Hampshire Granite State Poll (GSP 2010–2015, 11,548 interviews)

The Granite State Poll conducts telephone interviews with independent random samples of about 500 New Hampshire residents four times each year. Our core climate question has been carried on 21 surveys to date, from April 2010 through May 2015. Sampling and interviews for the GSP are done by the UNH Survey Center, with response rates averaging 25% (AAPOR 2006 definition 4). Probability weights (variable censuswt2) provide adjustments for minor design and sampling bias. The S3 Dataset attached contains the climate-change responses from all of the New Hampshire, CERA/CAFOR and other surveys described in this paper, a total of 28,962 individual interviews.

Community and Environment in Rural America and Communities and Forests in Oregon (CERA 2010–2012 and CAFOR 2011, 2014, 13,111 interviews)

These telephone surveys, done by the UNH Survey Center under direction of Carsey School researchers, employ sampling, interviewing and weighting methods similar to those of NCERA. They target small clusters of counties, many of them nonmetropolitan. The locations are diverse but selected non-randomly for different projects. The CERA and CAFOR surveys used here involve regions in Appalachia, the Columbia River, Gulf Coast Florida, Gulf Coast Louisiana, northern New England, eastern Oregon, the Olympic Peninsula, Puget Sound, and southeast Alaska. Table 1 lists the counties, dates and number of interviews comprising each of these CERA/CAFOR surveys. Citations to many papers describing individual studies are given in [18][19][20]. Response rates for individual surveys (AAPOR 2006 definition 4) range from 18 to 48%, with a mean of 31. For all analyses here we adopt the original CERA or CAFOR weighting schemes, which take into account household size, county adult population and age-sex or age-sex-race distributions.

thumbnail
Table 1. Community and Environment in Rural America (CERA) and Communities and Forests in Oregon (CAFOR) surveys that carried the climate-beliefs question.

Conducted by Carsey School of Public Policy (formerly Carsey Institute) researchers over 2010 to 2014.[18][19][20] N denotes the number of interviews.

https://doi.org/10.1371/journal.pone.0138208.t001

The privacy and interests of subjects interviewed for these surveys are protected through protocols approved by Institutional Review Boards at NORC (for GSS) or UNH (for NCERA, GSP, CERA and CAFOR). All data are recorded, analyzed and presented anonymously, as specified for these protocols.

Methods

The Stata 14.0 statistical program is employed for data management, analysis and graphing [21]. Figs 1 and 2 chart response percentages calculated using probability weights as described above. Ninety-five percent confidence intervals appear with each data point in the time plots of Fig 2.

thumbnail
Fig 1. (A) Response percentages for climate-change question on 3 national and 21 statewide New Hampshire surveys; (B) percentage choosing the now/human response on 11 CERA/CAFOR surveys.

Respondents who said they do not know, or gave no answer, are categorized as DK/NA in (A).

https://doi.org/10.1371/journal.pone.0138208.g001

thumbnail
Fig 2. (A) Now/human response by date of survey, and (B) broken down by political party, spring 2010 to spring 2015.

Surveys graphed at median interview dates, and shown with 95% confidence intervals.

https://doi.org/10.1371/journal.pone.0138208.g002

To quantify and test multiple predictors of now/human responses to the climate question, Table 2 estimates five weighted logistic regression models. Such models are commonly employed with the categorical dependent variables of survey data. If P(yi = 1) is the conditional probability of a now/human response by the ith individual, the odds of such a response are defined as O(yi = 1) = P(yi = 1)/P(yi ≠1). Logistic regression models the conditional log odds as a linear function of m predictor variables x1i, x2i, …, xmi: The β coefficients are estimated by maximum likelihood.

thumbnail
Table 2. Individual characteristics (all surveys), and county (CERA/CAFOR) or season, daily temperature anomaly and year (GSP), as predictors of belief that climate change is happening now, caused mainly by human activities.

Odds ratios from weighted logistic regression.

https://doi.org/10.1371/journal.pone.0138208.t002

Exponentiating the estimated β coefficients, eβ, obtains odds ratios interpretable as multiplicative effects on O(yij = 1). Odds ratios greater than 1.0 represent “positive” effects, meaning that higher values of an x variable are associated with higher odds that y = 1. Odds ratios below 1.0 represent “negative” effects, meaning that higher x values are associated with lower odds that y = 1.

The x variables or predictors for all models in Table 2 include respondent age (in years), gender (0 male, 1 female), education (–1 high school or less, 0 some college or technical school, 1 college graduate, 2 postgraduate) and political party (–1 Democrat, 0 Independent, 1 Republican). Under this coding, when education×party interaction terms are present the main effects of education represent its effects when party = 0 (Independents). Similarly the main effects of party represent its effects when education = 0 (some college or technical school).

The CERA/CAFOR model in the fourth column of Table 2 pools data from 11 regional CERA or CAFOR surveys representing 38 different counties or occasions (see Table 1). Previous analysis found substantial county-to-county variation [22], so intercept dummy variables (0,1 indicators) for counties are included among the predictors. To represent 38 counties we need one intercept and 37 dummy variables, but for readability these 38 coefficients are not listed in the table. Instead, an adjusted Wald test for all of them together confirms significant (p < .001) place-to-place variation.

The GSP model in the fifth column of Table 2 pools data from 21 New Hampshire surveys, 2010–2015. Interview-day temperature anomaly, season and year are included among the predictors. A statewide temperature index (mean 0.9°C, range –11.1 to +14.6°C) is defined as the mean of anomalies (departures from 1981–2010 daily normals) across the state’s four continuing US Historical Climatology Network stations (Durham, Keene, Hanover and First Connecticut Lake). Season is represented by three dummy variables with winter as the base category. Including year among the predictors tests for a time trend.

Four significant education×party interaction effects from Table 2 are visualized as adjusted marginal plots [23] in Fig 3. Curves depict the predicted probability of a now/human response as a function of respondent education and political party identification, adjusted for all the other predictors in each model.

thumbnail
Fig 3. Probability of now/human response in GSS, NCERA, CERA/CAFOR and GSP surveys as a function of education, by political identification.

Adjusted marginal plots with 95% confidence intervals calculated from the logistic regression models in Table 2.

https://doi.org/10.1371/journal.pone.0138208.g003

Results

Climate-change beliefs across 35 surveys

Three nationwide US surveys with diverse sampling and interview methods find 52% (in 2011) or 53% (in 2012 and 2014) agreement with the scientific consensus that human activities are now changing Earth’s climate (Fig 1A). A series of 21 statewide New Hampshire surveys over 2010 to 2015 runs a few points higher than the national surveys overall (55%). On these surveys a substantial minority (31 to 39%) concede that climate change is happening, but caused mainly by natural forces. Few (3 to 8%) say they believe climate change is not happening, or decline to express an opinion. In social, cognitive or political terms, the now/human and now/natural respondents prove distinct, whereas now/natural and not now respondents are less distinct [14]. Survey questions that simply ask whether global warming/climate change is happening, without specifying a cause, confuse two opposing viewpoints—in effect, grouping some of the now/natural responses together with now/human.

The CERA and CAFOR surveys target small and often rural clusters of counties, in regions selected for a variety of separate projects. Agreement with the scientific consensus ranges from 36 to 58% across these 11 surveys (Fig 1B). The regions studied include growing amenity-rich or near-urban areas, others dependent on coal or oil production, and still others with declining traditional resources such as forestry. Details of local environment and society help to understand place-to-place variations in climate and other environmental perceptions [18][19][24][25].

Tracked over time

Fig 2A tracks the percentage of now/human responses on nationwide and New Hampshire surveys over time, and their 95% confidence intervals. The different surveys line up surprisingly well, with New Hampshire results a few points higher. Fig 2A gives a visual impression of slight upward drift, to be tested by the year coefficient in Table 2. In the New Hampshire time line we see no sign of a lasting impact from Hurricane Sandy, which brushed this state but caused no disaster there in late October 2012 (between our October 2012 and January 2013 surveys).

The placid surface of Fig 2A covers a deep partisan divide (Fig 2B). Overall around 80% of New Hampshire Democrats, 55% of Independents, and 31% of Republicans agree with the scientific consensus that climate change is happening now, caused mainly by human activities. This partisan gap is one of the largest in questions asked on our surveys. The gap is somewhat greater in New Hampshire than nationally, partly reflecting a higher proportion of college graduates who, as will be seen, tend to be most polarized on this issue. For GSS the partisan gap is just 27 points, but even that is wider than historically polarizing abortion or gun control questions asked on the same survey. Surveys using different questions suggest that partisan gaps in climate beliefs have widened over the past decade [9][26].

Individual-level predictors

Political orientation and education dominate other characteristics in predicting individual responses. Moreover, politics moderates the effects of education. Table 2 quantifies these effects in logistic regression models that predict odds of a now/human response to the climate question. For the common individual-level predictors—age, gender, education, political party and education×party—these five analyses obtain remarkably consistent results.

Age effects are significant for every model except GSS, and all have odds ratios below 1, meaning that older respondents are less likely to agree with the scientific consensus. For example, an odds ratio of 0.985 (IME) indicates that the odds favoring a now/human response to the climate question are multiplied by 0.985, or decrease by 1.5%, with each one-year increase in age (if other predictors stay the same). With a 10-year increase in age, the odds are multiplied by 0.98510 = 0.860, or decrease about 14%.

In the CERA/CAFOR and GSP data women are significantly more likely to agree with the consensus, as shown by odds ratios above 1. The CERA/CAFOR odds ratio, 1.213, tells us that odds favoring a now/human response are about 21% higher for women than for men, other things being equal.

The main effects of education, significant across all of these models, suggest that among Independents (party = 0) the odds of belief in anthropogenic climate change increase by 20 to 25% (are multiplied by 1.202 to 1.249) with each step in education. Significant education×party interactions, however, indicate that the effects of education change with political party. Like the main effects of education, the magnitude of education×party interactions is roughly consistent (odds ratios from 0.748 to 0.850) across different datasets. Adjusted marginal plots in Fig 3 visualize the significant interactions in terms of probability. Among Democrats and Independents, probability of a now/human response rises with education. Among Republicans, however, this probability slightly declines with education. Better-educated Democrats and Republicans thus stand farther apart.

Place-to-place variation

The CERA/CAFOR surveys covered 35 different counties, and re-surveyed three of them on two different occasions (2011 and 2014), for a total of 38 county/occasions. Earlier work found substantial place-to-place variation [22], motivating our inclusion of 37 intercept dummy variables in the regression model. As Table 2 notes these county/occasion indicators help to predict individual-level climate beliefs. An adjusted Wald test finds that the county indicators collectively have significant impacts.

Place-to-place variations can themselves be a focus of research. Studies using other dependent variables have found broad structural effects, as from unemployment or population growth rates, alongside other effects reflecting local circumstances such as the importance of coal mining in rural Kentucky or the experience of warming winters in northern New England [18][19][24][25].Our focus here has been on individual-characteristic effects that prove stable across many different surveys. This includes the CERA/CAFOR surveys where, after adjusting for the significant place-to-place variation, we find substantially the same individual effects (from age, gender, education, party and education×party) seen in other surveys.

Temporal variation

The New Hampshire GSP interviews were conducted on 217 different days over 2010–2015. Temporal variation across this series of 21 surveys is much less than the spatial variation across the 11 regional CERA/CAFOR surveys, but it does display several patterns. Temperature anomalies on the interview day show a weak though significant effect on climate-change beliefs. Temperature effects prove intermittent within subsets of these data, however, marking them as not robust compared with individual and place effects.

The odds of belief in anthropogenic climate change are about 14% higher (multiplied by 1.145) in summer than winter. None of the seasonal effects are statistically significant, however. On the other hand, a slight upward trend in now/human responses, subjectively visible in Fig 2A, is more formally supported by a significant odds ratio for year in predicting the individual responses (Table 2). With each additional year, odds favoring a now/human response rise about 7% (multiplied by 1.67)—other things being equal.

Discussion and Conclusions

Politics and education effects

The most striking result here is the stability of public beliefs about anthropogenic climate change. That holds across different surveys (Fig 1A) and over a five-year time span (Fig 2A), although not across places (Fig 1B). General stability is anchored by wide, persistent political divisions (Fig 2B). Effects from individual age, gender, education and political party manifest as similar odds ratios on many different surveys (Table 2). Very similar education×party interaction effects occur in most of these surveys as well. In social research, interaction effects in multivariate models frequently prove to be sample-specific, so the degree of replication seen in Fig 3 is extraordinary.

Political identity dominates other background characteristics in predicting individual climate-change beliefs. Politics moderate effects from education, the second-strongest predictor. Agreement with the scientific consensus increases with education among Democrats and Independents (or liberals and moderates), but stays level or declines with education among Republicans (or conservatives). Similar interactions were first tested with different climate variables in 2006 GSS data [27] and subsequently replicated on other regional [24][28] and nationwide [9][29] surveys. Variations on this pattern include objectively-assessed science knowledge [30], numeracy [31] or self-assessed understanding [9][28] in place of education; and measures of ideology [9][27][28] or culture [31] in place of political party. Some other environment-related questions exhibit interactions of the same type [18][19][32].

Common explanations for the pattern invoke greater awareness among educated individuals about the views of politicians and media they follow—the elite cues hypothesis [9][33][34][35]. More educated or information-rich individuals also could be more effective in seeking out and retaining information that accords with their prejudices—as described by biased assimilation [9][36][37], motivated skepticism [38] and related hypotheses [26][39][40]. These explanations all hinge on the active, motivated acceptance/rejection of information, a major complication to the simpler information deficit hypothesis that people express low concern about scientifically-identified problems because they lack information that scientists could provide [41]. With regard to climate change many people assert that they are well informed, although their sense of understanding may come from politics rather than science [32].

Place and temporal effects

Place-to-place variations can be substantial (Fig 1B). Other studies have found both systematic and idiosyncratic explanations for such place effects, reflecting characteristics of the local economy, history, environment and culture.[18][19][25].

Temporal variations over the years studied here have been smaller (Fig 2A), with only weak seasonal and daily temperature effects. The latter finding fits the mixed conclusions of previous research, in which some authors report effects from ambient conditions [42][43][44], weather [45][46][47][48][49][50] or climate trends [24][29]. Other studies, however, find minor or nonexistent effects from weather or climate [51][52]. These inconsistent results suggests that weather or climate effects tend to be minor and contingent, in contrast to the strong, ubiquitous effects of political orientation.

The New Hampshire time series was initiated to monitor possible changes in public agreement with the scientific consensus on climate change. The relative lack of change was an early, unexpected discovery. As the series lengthens, however, we see evidence of upward drift. Overall, New Hampshire public acceptance of anthropogenic climate change moved up about five points, from 53% in 2010 to 58% in 2015. This small but statistically significant (Table 2) drift roughly agrees with yearly nationwide results based on other survey questions [4]. That agreement on trends incidentally provides further encouragement for viewing the New Hampshire series as a proxy. Despite upward movement, both New Hampshire and national public opinion falls far short of the 97% consensus among climate scientists.

Future research

The basic climate-change question offers currency, simplicity and unambiguous interpretation—whether individuals personally agree with the central point of scientific consensus on this globally important issue. As the examples here show, the question adapts readily to diverse survey instruments, opening possibilities for temporal, geographic and social-group comparisons. One planned future application is the 2016 General Social Survey, which offers an impressive range of sociological covariates. We also expect further regional surveys along the lines of CERA and CAFOR, investigating local variations. Finally, the same climate question has proven useful at smaller scales, in the benchmark and evaluation stages of education activities that are in progress but not described here.

The quarterly resolution and increasingly long run of the New Hampshire time series provides a unique platform to detect and characterize future change. To date it has shown only minor fluctuations around a slow upward drift. Seemingly large external events including an election and nearby hurricane had no detectable effects, but the possibility remains that cumulative or more extreme political, economic or climate-related events could have greater impact. With or without dramatic impacts, the series provides a monitoring system for the shape of any changes in public acceptance—whether abrupt or gradual, ephemeral or lasting.

Supporting Information

S1 File. National CERA Survey (NCERA 2011) variable list (pdf format).

https://doi.org/10.1371/journal.pone.0138208.s001

(PDF)

S2 File. NCERA 2011 and CERA/CAFOR 2010–2011 surveys technical report (pdf format).

https://doi.org/10.1371/journal.pone.0138208.s002

(PDF)

S3 File. Methodology Statement and Topline for iMediaEthics Poll on Climate Change 2014, conducted by Princeton Survey Research Associates International (pdf format).

https://doi.org/10.1371/journal.pone.0138208.s003

(PDF)

S4 File. Stata graphing commands and statistical tables (pdf format).

https://doi.org/10.1371/journal.pone.0138208.s004

(PDF)

S1 Dataset. NCERA 2011 national survey (Stata 12 format)

https://doi.org/10.1371/journal.pone.0138208.s005

(DTA)

S2 Dataset. iMediaEthics 2014 national survey (Stata 12 format)

https://doi.org/10.1371/journal.pone.0138208.s006

(DTA)

S3 Dataset. Climate change belief responses from many surveys (Stata 12 format).

https://doi.org/10.1371/journal.pone.0138208.s007

(DTA)

S4 Dataset. Weighted percentages responding that climate change is happening now, mainly caused by human activities, on many surveys (Stata 12 format).

https://doi.org/10.1371/journal.pone.0138208.s008

(DTA)

S5 Dataset. Weighted percentages responding that climate change is happening now, mainly caused by human activities, by political party identification on many surveys (Stata 12 format).

https://doi.org/10.1371/journal.pone.0138208.s009

(DTA)

S6 Dataset. Climate change now/human responses from many surveys, by political party (Stata 12 format).

https://doi.org/10.1371/journal.pone.0138208.s010

(DTA)

Acknowledgments

The UNH Survey Center conducted all telephone interviews for NCERA, CERA/CAFOR and the Granite State Poll. The Carsey School of Public Policy at the University of New Hampshire provided logistical and administrative support.

Author Contributions

Conceived and designed the experiments: LH JH MLS DM TS. Performed the experiments: LH JH DM TS. Analyzed the data: LH MLS. Wrote the paper: LH JH.

References

  1. 1. AGU. Human-induced climate change requires urgent action. American Geophysical Union position statement. 2013. Available:http://www.agu.org/sci_pol/positions/
  2. 2. IPCC. Climate Change 2013—The Physical Science Basis. Summary for Policy Makers. Geneva, Switzerland: Intergovernmental Panel on Climate Change; 2013.
  3. 3. Doran PT, Zimmerman MK. Direct examination of the scientific consensus on climate change. EOS 2009 90(3):22–23.
  4. 4. Funk C, Rainie L. Public and scientists’ views on science and society. Pew Research Center 2015. Available: http://www.pewinternet.org/2015/01/29/public-and-scientists-views-on-science-and-society/ accessed 2/2/2015
  5. 5. Anderegg WRL, Prall JW, Harold J, Schneider SH. Expert credibility in climate change. Proceedings of the National Academy of Sciences 2010 107 (27):12107–12109.
  6. 6. Cook J, Nuccitelli D, Green SA, Richardson M, Winkler B, Painting R, et al. Quantifying the consensus on anthropogenic global warming in the scientific literature. Environmental Research Letters 2013 8.
  7. 7. Oreskes N. The scientific consensus on climate change. Science 2004 306(5702):1686. pmid:15576594
  8. 8. Brulle RJ. Institutionalizing delay: Foundation funding and the creation of U.S. climate change counter-movement organizations. Climatic Change 2013
  9. 9. McCright AM, Dunlap RE. The politicization of climate change and polarization in the American public’s views of global warming, 2001–2010. The Sociological Quarterly 2011 52:155–194.
  10. 10. Howe PD, Leiserowitz A. Who remembers a hot summer or a cold winter? The asymmetric effect of beliefs about global warming on perceptions of local climate conditions in the U.S. Global Environmental Change 2013
  11. 11. Leiserowitz AA, Maiback EW, Roser-Renouf C, Smith N, Dawson E. Climategate, public opinion, and the loss of trust. American Behavioral Scientist 2013 57(6):818–837.
  12. 12. Kosaka Y, Xie S-P. Recent global-warming hiatus tied to equatorial Pacific surface cooling. Nature 2013 501:403–407. pmid:23995690
  13. 13. Schuldt JP, Roh S, Schwartz N. Questionnaire design effects in climate change surveys: Implications for the partisan divide. Annals, American Academy of Political and Social Science 2015 658:121–133.
  14. 14. Hamilton LC. Did the Arctic ice recover? Demographics of true and false climate facts. Weather, Climate, and Society 2012 4(4):236–249.
  15. 15. Holthaus E. Senate votes 98–1 that climate change is real but splits on that pesky cause. Slate 1/21/2015. Available: http://www.slate.com/blogs/future_tense/2015/01/21/senate_votes_that_climate_change_is_real_but_doesn_t_agree_on_cause.html accessed 1/29/2015
  16. 16. Smith T.W., Marsden P., Hout M. & Kim J.. 2013. General social surveys, 1972–2012. Chicago, IL: National Opinion Research Center. Data available online at http://www3.norc.org/GSS+Website/Download/ accessed 4/15/2015.
  17. 17. AAPOR. Standard Definitions: Final Disposition of Case Codes and Outcome Rates for Surveys. 4th ed. Lenexa, KS: American Association for Public Opinion Research; 2006.
  18. 18. Hamilton LC, Hartter J, Safford TG, Stevens FR. Rural environmental concern: Effects of position, partisanship and place. Rural Sociology 2014 79(2):257–281.
  19. 19. Hamilton LC, Safford TG. Environmental views from the coast: Public concern about local to global marine issues. Society and Natural Resources 2015 28(1):57–74.
  20. 20. Boag AE, Hartter J, Hamilton LC, Stevens FR, Ducey MJ, Palace MW, et al. Forrest views: Shifting attitudes toward the environment in northeast Oregon. Durham, New Hampshire: Carsey School of Public Policy; 2015.
  21. 21. Hamilton LC. Statistics with Stata, version 12. Belmont, CA: Cengage; 2013.
  22. 22. Hamilton LC, Hartter J, Safford TG. Validity of county-level estimates of climate-change beliefs. Nature Climate Change 2015 5(August):704.
  23. 23. Mitchell MN. A Visual Guide to Stata Graphics, 2nd edition. College Station, TX: Stata Press; 2008.
  24. 24. Hamilton LC, Keim BD. Regional variation in perceptions about climate change. International Journal of Climatology 2009 29(15):2348–2352.
  25. 25. Hamilton LC, Colocousis CR, Duncan CM. Place effects on environmental views. Rural Sociology 2010 75(2):326–347.
  26. 26. McCright AM, Dunlap RE, Xiao C. Increasing influence of party identification on perceived scientific agreement and support for government action on climate change in the USA, 2006–2012. Weather, Climate, and Society 2014 http://dx.doi.org/10.1175/WCAS-D-13-00058.1
  27. 27. Hamilton LC. Who cares about polar regions? Results from a survey of U.S. public opinion. Arctic, Antarctic, and Alpine Research 2008 40(4):671–678.
  28. 28. Hamilton LC. Education, politics and opinions about climate change: Evidence for interaction effects. Climatic Change 2011 104:231–242.
  29. 29. Shao W, Keim BD, Garland JC, Hamilton LC. Weather, climate, and the economy: Explaining risk perceptions of global warming, 2001–2010. Weather, Climate, and Society 2014 6(1):119–134.
  30. 30. Hamilton LC, Cutler MJ, Schaefer A. Public knowledge and concern about polar-region warming. Polar Geography 2012 35(2):155–168.
  31. 31. Kahan DM, Jenkins-Smith H, Braman D. Cultural cognition of scientific consensus. Journal of Risk Research 2011 14(2):147–174.
  32. 32. Hamilton LC, Saito K. A four-party view of U.S. environmental concern. Environmental Politics 2015 24(2):212–227.
  33. 33. Brulle RJ, Carmichael J, Jenkins JC. Shifting public opinion on climate change: An empirical assessment of factors influencing concern over climate change in the U.S., 2002–2010. Climatic Change 2012 114:169–188.
  34. 34. Darmofal D. Elite cues and citizen disagreement with expert opinion. Political Research Quarterly 2005 58(3):381–395.
  35. 35. Guber DL. A cooling climate for change? Party polarization and the politics of global warming. American Behavioral Scientist 2012 57(1):93–115.
  36. 36. Borick CP, Rabe BG. A reason to believe: Examining the factors that determine individual views on global warming. Social Science Quarterly 2010 91(3):777–800.
  37. 37. Corner A, Whitmarsh L, Xenias D. Uncertainty, scepticism and attitudes towards climate change: Biased assimilation and attitude polarisation. Climatic Change 2011 114(3–4):463–478.
  38. 38. Taber CS, Lodge M. Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science 2006 50(3):755–769.
  39. 39. Wood BD, Vedlitz A. Issue definition, information processing, and the politics of global warming. American Journal of Political Science 2007 51(3):552–568.
  40. 40. Zhao X. Media use and global warming perceptions: A snapshot of the reinforcing spirals. Communication Research 2009 36(5):698–723.
  41. 41. Burgess J, Harrison CM, Filius P. Environmental communication and the cultural politics of environmental citizenship. Environment and Planning A 1998 30(8):1445–1460.
  42. 42. Joireman J, Truelove HB, Duell B. Effect of outdoor temperature, heat primes and anchoring on belief in global warming. Journal of Environmental Psychology 2010 30:358–367.
  43. 43. Risen JL, Critcher CR. Visceral fit: While in a visceral state, associated states of the world seem more likely. Journal of Personality and Social Psychology 2011 100(5):777–793. pmid:21244180
  44. 44. Lewandowski GW Jr, Ciarocco NJ, Gately EL. The effect of embodied temperature on perceptions of global warming. Current Psychology 2012 31(3):318–324.
  45. 45. Li Y, Johnson EJ, Zaval L. Local warming: Daily temperature change influences belief in global warming. Psychological Science 2011 22(4):454–459. pmid:21372325
  46. 46. Hamilton LC, Stampone MD. Blowin’ in the wind: Short-term weather and beliefs about anthropogenic climate change. Weather, Climate, and Society 2013 5(2):112–119.
  47. 47. Hamilton LC, Lemcke-Stampone M. Arctic warming and your weather: Public belief in the connection. International Journal of Climatology 2014 34:1723–1728.
  48. 48. Egan PJ, Mullin M. Turning personal experience into political attitudes: The effect of local weather on Americans’ perceptions about global warming. Journal of Politics 2012 74(3):796–809.
  49. 49. Goebbert K, Jenkins-Smith HC, Klockow K, Nowlin MC, Silva C. Weather, climate, and worldviews: The sources and consequences of public perceptions of changes in local weather patterns. Weather, Climate, and Society 2012 4:132–144.
  50. 50. Shao W. Are actual weather and perceived weather the same? Understanding perceptions of local weather and their effects on risk perceptions of global warming. Journal of Risk Research 2015.
  51. 51. McCright AM, Dunlap RE, Xiao C. The impacts of temperature anomalies and political orientation on perceived winter warming. Nature Climate Change 2014.
  52. 52. Marquart-Pyatt ST, McCright AM, Dietz T and Dunlap RE. Politics eclipses climate extremes for climate change perceptions. Global Environmental Change 2014 29:246–257.