Skip to main content
Menu

Main navigation

  • About
    • Annual Reports
    • Leadership
    • Jobs
    • Student Programs
    • Media Information
    • Store
    • Contact
    LOADING...
  • Experts
    • Policy Scholars
    • Adjunct Scholars
    • Fellows
  • Events
    • Upcoming
    • Past
    • Event FAQs
    • Sphere Summit
    LOADING...
  • Publications
    • Studies
    • Commentary
    • Books
    • Reviews and Journals
    • Public Filings
    LOADING...
  • Blog
  • Donate
    • Sponsorship Benefits
    • Ways to Give
    • Planned Giving
    • Meet the Development Team

Issues

  • Constitution and Law
    • Constitutional Law
    • Criminal Justice
    • Free Speech and Civil Liberties
  • Economics
    • Banking and Finance
    • Monetary Policy
    • Regulation
    • Tax and Budget Policy
  • Politics and Society
    • Education
    • Government and Politics
    • Health Care
    • Poverty and Social Welfare
    • Technology and Privacy
  • International
    • Defense and Foreign Policy
    • Global Freedom
    • Immigration
    • Trade Policy
Live Now

Cato at Liberty


  • Blog Home
  • RSS

Email Signup

Sign up to have blog posts delivered straight to your inbox!

Topics
  • Banking and Finance
  • Constitutional Law
  • Criminal Justice
  • Defense and Foreign Policy
  • Education
  • Free Speech and Civil Liberties
  • Global Freedom
  • Government and Politics
  • Health Care
  • Immigration
  • Monetary Policy
  • Poverty and Social Welfare
  • Regulation
  • Tax and Budget Policy
  • Technology and Privacy
  • Trade Policy
Archives
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009
  • July 2009
  • June 2009
  • May 2009
  • April 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • December 2007
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006
  • June 2006
  • May 2006
  • April 2006
  • Show More
February 1, 2017 5:11PM

Bias in Climate Science

By Patrick J. Michaels and Paul C. "Chip" Knappenberger

SHARE

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”




There is a new paper out in the journal Climatic Change that takes a look into the issue of publication bias in the climate change literature. This is something that we have previously looked into ourselves. The results of our initial investigation (from back in 2010) were written up and published in the paper “Evidence for ‘Publication bias’ concerning global warming in Science and Nature” in which we concluded that there was an overwhelming propensity for Nature and Science—considered among the world’s leading scientific journals—to publish findings that concluded climate change was “worse than expected.” We noted the implications:

This has considerable implications for the popular perception of global warming science, for the nature of “compendia” of climate change research, such as the reports of the United Nations' Intergovernmental Panel on Climate Change, and for the political process that uses those compendia as the basis for policy…

The consequent synergy between [publication bias], public perception, scientific “consensus” and policy is very disturbing. If the results shown for Science and Nature are in fact a general character of the scientific literature concerning global warming, our policies are based upon a unidirectionally biased stream of new information, rather than one that has a roughly equiprobable distribution of altering existing forecasts or implications of climate change in a positive or negative direction. This bias exists despite the stated belief of the climate research community that it does not.

In their investigation into publication bias, the authors of the new paper, Christian Harlos, Tim C. Edgell, and Johan Hollander, looked more broadly across scientific journals (including articles from 31 different journals), but a bit more narrowly at the field of climate change, limiting themselves to a sub-set of articles that dealt with a marine response to climate change (they selected, via random sampling, 120 articles in total).

Harlos et al. were primarily interested in looking into whether or not there was a bias in these articles resulting from an under-reporting of non-significant results. This bias type is known as the “file drawer” problem—in which research findings that aren’t statistically significant are rarely published (and therefore sit in a “file drawer).  This leads to an over- (and non-robust) estimate of the number of truly significant results. The “file drawer” problem has received a lot of attention in recent years (see here for example) and it continues to be an active research area.

From their examination, however, the Harlos team did not find firm evidence that the file-drawer-type bias was strongly manifest. But, importantly, they did find that several other types of bias were manifest, including bias in how scientific findings were being communicated:

However, our meta-analysis did find multiple lines of evidence of biases within our sample of articles, which were perpetuated in journals of all impact factors and related largely to how science is communicated: The large, statistically significant effects were typically showcased in abstracts and summary paragraphs, whereas the lesser effects, especially those that were not statistically significant, were often buried in the main body of reports. Although the tendency to isolate large, significant results in abstracts has been noted elsewhere (Fanelli 2012), here we provide the first empirical evidence of such a trend across a large sample of literature.

The authors note that, in particular, this bias was worst in the high impact journals (like Science and Nature), and that:

[O]ur results corroborate with others by showing that high impact journals typically report large effects based on small sample sizes (Fraley and Vazire 2014), and high impact journals have shown publication bias in climate change research (Michaels 2008, and further discussed in Radetzki 2010).

Ultimately, importantly, and significantly, they conclude:

…[M]ost audiences, especially non-scientific ones, are more likely to read article abstracts or summary paragraphs only, without perusing technical results. The onus to effectively communicate science does not fall entirely on the reader; rather, it is the responsibility of scientists and editors to remain vigilant, to understand how biases may pervade their work, and to be proactive about communicating science to non-technical audiences in transparent and un-biased ways. Ironically, articles in high impact journals are those most cited by other scientists; therefore, the practice of sensationalizing abstracts may bias scientific consensus too, assuming many scientists may also rely too heavily on abstracts during literature reviews and do not spend sufficient time delving into the lesser effects reported elsewhere in articles.

Despite our sincerest aim of using science as an objective and unbiased tool to record natural history, we are reminded that science is a human construct, often driven by human needs to tell a compelling story, to reinforce the positive, and to compete for limited resources—publication trends and communication bias is a proof of that.

These findings are yet another impelling reason (recall the problem with the bias in climate model tuning) why a re-examination of our government’s previous assessment reports of climate change (such as those underlying the EPA’s endangerment finding) should be undertaken by the new Administration at the soonest possible opportunity.

References

Harlos, C., T.C. Edgell, and J. Hollander, 2017. No evidence of publication bias in climate change science, Climatic Change, 140, 375-385, doi:10.1007/s10584-016-1880-1

Michaels, P.J., 2008. Evidence for “Publication bias” concerning global warming in Science and Nature. Energy and Environment, 19, 287–301, doi:10.1260/095830508783900735

Related Tags
Energy and Environment

Stay Connected to Cato

Sign up for the newsletter to receive periodic updates on Cato research, events, and publications.

View All Newsletters

1000 Massachusetts Ave. NW
Washington, DC 20001-5403
202-842-0200
Contact Us
Privacy

Footer 1

  • About
    • Annual Reports
    • Leadership
    • Jobs
    • Student Programs
    • Media Information
    • Store
    • Contact
  • Podcasts

Footer 2

  • Experts
    • Policy Scholars
    • Adjunct Scholars
    • Fellows
  • Events
    • Upcoming
    • Past
    • Event FAQs
    • Sphere Summit

Footer 3

  • Publications
    • Books
    • Cato Journal
    • Regulation
    • Cato Policy Report
    • Cato Supreme Court Review
    • Cato’s Letter
    • Human Freedom Index
    • Economic Freedom of the World
    • Cato Handbook for Policymakers

Footer 4

  • Blog
  • Donate
    • Sponsorship Benefits
    • Ways to Give
    • Planned Giving
Also from Cato Institute:
Libertarianism.org
|
Humanprogress.org
|
Downsizinggovernment.org