home > archive > 2013 > this article

The science fiction of IPCC climate models

By Kesten C. Green, J. Scott Armstrong and Willie Soon
web posted October 14, 2013

The human race has prospered by relying on forecasts that the seasons will follow their usual course, while knowing they will sometimes be better or worse. Are things different now?

For the fifth time now, the Intergovernmental Panel on Climate Change claims they are. The difference, the IPCC asserts, is increased human emissions of carbon dioxide: a colorless, odorless, non-toxic gas that is a byproduct of growing prosperity. It is also a product of all animal respiration and is also essential for most life on Earth, yet in total it makes up only 0.0004 of the atmosphere.

The IPCC assumes that the relatively small human contribution of this gas to the atmosphere will cause global warming, and insists that the warming will be dangerous.

Other scientists contest the IPCC assumptions, on the grounds that the climatological effect of increases in atmospheric carbon dioxide is trivial – and that the climate is so complex and insufficiently understood that the net effect of human emissions on global temperatures cannot be forecasted.

The computer models that the authors of the IPCC reports rely on are complicated representations of the assumption that human carbon dioxide emissions are now the primary factor driving climate change and will substantially overheat the Earth. The models include many assumptions that mainstream scientists question.

The modelers have correctly stated that they produce scenarios, not forecasts. Scenarios are stories constructed from a collection of assumptions. Well-constructed scenarios can be very convincing, in the same way that a well-crafted fictional book or film can be.

The IPCC and its supporters promote these scary scenarios as if they were forecasts. However, scenarios are neither forecasts nor the product of a validated forecasting method.

The IPCC modelers were apparently unaware of decades of forecasting research. Our audit of the procedures used to create their apocalyptic scenarios found that they violated 72 of 89 relevant scientific forecasting principles. Would you go ahead with your flight, if you overheard two of the ground crew discussing how the pilot had skipped 80 percent of the pre-flight safety checklist? 

Thirty-nine forecasting experts from many disciplines from around the world developed the forecasting principles from published experimental research. A further 123 forecasting experts reviewed the work. The principles were published in 2001. They are freely available on the Internet, to help forecasters produce the best forecasts they can, and help forecast users determine the validity of forecasts. These principles are the only published set of evidence-based standards for forecasting.

Global warming alarmists nevertheless claim that the "nearly all" climate scientists believe dangerous global warming will occur. This is a strange claim, in view of the fact more than 30,000 American scientists signed the Oregon Petition, stating that there is no basis for dangerous manmade global warming forecasts, and "no convincing evidence" that carbon dioxide is dangerously warming the planet or disrupting its climate.

Most importantly, computer models and scenarios are not evidence – and validation does not consist of adding up votes. Such an approach can only be detrimental to the advancement of scientific knowledge. Validation requires comparing predictions to actual observations, and the IPCC models have failed in that regard.

Given the expensive policies proposed and implemented in the name of preventing dangerous manmade global warming, we are astonished that there is only one published peer-reviewed paper that claims to provide scientific forecasts of long-range global mean temperatures. The paper is our own 2009 article in the International Journal of Forecasting.

Our paper examined the state of knowledge and available empirical (that is, actually measured) data, in order to select appropriate evidence-based procedures for long-range forecasting of global mean temperatures. Given the complexity and uncertainty of the situation, we concluded that the "no-trend" model is the proper method to use. The conclusion is based on a substantial body of research that found complex models do not work well in complex and uncertain situations.

This finding might be puzzling to people who are unfamiliar with the research on forecasting. So we tested the no-trend model, using the same data that the IPCC uses, since forecasting principles require that models be validated by comparing them to actual observations.

To do this, we produced annual forecasts from one to 100 years ahead, starting from 1851 and stepping forward year-by-year until 1975, the year before the current warming alarm was raised. (This is also the year when Newsweek and other magazines reported that scientists were "almost unanimous" that Earth faced a new period of global cooling.) We conducted the same analysis for the IPCC scenario of temperatures increasing at a rate of 0.03 degrees Celsius (0.05 degrees Fahrenheit) per year in response to increasing human carbon dioxide emissions.

This procedure yielded 7,550 forecasts for each method. The findings?

Overall, the no-trend forecast error was one-seventh the error of the IPCC scenario's projection. They were as accurate as or more accurate than the IPCC temperatures for all forecast horizons. Most important, the relative accuracy of the no-trend forecasts increased for longer horizons. For example, the no-trend forecast error was one-twelfth that of the IPCC temperature scenarios for forecasts 91 to 100 years ahead.

Our research in progress scrutinizes more forecasting methods, uses more and better data, and extends our validation tests. The findings strengthen the conclusion that there are no scientific forecasts that predict dangerous global warming.

Is it surprising that the government would support an alarm lacking scientific support? Not really. In our study of situations that are analogous to the current alarm over scenarios of global warming, we identified 26 earlier movements based on scenarios of manmade disaster, including the global cooling alarm in the 1960s to 1970s. None of them were based on scientific forecasts. And yet, governments imposed costly policies in response to 23 of them. In no case did the forecast of major harm come true.

There is no support from scientific forecasting for an upward trend in temperatures, or a downward trend. Without support from scientific forecasts, the global warming alarm is baseless and should be ignored.

Government programs, subsidies, taxes and regulations proposed as responses to the global warming alarm result in misallocations of valuable resources. They lead to inflated energy prices, declining international competitiveness, disappearing industries and jobs, and threats to health and welfare. 

Humanity can do better with the old, simple, tried-and-true no-trend climate forecasting model. This traditional method is also consistent with scientific forecasting principles. ESR

Dr. Kesten C. Green is with the University of South Australia in Adelaide and is director of the major website on forecasting methods, www.ForecastingPrinciples.com, and has published twelve peer-reviewed articles on forecasting. Professor J. Scott Armstrong teaches at the University of Pennsylvania in Philadelphia and is a founder of the two major journals on forecasting methods, editor of the Principles of Forecasting handbook, and the world's most highly cited author on forecasting methods. Dr. Willie Soon of Salem, MAfor the past 20 years has published extensively on solar and other factors that cause climate changes. Copies of the authors' climate forecasting papers are available at www.PublicPolicyForecasting.com.  

 

Home


 

Home

Site Map

E-mail ESR

 

 


© 1996-2024, Enter Stage Right and/or its creators. All rights reserved.