2000 Years of Climate Change

New studies: problems with temperature records used by climate models

Most people who disagree with the hypothesis of catastrophic man-made global warming have their doubts about official temperature record collection. But unless you are really following the issue closely, it might be hard for you to find peer-reviewed papers that question the majority view. Let’s take a look at some recent peer-reviewed papers that will help skeptics make their case.

This article is from the Epoch Times. They sometimes hide their articles behind paywalls, but this out is available to all. Just in case, I found an archive of it.

It says:

Temperature records used by climate scientists and governments to build models that then forecast dangerous manmade global warming repercussions have serious problems and even corruption in the data, multiple scientists who have published recent studies on the issue told The Epoch Times.

[…]Problems with temperature data include a lack of geographically and historically representative data, contamination of the records by heat from urban areas, and corruption of the data introduced by a process known as “homogenization.”

I already know about that problem of measuring stations being located in areas of high ambient heat, like busy streets, industrial areas, solar panel farms, etc. But I had not heard about homogenization.

I looked up homogenization, and it’s referring to the need to remove the impact of “non-climatic changes” on the temperature data. That could be changes caused by changing sensor technology, moving the weather station, or other factors unrelated to climate.

The article says:

For instance, if a temperature station was originally placed in an empty field but that field has since been paved over to become a parking lot, the record would appear to show much hotter temperatures. As such, it would make sense to try to correct the data collected.

That was an interesting point. So, the warming would be actually caused by increased heat from cars, buildings, Sun reflections, etc. as the sensor got surrounded by civilization. It’s readings would change, but not because of changes in the climate.

But there’s a problem with the way that scientists have been adjusting the raw data:

Virtually nobody argues against the need for some homogenization to control for various factors that may contaminate temperature data.

But a closer examination of the process as it now occurs reveals major concerns, Ronan Connolly, an independent scientist at CERES, said.

“While the scientific community has become addicted to blindly using these computer programs to fix the data biases, until recently nobody has bothered to look under the hood to see if the programs work when applied to real temperature data,” he told The Epoch Times.

Since the early 2000s, various governmental and intergovernmental organizations creating global temperature records have relied on computer programs to automatically adjust the data.

Mr. Soon, Mr. Connolly, and a team of scientists around the world spent years looking at the programs to determine how they worked and whether they were reliable.

One of the scientists involved in the analysis, Peter O’Neill, has been tracking and downloading the data daily from the National Oceanographic and Atmospheric Administration (NOAA) and its Global Historical Climatology Network since 2011.

He found that each day, NOAA applies different adjustments to the data.

“They use the same homogenization computer program and re-run it roughly every 24 hours,” Mr. Connolly said. “But each day, the homogenization adjustments that they calculate for each temperature record are different.”

This is “very bizarre,” he said.

“If the adjustments for a given weather station have any basis in reality, then we would expect the computer program to calculate the same adjustments every time. What we found is this is not what’s happening,” Mr. Connolly said.

These concerns are what first sparked the international investigation into the issue by Mr. Soon and his colleagues.

Because NOAA doesn’t maintain historical information on its weather stations, the CERES scientists reached out to European scientists who had been compiling the data for the stations that they oversee.

They found that just 17 percent of NOAA’s adjustments were consistently applied. And less than 20 percent of NOAA’s adjustments were clearly associated with a documented change to the station observations.

“When we looked under the hood, we found that there was a hamster running in a wheel instead of an engine,” Mr. Connolly said. “It seems that with these homogenization programs, it is a case where the cure is worse than the disease.”

For me, that is the take home lesson for the whole article.

Well, how much of a difference do these problems make for climate change models?

It’s a huge problem:

The flaws are so significant that they make the temperature data—and the models based on it—essentially useless or worse, three independent scientists with the Center for Environmental Research and Earth Sciences (CERES) explained.

The experts said that when data corruption is considered, the alleged “climate crisis” supposedly caused by human activities disappears.

Instead, natural climate variability offers a much better explanation for what is being observed, they said.

I waited a few days to blog this article, so that I could check up on the studies. Here is one, and here is another. There are a lot of authors on these studies, and they are from good journals: Climate and Atmosphere. The authors are from a variety of academic institutions from different countries, like Germany, Ireland, Hungary, Austria, Sweden, Italy and the Netherlands.

If you like to debate climate change like I do, then you might want to bookmark this article. To me, this is another new, peer-reviewed piece of evidence that argues against the catastrophic man-made global warming view. And we have to be guided by evidence.

Leave a comment