Are recent temperature records as certain as we think?
Every year now ends with the same headline: “Hottest year on record,” “Hottest decade ever,” “Earth enters uncharted climate territory.” The repetition has become so familiar that few people stop to ask what these claims actually mean. In August 2021, for example, Sen. Bernie Sanders declared that July was “the hottest month ever in the history of the planet.” Many accepted that statement instantly. But when I looked at regional climate reports from that same month, the picture was far more complicated. What I found was striking enough to document.
In July 2021, Antarctic sea ice extent was above average. South America endured a severe cold snap that damaged coffee crops. Germany had a cool, unsettled July. In France, media outlets ran headlines asking whether summer would ever get going. The South African Weather Service warned of “very cold temperatures” across the country. Minnesota saw a rare frost. And in the typically scorching central and southern U.S. plains, temperatures were cooler than usual, while those in the northeastern U.S. were about average. When a sweeping global headline contradicts so many local realities, it’s reasonable to pause and ask how these records are defined — and how certain they really are.
Part of the answer lies in how the climate story has changed over the past few decades. For years, the public has been told that temperatures were essentially flat for a thousand years and then suddenly spiked in the 20th century — a narrative popularized by the “hockeystick” graph in the late 1990s. But before that graph appeared, mainstream climate science described a very different history.
Look back at the scientific literature before 2000, and a clear pattern emerges. The Medieval Warm Period, roughly 900-1300 AD, was widely recognized as a time when parts of the Northern Hemisphere were as warm as — or warmer than — today. This wasn’t a fringe idea. It came from respected climatologists like Hubert Lamb, whose work in Nature, Climatic Change and the UK Met Office helped establish modern paleoclimate research.
Lamb’s reconstructions showed a warm medieval era followed by a sharp descent into the Little Ice Age, a period of global cooling that lasted into the 1800s. His work focused on Europe and the North Atlantic but drew on evidence from across the hemisphere and was intended as a broad reconstruction rather than a local curiosity.
The 1990 IPCC report — the authoritative climate assessment of its time — reflected this understanding. Its millennial temperature graph showed a pronounced Medieval Warm Period, a deep Little Ice Age and a gradual rebound into the modern era. Throughout the 1990s, peer-reviewed studies in Science, The Holocene and Climatic Change reinforced this picture. Researchers documented glacier advances, cooling events and longterm natural variability that shaped the climate long before industrialization. Their work did not deny human influence, including the well-established mid-20th century cooling from sulfate aerosol pollution, but it emphasized that modern warming must be understood against a backdrop of substantial natural variability.
Even the public record reflected this earlier perspective. In January 1989, The New York Times summarized NOAA’s own assessment of global temperatures: based on the available data, NOAA scientists said there had been no clear warming trend over the previous century. Year-to-year fluctuations were evident, but no statistically significant long-term rise. That was the official interpretation of the same temperature record we use today.
As late as 1999, this more cautious view remained visible in mainstream scientific publications. In NASA’s “Whither U.S. Climate?,” Dr. James Hansen wrote that, “Empirical evidence does not lend much support to the notion that climate is headed precipitately toward more extreme heat and drought.” That statement came from the nation’s most prominent climate scientist at the time, and it reflected the datasets as they existed then — before major rounds of homogenization and algorithmic adjustments were applied in the 2000s.
Those adjustments are central to today’s temperature record. Beginning in the early 2000s, NASA and NOAA updated the U.S. temperature series using homogenization — a statistical process meant to correct for station moves, instrument changes and time-of-observation shifts. The goal was consistency. But the effect was to alter the shape of the historical temperature curve, often cooling the early 20th century and intensifying recent warming.
All major global temperature datasets use homogenization, but NOAA’s fully automated system applies some of the broadest regional adjustments, allowing changes at one station to influence many others. While mathematically elegant, it also means that a single station’s adjustment can influence many others, and that urban heat island effects or local anomalies can affect nearby rural stations. In other words, the modern temperature record is no longer a simple reflection of raw measurements; it is a heavily processed product shaped by statistical decisions most readers never hear about.
Consider Indian Lake, New York. The raw data I presented in my March 22, 2025, Adirondack Daily Enterprise article shows a cooling over the past century. The homogenized version published by Dr. Curt Stager on April 9, 2025, shows warming. Both data sets describe the same weather station. Similar discrepancies have been documented elsewhere, including at Australia’s Rutherglen station. These examples don’t prove wrongdoing, but they do illustrate how profoundly the record can change depending on the adjustment method. And if you’re wondering how two different results from the same underlying data can exist for the same weather station, you’re not alone. It’s also worth remembering that this comparison is only possible because the raw data are still publicly available.
I chose Indian Lake because it’s a rural Adirondack station, far from urban heat island effects and because NOAA has an almost complete raw temperature record there. Since 1900, observers at the site have maintained one of the most consistent datasets in the region. Many other Adirondack and North Country stations are incomplete or have stopped reporting altogether. By contrast, any site — including Indian Lake — can be run through NOAA’s homogenization system. The algorithm can fill in missing values and adjust existing ones using data from other stations, sometimes hundreds of miles away. Depending on station density, homogenization can draw on neighbors 200 to 700 miles away, or even farther, to modify a single location’s record.
Meanwhile, climate reporting has grown more dramatic. Headlines like “Hottest Day Ever Recorded” or “Earth Is Heating Faster Than Expected” are crafted for emotional impact.
They leave little room for nuance, uncertainty or the messy reality of regional variation.
So here is the question I invite readers to consider: If the historical narrative has shifted, the datasets have been repeatedly reprocessed and the headlines have grown more absolute, how confident should we be in the temperature records that dominate today’s climate conversation?
— — —
Jed Dukett lives in Tupper Lake and is a former acid rain scientist.
