USCS Warnings of Coastal Floodings

Be not Confused. USCS is not the US Coastal Service, but rather stands for the Union of Super Concerned Scientists, or UCS for short. Using their considerable PR skills and budgets, they have plastered warnings in the media targeting major coastal cities, designed to strike terror in anyone holding real estate in those places. Example headlines include:

Sea level rise could put thousands of homes in this SC county at risk, study says The State, South Carolina

Taxpayers in the Hamptons among the most exposed to rising seas Crain’s New York Business

Adapting to Climate Change Will Take More Than Just Seawalls and Levees Scientific American

The Biggest Threat Facing the City of Miami Smithsonian Magazine

What Does Maryland’s Gubernatorial Race Mean For Flood Management? The Real News Network

Study: Thousands of Palm Beach County homes impacted by sea-level rise WPTV, Florida

Sinking Land and Climate Change Are Worsening Tidal Floods on the Texas Coast Texas Observer

Sea Level Rise Will Threaten Thousands of California Homes Scientific American

300,000 coastal homes in US, worth $120 billion, at risk of chronic floods from rising seas USA Today

That last gets the thrust of the UCS study Underwater: Rising Seas, Chronic Floods, and the Implications for US Coastal Real Estate (2018)

Sea levels are rising. Tides are inching higher. High-tide floods are becoming more frequent and reaching farther inland. And hundreds of US coastal communities will soon face chronic, disruptive flooding that directly affects people’s homes, lives, and properties.

Yet property values in most coastal real estate markets do not currently reflect this risk. And most homeowners, communities, and investors are not aware of the financial losses they may soon face.

This analysis looks at what’s at risk for US coastal real estate from sea level rise—and the challenges and choices we face now and in the decades to come.

The report and supporting documents give detailed dire warnings state by state, and even down to counties and townships. As example of the damage projections is this table estimating 2030 impacts:

State  Homes at Risk  Value at Risk Property Tax at Risk  Population in 
at-risk homes 
AL  3,542 $1,230,676,217 $5,918,124  4,367
CA  13,554 $10,312,366,952 $128,270,417  33,430
CT  2,540 $1,921,428,017 $29,273,072  5,690
DC  – $0 $0  –
DE  2,539 $127,620,700 $2,180,222  3,328
FL  20,999 $7,861,230,791 $101,267,251  32,341
GA  4,028 $1,379,638,946 $13,736,791  7,563
LA  26,336 $2,528,283,022 $20,251,201  63,773
MA  3,303 $2,018,914,670 $17,887,931  6,500
MD  8,381 $1,965,882,200 $16,808,488  13,808
ME  788 $330,580,830 $3,933,806  1,047
MS  918 $100,859,844 $1,392,059  1,932
NC  6,376 $1,449,186,258 $9,531,481  10,234
NH  1,034 $376,087,216 $5,129,494  1,659
NJ  26,651 $10,440,814,375 $162,755,196  35,773
NY  6,175 $3,646,706,494 $74,353,809  16,881
OR  677 $110,461,140 $990,850  1,277
PA  138 $18,199,572 $204,111  310
RI  419 $299,462,350 $3,842,996  793
SC  5,779 $2,882,357,415 $22,921,550  8,715
TX  5,505 $1,172,865,533 $19,453,940  9,802
VA  3,849 $838,437,710 $8,296,637  6,086
WA  3,691 $1,392,047,121 $13,440,420  7,320

The methodology, of course is climate models all the way down. They explain:

Three sea level rise scenarios, developed by the National Oceanic and Atmospheric Administration (NOAA) and localized for this analysis, are included:

  • A high scenario that assumes a continued rise in global carbon emissions and an increasing loss of land ice; global average sea level is projected to rise about 2 feet by 2045 and about 6.5 feet by 2100.
  • An intermediate scenario that assumes global carbon emissions rise through the middle of the century then begin to decline, and ice sheets melt at rates in line with historical observations; global average sea level is projected to rise about 1 foot by 2035 and about 4 feet by 2100.
  • A low scenario that assumes nations successfully limit global warming to less than 2 degrees Celsius (the goal set by the Paris Climate Agreement) and ice loss is limited; global average sea level is projected to rise about 1.6 feet by 2100.

Oh, and they did not forget the disclaimer:

Disclaimer
This research is intended to help individuals and communities appreciate when sea level rise may place existing coastal properties (aggregated by community) at risk of tidal flooding. It captures the current value and tax base contribution of those properties (also aggregated by community) and is not intended to project changes in those values, nor in the value of any specific property.

The projections herein are made to the best of our scientific knowledge and comport with our scientific and peer review standards. They are limited by a range of factors, including but not limited to the quality of property-level data, the resolution of coastal elevation models, the potential installment of defensive measures not captured by those models, and uncertainty around the future pace of sea level rise. More information on caveats and limitations can be found at http://www.ucsusa.org/underwater.

Neither the authors nor the Union of Concerned Scientists are responsible or liable for financial or reputational implications or damages to homeowners, insurers, investors, mortgage holders, municipalities, or other any entities. The content of this analysis should not be relied on to make business, real estate or other real world decisions without independent consultation with professional experts with relevant experience. The views expressed by individuals in the quoted text of this report do not represent an endorsement of the analysis or its results.

The need for a disclaimer becomes evident when looking into the details. The NOAA reference is GLOBAL AND REGIONAL SEA LEVEL RISE SCENARIOS FOR THE UNITED STATES NOAA Technical Report NOS CO-OPS 083

Since the text emphasizes four examples of their scenarios, let’s consider them here. First there is San Francisco, a city currently suing oil companies over sea level rise. From tidesandcurrents comes this tidal gauge record
It’s a solid, long-term record providing a century of measurements from 1900 through 2017.  The graph below compares the present observed trend with climate models projections out to 2100.

Since the record is set at zero in 2000, the difference in 21st century expectation is stark. Instead of  the existing trend out to around 20 cm, models project 2.5 meters rise by 2100.

New York City is represented by the Battery tidal gauge:
Again, a respectable record with a good 20th century coverage.  And the models say:
The red line projects 2500 mm rise vs. 284 mm, almost a factor of 10 more.  The divergence is evident even in the first 17 years.

Florida comes in for a lot of attention, especially the keys, so here is Key West:
A similar pattern to NYC Battery gauge, and here is the projection:
The pattern is established: Instead of a rise of about 30 cm, the models project 250 cm.

Finally, probably the worst case, and well-known to all already is Galveston, Texas:
The water has been rising there for a long time, so maybe the models got this one close.
The gap is less than the others since the rising trend is much higher, but the projection is still four times the past.  Galveston is at risk, all right, but we didn’t need this analysis to tell us that.

A previous post Unbelievable Climate Models goes into why they are running so hot and so extreme, and why they can not be trusted.

Advertisements

Unbelievable Climate Models

It is not just you thinking the world is not warming the way climate models predicted. The models are flawed, and their estimates of the climate’s future response to rising CO2 are way too hot. Yet these overcooked forecasts are the basis for policy makers to consider all kinds of climate impacts, from sea level rise to food production and outbreaks of Acne.

The models’ outputs are contradicted by the instrumental temperature records. So a choice must be made: Shall we rely on measurements of our past climate experience, or embrace the much warmer future envisioned by these models?

Ross McKitrick takes us through this fundamental issue in his Financial Post article All those warming-climate predictions suddenly have a big, new problem Excerpts below with my bolds, headers and images

Why ECS is Important

One of the most important numbers in the world goes by the catchy title of Equilibrium Climate Sensitivity, or ECS. It is a measure of how much the climate responds to greenhouse gases. More formally, it is defined as the increase, in degrees Celsius, of average temperatures around the world, after doubling the amount of carbon dioxide in the atmosphere and allowing the atmosphere and the oceans to adjust fully to the change. The reason it’s important is that it is the ultimate justification for governmental policies to fight climate change.

The United Nations Intergovernmental Panel on Climate Change (IPCC) says ECS is likely between 1.5 and 4.5 degrees Celsius, but it can’t be more precise than that. Which is too bad, because an enormous amount of public policy depends on its value. People who study the impacts of global warming have found that if ECS is low — say, less than two — then the impacts of global warming on the economy will be mostly small and, in many places, mildly beneficial. If it is very low, for instance around one, it means greenhouse gas emissions are simply not worth doing anything about. But if ECS is high — say, around four degrees or more — then climate change is probably a big problem. We may not be able to stop it, but we’d better get ready to adapt to it.

So, somebody, somewhere, ought to measure ECS. As it turns out, a lot of people have been trying, and what they have found has enormous policy implications.

The violins span 5–95% ranges; their widths indicate how PDF values vary with ECS. Black lines show medians, red lines span 17–83% ‘likely’ ranges. Published estimates based directly on observed warming are shown in blue. Unpublished estimates of mine based on warming attributable to greenhouse gases inferred by two recent detection and attribution studies are shown in green. CMIP5 models are shown in salmon. The observational ECS estimates have broadly similar medians and ‘likely’ ranges, all of which are far below the corresponding values for the CMIP5 models. Source: Nic Lewis at Climate Audit https://climateaudit.org/2015/04/13/pitfalls-in-climate-sensitivity-estimation-part-2/

Methods Matter

To understand why, we first need to delve into the methodology a bit. There are two ways scientists try to estimate ECS. The first is to use a climate model, double the modeled CO2 concentration from the pre-industrial level, and let it run until temperatures stabilize a few hundred years into the future. This approach, called the model-based method, depends for its accuracy on the validity of the climate model, and since models differ quite a bit from one another, it yields a wide range of possible answers. A well-known statistical distribution derived from modeling studies summarizes the uncertainties in this method. It shows that ECS is probably between two and 4.5 degrees, possibly as low as 1.5 but not lower, and possibly as high as nine degrees. This range of potential warming is very influential on economic analyses of the costs of climate change.***

The second method is to use long-term historical data on temperatures, solar activity, carbon-dioxide emissions and atmospheric chemistry to estimate ECS using a simple statistical model derived by applying the law of conservation of energy to the planetary atmosphere. This is called the Energy Balance method. It relies on some extrapolation to satisfy the definition of ECS but has the advantage of taking account of the available data showing how the actual atmosphere has behaved over the past 150 years.

The surprising thing is that the Energy Balance estimates are very low compared to model-based estimates. The accompanying chart compares the model-based range to ECS estimates from a dozen Energy Balance studies over the past decade. Clearly these two methods give differing answers, and the question of which one is more accurate is important.

Weak Defenses for Models Discrepancies

Climate modelers have put forward two explanations for the discrepancy. One is called the “emergent constraint” approach. The idea is that models yield a range of ECS values, and while we can’t measure ECS directly, the models also yield estimates of a lot of other things that we can measure (such as the reflectivity of cloud tops), so we could compare those other measures to the data, and when we do, sometimes the models with high ECS values also yield measures of secondary things that fit the data better than models with low ECS values.

This argument has been a bit of a tough sell, since the correlations involved are often weak, and it doesn’t explain why the Energy Balance results are so low.

The second approach is based on so-called “forcing efficacies,” which is the concept that climate forcings, such as greenhouse gases and aerosol pollutants, differ in their effectiveness over time and space, and if these variations are taken into account the Energy Balance sensitivity estimates may come out higher. This, too, has been a controversial suggestion.

Challenges to Oversensitive Models

A recent Energy Balance ECS estimate was just published in the Journal of Climate by Nicholas Lewis and Judith Curry. There are several features that make their study especially valuable. First, they rely on IPCC estimates of greenhouse gases, solar changes and other climate forcings, so they can’t be accused of putting a finger on the scale by their choice of data. Second, they take into account the efficacy issue and discuss it at length. They also take into account recent debates about how surface temperatures should or shouldn’t be measured, and how to deal with areas like the Arctic where data are sparse. Third, they compute their estimates over a variety of start and end dates to check that their ECS estimate is not dependent on the relative warming hiatus of the past two decades.

Their ECS estimate is 1.5 degrees, with a probability range between 1.05 and 2.45 degrees. If the study was a one-time outlier we might be able to ignore it. But it is part of a long list of studies from independent teams (as this interactive graphic shows), using a variety of methods that take account of critical challenges, all of which conclude that climate models exhibit too much sensitivity to greenhouse gases.

Change the Sensitivity, Change the Future

Policy-makers need to pay attention, because this debate directly impacts the carbon-tax discussion.

The Environmental Protection Agency uses social cost of carbon models that rely on the model-based ECS estimates. Last year, two colleagues and I published a study in which we took an earlier Lewis and Curry ECS estimate and plugged it into two of those models. The result was that the estimated economic damages of greenhouse gas emissions fell by between 40 and 80 per cent, and in the case of one model the damages had a 40 per cent probability of being negative for the next few decades — that is, they would be beneficial changes. The new Lewis and Curry ECS estimate is even lower than their old one, so if we re-did the same study we would find even lower social costs of carbon.

Conclusion

If ECS is as low as the Energy Balance literature suggests, it means that the climate models we have been using for decades run too hot and need to be revised. It also means that greenhouse gas emissions do not have as big an impact on the climate as has been claimed, and the case for costly policy measures to reduce carbon-dioxide emissions is much weaker than governments have told us. For a science that was supposedly “settled” back in the early 1990s, we sure have a lot left to learn.

Ross McKitrick is professor of economics at the University of Guelph and senior fellow at the Fraser Institute.

Famine Forecasts Foiled: Climate Increasing Food Production

Gregory Whitestone has the story at CNS Famine Forecasts Foiled: Climate’s Projected Food Production to Increase  Excerpts below with my bolds.

The latest dose of “fake news” about global warming comes from two forecasts of famine due to human activity. Both drew on estimates of extremely high temperatures predicted by the same flawed climate models used by the Intergovernmental Panel on Climate Change (IPCC) to predict other climate calamities. The climate models used in the studies are estimated to overpredict temperature by 2.5 to 3 times as compared to actually measured temperatures, and both rely on the highest estimates of maximum temperature increase.

The first of the reports warned that future production of vegetables and legumes would decrease by more than 30 percent with an expected rise of 4C. Even the alarmist IPCC says that the most likely case is a rise of about half that.

The primary reason for the prediction of famine is a sharp decrease in water availability, even though recent reports indicate that previously arid portions of the Earth are experiencing a significant net increase in soil moisture due to a combination of increasing precipitation and CO2 fertilization — both effects of our changing climate.

Buried in the report is an admission that contradicts the hysteria engendered by the headlines. According to the authors, a 250-ppm increase in CO2, without the exaggerated temperature increase, would boost crop production by an average of 22 percent! That’s correct, more food as a result of increasing CO2.

The second report projects decreases in corn (maize) production due to increasing heat waves. This increase in extreme heat was based on the same exaggerated 4oC increase in temperature as the first study.

According to the USDA, corn is the largest component of the global grain trade, and the United States is the world’s largest producer. Corn is thus one of the country’s most important agricultural products, processed as sweet corn, cornmeal, tortillas and, thankfully, bourbon. It also is the primary feedstock to fatten cattle, chickens and hogs.

Fortunately, despite a continuing rise in temperatures, the world and America have set new corn records on an annual basis. The world’s remarkable ability to increase food production year after year is attributable to mechanization, agricultural innovation, CO2 fertilization and warmer weather. World grain production figures show that crop and food production has steadily increased, with only positive effects from our changing climate.

World grain production, consumption (LHS) and stocks (RHS) IGC (International Grain Council) data, Momagri formatting

Historically, crop growth has ballooned in times of high temperatures and declined drastically during cold periods. Over the last 4,000 years we find that previous periods of much warmer temperatures coincided with increasing food and prosperity leading to the rise of great civilizations that were relatively rich and well fed. Prosperous periods were interrupted by times of great despair as the Earth plunged into global cooling. With names like the Greek Dark Ages, the Dark Ages and the Little Ice Age, intervening cool periods featured crop failure, famine and mass depopulation.

Corn production in the U.S. presents a conundrum for environmental activists. On the one hand, they engage in fear mongering with predictions of famine based on questionable climate models. On the other hand, as enemies of fossil fuels, the activists promote ethanol production to replace our oil-based transportation fuels. Every acre of corn diverted to ethanol production is an acre that is no longer feeding the world’s hungry. In 2008, Herr Jean Ziegler, the United Nations’ Rapporteur for the Right to Food, claimed that “to divert land from food production to biofuels is a crime against humanity.”

In 2000, the United States imposed the first ethanol mandate, dictating the level of ethanol that must be incorporated into American fuels. At that time, 90 percent of corn production was used for food. Today, only 60 percent of corn produced is used for food, driving up the cost of corn as food. The climate alarmists who claim to care about the world’s hungry could improve their lot overnight by simply canceling the ethanol mandate.

Rising temperatures and increasing carbon dioxide are leading to multiple benefits and perhaps the most important of those is increasing crop production. Sleep well users of fossil fuels; you aren’t causing famine.

Gregory Wrightstone is author of the new book, “Inconvenient Facts: The Science That Al Gore Doesn’t Want You To Know.” Wrightstone is a geologist with more than 35 years of experience researching and studying various aspects of the Earth’s processes. He is a member of the American Association for the Advancement of Science and the Geological Society of America.

See also:  Adapting Plants to Feed the World

2018 Update: Fossil Fuels ≠ Global Warming

Previous posts addressed the claim that fossil fuels are driving global warming. This post updates that analysis with the latest (2017) numbers from BP Statistics and compares World Fossil Fuel Consumption (WFFC) with three estimates of Global Mean Temperature (GMT). More on both these variables below.

WFFC

2017 statistics are now available from BP for international consumption of Primary Energy sources. 2018 Statistical Review of World Energy. 

The reporting categories are:
Oil
Natural Gas
Coal
Nuclear
Hydro
Renewables (other than hydro)

This analysis combines the first three, Oil, Gas, and Coal for total fossil fuel consumption world wide. The chart below shows the patterns for WFFC compared to world consumption of Primary Energy from 1965 through 2017.

WFFC2017

The graph shows that Primary Energy consumption has grown continuously for 5 decades. Over that period oil, gas and coal (sometimes termed “Thermal”) averaged 89% of PE consumed, ranging from 94% in 1965 to 85% in 2017.  MToe is millions of tons of oil equivalents.

Global Mean Temperatures

Everyone acknowledges that GMT is a fiction since temperature is an intrinsic property of objects, and varies dramatically over time and over the surface of the earth. No place on earth determines “average” temperature for the globe. Yet for the purpose of detecting change in temperature, major climate data sets estimate GMT and report anomalies from it.

UAH record consists of satellite era global temperature estimates for the lower troposphere, a layer of air from 0 to 4km above the surface. HadSST estimates sea surface temperatures from oceans covering 71% of the planet. HADCRUT combines HadSST estimates with records from land stations whose elevations range up to 6km above sea level.

Both GISS LOTI (land and ocean) and HADCRUT4 (land and ocean) use 14.0 Celsius as the climate normal, so I will add that number back into the anomalies. This is done not claiming any validity other than to achieve a reasonable measure of magnitude regarding the observed fluctuations.

No doubt global sea surface temperatures are typically higher than 14C, more like 17 or 18C, and of course warmer in the tropics and colder at higher latitudes. Likewise, the lapse rate in the atmosphere means that air temperatures both from satellites and elevated land stations will range colder than 14C. Still, that climate normal is a generally accepted indicator of GMT.

Correlations of GMT and WFFC

The next graph compares WFFC to GMT estimates over the five decades from 1965 to 2017 from HADCRUT4, which includes HadSST3.

WFFC&GMT2017

Over the last five decades the increase in fossil fuel consumption is dramatic and monotonic, steadily increasing by 227% from 3.5B to 11.5B oil equivalent tons.  Meanwhile the GMT record from Hadcrut shows multiple ups and downs with an accumulated rise of 0.9C over 52 years, 6% of the starting value.

The second graph compares to GMT estimates from UAH6, and HadSST3 for the satellite era from 1979 to 2017, a period of 38 years.

WFFC&UAH&HAD2017

In the satellite era WFFC has increased at a compounded rate of nearly 2% per year, for a total increase of 87% since 1979. At the same time, SST warming amounted to 0.44C, or 3.1% of the starting value.  UAH warming was 0.58C, or 4.2% up from 1979.  The temperature compounded rate of change is 0.1% per year, an order of magnitude less.  Even more obvious is the 1998 El Nino peak and flat GMT since.

Summary

The climate alarmist/activist claim is straight forward: Burning fossil fuels makes measured temperatures warmer. The Paris Accord further asserts that by reducing human use of fossil fuels, further warming can be prevented.  Those claims do not bear up under scrutiny.

It is enough for simple minds to see that two time series are both rising and to think that one must be causing the other. But both scientific and legal methods assert causation only when the two variables are both strongly and consistently aligned. The above shows a weak and inconsistent linkage between WFFC and GMT.

Going further back in history shows even weaker correlation between fossil fuels consumption and global temperature estimates:

wfc-vs-sat

Figure 5.1. Comparative dynamics of the World Fuel Consumption (WFC) and Global Surface Air Temperature Anomaly (ΔT), 1861-2000. The thin dashed line represents annual ΔT, the bold line—its 13-year smoothing, and the line constructed from rectangles—WFC (in millions of tons of nominal fuel) (Klyashtorin and Lyubushin, 2003). Source: Frolov et al. 2009

In legal terms, as long as there is another equally or more likely explanation for the set of facts, the claimed causation is unproven. The more likely explanation is that global temperatures vary due to oceanic and solar cycles. The proof is clearly and thoroughly set forward in the post Quantifying Natural Climate Change.

Background context for today’s post is at Claim: Fossil Fuels Cause Global Warming.

Blinded by Antarctica Reports

Special snow goggles for protection in polar landscapes.

Someone triggered Antarctica for this week’s media alarm blitz.

Antarctic ice loss increases to 200 billion tonnes a year – Climate Action

Antarctica is now melting three times faster than ever before – Euronews

Antarctica is shedding ice at an accelerating rate – Digital Journal

Al Gore Sounds the Alarm on 0.3 inches of Sea Level Rise from Ice Sheets– Daily Caller

Antarctica is losing an insane amount of ice. Nothing about this is good. – Fox News
Looks like it’s time yet again to play Climate Whack-A-Mole.  That means stepping back to get some perspective on the reports and the interpretations applied by those invested in alarmism.

Antarctic Basics

The Antarctic Ice Sheet extends almost 14 million square kilometers (5.4 million square miles), roughly the area of the contiguous United States and Mexico combined. The Antarctic Ice Sheet contains 30 million cubic kilometers (7.2 million cubic miles) of ice. (Source: NSIDC: Quick Facts Ice Sheets)

The Antarctic Ice Sheet covers an area larger than the U.S. and Mexico combined. This photo shows Mt. Erebus rising above the ice-covered continent. Credit: Ted Scambos & Rob Bauer, NSIDC

The study of ice sheet mass balance underwent two major advances, one during the early 1990s, and again early in the 2000s. At the beginning of the 1990s, scientists were unsure of the sign (positive or negative) of the mass balance of Greenland or Antarctica, and knew only that it could not be changing rapidly relative to the size of the ice sheet.

Advances in glacier ice flow mapping using repeat satellite images, and later using interferometric synthetic aperture radar SAR methods, facilitated the mass budget approach, although this still requires an estimate of snow input and a cross-section of the glacier as it flows out from the continent and becomes floating ice. Satellite radar altimetry mapping and change detection, developed in the early to mid-1990s allowed the research community to finally extract reliable quantitative information regarding the overall growth or reduction of the volume of the ice sheets.

By 2002, publications were able to report that both large ice sheets were losing mass (Rignot and Thomas 2002). Then in 2003 the launch of two new satellites, ICESat and GRACE, led to vast improvements in one of the methods for mass balance determination, volume change, and introduced the ability to conduct gravimetric measurements of ice sheet mass over time. The gravimetric method helped to resolve remaining questions about how and where the ice sheets were losing mass. With this third method, and with continued evolution of mass budget and geodetic methods it was shown that the ice sheets were in fact losing mass at an accelerating rate by the end of the 2000s (Veliconga 2009, Rignot et al. 2011b).

Contradictory Findings

NASA Study: Mass Gains of Antarctic Ice Sheet Greater than Losses

A new 2015 NASA study says that an increase in Antarctic snow accumulation that began 10,000 years ago is currently adding enough ice to the continent to outweigh the increased losses from its thinning glaciers.

The research challenges the conclusions of other studies, including the Intergovernmental Panel on Climate Change’s (IPCC) 2013 report, which says that Antarctica is overall losing land ice.

According to the new analysis of satellite data, the Antarctic ice sheet showed a net gain of 112 billion tons of ice a year from 1992 to 2001. That net gain slowed to 82 billion tons of ice per year between 2003 and 2008.

“We’re essentially in agreement with other studies that show an increase in ice discharge in the Antarctic Peninsula and the Thwaites and Pine Island region of West Antarctica,” said Jay Zwally, a glaciologist with NASA Goddard Space Flight Center in Greenbelt, Maryland, and lead author of the study, which was published on Oct. 30 in the Journal of Glaciology. “Our main disagreement is for East Antarctica and the interior of West Antarctica – there, we see an ice gain that exceeds the losses in the other areas.” Zwally added that his team “measured small height changes over large areas, as well as the large changes observed over smaller areas.”

Scientists calculate how much the ice sheet is growing or shrinking from the changes in surface height that are measured by the satellite altimeters. In locations where the amount of new snowfall accumulating on an ice sheet is not equal to the ice flow downward and outward to the ocean, the surface height changes and the ice-sheet mass grows or shrinks.

Snow covering Antarctic peninsula.

Keeping Things in Perspective

Such reports often include scary graphs like this one and the reader is usually provided no frame of reference or context to interpret the image. First, the chart is showing cumulative loss of mass arising from an average rate of 100 Gt lost per year since 2002. Many years had gains, including 2002, and the cumulative loss went below zero only in 2006.  Also, various methods of measuring and analyzing give different results, as indicated by the earlier section.

Most important is understanding the fluxes in proportion to the Antarctic Ice Sheet.  Let’s do the math.  Above it was stated Antarctica contains ~30 million cubic kilometers of ice volume.  One km3 of water is 1 billion cubic meters and weighs 1 billion tonnes, or 1 gigatonne.  So Antarctica has about 30,000,000 gigatonnes of ice.  Since ice is slightly less dense than water, the total should be adjusted by 0.92 for an estimate of 27.6 M Gts of ice comprising the Antarctic Ice Sheet.

So in the recent decade, an average year went from 27,600,100 Gt to 27,600,000, according to one analysis.  Other studies range from losing 200 Gt/yr to gaining 100 Gt/yr.

Even if Antarctica lost 200 Gt/yr. for the next 1000 years, it would only approach 1% of the ice sheet.

If like Al Gore you are concerned about sea level rise, that calculation starts with the ocean area estimated to be 3.618 x 10^8 km2 (361,800,000 km2). To raise that area 1 mm requires 3.618×10^2 km3 or 361.8 km3 water (1 km3 water=1 Gt.) So 200 Gt./yr is about 0.55mm/yr or 6 mm a decade, or 6 cm/century.

By all means let’s pay attention to things changing in our world, but let’s also notice the scale of the reality and not make mountains out of molehills.

Let’s also respect the scientists who study glaciers and their subtle movements over time (“glacial pace”).  Below is an amazing video showing the challenges and the beauty of working on Greenland Glacier.

From Ice Alive: Uncovering the secrets of Earth’s Ice

For more on the Joys of Playing Climate Whack-A-Mole 

Cooling Ocean Air Temps

Presently sea surface temperatures (SST) are the best available indicator of heat content gained or lost from earth’s climate system.  Enthalpy is the thermodynamic term for total heat content in a system, and humidity differences in air parcels affect enthalpy.  Measuring water temperature directly avoids distorted impressions from air measurements.  In addition, ocean covers 71% of the planet surface and thus dominates surface temperature estimates.  Eventually we will likely have reliable means of recording water temperatures at depth.

Recently, Dr. Ole Humlum reported from his research that air temperatures lag 2-3 months behind changes in SST.  He also observed that changes in CO2 atmospheric concentrations lag behind SST by 11-12 months.  This latter point is addressed in a previous post Who to Blame for Rising CO2?

The May update to HadSST3 will appear later this month, but in the meantime we can look at lower troposphere temperatures (TLT) from UAHv6 which are already posted for May. The temperature record is derived from microwave sounding units (MSU) on board satellites like the one pictured above.

The UAH dataset includes temperature results for air above the oceans, and thus should be most comparable to the SSTs. The graph below shows monthly anomalies for ocean temps since January 2015.

UAH May2018

Open image in new tab to enlarge.

The anomalies have reached the same levels as 2015.  Taking a longer view, we can look at the record since 1995, that year being an ENSO neutral year and thus a reasonable starting point for considering the past two decades.  On that basis we can see the plateau in ocean temps is persisting. Since last October all oceans have cooled, with upward bumps in Feb. 2018, now erased.

UAHv6 TLT 
Monthly Ocean
Anomalies
Average Since 1995 Ocean 5/2018
Global 0.13 0.09
NH 0.16 0.33
SH 0.11 -0.09
Tropics 0.12 0.02

As of May 2018, global ocean temps are slightly lower than April and below the average since 1995.  NH remains higher, but not enough to offset much lower temps in SH and Tropics (between 20N and 20S latitudes).  Global ocean air temps are now the lowest since April 2015, and SH the lowest since May 2013.

The details of UAH ocean temps are provided below.  The monthly data make for a noisy picture, but seasonal fluxes between January and July are important.

Click on image to enlarge.

The greater volatility of the Tropics is evident, leading the oceans through three major El Nino events during this period.  Note also the flat period between 7/1999 and 7/2009.  The 2010 El Nino was erased by La Nina in 2011 and 2012.  Then the record shows a fairly steady rise peaking in 2016, with strong support from warmer NH anomalies, before returning to the 22-year average.

Summary

TLTs include mixing above the oceans and probably some influence from nearby more volatile land temps.  They started the recent cooling later than SSTs from HadSST3, but are now showing the same pattern.  It seems obvious that despite the three El Ninos, their warming has not persisted, and without them it would probably have cooled since 1995.  Of course, the future has not yet been written.

 

Chicxulub asteroid Apocalypse? Not so fast.

The Daily Mail would have you believe Apocalyptic asteroid that wiped out the dinosaurs 66 million years ago triggered 100,000 years of global warming
Chicxulub asteroid triggered a global temperature rise of 5°C (9°F).

This notion has been around for years, but dredged up now to promote fears of CO2 and global warming. And maybe it’s because of a new Jurassic Park movie coming this summer.  But it doesn’t take much looking around to discover experts who have a sober, reasonable view of the situation.

Princeton expert Gerta Keller, Professor of Geosciences at Princeton, has studied this issue since the 1990s and tells all at her website CHICXULUB: THE IMPACT CONTROVERSY Excerpts below with my bolds.

Introduction to The Impact Controversy

In the 1980s as the impact-kill hypothesis of Alvarez and others gained popular and scientific acclaim and the mass extinction controversy took an increasingly rancorous turn in scientific and personal attacks fewer and fewer dared to voice critique. Two scientists stand out: Dewey McLean (VPI) and Chuck Officer (Dartmouth University). Dewey proposed as early as 1978 that Deccan volcanism was the likely cause for the KTB mass extinction, Officer also proposed a likely volcanic cause. Both were vilified and ostracized by the increasingly vocal group of impact hypothesis supporters. By the middle of the 1980s Vincent Courtillot (Physique de Globe du Paris) also advocated Deccan volcanism, though not as primary cause but rather as supplementary to the meteorite impact. Since 2008 Courtillot has strongly advocated Deccan volcanism as the primary cause for the KTB mass extinction.

(Overview from Tim Clarely, Ph.D. questioning the asteroid) In secular literature and movies, the most popular explanation for the dinosaurs’ extinction is an asteroid impact. The Chicxulub crater in Mexico is often referred to as the “smoking gun” for this idea. But do the data support an asteroid impact at Chicxulub?

The Chicxulub crater isn’t visible on the surface because it is covered by younger, relatively undeformed sediments. It was identified from a nearly circular gravity anomaly along the northwestern edge of the Yucatán Peninsula (Figure 1). There’s disagreement on the crater’s exact size, but its diameter is approximately 110 miles—large enough for a six-mile-wide asteroid or meteorite to have caused it.

Although some of the expected criteria for identifying a meteorite impact are present at the Chicxulub site—such as high-pressure and deformed minerals—not enough of these materials have been found to justify a large impact. And even these minerals can be caused by other circumstances, including rapid crystallization4 and volcanic activity.

The biggest problem is what is missing. Iridium, a chemical element more abundant in meteorites than on Earth, is a primary marker of an impact event. A few traces were identified in the cores of two drilled wells, but no significant amounts have been found in any of the ejecta material across the Chicxulub site. The presence of an iridium-rich layer is often used to identify the K-Pg (Cretaceous-Paleogene) boundary, yet ironically there is virtually no iridium in the ejecta material at the very site claimed to be the “smoking gun”!

In addition, secular models suggest melt-rich layers resulting from the impact should have exceeded a mile or two in thickness beneath the central portion of the Chicxulub crater. However, the oil wells and cores drilled at the site don’t support this. The thickest melt-rich layers encountered in the wells were between 330 and 990 feet—nowhere near the expected thicknesses of 5,000 to 10,000 feet—and several of the melt-rich layers were much thinner than 300 feet or were nonexistent.

Finally, the latest research even indicates that the tsunami waves claimed to have been generated by the impact across the Gulf of Mexico seem unlikely.

Summary from Geller

The Cretaceous-Tertiary boundary (KTB) mass extinction is primarily known for the demise of the dinosaurs, the Chicxulub impact, and the frequently rancorous thirty years-old controversy over the cause of this mass extinction. Since 1980 the impact hypothesis has steadily gained support, which culminated in 1990 with the discovery of the Chicxulub crater on Yucatan as the KTB impact site and “smoking gun” that proved this hypothesis. In a perverse twist of fate, this discovery also began the decline of this hypothesis, because for the first time it could be tested directly based on the impact crater and impact ejecta in sediments throughout the Caribbean, Central America and North America.

Two decades of multidisciplinary studies amassed a database with a sum total that overwhelmingly reveals the Chicxulub impact predates the KTB mass extinction. It’s been a wild and frequently acrimonious ride through the landscape of science and personalities. The highlights of this controversy, the discovery of facts inconsistent with the impact hypothesis, the denial of evidence, misconceptions, and misinterpretations are recounted here. (Full paper in Keller, 2011, SEPM 100, 2011).

Chicxulub Likely Happened ~100,000 years Before the KTB Extinction

Figure 42. Planktic foraminiferal biostratigraphy, biozone ages calculated based on time scales where the KTB is placed at 65Ma, 65.5Ma and 66Ma, and the relative age positions of the Chicxulub impact, Deccan volcanism phases 2 and 3 and climate change, including the maximum cooling and maximum warming (greenhouse warming) and the Dan-2 warm event relative to Deccan volcanism.

Most studies surrounding the Chicxulub impact crater have concentrated on the narrow interval of the sandstone complex or so-called impact-tsunami. Keller et al. (2002, 2003) placed that interval in zone CF1 based on planktic foraminiferal biostratigraphy and specifically the range of the index species Plummerita hantkeninoides that spans the topmost Maastrichtian. Zone CF1. The age of CF1 was estimated to span the last 300ky of the Maastrichtian based on the old time scale of Cande and Kent (1995) that places the KTB at 65Ma. The newer time scale (Gradstein et al., 2004) places the KTB at 65.5Ma, which reduces zone CF1 to 160ky.

By early 2000 our team embarked on an intensive search for impact spherules below the sandstone complex throughout NE Mexico. Numerous outcrops were discovered with impact spherule layers in planktic foraminiferal zone CF1 below the sandstone complex and we suggested that the Chicxulub impact predates the KTB by about 300ky (Fig. 42; Keller et al., 2002, 2003, 2004, 2005, 2007, 2009; Schulte et al., 2003, 2006).

Time scales change with improved dating techniques. Gradstein et al (2004) proposed to place the KTB at 65.5 Ma, (Abramovich et al., 2010). This time scale is now undergoing further revision (Renne et al., 2013) placing the KTB at 66 Ma, which reduces zone CF to less than 100ky. By this time scale, the age of the Chicxulub impact predates the KTB by less than 100ky based on impact spherule layers in the lower part zone CF1. See Fig. 42 for illustration.

Unfortunately, this wide interest rarely resulted in integrated interdisciplinary studies or joint discussions to search for common solutions to conflicting results. Increasingly, in a perverse twist of science new results became to be judged by how well they supported the impact hypothesis, rather than how well they tested it. An unhealthy US versus THEM culture developed where those who dared to question the impact hypothesis, regardless of the solidity of the empirical data, were derided, dismissed as poor scientists, blocked from publication and getting grant funding, or simply ignored. Under this assault, more and more scientists dropped out leaving a nearly unopposed ruling majority claiming victory for the impact hypothesis. In this adverse high-stress environment just a small group of scientists doggedly pursued evidence to test the impact hypothesis.

No debate has been more contentious during the past thirty years, or has more captured the imagination of scientists and public alike, than the hypothesis that an extraterrestrial bolide impact was the sole cause for the KTB mass extinction (Alvarez et al., l980). How did this hypothesis evolve so quickly into a virtually unassailable “truth” where questioning could be dismissed by phrases such as “everybody knows that an impact caused the mass extinction”, “only old fashioned Darwinian paleontologists can’t accept that the mass extinction was instantaneous”, “paleontologists are just bad scientists, more like stamp collectors”, and “it must be true because how could so many scientists be so wrong for so long.” Such phrases are reminiscent of the beliefs that the Earth is flat, that the world was created 6000 years ago, that Noah’s flood explains all geological features, and the vilification of Alfred Wegner for proposing that continents moved over time.

Update Published at National Geographic February 2018 By Shannon Hall Volcanoes, Then an Asteroid, Wiped Out the Dinosaur

What killed the dinosaurs? Few questions in science have been more mysterious—and more contentious. Today, most textbooks and teachers tell us that nonavian dinosaurs, along with three-fourths of all species on Earth, disappeared when a massive asteroid hit the planet near the Yucatán Peninsula some 66 million years ago.

But a new study published in the journal Geology shows that an episode of intense volcanism in present-day India wiped out several species before that impact occurred.

The result adds to arguments that eruptions plus the asteroid caused a one-two punch. The volcanism provided the first strike, weakening the climate so much that a meteor—the more deafening blow—was able to spell disaster for Tyrannosaurs rex and its late Cretaceous kin.

A hotter climate certainly helped send the nonavian dinosaurs to their early grave, says Paul Renne, a geochronologist at the University of California, Berkeley, who was not involved in the study. That’s because the uptick in temperature was immediately followed by a cold snap—a drastic change that likely set the stage for planet-wide disaster.

Imagine that some life managed to adapt to those warmer conditions by moving closer toward the poles, Renne says. “If you follow that with a major cooling event, it’s more difficult to adapt, especially if it’s really rapid,” he says.

In this scenario, volcanism likely sent the world into chaos, driving many extinctions alone and increasing temperatures so drastically that most of Earth’s remaining species couldn’t protect themselves from that second punch when the asteroid hit.

“The dinosaurs were extremely unlucky,” Wignall says.

But it will be hard to convince Sean Gulick, a geophysicist at the University of Texas at Austin, who co-led recent efforts to drill into the heart of the impact crater in Mexico. He points toward several studies that have suggested that ecosystems remained largely intact until the time of the impact.

Additionally, a forthcoming paper might make an even stronger case that the impact drove the extinction alone, notes Jay Melosh, a geophysicist at Purdue University who has worked on early results from the drilling project. It looks as though the divisive debate will continue with nearly as much ferocity as the events that rocked our world 66 million years ago.

Summary:

So if the Chicxulub asteroid didn’t kill the dinosaurs, what did? Paleontologists have advanced all manner of other theories over the years, including the appearance of land bridges that allowed different species to migrate to different continents, bringing with them diseases to which native species hadn’t developed immunity. Keller and Addate do not see any reason to stray so far from the prevailing model. Some kind of atmospheric haze might indeed have blocked the sun, making the planet too cold for the dinosaurs — it just didn’t have to have come from an asteroid. Rather, they say, the source might have been massive volcanoes, like the ones that blew in the Deccan Traps in what is now India at just the right point in history.

For the dinosaurs that perished 65 million years ago, extinction was extinction and the precise cause was immaterial. But for the bipedal mammals who were allowed to rise once the big lizards were finally gone, it is a matter of enduring fascination.

This science seems as settled as climate change/global warming, and with many of the same shenanigans.

New Zealand Warming Disputed

New Zealand Cook National Park.

A dust up over the temperature trend in New Zealand is discussed at the the climate conversation New Zealand Response to NIWA comment on de Freitas reanalysis of the NZ temperature record  by Barry Brill, Chairman of the New Zealand Climate Science Coalition.  Excerpts with my bolds.

Conclusions

de Freitas finds that New Zealand has experienced an insignificant warming trend of 0.28°C/century during 1909-2008. Using the same data, the Mullan Report calculates that trend at 0.91°C/century. Both studies claim to apply the statistical technique described in RS93, and each alleges that the other has departed from that methodology. This core issue has been described in the graph above but has not been addressed in this note.

A second core issue relates to reliance upon inhomogeneous Auckland and Wellington data despite the extensive contamination of both sites by sheltering and UHI. That matter has not been addressed here either.

Instead, this limited reply deals with the raft of peripheral allegations contained in the NIWA Comment. In particular, it sets out to show that all plausible published records, as well as the scientific literature, support the view that New Zealand’s temperature record has remained remarkably stable over the past century or so.

Some of the Issues Rebutted

Other temperature records:

The de Freitas warming trend of 0.28°C/century is wholly consistent with the synchronous Southern Hemisphere trend reported in IPPC’s AR5. Both the IPCC and NIWA have long reported that anthropogenic warming trends in ocean-dominated New Zealand would be materially lower than global averages. The S81/Mullan Report trend of 0.91°C/century is clearly anomalous.

Official New Zealand temperature records for eight years in the 1860s, which are both reliable and area-representative, show the absolute mean temperature was then 13.1°C. A 30-year government record for the period ending 1919 shows the mean temperature to be 12.8°C. The current normal (30-year) mean 7SS temperature is 12.9°C. Clearly, New Zealand mean temperatures have remained almost perfectly stable during the past 150 years.

Use of RS93 Statistical Method:

The Mullan Report (along with other NIWA articles that are not publicly available) does purport to use RS93 comparison techniques, so this assertion is naturally accepted whenever these ‘grey’ papers are mentioned in the peer-reviewed literature. However, the Mullan Report sits outside the literature and clearly fails to execute its stated intention to apply RS93 methods. The de Freitas paper rectifies those omissions.

NZ Glaciers

In this area, the most recent authority is Mackintosh et al. (2017), entitled “Regional cooling caused recent New Zealand glacier advances in a period of global warming.” After observing that at least 58 Southern Alps glaciers advanced during the period 1983-2008, the abstract notes:

“Here we show that the glacier advance phase resulted predominantly from discrete periods of reduced air temperature, rather than increased precipitation. The lower temperatures were associated with anomalous southerly winds and low sea surface temperature in the Tasman Sea region. These conditions result from variability in the structure of the extratropical atmospheric circulation over the South Pacific.”

This Nature paper, of which James Renwick was an author, notes that the World Glacier Monitoring Service database shows that in 2005 “15 of the 26 advancing glaciers observed worldwide were in New Zealand.”

BEST Data

Using up to 52 auto-adjusted datasets1, the Berkeley Earth group derives an absolute New Zealand temperature range of 9.5°C to 11°C over the 160-year period from 1853 to 2013.

The mid-point of this range is very far from the mid-point of the 12.3°C to 13.2°C range recorded in the 7SS (whether raw or adjusted) and is clearly wrong. Nonetheless, for the 100-year period 1909-2008, the BEST adjusted anomalies are said to show a 100% perfect correlation with those of the Mullan Report (to three decimal points). The claimed independence of such an immaculate outcome is entirely implausible.

Anthropogenic Warming

The Mullan Report’s 1960-90 upwards spike could not have occurred whilst the south-west Pacific region was in a cooling phase – which is confirmed by Mackintosh et al. (2017). Further, the final 30-year period of the Mullan Report shows an insignificant trend of only 0.12°C/century, demonstrating that New Zealand has not yet been affected by global warming trends.

Summary

Good to see that de Fritas et al are again speaking climate truth to entrenched alarmists.  Go Kiwis!

Confronting Suzuki’s Climate Hysteria

Thanks to Friends of Science in Calgary for hosting Award-winning Dutch filmmaker Marijn Poels and Canadian climate change scientist Dr. Madhav Khandekar.  They dismantled the dogma of Global Editors Network and Dr. Suzuki-style climate hysteria in one evening at Friends of Science Society’s 15th Annual Event entitled: “Extreme Climate Uncertainty.”

Full story is Inquiry not Dogma, which includes links and background information.  Excerpts with my bolds.

Poels challenged the audience with evidence that food security is at risk due to ‘green’ energy policies while Dr. Khandekar deconstructed climate alarmism with convincing evidence that extreme weather is mostly media hype.

Left-wing, progressive Poels recounted to the Friends of Science event audience how he had worked in conflict and poverty countries for nine years, making 50 films in that time. When he returned home to Europe for some recovery time in the pastoral countryside, he was surprised, then alarmed to find that EU climate and energy policies were trading food security for unreliable, expensive ‘energy’ security. Curious to find the root of this strange set of policies, Poels followed the money and policy to talk with climate scientists and agricultural experts.

Poels noted that he had a broad-reaching, very supportive media network for his human rights and justice films; this dried up the moment he broached the topic of climate change.His 2017  documentary film exposed how climate change policies are threatening modern civilization. Trailer can be viewed below. My recent post on this subject was Climate Policies Failure, the Movie.

Dr. Madhav Khandekar, former Environment Canada researcher, gave a lively, humorous presentation that debunked the claims of extreme weather being more frequent or caused by human influence on climate or human industrial carbon dioxide emissions. Khandekar explained some of the intricacies of the global effects of the natural, cyclical El Nino Southern Oscillation and its mirror image, La Nina. Overall, Khandekar says the only noticeable trends are toward longer cold snaps, a possible harbinger of long periods of cold and erratic weather as experienced in the Little Ice Age, during a solar minimum.

Khandekar was an instructor at the University of Alberta early on in his career, an institution now embroiled in a vigorous public debate about the propriety of conferring an honorary degree on Dr. David Suzuki at this spring’s convocation.

Friends of Science Society posted an open letter on their blog on May 9, 2018, addressed to the president of the University of Alberta, expressing their views on the matter. After describing the details of Suzuki’s destructive behavior, the letter concludes with the following summary:

Friends of Science Society University of Alberta grad members are not upset that Dr. Suzuki holds controversial views because they value freedom of speech. More so, they value scientific integrity. They are upset that he spouts false and misleading diatribes on scientific topics – contrary to all the careful and accurate scientific methods that they learned as students at the University of Alberta.

And they are very upset that you choose to honor that.

Our members have not only seen job loss for themselves or their employees, they have experienced the tragic consequences of lives lost through suicide as careers, finances, families and business enterprises fall apart.

For no good reason.

Under your leadership, Dr. Turpin, the University of Alberta embarked on a program entitled “For the Public Good.” Now you want to honor a high-profile public figure, someone whose uninformed and misleading activism, has aided the destruction of the economy in Alberta, whose unsupported activist rhetoric has done untold damage to the Canadian economy and whose statements have damaged our international reputation as a reliable and fair place to do business. The outcomes include personal catastrophe for hundreds of thousands of people, many of them University of Alberta alumni. How is that for the public good?

In our opinion, based on the foregoing evidence, Dr. Suzuki’s actions and words are not congruent with the skills learned in the physical sciences, environmental or business management at the University of Alberta. They are not in keeping with the expectations of its graduates or faculty members, nor with its own Code of Ethics, nor with the values you express in your statement meant to validate your decision to honor Dr. Suzuki.

We ask that you rescind the offer of the honorary degree to Dr. David Suzuki.

 

 

 

 

 

Climate Canary? N. America Cooling

Hidden amid reports of recent warmest months and years based on global averages, there is a significant departure in North America. Those of us living in Canada and USA have noticed a distinct cooling, and our impressions are not wrong.

The image above shows how much lower have been April 2018 temperatures. The table below provides the numbers behind the graphs from NOAA State of the Climate.

CONTINENT ANOMALY (1910-2000) TREND (1910-2018) RANK RECORDS
°C °F °C °F (OUT OF 109 YEARS) YEAR(S) °C °F
North America -0.97 -1.75 0.11 0.19 Warmest 94ᵗʰ 2010 2.65 4.77
South America 1.34 2.41 0.13 0.24 Warmest 1ˢᵗ 2018 1.34 2.41
Europe 2.82 5.08 0.14 0.25 Warmest 1ˢᵗ 2018 2.82 5.08
Africa 1.23 2.21 0.12 0.22 Warmest 5ᵗʰ 2016 1.72 3.1
Asia 1.66 2.99 0.18 0.32 Warmest 9ᵗʰ 2016 2.4 4.32
Oceania 2.47 4.45 0.14 0.25 Warmest 2ⁿᵈ 2005 2.54 4.57

The table shows how different was the North American experience: 94th out of 109 years.  But when we look at the first four months of the year, the NA is more in line with the rest of the globe.

 

As the image shows, cooling was more widespread during the first third of 2018, particularly in NA, Northern Europe and Asia, as well as a swath of cooler mid ocean latitudes in the Southern Hemisphere.

CONTINENT ANOMALY (1910-2000) TREND (1910-2018) RANK RECORDS
°C °F °C °F (OUT OF 109 YEARS) YEAR(S) °C °F
North America 0.44 0.79 0.16 0.29 Warmest 44ᵗʰ 2016 2.71 4.88
South America 0.94 1.69 0.13 0.24 Warmest 6ᵗʰ 2016 1.39 2.5
Europe 1.35 2.43 0.13 0.24 Warmest 13ᵗʰ 2014 2.46 4.43
Africa 1.08 1.94 0.1 0.18 Warmest 3ʳᵈ 2010 1.62 2.92
Asia 1.57 2.83 0.19 0.34 Warmest 8ᵗʰ 2002 2.72 4.9
Oceania 1.58 2.84 0.12 0.22 Warmest 1ˢᵗ 2018 1.58 2.84

The table confirms that Europe and Asia are cooler in 2018 than recent years in the decade.

Summary

These data show again that temperature indicators of climate are not global but regional, and even local in their manifestations.  At the continental level there are significant differences.  North America is an outlier, but who is to say whether it is an aberration that will join the rest, or whether it is the trend setter signaling a widespread cooler future.

See Also:  Is This Cold the New Normal?