Inside the Climate Tutorial

Thanks to an article at Wired, we get a first glimpse into what transpired at the March 21 courtroom tutorial called by Federal District court  Judge Alsup.  From a science perspective, it looks at the moment like a missed opportunity.  The oil company lawyers sat in silence, allowing Chevron’s lead attorney to speak for them, and he mainly quoted from IPCC documents.  The calculation seems to be taking a position that we didn’t know more and not any sooner than the IPCC came to conclusions in their series of assessment reports.  The plaintiffs let alarmist scientists present on their behalf, and can now claim their opinions were not refuted.

The Wired article is In the Courtroom, Climate Science Needs Substance–and Style Excerpts below with my bolds.

Outside the usual procedural kabuki of the courtroom, the truth is no one really knew what to expect from this court-ordered “tutorial.” For a culture based in large measure on precedent, putting counsel and experts in a room to hash out climate change for a trial—putting everyone on the record, in federal court, on what is and is not true about climate science—was literally unprecedented.

What Alsup got might not have been a full on PowerPoint-powered preview of the trial. But it did reveal a lot about the styles and conflicts inherent in the people who produce the carbon and the people who study it.

The other petrochemists put forth Theodore Boutrous, an AC-130 gunship of a lawyer who among other things got the US Supreme Court to overturn the California law against same-sex marriage. Here, retained specifically by Chevron, Boutrous argued what seemed to be climate change’s chapter-and-verse. He extolled the virtues of the several IPCC reports, 2013 most recently, and quoted them liberally. Boutrous talked about how the reports’ conclusions have gotten more and more surefooted about “anthropogenic” causes of climate change—it’s people!—and outcomes like sea level rise. “From Chevron’s perspective, there’s no debate about climate science,” Boutrous said. “Chevron accepts what this scientific body—scientists and others—what the IPCC has reached consensus on.”

Still, over the course of the morning, Boutrous nevertheless tried to neg the IPCC in two specific ways. One was a classic: He challenged the models that climate scientists use to attempt to predict the future. These computer models, Boutrous said, are “increasingly complex. That can make the modeling more powerful.” But with great power comes great potential wrongness. “Because it’s an attempt to represent things in the real world, the complexity can bring more risk.” He assured the court that Chevron agreed with the IPCC approach—posting up a slide pulled from an IPCC report that showed the multicolored paths of literally hundreds of models, using different emissions scenarios and essentially describing the best case and worst case (and a bunch of in-between cases). It looked like a blast of fireworks emerging from observed average temperature, headed chaotically up and to the right.

So here comes the crux of the thing—a question not of whether climate change is real, but whether you can ascribe blame for it. Leaning heavily on more IPCC quotes, Boutrous showed slides and statistics saying that climate change is a global problem that doesn’t differentially affect the West Coast of North America and isn’t caused by any one emitter. Or even any one source of emissions. Anthropogenic emissions are driven by things like population size, economic activities, lifestyle, energy use, land use patterns, and technology and climate policy, according to the IPCC. “The IPCC does not say it’s the extraction and production of oil,” Boutrous said. “It’s economic activity that creates the demand for energy and that leads to emissions.”

If that seems a little bit like the “guns don’t kill people; people kill people” of petrochemical capitalism, well, Judge Alsup did start the morning by saying today was a day for science, not politics.

So what knives did the representatives of the state of California bring to this oil-fight? Here’s where style is interesting. California didn’t front lawyers. For the science tutorial, the municipalities fronted scientists—people who’d been first authors on chapters in the IPCC reports from which Boutrous quoted, and one who’d written a more recent US report and a study of sea level rise in California. They knew their stuff and could answer all of Judge Alsup’s questions … but their presentations were more like science conference fodder than well-designed rhetoric.

For example, Myles Allen, leader of the Climate Research Program at the University of Oxford, gave a detailed, densely-illustrated talk on the history and science of climate change…but he also ended up in an extended back and forth with Alsup about whether Svante Arrhenius’ 1896 paper hypothesizing that carbon dioxide in Earth’s atmosphere warmed the planet explicitly used the world “logarithmic.” Donald Wuebbles, an atmospheric scientist at the University of Illinois and co-author of the Nobel Prize-winning 2007 IPCC report, mounted a grim litany of all the effects scientists can see, today, of climate, but Alsup caught him up asking for specific things he disagreed with Boutrous on—a tough game since Boutrous was just quoting the IPCC.

Then Alsup and Wuebbles took a detour into naming other renewable power sources besides solar and wind. “Nuclear would not put out any CO2, right? We might get some radiation as we drive by, but maybe in retrospect we should have taken a hard look at nuclear?” Alsup interrupted. “No doubt solar is good where you can use it, but do you really think it could be a substitute for supplying the amount of power America used in the last 30 years?”

“I think solar could be a significant factor of our energy future,” Wuebbles said. “I don’t think there’s any one silver bullet.”

In part, one might be tempted to put some blame on Alsup here. You might remember him from such trials as Uber v. Waymo, where he asked for a similar tutorial on self-driving car technology. Or from Oracle v. Google, a trial for which Alsup taught himself a little of the programming language Java so he’d understand the case better. Or from his intercession against the Trump administration’s attempt to cancel the Deferred Action for Childhood Arrivals program, protecting the immigration status of so-called Dreamers. “He’s kind of quirky and not reluctant to do things kind of outside the box,” said Deborah Sivas, Director of the Environmental and Natural Resource Law & Policy Program at Stanford Law School. “And I think he sees this as a precedent-setting case, as do the lawyers.”

It’s possible, then, to infer that Alsup was doing more than just getting up to speed on climate change on Wednesday. The physics and chemistry are quite literally textbook, and throughout the presentations he often seemed to know more than he was letting on. He challenged chart after chart incisively, and often cut in on history. When Allen brought up Roger Revelle’s work showing that oceans couldn’t absorb carbon—at least, not fast enough to stave off climate change, Alsup interrupted.

“Is it true that Revelle initially thought the ocean would absorb all the excess, and that he came to this buffer theory a little later?” Alsup asked.

“You may know more of this history than I do,” Allen said.

But on the other hand, some of what the litigators seemed to not know sent the scientists in the back in literal spasms. When Boutrous couldn’t answer Alsup’s questions about the specific causes of early 20th-century warming (presumably before the big industrial buildup of the 1950s), Allen and Wuebbles, sitting just outside the gallery, clenched fists and looked like they were having to keep from shouting out the answer. Later, Alsup acknowledged that he’d watched An Inconvenient Truth to prepare, and Boutrous said he had, too.

All of which makes it hard to tell whether bringing scientists to this table was the right move. And maybe that has been the problem all along. The interface where utterly flexible law and policy moves against the more rigid statistical uncertainties of scientific observation has always been contested space. The practitioners of both arts seem foreign to each other; the cultural mores differ.

Maybe that’s what this “tutorial” was meant for. As Sivas says, the facts aren’t really in doubt here. Or rather, most of them aren’t, and maybe Alsup will use today as a kind of discovery process, a way to crystalize the difference between uncertainty in science and uncertainty under the law. “That’s what judges do. They decide the credibility of one expert over another,” Sivas says. “That doesn’t mean it’s scientific truth. It means it’s true as a legal claim.”

The skeptical scientific brief was filed by esteemed scientists Happer, Koonin and Lindzen, but its effect is not yet evident.  More details are at Cal Climate Tutorial: The Meat


Cal Climate Tutorial: The Meat

Prevous posts provided the context regarding the Climate Tutorial requested by Judge Alsup in the lawsuit case filed by California cities against big oil companies: Cal Court to Hear Climate Tutorial

An overview of a submission by Professors Happer, Koonin and Lindzen was in Climate Tutorial for Judge Alsup

This post goes into the meat and potatoes of that submission with excerpts from Section II: Answers to specific questions (my bolds)

Question 1: What caused the various ice ages (including the “little ice age” and prolonged cool periods) and what caused the ice to melt? When they melted, by how much did sea level rise?

The discussion of the major ice ages of the past 700 thousand years is distinct from the discussion of the “little ice age.” The former refers to the growth of massive ice sheets (a mile or two thick) where periods of immense ice growth occurred, lasting approximately eighty thousand years, followed by warm interglacials lasting on the order of twenty thousand years. By contrast, the “little ice age” was a relatively brief period (about four hundred years) of relatively cool temperatures accompanied by the growth of alpine glaciers over much of the earth.

Tutorial 1

The last glacial episode ended somewhat irregularly. Ice coverage reached its maximum extent about eighteen thousand years ago. Melting occurred between about twenty thousand years ago and thirteen thousand years ago, and then there was a strong cooling (Younger Dryas) which ended about 11,700 years ago. Between twenty thousand years ago and six thousand years ago, there was a dramatic increase in sea level of about 120 meters followed by more gradual increase over the following several thousand years. Since the end of the “little ice age,” there has been steady increase in sea-level of about 6 inches per century.


As to the cause of the “little ice age,” this is still a matter of uncertainty. There was a long hiatus in solar activity that may have played a role, but on these relatively short time scales one can’t exclude natural internal variability. It must be emphasized that the surface of the earth is never in equilibrium with net incident solar radiation because the oceans are always carrying heat to and from the surface, and the motion systems responsible have time scales ranging from years (for example ENSO) to millennia.

The claim that orbital variability requires a boost from CO2 to drive ice ages comes from the implausible notion that what matters is the orbital variations in the global average insolation (which are, in fact, quite small) rather than the large forcing represented by the Milankovitch parameter. This situation is very different than in the recent and more modest shorter-term warming, where natural variability makes the role of CO2 much more difficult to determine.

Question 2: What is the molecular difference by which CO2 absorbs infrared radiation but oxygen and nitrogen do not?

Molecules like CO2, H2O, CO or NO are called a greenhouse-gas molecules, because they can efficiently absorb or emit infrared radiation, but they are nearly transparent to sunlight. Molecules like O2 and N2 are also nearly transparent to sunlight, but since they do not absorb or emit thermal infrared radiation very well, they are not greenhouse gases. The most important greenhouse gas, by far, is water vapor. Water molecules, H2O, are permanently bent and have large electric dipole moments.

Question 3: What is mechanism by which infrared radiation trapped by CO2 in the atmosphere is turned into heat and finds its way back to sea level?

Unscattered infrared radiation is very good at transmitting energy because it moves at the speed of light. But the energy per unit volume stored by the thermal radiation in the Earth’s atmosphere is completely negligible compared to the internal energy of the air molecules.

Although CO2 molecules radiate very slowly, there are so many CO2 molecules that they produce lots of radiation, and some of this radiation reaches sea level. The figure following shows downwelling radiation measured at the island of Nauru in the Tropical Western Pacific Ocean, and at colder Point Barrow, Alaska, on the shore of the Arctic Ocean.

So the answer to the last part of the question, “What is the mechanism by which … heat … finds its way back to sea level?” is that the heat is radiated to the ground by molecules at various altitudes, where there is usually a range of different temperatures. The emission altitude is the height from which radiation could reach the surface without much absorption, say 50% absorption. For strongly absorbed frequencies, the radiation reaching the ground comes from low-altitude molecules, only a few meters above ground level for the 667 cm-1 frequency at the center of the CO2 band. More weakly absorbed frequencies are radiated from higher altitudes where the temperature is usually colder than that of the surface. But occasionally, as the data from Point Barrow show, higher-altitude air can be warmer than the surface.

Closely related to Question 3 is how heat from the absorption of sunlight by the surface gets back to space to avoid a steadily increasing surface temperature. As one might surmise from the figure, at Narau there is so much absorption from CO2 and by water vapor, H2O, that most daytime heat transfer near the surface is by convection, not by radiation. Especially important is moist convection, where the water vapor in rising moist air releases its latent heat to form clouds. The clouds have a major effect on radiative heat transfer. Cooled, drier, subsiding air completes the convection circuit. Minor changes of convection and cloudiness can have a bigger effect on the surface temperature than large changes in CO2 concentrations.

Question 4: Does CO2 in the atmosphere reflect any sunlight back into space, such that the reflected sunlight never penetrates the atmosphere in the first place?

The short answer to this question is “No”, but it raises some interesting issues that we discuss below.

Molecules can either scatter or absorb radiation. CO2 molecules are good absorbers of thermal infrared radiation, but they scatter almost none. Infrared radiant energy absorbed by a CO2 molecule is converted to internal vibrational and rotational energy. This internal energy is quickly lost in collisions with the N2 and O2 molecules that make up most of the atmosphere. The collision rates, billions per second, are much too fast to allow the CO2 molecules to reradiate the absorbed energy, which takes about a second. CO2 molecules in the atmosphere do emit thermal infrared radiation continuously, but the energy is almost always provided by collisions with N2 and O2 molecules, not by previously absorbed radiation. The molecules “glow in the dark” with thermal infrared radiation.

H2O CO2 absorption spectrums

The figure shows that water vapor is by far the most important absorber. It can absorb both thermal infrared radiation from the Earth and shorter-wave radiation from the Sun. Water vapor and its condensates, clouds of liquid or solid water (ice), dominate radiative heat transfer in the Earth’s atmosphere; CO2 is of secondary importance.

If Question 4 were “Do clouds in the atmosphere reflect any sunlight back into space, such that the reflected sunlight never penetrates the atmosphere in the first place?” the answer would be “Yes”. It is common knowledge that low clouds on a sunny day shade and cool the surface of the Earth by scattering the sunlight back to space before it can be absorbed and converted to heat at the surface.

The figure shows that very little thermal radiation from the surface can reach the top of the atmosphere without absorption, even if there are no clouds. But some of the surface radiation is replaced by molecular radiation emitted by greenhouse molecules or cloud tops at sufficiently high altitudes that the there are no longer enough higher-altitude greenhouse molecules or clouds to appreciably attenuate the radiation before it escapes to space. Since the replacement radiation comes from colder, higher altitudes, it is less intense and does not reject as much heat to space as the warmer surface could have without greenhousegas absorption.

As implied by the figure, sunlight contains some thermal infrared energy that can be absorbed by CO2. But only about 5% of sunlight has wavelengths longer than 3 micrometers where the strongest absorption bands of CO2 are located. The Sun is so hot, that most of its radiation is at visible and near-visible wavelengths, where CO2 has no absorption bands.

Question 5: Apart from CO2, what happens to the collective heat from tail pipe exhausts, engine radiators, and all other heat from combustion of fossil fuels? How, if at all, does this collective heat contribute to warming of the atmosphere?

After that energy is used for heat, mobility, and electricity, the Second Law of Thermodynamics guarantees that virtually all of it ends up as heat in the climate system, ultimately to be radiated into space along with the earth’s natural IR emissions. [A very small fraction winds up as visible light that escapes directly to space through the transparent atmosphere, but even that ultimately winds up as heat somewhere “out there.”]

How much does this anthropogenic heat affect the climate? There are local effects where energy use is concentrated, for example in cities and near power plants. But globally, the effects are very small. To see that, convert the global annual energy consumption of 13.3 Gtoe (Gigatons of oil equivalent) to 5.6 × 1020 joules. Dividing that by the 3.2 × 107 seconds in a year gives a global power consumption of 1.75 × 1013 Watts. Spreading that over the earth’s surface area of 5.1 × 1014 m2 results in an anthropogenic heat flux of 0.03 W/m2 . This is some four orders of magnitude smaller than the natural heat fluxes of the climate system, and some two orders of magnitude smaller than the anthropogenic radiative forcing.

Question 6: In grade school many of us were taught that humans exhale CO2 but plants absorb CO2 and return oxygen to the air (keeping the carbon fiber). Is this still valid? If so why hasn’t plant life turned the higher levels of CO2 back into oxygen? Given the increase in population on earth (four billion), is human respiration a contributing factor to the buildup of CO2?

If all of the CO2 produced by current combustion of fossil fuels remained in the atmosphere, the level would increase by about 4 ppm per year, substantially more than the observed rate of around 2.5 ppm per year, as seen in the figure above. Some of the anthropogenic CO2 emissions are being sequestered on land or in the oceans.


There is evidence that primary photosynthetic productivity has increased somewhat over the past half century, perhaps due to more CO2 in the atmosphere. For example, the summerwinter swings like those in the figure above are increasing in amplitude. Other evidence for modestly increasing primary productivity includes the pronounced “greening” of the Earth that has been observe by satellites. An example is the map above, which shows a general increase in vegetation cover over the past three decades.

The primary productivity estimate mentioned above would also correspond to an increase of the oxygen fraction of the air by 50 ppm, but since the oxygen fraction of the air is very high (209,500 ppm), the relative increase would be small and hard to detect. Also much of the oxygen is used up by respiration.

The average human exhales about 1 kg of CO2 per day, so the 7 billion humans that populate the Earth today exhale about 2.5 x 109 tons of CO2 per year, a little less than 1% of that is needed to support the primary productivity of photosynthesis and only about 6% of the CO2 “pollution” resulting from the burning of fossil fuels. However, unlike fossil fuel emissions, these human (or more generally, biological) emissions do not accumulate in the atmosphere, since the carbon in food ultimately comes from the atmosphere in the first place.

Question 7: What are the main sources of CO2 that account for the incremental buildup of CO2 in the atmosphere?

The CO2 in the atmosphere is but one reservoir within the global carbon cycle, whose stocks and flows are illustrated by Figure 6.1 from IPCC AR5 WG1:

There is a nearly-balanced annual exchange of some 200 PgC between the atmosphere and the earth’s surface (~80 Pg land and ~120 Pg ocean); the atmospheric stock of 829 Pg therefore “turns over” in about four years.

Human activities currently add 8.9 PgC each year to these closely coupled reservoirs (7.8 from fossil fuels and cement production, 1.1 from land use changes such as deforestation). About half of that is absorbed into the surface, while the balance (airborne fraction) accumulates in the atmosphere because of its multicentury lifetime there. Other reservoirs such as the intermediate and deep ocean are less closely coupled to the surface-atmosphere system.

Much of the natural emission of CO2 stems from the decay of organic matter on land, a process that depends strongly on temperature and moisture. And much CO2 is absorbed and released from the oceans, which are estimated to contain about 50 times as much CO2 as the atmosphere. In the oceans CO2 is stored mostly as bicarbonate (HCO3 – ) and carbonate (CO3 – – ) ions. Without the dissolved CO2, the mildly alkaline ocean with a pH of about 8 would be very alkaline with a pH of about 11.3 (like deadly household ammonia) because of the strong natural alkalinity.

Only once in the geological past, the Permian period about 300 million years ago, have atmospheric CO2 levels been as low as now. Life flourished abundantly during the geological past when CO2 levels were five or ten times higher than those today.

Question 8: What are the main sources of heat that account for the incremental rise in temperature on earth?

The only important primary heat source for the Earth’s surface is the Sun. But the heat can be stored in the oceans for long periods of time, even centuries. Variable ocean currents can release more or less of this stored heat episodically, leading to episodic rises (and falls) of the Earth’s surface temperature.

Incremental changes of the surface temperature anomaly can be traced back to two causes: (1) changes in the surface heating rate; (2) changes in the resistance of heat flow to space. Quasi periodic El Nino episodes are examples of the former. During an El Nino year, easterly trade winds weaken and very warm deep water, normally blown toward the coasts of Indonesia and Australia, floats to the surface and spreads eastward to replace previously cool surface waters off of South America. The average temperature anomaly can increase by 1 C or more because of the increased release of heat from the ocean. The heat source for the El Nino is solar energy that has accumulated beneath the ocean surface for several years before being released.

On average, the absorption rate of solar radiation by the Earth’s surface and atmosphere is equal to emission rate of thermal infrared radiation to space. Much of the radiation to space does not come from the surface but from greenhouse gases and clouds in the lower atmosphere, where the temperature is usually colder than the surface temperature, as shown in the figure on the previous page. The thermal radiation originates from an “escape altitude” where there is so little absorption from the overlying atmosphere that most (say half) of the radiation can escape to space with no further absorption or scattering. Adding greenhouse gases can warm the Earth’s surface by increasing the escape altitude. To maintain the same cooling rate to space, the temperature of the entire troposphere, and the surface, would have to increase to make the effective temperature at the new escape altitude the same as at the original escape altitude. For greenhouse warming to occur, a temperature profile that cools with increasing altitude is required.

Over most of the CO2 absorption band (between about 580 cm-1 and 750 cm-1 ) the escape altitude is the nearly isothermal lower stratosphere shown in the first figure. The narrow spike of radiation at about 667 cm-1 in the center of the CO2 band escapes from an altitude of around 40 km (upper stratosphere), where it is considerably warmer than the lower stratosphere due heating by solar ultraviolet light which is absorbed by ozone, O3. Only at the edges of the CO2 band (near 580 cm-1 and 750 cm-1 ) is the escape altitude in the troposphere where it could have some effect on the surface temperature. Water vapor, H2O, has emission altitudes in the troposphere over most of its absorption bands. This is mainly because water vapor, unlike CO2, is not well mixed but mostly confined to the troposphere.


To summarize this overview, the historical and geological record suggests recent changes in the climate over the past century are within the bounds of natural variability. Human influences on the climate (largely the accumulation of CO2 from fossil fuel combustion) are a physically small (1%) effect on a complex, chaotic, multicomponent and multiscale system. Unfortunately, the data and our understanding are insufficient to usefully quantify the climate’s response to human influences. However, even as human influences have quadrupled since 1950, severe weather phenomena and sea level rise show no significant trends attributable to them. Projections of future climate and weather events rely on models demonstrably unfit for the purpose. As a result, rising levels of CO2 do not obviously pose an immediate, let alone imminent, threat to the earth’s climate.

Full text of submission is here

Climate Tutorial for Judge Alsup

H/T tomomason for noticing this document submitted to Judge Alsup’s requested tutorial

The Honorable William H. Alsup

The covering letter and the submission itself are here.  Below are excerpts of introductory and overview comments.

The Court has invited a tutorial on global warming and climate change, which is set to occur March 21, 2018. The Court also identified specific questions to be addressed at the tutorial. Pursuant to Civil L.R. 7-11, Professors William Happer, Steven E. Koonin, and Richard S. Lindzen respectfully ask the Court to accept their presentation (attached to this motion as Exhibit A) in response to the Court’s questions. The professors would be honored to participate directly in the tutorial if the Court desires.

The Court’s specified questions include topics that have been the subject of the professors’ study and analysis for decades. These men have been thought and policy leaders in the scientific community and in the administrations of two different U.S. Presidents. They have extensive research experience with the specific issues the Court identified. As such, they offer a valuable perspective on these issues. The attached presentation contains three sections: (1) an overview; (2) responses to the Court’s questions; and (3) biographies of the professors.

Overview from the Submission

Our overview of climate science is framed through four statements:

1. The climate is always changing; changes like those of the past half-century are common in the geologic record, driven by powerful natural phenomena

2. Human influences on the climate are a small (1%) perturbation to natural energy flows

3. It is not possible to tell how much of the modest recent warming can be ascribed to human influences

4. There have been no detrimental changes observed in the most salient climate variables and today’s projections of future changes are highly uncertain

We offer supporting evidence for each of these statements drawn almost exclusively from the Climate Science Special Report (CSSR) issued by the US government in November, 2017 or from the Fifth Assessment Report (AR5) issued in 2013-14 by the UN’s Intergovernmental Panel on Climate Change or from the refereed primary literature.

To summarize this overview, the historical and geological record suggests recent changes in the climate over the past century are within the bounds of natural variability. Human influences on the climate (largely the accumulation of CO2 from fossil fuel combustion) are a physically small (1%) effect on a complex, chaotic, multicomponent and multiscale system. Unfortunately, the data and our understanding are insufficient to usefully quantify the climate’s response to human influences. However, even as human influences have quadrupled since 1950, severe weather phenomena and sea level rise show no significant trends attributable to them. Projections of future climate and weather events rely on models demonstrably unfit for the purpose. As a result, rising levels of CO2 do not obviously pose an immediate, let alone imminent, threat to the earth’s climate.

The submission includes detailed responses to each of the judge’s questions and are well worth reading.

Newsflash: NH Snow Exceptionally Huge This Year

Over land the northern hemisphere Globsnow snow-water-equivalent SWE product and over sea the OSI-SAF sea-ice concentration product. Credit: Image courtesy of Finnish Meteorological Institute

This just in from  Science Daily thanks to the Finnish Meteorological Institute: Exceptionally large amount of winter snow in Northern Hemisphere this year  March 14, 2018.

Excerpts below include both factual and speculative content (with my bolds.)

The new Arctic Now product shows with one picture the extent of the area in the Northern Hemisphere currently covered by ice and snow. This kind of information, which shows the accurate state of the Arctic, becomes increasingly important due to climate change.

In the Northern Hemisphere the maximum seasonal snow cover occurs in March. “This year has been a year with an exceptionally large amount of snow, when examining the entire Northern Hemisphere. The variation from one year to another has been somewhat great, and especially in the most recent years the differences between winters have been very great,” says Kari Luojus, Senior Research Scientist at the Finnish Meteorological Institute.

The information has been gleaned from the Arctic Now service of the Finnish Meteorological Institute, which is unique even on a global scale. The greatest difference compared with other comparable services is that traditionally they only tell about the extent of the ice or snow situation.

“Here at the Finnish Meteorological Institute we have managed to combine data to form a single image. In this way we can get a better situational picture of the cryosphere — that is, the cold areas of the Northern Hemisphere,” Research Professor Jouni Pulliainen observes. In addition to the coverage, the picture includes the water value of the snow, which determines the water contained in the snow. This is important information for drafting hydrological forecasts on the flood situation and in monitoring the state of climate and environment in general.

Information on the amount of snow is also sent to the Global Cryosphere Watch service of the World Meteorological Organisation (WMP) where the information is combined with trends and statistics of past years. Lengthy series of observation times show that the total amount of snow in the Northern Hemisphere has declined in the spring period and that the melting of the snow has started earlier in the same period. Examination over a longer period (1980-2017) shows that the total amount of snow in all winter periods has decreased on average.

Also, the ice cover on the Arctic Ocean has grown thinner and the amount and expanse of perennial ice has decreased. Before 2000 the smallest expanse of sea ice varied between 6.2 and 7.9 million square kilometres. In the past ten years the expanse of ice has varied from 5.4 to 3.6 million square kilometres. Extreme weather phenomena — winters in which snowfall is sometimes quite heavy, and others with little snow, will increase in the future.  (Speculation for sure.)

Here is the MASIE chart from yesterday, confirming extensive snow this year:

Rise and Fall of the Modern Warming Spike


The first graph appeared in the IPCC 1990 First Assessment Report (FAR) credited to H.H.Lamb, first director of CRU-UEA. The second graph was featured in 2001 IPCC Third Assessment Report (TAR) the famous hockey stick credited to M. Mann.

A previous post Rise and Fall of CAGW described the process that began with Hansen’s flashy Senate testimony in 1988, later supported by Santer’s flashy paper in 1996. This post traces a second iteration that ensued following Michael Mann’s production of the infamous Climate Hockey Stick graph in 1998. The image at the top comes from the 2001 IPCC TAR (Third Assessment Report) signifying the immediate embrace of this alarmist tool by consensus climatists.  The message of the graph was to assert a spike in modern warming unprecedented in the last 1000 years.  This claim of a “Modern Warming Spike” required a flat temperature profile throughout the Middle Ages (since 1000 AD).

The background to the process steps (image below) from Ross Pomeroy’s paper is provided followed by text and references for the rise and fall of the theory intended to erase Medieval Warming comparable to the present day. Sources of material are listed at the end and  included here with my bolds.

How Theories Advance and Collapse

Seeing how disarray defines psychology, it makes perfect sense that the field’s leading theories are vulnerable to collapse. Having watched this process play out a number of times, a clear pattern has emerged. Let’s call it the “Six Stages of a Failed Psychological or Sociological Theory.”

Stage 1: The Flashy Finding. An intriguing report is published with subject matter that lends itself to water cooler conversation, say, for example, that sticking a pen in your mouth to force a smile makes things seem funnier. Media outlets provide gushing coverage.

Stage 1 Modern Warming Spike Theory


Figure 2.20: Millennial Northern Hemisphere (NH) temperature reconstruction (blue) and instrumental data (red) from AD 1000 to 1999, adapted from Mann et al. (1999). Smoother version of NH series (black), linear trend from AD 1000 to 1850 (purple-dashed) and two standard error limits (grey shaded) are shown. Source: IPCC Third Assessment Report

Since the IPCC believes that the warming from 1975 to 1998 was mainly man-made, but not the warming in earlier centuries, it would like to be able to demonstrate that recent warming is ‘unprecedented’. But it isn’t. Temperatures in many parts of the world appear to be lower than they were in the Medieval Warm Period (MWP, c. 900-1400), and also in the earlier Roman Warm Period (c. 200 BC – 600 AD). During the MWP the Vikings tilled now-frozen farms in Greenland and were buried there in ground that is now permafrost ( Hundreds of peer-reviewed articles show that the MWP was a global phenomenon (Idso & Singer, 2009, 69-94;;, and was not confined to parts of the northern hemisphere, as the IPCC likes to assert.

Those wanting to “get rid of” the MWP run into the problem that it shows up strongly in the data. Shortly after Deming’s article appeared, a group led by Shaopeng Huang of the University of Michigan completed a major analysis of over 6,000 borehole records from every continent around the world. Their study went back 20,000 years. The portion covering the last millennium is shown in Figure 4.


The similarity to the IPCC’s 1995 graph is obvious. The world experienced a “warm” interval in the medieval era that dwarfs 20th century changes. The present-day climate appears to be simply a recovery from the cold years of the “Little Ice Age.”

Huang and coauthors published their findings in Geophysical Research Letters 6 in 1997. The next year, Nature published the first Mann hockey stick paper, commonly called “MBH98.”7 Mann et al. followed up in 1999 with a paper in GRL (“MBH99”) extending their results from AD1400 back to AD1000.8 In early 2000 the IPCC released the first draft of the TAR. The hockey stick was the only paleoclimate reconstruction shown in the Summary, and was the only one in the whole report to be singled out for repeated presentation. The borehole data received a brief mention in Chapter 2 but the Huang et al. graph was not shown. A small graph of borehole data taken from another study and based on a smaller sample was shown, but it only showed a post-1500 segment, which, conveniently, trended upwards.


Figure 2.19: Reconstructed global ground temperature estimate from borehole data over the past five centuries, relative to present day. Shaded areas represent ± two standard errors about the mean history (Pollack et al., 1998). Superimposed is a smoothed (five-year running average) of the global surface air temperature instrumental record since 1860 (Jones and Briffa, 1992). Source: IPCC Third Assessment Report WG 1

Stage 2: The Fawning Replications. Other psychologists, usually in the early stages of their careers, leap to replicate the finding. Most of their studies corroborate the effect. Those that don’t are not published, perhaps because the researchers don’t want to step on any toes, or because journal editors would prefer not to publish negative findings.

Stage 2 Modern Warming Spike Theory

As the hockey stick began to appear in the scientific literature, it emerged that 1998 was the warmest year in Phil Jones’s 150-year record of thermometer data. The length of the hockey stick blade just grew. Those in charge of publicizing the work of climate scientists and making the case for man-made climate change were understandably excited. Controversial science swiftly morphed into a propaganda tool.

The World Meteorological Organization put the hockey stick on the cover of its 1999 report on climate change. Then IPCC chiefs decided to give it pride of place in their 2001 IPCC report. Moreover, based on the hockey stick, they stated that “it is likely that the 1990s was the warmest decade and 1998 the warmest year during the past thousand years”. That attracted attention — and trouble. The doubts expressed in that paper title about “uncertainties and limitations” were melting away.


1999 WMO statement on the Climate.

An article in the Guardian (here) describes the struggle leading to victory for the Hockey Stick.

Emails exchanged in September 1999 reveal intense disagreement about whether Mann’s hockey stick should go into the IPCC summary for policymakers – the only bit of the report that usually gets read outside the scientific community – or whether other reconstructions using tree ring data alone should get priority. One of the main tree-ring constructions was by Briffa. The emails also expose major tensions between a desire for scrupulous honesty about uncertainties, and the desire for a simple story to tell the policymakers. The IPCC’s core job is to present a “consensus” on the science, but in this critical case there was no easy consensus.

The tensions were summed up in an email sent on 22 September 1999 by Met Office scientist Chris Folland, in which he alerted key researchers that a diagram of temperature change over the past thousand years “is a clear favourite for the policy makers’ summary”

But there were two competing graphs – Mann’s hockey stick and another, by Jones, Briffa and others. Mann’s graph was clearly the more compelling image of man-made climate change. The other “dilutes the message rather significantly,” said Folland. “We want the truth. Mike [Mann] thinks it lies nearer his result.” Folland noted that “this is probably the most important issue to resolve in chapter 2 at present.”

Mann, Jones and Briffa eventually settled their differences. And the hockey stick was given pride of place in the IPCC report. Folland says: “My recollection is that the final version [of the IPCC summary], which contains the hockey stick, satisfied Keith and everyone else in the end — after the usual vigorous scientific debate.” And after the three came under attack from climate sceptics, all reference to these past spats disappeared from the emails as they faced a common foe.

Stage 3: A Consensus Forms. The finding is now taken for granted, regularly appearing in pop psychology stories and books penned by writers like Malcolm Gladwell or Jonah Lehrer. Millions of people read about it and “armchair” explain it to their friends and family.

Stage 3 Modern Warming Spike Theory

In its 2001 Third Assessment Report, the IPCC used the iconic ‘hockey stick’ graph to try and show that modern warming was indeed ‘unprecedented’. The graph was produced by Michael Mann (now at Penn State University in the US), Ray Bradley and Malcolm Hughes (MBH), and published in Nature and Geophysical Research Letters in 1998 and 1999. At that time, the standard view was that the Medieval Warm Period and subsequent Little Ice Age (c. 1400-1850) were global events. But some climatologists saw the MWP as an embarrassment and spoke of the need to ‘get rid of it’. MBH’s temperature reconstruction did exactly that: it showed 900 years of gradually declining temperatures followed by a dramatic increase in the 20th century. The hockey stick played a central role in mobilizing political and public opinion in favour of drastic action to curb greenhouse gas emissions.

Al Gore with a version of the Hockey Stick graph in the 2006 movie An Inconvenient Truth

“As soon as the IPCC Report came out, the hockey stick version of climate history became canonical. Suddenly it was the “consensus” view, and for the next few years it seemed that anyone publicly questioning the result was in for a ferocious reception.” Ross McKitrick.What is the ‘Hockey Stick’ Debate About?  

Stage 4: The Rebuttal. After a few decades, a new generation of researchers look to make a splash by questioning prevailing wisdom. One team produces a more methodologically-sound study that debunks the initial finding. Media outlets blare the “counterintuitive” discovery.

Stage 4 Modern Warming Spike Theory

The hockey stick was based on historical temperature proxies (mainly tree rings), with the 20th-century instrumental temperature record tacked on the end. Incredibly, although the MBH articles were peer reviewed, nobody tried to replicate and verify the work, even though it overturned well-established views on climate history. It was only several years later that Steve McIntyre, a Canadian mathematician and retired mining consultant, began to investigate the matter. Mann did his best to obstruct him; he refused to release his computer code, saying that ‘giving them the algorithm would be giving in to the intimidation tactics that these people are engaged in’.

McIntyre, with the help of economist Ross McKitrick, went on to write several articles in 2003 and 2005, exposing the flaws in the hockey-stick reconstruction. They showed that the shape of the graph was determined mainly by suspect bristlecone/foxtail tree-ring data, and that Mann’s computer algorithm was so biased that it could produce hockey sticks even out of random noise; in short, Mann’s statistical methods ‘mined’ for hockey-stick signals in the proxy data, which were then assigned exaggerated weight in the reconstruction – thereby giving a whole new meaning to the term ‘Man(n)-made warming’!

In 2006 McIntyre & McKitrick’s criticisms were upheld by two expert committees in the US – the National Academy of Sciences (NAS) panel and a congressional panel headed by statistician Edward Wegman. Wegman pointed out that the palaeoclimate field is dominated by ‘a tightly knit group of individuals who passionately believe in their thesis’, and that ‘the work has been sufficiently politicized that they can hardly reassess their own public positions without losing credibility’.

McKitrick wrote in 2005:

Since our work has begun to appear we have enjoyed the satisfaction of knowing we are winning over the expert community, one at a time. Physicist Richard Muller of Berkeley studied our work last year and wrote an article about it:

“[The findings] hit me like a bombshell, and I suspect it is having the same effect on many others. Suddenly the hockey stick, the poster-child of the global warming community, turns out to be an artifact of poor mathematics.”

In an article in the Dutch science magazine Natuurwetenschap & Techniek, Dr. Rob van Dorland of the Dutch National Meteorological Agency commented “It is strange that the climate reconstruction of Mann passed both peer review rounds of the IPCC without anyone ever really having checked it. I think this issue will be on the agenda of the next IPCC meeting in Peking this May.”

In February 2005 the German television channel Das Erste interviewed climatologist Ulrich Cubasch, who revealed that he too had been unable to replicate the hockey stick (emphasis added):

He [Climatologist Ulrich Cubasch] discussed with his coworkers – and many of his professional colleagues – the objections, and sought to work them through… Bit by bit, it became clear also to his colleagues: the two Canadians were right. …Between 1400 and 1600, the temperature shift was considerably higher than, for example, in the previous century. With that, the core conclusion, and that also of the IPCC 2001 Report, was completely undermined.

Recently Steve MacIntyre and I received an email from Dr. Hendrik Tennekes, retired director of the Royal Meteorological Institute of the Netherlands. He wrote to convey comments he wished to be communicated publicly: “The IPCC review process is fatally flawed. The behavior of Michael Mann is a disgrace to the profession.”

The original MBH graph compared to a corrected version produced by MacIntyre and McKitrick after undoing Mann’s errors.

Stage 5: Proper Replications Pour In. Research groups attempt to replicate the initial research with the skepticism and precise methodology that should’ve been used in the first place. As such, the vast majority fail to find any effect.

Stage 5 Modern Warming Spike Theory

The IPCC dealt with the devastating rebuttal by hiding the hockey stick within a spaghetti graph of various paleo proxies to diffuse the issue, while still claiming unprecedented modern warming.

In the IPCC’s 2007 Fourth Assessment Report, the hockey stick was included in a ‘spaghetti diagram’ alongside six other temperature reconstructions, which showed greater variability in the past but still no pronounced MWP. These ‘independent’ studies are the work of Mann’s colleagues and make use of the same flawed proxies as well as dubious statistical techniques (Montford, 2010, 266-308). The data were carefully cherry-picked to exclude tree-ring series that showed a prominent MWP ( Palaeoclimatologist Rosanne D’Arrigo actually told the NAS panel that cherry-picking was necessary if you wanted to make cherry pie (i.e. hockey sticks). And Jan Esper has stated: ‘The ability to pick and choose which samples to use is an advantage unique to dendroclimatology’ – a statement that would make any reputable scientist shudder (Montford, 236, 288-9).

Sixteen of the articles cited in AR4 failed to meet the IPCC’s own publication deadlines for cited references; all of them were written by IPCC contributing authors in support of the AGW cause. The most notable case is a paper by Eugene Wahl and Caspar Ammann. The authors of chapter 6 desperately needed this paper to counter McIntyre & McKitrick’s criticisms of the hockey stick, as the authors claimed to have validated Mann’s results. The leaked emails show that members of the Team pressurized Climatic Change editor Stephen Schneider to ensure that the paper was processed quickly enough to meet IPCC deadlines, though this was not entirely successful. Wahl and Ammann referred to arguments in another unpublished paper they had written, which was not even submitted until well after the first paper had gone forward for IPCC review. Jones advised the authors to be dishonest: ‘try and change the Received date! Don’t give those skeptics something to amuse themselves with’ (1189722851). Both papers finally appeared in September 2007. The authors conceded that the hockey stick failed a key test for statistical significance, but claimed it passed another test and promised to provide details in their Supplementary Information. When this was finally made available a year later, it became clear that torturous statistical manipulations were required to enable the test to be passed (Montford, 2010, 201-19, 338-42, 424-6; The shenanigans involved in the Wahl & Ammann saga are quite breathtaking.

But the credibility of the hockey stick claims was attacked repeatedly:

Stage 6: The Theory Lives On as a Zombie. Despite being debunked, the theory lingers on in published scientific studies, popular books, outdated webpages, and common “wisdom.” Adherents in academia cling on in a state of denial – their egos depend upon it.

Stage 6 Modern Warming Spike Theory

There are still hardcore alarmist blogs that defend the hockey stick graph, but IPCC itself has dropped it without explicitly disowning it.

About 1000 years ago, large parts of the world experienced a prominent warm phase which in many cases reached a similar temperature level as today or even exceeded present-day warmth. While this Medieval Warm Period (MWP) has been documented in numerous case studies from around the globe, climate models still fail to reproduce this historical warm phase. The problem is openly conceded in the most recent IPCC report from 2013 (AR5, Working Group 1) where in chapter 5.3.5. the IPCC scientists admit (pdf here):

“Continental-scale surface temperature reconstructions show, with high confidence, multi-decadal periods during the Medieval Climate Anomaly (950 to 1250) that were in some regions as warm as in the mid-20th century and in others as warm as in the late 20th century.”  pg.386

“The timing of warm and cold periods is mostly consistent across reconstructions (in some cases this is because they use similar proxy compilations) but the magnitude of the changes is clearly sensitive to the statistical method and to the target domain (land or land and sea; the full hemisphere or only the extra-tropics; Figure 5.7a). Even accounting for these uncertainties, almost all reconstructions agree that each 30-year (50-year) period from 1200 to 1899 was very likely colder in the NH than the 1983–2012 (1963–2012) instrumental temperature NH reconstructions covering part or all of the first millennium suggest that some earlier 50-year periods might have been as warm as the 1963–2012 mean instrumental temperature, but the higher temperature of the last 30 years appear to be at least likely the warmest 30-year period in all reconstructions (Table 5.4). However, the confidence in this finding is lower prior to 1200, because the evidence is less reliable and there are fewer independent lines of evidence. There are fewer proxy records, thus yielding less independence among the reconstructions while making them more susceptible to errors in individual proxy records. The published uncertainty ranges do not include all sources of error (Section, and some proxy records and uncertainty estimates do not fully represent variations on time scales as short as the 30 years considered in Table 5.4. Considering these caveats, there is medium confidence that the last 30 years were likely the warmest 30-year period of the last 1400 years.” Pg.410

Meanwhile a multitude of studies confirm that medieval warming was widespread and not limited to regions in the Northern Hemisphere, as Mann and others have claimed. For example the  MWP Mapping Project  led by Dr. Sebastian Luening, Prof. Dr. Fritz Vahrenholt (authors of ‘The neglected sun‘).

red: MWP warming
blue: MWP cooling (very rare)
yellow: MWP more arid
green: MWP more humid
grey: no trend or data ambiguous

Most of western North America and Africa were experiencing drought conditions during the MWP (except some areas in Southwest Africa). In contrast, Australia and the Carribean was more humid. Globally, 99% of all paleoclimatic temperature studies compiled in the map so far show a prominent warming during the MWP. This includes Antarctica and the Arctic.


“Regarding the Hockey Stick of IPCC 2001 evidence now indicates, in my view, that an IPCC Lead Author working with a small cohort of scientists, misrepresented the temperature record of the past 1000 years by (a) promoting his own result as the best estimate, (b) neglecting studies that contradicted his, and (c) amputating another’s result so as to eliminate conflicting data and limit any serious attempt to expose the real uncertainties of these data.” – John Christy, Examining the Process concerning Climate Change Assessments,  Testimony 31 March 2011



Presentation to the National Academy of Sciences Expert Panel, “Surface Temperature Reconstructions for the Past 1,000-2,000 Years   Stephen Mcintyre and Ross McKitrick  2006

Climategate and the Inquiries,Ken Gregory

Climategate and the Corruption of Climate Science David Pratt

IPCC TAR and the hockey stick Judith Curry 2014

Global Warming Bombshell Richard Mueller 2004

When the IPCC ‘disappeared’ the Medieval Warm Period Frank Lansner 2010


Todays temperatures are cooler than the Medieval Warming Period, which was preceded by an even warmer Roman Warm Period, which followed an even warmer Minoan Warm Period.  We are in an Interglacial age about 11,500 years old, and the overall trend is cooling.

Figure 37. Holocene global temperature change reconstruction. a. Red curve, global average temperature reconstruction from Marcott et al., 2013, figure 1. The averaging method does not correct for proxy drop out which produces an artificially enhanced terminal spike, while the Monte Carlo smoothing eliminates most variability information. b. Black curve, global average temperature reconstruction from Marcott et al., 2013, using proxy published dates, and differencing average. Temperature anomaly was rescaled to match biological, glaciological, and marine sedimentary evidence, indicating the Holocene Climate Optimum was about 1.2°C warmer than LIA. c. Purple curve, Earth’s axis obliquity is shown to display a similar trend to Holocene temperatures. Source: Marcott et al., 2013.
Source:Judith Curry Nature Unbound III: Holocene climate variability (Part A)

Updated: Fears and Facts about Reservoirs and GHGs


A previous post explained how methane has been hyped in support of climate alarmism/activism. Now we have an additional campaign to disparage hydropower because of methane emissions from dam reservoirs. File this under “They have no shame.” Excerpts below with my bolds.

On March 5, 2018 a study was published in Environmental Research Letters Greenhouse gas emissions of hydropower in the Mekong River Basin can exceed those of fossil fuel energy sources

“The hydropower related emissions started in the Mekong in mid-1960’s when the first large reservoir was built in Thailand, and the emissions increased considerably in early 2000’s when hydropower development became more intensive. Currently the emissions are estimated to be around 15 million tonnes of CO2e per year, which is more than total emissions of all sectors in Lao PDR in year 2013,” says Dr Timo Räsänen who led the study. The GHG emissions are expected to increase when more hydropower is built. However, if construction of new reservoirs is halted, the emissions will decline slowly in time.

Another recent example of the claim is from Asia Times Global hydropower boom will add to climate change

The study, published in BioScience, looked at the carbon dioxide (CO2), methane (CH4), and nitrous oxide (N2O) emitted from 267 reservoirs across six continents. In total, the reservoirs studied have a surface area of more than 77,287 square kilometers (29,841 square miles). That’s equivalent to about a quarter of the surface area of all reservoirs in the world, which together cover 305,723 sq km – roughly the combined size of the United Kingdom and Ireland.

“The new study confirms that reservoirs are major emitters of methane, a particularly aggressive greenhouse gas,” said Kate Horner, Executive Director of International Rivers, adding that hydropower dams “can no longer be considered a clean and green source of electricity.”

In fact, methane’s effect is 86 times greater than that of CO2 when considered on this two-decade timescale. Importantly, the study found that methane is responsible for 90% of the global warming impact of reservoir emissions over 20 years.

Alarmists are Wrong about Hydropower

Now CH4 is proclaimed the primary culprit held against hydropower. As usual, there is a kernel of truth buried beneath this obsessive campaign: Flooding of biomass does result in decomposition accompanied by some release of CH4 and CO2. From HydroQuebec:  Greenhouse gas emissions and reservoirs

Impoundment of hydroelectric reservoirs induces decomposition of a small fraction of the flooded biomass (forests, peatlands and other soil types) and an increase in the aquatic wildlife and vegetation in the reservoir.

The result is higher greenhouse gas (GHG) emissions after impoundment, mainly CO2 (carbon dioxide) and a small amount of CH4 (methane).

However, these emissions are temporary and peak two to four years after the reservoir is filled.

During the ensuing decade, CO2 emissions gradually diminish and return to the levels given off by neighboring lakes and rivers.

Hydropower generation, on average, emits 50 times less GHGs than a natural gas generating station and about 70 times less than a coal-fired generating station.

The Facts about Tropical Reservoirs

Activists estimate Methane emissions from dams and reservoirs across the planet, including hydropower, are estimated to be significantly larger than previously thought, approximately equal to 1 gigaton per year.

Activists also claim that dams in boreal regions like Quebec are not the problem, but tropical reservoirs are a big threat to the climate. Contradicting that is an intensive study of Brazilian dams and reservoirsGreenhouse Gas Emissions from Reservoirs: Studying the Issue in Brazil

The Itaipu Dam is a hydroelectric dam on the Paraná River located on the border between Brazil and Paraguay. The name “Itaipu” was taken from an isle that existed near the construction site. In the Guarani language, Itaipu means “the sound of a stone”. The American composer Philip Glass has also written a symphonic cantata named Itaipu, in honour of the structure.

Five Conclusions from Studying Brazilian Reservoirs

1) The budget approach is essential for a proper grasp of the processes going on in reservoirs. This approach involves taking into account the ways in which the system exchanged GHGs with the atmosphere before the reservoir was flooded. Older studies measured only the emissions of GHG from the reservoir surface or, more recently, from downstream de-gassing. But without the measurement of the inputs of carbon to the system, no conclusions can be drawn from surface measurements alone.

2) When you consider the total budgets, most reservoirs acted as sinks of carbon in the short run (our measurements covered one year in each reservoir). In other words, they received more carbon than they exported to the atmosphere and to downstream.

3) Smaller reservoirs are more efficient as carbon traps than the larger ones.

4) As for the GHG impact, in order to determine it, we should add the methane (CH4) emissions to the fraction of carbon dioxide (CO2) emissions which comes from the flooded biomass and organic carbon in the flooded (terrestrial) soil. The other CO2 emissions, arising from the respiration of aquatic organisms or from the decomposition of terrestrial detritus that flows into the reservoir (including domestic sewage), are not impacts of the reservoir. From this sum, we should deduct the amount of carbon that is stored in the sediment and which will be kept there for at least the life of the reservoir (usually more than 80 years). This “stored carbon” ranges from as little as 2 percent of the total carbon output to more than 25 percent, depending on the reservoirs.

5) When we assess the GHG impacts following the guidelines just described, all of FURNAS’s reservoirs have lower emissions than the cleanest European oil plant. The worst case – Manso, which was sampled only three years after the impoundment, and therefore in a time in which the contribution from the flooded biomass was still very significant – emitted about half as much carbon dioxide equivalents (CO2 eq) as the average oil plant from the United States (CO2 eq is a metric measure used to compare the emissions from various greenhouse gases based upon their global warming potential, GWP. CO2 eq for a gas is derived by multiplying the tons of the gas by the associated GWP.) We also observed a very good correlation between GHG emissions and the age of the reservoirs. The reservoirs older than 30 years had negligible emissions, and some of them had a net absorption of CO2eq.

Keeping Methane in Perspective

Over the last 30 years, CH4 in the atmosphere increased from 1.6 ppm to 1.8 ppm, compared to CO2, presently at 400 ppm. So all the dam building over 3 decades, along with all other land use was part of a miniscule increase of a microscopic gas, 200 times smaller than the trace gas, CO2.


Background Facts on Methane and Climate Change

The US Senate is considering an act to repeal with prejudice an Obama anti-methane regulation. The story from activist source Climate Central is
Senate Mulls ‘Kill Switch’ for Obama Methane Rule

The U.S. Senate is expected to vote soon on whether to use the Congressional Review Act to kill an Obama administration climate regulation that cuts methane emissions from oil and gas wells on federal land. The rule was designed to reduce oil and gas wells’ contribution to climate change and to stop energy companies from wasting natural gas.

The Congressional Review Act is rarely invoked. It was used this month to reverse a regulation for the first time in 16 years and it’s a particularly lethal way to kill a regulation as it would take an act of Congress to approve a similar regulation. Federal agencies cannot propose similar regulations on their own.

The Claim Against Methane

Now some Republican senators are hesitant to take this step because of claims like this one in the article:

Methane is 86 times more potent as a greenhouse gas than carbon dioxide over a period of 20 years and is a significant contributor to climate change. It warms the climate much more than other greenhouse gases over a period of decades before eventually losing its potency. Atmospheric carbon dioxide remains a potent greenhouse gas for thousands of years.

Essentially the journalist is saying: As afraid as you are about CO2, you should be 86 times more afraid of methane. Which also means, if CO2 is not a warming problem, your fear of methane is 86 times zero. The thousands of years claim is also bogus, but that is beside the point of this post, which is Methane.

IPCC Methane Scare

The article helpfully provides a link referring to Chapter 8 of IPCC AR5 report by Working Group 1 Anthropogenic and Natural Radiative Forcing.

The document is full of sophistry and creative accounting in order to produce as scary a number as possible. Table 8.7 provides the number for CH4 potency of 86 times that of CO2.  They note they were able to increase the Global Warming Potential (GWP) of CH4 by 20% over the estimate in AR4. The increase comes from adding in more indirect effects and feedbacks, as well as from increased concentration in the atmosphere.

In the details are some qualifying notes like these:

Uncertainties related to the climate–carbon feedback are large, comparable in magnitude to the strength of the feedback for a single gas.

For CH4 GWP we estimate an uncertainty of ±30% and ±40% for 20- and 100-year time horizons, respectively (for 5 to 95% uncertainty range).

Methane Facts from the Real World
From Sea Friends (here):

Methane is natural gas CH4 which burns cleanly to carbon dioxide and water. Methane is eagerly sought after as fuel for electric power plants because of its ease of transport and because it produces the least carbon dioxide for the most power. Also cars can be powered with compressed natural gas (CNG) for short distances.

In many countries CNG has been widely distributed as the main home heating fuel. As a consequence, methane has leaked to the atmosphere in large quantities, now firmly controlled. Grazing animals also produce methane in their complicated stomachs and methane escapes from rice paddies and peat bogs like the Siberian permafrost.

It is thought that methane is a very potent greenhouse gas because it absorbs some infrared wavelengths 7 times more effectively than CO2, molecule for molecule, and by weight even 20 times. As we have seen previously, this also means that within a distance of metres, its effect has saturated, and further transmission of heat occurs by convection and conduction rather than by radiation.

Note that when H20 is present in the lower troposphere, there are few photons left for CH4 to absorb:

Even if the IPCC radiative greenhouse theory were true, methane occurs only in minute quantities in air, 1.8ppm versus CO2 of 390ppm. By weight, CH4 is only 5.24Gt versus CO2 3140Gt (on this assumption). If it truly were twenty times more potent, it would amount to an equivalent of 105Gt CO2 or one thirtieth that of CO2. A doubling in methane would thus have no noticeable effect on world temperature.

However, the factor of 20 is entirely misleading because absorption is proportional to the number of molecules (=volume), so the factor of 7 (7.3) is correct and 20 is wrong. With this in mind, the perceived threat from methane becomes even less.

Further still, methane has been rising from 1.6ppm to 1.8ppm in 30 years (1980-2010), assuming that it has not stopped rising, this amounts to a doubling in 2-3 centuries. In other words, methane can never have any measurable effect on temperature, even if the IPCC radiative cooling theory were right.

Because only a small fraction in the rise of methane in air can be attributed to farm animals, it is ludicrous to worry about this aspect or to try to farm with smaller emissions of methane, or to tax it or to trade credits.

The fact that methane in air has been leveling off in the past two decades, even though we do not know why, implies that it plays absolutely no role as a greenhouse gas.

More information at THE METHANE MISCONCEPTIONS by Dr Wilson Flood (UK) here


Natural Gas (75% methane) burns the cleanest with the least CO2 for the energy produced.

Leakage of methane is already addressed by efficiency improvements for its economic recovery, and will apparently be subject to even more regulations.

The atmosphere is a methane sink where the compound is oxidized through a series of reactions producing 1 CO2 and 2H20 after a few years.

GWP (Global Warming Potential) is CO2 equivalent heat trapping based on laboratory, not real world effects.

Any IR absorption by methane is limited by H2O absorbing in the same low energy LW bands.

There is no danger this century from natural or man-made methane emissions.


Senators and the public are being bamboozled by opaque scientific bafflegab. The plain truth is much different. The atmosphere is a methane sink in which CH4 is oxidized in the first few meters. The amount of CH4 available in the air is miniscule, even compared to the trace gas CO2, and it is not accelerating. Methane is the obvious choice to signal virtue on the climate issue since governmental actions will not make a bit of difference anyway, except perhaps to do some economic harm.

Give a daisy a break (h/t Derek here)

Daisy methane


For a more thorough and realistic description of atmospheric warming see:

Fearless Physics from Dr. Salby

Climate Kills Wildflower! (False Alarm)

This is Androsace septentrionalis (Northern rock jasmine). Credit: Anne Marie Panetta

Breathless news out of Colorado Climate warming causes local extinction of Rocky Mountain wildflower species  Excerpts below with my bolds.

New University of Colorado Boulder-led research has established a causal link between climate warming and the localized extinction of a common Rocky Mountain flowering plant, a result that could serve as a herald of future population declines.

The new study, which was published today in the journal Science Advances, found that warmer, drier conditions in line with future climate predictions decimated experimental populations of Androsace septentrionalis (Northern rock jasmine), a mountain wildflower found at elevations ranging from around 6,000 feet in Colorado’s foothills to over 14,000 feet at the top of Mt. Elbert.

The findings paint a bleak picture for the persistence of native flowering plants in the face of climate change and could serve as a herald for future species losses in mountain ecosystems over the next century.

Always the curious one, I went looking for context to interpret this report.  Thank goodness for the Internet; it didn’t take long to find information left out of the alarming news release.  From the US Wildflower Database (here) we can see the bigger picture.

Androsace Septentrionalis, Rock Jasmine

Androsace septentrionalis is a small-flowered and rather inconspicuous plant, and is the most common member of this genus in the West, out of six in the US. Plants are very variable in size, reflecting the wide range of habitats and elevations – from near sea level to over 11,000 feet. Stalkless leaves grow at the base, in a flat rosette, and often have a few teeth along the margins, and ciliate hairs. Leaf surfaces may be hairless or sparsely short hairy.

Common names: Rock jasmine, pygmyflower
Family: Primrose (Primulaceae)
Scientific name: Androsace septentrionalis
Main flower color: White
Range: The Rocky Mountain states, westwards to the Great Basin, and small areas of neighboring states
Height: Between 1 and 8 inches
Habitat: Grassland, forest, tundra; generally open areas, from sea level to 11,500 feet
Leaves: Basal, oblanceolate, up to 1.2 inches long and 0.4 inches across; entire or coarsely toothed edges
Season: March to September

Look at the range and habitat and ask yourself if this plant is adaptive, as well as the fact this species is the most common out of six in the genus.

And in Minnesota (here), on the eastern edge of the range, it is rare compared to the Western Rock Jasmine (Androsace occidentalis).

If American lotus (Nelumbo lutea) is noted as Minnesota’s largest native wildflower, Western Rock Jasmine  certainly vies for its smallest. It can have very dense populations but it takes a discerning and determined eye to pick it out of the landscape, and is only of interest to those who celebrate the diversity of nature. It is easily distinguished from its rare cousin, Northern Androsace (Androsace septentrionalis) which is larger in stature and has rather narrower bracts at the base of the flower cluster.

The preferred habitat features sun; dry sandy soil, grassy meadows, open fields, disturbed soil, which along with “rock” in the name suggests that these plants tolerate arid conditions.


Far from going extinct, these flowers abound and like humans adapt readily to their surroundings. As has been stated previously, when alarmists project large numbers of extinctions due to future climate change, always ask for the names and the dead bodies.  What the headlines claim is refuted by the facts on the ground.


Rainfall Climate Paradox

A recent article displays the intersection of fears and facts comprising the climate paradox, in this case the issue of precipitation.  Rainfall’s natural variation hides climate change signal appeared today in by Kate Prestt, Australian National University.  Excerpts with my bolds.

New research from The Australian National University (ANU) and ARC Centre of Excellence for Climate System Science suggests natural rainfall variation is so great that it could take a human lifetime for significant climate signals to appear in regional or global rainfall measures.

Even exceptional droughts like those over the Murray Darling Basin (2000-2009) and the 2011 to 2017 Californian drought fit within the natural variations in the long-term precipitation records, according to the statistical method used by the researchers.

This has significant implications for policymakers in the water resources, irrigation and agricultural industries.

“Our findings suggest that for most parts of the world, we won’t be able to recognise long term or permanent changes in annual rainfall driven by climate change until they have already occurred and persisted for some time,” said Professor Michael Roderick from the ANU Research School of Earth Sciences.

“This means those who make decisions around the construction of desalination plants or introduce new policies to conserve water resources will effectively be making these decisions blind.

“Conversely, if they wait and don’t act until the precipitation changes are recognised they will be acting too late. It puts policymakers in an invidious position.”

To get their results the researchers first tested the statistical approach on the 244-year-long observational record of precipitation at the Radcliffe Observatory in Oxford, UK. They compared rainfall changes over 30-year-intervals. They found any changes over each interval were indistinguishable from random or natural variation.

They then applied the same process to California, which has a record going back to 1895, and the Murray Darling Basin from 1901-2007. In both cases the long dry periods seem to fit within expected variations.

Finally, they applied the process to reliable global records that extended from 1940-2009. Only 14 per cent of the global landmass showed, with 90 per cent confidence, increases or decreases in precipitation outside natural variation.

Professor Graham Farquhar AO also from the ANU Research School of Biology said natural variation was so large in most regions that even if climate change was affecting rainfall, it was effectively hidden in the noise.

“We know that humans have already had a measurable influence on streamflows and groundwater levels through extraction and making significant changes to the landscape,” Professor Farquhar said.

“But the natural variability of precipitation found in this paper presents policymakers with a large known unknown that has to be factored into their estimates to effectively assess our long-term water resource needs.”  The research has been published in the journal Proceedings of the National Academy of Sciences.


Much like sea level rise, scientists fearing the worst seek and hope to find a nanosignal inside noisy imprecise measurements of a naturally varying phenomenon.

CO2 Not Dangerous

Figure 1 depicts EPA’s endangerment chain of reasoning.

Scientists are putting forward the case against CO2 endangerment by making submissions to inform EPA’s reconsideration of that erroneous finding some years ago. As noted previously, the Supreme Court had ruled that EPA has authority to regulate CO2, but left it to the agency to study and decide the endangerment. H/T to GWPF and WUWT for providing links to the documents submitted to EPA on this topic. This post provides a synopsis with some of the key exhibits (my bolds)

The first supplement (here) addressed the first part of the scientific case, namely that fossil fuel emissions cause warming in earth’s atmosphere. The rebuttal consists of three points:

First, Research Reports failed to find that the steadily rising atmospheric CO2 concentrations have had a statistically significant impact on any of the 14 temperature data sets that were analyzed. The tropospheric and surface temperature data measurements that were analyzed were taken by many different entities using balloons, satellites, buoys and various land based techniques.

Second, new information is submitted regarding the logically invalid use of climate models in the attribution of warming to human greenhouse gas (GHG) emissions.

Third, new information is submitted relevant to the invalidation of the “Tropical Hot Spot” and the resulting implications for the three lines of evidence, a subject that was also discussed in our original Petition.

Now we have a Fifth Supplement (here) which rebuts in detail the “lines of evidence” which claim to prove man-made global warming is causing observable changes in nature.

Claim #1: Heat Waves are increasing at an alarming rate and heat kills

Summary of Rebuttal There has been no detectable long-term increase in heat waves in the United States or elsewhere in the world. Most all-time record highs here in the U.S. happened many years ago, long before mankind was using much fossil fuel. Thirty-eight states set their all-time record highs before 1960 (23 in the 1930s!). Here in the United States, the number of 100F, 95F and 90F days per year has been steadily declining since the 1930s. The Environmental Protection Agency Heat Wave Index confirms the 1930s as the hottest decade.

Claim #2: Global warming is causing more hurricanes and stronger hurricanes

Summary of RebuttalThere has been no detectable long-term trend in the number and intensity of hurricane activity globally. The activity does vary year to year and over multidecadal periods as ocean cycles including El Nino/La Nina,multidecadal cycles in the Pacific (PDO) and Atlantic (AMO) favor some basins over others.  The trend in landfalling storms in the United States has been flat to down since the 1850s. Before the active hurricane season in the United States in 2017, there had been a lull of 4324 days (almost 12 years) in major hurricane landfalls, the longest lull since the 1860s.

Claim #3: Global warming is causing more and stronger tornadoes

Summary of Rebuttal Tornadoes are failing to follow “global warming” predictions. Big tornadoes have seen a decline in frequency since the 1950s. The years 2012, 2013, 2014, 2015 and 2016 all saw below average to near record low tornado counts in the U.S. since records began in 1954. 2017 to date has rebounded only to the long-term mean. This lull followed a very active and deadly strong La Nina of 2010/11, which like the strong La Nina of 1973/74 produced record setting and very deadly outbreaks of tornadoes. Population growth and expansion outside urban areas have exposed more people to the tornadoes that once roamed through open fields.

Claim #4: Global warming is increasing the magnitude and frequency of droughts and floods.

Summary of Rebuttal Our use of fossil fuels to power our civilization is not causing droughts or floods. NOAA found there is no evidence that floods and droughts are increasing because of climate change. The number, extend or severity of these events does increase dramatically for a brief period of years at some locations from time to time but then conditions return to more normal. This is simply the long-established constant variation of weather resulting from a confluence of natural factors.

Claim #5: Global Warming has increased U.S. Wildfires

Summary of Rebuttal  Wildfires are in the news almost every late summer and fall. The National Interagency Fire Center has recorded the number of fires and acreage affected since 1985. This data show the number of fires trending down slightly, though the acreage burned had increased before leveling off over the last 20 years. The NWS tracks the number of days where conditions are conducive to wildfires when they issue red-flag warnings. It is little changed.

Claim #6: Global warming is causing snow to disappear

Summary of Rebuttal This is one claim that has been repeated for decades even as nature showed very much the opposite trend with unprecedented snows even to the big coastal cities. Every time they repeated the claim, it seems nature upped the ante more. Alarmists have eventually evolved to crediting warming with producing greater snowfall, because of increased moisture but the snow events in recent years have usually occurred in colder winters with high snow water equivalent ratios in frigid arctic air.

Claim #7: Global warming is resulting in rising sea levels as seen in both tide gauge and satellite technology.

Summary of Rebuttal This claim is demonstrably false. It really hinges on this statement: “Tide gauges and satellites agree with the model projections.” The models project a rapid acceleration of sea level rise over the next 30 to 70 years. However, while the models may project acceleration, the tide gauges clearly do not.  All data from tide gauges in areas where land is not rising or sinking show instead a steady linear and unchanging sea level rate of rise from 4 up to 6 inches/century, with variations due to gravitational factors.

Figure 1. Modelled and observed sea-level changes, 1840-2010. The curve marked “Models” represents the IPCC’s combination of selected tide-gauge records and corrected satellite altimetry data. The curve marked “Observations” represents the observed eustatic sea level changes in the field up to 1960 according to Mörner (1973) and (in this paper) thereafter. After 1965, the two curves start to diverge, presenting two totally different views, separated by the area with the question mark. Which of these views is tenable?

Claim #8: Arctic, Antarctic and Greenland ice loss is accelerating due to global warming

Summary of Rebuttal Satellite and surface temperature records and sea surface temperatures show that both the East Antarctic Ice Sheet and the West Antarctic Ice Sheet are cooling, not warming and glacial ice is increasing, not melting. Satellite and surface temperature measurements of the southern polar area show no warming over the past 37 years. Growth of the Antarctic ice sheets means sea level rise is not being caused by melting of polar ice and, in fact, is slightly lowering the rate of rise. Satellite Antarctic temperature records show 0.02C/decade cooling since 1979. The Southern Ocean around Antarctica has been getting sharply colder since 2006. Antarctic sea ice is increasing, reaching all-time highs. Surface temperatures at 13 stations show the Antarctic Peninsula has been sharply cooling since 2000.
Claim #9: Rising atmospheric CO2 concentrations are causing ocean acidification, which is catastrophically harming marine life

Summary of Rebuttal As the air’s CO2 content rises in response to ever-increasing anthropogenic CO2 emissions, more and more carbon dioxide is expected to dissolve into the surface waters of the world’s oceans, which dissolution is projected to cause a 0.3 to 0.7 pH unit decline in the planet’s oceanic waters by the year 2300.

The ocean chemistry aspect of the ocean acidification hypothesis is rather straightforward, but it is not as solid as it is often claimed to be. For one thing, the work of a number of respected scientists suggests that the drop in oceanic pH will not be nearly as great as the IPCC and others predict. And, as with all phenomena involving living organisms, the introduction of life into the analysis greatly complicates things. When a number of interrelated biological phenomena are considered, it becomes much more difficult, if not impossible, to draw such sweeping negative conclusions about the reaction of marine organisms to ocean acidification. Quite to the contrary, when life is considered, ocean acidification is often found to be a non-problem, or even a benefit. And in this regard, numerous scientific studies have demonstrated the robustness of multiple marine plant and animal species to ocean acidification—when they are properly performed under realistic experimental conditions.

Graph showing a typical oceanic situation. Over a 60 day period, pH fluxes are far greater than claims of global shifts toward 7 (neutral) or lower (acidity).

Claim #10: Carbon pollution is a health hazard

Summary of Rebuttal The term “carbon pollution” is a deliberate, ambiguous, disingenuous term, designed to mislead people into thinking carbon dioxide is pollution. It is used by the environmentalists to confuse the environmental impacts of CO2 emissions with the impact of the emissions of unwanted waste products of combustion. The burning of carbon-based fuels (fossil fuels – coal, oil, natural gas – and biofuels and biomass) converts the carbon in the fuels to carbon dioxide (CO2), which is an odorless invisible gas that is plant food and it is essential to life on the planet.

VOC refers to “volatile organic compounds” meaning any compound of carbon produced from burning fuels, excluding carbon monoxide and carbon dioxide.

The linked documents above provide more details on EPA’s “secret science”, as well as posts on this blog addressing many of these topics.





Sea Level Hype

It seems that alarmists get their exercise mainly by jumping to conclusions. Using datasets as trampolines they make great leaps of faith, oftentimes turning reality upside down in the process.

Update Feb. 17 at bottom

The latest example is the mass media excitement and exaggerations concerning sea level rise. Just consider the listing from Google News Feb. 13:

Miami could be underwater in your kid’s lifetime as sea level rise accelerates
USA Today

Yes, sea level rise really is accelerating
Ars Technica

Study: Sea level rise is accelerating and its rate could double in next century
Chicago Tribune

“It’s a big deal”: Melting ice sheets are accelerating sea level rise
CBS News Feb 13, 201

Satellites: Sea level rise to reach 2 feet by 2100
Minnesota Public Radio News (blog)

Satellite observations show sea levels rising, and climate change is accelerating it

The sea is coming for us
The Outline

Etc. Etc.Etc.

Although the principle author gave those juicy sound bites so craved by unreflective journalists, still the actual paper is quite restrained in its claims.  After all, they are only looking at 25 years of a very noisy dataset which has a quasi 60-year oscillation.  The paper is:

Climate-change–driven accelerated sea-level rise detected in the altimeter era By R. S. Nerem et al.


Using a 25-y time series of precision satellite altimeter data from TOPEX/Poseidon, Jason-1, Jason-2, and Jason-3, we estimate the climate-change–driven acceleration of global mean sea level over the last 25 y to be 0.084 ± 0.025 mm/y2. Coupled with the average climate-change–driven rate of sea level rise over these same 25 y of 2.9 mm/y, simple extrapolation of the quadratic implies global mean sea level could rise 65 ± 12 cm by 2100 compared with 2005, roughly in agreement with the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5) model projections.

Dr. John Ray provides a skeptical commentary, writing from Brisbane, Australia, at his blog (here) with my bolds.

Dedicated Warmist Seth Borenstein sets out a coherent story about warming causing sea-level rise. He regurgitates all the usual Warmist talking points regardless of their truth. He says, for instance, that the Antarctic is melting when it is not.

So we have to go back to the journal article behind Seth’s splurge to see what the scientists are saying.

And what we see there is very different from Seth’s confident pronouncements. We see a very guarded article indeed which rightly lists many of the difficulties in measuring sea level rise. And they can surmount those difficulties only by a welter of estimates and adjustments. Anywhere in that process there could be errors and biases. And as a result, we see that the journal authors describe their findings as only a”preliminary estimate” of sea level rise.

And it gets worse. When we look further into the journal article we see that the sea level rise is measured in terms of only 84 thousandths of one millimeter. So we are in the comedy of the absurd. Such a figure is just a statistical artifact with no observable physical equivalent.

So the sea level rise Seth talks about with great confidence ends up being an unbelievably small quantity measured with great imprecision! Amazing what you find when you look at the numbers, isn’t it?

Many advances in science start with a leap of imagination.  I seem to remember a chemist who woke up one morning with the first correct diagram of benzene.  And a man I admired said before sleeping he brought to mind things that were puzzling him.  Often in the morning he found answers combing out his hair.  Of course any such notions must then be validated through experimentation and measurement to become scientific knowledge.  A leap of faith is another matter altogether.

Sea Level Measurement Contortions

What’s involved in estimating sea level by means of satellites? Albert Parker is a seasoned researcher and explains to us laymen in this interview, followed by links to his recent publications. Senior Researcher Questions Satellite Measurements of Global Sea-Level By Ernest Dempsey with my bolds.

With a lot of rhetoric about the claimed sea-level rise and threat of global warming due to carbon emissions from human activities, the actual science of sea-level measurements and scientific inquiry of the verifiable degree of climate change has been lost in the noise. The following correspondence with Albert Parker, PhD, author of the 2014 paper Problems and reliability of the satellite altimeter based Global Mean Sea Level computation casts light on how reliable the various sea-level measurements are and whether the actual, on-ground science verifies the narrative of carbon-based climate change and alarming sea-level rise.

Ernest: Albert, thanks for taking my call for this Q&A. Would you please tell us about your academic and research background briefly?

Albert Parker: I received my MSc and PhD in Engineering many years ago, before the age of the commercial universities. I have been working after the PhD for 30 years in companies and universities. I started to work on climate change as an independent scientist, for my personal understanding, after the leaked Climategate emails in 2009, as I was curious to see what was really going on in the raw data.

Ernest: Can you please tell our readers the various methods scientists have used to measure the mean sea level at any point?

Albert Parker: Relative sea levels have been locally measured by tidal gauges for many years. A tidal gauge signal is characterized by oscillations on many different time scales. The tidal gauge signal is monthly averaged. A linear fitting of the monthly average values collected over a sufficiently long time window returns the trend. As the tide gauge instrument can move up and down, these sea levels are relative to the instrument.

The absolute global sea level is a hypothetical measure of the status of the ocean waters. Somebody has produced global mean sea level reconstructions from tide gauges since the 1700 or the 1800. These reconstructions are not reliable. Before the end of the 1800s, there was for example not a single tide gauge covering all the southern hemisphere. To compute a proper global mean sea level from tide gauges, you should need many gridded tide gauges along the world coastline, and a measure of their absolute vertical motion, both based on a sufficiently long common time window. There is not such a thing yet. As the trends significantly vary from one location to the other, it therefore only makes sense to focus on the average acceleration rather than the global mean sea level trend.

Ernest: In your 2014 paper, you inform that tide gauge measurements of mean sea level show negligibly small annual rise in mean sea level while satellite measurements give us a notably larger rise in sea level globally. Which of these two would you call more reliable and why?

Albert Parker: The only results to consider are local and global average trends and accelerations from tidal gauges of sufficient quality and length. If a global mean sea level from tide gauges can hardly be computed, you may still look at the individual tide gauges of enough length and quality to understand if there is acceleration or not. And so far, there has been very little acceleration in any tide gauge record over the 20th century and what is passed of the 21th century. Therefore, coastal management can be local, with adaptation measures needed where the sea level rises significantly because of extreme subsidence, and not certainly where the sea levels are rising slowly or are falling.

Regarding the satellite global mean sea level, this result is more a computation than a true measurement and it is not reliable. If you try to track by global positioning system (GPS) the position of selected fixed points, such as few GPS domes on land, and you try to compute the GPS time series to derive a GPS velocity, you may then discover that this much simpler computation, also constrained by the geodetic dimensions, still suffers significant uncertainties, because of satellite drift and other technicalities. It is therefore impossible to measure with nanometric precision the instantaneous height of all the water volume to then derive a time rate of change. The only thing that you can get from the satellite altimeter measurements is an almost detrended, noisy signal, as it was clear in the first results of the project. If subjective corrections are then applied to this signal, for any reason you get the satellite altimeter results that is not a measure, it is a computation, that lacking validation has very little value.

Ernest: Tell us about calibration and its role in sea level readings.

Albert Parker: It is not just a problem of calibration. You are trying to measure with a satellite altimeter the instantaneous, absolute, height, with accuracy up to the nanometer, of a continuously oscillating mass of water bounded by an irregular, continuously moving surface. With the much more established and reliable GPS system that serves many more goals than the monitoring of a climate change parameter, it is hard to compute with accuracy better than a couple of millimeters per year the time rate of change of the position of fixed GPS domes. The global mean sea level results of the satellite altimeter are unfortunately never validated computations, not certainly very accurate measurements.

Ernest: The observable change in sea level can be due to increase in amount of water in the oceans or upward tectonic movement of the seafloor, right? Is there any way to tell how much rise resulted from either?

Albert Parker: The situation is little bit more complicated. If you look at the relative sea level trends across the world, they rise and fall because of changing water conditions and land movements. If you are along the Pacific coast of the US for example, in Alaska, the sea levels are generally falling because the land is moving up (uplift). Conversely, if you look at California, the sea levels are rising mostly because the land is moving down (subsidence). Local factors produce significant differences in between the rates of sea level rise (trends).

Changes in tide levels over time evidenced in Fiji.

To get an accurate measure of the sea level rise by thermal expansion and mass addition from tide gauges, this is not easy. What we can see from the individual tide gauges, is that the contribution from mass addition and thermal expansion is about constant since the start of the 20th century. Since the year 1900, the warming of the oceans and the melting of the ices on land has therefore basically provided an almost constant contribution to the rate of rise of sea levels. Same time, the anthropogenic carbon dioxide emission has increased exponentially. This would be enough to conclude that the anthropogenic carbon dioxide emissions have from very little influence to no influence at all on the rate of rise of sea levels.

Ernest: Then there is the question of periodicity. Far as I get it from your paper, it is more scientific or reasonable to look at sea level change over the past least 60 years. Why is that?

Albert Parker: The sea levels are very well known to oscillate with many periodicities up to a quasi-60 years well shown in almost all the world tide gauges. If you study a tide gauge record and you want to compute a trend by linear fitting, you do need data collected over a time window long enough to understand what is a multidecadal natural oscillation and what is a sea level acceleration produced by intensifying mass addition and thermal expansion. It is unfortunately common to find peoples who cherry pick the short-term positive oscillation in selected locations to sell this result as the proof that global warming is real.

Obviously, the cherry pickers do not pick up the cherries in areas of opposite short-term oscillation where same approach could prove there is global cooling equally real. Similarly, they do not consider the fact that in the long-term locations, positive and negative phases of the oscillations have regularly followed each other over the time, and “unprecedented” short term sea level rises have been measured already about 60 years ago.

Ernest: Since you pointed out the shortcomings in sea level measurements by satellite altimetry and GPS, has the environmental science community responded to your work?

Albert Parker: The shortcomings of satellite altimetry to compute sea levels are very well known. The most part of the independent scientists, unfortunately mostly retired, acknowledges that there is something not that straight going on in the satellite altimeter result. Nils-Axel Morner and many others have written wonderful papers questioning the sea level claims. Problem are the dependent scientists, working in a commercial academy, and more than them, the general press and the politicians that have a clear interest to force the peoples to believe that global warming is real and they need more administration and control and more taxes.

Ernest: Speaking of press, we hear a lot in media about new researches finding links between anthropogenic carbon in atmosphere and sea level rise. And some have claimed disastrous consequences of this supposedly impending sea level threat. What’s your response when you read those stories?

Albert Parker: In the recent scientific paper reference [1], that of course will not receive any attention by the alarmists, we discuss how different experimental data sets of tide gauges show relatively small sea level trends, from +0.4 to +2 millimeters per year, and negligibly small sea level accelerations, just a few micrometers per year squared. These results demonstrate that the sea levels have not been driven by the anthropogenic carbon dioxide emission over the last 120 years, and it is very unlikely they will start be driven by magic right now. These trends and accelerations translate in forecasts to the year 2100 of 100-200 mm sea level rise, not certainly the 850 mm by the IPCC, nor the 1,670 or the 3,050 mm of works such as reference [2] or [3].

The figures below are a comparison of sea level measurements vs. sea level computations over the time window 1970 to 2017, and evidence based forecasts to the year 2100 vs. the model predictions. The difference amongst latest models and reality is increasing as opposed to being lessened. It should be the opposite. Many may certainly claim new links between the anthropogenic carbon dioxide in the atmosphere and the sea level rise, with disastrous consequences of this supposedly impending sea level threat. This does not mean they are correct.

Fig. 1 – Comparison of sea level rises predicted by the local panels [2] (BOS-NRC) and [3] (H++), predicted by the IPCC AR5 RCP8.5 (IPCC RCP8.5), and measured by the tide gauges (averages of different data sets, California-8, PSMSL-301, Mitrovica-23, Holgate-9, NOAA-199, US-71). Further details in [1].

From these graphs, we already know that up to 2017 the models have been wrong, and it is increasingly unlikely to expect more rather than less sea level rise by 2100 vs. the already exaggerated IPCC predictions.

Fig. 2 – Comparison of sea level rises by 2100 predicted by the local panels [2] (BOS-NRC) and [3] (H++), predicted by the IPCC AR5 RCP8.5 (IPCC RCP8.5), and inferred from tide gauge measurements of different data sets (California-8, PSMSL-301, Mitrovica-23, Holgate-9, NOAA-199, US-71). Further details in [1].

[1] Parker, A. & Ollier, C.D., CALIFORNIA SEA LEVEL RISE: EVIDENCE BASED FORECASTS VS. MODEL PREDICTIONS, Ocean and Coastal Management, Ocean & Coastal Management, Available online 19 July 2017, In Press, Corrected Proof. doi: 10.1016/j.ocecoaman.2017.07.008

More Resources:

Sea Level Rise: Just the Facts

Cutting Edge Sea Level Data

Fear Not For Fiji

Footnote:  Climate alarmists may be jumping the shark as well as jumping to conclusions.
“Jumping the shark” is attempting to draw attention to or create publicity for something that is perceived as not warranting the attention, especially something that is believed to be past its peak in quality or relevance. The phrase originated with the TV series “Happy Days” when an episode had Fonzie doing a water ski jump over a shark. The stunt was intended to perk up the ratings, but it marked the show’s low point ahead of its demise.

Update Feb. 17

Prompted by a question from hunter, I found this informative recent letter on this topic(my bolds):

From Reply from Nils-Axel Mörner on the problems of estimating Future Sea Level Changes as asked by Albert Parker in letter of January 2, 2018

There are physical frames to consider. Ice melting requires time and heating, strictly bounded by physical laws. At the largest climatic jump in the last 20,000 years – viz. at the Pleistocene/Holocene boundary about 11,000 years BP – ice melted under extreme temperature forcing; still sea level only rose at a rate of about 10 mm/yr (or just a little more if one would consider more extreme eustatic reconstructions). Today, under interglacial climatic conditions with all the glacial ice caps gone climate forcing can only rise global sea level by a fraction of the 11,000 BP rate, which in comparison with the values of Garner et al. [1] would imply:
well below 0.4 m at 2050 instead of +0.6 m
well below 0.9 m at 2100 instead of +2.6 m
well below 2.9 m at 2300 instead of +17.5 m

Consequently, the values given by Garner et al. [1] violate physical laws and common glaciological knowledge. Therefore, their values must not be set as standard in coastal planning (point 2 above).

The mean sea level rise over the last 125 years is +0.81 ±0.18 mm/yr. At Stockholm in Sweden, the absolute uplift over the last 3000 years is strictly measured at +4.9 mm/yr. The mean tide-gauge change is -3.8 mm/yr, giving a eustatic component of +1.1 mm/yr for the last 150 years. In Amsterdam, the long-term subsidence is known as +0.4 mm/yr. The Amsterdam/Ijmuiden stations record a relative rise of +1.5 mm/yr, which give a eustatic component of +1.1 mm/yr.

Global Loading Adjustment has been widely used in order to estimate global sea level changes. Obviously, the globe must adjust its rate of rotation and geoid relief in close agreement with the glacial eustatic rise in sea level after the last Ice Age. The possible internal glacial loading adjustment is much more complicated, and even questionable, however.

Direct coastal analysis of morphology, stratigraphy, biological criteria, coastal dynamics, etc usually offers the far best means of recording the on-going sea level variations in a correct and meaningful way. It calls for hard work in the field and deep knowledge in a number of subjects. We have, very successfully, applied it in the Maldives, in Bangladesh, in Goa in southern India, and now also in the Fiji Islands. In all these sites, direct coastal analyses indicate full eustatic stability over the last 50-70 years, and long-term variations over the last 500 years that are consistent with “rotational eustasy” or “Global Solar Cycle Oscillations” (GSCO).