Climate Kills Wildflower! (False Alarm)

This is Androsace septentrionalis (Northern rock jasmine). Credit: Anne Marie Panetta

Breathless news out of Colorado Climate warming causes local extinction of Rocky Mountain wildflower species  Excerpts below with my bolds.

New University of Colorado Boulder-led research has established a causal link between climate warming and the localized extinction of a common Rocky Mountain flowering plant, a result that could serve as a herald of future population declines.

The new study, which was published today in the journal Science Advances, found that warmer, drier conditions in line with future climate predictions decimated experimental populations of Androsace septentrionalis (Northern rock jasmine), a mountain wildflower found at elevations ranging from around 6,000 feet in Colorado’s foothills to over 14,000 feet at the top of Mt. Elbert.

The findings paint a bleak picture for the persistence of native flowering plants in the face of climate change and could serve as a herald for future species losses in mountain ecosystems over the next century.

Always the curious one, I went looking for context to interpret this report.  Thank goodness for the Internet; it didn’t take long to find information left out of the alarming news release.  From the US Wildflower Database (here) we can see the bigger picture.

Androsace Septentrionalis, Rock Jasmine

Androsace septentrionalis is a small-flowered and rather inconspicuous plant, and is the most common member of this genus in the West, out of six in the US. Plants are very variable in size, reflecting the wide range of habitats and elevations – from near sea level to over 11,000 feet. Stalkless leaves grow at the base, in a flat rosette, and often have a few teeth along the margins, and ciliate hairs. Leaf surfaces may be hairless or sparsely short hairy.

Common names: Rock jasmine, pygmyflower
Family: Primrose (Primulaceae)
Scientific name: Androsace septentrionalis
Main flower color: White
Range: The Rocky Mountain states, westwards to the Great Basin, and small areas of neighboring states
Height: Between 1 and 8 inches
Habitat: Grassland, forest, tundra; generally open areas, from sea level to 11,500 feet
Leaves: Basal, oblanceolate, up to 1.2 inches long and 0.4 inches across; entire or coarsely toothed edges
Season: March to September

Look at the range and habitat and ask yourself if this plant is adaptive, as well as the fact this species is the most common out of six in the genus.

And in Minnesota (here), on the eastern edge of the range, it is rare compared to the Western Rock Jasmine (Androsace occidentalis).

If American lotus (Nelumbo lutea) is noted as Minnesota’s largest native wildflower, Western Rock Jasmine  certainly vies for its smallest. It can have very dense populations but it takes a discerning and determined eye to pick it out of the landscape, and is only of interest to those who celebrate the diversity of nature. It is easily distinguished from its rare cousin, Northern Androsace (Androsace septentrionalis) which is larger in stature and has rather narrower bracts at the base of the flower cluster.

The preferred habitat features sun; dry sandy soil, grassy meadows, open fields, disturbed soil, which along with “rock” in the name suggests that these plants tolerate arid conditions.

Summary

Far from going extinct, these flowers abound and like humans adapt readily to their surroundings. As has been stated previously, when alarmists project large numbers of extinctions due to future climate change, always ask for the names and the dead bodies.  What the headlines claim is refuted by the facts on the ground.

 

Advertisements

Rainfall Climate Paradox

A recent article displays the intersection of fears and facts comprising the climate paradox, in this case the issue of precipitation.  Rainfall’s natural variation hides climate change signal appeared today in phys.org by Kate Prestt, Australian National University.  Excerpts with my bolds.

New research from The Australian National University (ANU) and ARC Centre of Excellence for Climate System Science suggests natural rainfall variation is so great that it could take a human lifetime for significant climate signals to appear in regional or global rainfall measures.

Even exceptional droughts like those over the Murray Darling Basin (2000-2009) and the 2011 to 2017 Californian drought fit within the natural variations in the long-term precipitation records, according to the statistical method used by the researchers.

This has significant implications for policymakers in the water resources, irrigation and agricultural industries.

“Our findings suggest that for most parts of the world, we won’t be able to recognise long term or permanent changes in annual rainfall driven by climate change until they have already occurred and persisted for some time,” said Professor Michael Roderick from the ANU Research School of Earth Sciences.

“This means those who make decisions around the construction of desalination plants or introduce new policies to conserve water resources will effectively be making these decisions blind.

“Conversely, if they wait and don’t act until the precipitation changes are recognised they will be acting too late. It puts policymakers in an invidious position.”

To get their results the researchers first tested the statistical approach on the 244-year-long observational record of precipitation at the Radcliffe Observatory in Oxford, UK. They compared rainfall changes over 30-year-intervals. They found any changes over each interval were indistinguishable from random or natural variation.

They then applied the same process to California, which has a record going back to 1895, and the Murray Darling Basin from 1901-2007. In both cases the long dry periods seem to fit within expected variations.

Finally, they applied the process to reliable global records that extended from 1940-2009. Only 14 per cent of the global landmass showed, with 90 per cent confidence, increases or decreases in precipitation outside natural variation.

Professor Graham Farquhar AO also from the ANU Research School of Biology said natural variation was so large in most regions that even if climate change was affecting rainfall, it was effectively hidden in the noise.

“We know that humans have already had a measurable influence on streamflows and groundwater levels through extraction and making significant changes to the landscape,” Professor Farquhar said.

“But the natural variability of precipitation found in this paper presents policymakers with a large known unknown that has to be factored into their estimates to effectively assess our long-term water resource needs.”  The research has been published in the journal Proceedings of the National Academy of Sciences.

Summary

Much like sea level rise, scientists fearing the worst seek and hope to find a nanosignal inside noisy imprecise measurements of a naturally varying phenomenon.

CO2 Not Dangerous


Figure 1 depicts EPA’s endangerment chain of reasoning.

Scientists are putting forward the case against CO2 endangerment by making submissions to inform EPA’s reconsideration of that erroneous finding some years ago. As noted previously, the Supreme Court had ruled that EPA has authority to regulate CO2, but left it to the agency to study and decide the endangerment. H/T to GWPF and WUWT for providing links to the documents submitted to EPA on this topic. This post provides a synopsis with some of the key exhibits (my bolds)

The first supplement (here) addressed the first part of the scientific case, namely that fossil fuel emissions cause warming in earth’s atmosphere. The rebuttal consists of three points:

First, Research Reports failed to find that the steadily rising atmospheric CO2 concentrations have had a statistically significant impact on any of the 14 temperature data sets that were analyzed. The tropospheric and surface temperature data measurements that were analyzed were taken by many different entities using balloons, satellites, buoys and various land based techniques.

Second, new information is submitted regarding the logically invalid use of climate models in the attribution of warming to human greenhouse gas (GHG) emissions.

Third, new information is submitted relevant to the invalidation of the “Tropical Hot Spot” and the resulting implications for the three lines of evidence, a subject that was also discussed in our original Petition.

Now we have a Fifth Supplement (here) which rebuts in detail the “lines of evidence” which claim to prove man-made global warming is causing observable changes in nature.

Claim #1: Heat Waves are increasing at an alarming rate and heat kills

Summary of Rebuttal There has been no detectable long-term increase in heat waves in the United States or elsewhere in the world. Most all-time record highs here in the U.S. happened many years ago, long before mankind was using much fossil fuel. Thirty-eight states set their all-time record highs before 1960 (23 in the 1930s!). Here in the United States, the number of 100F, 95F and 90F days per year has been steadily declining since the 1930s. The Environmental Protection Agency Heat Wave Index confirms the 1930s as the hottest decade.

Claim #2: Global warming is causing more hurricanes and stronger hurricanes

Summary of RebuttalThere has been no detectable long-term trend in the number and intensity of hurricane activity globally. The activity does vary year to year and over multidecadal periods as ocean cycles including El Nino/La Nina,multidecadal cycles in the Pacific (PDO) and Atlantic (AMO) favor some basins over others.  The trend in landfalling storms in the United States has been flat to down since the 1850s. Before the active hurricane season in the United States in 2017, there had been a lull of 4324 days (almost 12 years) in major hurricane landfalls, the longest lull since the 1860s.

Claim #3: Global warming is causing more and stronger tornadoes

Summary of Rebuttal Tornadoes are failing to follow “global warming” predictions. Big tornadoes have seen a decline in frequency since the 1950s. The years 2012, 2013, 2014, 2015 and 2016 all saw below average to near record low tornado counts in the U.S. since records began in 1954. 2017 to date has rebounded only to the long-term mean. This lull followed a very active and deadly strong La Nina of 2010/11, which like the strong La Nina of 1973/74 produced record setting and very deadly outbreaks of tornadoes. Population growth and expansion outside urban areas have exposed more people to the tornadoes that once roamed through open fields.

Claim #4: Global warming is increasing the magnitude and frequency of droughts and floods.

Summary of Rebuttal Our use of fossil fuels to power our civilization is not causing droughts or floods. NOAA found there is no evidence that floods and droughts are increasing because of climate change. The number, extend or severity of these events does increase dramatically for a brief period of years at some locations from time to time but then conditions return to more normal. This is simply the long-established constant variation of weather resulting from a confluence of natural factors.

Claim #5: Global Warming has increased U.S. Wildfires

Summary of Rebuttal  Wildfires are in the news almost every late summer and fall. The National Interagency Fire Center has recorded the number of fires and acreage affected since 1985. This data show the number of fires trending down slightly, though the acreage burned had increased before leveling off over the last 20 years. The NWS tracks the number of days where conditions are conducive to wildfires when they issue red-flag warnings. It is little changed.

Claim #6: Global warming is causing snow to disappear

Summary of Rebuttal This is one claim that has been repeated for decades even as nature showed very much the opposite trend with unprecedented snows even to the big coastal cities. Every time they repeated the claim, it seems nature upped the ante more. Alarmists have eventually evolved to crediting warming with producing greater snowfall, because of increased moisture but the snow events in recent years have usually occurred in colder winters with high snow water equivalent ratios in frigid arctic air.

Claim #7: Global warming is resulting in rising sea levels as seen in both tide gauge and satellite technology.

Summary of Rebuttal This claim is demonstrably false. It really hinges on this statement: “Tide gauges and satellites agree with the model projections.” The models project a rapid acceleration of sea level rise over the next 30 to 70 years. However, while the models may project acceleration, the tide gauges clearly do not.  All data from tide gauges in areas where land is not rising or sinking show instead a steady linear and unchanging sea level rate of rise from 4 up to 6 inches/century, with variations due to gravitational factors.

Figure 1. Modelled and observed sea-level changes, 1840-2010. The curve marked “Models” represents the IPCC’s combination of selected tide-gauge records and corrected satellite altimetry data. The curve marked “Observations” represents the observed eustatic sea level changes in the field up to 1960 according to Mörner (1973) and (in this paper) thereafter. After 1965, the two curves start to diverge, presenting two totally different views, separated by the area with the question mark. Which of these views is tenable?

Claim #8: Arctic, Antarctic and Greenland ice loss is accelerating due to global warming

Summary of Rebuttal Satellite and surface temperature records and sea surface temperatures show that both the East Antarctic Ice Sheet and the West Antarctic Ice Sheet are cooling, not warming and glacial ice is increasing, not melting. Satellite and surface temperature measurements of the southern polar area show no warming over the past 37 years. Growth of the Antarctic ice sheets means sea level rise is not being caused by melting of polar ice and, in fact, is slightly lowering the rate of rise. Satellite Antarctic temperature records show 0.02C/decade cooling since 1979. The Southern Ocean around Antarctica has been getting sharply colder since 2006. Antarctic sea ice is increasing, reaching all-time highs. Surface temperatures at 13 stations show the Antarctic Peninsula has been sharply cooling since 2000.
Claim #9: Rising atmospheric CO2 concentrations are causing ocean acidification, which is catastrophically harming marine life

Summary of Rebuttal As the air’s CO2 content rises in response to ever-increasing anthropogenic CO2 emissions, more and more carbon dioxide is expected to dissolve into the surface waters of the world’s oceans, which dissolution is projected to cause a 0.3 to 0.7 pH unit decline in the planet’s oceanic waters by the year 2300.

The ocean chemistry aspect of the ocean acidification hypothesis is rather straightforward, but it is not as solid as it is often claimed to be. For one thing, the work of a number of respected scientists suggests that the drop in oceanic pH will not be nearly as great as the IPCC and others predict. And, as with all phenomena involving living organisms, the introduction of life into the analysis greatly complicates things. When a number of interrelated biological phenomena are considered, it becomes much more difficult, if not impossible, to draw such sweeping negative conclusions about the reaction of marine organisms to ocean acidification. Quite to the contrary, when life is considered, ocean acidification is often found to be a non-problem, or even a benefit. And in this regard, numerous scientific studies have demonstrated the robustness of multiple marine plant and animal species to ocean acidification—when they are properly performed under realistic experimental conditions.

Graph showing a typical oceanic situation. Over a 60 day period, pH fluxes are far greater than claims of global shifts toward 7 (neutral) or lower (acidity).

Claim #10: Carbon pollution is a health hazard

Summary of Rebuttal The term “carbon pollution” is a deliberate, ambiguous, disingenuous term, designed to mislead people into thinking carbon dioxide is pollution. It is used by the environmentalists to confuse the environmental impacts of CO2 emissions with the impact of the emissions of unwanted waste products of combustion. The burning of carbon-based fuels (fossil fuels – coal, oil, natural gas – and biofuels and biomass) converts the carbon in the fuels to carbon dioxide (CO2), which is an odorless invisible gas that is plant food and it is essential to life on the planet.

VOC refers to “volatile organic compounds” meaning any compound of carbon produced from burning fuels, excluding carbon monoxide and carbon dioxide.

The linked documents above provide more details on EPA’s “secret science”, as well as posts on this blog addressing many of these topics.

 

 

 

 

Fossil Fuels ≠ Global Warming Updated

Previous posts addressed the claim that fossil fuels are driving global warming. This post updates that analysis with the latest (2016) numbers from BP Statistics and compares World Fossil Fuel Consumption (WFFC) with three estimates of Global Mean Temperature (GMT). More on both these variables below.

WFFC

2016 statistics are now available from BP for international consumption of Primary Energy sources. Statistical Review of World Energy.  2017 numbers should be available this summer.

The reporting categories are:
Oil
Natural Gas
Coal
Nuclear
Hydro
Renewables (other than hydro)

This analysis combines the first three, Oil, Gas, and Coal for total fossil fuel consumption world wide. The chart below shows the patterns for WFFC compared to world consumption of Primary Energy from 1965 through 2016.

WFFC 2016 BP

The graph shows that Primary Energy consumption has grown continuously for 5 decades. Over that period oil, gas and coal (sometimes termed “Thermal”) averaged 90% of PE consumed, ranging from 94% in 1965 to 86% in 2016.  MToe is millions of tons of oil equivalents.

Global Mean Temperatures

Everyone acknowledges that GMT is a fiction since temperature is an intrinsic property of objects, and varies dramatically over time and over the surface of the earth. No place on earth determines “average” temperature for the globe. Yet for the purpose of detecting change in temperature, major climate data sets estimate GMT and report anomalies from it.

UAH record consists of satellite era global temperature estimates for the lower troposphere, a layer of air from 0 to 4km above the surface. HadSST estimates sea surface temperatures from oceans covering 71% of the planet. HADCRUT combines HadSST estimates with records from land stations whose elevations range up to 6km above sea level.

Both GISS LOTI (land and ocean) and HADCRUT4 (land and ocean) use 14.0 Celsius as the climate normal, so I will add that number back into the anomalies. This is done not claiming any validity other than to achieve a reasonable measure of magnitude regarding the observed fluctuations.

No doubt global sea surface temperatures are typically higher than 14C, more like 17 or 18C, and of course warmer in the tropics and colder at higher latitudes. Likewise, the lapse rate in the atmosphere means that air temperatures both from satellites and elevated land stations will range colder than 14C. Still, that climate normal is a generally accepted indicator of GMT.

Correlations of GMT and WFFC

The next graph compares WFFC to GMT estimates over the five decades from 1965 to 2016 from HADCRUT4, which includes HadSST3.

WFFC HadGMT 2016

Over the last five decades the increase in fossil fuel consumption is dramatic and monotonic, steadily increasing by 223% from 3.5B to 11.4 B oil equivalent tons.  Meanwhile the GMT record from Hadcrut shows multiple ups and downs with an accumulated rise of 0.9C over 51 years, 7% of the starting value.

The second graph compares to GMT estimates from UAH6, and HadSST3 for the satellite era from 1979 to 2016, a period of 37 years.

WFFC HadSST UAH 2016

In the satellite era WFFC has increased at a compounded rate of nearly 2% per year, for a total increase of 84% since 1979. At the same time, SST warming amounted to 0.55C, or 3.9% of the starting value.  UAH warming was 0.72, or 5.5% up from 1979.  The temperature compounded rate of change is 0.1% per year, an order of magnitude less.  Even more obvious is the 1998 El Nino peak and flat GMT since.

Summary

The climate alarmist/activist claim is straight forward: Burning fossil fuels makes measured temperatures warmer. The Paris Accord further asserts that by reducing human use of fossil fuels, further warming can be prevented.  Those claims do not bear up under scrutiny.

It is enough for simple minds to see that two time series are both rising and to think that one must be causing the other. But both scientific and legal methods assert causation only when the two variables are both strongly and consistently aligned. The above shows a weak and inconsistent linkage between WFFC and GMT.

Going further back in history shows even weaker correlation between fossil fuels consumption and global temperature estimates:

wfc-vs-sat

Figure 5.1. Comparative dynamics of the World Fuel Consumption (WFC) and Global Surface Air Temperature Anomaly (ΔT), 1861-2000. The thin dashed line represents annual ΔT, the bold line—its 13-year smoothing, and the line constructed from rectangles—WFC (in millions of tons of nominal fuel) (Klyashtorin and Lyubushin, 2003). Source: Frolov et al. 2009

In legal terms, as long as there is another equally or more likely explanation for the set of facts, the claimed causation is unproven. The more likely explanation is that global temperatures vary due to oceanic and solar cycles. The proof is clearly and thoroughly set forward in the post Quantifying Natural Climate Change.

Background context for today’s post is at Claim: Fossil Fuels Cause Global Warming.

The cosmoclimatology theory

An article at GWPF provides a concise description linking solar activity to earth’s climate. It pulls together several strands of observations and thought presented in recent posts, which are referenced at the end.

The GWPF article (here) is from Deepak Lal and focuses on why India should follow the US out of the Paris accord, but I am more interested in the scientific rationale. The author nicely summarizes an alternative explanation for climate fluctuations to that of IPCC “consensus” scientists. Excerpts below with my bolds.

Propounded by Danish physicist Henrik Svensmark and his associates, the cosmoclimatology theory states that climate is controlled by low cloud cover, which when widespread has a cooling effect by reflecting solar energy back into space and vice versa. These low clouds, in turn, are formed when sub-atomic particles called cosmic rays, emitted by exploding stars, combine with water vapour rising from the oceans.

The constant bombardment of the planet by cosmic rays is modulated by the solar wind, which when it is blowing prevents cosmic rays from reaching the earth and creating low clouds. The solar wind in turn is caused by the varying sunspot activity of the sun.

When, as recently, sunspot activity decreases we get the global ‘cooling’ observed during the recent ‘pause’ in global warming. Furthermore, as noted by the Princeton physicist William Happer (see my column “Clouds of Climate Change”, September 2011), the millennial ‘ice core’ records of the correlation between CO2 and temperature show “that changes in temperature preceded changes in CO2 levels, so that CO2 levels were an effect of temperature changes.

Much of this was probably due to outgassing of CO2 from the warming oceans or the reverse in cooling” (“The truth about greenhouse gasses”). For the oceans are the primary sinks as well as emitters of CO2. Given their vastness relative to the earth’s surface, it takes a long time for the ocean to warm from rises in terrestrial temperatures (and vice versa), hence the lag between temperature and CO2 levels.

The CLOUD experiment is studying whether cosmic rays play a role in cloud formation.Maximilien Brice / CERN

The missing piece in the cosmoclimatology theory was the physical link between cosmic rays and cloud formation. The first confirmation of the basic hypothesis that “ions [cosmic rays] are fundamental for the nucleation of aerosols [tiny liquid or solid particles that provide a nucleus around which droplets can form from water vapour in the air]” was confirmed by the CLOUD experiment at CERN — the particle physics laboratory in 2011. (See Kirby et al, Nature, (2011), 476, 429-433: Cloud formation may be linked to cosmic rays  Experiment probes connection between climate change and radiation bombarding the atmosphere.

But there was still a problem with the hypothesis. It was that, even if as the CLOUD experiment showed ions helped aerosols to form and become stable against evaporation — a process called nucleation — these small aerosols “need to grow nearly a million times in mass in order to have an effect on cloud formation.”

The latest research by Svensmark and his associates (reported in H Svensmark et al. “Increased ionisation supports growth of aerosols into cloud condensation nuclei” Nature Communications 2017;8(1) shows“ both theoretically and empirically and experimentally, how interactions between ions and aerosols can accelerate the growth by adding material to the small aerosols and thereby help them survive to become cloud condensation nuclei” (David Whitehouse: “Cosmic Rays Climate Link Found”). This implies, Prof Svensmark argues, that the effect of the sun on climate could be “5-7 times stronger than that estimated due to changes in the radiant output of the sun alone.”

It also explains why over geological time, there have been much larger variations in climate correlated with changes in cosmic rays. He adds that “it also negates the idea that carbon dioxide has been controlling the climate on the se timescales. ”Thus, the Medieval Warm period around 1000 AD and the subsequent Little Ice Age between 1300AD and 1900AD fit with changes in solar activity.

It also explains climate change observed over the 20th century. Similarly, coolings and warmings around 2 degrees Celsius have occurred repeatedly over the last 10,000 years with variations in the Sun’s activity and cosmic ray influx. While over longer time periods there are much larger variations of up to 10 degrees Celsius as “the Sun and Earth travel through the Galaxy visiting regions with varying numbers of exploding stars”. Svensmark concludes that ‘finally we have the last piece of the puzzle explaining how particles from space affect climate on Earth. It gives an understanding of how changes caused by solar activity or by supernova activity can change climate”.

Surely with this confirmation of the cosmo-climatology theory a Nobel Prize in physics for Svensmark and his associates cannot be far off, and with that the end of the hubristic theory of anthropogenic CO2 generated climate change.

Last word to Svensmark from his December 2017 publication

The missing link between exploding stars, clouds, and climate on Earth  Breakthrough in understanding of how cosmic rays from supernovae can influence Earth’s cloud cover and thereby climate

Summary: The study reveals how atmospheric ions, produced by the energetic cosmic rays raining down through the atmosphere, helps the growth and formation of cloud condensation nuclei — the seeds necessary for forming clouds in the atmosphere.


Cosmic rays interacting with the Earth’s atmosphere producing ions that helps turn small aerosols into cloud condensation nuclei — seeds on which liquid water droplets form to make clouds. A proton with energy of 100 GeV interact at the top of the atmosphere and produces a cascade of secondary particles who ionize molecules when traveling through the air. One 100 GeV proton hits every m2 at the top of the atmosphere every second.

The hypothesis in a nutshell

  • Cosmic rays, high-energy particles raining down from exploded stars, knock electrons out of air molecules. This produces ions, that is, positive and negative molecules in the atmosphere.
  • The ions help aerosols — clusters of mainly sulphuric acid and water molecules — to form and become stable against evaporation. This process is called nucleation. The small aerosols need to grow nearly a million times in mass in order to have an effect on clouds.
  • The second role of ions is that they accelerate the growth of the small aerosols into cloud condensation nuclei — seeds on which liquid water droplets form to make clouds. The more ions the more aerosols become cloud condensation nuclei. It is this second property of ions which is the new result published in Nature Communications.
  • Low clouds made with liquid water droplets cool the Earth’s surface.
  • Variations in the Sun’s magnetic activity alter the influx of cosmic rays to the Earth.
  • When the Sun is lazy, magnetically speaking, there are more cosmic rays and more low clouds, and the world is cooler.
  • When the Sun is active fewer cosmic rays reach the Earth and, with fewer low clouds, the world warms up.

Figure 2 It is crucial to look at the baseline closely that in 2009 actually touched zero for months on end. This is not normal for the low point of the cycle. Figure 3 shows how cycle 24 was feeble compared with recent cycles. And it looks like it will have a duration of ~10 years (2009-2019) which as the low end of the normal range which is 9 to 14 years with mean of 11 years. Chart adapted from SIDC is dated 1 January 2018.

Additional Resources:

Nature’s Sunscreen

Magnetic Pole Swapping and Cooling

Autumnal Climate Change

Civil Climate Discourse

The issue of global warming/climate change has been used to polarize populations for political leverage. People like myself who are skeptical of alarmist claims find it difficult to engage with others whose minds are made up with or without a factual basis. In a recent email Alex Epstein gives some good advice how to talk about energy and climate. At the end I provide links to other material from Alex supporting his principle message regarding human benefits from using fossil fuels. Text below is his email with my bolds.

Two simple-but-powerful tactics

1. Opinion Stories

Unless I have some specific reason for wanting to have a long conversation I like to keep my conversations short, with the end goal of getting the other person to consume some high-impact resource.

One way to make this even more effective is to offer to email/mail the person a resource. Then you’ll have their contact info and can follow up in a few weeks.

The last paragraph of your message is really important. You’re telling the story of how you came to your opinion. I call this device “the opinion story.”

Here’s how it works.

Imagine you’re trying to persuade someone to read your favorite book. My favorite book is Atlas Shrugged, by Ayn Rand.

I used to say: “Atlas Shrugged is the best book you’ll ever read. You have to read it.”

That’s an opinion statement. If you haven’t read the book I’ll bet that statement makes you resistant. “Oh really? You’re telling me what the best book I’ll ever read is? You’re telling me what I have to read?”

Opinion statements often breed resistance and reflexive counter-arguments. So now I try to persuade people differently.

I might say: “My favorite book is Atlas Shrugged by Ayn Rand. I read it when I was 18 and the way the characters thought and approached life motivated me to pursue a career I love and give it everything I have.”

How do you react to that statement?

Probably better. You’re probably not resistant. You may well be intrigued. And you can’t disagree with me–because I didn’t tell you what to think, I told you my opinion story. I respected your independence.

While statements breed resistance and counter-argument, stories often breed interest and requests for more.

You can use opinion stories for anything, no matter how controversial.

For example, if someone asks me about my book, The Moral Case for Fossil Fuels, I don’t need to say “I prove that we should be using more fossil fuels, not less.” I can just say “I researched the pros and cons of different forms of energy and was surprised to come to the conclusion that we should be using more fossil fuels, not less.”

I like to have an opinion story for every controversial opinion I hold.

2. Introducing Surprising Facts

Reader Comment: “The problem I always run into is that they really believe Germany is a success.”

I’ve had the same experience, too! On many issues.

Often in conversation the phenomenon of conflicting factual claims on an issue—such as the impact of solar and wind on Germany’s economy—leads to an impasse.

One way to deal with this is to focus on establishing an explicit framework, with human flourishing (not minimum impact) as the goal and full context analysis (not bias and sloppiness) as the process. Most disputes stem from conflicting frameworks, not conflicting facts. And if you offer a compelling framework you’ll be more trustworthy on the facts.

That said, here’s a tactic I discovered a few years ago to make certain factual points much more persuasive in the moment..

I’ll start with how I discovered it.

I was walking through the Irvine Spectrum mall with a good friend when we ran into two young women working to promote Greenpeace.

My friend found one of the women attractive and said he wanted to talk to her. I thought, given my experiences with (paid) Greenpeace activists, that this was unlikely to be an edifying experience, and encouraged him to instead record a conversation between me and one of the women. Unfortunately for posterity, I was unpersuasive and what follows was never recorded.

I decided to talk to the other Greenpeace woman. She quickly started “educating” me on how Germany was successfully running on solar and wind.

Me: “Really? I’m curious where you’re getting that because I research energy for a living–and Germany is actually building a lot of new coal plants right now.”

Greenpeace: “No, that can’t be true.”

Me: “Okay, how about this? I’ll email you a news article about Germany building new coal plants. If I do, will you reconsider your position?” [Note: This is an example of the technique I recommended above.]

Greenpeace: Hesitates.

Me: “Actually, wait, we have smartphones. I’m going to Google Germany and coal. Let’s see what comes up.”

Displaying on my iPhone is a recent news story whose headline is something very close to: “Germany to build 12 new coal plants, government announces.”

Me: “So what do you think?”

Greenpeace: “I don’t know,” followed by—very rare for a Greenpeace activist—having nothing to say.

Had this been a normal person I am confident the live confirmation of the surprising fact would have made a lasting impression.

I think this tactic works best for news stories about surprising facts. Vs. an opinion story about some issue of analysis, like what Germany’s GDP is.

Summary

Alex Epstein is among those who demonstrate from public information sources comparisons between societies who use carbon fuels extensively and those who do not. The contrast is remarkable: Societies with fossil fuels have citizens who are healthier, live longer, have higher standards of living, and enjoy cleaner air and drinking water, to boot. Not only do healthier, more mobile people create social wealth and prosperity, carbon-based energy is heavily taxed by every society that uses it. Those added government revenues go (at least some of it) into the social welfare of the citizenry. By almost any measure, carbon-based energy makes the difference between developed and underdeveloped populations.

A previous post Social Benefits of Carbon referenced facts and figures from Alex’s book which can be accessed here

Other Resources:
Two Page Overview of The Moral Case for Fossil Fuels — What it is and why it matters 
main points are:
How to think about our energy future
Fossil fuels & human flourishing: the benefits
Fossil fuels & human flourishing: environmental concerns

11 page Introduction to The Moral Case for Fossil Fuels

Maslow’s hierarchy of human needs updated.

How’s Your CCIQ?

 

H/T David Wojick and CFACT

Doctors for Disaster Preparedness are concerned to be ready for real disasters and not be distracted by irrational fears like global warming/climate change. They have provided a useful resource for people to test and deepen their knowledge of an issue distorted for many people by loads of misinformation and exaggerations.

From David Wojick:

A new lesson set called the Climate Change IQ (CCIQ) provides a good skeptical critique of ten top alarmist claims. The format is succinct and non-technical. Each alarmist claim is posed as a question, followed by a short skeptical answer, which is highlighted with a single telling graphic.

Then there is a link to a somewhat longer answer, which in turn includes links to a few online sources of more information. Each lesson is also available in a printable PDF version, suitable for classroom use. This compact format is potentially very useful.

CCIQ comes from a long-standing skeptical group called the Doctors for Disaster Preparedness (DDP). Despite the name, DDP gives attention to pointing out scares that are not disasters waiting to happen. Not surprisingly climate alarmism gets a lot of this attention.

The ten topic questions are wide ranging, including the following. Each speaks to a popular pro-alarmist news hook.

Is climate change the most urgent global health threat?

Are government-sponsored climate scientists the only credible sources of information relating to climate-change policy?

Is the increase in atmospheric CO2 making wildfires worse?

Why can’t all States emulate California’s proposed “clean” energy standards?

What would happen if atmospheric CO2 concentration dropped by half, say to less than 200 ppm?

Are human CO2 emissions acidifying the oceans and endangering shell-making animals?

Will Manhattan and Florida soon be under water if humans do not curtail use of “fossil fuels”?

Do 97% of climate scientists agree that catastrophic climate change will result if humans do not curtail use of “fossil fuels”? (This one includes the dynamite John Christy graph showing the rapidly growing divergence of climate model global temperature forecasts with real world observations.)

Is Arctic ice disappearing?

And the number 1 CCIQ question: Would lowering atmospheric CO2 prevent or mitigate hurricanes?

Check it out. Inquiring minds want to know.

 

Critical Climate Intelligence for Jurists (and others)

 

Recently I saw an activist website complaining that jurists were going to seminars led by staff at Antonin Scalia Law School, George Mason University. I wondered what might be on offer different than alarmist materials from Union of Concerned Scientists, National Resources Defense Council, Greenpeace, World Wildlife, and so on. So I went looking to see what was upsetting to the climate faithful, and found some unexpected resources for climate realists, including those serving on the bench.

The Scalia Law School at George Mason University has a long standing Mason Judicial Education Program providing continuing education for jurists. The linked website provides this description:

For over four decades, the LEC’s Judicial Education Program has helped train the nation’s judges and justices in basic economics, accounting, statistics, regulatory analysis, and other related disciplines. The Program offers intellectually rigorous, balanced, and timely education programs to the nation’s judges and justices in the belief that the fundamental principles of a free and just society depend on a knowledgable and well educated judiciary. To date, over 5,000 federal and state judges from all 50 states and the District of Columbia, including three current U.S. Supreme Court Justices, have participated in at least one of the LEC’s judicial education programs. As one JEP participant has put it: the courses have “made us better at our work and improved the administration of justice.”

From time to time there are seminars where jurists discuss cases indicative of newer tendencies in litigation. The school publishes reports of these gatherings as well as studies and articles by legal scholars in its Journal of Law, Economics and Policy. This post relies on excerpts from several essays linked below.

The Basics of Climate Law

Readings in the Journal show that climate legalities are part of environmental law, which is an aspect of the Common Law dimension called Property Rights, in particular a tort called Public nuisance. As described by legal scholar Richard O. Faulk:

Public nuisance consists of a few elements, and they’re not very complicated. First of all, a public right must be involved—a right common to the general public that they have a legal right to enjoy. Second, there must be a substantial interference with that right that causes some sort of damage, or threatens to cause some sort of damage. Two remedies are available in public nuisance litigation. The first is an equitable remedy known as abatement, where a court can, upon finding a public nuisance, order the defendant to stop or to change its activities. The court can also order the defendant to remediate the problems caused by it. Under some circumstances, damages may be awarded. Costs of remediation and other compensatory awards may be available.

Woman on a ducking stool. Historical punishment for ‘common scold’ – woman considered a public nuisance. (Welsh/English heritage)

Let’s look at a couple of examples. I live in Houston. Let’s say that during Hurricane Ike a tree fell from my property and crashed into my neighbor’s house and damaged his roof. Under those circumstances, no public right is involved. Under those circumstances, it’s simply a private dispute between landowners. It may be private nuisance that he has a tree in his living room, but it’s a matter between us as private land owners, and a public nuisance does not arise.

But let’s say that the tree falls the other way and it blocks the street in front of my house. Under those circumstances, the public has a clear right to go down that road, to navigate it, to deal with whatever errands it needs to run. Since the fallen tree invades a right that’s common to everyone, it’s a public nuisance. The remedy is to order me to remove the tree.

Now, that’s a simple illustration. Let’s look at global warming. Let’s say, for example, that several utility companies in the Northeast burn coal in their plants. Those plants, through their smokestacks, release greenhouse gases—all kinds of things like carbon dioxide, methane, various other things as a result of the combustion of the coal. Let’s say that, for the purposes of argument, science has established that those types of emissions cause or contribute to cause global warming, which is a deleterious thing to human beings.

I don’t think anyone would doubt that the air we breathe is a common resource. So, there probably is a public right involved in these circumstances. But there are other issues. One of them is whether the emissions of these particular defendants are, in fact, substantially contributing or causing the climate change.

Generally in tort cases involving public nuisance, there is a term, which we all know from negligence cases and other torts, called proximate causation. In proximate causation, there is a “but for” test: but for the defendant’s activity, would the injury have happened? Can we say that climate change would not have happened if these power plants, these isolated five power plants, were not emitting greenhouse gases? If they completely stopped, would we still have global warming? If you shut them down completely and have them completely dismantled, would we still have global warming? Is it really their emissions that are causing this, or is it the other billions and billions of things on the planet that caused global warming—such as volcanoes? Such as gases being naturally released through earth actions, through off-gassing?

Is it the refinery down in Texas instead? Is it the elephant on the grasses in Africa? Is it my cows on my ranch in Texas who emit methane every day from their digestive systems? How can we characterize the public utilities’ actions as “but for” causes or “substantial contributions?” So far, the courts haven’t even reached these issues on the merits.

As Faulk says, the courts have not yet considered climate cases on their merits due to preemptive issues, such as standing, damages and liability. And there are additional hurdles before courts can rule on climate change.

What about the separation of constitutional powers?

Joseph F. Speelman:

The idea expressed by the National Resources Defense Counsel and other NGOs was that they didn’t like democracy because it didn’t get them what they wanted, and so they were going to use the courts to run the American political process— fundamentally anti-democratic philosophy that they have consistently maintained—and might I add—relatively successfully.

The theory of public nuisance the way it’s being utilized now, is simply lawlessness—nothing less than pure lawlessness. This is (stated by) the man that wrote the language on public nuisance, but never intended it to be used the way it is being used. It’s being used to try to change society. Is that the law’s job? Or is the law’s job to try to set standards so that people like me can advise clients on how they can obey the law? It’s so much fun to say, “Do the right thing.” What is the right thing?

What about limits to liability?

About every two or three years, the people that brought you asbestos and tobacco come up with new ideas; predatory ideas. They’re designed to separate money from people that have it and take it somewhere else. So it is about the money. Now, try to explain that to your client. Put yourself in my posture. The only way I can safely advise a client to avoid liability in this environment is to not make, buy, sell, or insure anything—but that would make it really hard for us to do what the administration asks business to do, which is to hire people.

The circuit court deemed the theory of public nuisance, as it’s being utilized, standardless liability—no standards. No ability for Joe Speelman to advise his client on how to do the right thing to avoid liability. There are no standards. I can’t tell you how to do it. And if I can’t do that, then the entire process by which business operates and makes things and sells things in this country, ultimately comes down—we get to what we really have, which is a casino mentality. So, as I said when I started—from those folks that brought you asbestos, lead litigation, and tobacco—we now have climate change.

What about effective remedies?

Jason S. Johnson:

These are all interstate public nuisance cases. There are plaintiffs in some states suing defendants who, for the most part, are in other states. There is some overlap in some of the cases such as Comer and Kivalina, but generally, the interstate character of these public nuisance actions is very, very important.

Now, what’s the problem with externalization across states? One: it’s very likely that this is going to be an inefficient externalization. That is to say you come up with some award from the court, and the basic idea of economics is we use the liability system, to what? To internalize costs. If people bear the costs of their actions, they have an incentive then to take precautions or take various steps to lower the cost to other people of the actions that they take. The inefficiency of the externalization here is very, very likely. Why? Among other things, there are very real benefits from global warming that can be expected to benefit lots of states and lots of cities.

States that think they’re going to be beneficiaries, or think that they’re going to be real net losers from greenhouse gas emission reduction— because they’re states where a lot of electricity comes from burning coal and/or they mine and produce coal in those states—those states are not at the table in these litigations either. There are a lot of benefits and costs that are not included in this dyadic interstate public nuisance litigation. They’re almost sure to generate inefficient results. Another reason why they’re sure to generate inefficient results is because the benefit of any litigation depends upon the remedy affecting behavior, and behavior affecting the harm that people suffer.

It’s simply a fact that by 2020, China is going to be responsible— forget about India and Brazil—for about 45% of the world’s greenhouse gas emission reduction. So, there’s no remedy in any of these cases that will provide any relief to any of the plaintiffs.

Finally, what’s going to happen? Who knows what’s going to happen at the Supreme Court level. But these are likely to be very ineffective and counterproductive. Remember, for a lot of environmental groups, the reason for bringing these interstate public nuisance cases is they thought they were going to force Congress to act. Well, Congress didn’t act.

Excerpts above come from the Judicial Symposium on Civil Justice Issues: Climate Change Litigation

Climate Law Itself is Changing

A more recent symposium addressed a contextual shift in principles and assumptions, differing from older concepts underlying case law precedents from the past. Briefly put, the environment is no longer seen as static, but is rather dynamic at all time scales. And in parallel, the economic system is now recognized as dynamic and fluid, rather than determinative. Both of these paradigm shifts alter the way jurists and others consider environmental claims and responses to them.

Excerpts below come from Dynamic Ecology and Dynamic Economics,issue 11.2 in the Journal of Law, Economics and Policy

Jonathan H. Adler:

Most of today’s environmental laws and programs are based upon outmoded assumptions about the relative stability of natural systems when free of human interference. Scientists have understood for decades that ecosystems are anything but stable. To the contrary, ecosystems are incredibly dynamic and change over time due to both internal and external forces. An ecosystem is the “paradigmatic complex system,” exhibiting dynamic and discontinuous behavior. To be effective, therefore, environmental management systems must themselves be sufficiently adaptive.

Noted ecologist Daniel Botkin argues that “solving our environmental problems requires a new perspective” of environmental concerns that incorporates contemporary scientific understandings and embraces humanity’s role in environmental management. Recognizing a new perspective is but the first step, however. There is also a need to identify how this perspective can inform environmental policy, not just on the ground but in the very institutional architecture of environmental law and management. Then comes the really hard part, for even if it is possible to conceive of how environmental management should proceed, it may be devilishly difficult to put such ideas into practice. Old habits die hard. Legal and institutional norms die even harder.

Contemporary environmental law embodies archaic assumptions about the natural world. Through the middle of the 20th century, “the predominant theories in ecology either presumed or had as a necessary corollary a very strict concept of a highly structured, ordered, and regulated, steady state ecological system.” Under this view, nature naturally tended toward an equilibrium state—a “balance”—absent human interference. Maintaining and protecting this balance was, in this view, ecologically superior and ultimately better for humanity as well. Contemporary ecological science has “dismissed” these theories and the accompanying notion of a “balance of nature.”

The architecture of contemporary environmental law was erected when the equilibrium paradigm still held sway. As a consequence, the edifice of environmental law sits on an unstable foundation. The equilibrium paradigm justified “a wide range of prohibitions on human activities that alter ‘natural’ land and water systems” and other environmental restrictions on productive activity.

Contemporary ecological science embraces a more dynamic understanding of the natural world and rejects the idea of a “balance of nature” that would exist but for human interference. Two insights about natural systems are essential to the contemporary view. First is the recognition that ecological systems are always in flux. There is no true “natural” state for ecosystems. No “climax” or endpoint toward which ecosystems move or evolve if left undisturbed. Second, in this day and age, there is no part of the globe in which ecosystems exist wholly apart from human influence.

The environmental laws and regulations on the books are “out of date.” As Botkin observes, “whether or not environmental scientists know about geological time and evolutionary biology, their policies ignore them.” Too often environmental policy and protection measures are based upon “nonrational, ideological beliefs instead of rationally derived facts in harmony with modern understanding of the environment.” Yet, many of the most pressing environmental problems today “exhibit the hallmark characteristics of complex adaptive systems.”

Many existing environmental laws impose binary decisions on agencies—either a species is endangered or it is not, a level of pollution may be anticipated to endanger health or it is not, etc. Once such determinations are made, specific regulatory consequences follow automatically. If a species is endangered, it triggers the regulatory requirements of the Endangered Species Act (ESA).  If a pollutant may be reasonably anticipated to threaten health and welfare, certain types of emission controls must be imposed.

Markets are also complex, adaptive, and dynamic systems. Just as it is not always possible to predict the ecological consequences of specific environmental management measures, it is often not possible to predict the market effects of such measures, or—perhaps more importantly—how such interventions will affect the interplay of economic decisions and environmental outcomes. Market actors will often respond to regulatory constraints in unanticipated ways, with unforeseen (and perhaps undesirable) effects.

There are opportunities to improve the adaptive and responsive nature of environmental protection efforts in the United States, but such opportunities are inherently limited so long as environmental protection is dominated by a relatively centralized, top-down administrative structure. Conventional regulatory and administrative systems are not particularly adaptive or responsive to changing environmental conditions, or even to changed understanding of environmental needs. Bureaucratic systems change slowly and are rarely forward looking. This is due, in part, to legal constraints, but also due to the nature of monopolistic bureaucratic systems, and the inherent information limitations that hamper the ability of such systems to acquire and account for relevant information—let alone to encourage the discovery of such information in the first place. Bureaucratic structures are resistant to change, and this is particularly true where such resistance poses few risks. Regulatory agencies do not go out of business when they fail to adapt. To the contrary, a failing agency is more likely to see a budget increase than it is to close its doors. The feedback mechanisms that force private firms to be adaptive and responsive to changing market conditions are largely absent from the administrative state.

So even if agency heads are willing to make the effort, they face a daunting gauntlet of interest group opposition and judicial scrutiny. According to Professor Ruhl, when the Fish and Wildlife Service (FWS) sought to integrate adaptive management into the habitat conservation plan (HCP) permitting process, interest group litigants and courts were quick to challenge the agency’s authority to incorporate greater flexibility into the program.

Due process concerns about adaptive management are greatest where federal agencies are engaged in the regulation of private land or the imposition of restrictions that directly affect private rights, including some rights on federal lands. Adopting adaptive management policies and techniques is far less problematic in the context of managing government lands than where environmental management decisions encroach upon private interests or risk infringing upon private property rights. While there may be political obstacles, including interest group resistance, to reducing the procedural obligations of agencies engaged in resource management decisions, there are less likely to be judicially cognizable property interests of the sort that could implicate Due Process concerns.

Climate Change Seen Through Dynamic Ecology

Daniel Botkin has led the shift in paradigm to Dynamic Ecology, especially in his influential book: Discordant Harmonies: a New Ecology for the Twenty-first Century. 1990 Oxford University Press, New York.

Daniel B. Botkin is Professor Emeritus, University of California, Santa Barbara, in the Department of Ecology, Evolution, and Marine Biology.

In 2014 he shared his view of the climate change issue in Testimony to the House Subcommittee on Science,Space and Technology. The whole document is enlightening, and included point-by-point critique of IPCC statements. His main points are highlighted below, while details and examples are in the full text.

1.I want to state up front that we have been living through a warming trend driven by a variety of influences. However, it is my view that this is not unusual, and contrary to the characterizations by the IPCC and the National Climate Assessment, these environmental changes are not apocalyptic nor irreversible.

2.My biggest concern is that both the reports present a number of speculative, and sometimes incomplete, conclusions embedded in language that gives them more scientific heft than they deserve. The reports are “scientific-sounding” rather than based on clearly settled facts or admitting their lack. Established facts about the global environment exist less often in science than laymen usually think.

3.HAS IT BEEN WARMING? Yes, we have been living through a warming trend, no doubt about that. The rate of change we are experiencing is also not unprecedented, and the “mystery” of the warming “plateau” simply indicates the inherent complexity of our global biosphere. Change is normal, life on Earth is inherently risky; it always has been. The two reports, however, makes it seem that environmental change is apocalyptic and irreversible. It is not.

4.IS CLIMATE CHANGE VERY UNUSUAL? No, it has always undergone changes.

5.ARE GREENHOUSE GASES INCREASING? Yes, CO2 rapidly.

6.IS THERE GOOD SCIENTIFIC RESEARCH ON CLIMATE CHANGE? Yes, a great deal of it.

7.ARE THERE GOOD SCIENTISTS INVOLVED IN THE IPCC 2014 REPORT? Yes, the lead author of the Terrestrial (land) Ecosystem Report is Richard Betts, a coauthor of one my scientific papers about forecasting effects of global warming on biodiversity.

8. ARE THERE SCIENTIFICALLY ACCURATE STATEMENTS AT PLACES IN THE REPORT? Yes, there are.

9. What I sought to learn was the overall take-away that the reports leave with a reader. I regret to say that I was left with the impression that the reports overestimate the danger from human-induced climate change and do not contribute to our ability to solve major environmental problems. I am afraid that an “agenda” permeates the reports, an implication that humans and our activity are necessarily bad and ought to be curtailed.

10. ARE THERE MAJOR PROBLEMS WITH THE REPORTS? Yes, in assumptions, use of data, and conclusions.

11. My biggest concern about the reports is that they present a number of speculative, and sometimes incomplete, conclusions embedded in language that gives them more scientific heft than they deserve. The reports, in other words, are “scientific-sounding,” rather than clearly settled and based on indisputable facts. Established facts about the global environment exist less often in science than laymen usually think.

12. The two reports assume and/or argue that the climate warming forecast by the global climate models is happening and will continue to happen and grow worse. Currently these predictions are way off the reality (Figure 1). Models, like all scientific theory, have to be tested against real-world observations. Experts in model validation say that the climate models frequently cited in the IPCC report are little if any validated. This means that as theory they are fundamentally scientifically unproven.

13. The reports suffer from using the term “climate change” with two meanings: natural and human-induced. These are both given as definitions in the IPCC report and are not distinguished in the text and therefore confuse a reader. (The Climate Change Assessment uses the term throughout including its title, but never defines it.) There are places in the reports where only the second meaning—human induced—makes sense, so that meaning has to be assumed. There are other places where either meaning could be applied.

14. Some of the report conclusions are the opposite of those given in articles cited in defense of those conclusions.

15. Some conclusions contradict and are ignorant of the best statistically valid observations.

16. The report for policy makers on Impacts, Adaptation, and Vulnerability repeats the assertion of previous IPCC reports that “large fraction of species” face “increase extinction risks” (p15). Overwhelming evidence contradicts this assertion. And it has been clearly shown that models used to make these forecasts, such as climate envelope models and species-area curve models, make incorrect assumptions that lead to erroneous conclusions, over-estimating extinction risks. Surprisingly few species became extinct during the past 2.5 million years, a period encompassing several ice ages and warm periods.

17. THE REPORT GIVES THE IMPRESSION THAT LIVING THINGS ARE FRAGILE AND RIGID, unable to deal with change. The opposite is to case. Life is persistent, adaptable, adjustable.

18. STEADY-STATE ASSUMPTION: There is an overall assumption in the IPCC 2014 report and the Climate Change Assessment that all change is negative and undesirable; that it is ecologically and evolutionarily unnatural, bad for populations, species, ecosystems, for all life on planet Earth, including people. This is the opposite of the reality.

19. The summary for policy makers on Impacts, Adaptation, and Vulnerability makes repeated use of the term “irreversible” changes. A species going extinct is irreversible, but little else about the environment is irreversible.

20. The extreme overemphasis on human-induced global warming has taken our attention away from many environmental issues that used to be front and center but have been pretty much ignored in the 21st century.

21. Do the problems with these reports mean that we can or should abandon any concerns about global warming or abandon any research about it? Certainly not, but we need to put this issue within an appropriate priority with other major here-and-now environmental issues that are having immediate effects.

22. The concerns I have mentioned with the IPCC apply as well to the White House’s National Climate Assessment.

Summary

The good news: Some people in the legal community are reflecting analytically about climate claims appearing in litigation, and are speaking out about the failure of facts and logic to support the allegations.

The bad news:  The more I read, the more I fear the judiciary is caught in the past and ill-prepared for the onslaught of cases coming from the anti-fossil fuels activists.  Jason Johnson, one of the above presenters said this on his website:

Legal scholarship has come to accept as true the various pronouncements of the Intergovernmental Panel on Climate Change (IPCC) and other scientists who have been active in the movement for greenhouse gas (ghg) emission reductions to combat global warming. The only criticism that legal scholars have had of the story told by this group of activist scientists – what may be called the climate establishment – is that it is too conservative in not paying enough attention to possible catastrophic harm from potentially very high temperature increases.

Scientists who have been leaders in the process of producing these Assessment Reports (“AR’s”) argue that they provide a “balanced perspective” on the “state of the art” in climate science,with the IPCC acting as a rigorous and “objective assessor” of what is known and unknown in climate science. Legal scholars have accepted this characterization, trusting that the IPCC AR’s are the product of an “exhaustive review process” – involving hundreds of outside reviewers and thousands of comments. Within mainstream environmental law scholarship, the only concern expressed about the IPCC and “consensus” climate change science is that the IPCC’s process has allowed for too much government influence (especially from China and the U.S.), pressure that has caused the IPCC’s future projections to be too cautious – too hesitant to confidently project truly catastrophic climate change.

Thus politicians, environmental law scholars and policymakers have clearly come to have extreme confidence in the opinion of a group of scientists – many of whom play a leading role on the IPCC – who hold that the late twentieth century warming trend in average global surface temperature was caused by the buildup of anthropogenic ghg’s, and that if ghg emissions are not reduced soon, then the 21st century may witness truly catastrophic changes in the earth’s climate. In the legal and the policy literature on global warming, this view – which may be called the opinion of the climate establishment – is taken as a fixed, unalterable truth. It is virtually impossible to find anywhere in the legal or the policy literature on global warming anything like a sustained discussion of the actual state of the scientific literature on ghg emissions and climate change. Instead, legal and policy scholars simply defer to a very general statement of the climate establishment’s opinion (except when it seems too conservative), generally failing even to mention work questioning the establishment climate story, unless to dismiss it with the ad hominem argument that such work is the product of untrustworthy, industry-funded “skeptics” and “deniers.”

This paper constitutes such a cross-examination. As anyone who has served as an expert witness in American litigation can attest, even though an opposing attorney may not have the expert’s scientific training, a well prepared and highly motivated trial attorney who has learned something about the technical literature can ask very tough questions, questions that force the expert to clarify the basis for his or her opinion, to explain her interpretation of the literature, and to account for any apparently conflicting literature that is not discussed in the expert report. My strategy in this paper is to adopt the approach that would be taken by a non-scientist attorney deposing global warming scientists serving as experts for the position that anthropogenic ghg emissions have caused recent global warming and must be halted if serious and seriously harmful future warming is to be prevented – what I have called above the established climate story.

To use legal terms, is the work by the IPCC and establishment story lead scientists a legal brief – intended to persuade – or a legal memo – intended to objectively assess both sides? The second and related objective of this Article is to use the cross examination to identify what seem to be the key, policy-relevant areas of remaining uncertainty in climate science, and to then at least begin to sketch the concrete implications of such remaining uncertainty for the design of legal rules and institutions adopted to respond to perceived climate change risks.

Far from turning up empty, my cross examination has (initially, to my surprise) revealed that on virtually every major issue in climate change science, the IPCC AR’s and other summarizing work by leading climate establishment scientists have adopted various rhetorical strategies that seem to systematically conceal or minimize what appear to be fundamental scientific uncertainties or even disagreements. The bulk of this paper proceeds by cataloguing, and illustrating with concrete climate science examples, the various rhetorical techniques employed by the IPCC and other climate change scientist/advocates in an attempt to bolster their position, and to minimize or ignore conflicting scientific evidence.

There are, to be sure, many chapters in the IPCC Assessment Reports whose authors have chosen to quite fully disclose both what is known as well as what is unknown, and subject to fundamental uncertainty, in their particular field of climate science. Still, the climate establishment story — comprising all of the IPCC Assessment Reports, plus the IPCC’s “Policymaker Summaries,” plus the freelance advocacy efforts of activist climate scientists (exemplified by James Hansen of NASA) – seems overall to comprise an effort to marshal evidence in favor of a predetermined policy preference, rather than to objectively assess both what is known and unknown about climatic variation and its causes.

To his credit Jason Johnson has done his homework on Climate Science and you can see his results in the document Global Warming Advocacy Science: A Cross Examination

How many other jurists have girded themselves for this battleground?

Progressively Scaring the World (Lewin book synopsis)

H/T to Global Warming Policy Foundation for this publication. Announcement is here.

Bernie Lewin has written a thorough history explaining a series of environmental scares building up to the current obsession with global warming/climate change. The story is enlightening to people like me who were not paying attention when much of this activity was going down, prior to Copenhagen COP in my case.  It also provides a rich description of happenings behind the scenes.

As Lewin explains, it is a particularly modern idea to scare the public with science, and thereby advance a policy agenda. The power of this approach is evident these days, but his book traces it back to more humble origins and describes the process bringing us to the present state of full-blown climate fear. It is a cautionary tale.

“Those who don’t know history are doomed to repeat it.”
― Edmund Burke (1729-1797)

This fearful belief evolved through a series of expanding scares as diagrammed below:This article provides only some highlights while the book exposes the maneuvers and the players, their interests and tactics. Quotes from Lewin appear in italics, with my titles, summaries and bolds.

In the Beginning: The DDT Scare

The Context

A new ‘environmentalism’ arose through a broadening of specific campaigns against environmental destruction and pollution. It began to target more generally the industries and technologies deemed inherently damaging. Two campaigns in particular facilitated this transition, as they came to face-up squarely against the dreams of a fantastic future delivered by unfettered sci-tech progress.

One of these challenged the idea that we would all soon be tearing through the sky and crossing vast oceans in just a few hours while riding our new supersonic jets. But even before the ‘Supersonic Transportation Program’ was announced in 1963, another campaign was already gathering unprecedented support. This brought into question the widely promoted idea that a newly invented class of chemicals could safely bring an end to so much disease and destruction—of agriculture, of forests, and of human health—through the elimination of entire populations of insects. Pg.16

When the huge DDT spraying programs began, the Sierra Club’s immediate concern was the impact on nature reserves. But then, as the movement against DDT developed, and as it became increasingly involved, it began to broaden its interest and transform. By the end of the 1960s it and other similar conservation organisations were leading the new environmentalism in a broader campaign against DDT and other technological threats to the environment. Pg.18

The Alarm

This transformation was facilitated by the publication of a single book that served to consolidate the case against the widespread and reckless use of organic pesticides: Silent Spring. The author, Rachel Carson, had published two popular books on ocean ecology and a number of essays on ecological themes before Silent Spring came out in 1962. As with those earlier publications, one of the undoubted contributions of the book was the education of the public in a scientific understanding of nature. Pg.18

We will never know how Carson would have responded to the complete ban on DDT in the USA. She was suffering from cancer while writing Silent Spring and died shortly after publication (leaving the royalties from its sale to the Sierra Club), but the ban was not achieved for another decade. What we do know is that a full ban was never her intention. She supported targeted poisoning programs in place of blanket spraying, and she urged the authorities to look for alternative and ‘integrated control’, along the lines of the ‘Integrated Pest Management’ approach that is common and accepted today. Pg.19

The Exaggeration

Overall, by today’s standards at least, Carson’s policy position was moderate, and so we should be careful not to attribute to her the excesses of her followers. The trouble with Carson was otherwise: it was in her use and abuse of science to invoke in her readers an overwhelming fear. In Silent Spring, scientific claims find dubious grounding in the evidence. Research findings are exaggerated, distorted and then merged with the purely anecdotal and the speculative, to great rhetorical effect. Pg.19

Historically, the most important area of distortion is in linking organic pesticides with human cancers. The scientific case for DDT as a carcinogen has never been strong and it certainly was not strong when Silent Spring was published. Of course, uncertainty remained, but Carson used the authority of science to go beyond uncertainty and present DDT as a dangerous carcinogen. And it was not just DDT; Carson depicts us ‘living in a sea of carcinogens’, mostly of our own making, and for which there is ‘no safe dose’. Pg.19

The Legacy

If we are to understand how the EPA ban came about, it is important to realise that this action succeeded in breaking a policy stalemate that was becoming increasingly hazardous for the increasingly embattled Nixon administration. On one side of this stalemate were the repeated scientific assessments pointing to a moderate position, while on the other side were calls for more and more extreme measures fuelled by more and more outrageous claims. Pg.21

Such sober assessments by scientific panels were futile in the face of the pseudo-scientific catastrophism that was driving the likes of the Audubon Society into a panic over the silencing of the birds. By the early 1970s two things were clear: public anxiety over DDT would not go away, and yet the policy crisis would not be resolved by heeding the recommendations of scientific committees. Instead, resolution came through the EPA, and the special role that it found for itself following the publication of the Sweeney report. Pg.22

Summary

The DDT scare demonstrated an effective method: Claim that a chemical pollutant is a serious public health risk, Cancer being the most alarming of all. The media stoked the fear, and politicians acted to quell anxiety despite the weak scientific case. Also, the precedent was set for a governmental entity (EPA in this case) to make a judgment overruling expert advice in responding to public opinion.

The SST Scare

The Context

The contribution to the demise of the SST of the environmentalists’ campaign is sometimes overstated, but that is of less concern to our story than the perception that this was their victory. While the DDT campaign was struggling to make headway, the SST campaign would be seen as an early symbolic triumph over unfettered technological progressivism. It provided an enormous boost to the new movement and helped to shape it. Back in 1967, the Sierra Club had first come out campaigning against the SST for the sonic shockwaves sweeping the (sparsely populated) wilderness over which it was then set to fly. But as they began to win that argument, tension was developing within the organisation, with some members wishing to take a stronger, more general and ethical stand against new and environmentally damaging technologies such as this. P.27

With popular support for environmental causes already blooming across the country, and with the SST program already in jeopardy, scientists finally gained their own position of prominence in the controversy when they introduced some new pollution concerns. . . If that wasn’t enough, environmental concerns were also raised in the most general and cursory terms about the aircraft’s exhaust emissions. These first expressions of pollution concerns would soon be followed by others, from scientists who were brought into the debate to air speculation about various atmospheric catastrophes that would ensue if these supersonic birds were ever allowed to fly. Pg.27

The Alarm

What did make the front page of the New York Times on 2 August 1970 was concern about another climatic effect highlighted in the executive summary of the report. The headline trumpeted ‘Scientists ask SST delay pending study of pollution’ (see Figure 2.1).  The conference had analysed the effect of emissions from a fleet of 500 aircraft flying in the stratosphere, and concerns were raised that the emission of water vapour (and to a lesser extent other emissions) might absorb sunlight sufficiently to have a local or even global effect on climate. . . The climatic change argument remained in the arsenal of the anti-SST campaigners through to the end, but it was soon outgunned by much more dramatic claims about possible damage to the ozone layer. Pg.30

Throughout the 1970s, scientific speculation drove a series of ozone scares, each attracting significant press attention. These would climax in the mid-1980s, when evidence of ozone-depleting effects of spray-can propellants would be discovered in the most unlikely place. This takes us right up to the start of the global warming scare, presenting along the way many continuities and parallels. Indeed, the push for ozone protection up to the 1980s runs somewhat parallel with the global warming movement until the treaty process to mitigate ozone damage suddenly gained traction and became the very model for the process to mitigate global warming. The ozone story therefore warrants a much closer look. Pg.31

For Harold Johnston of the University of California, the real problem with SST exhaust would not be water vapour but oxides of nitrogen. Working all night, the next morning he presented Xerox copies of handwritten work projecting 10–90% depletion. In high traffic areas, there would be no stopping these voracious catalysts: the ozone layer would all but disappear within a couple of years. Even when Johnston later settled for a quotable reduction by half, there could be no quibbling over the dangers to nature and humanity of such massive environmental destruction. Pg.44

A New York Times reporter contacted Johnston to confirm his claims and although the report he delivered was subdued, the story remained alarming. It would take less than a year of full-fleet operations, Dr Johnston said in a telephone interview, for SSTs to deplete half of the stratospheric ozone that shields the earth from the sun’s ultraviolet radiation. Scientists argued in the SST debate last March that even a 1 percent reduction of ozone would increase radiation enough to cause an additional 10,000 cases of skin cancer a year in the United States. The next day, 19 May 1971, a strong negative vote demolished the funding bill. All but a few stalwarts agreed that one more vote in the House and it was all over for Boeing’s SST. After that final vote, on 30 May, the New York Times followed-up on its initial story with a feature on Johnston’s claims. This was written by their leading science writer, Walter Sullivan, an influential science communicator important to our story. Pg.48

The Exaggeration

It is true that in 1971 the link between skin cancer and sun exposure was fairly well established in various ways, including by epidemiological studies that found fair-skinned communities in low latitudes tended to record higher rates. However, the link to ultraviolet light exposure (specifically, the UV-B band) is strongest among those cancers that are most common but are also rarely lethal. The link with the rarer and most dangerous cancers, the malignant melanomas, is not so strong, especially because they often appear on skin that is not usually exposed to the sun. Pg.43

Thus, sceptics of the fuss over the risk of a few percent thinning of the already variable ozone layer would point out that the anti-SST crowd did not seemed overly worried about the modern preference for sunshine, which was, on the very same evidence, already presenting a risk many orders of magnitude greater: a small depletion in the ozone layer would be the equivalent of moving a few miles south. To the dismay of their environmentalist opponents, the bolder among these sceptics would recommend the same mitigation measures recommended to the lifestyle migrants—sunscreen, sunglasses and sunhats. Pg.43

But in 1971 there was no way to directly measure stratospheric NOx. No one was even sure whether there was any up there. Nor was there any way to confirm the presence—and, if so, the concentration— of many of the other possibly relevant reactive trace gases. This left scientists only guessing at natural concentrations, and for NOx, Johnston and others had done just that. These ‘best guesses’ were then the basis for modelling of the many possible reactions, the reaction rates, and the relative significance of each in the natural chemistry of the cold thin air miles above. All this speculation would then form the basis of further speculations about how the atmosphere might respond to the impacts of aircraft that had not yet flown; indeed none had even been built. Pg.46

The Legacy

But already the message had got through to where it mattered: to the chair of the Senate Committee on Aeronautical and Space Science, Clinton Anderson. The senator accepted Johnston’s theory on the strength of Sullivan’s account, which he summarised in a letter to NASA before concluding that ‘we either need NOx-free engines or a ban on stratospheric flight’.  And so it turned out that directly after the scrapping of the Boeing prototype, the overriding concern about supersonic exhaust pollution switched from water vapour to NOx. Pg.49

As startling as Johnston’s success appears, it is all the more extraordinary to consider how all the effort directed at solving the NOx problem was never distracted by a rising tide of doubt. The more the NOx effect was investigated, the more complex the chemistry seemed to be and the more doubtful became the original scientific foundations of the scare. In cases of serial uncertainty, the multiplying of best-guess estimates of an effect can shift one way and then the other as the science progresses. But this was never the case with NOx, nor with the SST-ozone scare generally. Pg.50

Summary

The SST Scare moved attention to the atmosphere and the notion of trace gases causing environmental damage, again linked to cancer risk. While ozone was the main issue, climate change was also raised along with interest in carbon dioxide emissions. Public policy was moved to withdraw funding for American SST production and later to ban European SSTs from landing in the US. It also demonstrated that fears could be promoted regarding a remote part of nature poorly known or understood. Models were built projecting fearful outcomes from small changes in atmospheric regions where data was mostly lacking.

The CFC Scare

The Context

Presumptions about the general state of a system’s stability are inevitable in situations of scant evidence, and they tend to determine positions across the sceptic/alarmist divide. Of course, one could suppose a stable system, in which a relatively minor compensatory adjustment might have an alarming impact on civilisation, like the rapid onset of a few metres of rise in sea level. But it is the use of such phrases as ‘disturbing the delicate balance of nature’ or ‘a threat to life on Earth’ that are giveaways to a supposition of instability. Hence Scorer’s incredulity regarding Johnston’s leap towards his catastrophic conclusion: ‘How could it be alleged seriously that the atmosphere would be upset by introducing a small quantity of the most commonly and easily formed compounds of the two elements which comprise 99% of it?’ Pg.68

Meanwhile, ‘Sherry’ Rowland at the University of California was looking around for a new interest. Since 1956 he had been mostly researching the chemistry of radioactive isotopes under funding from the Atomic Energy Commission. Hearing of Lovelock’s work, he was intrigued by the proposal that nearly all the CFCs ever produced might still be out there. Were there no environmental conditions anywhere that would degrade these chemicals? He handed the problem to his post-doctoral research assistant, Mario Molina. Molina eventually concluded that indeed there were no ‘sinks’ for CFCs anywhere in the ocean, soils or lower atmosphere. Thus we should expect that CFCs would drift around the globe, just as Lovelock had proposed, and that they would do so for decades, even centuries. . . or forever? Could mankind have created an organic compound that is so noble that it is almost immortal? Pg.75

The Alarm

The ozone effect that Molina had stumbled upon was different to those previously proposed from rockets and aeroplanes in one important respect: it would be tremendously delayed. Like a hidden cancer, the CFCs would build up quietly and insidiously in the lower atmosphere until their effect on the ozone miles above was eventually detectable, decades later. But when unequivocal evidence finally arrived to support the theory, it would be too late. By then there would be no stopping the destruction of the thin veil protecting us from the Sun’s carcinogenic rays. What Molina had stumbled upon had, in double-dose, one sure element of a good environmental scare. Pg.77

According to Walter Sullivan, they had calculated that spray-can CFCs have already accumulated sufficiently in the upper air to begin depleting the ozone that protects the earth from lethal ultraviolet radiation.  On current emission trends, 30% of the ozone layer would be destroyed as early as 1994. This was no longer a story about saving the sky for our grandchildren. These scientists had found an effect, already in train, with ‘lethal’ consequences for all living things during the lifetime of most of the New York Times’ massive and influential readership. Pg.82

During 1988, the second wave of global environmentalism would reach its peak in the USA, with CFC pollution its first flagship cause. Mid-March saw the US Congress voting unanimously to ratify the Montreal Protocol. It was only the second country to do so, while resistance remained strong in Europe. The following day, NASA announced the results of a huge two-year study of global ozone trends. Pb.107

The new scientific evidence came from a re-analysis of the ozone record. This found that the protective layer over high-population areas in the midlatitudes of the northern hemisphere had been depleted by between 1.7% and 3% from 1969 to 1986. These trends had been calculated after removing the effect of ‘natural geophysical variables’ so as to better approximate the anthropogenic influence. As such, these losses across just 15 years were at much faster rates than expected by the previous modelling of the CFC effect. Pg.107

The statements of the scientists (at least as quoted) made it clear to the press that this panel of experts had interpreted the empirical evidence as showing that a generalised CFC-driven depletion had already begun, and at a much faster rate than expected from the modelling used to inform the Montreal Protocol.  Pg.109

This linking by scientists of the breakup of the southern vortex with low ozone readings in southern Australia during December 1987 morphed into the idea that the ozone hole itself had moved over southern Australia. All sorts of further exaggerations and extrapolations ensued, including the idea of the hole’s continuing year-round presence. An indication of the strength of this mythology is provided by a small survey in 1999 of first-year students in an atmospheric science course at a university in Melbourne. This found that 80% of them believed the ozone hole to be over Australia, 97% believed it to be present during the summer and nearly 80% blamed ozone depletion for Australia’s high rate of skin cancer. Pg.114

After the London ‘Save the Ozone Layer Conference’, the campaign to save the ozone layer was all but won. It is true that a push for funding to assist poor country compliance did gain some momentum at this conference, and it was thought that this might stymie agreement, but promises of aid were soon extracted, and these opened the way for agreement on a complete global phase-out of CFC production. Pg.119

The Exaggeration

Here we had Harvard scientists suggesting that hairspray destruction of the ozone layer had already begun. Verification of the science behind this claim could not have played any part in the breaking of the scare, for there was nothing to show. It turned out that McElroy and Wofsy had not shown their work to anyone, anywhere. Indeed, the calculations they reported to Sullivan were only submitted for publication a few days after the story ran in the New York Times. By that time already, the science did not matter; when McElroy and Wofsy’s calculations finally appeared in print in February 1975, the response to the scare was in full swing, with spray-can boycotts, with ‘ban the can’ campaigns, and with bills to that effect on the table in Congress. Pg.82

It was on track to deliver its findings by April 1976 when it was hit with the shocking discovery of a new chlorine ‘sink’. On receiving this news, it descended into confusion and conflict and this made impossible the timely delivery of its much-anticipated report. The new ‘sink’ was chlorine nitrate. When chlorine reacts to form chlorine nitrate its attack on ozone is neutralised. It was not that chlorine nitrate had previously been ignored, but that it was previously considered very unstable. However, late in 1975 Rowland concluded it was actually quite stable in the mid-stratosphere, and therefore the two most feared ozone eaters—NOx and CFCs—would neutralise each other: not only could natural NOx moderate the CFC effect, but hairsprays and deodorants could serve to neutralise any damage Concorde might cause. Pg.84

Now, at the height of the spray-can scare, there was a shift back to climate. This was reinforced when others began to point to the greenhouse effect of CFCs. An amazing projection, which would appear prominently in the NAS report, was that CFCs alone would increase global mean temperature by 1°C by the end of the century—and that was only at current rates of emissions! In all this, McElroy was critical of Rowland (and others) for attempting to maintain the momentum of the scare by switching to climatic change as soon as doubts about the cancer scare emerged. It looked like the scientists were searching for a new scientific justification of the same policy outcome. Pg.87

The Legacy

The ban on the non-essential uses of spray-can CFCs that came into force in January 1978 marked a peak in the rolling ozone scares of the 1970s. Efforts to sustain the momentum and extend regulation to ‘essential’ spray cans, to refrigeration, and on to a complete ban, all failed. The tail-end of the SST-ozone scare had also petered out after the Franco-British consortium finally won the right to land their Concorde in New York State in 1977. And generally in the late 1970s, the environmental regulation movement was losing traction, with President Carter’s repeated proclamations of an environmental crisis becoming increasingly shrill (more on that below). Eventually, in 1981, Ronald Reagan’s arrival at the White House gave licence and drive to a backlash against environmental regulation that had been building throughout the 1970s. Long before Reagan’s arrival, it was made clear in various forums that further regulatory action on CFCs could only be premised on two things: international cooperation and empirical evidence. Pg.89

To some extent, the demand for better science had always been resisted. From the beginning, advocates conceded that direct and unequivocal evidence of CFC-caused depletion might be impossible to gain before it is too late.  But concerns over whether the science was adequate went deeper. The predictions were based on simple models of a part of our world that was still remote and largely unknown. Pg.91

Summary.

The CFC scare brought the focus of dangerous behavior down from the stratosphere to spray cans in the hands of ordinary people, along with their use of air conditioners so essential to life in the sunny places people prefer.  Speculation about ozone holes over polar regions were also more down to earth. And for the first time all of this concern produced an international treaty with extraordinary cooperation against CFCs, with UNEP soaring into prominence and gaining much credit for guiding the policy process.

The CO2 Scare

The Context

In the USA during the late 1970s, scientific interest in the potential catastrophic climatic consequences of carbon dioxide emissions came to surpass other climatic concerns. Most importantly, it came to surpass the competing scientific and popular anxiety over global cooling and its exacerbation by aerosol emissions. However, it was only during the late 1980s that the ‘carbon dioxide question’ broke out into the public discourse and transformed into the campaign to mitigate greenhouse warming. For more than a decade before the emergence of this widespread public concern, scientists were working on the question under generous government funding. Pg.122

The proven trigger for the release of funding was to forewarn of catastrophe, to generate public fear and so motivate administrators and politicians to fund investigations targeting the specific issue. The dilemma for the climatic research leadership was that calls for more research to assess the level of danger would fail unless declarations of danger were already spreading fear. Pg.143

The scare that would eventually triumph over all preceding global environmental scares, and the scare that would come to dominate climatic research funding, began with a coordinated, well-funded program of research into potentially catastrophic effects. It did so before there was any particular concern within the meteorological community about these effects, and before there was any significant public or political anxiety to drive it. It began in the midst of a debate over the relative merits of coal and nuclear energy production. Pg 144

The Alarm

In February 1979, at the first ever World Climate Conference, meteorologists would for the first time raise a chorus of warming concern. These meteorologists were not only Americans. Expert interest in the carbon dioxide threat had arisen during the late 1970s in Western Europe and Russia as well. However, there seemed to be nothing in particular that had triggered this interest. There was no new evidence of particular note. Nor was there any global warming to speak of. Global mean temperatures remained subdued, while in 1978 another severe winter descended over vast regions of North America. The policy environment also remained unsympathetic. Pg.184

At last, during the early 1980s, Nature gave some clear signals that it was coming out on the side of the warmers. In the early 1980s it started to become clear that the four-decade general cooling trend was over. Weather station records in the northern mid-latitudes began again to show an upward trend, which was traceable back to a turnaround during the 1970s. James Hansen was early in announcing this shift, and in doing so he also excited a foreboding of manmade warming. Pg.193

Besides, there was a much grander diluvian story that continued to gain currency: the semi-submerged West Antarctic ice sheet might detach and slide into the sea. This was for some an irresistible image of terrible beauty: displacement on a monumental scale, humanity unintentionally applying the lever of industrial emissions to cast off this inconceivably large body of ice. As if imagining some giant icy Archimedes slowly settling into his overflowing bath, Hansen calculated the consequential displacement to give a sea-level rise of 5 or 6 metres within a century. Pg.195

Moreover, it had the imprimatur of the American Association for the Advancement of Science; the AAAS journal, Science, was esteemed in the USA above all others. Thus we can forgive Sullivan his credulity of this string of claims: that the new discovery of ‘clear evidence’ shows that emissions have ‘already warmed the climate’, that this supports a prediction of warming in the next century of ‘almost unprecedented magnitude’, and that this warming might be sufficient to ‘melt and dislodge the ice cover of West Antarctica’. The cooling scare was barely in the grave, but the warmers had been rehearsing in the wings. Now their most daring member jumped out and stole the show. Pg.196

But Hansen went beyond this graph and beyond the conclusion of his published paper to firstly make a strong claim of causation, and then, secondly, to relate this cause to the heat being experienced that year (indeed, the heat being experienced in the hearing room even as he spoke!). He explained that ‘the Earth is warmer in 1988 than at any time in the history of instrumental measurements’. He had calculated that ‘there is only a 1 percent chance of an accidental warming of this magnitude. . . ’ This could only mean that ‘the greenhouse effect has been detected, and it is changing our climate now’. Hansen’s detection claim was covered by all the main television network news services and it won for him another New York Times front page headline: Global warming has begun, expert tells Senate. Pg.224

The Exaggeration

Where SCOPE 29 looked toward the time required for a doubling of the atmospheric concentration of carbon dioxide, at Villach the policy recommendation would be based on new calculations for the equivalent effect when all emitted greenhouse gases were taken into account. The impact of the new calculations was to greatly accelerate the rate of the predicted warming. According to SCOPE 29, on current rates of emissions, doubling of the carbon dioxide concentration would be expected in 2100. At Villach, the equivalent warming effect of all greenhouse gases was expected as early as 2030. Pg.209

This new doubling date slipped under a psychological threshold: the potential lifetime of the younger scientists in the group. Subsequently, these computations were generally rejected and the agreed date for ‘the equivalent of CO2 doubling’ was pushed out at least 20 years; indeed, never again would there be a doubling estimate so proximate with the time in which it was made. Pg.209

Like so many of the consensus statements from this time on, this one is twisted so that it gives the appearance of saying more than it actually does. In this way, those pushing for dramatic effect and those concerned not to overstate the case can come to agreement. In fact, this passage of the statement brings the case for alarm down to the reliability of the modelling, which is pretty much true of SCOPE 29. Pg.210

In other words, the Impact on Climate Change working group concluded that the models are not yet ready to make predictions (however vaguely) about the impact of greenhouse gas emissions on the global climate.  Pg.210

The Legacy

Today, emissions targets dominate discussions of the policy response to global warming, and total emissions rates are tacitly assumed to be locked to a climatic response of one, two or so many degrees of warming. Today’s discussions sits on top of a solid foundation of dogma established across several decades and supposedly supported by a scientific consensus, namely that there is a direct cause–effect temperature response to emissions. Pg.219

One of the main recommendations for mitigating these dire consequences is a comprehensive global treaty to protect the atmosphere. On the specific issue of global warming, the conference statement calls for the stabilisation of atmospheric concentrations of one greenhouse gas, namely carbon dioxide. It estimates that this would require a reduction of current global emissions by more than 50%. However, it suggests an initial goal for nations to reduce their current rates of carbon dioxide emission by 20% by 2005. This rather arbitrary objective would become the headline story: ‘Targets agreed to save climate’. And it stuck. In the emissions-reduction policy debate that followed, this ‘Toronto target’ became the benchmark. For many years to come—indeed, until the Kyoto Protocol of 1997—it would be a key objective of sustainable development’s newly launched flagship. Pg.221

Summary

The framework for international action is established presuming that CO2 emissions directly cause global warming and that all nations must collectively cut their use of fossil fuels. However, the drive for a world treaty is hampered by a lack of proof and scientists’ mixed commitment to the policy goals.

The IPCC Scare

The Context

Before winter closed in at the end of 1988, North America was brimming with warming enthusiasm. In the USA, global warming was promised attention no matter who won the presidential election. In Canada, after the overwhelming success of the Toronto conference, the government continued to promote the cause, most enthusiastically through its environment minister Tom McMillan. Elsewhere among world leaders, enthusiasm was also building. The German chancellor, Helmut Kohl, had been a long-time campaigner against fossil fuels. Pg.224

In February 1989, the year got off to a flying start with a conference in Delhi organised by India’s Tata Energy Research Institute and the Woods Hole Research Center, which convened to consider global warming from the perspective of developing countries. The report of the conference produced an early apportionment of blame and a call for reparations. It proclaimed that the global warming problem had been caused by the industrially developed countries and therefore its remediation should be financed by them, including by way of aid to underdeveloped countries. This call was made after presenting the problem in the most alarming terms: Global warming is the greatest crisis ever faced collectively by humankind, unlike other earlier crises, it is global in nature, threatens the very survival of civilisation, and promises to throw up only losers over the entire international socio-economic fabric. The reason for such a potential apocalyptic scenario is simple: climate change of geological proportions are occurring over time-spans as short as a single human lifetime. Pg.226

Throughout 1989, the IPCC working groups conducted a busy schedule of meetings and workshops at venues around the northern hemisphere. Meanwhile, the outpouring of political excitement that had been channelled into the process brought world attention to the IPCC. By the time of its second full session in June 1989, its treaty development mandate had become clearer: the final version of the resolution that had passed at the UN General Assembly the previous December—now called ‘Protection of global climate for present and future generations of mankind’—requested that the IPCC make recommendations on strengthening relevant existing international legal instruments and on ‘elements for inclusion in a possible future international convention on climate.’ pg.242

The Alarm

The general feeling in the research community that the policy process had surged ahead of the science often had a different effect on those scientists engaged with the global warming issue through its expanded funding. For them, the situation was more as President Bush had intimated when promising more funding: the fact that ‘politics and opinion have outpaced the science’ brought the scientists under pressure ‘to bridge the gap’pg.253

This is what became known as the ‘first detection’ program. With funding from DoE and elsewhere, the race was soon on to find ways to achieve early detection of the climate catastrophe signal. More than 10 years later, this search was still ongoing as the framework convention to mitigate the catastrophe was being put in place. It was not so much that the ‘conventional wisdom’ was proved wrong; in other words, that policy action did not in fact require empirical confirmation of the emissions effect. It was more that the policy action was operating on the presumption that this confirmation had already been achieved. Pg.254

The IPCC has warned that if CO2 emissions are not cut by 60 percent immediately, the changes in the next 60 years may be so rapid that nature will be unable to adapt and man incapable of controlling them.  The policy action to meet this threat—the UN Framework Convention on Climate Change—went on to play a leading role as the headline outcome of the entire show. The convention drafted through the INC negotiation over the previous two years would not be legally binding, but it would provide for updates, called ‘protocols’, specifying mandatory emissions limits. Towards the end of the Earth Summit, 154 delegations put their names to the text. Pg.266

The Exaggeration

It may surprise readers that even within the ‘carbon dioxide community’ it was not hard to find the view that the modelling of the carbon dioxide warming was failing validation against historical data and, further upon this admission, the suggestion that their predicted warming effect is wrong. In fact, there was much scepticism of the modelling freely expressed in and around the Carbon Dioxide Program in these days before the climate treaty process began. Those who persisted with the search for validation got stuck on the problem of better identifying background natural variability. There did at least seem to be agreement that any recent warming was well within the bounds of natural variability. Pg.261

During the IPCC review process, Wigley was asked to answer the question that he had avoided in the SCOPE 29: When is detection likely to be achieved? He responded with an addition to the IPCC chapter that explains that we would have to wait until the half-degree of warming that had occurred already during the 20th century is repeated. Only then are we likely to determine just how much of it is human-induced. If the carbon dioxide driven warming is at the high end of the predictions, then this would be early in the 21th century, but if the warming was slow then we may not know until 2050 (see Figure 15.1). In other words, scientific confirmation that carbon dioxide emissions is causing global warming is not likely for decades. Pg.263

These findings of the IPCC Working Group 1 assessment presented a political problem. This was not so much that the working group was giving the wrong answers; it was that it had got stuck on the wrong questions, questions obsolete to the treaty process. The IPCC first assessment was supposed to confirm the scientific rationale for responding to the threat of climate change, the rationale previously provided by the consensus statement coming out of the 1985 Villach conference. After that, it would provide the science to support the process of implementing a coordinated response. But instead of confirming the Villach findings, it presented a gaping hole in the scientific rationale. Pg.263

Scientist-advocates would continue their activism, but political leaders who pledged their support for climate action had invested all scientific authority for this action in the IPCC assessment. What did the IPCC offer in return? It had dished up dubiously validated model projections and the prospect of empirical confirmation perhaps not for decades to come. Far from legitimising a treaty, the scientific assessment of Working Group 1 provided governments with every reason to hesitate before committing to urgent and drastic action. Pg.263

In 1995, the IPCC was stuck between its science and its politics. The only way it could save itself from the real danger of political oblivion would be if its scientific diagnosis could shift in a positive direction and bring it into alignment with policy action. Without a positive shift in the science, it is hard to see how even the most masterful spin on another assessment could serve to support momentum towards real commitment in a binding protocol. With ozone protection, the Antarctic hole had done the trick and brought on agreement in the Montreal Protocol. But there was nothing like that in sight for the climate scare. Without a shift in the science, the IPCC would only cause further embarrassment and so precipitate its further marginalisation. Pg.278

For the second assessment, the final meeting of the 70-odd Working Group 1 lead authors was scheduled for July 1995 in Asheville, North Carolina. This meeting was set to finalise the drafting of the chapters in response to review comments. It was also (and mostly) to finalise the draft Summary for Policymakers, ready for intergovernmental review. The draft Houghton had prepared for the meeting was not so sceptical on the detection science as the main text of the detection chapter drafted by Santer; indeed it contained a weak detection claim. However, it matched the introduction to the detection chapter, where Santer had included the claim that ‘the best evidence to date suggests’. . . .. . a pattern of climate response to human activities is identifiable in observed climate records.

This detection claim appeared incongruous with the scepticism throughout the main text of the chapter and was in direct contradiction with its Concluding Summary. It represented a change of view that Santer had only arrived at recently due to a breakthrough in his own ‘fingerprinting’ investigations. These findings were so new that they were not yet published or otherwise available, and, indeed, Santer’s first opportunity to present them for broader scientific scrutiny was when Houghton asked him to give a special presentation to the Asheville meeting. Pg.279

However, the results were also challenged at Asheville: Santer’s fingerprint finding and the new detection claim were vigorously opposed by several experts in the field. One of the critics, John Christy, recalls challenging Santer on his data selection.  Santer recalls disputing the quality of the datasets used by Christy.  Debates over the scientific basis of the detection claim dominated the meeting, sometimes continuing long after the formal discussions had finished and on into the evening. Pg.280

In September, a draft summary of the entire IPCC second assessment was leaked by the New York Times, the new detection claim revealed on its front page. Pg.281

The UK Independent headlined ‘Global Warming is here, experts agree’ with
the subheading:  ‘Climate of fear: Old caution dropped as UN panel of scientists concur on danger posed by greenhouse gases.‘ The article explains the breakthough: “The panel’s declaration, after three days of torturous negotiation in Madrid, marks a decisive shift in the global-warming debate. Sceptics have claimed there is no sound evidence that climate has been changed by the billions of tonnes of carbon dioxide and other heat-trapping ‘greenhouse gases’ spewed into the atmosphere each year, mostly from the burning of fossil fuels and forests. But the great majority of governments and climate scientists now think otherwise and are now prepared to say so. ‘The balance of evidence suggests a discernible human influence on global climate’, the IPCC’s summary of its 200-page report says. The last such in-depth IPCC report was published five years ago and was far more cautious.” Pg.283

The Legacy

Stories appearing in the major newspapers over the next few days followed a standard pattern. They told how the new findings had resolved the scientific uncertainty and that the politically motivated scepticism that this uncertainty had supported was now untenable. Not only was the recent success of the attribution finding new to this story; also new was the previous failure. Before this announcement of the detection breakthrough, attention had rarely been drawn to the lack of empirical confirmation of the model predictions, but now this earlier failure was used to give a stark backdrop to the recent success, maximising its impact and giving a scientific green light to policy action. Thus, the standard narrative became: success after the previous failure points the way to policy action. Pg.284

With so many political actors using the authority of the IPCC’s detection finding to justify advancing in that direction, it is hard to disagree with his assessment. Another authority might well have been used to carry the treaty politics forward, but the fact that this particular authority was available, and was used, meant that the IPCC was hauled back into the political picture, where it remains the principal authority on the science to this day. Pg.301

What we can see from all this activity by scientists in the close vicinity of the second and third IPCC assessments is the existence of a significant body of opinion that is difficult to square with the IPCC’s message that the detection of the catastrophe signal provides the scientific basis for policy action. Most of these scientists chose not to engage the IPCC in public controversy and so their views did not impact on the public image of the panel. But even where the scientific basis of the detection claims drew repeated and pointed criticism from those prepared to engage in the public controversy, these objections had very little impact on the IPCC’s public image. Pg.310

Today, after five full assessments and with another on the way, the IPCC remains the pre-eminent authority on the science behind every effort to head off a global climate catastrophe. Pg.310

Summary:

Today the IPCC is a testament to the triumph of politics over science, of style and rhetoric over substance and evidence. A “bait and switch” gambit was applied at the right moment to produce the message wanted by the committed. Fooled by the finesse, the media then trumpeted the “idea whose time has come,” and the rest is history, as they say.   And yet, despite IPCC claims to the contrary, the detection question is still not answered for those who demand evidence.

Thank you Bernie Lewin and GWPF for setting the record straight, and for demonstrating how this campaign is sustained by unfounded fears.

A continuing supply of hot air keeps scare balloons inflated.

CO2 Fluxes, Sources and Sinks

A recent post Obsessed with Human CO2 pointed out how small is the amount of CO2 emissions from fossil fuels compared to natural sources. Human emissions fall within the error ranges around the estimates from land, oceans and biosphere. This post looks deeper into the issue and our current state of knowledge about attributing CO2 concentrations in the atmosphere.

Note the size of the human emissions next to the red arrow. (Units are in GT)

Alarming Claims by IPCC Followers

From Chapter 6 Working Group 1 AR5 with my bolds.

With a very high level of confidence, the increase in CO2 emissions from fossil fuel burning and those arising from land use change are the dominant cause of the observed increase in atmospheric CO2 concentration. About half of the emissions remained in the atmosphere (240 ± 10 PgC) since 1750. The rest was removed from the atmosphere by sinks and stored in the natural carbon cycle reservoirs. The ocean reservoir stored 155 ± 30 PgC. Vegetation biomass and soils not affected by land use change stored 160 ± 90 PgC. {6.1, 6.3, 6.3.2.3, Table 6.1, Figure 6.8}

Since the beginning of the Industrial Era (1750), the concentration of CO2 in the atmosphere has increased by 40%, from 278 ± 5 ppm to 390.5 ± 0.1 ppm in 2011 (Figure 6.11; updated from Ballantyne et al. (2012), corresponding to an increase in CO2 of 240 ± 10 PgC in the atmosphere. Atmospheric CO2 grew at a rate of 3.4 ± 0.2 PgC yr–1 in the 1980s, 3.1 ± 0.2 PgC yr–1 in the 1990s and 4.0 ± 0.2 PgC yr–1 in the 2000s (Conway and Tans, 2011) (Table 6.1).

Coupled carbon-cycle climate models indicate that less carbon is taken up by the ocean and land as the climate warms constituting a positive climate feedback. Many different factors contribute to this effect: warmer seawater, for instance, has a lower CO2 solubility, so altered chemical carbon reactions result in less oceanic uptake of excess atmospheric CO2. On land, higher temperatures foster longer seasonal growth periods in temperate and higher latitudes, but also faster respiration of soil carbon.

The removal of human-emitted CO2 from the atmosphere by natural processes will take a few hundred thousand years (high confidence). Depending on the RCP scenario considered, about 15 to 40% of emitted CO2 will remain in the atmosphere longer than 1,000 years. This very long time required by sinks to remove anthropogenic CO2 makes climate change caused by elevated CO2 irreversible on human time scale. {Box 6.1}

Alarmist Summary: All of the rise in atmospheric CO2 is caused by humans, is increasing and will last for 1000 years.

Sobering Facts from Scientific Observations

Fact 1. The Carbon Cycle System is estimated with uncertainties greater than human emissions.

Carbon fluxes describe the rate of exchange of carbon between the various carbon sinks / reservoirs.

There are four main carbon sinks – lithosphere (earth crust), hydrosphere (oceans), atmosphere (air), biosphere (organisms).

The rate at which carbon is exchanged between these reservoirs depends on the conversion processes involved:

Photosynthesis – removes carbon dioxide from the atmosphere and fixes it in producers as organic compounds
Respiration – releases carbon dioxide into the atmosphere when organic compounds are digested in living organisms
Decomposition – releases carbon products into the air or sediment when organic matter is recycled after death of an organism
Gaseous dissolution – the exchange of carbon gases between the ocean and atmosphere
Lithification – the compaction of carbon-containing sediments into fossils and rocks within the Earth’s crust (e.g. limestone)
Combustion – releases carbon gases when organic hydrocarbons (coal, oil and gas) are burned as a fuel source

It is not possible to directly measure the size of the carbon sinks or the fluxes between them – instead estimates are made.

Global carbon fluxes are very large and are therefore measured in gigatonnes (1 gigatonne of carbon = 1 billion metric tonnes).

Because carbon fluxes are large and based on measurements from many different sources, estimates have large uncertainties.

A good summary description of carbon fluxes and reservoirs is at University of New Hampshire (here). This figure from IPCC AR4 shows how estimates have been developed. Explanation below with my bolds.

IPCC AR4WG1 Figure 7.3. The global carbon cycle for the 1990s, showing the main annual fluxes in GtC yr–1: pre-industrial ‘natural’ fluxes in black and ‘anthropogenic’ fluxes in red (modified from Sarmiento and Gruber, 2006, with changes in pool sizes from Sabine et al., 2004). The net terrestrial loss of –39 GtC is inferred from cumulative fossil fuel emissions minus atmospheric increase minus ocean storage. The loss of –140 GtC from the ‘vegetation, soil and detritus’ compartment represents the cumulative emissions from land use change (Houghton, 2003), and requires a terrestrial biosphere sink of 101 GtC (in Sabine et al., given only as ranges of –140 to –80 GtC and 61 to 141 GtC, respectively; other uncertainties given in their Table 1). Net anthropogenic exchanges with the atmosphere are from Column 5 ‘AR4’ in Table 7.1. Gross fluxes generally have uncertainties of more than ±20% but fractional amounts have been retained to achieve overall balance when including estimates in fractions of GtC yr–1 for riverine transport, weathering, deep ocean burial, etc. ‘GPP’ is annual gross (terrestrial) primary production. Atmospheric carbon content and all cumulative fluxes since 1750 are as of end 1994.

The diagram shows that anthropogenic emissions of CO2 from burning of fossil fuels cannot be the reason for the increase in atmospheric CO2.

Fact 2. Land-based Carbon Pools Behave Diversely, Defying Global Averaging.

It should be clear from the observational data that Earth’s biosphere is exerting a powerful brake on the rate of rise of the air’s CO2 content, such that the large increases in anthropogenic CO2 emissions of the past two decades have not resulted in any increase in the rate of CO2 accumulation in the atmosphere. The IPCC has yet to acknowledge the existence and sign of this negative feedback, choosing to rely on projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5) models. Those models “consistently estimate a positive carbon cycle feedback, i.e. reduced natural sinks or increased natural CO2 sources in response to future climate change.” The models further find “in particular, carbon sinks in tropical land ecosystems are vulnerable to climate change” (p. 21 of the Technical Summary, Second Order Draft of AR5, dated October 5, 2012).

Fluxnet Observation Sites around the world.

Soils are the largest carbon reservoir of the terrestrial carbon cycle. Worldwide they contain three or four times more organic carbon (1500 Gt to 1 m depth, 2500 Gt to 2 m) than vegetation (610 Gt) and twice or three times as much carbon as the atmosphere (750 Gt, see Figure 1) [71]. Carbon storage in soils is the balance between the input of dead plant material (leaf, root litter, and decaying wood) and losses from decomposition and mineralization of organic matter (‘heterotrophic respiration’). Under aerobic conditions, most of the carbon entering the soil returns to the atmosphere by autotrophic root respiration and heterotrophic respiration (together called ‘soil respiration’ or ‘soil CO2 efflux’). The mineralization rate is a function of temperature and moisture levels and chemical environment with factors such as pH, Eh, nitrogen level and the cation exchange capacity of the minerals in the soil affecting the mineralization rate of soil organic carbon (SOC) [72, 73, 74, 75, 76, 77, 78]. Under anaerobic conditions, resulting from constantly high water levels, part of the carbon entering the soil is not fully mineralized and accumulates as peat.

Today, eddy covariance measurements of carbon dioxide and water vapor exchange are being made routinely on all continents.  The flux measurement sites are linked across a confederation of regional networks in North, Central and South America, Europe, Asia, Africa, and Australia, in a global network, called FLUXNET.  This global network includes more than eight hundred active and historic flux measurement sites, dispersed across most of the world’s climate space and representative biomes (Figure 1, 2). Fluxnet portal is here Excerpts with my bolds.

The flux network has also been pivotal in refining the functional response of net and gross carbon dioxide exchange with climatic drivers. One notable observation relates to the sensitivity of ecosystem respiration to temperature. That is, respiration is constant across climate and ecological space and increases by a factor of 1.4 with a ten degree increase in temperature. Another emergent property is the plasticity of the timing of the initiation of the growing season, and how it is triggered by when soil temperature matches mean annual air temperature.

Lessons learned from FLUXNET

One of the first and overarching things we have learned is “what is the net and gross annual carbon fluxes, at sites across the globe?” A collation of data has enabled the community to produce a probability distribution of net carbon exchange that is occurring across the network. We see that the central tendency of net carbon exchange is: −157±285 g C m−2 y−1 (Figure 1), representing a sink of carbon to the terrestrial biosphere from the atmosphere. We are also able to document the range of carbon uptake by terrestrial ecosystems. We find that the most negative tail of the histogram is about -1000 g C m−2 y−1. The most positive tail of the histogram, representing sites acting as carbon sources can be as large as +1000 g C m−2 y−1. Of course these values do not consider net biome exchange that would release pulses of carbon from fires or anthropogenic combustion of fossil fuels.

Fact 3. Fluxes are Dynamic and Difficult to Estimate Reliably.

This summary comes from Helge Hellevanga and Per Aagaard in Making Constraints on natural global atmospheric CO2 fluxes from 1860 to 2010 using a simplified explicit forward model (2015) Excerpt with my bolds.

The relative contribution of the emissions and the efficiency of the biosphere and the ocean to mitigate the increase in atmospheric CO2-concentrations, remain highly uncertain. This is demonstrated in chapter six of the latest IPCC report5, where we can read that the net land-atmosphere carbon flux in the 1980s was estimated to −0.1 ± 0.8 Gt C/a (negative numbers denote net uptake). These numbers were partly based on estimates of net CO2 releases caused by land use changes (+1.4 ± 0.8 Gt C/a), and a residual terrestrial sink estimated to −1.5 ± 1.1 Gt C/a.

There are globally much data supporting increased uptake of carbon by the ocean mixed layer (shallow surface water), but the global gross ocean-atmosphere fluxes, partly influenced by annual and inter-annual processes, such as El Niño/La Niña events, are nevertheless not easy to estimate. Obtaining global values of the carbon fluxes are further complicated by large local and regional variations in carbon releases and uptake by the terrestrial biosphere.

Because of the close coupling between oxygen and carbon fluxes during photosynthesis and respiration, the tracer APO (Atmospheric Potential Oxygen), in combination with atmospheric CO2 data, is used to obtain the net amount of CO2 being taken up by the oceanic sink. The net amount of carbon being taken up by the terrestrial biosphere can then be found from the residual (difference between carbon accumulated in the atmosphere and amount taken up by the global oceans).

APO values are however not straightforward to estimate, and a recent study suggests that the strength of the terrestrial sink may be significantly lower than found earlier. Moreover, current measurements of the atmospheric O2/N2 ratio and CO2 concentrations may suggest that the amount of oxygen is dropping at a faster rate than calculated from the APO tracer values.

Fact 4. The Carbon Cycle is driven by Temperature more than Human Emissions.

Global warming, human-induced carbon emissions,and their uncertainties
FANG JingYun, ZHU JiangLing, WANG ShaoPeng, YUE Chao & SHEN HaiHua. Excerpts with my bolds.

However, the current global carbon balance is disturbed by two factors: one is anthropogenic carbon emissions from fossil fuel combustion and land use change, which are 9–10 Pg C per year [74], i.e. equal to 1/22–1/26 of the natural emissions from terrestrial and oceanic biospheres; and the other is that increasing temperature can result in a positive feedback of carbon emissions caused from a greater soil heterotrophic respiration and from oceanic ecosystems [77, 78]. This increased emission will be reserved in atmosphere and contribute to the increase of atmospheric CO2 concentration if it cannot be absorbed by ecosystems. In this sense, in addition to the anthropogenic carbon emissions, the positive feedback of terrestrial and marine ecosystems to global warming may be another important source of the increasing atmospheric CO2 concentration. The estimation of global carbon budget indicates that a total of the natural and anthropogenic emissions are 250 Pg C per year, whereas the total of absorption by the natural ecosystems and the atmosphere is estimated as 230 Pg C per year (Table 2). This generates a gap of 20 Pg C between the global emissions and absorptions, which is twice the current total anthropogenic emissions (9–10 Pg C/yr). Therefore, there is a great uncertainty in the sources of the increased atmospheric CO2, and we may not reach to the conclusion that elevating atmospheric CO2 concentration is mainly from human activities.

Fact 5. CO2 Residence Times are Far Shorter than IPCC Imagines.

Tom Segalstad describes how alarmist dogma evolved in order to explain away contradictory facts. His paper is Carbon cycle modelling and the residence time of natural and anthropogenic atmospheric CO2 : on the construction of the “Greenhouse Effect Global Warming” dogma. Excerpts with my bolds.

Both radioactive and stable carbon isotopes show that the real atmospheric CO2 residence time (lifetime) is only about 5 years, and that the amount of fossil-fuel CO 2 in the atmosphere is maximum 4%. Any CO level rise beyond this can only come from a much larger, but natural, carbon reservoir with much higher 13-C/12-C isotope ratio than that of the fossil fuel pool, namely from the ocean, and/or the lithosphere, and/or the Earth’s interior.

The apparent annual atmospheric CO level increase, postulated to be anthropogenic, would constitute only some 0.2% of the total annual amount of CO exchanged naturally between the atmosphere and the ocean plus other natural sources and sinks. It is more probable that such a small ripple in the annual natural flow of CO would be caused by natural fluctuations of geophysical processes.

13-C/12-C isotope mass balance calculations show that IPCC’s atmospheric CO2 residence time of 50-200 years make the atmosphere too light (50% of its current CO2 mass) to fit its measured 13-C/12-C isotope ratio. This proves why IPCC’s wrong model creates its artificial 50% “missing sink”. IPCC’s 50% inexplicable “missing sink” of about 3 giga-tonnes carbon annually should have led all governments to reject IPCC’s model.

Tom V. Segalstad has conducted university research, publishing, and teaching in geochemistry, mineralogy, petrology, volcanology, structural geology, ore geology, and geophysics at the University of Oslo, Norway, and the Pennsylvania State University, USA.  Some images here are from Tom Segalstad’s presentation Carbon isotope mass balance modelling of atmospheric vs. oceanic CO2

Segalstad was a reviewer for IPCC assessment reports in the early days before observational facts were set aside in favor of the agenda and climate models tuned to suit the narrative. His whimsical comment on the experience:

Footnote:

For more on CO2 interchange between ocean and air, see Fear Not CO2: The Real Chemistry

For more on atmospheric CO2 processes, see Fearless Physics from Dr. Salby

For more on temperature impacting terrestrial CO2 sources, see Not Worried About CO2