Astronomy is Science. Climatology Not.

A nice tongue in cheek essay appeared in the Atlantic The Eclipse Conspiracy: Something doesn’t add up.

It is a whimsical spoof on anyone skeptical that the solar eclipse will happen tomorrow. (Excerpts)

Meanwhile the scientists tell us we can’t look at it without special glasses because “looking directly at the sun is unsafe.”

That is, of course, unless we wear glasses that are on a list issued by these very same scientists. Meanwhile, corporations like Amazon are profiting from the sale of these eclipse glasses. Is anyone asking how many of these astronomers also, conveniently, belong to Amazon Prime?

Let’s follow the money a little further. Hotels along the “path of totality”—a region drawn up by Obama-era NASA scientists—have been sold out for months. Some of those hotels are owned and operated by large multinational corporations. Where else do these hotels have locations? You guessed it: Washington, D.C.

In fact the entire politico-scientifico-corporate power structure is aligned behind the eclipse. This includes the mainstream media. How many news stories have you read about how the eclipse won’t happen?

That’s a great example of “conspiracy ideation” and a subtle dig at people who don’t trust NASA on climate matters. In fact, many of the real NASA scientists are extremely critical of NASA’s participation in climate activism.  Journalists or Senators who raise NASA as evidence of climate change should be directed to The Right Climate Stuff, where esteemed NASA scientists give plenty of good reasons to doubt NASA on this topic.

Bottom Line: A Real Science Makes Predictions that Come True.

The article, perhaps unwittingly, shows why Astronomy is a real science we can trust while Climatology is faith-based, like Astrology. When the eclipse happens, it confirms Astronomers have knowledge about the behavior of planetary bodies. When numerous predictions of climate catastrophes are unfulfilled, it demonstrates scientists’ lack of knowledge about our climate system. Anyone claiming certainty about the climate is exercising their religious freedom, but not doing science.

 

Global Warming Fails to Convince

I happened to read an article at Real Clear Science An Inconvenient Truth About ‘An Inconvenient Truth’ by Eric Merkley & Dominik Stecula August 18, 2017. The article itself is of middling interest, mainly being a lament that Al Gore became the leading promoter of public awareness about the dangers of global warming. The authors contend that Republicans were predetermined to reject claims from such a high-profile liberal Democrat.

It is not new nor interesting to hear warmists diss skeptics as simplistic right-wingers having a knee jerk reaction to global warming claims. But reading the comment thread was illuminating and undercut the presumptions of the article. Instead of pointing to all the leftist knee jerkers swearing allegiance to climatism, posts by several scientists made comments hitting the credibility problem at its core.

Two comments reprinted below deserve a wide audience for expressing what many think but have not expressed so clearly.

@Gabe Kesseru

I spent an entire career in applied sciences and know the difference between true science and lesser areas of study. Climatology is one of the latter. It is mostly a field of historical trend analysis trying desperately to be a field of trend prediction (and doing very poorly at that).

Climatologists have done themselves a disservice by calling themselves scientists, since by doing so we expect them to use the scientific method. The use of scientific method will always be impossible in climatology, since the most important step in the SM is experimentation to prove the hypothesis. And experimentation is impossible when we can’t perform a laboratory equivalent of the earth’s climate over centuries in a laboratory experiment.

Secondarily, science requires that we gather data to laboratory accuracy levels which again is impossible with haphazard worldwide thermometer measurements originally meant to measure weather at casual levels of accuracy and casual levels of repeatability.

@Dan Ashley · Northcentral University

Dan Ashley here. PhD statistics, PhD Business.

I am not a climate, environment, geology, weather, or physics expert. However, I am an expert on statistics. So, I recognize bad statistical analysis when I see it. There are quite a few problems with the use of statistics within the global warming debate. The use of Gaussian statistics is the first error. In his first movie Gore used a linear regression of CO2 and temperature. If he had done the same regression using the number of zoos in the world, or the worldwide use of atomic energy, or sunspots, he would have the same result. A linear regression by itself proves nothing.

The theory that CO2 is a greenhouse gas has been proven correct in a small greenhouse only. As a matter of fact, plants like higher CO2 and it is frequently pumped into greenhouses because of that. There has never been a definitive experiment regarding CO2, at or near the concentrations in our atmosphere. This theory actually has much less statistical support than the conspiracy theories regarding JFK’s assassination.

Gaussian statistics REQUIRE the events being published to be both independent and random. The temperatures experienced in one part of the world are dependent on temperatures in other locales. The readings are not independent. A better statistical method would be Mandlebroten (fractal). Mandlebroten statistics are not merely “fat tailed” statistics.

A more problematic issue with the data is that it has been adjusted. Data adjustments are frequently needed –for example, if a measuring device fails. However 100% of the data adjustments used are in favor of proving global warming. 100%. Not 100% minus one adjustment. Not nearly 100%. 100% –that is ALL– of the adjustments were in one direction only. Any student that put data like that in a PHD dissertation would never receive a doctoral degree.

One study published showed parts of the Earth where warming was occuring faster than other parts of the globe. The study claimed to be of data solely from satellites. The study identified several areas (Gambia for one) which have greater warming than other areas. Unfortunately, in three of those areas there have been no climate satellite observations for years.

The statements that claim “less arctic ice in recorded history” are equally spurious. We started gathering data on that in 1957 with the first satellite fly overs. On this issue “recorded history” is a very short time period.

Some geologist friends told me that a significant amount of Earth’s heat comes from the hot Earth’s core. They further stated that they do not know what percentage of heat that is. They do know it is probably over 20% and probably less than 70%. Whereas either of those extremes seems unlikely to me, remember that I am not a geologist.

As to rising oceans, that should be measured accurately. Measuring it with a stick stuck in the sand is inappropriate. Geologists tell me that the land is shifting and moving. Measuring it against the gravitational center of the Earth is the only accurate way. However, we do not know how to do that. As a matter of fact, we don’t know precisely where the gravitational center of the Earth is. (Any physicists around that want to explain the two body and the three body problem as it relates to the Earth, Moon, and Sun, please do so.

So, according to climate scientists the world is warming up. They may be correct, they may be incorrect. However, they have been unable to support their thesis via the use of statistics.

I personally see no reason to disassemble the world’s economic systems over an unproven, and somewhat implausible theory.

Summary

The scientific claims made in Gore’s movies do not stand up to scrutiny.  Changing the salesman is not going to make the pitch any more believable.

See also

Reasoning About Climate

Big Al’s Sequel Flawed at its Core

Beware the Arctic Storms of August

The Great Arctic Cyclone August 2012

The next two weeks will determine where this year’s minimum will rank compared to recent years. And much will depend upon storm activity which breaks up ice edges, compacts ice chunks and transports ice out through Fram Strait where it melts in the warmer Norwegian Sea.

We have two recent examples in 2012 and 2016. The Great Arctic Cyclone of 2012 produced the lowest minimum of the decade. The NASA photo of the storm is above.  The image below presents the impact of the 2012 storm upon ice extents from mid-August to mid-September annual minimum.  (Click on image to enlarge.)

In contrast, a more normal, non-stormy year is represented by 2014.  Progression of ice extents for 2014 is shown below.

Then again in 2016 several sizable Arctic storms struck late August.  The image below shows cyclonic winds (center left) over the Arctic Ocean on August 22, 2016.

The storms effect on 2016 sea ice appears in the image below.

Summary

Arctic ice extents these three years were not far apart mid-August, but they ended the melt season quite differently.  The Great Arctic Cyclone made 2012 the lowest of the decade, bottoming out at 3.4M km2.  2016 August storms also produced a low annual minimum of 4.2M km2.  In contrast, the absence of major storms in 2014 resulted in a much higher September minimum of 5.13M km2.  All of these compare to 2007 minimum of 4.05M km2, with no major storms reported.

It is difficult to extract a climate signal out of fluctuating ice extent minimums when they are so dependent on the vagaries of weather events.  It also means that anything can happen in the next few weeks.

 

 

 

 

 

August 11 Arctic Ice Report

 

Arctic sunset to occur in the next weeks. Nunavut is already having Civil Twilight, meaning the sun is less than 6 degrees below the horizon during the night.

The extent of Arctic ice fell to a new wintertime low in March 2017. But springtime ice persisted and extents since June are hanging around the decadal average.  Below shows the last 27 days through yesterday, August 11, 2017.

For this period 2017 was mostly average or higher, continuing into August. This year is now almost 300k km2 greater than 2016 and exceeds 2007 by 600k km2.  SII 2017 is also 600k km2 lower.

As we shall see, this year’s extents are in surplus on the Atlantic side, offset by deficits on the Pacific side.  The image compares day 223 with the same day in 2007.

The Table compares 2017 day 223 ice extents with the decadal average and 2007.

Region 2017223 Day 223
Average
2017-Ave. 2007223 2017-2007
 (0) Northern_Hemisphere 6295153 6338630 -43477 5690646 604507
 (1) Beaufort_Sea 646803 740583 -93779 767724 -120921
 (2) Chukchi_Sea 342601 479258 -136657 261771 80831
 (3) East_Siberian_Sea 411714 663564 -251849 207590 204124
 (4) Laptev_Sea 430221 329873 100349 310764 119458
 (5) Kara_Sea 140411 123400 17011 215854 -75443
 (6) Barents_Sea 60001 28883 31118 15996 44005
 (7) Greenland_Sea 236735 253185 -16450 286393 -49658
 (8) Baffin_Bay_Gulf_of_St._Lawrence 207230 71011 136219 83942 123288
 (9) Canadian_Archipelago 527348 439371 87977 361883 165465
 (10) Hudson_Bay 70437 87621 -17184 94262 -23825
 (11) Central_Arctic 3220493 3120642 99851 3083211 137282

Deficits to average are in the BCE region, and surpluses appear almost everywhere else.  Ice is particularly strong in Laptev, Baffin, CAA and the Central Arctic.

The graph below shows Barents this year continues to be above average but fell behind the record year of 2014. After pausing at 70K km2, it dipped to 50k km2, then bounced back to 60k yesterday.

The black line is average for the last 11 years.  2007 in purple appears close to an average year.  2014 had the highest annual extent in Barents Sea, due to higher and later maximums, holding onto ice during the summer, and recovering quickly.  In contrast, 2016 was the lowest annual extent, melting out early and recovering later.  2017 in blue started out way behind, but grew rapidly to reach average, and then persisted longer to exceed even 2014.  It will be important to see when the recovery of ice begins.

For more on why Barents Sea matters see Barents Icicles

 

Big Al’s Sequel: Flawed at its Core

Fortunately, box offices show few other than die-hard Gore fans are subjecting themselves to the Inconvenient Sequel. When people go to see cli-sci-fi (Climate Science Fiction) movies like Water World or Day After Tomorrow, they know in advance it will be someone’s imaginary portrayal of an undesirable future. The difference with Al Gore, and also with the writers of the draft US Climate Assessment is their claim that their imaginings are “the Truth.”

Despite the low box office numbers, the media will inundate us with flawed messages from the film, so this post is required for protection around the office water cooler or the kitchen table. Text below in italics are excerpts from Alex Epstein’s article in the Financial Post Al Gore can’t deny that his climate crusade involves great suffering  Alex Epstein: Gore has to make the case that climate dangers warrant so much human misery

Good Reasons to reject Al Gore’s alarms.

The running theme throughout An Inconvenient Sequel is that Gore’s first film was even more right than he expected. The movie begins with defenders of fossil fuels mocking or ignoring the dramatic predictions of An Inconvenient Truth. Leaving aside a heroic (and highly disputed) portrayal of Gore rescuing the Paris climate accord, the rest of the movie focuses on vindicating Gore’s two chief predictions: 1) That we could replace fossil fuels with cheap solar- and wind-powered “renewables”; and 2) that continued use of fossil fuels would lead to catastrophic temperature rises, catastrophic sea-level rises, catastrophic flooding, catastrophic drought, catastrophic storms, and catastrophic disease proliferation.

Let’s deal first with Gore’s second supposition.

Alarmists Substitute Models for Observations and Data

Since the last IPCC report (AR5), activists no longer respect what consensus scientists say. Observations and data are set aside, and only alarming projections from models count. As we know, computer simulations of the climate system are flawed, and running hotter than even adjusted global datasets. And we also know, since model outputs can only project modeler’s assumptions into the future, the models cannot prove the validity of those assumptions.

Berkeley physicist Richard Muller gives a mainstream scientist view of Gore’s claims.

“The problem is not with the survey, which asked a very general question. The problem is that many writers (and scientists!) look at that number and mis-characterize it. The 97% number is typically interpreted to mean that 97% accept the conclusions presented in An Inconvenient Truth by former Vice President Al Gore. That’s certainly not true; even many scientists who are deeply concerned by the small global warming (such as me) reject over 70% of the claims made by Mr. Gore in that movie (as did a judge in the UK; see footnote below).”

“I like to ask scientists who “believe” in global warming what they think of the data. Do they believe hurricanes are increasing? Almost never do I get the answer “Yes, I looked at that, and they are.” Of course they don’t say that, because if they did I would show them the actual data! Do they say, “I’ve looked at the temperature record, and I agree that the variability is going up”? No. Sometimes they will say, “There was a paper by Jim Hansen that showed the variability was increasing.” To which I reply, “I’ve written to Jim Hansen about that paper, and he agrees with me that it shows no such thing. He even expressed surprise that his paper has been so misinterpreted.”

“A really good question would be: “Have you studied climate change enough that you would put your scientific credentials on the line that most of what is said in An Inconvenient Truth is based on accurate scientific results? My guess is that a large majority of the climate scientists would answer no to that question, and the true percentage of scientists who support the statement I made in the opening paragraph of this comment, that true percentage would be under 30%. That is an unscientific guestimate, based on my experience in asking many scientists about the claims of Al Gore.”  Full text at Meet Richard Muller, Lukewarmist

Our Actual Climate is Mild and Not Dangerous

Nothing out of the ordinary is happening to our weather and climate, despite lots of claims otherwise. Gore’s sequel is long on anecdotes and fears, but lacks any references to the statistics contradicting him. Recent decades have been remarkably benign and agriculture is booming. IPCC scientists wrote that no evidence yet exists to connect extreme weather with human activities.  Alex Epstein:

Gore and others should be free to make the case that the danger of greenhouse gases is so serious as to warrant that scale of human misery. But they should have to quantify and justify the magnitude of climate danger. And that brings us to the truth about climate.

The overall trend in climate danger is that it is at an all-time low. The Emergency Events Database (EM-DAT) shows 6,114 climate-related deaths in 2016. In other recent years the numbers have maxed out in the tens of thousands. Compare this to the 1930s when, adjusted for population, climate-related deaths hit the 10-million mark several times.

The most significant cause of our radically reduced climate danger is industrial development, which takes a naturally dangerous climate and makes it unnaturally safe. And industrial development is driven by cheap, plentiful, reliable energy — which, today, overwhelmingly means fossil fuels. Climate will always be dangerous so priority number one is to have the energy and development to tame it. Modern irrigation, residential heating and air conditioning have made once uninhabitable places perfectly comfortable.

Controlling Human CO2 Emissions Will Not Change the Weather

The really inconvenient truth is that governments are not able to ensure favorable weather for humans. Nothing yet attempted, from corrupt carbon markets, to biofuels, to renewable electrical power, to carbon taxes has done anything beyond enriching cronies and filling government coffers.

Alex Epstein details Gore’s misdirecting us to renewables as our salvation.

Some of his anecdotes are meant to prove that cheap solar and wind are, as 2006 Gore prophesied, quickly dominating the world’s energy supply and, as 2006 Gore also warned us, that our rapidly warming climate is killing more and more people each year. But he has not given us the whole picture.

Take the rising dominance of solar and wind, which is used to paint supporters of fossil fuels as troglodytes, fools, and shills for Big Oil. The combined share of world energy consumption from renewables is all of two per cent. And it’s an expensive, unreliable, and therefore difficult-to-scale two per cent.

Because solar and wind are “unreliables,” they need to be backed up by reliable sources of power, usually fossil fuels, or sometimes non-carbon sources including nuclear and large-scale hydro power (all of which Gore and other environmentalists refuse to support). This is why every grid that incorporates significant solar and wind has more expensive electricity. Germans, on the hook for Chancellor Angela Merkel’s self-righteous anti-carbon commitments, are already paying three times the rates for electricity that Americans do.

Stories about “100-per-cent renewable” locations like Georgetown, Tex. are not just anecdotal evidence, they are lies. The Texas grid from which Georgetown draws its electricity is comprised of 43.7 per cent natural gas, 28.8 per cent coal, 12 per cent nuclear, and only 15.6 per cent renewable. Using a virtue-signalling gimmick pioneered by Apple, Facebook, and Google, Georgetown pays its state utility to label its grid electricity “renewable” — even though it draws its power from that fossil-fuel heavy Texas grid — while tarring others on the grid as “non-renewable.”

If we look at the overall trends instead of engaging in anecdotal manipulation we see that fossil fuel energy is the fastest-growing energy source in the world — still. Fossil fuels have never been more vital to human flourishing. There are 1,600 coal plants planned for the near future, which could increase international coal capacity 43 per cent. Advances in technology are making fossil fuels cleaner, safer, and more efficient than ever. To reduce their growth let alone to radically restrict their use — which is what Gore advocates — means forcing energy poverty on billions of people.

Conclusion

Gore’s Inconvenient Sequel gives a biased, self-serving, and convenient picture of fossil fuels and climate — convenient for Gore’s legacy, that is, but inconvenient for the billions his energy poverty policies will harm. As citizens, we must start demanding responsible thought leaders who will give us the whole picture that life-and-death energy and climate decisions require.

Note the contrast between Al Gore’s propaganda and Richard Lindzen’s short video:

Footnote:

Errors in “An Inconvenient Truth” Highlighted by UK High Court Judge Michael Burton:

1.) The sea level will rise up to 20 feet because of the melting of either West Antarctica or Greenland in the near future. (This “Armageddon scenario” would only take place over thousands of years, the judge wrote.)

2.) Some low-lying Pacific islands have been so inundated with water that their citizens have all had to evacuate to New Zealand. (“There is no evidence of any such evacuation having yet happened.”)

3.) Global warming will shut down the “ocean conveyor,” by which the Gulf Stream moves across the North Atlantic to Western Europe. (According to the Intergovernmental Panel on Climate Change, “it is very unlikely that the Ocean Conveyor will shut down in the future…”)

4.) There is a direct coincidence between the rise in carbon dioxide in the atmosphere and the rise in temperature over the last 650,000 years. (“Although there is general scientific agreement that there is a connection, the two graphs do not establish what Mr. Gore asserts.”)

5.) The disappearance of the snows on Mount Kilimanjaro is expressly attributable to global warming. (“However, it is common ground that, the scientific consensus is that it cannot be established that the recession of snows on Mount. Kilimanjaro is mainly attributable to human-induced climate change.”)

6.) The drying up of Lake Chad is a prime example of a catastrophic result of global warming. (“It is generally accepted that the evidence remains insufficient to establish such an attribution” and may be more likely the effect of population increase, overgrazing and regional climate variability.)

7.) Hurricane Katrina and the consequent devastation in New Orleans is because of global warming. (“It is common ground that there is insufficient evidence to show that.”)

8.) Polar bears are drowning because they have to swim long distances to find ice. (“The only scientific study that either side before me can find is one, which indicates that four polar bears have recently been found drowned because of a storm.”)

9.) Coral reefs all over the world are bleaching because of global warming and other factors. (“Separating the impacts of stresses due to climate change from other stresses, such as overfishing and pollution, was difficult.”)

 

 

Tropics Lead Ocean Cooling

July Sea Surface Temperatures (SSTs) are now available, and we can see further ocean cooling led by plummeting temps in the  Tropics and SH, continuing the downward trajectory from the previous 12 months.

HadSST is generally regarded as the best of the global SST data sets, and so the temperature story here comes from that source, the latest version being HadSST3.

The chart below shows the last two years of SST monthly anomalies as reported in HadSST3 including July 2017.

In May despite a slight rise in the Tropics, declines in both hemispheres and globally caused SST cooling to resume after an upward bump in April.  Now in July a large drop is showing both in the Tropics and in SH, declining the last 4 months.  Meanwhile the NH is peaking in July as usual, but well down from the previous July.  The net of all this is a slightly lower Global anomaly but with likely additional future cooling led by the Tropics and also SH hitting new lows for this period.

Note that higher temps in 2015 and 2016 were first of all due to a sharp rise in Tropical SST, beginning in March 2015, peaking in January 2016, and steadily declining back to its beginning level. Secondly, the Northern Hemisphere added two bumps on the shoulders of Tropical warming, with peaks in August of each year. Also, note that the global release of heat was not dramatic, due to the Southern Hemisphere offsetting the Northern one. Note that Global anomaly for July 2017 matches closely to April 2015.  However,  SH and the Tropics are lower now and trending down compared to an upward trend in 2015.

We have seen lots of claims about the temperature records for 2016 and 2015 proving dangerous man made warming.  At least one senator stated that in a confirmation hearing.  Yet HadSST3 data for the last two years show how obvious is the ocean’s governing of global average temperatures.

USS Pearl Harbor deploys Global Drifter Buoys in Pacific Ocean

The best context for understanding these two years comes from the world’s sea surface temperatures (SST), for several reasons:

  • The ocean covers 71% of the globe and drives average temperatures;
  • SSTs have a constant water content, (unlike air temperatures), so give a better reading of heat content variations;
  • A major El Nino was the dominant climate feature these years.

Solar energy accumulates massively in the ocean and is variably released during circulation events.

 

Gamechanger: DC Appeals Court Denies EPA Climate Rules

A major clarification came today from the DC Court of Appeals ordering EPA (and thus the Executive Branch Bureaucracy) to defer to Congress regarding regulation of substances claimed to cause climate change.  While the issue and arguments are somewhat obscure, the clarity of the ruling is welcome.  Basically, the EPA under Obama attempted to use ozone-depleting authority to regulate HFCs, claiming them as greenhouse gases.  The judges decided that was a stretch too far.

The Court Decision August 8, 2017

The EPA enacted the rule in question in 2015, responding to research showing hydroflourocarbons, or HFCs, contribute to climate change.

The D.C. Circuit Court of Appeals’ 2-1 decision said EPA does not have the authority to enact a 2015 rule-making ending the use of hydrofluorocarbons commonly found in spray cans, automobile air conditioners and refrigerators. The three-judge panel said that because HFCs are not ozone-depleting substances, the EPA could not use a section of the Clean Air Act targeting those chemicals to ban HFCs.

“Indeed, before 2015, EPA itself maintained that Section 612 did not grant authority to require replacement of non ozone-depleting substances such as HFCs,” the court wrote.

“EPA’s novel reading of Section 612 is inconsistent with the statute as written. Section 612 does not require (or give EPA authority to require) manufacturers to replace non ozone-depleting substances such as HFCs,” said the opinion, written by Judge Brett Kavanaugh.

Contextual Background from the Court Document On Petitions for Review of Final Action by the United States Environmental Protection Agency  Excerpts below (my bolds)

In 1987, the United States signed the Montreal Protocol. The Montreal Protocol is an international agreement that has been ratified by every nation that is a member of the United Nations. The Protocol requires nations to regulate the production and use of certain ozone-depleting substances.

As a result, in the 1990s and 2000s, many businesses stopped using ozone-depleting substances in their products. Many businesses replaced those ozone-depleting substances with HFCs. HFCs became prevalent in many products. HFCs have served as propellants in aerosol spray cans, as refrigerants in air conditioners and refrigerators, and as blowing agents that create bubbles in foams.

In 2013, President Obama announced that EPA would seek to reduce emissions of HFCs because HFCs contribute to climate change.

Consistent with the Climate Action Plan, EPA promulgated a Final Rule in 2015 that moved certain HFCs from the list of safe substitutes to the list of prohibited substitutes. . .In doing so, EPA prohibited the use of certain HFCs in aerosols, motor vehicle air conditioners, commercial refrigerators, and foams – even if manufacturers of those products had long since replaced ozonedepleting substances with HFCs. Id. at 42,872-73.

Therefore, under the 2015 Rule, manufacturers that used those HFCs in their products are no longer allowed to do so. Those manufacturers must replace the HFCs with other substances that are on the revised list of safe substitutes.

In the 2015 Rule, EPA relied on Section 612 of the Clean Air Act as its source of statutory authority. EPA said that Section 612 allows EPA to “change the listing status of a particular substitute” based on “new information.” Id. at 42,876. EPA indicated that it had new information about HFCs: Emerging research demonstrated that HFCs were greenhouse gases that contribute to climate change. See id. at 42,879. EPA therefore concluded that it had statutory authority to move HFCs from the list of safe substitutes to the list of prohibited substitutes. Because HFCs are now prohibited substitutes, EPA claimed that it could also require the replacement of HFCs under Section 612(c) of the Clean Air Act even though HFCs are not ozone-depleting substances.

EPA’s current reading stretches the word “replace”  beyond its ordinary meaning. . .
Under EPA’s current interpretation of the word “replace,” manufacturers would continue to “replace” an ozone-depleting substance with a substitute even 100 years or more from now. EPA would thereby have indefinite authority to regulate a manufacturer’s use of that substitute. That boundless interpretation of EPA’s authority under Section 612(c) borders on the absurd.

In any event, the legislative history strongly supports our conclusion that Section 612(c) does not grant EPA continuing authority to require replacement of non-ozone-depleting substitutes.. . In short, although Congress contemplated giving EPA broad authority under Title VI to regulate the replacement of substances that contribute to climate change, Congress ultimately declined.

However, EPA’s authority to regulate ozone-depleting substances under Section 612 and other statutes does not give EPA authority to order the replacement of substances that are not ozone depleting but that contribute to climate change. Congress has not yet enacted general climate change legislation. Although we understand and respect EPA’s overarching effort to fill that legislative void and regulate HFCs, EPA may act only as authorized by Congress. Here, EPA has tried to jam a square peg (regulating non-ozone depleting substances that may contribute to climate change) into a round hole (the existing statutory landscape).

The Supreme Court cases that have dealt with EPA’s efforts to address climate change have taught us two lessons that are worth repeating here. See, e.g., Utility Air Regulatory Group v. EPA, 134 S. Ct. 2427 (2014). First, EPA’s well intentioned policy objectives with respect to climate change do not on their own authorize the agency to regulate. The agency must have statutory authority for the regulations it wants to issue. Second, Congress’s failure to enact general climate change legislation does not authorize EPA to act. Under the Constitution, congressional inaction does not license an agency to take matters into its own hands, even to solve a pressing policy issue such as climate change.

Footnote:  Looks like some judges found their big boy pants and applied US constitutional separation of powers against runaway executive climate actions.  Would such a decision have come without a skeptical President?

Could this be the first breach in the wall of unproven, unwarranted, federally funded climate activism?

Water rushes over damaged primary spillway at Oroville Dam in Northern California

Decoding Climate News


Definition of “Fake News”: When reporters state their own opinions instead of bearing witness to observed events.

Journalism professor David Blackall provides a professional context for investigative reporting I’ve been doing on this blog, along with other bloggers interested in science and climate change/global warming. His peer reviewed paper is Environmental Reporting in a Post Truth World. The excerpts below show his advice is good not only for journalists but for readers.  h/t GWPF, Pierre Gosselin

Overview: The Grand Transnational Narrative

The dominance of a ‘grand transnational narrative’ in environmental discourse (Mittal, 2012) over other human impacts, like deforestation, is problematic and is partly due to the complexities and overspecialization of climate modelling. A strategy for learning, therefore, is to instead focus on the news media: it is easily researched and it tends to act ‘as one driving force’, providing citizens with ‘piecemeal information’, making it impossible to arrive at an informed position about science, society and politics (Marisa Dispensa et al., 2003). After locating problematic news narratives, Google Scholar can then be employed to locate recent scientific papers that examine, verify or refute news media discourse.

The science publication Nature Climate Change this year, published a study demonstrating Earth this century warmed substantially less than computer-generated climate models predict.

Unfortunately for public knowledge, such findings don’t appear in the news. Sea levels too have not been obeying the ‘grand transnational narrative’ of catastrophic global warming. Sea levels around Australia 2011–2012 were measured with the most significant drops in sea levels since measurements began. . .The 2015–2016 El-Niño, a natural phenomenon, drove sea levels around Indonesia to low levels such that coral reefs were bleaching. The echo chamber of news repeatedly fails to report such phenomena and yet many studies continue to contradict mainstream news discourse.

I will be arguing that a number of narratives need correction, and while I accept that the views I am about to express are not universally held, I believe that the scientific evidence does support them.

The Global Warming/Climate Change Narrative

The primary narrative in need of correction is that global warming alone (Lewis, 2016), which induces climate change (climate disruption), is due to the increase in global surface temperatures caused by atmospheric greenhouse gases. Instead, there are many factors arising from human land use (Pielke et al., 2016), which it could be argued are responsible for climate change, and some of these practices can be mitigated through direct public action.

Global warming is calculated by measuring average surface temperatures over time. While it is easy to argue that temperatures are increasing, it cannot be argued, as some models contend, that the increases are uniform throughout the global surface and atmosphere. Climate science is further problematized by its own scientists, in that computer modelling, as one component of this multi-faceted science, is privileged over other disciplines, like geology.

Scientific uncertainty arises from ‘simulations’ of climate because computer models are failing to match the actual climate. This means that computer models are unreliable in making predictions.

Published in the eminent journal Nature (Ma, et. al., 2017), ‘Theory of chaotic orbital variations confirmed by Cretaceous geological evidence’, provides excellent stimulus material for student news writing. The paper discusses the severe wobbles in planetary orbits, and these affect climate. The wobbles are reflected in geological records and show that the theoretical climate models are not rigorously confirmed by these radioisotopically calibrated and anchored geological data sets. Yet popular discourse presents Earth as harmonious: temperatures, sea levels and orbital patterns all naturally balanced until global warming affects them, a mythical construct. Instead, the reality is natural variability, the interactions of which are yet to be measured or discovered (Berger, 2013).

In such a (media) climate, it is difficult for the assertion to be made that there might be other sources, than a nontoxic greenhouse gas called carbon dioxide (CO2), that could be responsible for ‘climate disruption’. A healthy scientific process would allow such a proposition. Contrary to warming theory, CO2 levels have increased, but global average temperatures remain steady. The global average temperature increased from 1983 to 1998; then, it flat-lined for nearly 20 years. James Hansen’s Hockey Stick graph, with soaring and catastrophic temperatures, simply did not materialize.

As Keenan et al. (2016) found through using global carbon budget estimates, ground, atmospheric and satellite observations, and multiple global vegetation models that there is also now a pause in the growth rate of atmospheric CO2. They attribute this to increases in terrestrial sinks over the last decade, where forests consume the rising atmospheric CO2 and rapidly grow—the net effect being a slowing in the rate of warming from global respiration.

Contrary to public understanding, higher temperatures in cities are due to a phenomenon known as the ‘urban heat effect’ (Taha, 1997; Yuan & Bauer, 2007). Engines, air conditioners, heaters and heat absorbing surfaces like bitumen radiate heat energy in urban areas, but this is not due to the greenhouse effect. Problematic too are data sets like ocean heat temperatures, sea-ice thickness and glaciers: all of which are varied, some have not been measured or there are insignificant measurement time spans for the data to be reliable.

Contrary to news media reports, some glaciers throughout the world (Norway [Chinn et al., 2005] and New Zealand [Purdie et al., 2008]) are growing, while others shrink (Paul et al., 2007).

Conclusion

This is clearly a contentious topic. There are many agendas at play, with careers at stake. My view represents one side of the debate: it is one I strongly believe in, and is, I contend, supported by the science around deforestation, on the ground, rather than focusing almost entirely on atmosphere. However, as a journalism educator, I also recognize that my view, along with others, must be open to challenge, both within the scientific community and in the court of public opinion.

As a journalism educator, it is my responsibility to provide my students with the research skills they need to question—and test—the arguments put forward by the key players in any debate. Given the complexity of the climate warming debate, and the contested nature of the science that underpins both sides, this will provide challenges well into the future. It is a challenge our students should relish, particularly in an era when they are constantly being bombarded with ‘fake news’ and so-called ‘alternative facts’.

To do so, they need to understand the science. If they don’t, they need to at least understand the key players in the debate and what is motivating them. They need to be prepared to question these people and to look beyond their arguments to the agendas that may be driving them. If they don’t, we must be reconciled to a future in which ‘fake news’ becomes the norm.

Examples of my investigative reports are in Data Vs. Models posts listed at Climate Whack-a-Mole

See also Yellow Climate Journalism

Renewables Hypocrisy

Update August 18, 2017 at the bottom

Charles McConnell explains the emptiness of this recent popular virtue signaling.  His article in WSJ is City Pledges for ‘100% Renewable Energy’ Are 99% Misleading
The power grid is built on fossil fuels, and there’s no way to designate certain electrons as guilt free. Entire article reprinted below (my bolds and images)

Dozens of cities have made a misleading pledge: that they will move to 100% renewable energy so as to power residents’ lives without emitting a single puff of carbon. At a meeting of the U.S. Conference of Mayors in late June, leaders unanimously adopted a resolution setting a “community-wide target” of 100% clean power by 2035. Mayors from Portland, Ore., to Los Angeles to Miami Beach have signed on to these goals.

States are getting in the game, too. Two years ago Hawaii pledged that its electricity would be entirely renewable by 2045. The California Senate recently passed a bill setting the same goal, while moving up the state’s timeline to get half its electricity from renewables from 2030 to 2025.

Let’s not get carried away. Although activists herald these pledges as major environmental accomplishments, they’re more of a marketing gimmick. Use my home state of Texas as an example. The Electric Reliability Council of Texas oversees 90% of the state’s electricity generation and distribution. Texas generates more wind and solar power than any other state. Yet more than 71% of the council’s total electricity still comes from coal and natural gas.

The trick is that there’s no method to designate electrons on the grid as originating from one source or another. Power generated by fossil fuels and wind turbines travels together over poles and underground wires before reaching cities, homes and businesses. No customer can use power from wind and solar farms exclusively.

So how do cities make this 100% renewable claim while still receiving regular electricity from the grid? They pay to generate extra renewable energy that they then sell on the market. If they underwrite enough, they can claim to have offset whatever carbon-generated electricity they use. The proceeds from the sale go back to the city and are put toward its electric bill.

In essence, these cities are buying a “renewable” label to put on the regular power they’re using. Developers of wind and solar farms win because they can use mayoral commitments to finance their projects, which probably are already subsidized by taxpayers.

But the game would never work without complete confidence in the reliability of the grid, which is dependent on a strategy of “all of the above,” generating power from sources that include coal, natural gas, nuclear, wind and solar.

The mayor of Georgetown, Texas, announced earlier this year that his city had reached its goal of 100% renewable electricity. But in a 2015 article announcing the pledge, he acknowledged what would happen if solar and wind were not able to cover the city’s needs: “The Texas grid operator, the Electric Reliability Council of Texas, will ensure generation is available to meet demand.”

Two years ago the mayor of Denton, Texas, announced a plan to go 70% renewable, while calling a target of 100% unrealistic. “One of the challenges of renewable energy is that it’s so hard to predict,” he said. “You don’t know exactly when the sun is going to shine or when the wind is going to blow. To maintain that reliable power, you must have backup power.”

There is no denying that wind and solar power are important to a balanced energy portfolio. But coal is the bedrock of affordable electricity, and it will remain so, no matter how much wishful thinking by environmental activists. Coal is abundant and reliable. Unlike wind and solar, coal generation can be dialed up and down in response to market conditions and to satisfy demand.

The headline-grabbing 100% renewable pledges intentionally overlook these facts. Fossil fuels are not only the largest and most critical component of the energy portfolio, they are the foundation upon which renewable power must stand. Wind and solar generators ride free into the electric grid on the backs of fossil generators that have installed and paid for the infrastructure on which all Americans depend. The rise of renewable generation is made possible by fossil fuels, not despite them.

We should celebrate the growth of renewables, but not with false and misleading claims. What’s needed is transparency and a shared objective to provide consumers with the most reliable, resilient and affordable energy available.

Mr. McConnell, executive director of the Energy and Environment Initiative at Rice University, was an assistant secretary of energy, 2011-13.

Update August 18, 2017

People need to know that adding renewables to an electrical grid presents both technical and economic challenges.  Experience shows that adding intermittent power more than 10% of the baseload makes precarious the reliability of the supply.  South Australia is demonstrating this with a series of blackouts when the grid cannot be balanced.  Germany got to a higher % by dumping its excess renewable generation onto neighboring countries until the EU finally woke up and stopped them. Texas got up to 29% by dumping onto neighboring states, and some like Georgia are having problems.

But more dangerous is the way renewables destroy the economics of electrical power.  Seasoned energy analyst Gail Tverberg writes:

In fact, I have come to the rather astounding conclusion that even if wind turbines and solar PV could be built at zero cost, it would not make sense to continue to add them to the electric grid in the absence of very much better and cheaper electricity storage than we have today. There are too many costs outside building the devices themselves. It is these secondary costs that are problematic. Also, the presence of intermittent electricity disrupts competitive prices, leading to electricity prices that are far too low for other electricity providers, including those providing electricity using nuclear or natural gas. The tiny contribution of wind and solar to grid electricity cannot make up for the loss of more traditional electricity sources due to low prices.

These issues are discussed in more detail in the post Climateers Tilting at Windmills

Why the US letter re. Paris Accord

August 5, 2017 Update to Climate Law post

Media are reporting on the State Department letter informing the UN that the US will be withdrawing from the Paris Accord.  Some climatists are encouraged that the three-year waiting period is acknowledged and that the next president could return to the fold.  Others are disappointed that the Trump administration is not more assertive against both the accord and the United Nations Framework Convention on Climate Change (UNFCCC) itself.

Everyone should breathe through the nose and recognize the game and the stakes.  Paris agreement is not binding and is without penalties (except for blame and shame).  So following the protocol costs the US nothing, and does provide some opportunities.  As the world’s leader in actually reducing CO2 emissions, the US wants and needs to be at the table to convince others to follow the US example.  There is also 1 billion US$ from Obama put into the green fund that could be disbursed in accordance with US current priorities regarding energy and climate.

But the most important reason for this letter is to document that the Paris accord does not have legal authority for and within the United States.  Putting the US intent in writing is necessary to deter legal claims to hold the US accountable to Paris terms and conditions.  The post below explains why Paris accord is so important to legal climate actions around the world.

Climate Activists storm the bastion of Exxon Mobil, here seen without their shareholder disguises.

On the same day POTUS announced US withdrawal from Paris accord, a majority of Exxon Mobil shareholders approved a resolution asking management to assess the value of corporate assets considering a global move toward a low-carbon future. Here is the resolution, filed by the New York State Comptroller:

RESOLVED: Shareholders request that, beginning in 2018, ExxonMobil publish an annual assessment of the long-term portfolio impacts of technological advances and global climate change policies, at reasonable cost and omitting proprietary information. The assessment can be incorporated into existing reporting and should analyze the impacts on ExxonMobil’s oil and gas reserves and resources under a scenario in which reduction in demand results from carbon restrictions and related rules or commitments adopted by governments consistent with the globally agreed upon 2 degree target. This reporting should assess the resilience of the company’s full portfolio of reserves and resources through 2040 and beyond, and address the financial risks associated with such a scenario.

Background:

This century climatists woke up to their losing the battle for public opinion for onerous and costly reductions to fossil fuel usage. They turned toward the legal system to achieve their agenda, and the field of Climate Law has become another profession corrupted by climate cash, along side of Climate Medicine.

In addition to numerous court lawsuits, and also civil disobedience cases, there has been a concerted, well-funded and organized divestment move against companies supplying fossil fuels to consumers. The intention is to at least tie up in red tape Big Oil, indeed Small Oil as well. The real hope is to weaken energy producers by depriving them of investors to the point that reserves are left in the ground, as desired by such activists as 350.org.

In 2016 virtually the same resolution was dismissed by shareholders with only 38% approving. The difference this year was the switch by BlackRock Inc. and Vanguard Group, two of the world’s largest asset managers. As reported by Fox News (here):

Investment products such as exchange-traded funds that track the performance of indexes often come at a lower cost than traditional mutual funds and have gathered assets at a clip in recent years. That growth has given firms like BlackRock and Vanguard increasing sway on shareholder votes. But the firms in turn have come under activist pressure to take stances on issues such as climate disclosure.

When BlackRock sided with Exxon and against a similar proposal at the company’s annual meeting a year ago, it faced backlash from investors and environmental activists. This year BlackRock said the disclosure of climate risks would be among its key engagement priorities with senior executives.

Exxon Mobil board must now show they are taking this proposal seriously, and activists will be looking for company assets to be “stress tested” with the hope that the shares become more risky. At the very least, management will have to put more time and energy into opining on various scenarios of uncertain content and probabilities relating to the wish dreams of climatists.

Balancing on a cascade of suppositions.

We can look into the climate activist mental frame thanks to documents supporting the current strategy using the legal system to implement actions against fossil fuel consumption.

For example, there is this recent text explaining the shareholder proposal tabled at ExxonMobil annual meeting. From Attorney Sanford Lewis:

The Proposal states:

“RESOLVED: Shareholders request that by 2017 ExxonMobil publish an annual assessment of long term portfolio impacts of public climate change policies, at reasonable cost and omitting proprietary information. The assessment can be incorporated into existing reporting and should analyze the impacts on ExxonMobil’s oil and gas reserves and resources under a scenario in which reduction in demand results from carbon restrictions and related rules or commitments adopted by governments consistent with the globally agreed upon 2 degree target. The reporting should assess the resilience of the company’s full portfolio of reserves and resources through 2040 and beyond and address the financial risks associated with such a scenario.

Now let’s unbundle the chain of suppositions that comprise this proposal.

  • Supposition 1: A 2C global warming target is internationally agreed.
  • Supposition 2: Carbon Restrictions are enacted by governments to comply with the target.
  • Supposition 3: Demand for oil and gas products is reduced due to restrictions
  • Supposition 4: Oil and gas assets become uneconomic for lack of demand.
  • Supposition 5: Company net worth declines by depressed assets and investors lose value.

1.Suppose an International Agreement to limit global warming to 2C.

From the supporting statement to the above proposal, Sanford Lewis provides these assertions:

Recognizing the severe and pervasive economic and societal risks associated with a warming climate, global governments have agreed that increases in global temperature should be held below 2 degrees Celsius from pre-industrial levels (Cancun Agreement).

Failing to meet the 2 degree goal means, according to scientists, that the world will face massive coastal flooding, increasingly severe weather events, and deepening climate disruption. It will impose billions of dollars in damage on the global economy, and generate an increasing number of climate refugees worldwide.

Climate change and the risks it is generating for companies have become major concerns for investors. These concerns have been magnified by the 21st Session of the Conference of the Parties (COP 21) in Paris, where 195 global governments agreed to restrict greenhouse gas (GHG) emissions to no more than 2 degrees Celsius from pre-industrial levels and submitted plans to begin achieving the necessary GHG emission reductions. In the agreement, signatories also acknowledged the need to strive to keep global warming to 1.5 degrees, recognizing current and projected harms to low lying islands.

Yet a careful reading of UN agreements shows commitment is exaggerated:
David Campbell (here):

Neither 2°C nor any other specific target has ever been agreed at the UN climate change negotiations.

Article 2 of the Paris Agreement in fact provides only that it ‘aims to strengthen the global response to the threat of climate change … including by the holding the increase to well below 2°C’. This is an expression, not of setting a concrete limit, but merely of an aspiration to set such a limit. It is true that Article 2 is expressed in a deplorably equivocatory and convoluted language which fails to convey this vital point, indeed it obscures it. But nevertheless that is what Article 2 means.

Dieter Helm (here):

Nothing of substance has been achieved in the last quarter of a century despite all the efforts and political capital that has been applied. The Paris Agreement follows on from Kyoto. The pledges – in the unlikely event they are met – will not meet the 2C target, shipping and aviation are excluded, and the key developing countries (China and India) are not committed to capping their emission for at least another decade and a half (or longer in India’s case)

None of the pledges is, in any event, legally binding. For this reason, the Paris Agreement can be regarded as the point at which the UN negotiating approach turned effectively away from a top down approach, and instead started to rely on a more country driven and hence bottom up one.

Paul Spedding:

The international community is unlikely to agree any time soon on a global mechanism for putting a price on carbon emissions.

2: Suppose Governments enact restrictions that limit use of fossil fuels.

Despite the wishful thinking in the first supposition, the activists proceed on the basis of aspirations and reporting accountability. Sanford Lewis:

Although the reduction goals are not set forth in an enforceable agreement, the parties put mechanisms in place for transparent reporting by countries and a ratcheting mechanism every five years to create accountability for achieving these goals. U.N. Secretary General Ban Ki-moon summarized the Paris Agreement as follows: “The once Unthinkable [global action on climate change] has become the Unstoppable.”

Now we come to an interesting bait and switch. Since Cancun, IPCC is asserting that global warming is capped at 2C by keeping CO2 concentration below 450 ppm. From Summary for Policymakers (SPM) AR5

Emissions scenarios leading to CO2-equivalent concentrations in 2100 of about 450 ppm or lower are likely to maintain warming below 2°C over the 21st century relative to pre-industrial levels. These scenarios are characterized by 40 to 70% global anthropogenic GHG emissions reductions by 2050 compared to 2010, and emissions levels near zero or below in 2100.

Thus is born the “450 Scenario” by which governments can be focused upon reducing emissions without any reference to temperature measurements, which are troublesome and inconvenient.

Sanford Lewis:

Within the international expert community, “2 degree” is generally used as shorthand for a low carbon scenario under which CO2 concentrations in the earth’s atmosphere are stabilized at a level of 450 parts per million (ppm) or lower, representing approximately an 80% reduction in greenhouse gas emissions from current levels, which according to certain computer simulations would be likely to limit warming to 2 degrees Celsius above pre-industrial levels and is considered by some to reduce the likelihood of significant adverse impacts based on analyses of historical climate variability. Company Letter, page 4.

Clever as it is to substitute a 450 ppm target for 2C, the mathematics are daunting. Joe Romm:

We’re at 30 billion tons of carbon dioxide emissions a year — rising 3.3% per year — and we have to average below 18 billion tons a year for the entire century if we’re going to stabilize at 450 ppm. We need to peak around 2015 to 2020 at the latest, then drop at least 60% by 2050 to 15 billion tons (4 billion tons of carbon), and then go to near zero net carbon emissions by 2100.

And the presumed climate sensitivity to CO2 is hypothetical and unsupported by observations:

3.Suppose that demand for oil and gas products is reduced by the high costs imposed on such fuels.

Sanford Lewis:

ExxonMobil recognized in its 2014 10-K that “a number of countries have adopted, or are considering adoption of, regulatory frameworks to reduce greenhouse gas emissions,” and that such policies, regulations, and actions could make its “products more expensive, lengthen project implementation timelines and reduce demand for hydrocarbons,” but ExxonMobil has not presented any analysis of how its portfolio performs under a 2 degree scenario.

Moreover, the Company’s current use of a carbon proxy price, which it asserts as its means of calculating climate policy impacts, merely amplifies and reflects its optimistic assessments of national and global climate policies. The Company Letter notes that ExxonMobil is setting an internal price as high as $80 per ton; in contrast, the 2014 Report notes a carbon price of $1000 per ton to achieve the 450 ppm (2 degree scenario) and the Company reportedly stated during the recent Paris climate talks that a 1.5 degree scenario would require a carbon price as high as $2000 per ton within the next hundred years.

Peter Trelenberg, manager of environmental policy and planning at Exxon Mobil reportedly told the Houston Chronicle editorial board: Trimming carbon emissions to the point that average temperatures would rise roughly 1.6 degrees Celsius – enabling the planet to avoid dangerous symptoms of carbon pollution – would bring costs up to $2,000 a ton of CO2. That translates to a $20 a gallon boost to pump prices by the end of this century… .

Even those who think emissions should be capped somehow see through the wishful thinking in these numbers. Dieter Helm:

The combination of the shale revolution and the ending of the commodity super cycle probably point to a period of low prices for sometime to come. This is unfortunate timing for current decarbonisation policies, many of which are predicated on precisely the opposite happening – high and rising prices, rendering current renewables economic. Low oil prices, cheap coal, and falling gas prices, and their impacts on driving down wholesale electricity prices, are the new baseline against which to consider policy interventions.

With existing technologies, it is a matter of political will, and the ability to bring the main polluters on board, as to whether the envelope will be breached. There are good reasons to doubt that any top down agreement will work sufficiently well to achieve it.

The end of fossil fuels is not about to happen anytime soon, and will not be caused by running out of any of them. There is more than enough to fry the planet several times over, and technological progress in the extraction of fossil fuels has recently been at least as fast as for renewables. We live in an age of fossil fuel abundance.

We also live in a world where fossil fuel prices have fallen, and where the common assumption that prices will bounce back, and that the cycle of fossil fuel prices will not only reassert itself but also continue on a rising trend, may be seriously misguided. It is plausible to at least argue that the oil price may never regain its peaks in 1979 and 2008 again.

A world with stable or falling fossil fuel prices turns the policy assumptions of the last decade or so on their heads. Instead of assuming that rising prices would ease the transition to low carbon alternatives, many of the existing technologies will probably need permanent subsidies. Once the full system costs are incorporated, current generation wind (especially offshore) and current generation solar may be out of the market except in special locations for the foreseeable future. In any event, neither can do much to address the sheer scale of global emissions.

Primary Energy Demand Projection

4.Suppose oil and gas reserves are stranded for lack of demand.

Sanford Lewis:

Achievement of even a 2 degree goal requires net zero global emissions to be attained by 2100. Achieving net zero emissions this century means that the vast majority of fossil fuel reserves cannot be burned. As noted by Mark Carney, the President of the Bank of England, the carbon budget associated with meeting the 2 degree goal will “render the vast majority of reserves ‘stranded’ – oil, gas, and coal that will be literally unburnable without expensive carbon capture technology, which itself alters fossil fuel economics.”

A concern expressed by some of our stakeholders is whether such a “low carbon scenario” could impact ExxonMobil’s reserves and operations – i.e., whether this would result in unburnable proved reserves of oil and natural gas.

Decisions to abandon reserves are not as simple or have the effects as desired by activists.

Financial Post (here):

The 450 Scenario is not the IEA’s central scenario. At this point, government policies to limit GHG emissions are not stringent enough to stimulate this level of change. However, for discussion purposes let’s use the IEA’s 450 Scenario to examine the question of stranded assets in crude oil investing. Would some oil reserves be “stranded” under the IEA’s scenario of demand reversal?

A considerable amount of new oil projects must be developed to offset the almost 80 per cent loss in legacy production by 2040. This continued need for new oil projects for the next few decades and beyond means that the majority of the value of oil reserves on the books of public companies must be realized, and will not be “stranded”.

While most of these reserves will be developed, could any portion be stranded in this scenario? The answer is surely “yes.” In any industry a subset of the inventory that is comprised of inferior products will be susceptible to being marginalized when there is declining demand for goods. In a 450 ppm world, inferior products in the oil business will be defined by higher cost and higher carbon intensity.

5.Suppose shareholders fear declining company net worth.

Now we come to the underlying rationale for this initiative.

Paul Spedding:

Commodity markets have repeatedly proved vulnerable to expectations that prices will fall. Given the political pressure to mitigate the impact of climate change, smart investors will be watching closely for indications of policies that will lead to a drop in demand and the possibility that their assets will become financially stranded.

Equity markets are famously irrational, and if energy company shareholders can be spooked into selling off, a death spiral can be instigated. So far though, investors are smarter than they are given credit.

Bloomberg:

Fossil-fuel divestment has been a popular issue in recent years among college students, who have protested at campuses around the country. Yet even with the movement spreading to more than 1,000 campuses, only a few dozen schools have placed some restrictions on their commitments to the energy sector. Cornell University, Massachusetts Institute of Technology and Harvard University are among the largest endowments to reject demands to divest.

Stanford Board of Trustees even said:

As trustees, we are convinced that the global community must develop effective alternatives to fossil fuels at sufficient scale, so that fossil fuels will not continue to be extracted and used at the present rate. Stanford is deeply engaged in finding alternatives through its research. However, despite the progress being made, at the present moment oil and gas remain integral components of the global economy, essential to the daily lives of billions of people in both developed and emerging economies. Moreover, some oil and gas companies are themselves working to advance alternative energy sources and develop other solutions to climate change. The complexity of this picture does not allow us to conclude that the conditions for divestment outlined in the Statement on Investment Responsibility have been met.

Update:  Universities are not the exception in finding the alarmist case unconvincing, according to a survey:

Almost half of the world’s top 500 investors are failing to act on climate change — an increase of 6 percent from 236 in 2014, according to a report Monday by the Asset Owners Disclosure Project, which surveys global companies on their climate change risk and management.

The Abu Dhabi Investment Authority, Japan Post Insurance Co Ltd., Kuwait Investment Authority and China’s SAFE Investment Company, are the four biggest funds that scored zero in the survey. The 246 “laggards” identified as not acting hold $14 trillion in assets, the report said.

Summary

Alarmists have failed to achieve their goals through political persuasion and elections. So they are turning to legal and financial tactics. Their wishful thinking appears as an improbable chain of events built upon a Paris agreement without substance.

Last word to David Campbell:

International policy has so far been based on the premise that mitigation is the wisest course, but it is time for those committed to environmental intervention to abandon the idea of mitigation in favour of adaptation to climate change’s effects.

For more on adapting vs. mitigating, see Adapting Works, Mitigating Fails

EventChain