Global Warming Fails to Convince

I happened to read an article at Real Clear Science An Inconvenient Truth About ‘An Inconvenient Truth’ by Eric Merkley & Dominik Stecula August 18, 2017. The article itself is of middling interest, mainly being a lament that Al Gore became the leading promoter of public awareness about the dangers of global warming. The authors contend that Republicans were predetermined to reject claims from such a high-profile liberal Democrat.

It is not new nor interesting to hear warmists diss skeptics as simplistic right-wingers having a knee jerk reaction to global warming claims. But reading the comment thread was illuminating and undercut the presumptions of the article. Instead of pointing to all the leftist knee jerkers swearing allegiance to climatism, posts by several scientists made comments hitting the credibility problem at its core.

Two comments reprinted below deserve a wide audience for expressing what many think but have not expressed so clearly.

@Gabe Kesseru

I spent an entire career in applied sciences and know the difference between true science and lesser areas of study. Climatology is one of the latter. It is mostly a field of historical trend analysis trying desperately to be a field of trend prediction (and doing very poorly at that).

Climatologists have done themselves a disservice by calling themselves scientists, since by doing so we expect them to use the scientific method. The use of scientific method will always be impossible in climatology, since the most important step in the SM is experimentation to prove the hypothesis. And experimentation is impossible when we can’t perform a laboratory equivalent of the earth’s climate over centuries in a laboratory experiment.

Secondarily, science requires that we gather data to laboratory accuracy levels which again is impossible with haphazard worldwide thermometer measurements originally meant to measure weather at casual levels of accuracy and casual levels of repeatability.

@Dan Ashley · Northcentral University

Dan Ashley here. PhD statistics, PhD Business.

I am not a climate, environment, geology, weather, or physics expert. However, I am an expert on statistics. So, I recognize bad statistical analysis when I see it. There are quite a few problems with the use of statistics within the global warming debate. The use of Gaussian statistics is the first error. In his first movie Gore used a linear regression of CO2 and temperature. If he had done the same regression using the number of zoos in the world, or the worldwide use of atomic energy, or sunspots, he would have the same result. A linear regression by itself proves nothing.

The theory that CO2 is a greenhouse gas has been proven correct in a small greenhouse only. As a matter of fact, plants like higher CO2 and it is frequently pumped into greenhouses because of that. There has never been a definitive experiment regarding CO2, at or near the concentrations in our atmosphere. This theory actually has much less statistical support than the conspiracy theories regarding JFK’s assassination.

Gaussian statistics REQUIRE the events being published to be both independent and random. The temperatures experienced in one part of the world are dependent on temperatures in other locales. The readings are not independent. A better statistical method would be Mandlebroten (fractal). Mandlebroten statistics are not merely “fat tailed” statistics.

A more problematic issue with the data is that it has been adjusted. Data adjustments are frequently needed –for example, if a measuring device fails. However 100% of the data adjustments used are in favor of proving global warming. 100%. Not 100% minus one adjustment. Not nearly 100%. 100% –that is ALL– of the adjustments were in one direction only. Any student that put data like that in a PHD dissertation would never receive a doctoral degree.

One study published showed parts of the Earth where warming was occuring faster than other parts of the globe. The study claimed to be of data solely from satellites. The study identified several areas (Gambia for one) which have greater warming than other areas. Unfortunately, in three of those areas there have been no climate satellite observations for years.

The statements that claim “less arctic ice in recorded history” are equally spurious. We started gathering data on that in 1957 with the first satellite fly overs. On this issue “recorded history” is a very short time period.

Some geologist friends told me that a significant amount of Earth’s heat comes from the hot Earth’s core. They further stated that they do not know what percentage of heat that is. They do know it is probably over 20% and probably less than 70%. Whereas either of those extremes seems unlikely to me, remember that I am not a geologist.

As to rising oceans, that should be measured accurately. Measuring it with a stick stuck in the sand is inappropriate. Geologists tell me that the land is shifting and moving. Measuring it against the gravitational center of the Earth is the only accurate way. However, we do not know how to do that. As a matter of fact, we don’t know precisely where the gravitational center of the Earth is. (Any physicists around that want to explain the two body and the three body problem as it relates to the Earth, Moon, and Sun, please do so.

So, according to climate scientists the world is warming up. They may be correct, they may be incorrect. However, they have been unable to support their thesis via the use of statistics.

I personally see no reason to disassemble the world’s economic systems over an unproven, and somewhat implausible theory.

Summary

The scientific claims made in Gore’s movies do not stand up to scrutiny.  Changing the salesman is not going to make the pitch any more believable.

See also

Reasoning About Climate

Big Al’s Sequel Flawed at its Core

Decoding Climate News


Definition of “Fake News”: When reporters state their own opinions instead of bearing witness to observed events.

Journalism professor David Blackall provides a professional context for investigative reporting I’ve been doing on this blog, along with other bloggers interested in science and climate change/global warming. His peer reviewed paper is Environmental Reporting in a Post Truth World. The excerpts below show his advice is good not only for journalists but for readers.  h/t GWPF, Pierre Gosselin

Overview: The Grand Transnational Narrative

The dominance of a ‘grand transnational narrative’ in environmental discourse (Mittal, 2012) over other human impacts, like deforestation, is problematic and is partly due to the complexities and overspecialization of climate modelling. A strategy for learning, therefore, is to instead focus on the news media: it is easily researched and it tends to act ‘as one driving force’, providing citizens with ‘piecemeal information’, making it impossible to arrive at an informed position about science, society and politics (Marisa Dispensa et al., 2003). After locating problematic news narratives, Google Scholar can then be employed to locate recent scientific papers that examine, verify or refute news media discourse.

The science publication Nature Climate Change this year, published a study demonstrating Earth this century warmed substantially less than computer-generated climate models predict.

Unfortunately for public knowledge, such findings don’t appear in the news. Sea levels too have not been obeying the ‘grand transnational narrative’ of catastrophic global warming. Sea levels around Australia 2011–2012 were measured with the most significant drops in sea levels since measurements began. . .The 2015–2016 El-Niño, a natural phenomenon, drove sea levels around Indonesia to low levels such that coral reefs were bleaching. The echo chamber of news repeatedly fails to report such phenomena and yet many studies continue to contradict mainstream news discourse.

I will be arguing that a number of narratives need correction, and while I accept that the views I am about to express are not universally held, I believe that the scientific evidence does support them.

The Global Warming/Climate Change Narrative

The primary narrative in need of correction is that global warming alone (Lewis, 2016), which induces climate change (climate disruption), is due to the increase in global surface temperatures caused by atmospheric greenhouse gases. Instead, there are many factors arising from human land use (Pielke et al., 2016), which it could be argued are responsible for climate change, and some of these practices can be mitigated through direct public action.

Global warming is calculated by measuring average surface temperatures over time. While it is easy to argue that temperatures are increasing, it cannot be argued, as some models contend, that the increases are uniform throughout the global surface and atmosphere. Climate science is further problematized by its own scientists, in that computer modelling, as one component of this multi-faceted science, is privileged over other disciplines, like geology.

Scientific uncertainty arises from ‘simulations’ of climate because computer models are failing to match the actual climate. This means that computer models are unreliable in making predictions.

Published in the eminent journal Nature (Ma, et. al., 2017), ‘Theory of chaotic orbital variations confirmed by Cretaceous geological evidence’, provides excellent stimulus material for student news writing. The paper discusses the severe wobbles in planetary orbits, and these affect climate. The wobbles are reflected in geological records and show that the theoretical climate models are not rigorously confirmed by these radioisotopically calibrated and anchored geological data sets. Yet popular discourse presents Earth as harmonious: temperatures, sea levels and orbital patterns all naturally balanced until global warming affects them, a mythical construct. Instead, the reality is natural variability, the interactions of which are yet to be measured or discovered (Berger, 2013).

In such a (media) climate, it is difficult for the assertion to be made that there might be other sources, than a nontoxic greenhouse gas called carbon dioxide (CO2), that could be responsible for ‘climate disruption’. A healthy scientific process would allow such a proposition. Contrary to warming theory, CO2 levels have increased, but global average temperatures remain steady. The global average temperature increased from 1983 to 1998; then, it flat-lined for nearly 20 years. James Hansen’s Hockey Stick graph, with soaring and catastrophic temperatures, simply did not materialize.

As Keenan et al. (2016) found through using global carbon budget estimates, ground, atmospheric and satellite observations, and multiple global vegetation models that there is also now a pause in the growth rate of atmospheric CO2. They attribute this to increases in terrestrial sinks over the last decade, where forests consume the rising atmospheric CO2 and rapidly grow—the net effect being a slowing in the rate of warming from global respiration.

Contrary to public understanding, higher temperatures in cities are due to a phenomenon known as the ‘urban heat effect’ (Taha, 1997; Yuan & Bauer, 2007). Engines, air conditioners, heaters and heat absorbing surfaces like bitumen radiate heat energy in urban areas, but this is not due to the greenhouse effect. Problematic too are data sets like ocean heat temperatures, sea-ice thickness and glaciers: all of which are varied, some have not been measured or there are insignificant measurement time spans for the data to be reliable.

Contrary to news media reports, some glaciers throughout the world (Norway [Chinn et al., 2005] and New Zealand [Purdie et al., 2008]) are growing, while others shrink (Paul et al., 2007).

Conclusion

This is clearly a contentious topic. There are many agendas at play, with careers at stake. My view represents one side of the debate: it is one I strongly believe in, and is, I contend, supported by the science around deforestation, on the ground, rather than focusing almost entirely on atmosphere. However, as a journalism educator, I also recognize that my view, along with others, must be open to challenge, both within the scientific community and in the court of public opinion.

As a journalism educator, it is my responsibility to provide my students with the research skills they need to question—and test—the arguments put forward by the key players in any debate. Given the complexity of the climate warming debate, and the contested nature of the science that underpins both sides, this will provide challenges well into the future. It is a challenge our students should relish, particularly in an era when they are constantly being bombarded with ‘fake news’ and so-called ‘alternative facts’.

To do so, they need to understand the science. If they don’t, they need to at least understand the key players in the debate and what is motivating them. They need to be prepared to question these people and to look beyond their arguments to the agendas that may be driving them. If they don’t, we must be reconciled to a future in which ‘fake news’ becomes the norm.

Examples of my investigative reports are in Data Vs. Models posts listed at Climate Whack-a-Mole

See also Yellow Climate Journalism

Why the US letter re. Paris Accord

August 5, 2017 Update to Climate Law post

Media are reporting on the State Department letter informing the UN that the US will be withdrawing from the Paris Accord.  Some climatists are encouraged that the three-year waiting period is acknowledged and that the next president could return to the fold.  Others are disappointed that the Trump administration is not more assertive against both the accord and the United Nations Framework Convention on Climate Change (UNFCCC) itself.

Everyone should breathe through the nose and recognize the game and the stakes.  Paris agreement is not binding and is without penalties (except for blame and shame).  So following the protocol costs the US nothing, and does provide some opportunities.  As the world’s leader in actually reducing CO2 emissions, the US wants and needs to be at the table to convince others to follow the US example.  There is also 1 billion US$ from Obama put into the green fund that could be disbursed in accordance with US current priorities regarding energy and climate.

But the most important reason for this letter is to document that the Paris accord does not have legal authority for and within the United States.  Putting the US intent in writing is necessary to deter legal claims to hold the US accountable to Paris terms and conditions.  The post below explains why Paris accord is so important to legal climate actions around the world.

Climate Activists storm the bastion of Exxon Mobil, here seen without their shareholder disguises.

On the same day POTUS announced US withdrawal from Paris accord, a majority of Exxon Mobil shareholders approved a resolution asking management to assess the value of corporate assets considering a global move toward a low-carbon future. Here is the resolution, filed by the New York State Comptroller:

RESOLVED: Shareholders request that, beginning in 2018, ExxonMobil publish an annual assessment of the long-term portfolio impacts of technological advances and global climate change policies, at reasonable cost and omitting proprietary information. The assessment can be incorporated into existing reporting and should analyze the impacts on ExxonMobil’s oil and gas reserves and resources under a scenario in which reduction in demand results from carbon restrictions and related rules or commitments adopted by governments consistent with the globally agreed upon 2 degree target. This reporting should assess the resilience of the company’s full portfolio of reserves and resources through 2040 and beyond, and address the financial risks associated with such a scenario.

Background:

This century climatists woke up to their losing the battle for public opinion for onerous and costly reductions to fossil fuel usage. They turned toward the legal system to achieve their agenda, and the field of Climate Law has become another profession corrupted by climate cash, along side of Climate Medicine.

In addition to numerous court lawsuits, and also civil disobedience cases, there has been a concerted, well-funded and organized divestment move against companies supplying fossil fuels to consumers. The intention is to at least tie up in red tape Big Oil, indeed Small Oil as well. The real hope is to weaken energy producers by depriving them of investors to the point that reserves are left in the ground, as desired by such activists as 350.org.

In 2016 virtually the same resolution was dismissed by shareholders with only 38% approving. The difference this year was the switch by BlackRock Inc. and Vanguard Group, two of the world’s largest asset managers. As reported by Fox News (here):

Investment products such as exchange-traded funds that track the performance of indexes often come at a lower cost than traditional mutual funds and have gathered assets at a clip in recent years. That growth has given firms like BlackRock and Vanguard increasing sway on shareholder votes. But the firms in turn have come under activist pressure to take stances on issues such as climate disclosure.

When BlackRock sided with Exxon and against a similar proposal at the company’s annual meeting a year ago, it faced backlash from investors and environmental activists. This year BlackRock said the disclosure of climate risks would be among its key engagement priorities with senior executives.

Exxon Mobil board must now show they are taking this proposal seriously, and activists will be looking for company assets to be “stress tested” with the hope that the shares become more risky. At the very least, management will have to put more time and energy into opining on various scenarios of uncertain content and probabilities relating to the wish dreams of climatists.

Balancing on a cascade of suppositions.

We can look into the climate activist mental frame thanks to documents supporting the current strategy using the legal system to implement actions against fossil fuel consumption.

For example, there is this recent text explaining the shareholder proposal tabled at ExxonMobil annual meeting. From Attorney Sanford Lewis:

The Proposal states:

“RESOLVED: Shareholders request that by 2017 ExxonMobil publish an annual assessment of long term portfolio impacts of public climate change policies, at reasonable cost and omitting proprietary information. The assessment can be incorporated into existing reporting and should analyze the impacts on ExxonMobil’s oil and gas reserves and resources under a scenario in which reduction in demand results from carbon restrictions and related rules or commitments adopted by governments consistent with the globally agreed upon 2 degree target. The reporting should assess the resilience of the company’s full portfolio of reserves and resources through 2040 and beyond and address the financial risks associated with such a scenario.

Now let’s unbundle the chain of suppositions that comprise this proposal.

  • Supposition 1: A 2C global warming target is internationally agreed.
  • Supposition 2: Carbon Restrictions are enacted by governments to comply with the target.
  • Supposition 3: Demand for oil and gas products is reduced due to restrictions
  • Supposition 4: Oil and gas assets become uneconomic for lack of demand.
  • Supposition 5: Company net worth declines by depressed assets and investors lose value.

1.Suppose an International Agreement to limit global warming to 2C.

From the supporting statement to the above proposal, Sanford Lewis provides these assertions:

Recognizing the severe and pervasive economic and societal risks associated with a warming climate, global governments have agreed that increases in global temperature should be held below 2 degrees Celsius from pre-industrial levels (Cancun Agreement).

Failing to meet the 2 degree goal means, according to scientists, that the world will face massive coastal flooding, increasingly severe weather events, and deepening climate disruption. It will impose billions of dollars in damage on the global economy, and generate an increasing number of climate refugees worldwide.

Climate change and the risks it is generating for companies have become major concerns for investors. These concerns have been magnified by the 21st Session of the Conference of the Parties (COP 21) in Paris, where 195 global governments agreed to restrict greenhouse gas (GHG) emissions to no more than 2 degrees Celsius from pre-industrial levels and submitted plans to begin achieving the necessary GHG emission reductions. In the agreement, signatories also acknowledged the need to strive to keep global warming to 1.5 degrees, recognizing current and projected harms to low lying islands.

Yet a careful reading of UN agreements shows commitment is exaggerated:
David Campbell (here):

Neither 2°C nor any other specific target has ever been agreed at the UN climate change negotiations.

Article 2 of the Paris Agreement in fact provides only that it ‘aims to strengthen the global response to the threat of climate change … including by the holding the increase to well below 2°C’. This is an expression, not of setting a concrete limit, but merely of an aspiration to set such a limit. It is true that Article 2 is expressed in a deplorably equivocatory and convoluted language which fails to convey this vital point, indeed it obscures it. But nevertheless that is what Article 2 means.

Dieter Helm (here):

Nothing of substance has been achieved in the last quarter of a century despite all the efforts and political capital that has been applied. The Paris Agreement follows on from Kyoto. The pledges – in the unlikely event they are met – will not meet the 2C target, shipping and aviation are excluded, and the key developing countries (China and India) are not committed to capping their emission for at least another decade and a half (or longer in India’s case)

None of the pledges is, in any event, legally binding. For this reason, the Paris Agreement can be regarded as the point at which the UN negotiating approach turned effectively away from a top down approach, and instead started to rely on a more country driven and hence bottom up one.

Paul Spedding:

The international community is unlikely to agree any time soon on a global mechanism for putting a price on carbon emissions.

2: Suppose Governments enact restrictions that limit use of fossil fuels.

Despite the wishful thinking in the first supposition, the activists proceed on the basis of aspirations and reporting accountability. Sanford Lewis:

Although the reduction goals are not set forth in an enforceable agreement, the parties put mechanisms in place for transparent reporting by countries and a ratcheting mechanism every five years to create accountability for achieving these goals. U.N. Secretary General Ban Ki-moon summarized the Paris Agreement as follows: “The once Unthinkable [global action on climate change] has become the Unstoppable.”

Now we come to an interesting bait and switch. Since Cancun, IPCC is asserting that global warming is capped at 2C by keeping CO2 concentration below 450 ppm. From Summary for Policymakers (SPM) AR5

Emissions scenarios leading to CO2-equivalent concentrations in 2100 of about 450 ppm or lower are likely to maintain warming below 2°C over the 21st century relative to pre-industrial levels. These scenarios are characterized by 40 to 70% global anthropogenic GHG emissions reductions by 2050 compared to 2010, and emissions levels near zero or below in 2100.

Thus is born the “450 Scenario” by which governments can be focused upon reducing emissions without any reference to temperature measurements, which are troublesome and inconvenient.

Sanford Lewis:

Within the international expert community, “2 degree” is generally used as shorthand for a low carbon scenario under which CO2 concentrations in the earth’s atmosphere are stabilized at a level of 450 parts per million (ppm) or lower, representing approximately an 80% reduction in greenhouse gas emissions from current levels, which according to certain computer simulations would be likely to limit warming to 2 degrees Celsius above pre-industrial levels and is considered by some to reduce the likelihood of significant adverse impacts based on analyses of historical climate variability. Company Letter, page 4.

Clever as it is to substitute a 450 ppm target for 2C, the mathematics are daunting. Joe Romm:

We’re at 30 billion tons of carbon dioxide emissions a year — rising 3.3% per year — and we have to average below 18 billion tons a year for the entire century if we’re going to stabilize at 450 ppm. We need to peak around 2015 to 2020 at the latest, then drop at least 60% by 2050 to 15 billion tons (4 billion tons of carbon), and then go to near zero net carbon emissions by 2100.

And the presumed climate sensitivity to CO2 is hypothetical and unsupported by observations:

3.Suppose that demand for oil and gas products is reduced by the high costs imposed on such fuels.

Sanford Lewis:

ExxonMobil recognized in its 2014 10-K that “a number of countries have adopted, or are considering adoption of, regulatory frameworks to reduce greenhouse gas emissions,” and that such policies, regulations, and actions could make its “products more expensive, lengthen project implementation timelines and reduce demand for hydrocarbons,” but ExxonMobil has not presented any analysis of how its portfolio performs under a 2 degree scenario.

Moreover, the Company’s current use of a carbon proxy price, which it asserts as its means of calculating climate policy impacts, merely amplifies and reflects its optimistic assessments of national and global climate policies. The Company Letter notes that ExxonMobil is setting an internal price as high as $80 per ton; in contrast, the 2014 Report notes a carbon price of $1000 per ton to achieve the 450 ppm (2 degree scenario) and the Company reportedly stated during the recent Paris climate talks that a 1.5 degree scenario would require a carbon price as high as $2000 per ton within the next hundred years.

Peter Trelenberg, manager of environmental policy and planning at Exxon Mobil reportedly told the Houston Chronicle editorial board: Trimming carbon emissions to the point that average temperatures would rise roughly 1.6 degrees Celsius – enabling the planet to avoid dangerous symptoms of carbon pollution – would bring costs up to $2,000 a ton of CO2. That translates to a $20 a gallon boost to pump prices by the end of this century… .

Even those who think emissions should be capped somehow see through the wishful thinking in these numbers. Dieter Helm:

The combination of the shale revolution and the ending of the commodity super cycle probably point to a period of low prices for sometime to come. This is unfortunate timing for current decarbonisation policies, many of which are predicated on precisely the opposite happening – high and rising prices, rendering current renewables economic. Low oil prices, cheap coal, and falling gas prices, and their impacts on driving down wholesale electricity prices, are the new baseline against which to consider policy interventions.

With existing technologies, it is a matter of political will, and the ability to bring the main polluters on board, as to whether the envelope will be breached. There are good reasons to doubt that any top down agreement will work sufficiently well to achieve it.

The end of fossil fuels is not about to happen anytime soon, and will not be caused by running out of any of them. There is more than enough to fry the planet several times over, and technological progress in the extraction of fossil fuels has recently been at least as fast as for renewables. We live in an age of fossil fuel abundance.

We also live in a world where fossil fuel prices have fallen, and where the common assumption that prices will bounce back, and that the cycle of fossil fuel prices will not only reassert itself but also continue on a rising trend, may be seriously misguided. It is plausible to at least argue that the oil price may never regain its peaks in 1979 and 2008 again.

A world with stable or falling fossil fuel prices turns the policy assumptions of the last decade or so on their heads. Instead of assuming that rising prices would ease the transition to low carbon alternatives, many of the existing technologies will probably need permanent subsidies. Once the full system costs are incorporated, current generation wind (especially offshore) and current generation solar may be out of the market except in special locations for the foreseeable future. In any event, neither can do much to address the sheer scale of global emissions.

Primary Energy Demand Projection

4.Suppose oil and gas reserves are stranded for lack of demand.

Sanford Lewis:

Achievement of even a 2 degree goal requires net zero global emissions to be attained by 2100. Achieving net zero emissions this century means that the vast majority of fossil fuel reserves cannot be burned. As noted by Mark Carney, the President of the Bank of England, the carbon budget associated with meeting the 2 degree goal will “render the vast majority of reserves ‘stranded’ – oil, gas, and coal that will be literally unburnable without expensive carbon capture technology, which itself alters fossil fuel economics.”

A concern expressed by some of our stakeholders is whether such a “low carbon scenario” could impact ExxonMobil’s reserves and operations – i.e., whether this would result in unburnable proved reserves of oil and natural gas.

Decisions to abandon reserves are not as simple or have the effects as desired by activists.

Financial Post (here):

The 450 Scenario is not the IEA’s central scenario. At this point, government policies to limit GHG emissions are not stringent enough to stimulate this level of change. However, for discussion purposes let’s use the IEA’s 450 Scenario to examine the question of stranded assets in crude oil investing. Would some oil reserves be “stranded” under the IEA’s scenario of demand reversal?

A considerable amount of new oil projects must be developed to offset the almost 80 per cent loss in legacy production by 2040. This continued need for new oil projects for the next few decades and beyond means that the majority of the value of oil reserves on the books of public companies must be realized, and will not be “stranded”.

While most of these reserves will be developed, could any portion be stranded in this scenario? The answer is surely “yes.” In any industry a subset of the inventory that is comprised of inferior products will be susceptible to being marginalized when there is declining demand for goods. In a 450 ppm world, inferior products in the oil business will be defined by higher cost and higher carbon intensity.

5.Suppose shareholders fear declining company net worth.

Now we come to the underlying rationale for this initiative.

Paul Spedding:

Commodity markets have repeatedly proved vulnerable to expectations that prices will fall. Given the political pressure to mitigate the impact of climate change, smart investors will be watching closely for indications of policies that will lead to a drop in demand and the possibility that their assets will become financially stranded.

Equity markets are famously irrational, and if energy company shareholders can be spooked into selling off, a death spiral can be instigated. So far though, investors are smarter than they are given credit.

Bloomberg:

Fossil-fuel divestment has been a popular issue in recent years among college students, who have protested at campuses around the country. Yet even with the movement spreading to more than 1,000 campuses, only a few dozen schools have placed some restrictions on their commitments to the energy sector. Cornell University, Massachusetts Institute of Technology and Harvard University are among the largest endowments to reject demands to divest.

Stanford Board of Trustees even said:

As trustees, we are convinced that the global community must develop effective alternatives to fossil fuels at sufficient scale, so that fossil fuels will not continue to be extracted and used at the present rate. Stanford is deeply engaged in finding alternatives through its research. However, despite the progress being made, at the present moment oil and gas remain integral components of the global economy, essential to the daily lives of billions of people in both developed and emerging economies. Moreover, some oil and gas companies are themselves working to advance alternative energy sources and develop other solutions to climate change. The complexity of this picture does not allow us to conclude that the conditions for divestment outlined in the Statement on Investment Responsibility have been met.

Update:  Universities are not the exception in finding the alarmist case unconvincing, according to a survey:

Almost half of the world’s top 500 investors are failing to act on climate change — an increase of 6 percent from 236 in 2014, according to a report Monday by the Asset Owners Disclosure Project, which surveys global companies on their climate change risk and management.

The Abu Dhabi Investment Authority, Japan Post Insurance Co Ltd., Kuwait Investment Authority and China’s SAFE Investment Company, are the four biggest funds that scored zero in the survey. The 246 “laggards” identified as not acting hold $14 trillion in assets, the report said.

Summary

Alarmists have failed to achieve their goals through political persuasion and elections. So they are turning to legal and financial tactics. Their wishful thinking appears as an improbable chain of events built upon a Paris agreement without substance.

Last word to David Campbell:

International policy has so far been based on the premise that mitigation is the wisest course, but it is time for those committed to environmental intervention to abandon the idea of mitigation in favour of adaptation to climate change’s effects.

For more on adapting vs. mitigating, see Adapting Works, Mitigating Fails

EventChain

Control Knobs, Rick Perry and AMS

A great post by Ross McKitrick at the Hill (H/T GWPF)  In the fight between Rick Perry and climate scientists — He’s winning  Excerpts below (my bolds)

Policy makers and the public need to understand the extent to which major scientific institutions like the American Meteorological Society have become biased and politicized on the climate issue. Convincing them of this becomes much easier when the organizations themselves supply the evidence.

This happened recently in response to a CNBC interview with Energy Secretary Rick Perry. He was asked “Do you believe CO2 [carbon dioxide] is the primary control knob for the temperature of the Earth and for climate?”

It was an ambiguous question that defies a simple yes or no answer. Perry thought for moment then said, “No, most likely the primary control knob is the ocean waters and this environment we live in.” He then went on to acknowledge the climate is changing and CO2 is having a role, but the issue is how much, and being skeptical about some of these things is “quite all right.”

Perry’s response prompted a letter of protest from Keith Seitter, executive director of the American Meteorological Society. The letter admonished him for supposedly contradicting “indisputable findings” that emissions of CO2 and other greenhouse gases are the primary cause of recent global warming, a topic for which Seitter insists there is no room for debate.

It is noteworthy that the meteorological society remained completely silent over the years when senior Democratic administration officials made multiple exaggerated and untrue statements in service of global warming alarmism.  (McKitrick provides several examples in his article)

But the meteorological society leapt to condemn Perry for a cautious response to an awkward question. Perry could not reasonably have agreed with the interviewer since the concept of a “control knob” for the Earth’s temperature wasn’t defined. Doubling CO2 might, according to models, cause a few degrees of warming. Doubling the size of the sun would burn up the planet. Doubling cloud cover might trigger an ice age. So which is the “primary control knob”? The meteorological society letter ignored the odd wording of the question, misrepresented Perry’s response and then summarily declared their position on climate “indisputable.” Perry’s cautious answer, by contrast, was perfectly reasonable in the context of a confusing question in a fast-moving TV interview.

Furthermore, Seitter’s letter invites skepticism. It pronounces confidently on causes of global warming “in recent decades” even though this is where the literature is most disputed and uncertain. Climate models have overestimated warming in recent decades for reasons that are not yet known. Key mechanisms of natural variability are not well understood, and measured climate sensitivity to CO2 appears to be lower than modelers assumed. Climate models tweaked to get recent Arctic sea ice changes right get overall warming even more wrong, adding to the list of puzzles. But to the meteorological society, the fact that these and many other questions are unresolved does not prevent them from insisting on uniformity of opinion.

Summary

The meteorological society letter is all about enforcing orthodoxy, which speaks ill of the leadership’s overall views on open scientific debate.

Ross McKitrick is a professor of economics at the University of Guelph and an Adjunct Scholar at the Cato Institute.

See also:  Nature’s Sunscreen and Climate Biorhythms

Footnote:  Arnd’s comment below reminds of this image.  It works even better with Republican Rick Perry testifying to the ocean’s climate dominance.

 

Man Made Warming from Adjusting Data

Roger Andrews does a thorough job analyzing the effects of adjustments upon Surface Air Temperature (SAT) datasets. His article at Energy Matters is Adjusting Measurements to Match the Models – Part 1: Surface Air Temperatures. Excerpts of text and some images are below.  The whole essay is informative and supports his conclusion:

In previous posts and comments I had said that adjustments had added only about 0.2°C of spurious warming to the global SAT record over the last 100 years or so – not enough to make much difference. But after further review it now appears that they may have added as much as 0.4°C.

For example, these graphs show warming of the GISS dataset:

Figure 2: Comparison of “Old” and “Current” GISS meteorological station surface air temperature series, annual anomalies relative to 1950-1990 means

The current GISS series shows about 0.3°C more global warming than the old version, with about 0.2°C more warming in the Northern Hemisphere and about 0.5°C more in the Southern. The added warming trends are almost exactly linear except for the downturns after 2000, which I suspect (although can’t confirm) are a result of attempts to track the global warming “pause”. How did GISS generate all this extra straight-line warming? It did it by replacing the old unadjusted records with “homogeneity-adjusted” versions.

The homogenization operators used by others have had similar impacts, with Berkeley Earth Surface Temperature (BEST) being a case in point. Figure 3, which compares warming gradients measured at 86 South American stations before and after BEST’s homogeneity adjustments (from Reference 1) visually illustrates what a warming-biased operator does at larger scales. Before homogenization 58 of the 86 stations showed overall warming, 28 showed overall cooling and the average warming trend for all stations was 0.54°C/century. After homogenization all 86 stations show warming and the average warming trend increases to 1.09°C/century:

Figure 3: Warming vs. cooling at 86 South American stations before and after BEST homogeneity adjustments

The adjusted “current” GISS series match the global and Northern Hemisphere model trend line gradients almost exactly but overstate warming relative to the models in the Southern (although this has only a minor impact on the global mean because the Southern Hemisphere has a lot less land and therefore contributes less to the global mean than does the Northern). But the unadjusted “old” GISS series, which I independently verified with my own from-scratch reconstructions, consistently show much less warming than the models, confirming that the generally good model/observation match is entirely a result of the homogeneity adjustments applied to the raw SAT records.

Summary

In this post I have chosen to combine a large number of individual examples of “data being adjusted to match it to the theory” into one single example that blankets all of the surface air temperature records. The results indicate that warming-biased homogeneity adjustments have resulted in current published series overestimating the amount by which surface air temperatures over land have warmed since 1900 by about 0.4°C (Table 1), and that global surface air temperatures have increased by only about 0.7°C over this period, not by the ~1.1°C shown by the published SAT series.

Land, however, makes up only about 30% of the Earth’s surface. The subject of the next post will be sea surface temperatures in the oceans, which cover the remaining 70%. In it I will document more examples of measurement manipulation malfeasance, but with a twist. Stay tuned.

Footnote:

I have also looked into this issue by analyzing a set of US stations considered to have the highest CRN rating.  The impact of adjustments was similarly evident and in the direction of warming the trends.  See Temperature Data Review Project: My Submission

 

Eemian and Holocene Climates

 

Hansen is publishing a new paper in support of the children suing the government for not fighting climate change. In it he claims temps now are higher than the Holocene and matching the Eemian, so we should expect comparable sea levels.

Paper is here:
http://www.earth-syst-dynam.net/8/577/2017/

Abstract. Global temperature is a fundamental climate metric highly correlated with sea level, which implies that keeping shorelines near their present location requires keeping global temperature within or close to its preindustrial Holocene range. However, global temperature excluding short-term variability now exceeds +1 °C relative to the 1880–1920 mean and annual 2016 global temperature was almost +1.3 °C. We show that global temperature has risen well out of the Holocene range and Earth is now as warm as it was during the prior (Eemian) interglacial period, when sea level reached 6–9 m higher than today. Further, Earth is out of energy balance with present atmospheric composition, implying that more warming is in the pipeline, and we show that the growth rate of greenhouse gas climate forcing has accelerated markedly in the past decade. The rapidity of ice sheet and sea level response to global temperature is difficult to predict, but is dependent on the magnitude of warming. Targets for limiting global warming thus, at minimum, should aim to avoid leaving global temperature at Eemian or higher levels for centuries. Such targets now require negative emissions, i.e., extraction of CO2 from the air. If phasedown of fossil fuel emissions begins soon, improved agricultural and forestry practices, including reforestation and steps to improve soil fertility and increase its carbon content, may provide much of the necessary CO2 extraction. In that case, the magnitude and duration of global temperature excursion above the natural range of the current interglacial (Holocene) could be limited and irreversible climate impacts could be minimized. In contrast, continued high fossil fuel emissions today place a burden on young people to undertake massive technological CO2 extraction if they are to limit climate change and its consequences. Proposed methods of extraction such as bioenergy with carbon capture and storage (BECCS) or air capture of CO2 have minimal estimated costs of USD 89–535 trillion this century and also have large risks and uncertain feasibility. Continued high fossil fuel emissions unarguably sentences young people to either a massive, implausible cleanup or growing deleterious climate impacts or both. (my bolds)

The image at the top shows that the Eemian climate was very different due to orbital mechanics, which were nothing like today.  And as Rud points out, the rise in sea levels took thousands of years at a rate similar to today: 2 mm a year.

In addition, Hansen et al. appear to have erased not only the Medieval Warming period, but also the Roman and Minoan periods before them.   Perhaps they are using 2016 temps as a trampoline for their claims, even though we are already well down from that El Nino event.

Hansen et al. are going over the top, exaggerating even beyond IPCC in order to proclaim Waterworld is at hand.

It seems to me that the kind of rise Hansen is looking comes after an ice age freezes lots of water, resulting in a very low baseline.  In the graph below, you can see the beginning of the Holocene around 14 thousand years ago, and then the rise slowed down to the present rate around 6000 years ago.

More information is available here:
https://wattsupwiththat.com/2015/06/01/ice-core-data-shows-the-much-feared-2c-climate-tipping-point-has-already-occurred/

For more on children’s crusade against global warming see:  Climate War Human Shields

 

 

Climate Biorhythms

Human Biorhythms

The question–whether monitoring biorhythm cycles can actually make a difference in people’s lives–has been studied since the 1960s, when the writings of George S. Thommen popularized the idea.

Several companies began experimenting and although the Japanese were the first nation to apply biorhythms on a large scale, the Swiss were the first to see and realize the benefits of biorhythms in reducing accidents.

Hans Frueh invented the Bio-Card and Bio-Calculator, and Swiss municipal and national authorities appear to have been applying biorhythms for many years before the Japanese experiments. Swissair, which reportedly had been studying the critical days of its pilots for almost a decade previously, did not allow either a pilot or a co-pilot experiencing a critical day to fly with another experiencing the same kind of instability. Reportedly, Swissair had no accidents on those flights where biorhythm had been applied.

Most biorhythm models use three cycles: a 23-day physical cycle, a 28-day emotional cycle, and a 33-day intellectual cycle.[8] Each of these cycles varies between high and low extremes sinusoidally, with days where the cycle crosses the zero line described as “critical days” of greater risk or uncertainty.

The numbers from +100% (maximum) to -100% (minimum) indicate where on each cycle the rhythms are on a particular day. In general, a rhythm at 0% is crossing the midpoint and is thought to have no real impact on your life, whereas a rhythm at +100% (at the peak of that cycle) would give you an edge in that area, and a rhythm at -100% (at the bottom of that cycle) would make life more difficult in that area. There is no particular meaning to a day on which your rhythms are all high or all low, except the obvious benefits or hindrances that these rare extremes are thought to have on your life.

Human Biorhythms are not proven

Various attempts have been made to validate this biorhythm model with inconclusive results. It is fair to say that this particular definition of physical, emotional, and intellectual cycles has not been proven. I do not myself subscribe to it nor have ever attempted to follow it. My point is mainly to draw an analogy. What if fluctuations in global temperatures are the combined results from multiple cycles of varying lengths?

What About Climate Biorhythms

At the longer end, we have astronomical cycles on millennial scales, and at the shorter end, we have seasonal cycles. In between there are a dozen or so oceanic cycles, such as ENSO, AMO, and AMOC, that have multi-decadal phases. Then there are solar cycles, ranging from basic quasi-11 year sunspot cycles, to other centennial maxs and mins. AARI scientists have documented a quasi-60 year cycle in Arctic ice extents. ETH Zurich has a solar radiation database showing an atmospheric sunscreen that alternatively dims or brightens the incoming sunshine over decades (see Nature’s Sunscreen).

It could be that observed warming and cooling periods occur when several more powerful cycles coincide in their phases. For example, we are at the moment anticipating an unusually quiet solar cycle, a Pacific Decadal Oscillation (PDO) negative phase, a cooler North Atlantic (AMO), and possibly a dimming period. Will that coincidence result in temperatures dropping? Was the Little Ice Age caused and then ended after 1850 by such a coincidence of climate biorhythms?

Summary

Our knowledge of these cycles is confounded by not yet untangling them to see individual periodicities, as a basis for probing into their interactions and combined influences.  Until that day, we should refrain from picking on one thing, like CO2, as though it were a control knob for the whole climate.

Nature’s Sunscreen

Greenhouse with adjustable sun screens to control warming.

A recent post Planetary Warming: Back to Basics discussed a recent paper by Nikolov and Zeller on the atmospheric thermal effect measured on various planets in our solar system. They mentioned that an important source of temperature variation around the earth’s energy balance state can be traced to global brightening and dimming.

This post explores the fact of fluctuations in the amount of solar energy reflected rather than absorbed by the atmosphere and surface. Brightening refers to more incoming solar energy from clear and clean skies. Dimming refers to less solar energy due to more sunlight reflected in the atmosphere by the presence of clouds and aerosols (air-born particles like dust and smoke).

The energy budget above from ERBE shows how important is this issue. On average, half of sunlight is either absorbed in the atmosphere or reflected before it can be absorbed by the surface land and ocean. Any shift in the reflectivity (albedo) impacts greatly on the solar energy warming the planet.

The leading research on global brightening/dimming is done at the Institute for Atmospheric and Climate Science of ETH Zurich, led by Martin Wild, senior scientist specializing in the subject.

Special instruments have been recording the solar radiation that reaches the Earth’s surface since 1923. However, it wasn’t until the International Geophysical Year in 1957/58 that a global measurement network began to take shape. The data thus obtained reveal that the energy provided by the sun at the Earth’s surface has undergone considerable variations over the past decades, with associated impacts on climate.

The initial studies were published in the late 1980s and early 1990s for specific regions of the Earth. In 1998 the first global study was conducted for larger areas, like the continents Africa, Asia, North America and Europe for instance.

Now ETH has announced The Global Energy Balance Archive (GEBA) version 2017: A database for worldwide measured surface energy fluxes. The title is a link to that paper published in May 2017 explaining the facility and some principal findings. The Archive itself is at  http://www.geba.ethz.ch.

For example, Figure 2 below provides the longest continuous record available in GEBA: surface downward shortwave radiation measured in Stockholm since 1922. Five year moving average in blue, 4th order regression model in red. Units Wm-2. Substantial multidecadal variations become evident, with an increase up to the 1950s (“early brightening”), an overall decline from the 1950s to the 1980s (“dimming”), and a recovery thereafter (“brightening”).

Figure 5. Composite of 56 European GEBA time series of annual surface downward shortwave radiation (thin line) from 1939 to 2013, plotted together with a 21 year Gaussian low-pass filter ((thick line). The series are expressed as anomalies (in Wm-2) from the 1971–2000 mean. Dashed lines are used prior to 1961 due to the lower number of records for this initial period. Updated from Sanchez-Lorenzo et al. (2015) including data until December 2013.

Martin Wild explains in a 2016 article Decadal changes in radiative fluxes at land and ocean surfaces and their relevance for global warming. From the Conclusion (SSR refers to solar radiation incident upon the surface)

However, observations indicate not only changes in the downward thermal fluxes, but even more so in their solar counterparts, whose records have a much wider spatial and temporal coverage. These records suggest multidecadal variations in SSR at widespread land-based observation sites. Specifically, declining tendencies in SSR between the 1950s and 1980s have been found at most of the measurement sites (‘dimming’), with a partial recovery at many of the sites thereafter (‘brightening’).

With the additional information from more widely measured meteorological quantities which can serve as proxies for SSR (primarily sunshine duration and DTR), more evidence for a widespread extent of these variations has been provided, as well as additional indications for an overall increasing tendency in SSR in the first part of the 20th century (‘early brightening’).

It is well established that these SSR variations are not caused by variations in the output of the sun itself, but rather by variations in the transparency of the atmosphere for solar radiation. It is still debated, however, to what extent the two major modulators of the atmospheric transparency, i.e., aerosol and clouds, contribute to the SSR variations.

The balance of evidence suggests that on longer (multidecadal) timescales aerosol changes dominate, whereas on shorter (decadal to subdecadal) timescales cloud effects dominate. More evidence is further provided for an increasing influence of aerosols during the course of the 20th century. However, aerosol and clouds may also interact, and these interactions were hypothesized to have the potential to amplify and dampen SSR trends in pristine and polluted areas, respectively.

No direct observational records are available over ocean surfaces. Nevertheless, based on the presented conceptual ideas of SSR trends amplified by aerosol–cloud interactions over the pristine oceans, modeling approaches as well as the available satellite-derived records it appears plausible that also over oceans significant decadal changes in SSR occur.

The coinciding multidecadal variations in SSTs and global aerosol emissions may be seen as a smoking gun, yet it is currently an open debate to what extent these SST variations are forced by aerosol-induced changes in SSR, effectively amplified by aerosol– cloud interactions, or are merely a result of unforced natural variations in the coupled ocean atmosphere system. Resolving this question could state a major step toward a better understanding of multidecadal climate change.

Another paper co-authored by Wild discusses the effects of aerosols and clouds The solar dimming/brightening effect over the Mediterranean Basin in the period 1979 − 2012. (NSWR is Net Short Wave Radiation, that is equal to surface solar radiation less reflected)

The analysis reveals an overall increasing trend in NSWR (all skies) corresponding to a slight solar brightening over the region (+0.36 Wm−2per decade), which is not statistically significant at 95% confidence level (C.L.). An increasing trend(+0.52 Wm−2per decade) is also shown for NSWR under clean skies (without aerosols), which is statistically significant (P=0.04).

This indicates that NSWR increases at a higher rate over the Mediterranean due to cloud variations only, because of a declining trend in COD (Cloud Optical Depth). The peaks in NSWR (all skies) in certain years (e.g., 2000) are attributed to a significant decrease in COD (see Figs. 9 and 10), whilethe two data series (NSWRall and NSWRclean) are highly correlated(r=0.95).

This indicates that cloud variation is the major regulatory factor for the amount and multi-decadal trends in NSWR over the Mediterranean Basin. (Note: Lower cloud optical depth is caused by less opaque clouds and/or decrease in overall cloudiness)

On the other hand, the results do not reveal a reversal from dimming to brightening during 1980s, as shown in several studies over Europe (Norris and Wild, 2007;Sanchez-Lorenzoet al., 2015), but a rather steady slight increasing trend in solar radiation, which, however, seems to be stabilized during the last years of the data series, in agreement with Sanchez-Lorenzo et al. (2015). Similarly, Wild (2012) reported that the solar brightening was less distinct at European sites after 2000 compared to the 1990s.

In contrast, the NSWR under clear (cloudless) skies shows a slight but statistically significant decreasing trend (−0.17 Wm−2per decade,P=0.002), indicating an overall decrease in NSWR over the Mediterranean due to water-vapor variability suggesting a transition to more humid environment under a warming climate.

Other researchers find cloudiness more dominant than aerosols. For example, The cause of solar dimming and brightening at the Earth’s surface during the last half century: Evidence from measurements of sunshine duration by Gerald Stanhill et al.

Analysis of the Angstrom-Prescott relationship between normalized values of global radiation and sunshine duration measured during the last 50 years made at five sites with a wide range of climate and aerosol emissions showed few significant differences in atmospheric transmissivity under clear or cloud-covered skies between years when global dimming occurred and years when global brightening was measured, nor in most cases were there any significant changes in the parameters or in their relationships to annual rates of fossil fuel combustion in the surrounding 1° cells. It is concluded that at the sites studied changes in cloud cover rather than anthropogenic aerosols emissions played the major role in determining solar dimming and brightening during the last half century and that there are reasons to suppose that these findings may have wider relevance.

Summary

The final words go to Martin Wild from Enlightening Global Dimming and Brightening.

Observed Tendencies in surface solar radiation
Figure 2.  Changes in surface solar radiation observed in regions with good station coverage during three periods.(left column) The 1950s–1980s show predominant declines (“dimming”), (middle column) the 1980s–2000 indicate partial recoveries (“brightening”) at many locations, except India, and (right column) recent developments after 2000 show mixed tendencies. Numbers denote typical literature estimates for the specified region and period in W m–2 per decade.  Based on various sources as referenced in Wild (2009).

The latest updates on solar radiation changes observed since the new millennium show no globally coherent trends anymore (see above and Fig. 2). While brightening persists to some extent in Europe and the United States, there are indications for a renewed dimming in China associated with the tremendous emission increases there after 2000, as well as unabated dimming in India (Streets et al. 2009; Wild et al. 2009).

We cannot exclude the possibility that we are currently again in a transition phase and may return to a renewed overall dimming for some years to come.

One can’t help but see the similarity between dimming/brightening and patterns of Global Mean Temperature, such as HadCrut.

Footnote: For more on clouds, precipitation and the ocean, see Here Comes the Rain Again

Updated: Climates Don’t Start Wars, People Do

Update July 14

A new study has looked into the Syrian civil war, which has been the poster child for those claiming climate causes  human conflict.  h\t to Mike Hulme who posted Climate Change and the Syrian Civil War Revisited The study concluded:

“For proponents of the view that anthropogenic climate change will become a ‘threat multiplier’ for instability in the decades ahead, the Syrian civil war has become a recurring reference point, providing apparently compelling evidence that such conflict effects are already with us. According to this view, human-induced climatic change was a contributory factor in the extreme drought experienced within Syria prior to its civil war; this drought in turn led to large-scale migration; and this migration in turn exacerbated the socio-economic stresses that underpinned Syria’s descent into war. This article provides a systematic interrogation of these claims, and finds little merit to them. Amongst other things it shows that there is no clear and reliable evidence that anthropogenic climate change was a factor in Syria’s pre-civil war drought; that this drought did not cause anywhere near the scale of migration that is often alleged; and that there exists no solid evidence that drought migration pressures in Syria contributed to civil war onset. The Syria case, the article finds, does not support ‘threat multiplier’ views of the impacts of climate change; to the contrary, we conclude, policymakers, commentators and scholars alike should exercise far greater caution when drawing such linkages or when securitising climate change.”  (my bold)

Original Post

Once again the media are promoting a link between climate change and human conflicts. It is obvious to anyone in their right mind that wars correlate with environmental destruction. From rioting in Watts, to the wars in Iraq, or the current chaos in Syria, there’s no doubt that fighting degrades the environment big time.

What is strange here is the notion that changes in temperatures and/or rainfall cause the conflicts in the first place. The researchers that advance this claim are few in number and are hotly disputed by many others in the field, but you would not know that from the one-sided coverage in the mass media.

The Claim

Lately the fuss arises from this study: Climate, conflict, and social stability: what does the evidence say?, Hsiang, S.M. & Burke, M. Climatic Change (2014) 123: 39. doi:10.1007/s10584-013-0868-3

Hsiang and Burke (2014) examine 50 quantitative empirical studies and find a “remarkable convergence in findings” (p. 52) and “strong support for a causal association” (p. 42) between climatological changes and conflict at all scales and across all major regions of the world. A companion paper by Hsiang et al. (2013) that attempts to quantify the average effect from these studies indicates that a 1 standard deviation (σ) increase in temperature or rainfall anomaly is associated with an 11.1 % change in the risk of “intergroup conflict”.1 Assuming that future societies respond similarly to climate variability as past populations, they warn that increased rates of human conflict might represent a “large and critical impact” of climate change.

The Bigger Picture

This assertion is disputed by numerous researchers, some 26 of whom joined in a peer-reviewed comment: One effect to rule them all? A comment on climate and conflict, Buhaug, H., Nordkvelle, J., Bernauer, T. et al. Climatic Change (2014) 127: 391. doi:10.1007/s10584-014-1266-1

In contrast to Hsiang and coauthors, we find no evidence of a convergence of findings on climate variability and civil conflict. Recent studies disagree not only on the magnitude of the impact of climate variability but also on the direction of the effect. The aggregate median effect from these studies suggests that a one-standard deviation increase in temperature or loss of rainfall is associated with a 3.5 % increase in conflict risk, although the 95 % highest density area of the distribution of effects cannot exclude the possibility of large negative or positive effects. With all contemporaneous effects, the aggregate point estimate increases somewhat but remains statistically indistinguishable from zero.

To be clear, this commentary should not be taken to imply that climate has no influence on armed conflict. Rather, we argue – in line with recent scientific reviews (Adger et al. 2014; Bernauer et al. 2012; Gleditsch 2012; Klomp and Bulte 2013; Meierding 2013; Scheffran et al. 2012a,b; Theisen et al. 2013; Zografos et al. 2014) – that research to date has failed to converge on a specific and direct association between climate and violent conflict.

The Root of Climate Change Bias

The two sides have continued to publish and the issue is far from settled. Interested observers describe how serious people can disagree so frequently about such findings in climate science.

Modeling and data choices sway conclusions about climate-conflict links, Andrew M. Linke, and Frank D. W. Witmer, Institute of Behavioral Science, University of Colorado, Boulder, CO 80309-0483 here

Conclusions about the climate–conflict relationship are also contingent on the assumptions behind the respective statistical analyses. Although this simple fact is generally understood, we stress the disciplinary preferences in modeling decisions.

However, we believe that the Burke et al. finding is not a “benchmark” in the sense that it is the scientific truth or an objective reality because disciplinary-related modeling decisions, data availability and choices, and coding rules are critical in deriving robust conclusions about temperature and conflict.

After adding additional covariates (models 4 and 6), the significant temperature effect in the Burke et al. (1) model disappears, with sociopolitical variables predicting conflict more effectively than the climate variables. Furthermore, this specification provides additional insights into the between- and within-effects that vary for factors such as political exclusion and prior conflict.

Summary

Sociopolitical variables predict conflict more effectively than climate variables. It is well established that poorer countries, such as those in Africa, are more likely to experience chronic human conflicts. It is also obvious that failing states fall into armed conflicts, being unable to govern effectively due to corruption and illegitimacy.

It boggles the mind that activists promote policies to deny cheap, reliable energy for such countries, perpetuating or increasing their poverty and misery, while claiming such actions reduce the chances of conflicts in the future.

Halvard Buhaug concludes (here):

Vocal actors within policy and practice contend that environmental variability and shocks, such as drought and prolonged heat waves, drive civil wars in Africa. Recently, a widely publicized scientific article appears to substantiate this claim. This paper investigates the empirical foundation for the claimed relationship in detail. Using a host of different model specifications and alternative measures of drought, heat, and civil war, the paper concludes that climate variability is a poor predictor of armed conflict. Instead, African civil wars can be explained by generic structural and contextual conditions: prevalent ethno-political exclusion, poor national economy, and the collapse of the Cold War system.

Footnote:  The Joys of Playing Climate Whack-A-Mole

Dealing with alarmist claims is like playing whack-a-mole. Every time you beat down one bogeyman, another one pops up in another field, and later the first one returns, needing to be confronted again. I have been playing Climate Whack-A-Mole for a while, and if you are interested, there are some hammers supplied below.

The alarmist methodology is repetitive, only the subject changes. First, create a computer model, purporting to be a physical or statistical representation of the real world. Then play with the parameters until fears are supported by the model outputs. Disregard or discount divergences from empirical observations. This pattern is described in more detail at Chameleon Climate Models

This post is the latest in a series here which apply reality filters to attest climate models.  The first was Temperatures According to Climate Models where both hindcasting and forecasting were seen to be flawed.

Others in the Series are:

Sea Level Rise: Just the Facts

Data vs. Models #1: Arctic Warming

Data vs. Models #2: Droughts and Floods

Data vs. Models #3: Disasters

Data vs. Models #4: Climates Changing

Climate Medicine

Beware getting sucked into any model.

Planetary Warming: Back to Basics

 

It is often said we must rely on projections from computer simulations of earth’s climate since we have no other earth on which to experiment. That is not actually true since we have observations upon a number of planetary objects in our solar system that also have atmospheres.

This is brought home by a paper, published recently in the journal “Environment Pollution and Climate Change,” written by Ned Nikolov, a Ph.D. in physical science, and Karl Zeller, retired Ph.D. research meteorologist. (title is link to paper).  H/T to Tallbloke for posting on this (here) along with comments by one of the authors.

New Insights on the Physical Nature of the Atmospheric Greenhouse Effect Deduced from an Empirical Planetary Temperature Model

Nikolov and Keller have written before on this topic, but this paper takes advantage of data from recent decades of space exploration as well as improved observatories. It is thorough, educational and makes a convincing case that a planet’s surface temperatures can be predicted from two variables: distance from the sun, and the atmospheric mass. This post provides some excerpts and exhibits as a synopsis, hopefully to encourage reading the paper itself.

Abstract

A recent study has revealed that the Earth’s natural atmospheric greenhouse effect is around 90 K or about 2.7 times stronger than assumed for the past 40 years. A thermal enhancement of such a magnitude cannot be explained with the observed amount of outgoing infrared long-wave radiation absorbed by the atmosphere (i.e. ≈ 158 W m-2), thus requiring a re-examination of the underlying Greenhouse theory.

We present here a new investigation into the physical nature of the atmospheric thermal effect using a novel empirical approach toward predicting the Global Mean Annual near-surface equilibrium Temperature (GMAT) of rocky planets with diverse atmospheres. Our method utilizes Dimensional Analysis (DA) applied to a vetted set of observed data from six celestial bodies representing a broad range of physical environments in our Solar System, i.e. Venus, Earth, the Moon, Mars, Titan (a moon of Saturn), and Triton (a moon of Neptune).

Twelve relationships (models) suggested by DA are explored via non-linear regression analyses that involve dimensionless products comprised of solar irradiance, greenhouse-gas partial pressure/density and total atmospheric pressure/density as forcing variables, and two temperature ratios as dependent variables. One non-linear regression model is found to statistically outperform the rest by a wide margin.

Above: Venusian Atmosphere

Our analysis revealed that GMATs of rocky planets with tangible atmospheres and a negligible geothermal surface heating can accurately be predicted over a broad range of conditions using only two forcing variables: top-of-the-atmosphere solar irradiance and total surface atmospheric pressure. The hereto discovered interplanetary pressure-temperature relationship is shown to be statistically robust while describing a smooth physical continuum without climatic tipping points.

This continuum fully explains the recently discovered 90 K thermal effect of Earth’s atmosphere. The new model displays characteristics of an emergent macro-level thermodynamic relationship heretofore unbeknown to science that has important theoretical implications. A key entailment from the model is that the atmospheric ‘greenhouse effect’ currently viewed as a radiative phenomenon is in fact an adiabatic (pressure-induced) thermal enhancement analogous to compression heating and independent of atmospheric composition. (my bold)

Earth Atmosphere Density and Temperature Profile

Consequently, the global down-welling long-wave flux presently assumed to drive Earth’s surface warming appears to be a product of the air temperature set by solar heating and atmospheric pressure. In other words, the so-called ‘greenhouse back radiation’ is globally a result of the atmospheric thermal effect rather than a cause for it. (my bold)

Our empirical model has also fundamental implications for the role of oceans, water vapour, and planetary albedo in global climate. Since produced by a rigorous attempt to describe planetary temperatures in the context of a cosmic continuum using an objective analysis of vetted observations from across the Solar System, these findings call for a paradigm shift in our understanding of the atmospheric ‘greenhouse effect’ as a fundamental property of climate.

The research effort demonstrates sound scientific research: data and sources are fully explained, the pattern analysis is replicable, and the conclusions set forth in a logical manner. Alternative hypotheses were explored and rejected in favor of one explaining observations to near perfection, and also showing applicability to other cases.

Equation (10a) implies that GMATs of rocky planets can be calculated as a product of two quantities: the planet’s average surface temperature in the absence of an atmosphere (Tna, K) and a nondimensional factor (Ea ≥ 1.0) quantifying the relative thermal effect of the atmosphere.

As an example of technical descriptions, consider how the paper describes issues relating to the calculation of Tna.

For bodies with tangible atmospheres (such as Venus, Earth, Mars, Titan and Triton), one must calculate Tna using αe=0.132 and ηe=0.00971, which assumes a Moon-like airless reference surface in accordance with our pre-analysis premise. For bodies with tenuous atmospheres (such as Mercury, the Moon, Calisto and Europa), Tna should be calculated from Eq. (4a) (or Eq. 4b respectively if S>0.15 W m-2 and/or Rg ≈ 0 W m-2) using the body’s observed values of Bond albedo αe and ground heat storage fraction ηe.

In the context of this model, a tangible atmosphere is defined as one that has significantly modified the optical and thermo-physical properties of a planet’s surface compared to an airless environment and/or noticeably impacted the overall planetary albedo by enabling the formation of clouds and haze. A tenuous atmosphere, on the other hand, is one that has not had a measurable influence on the surface albedo and regolith thermos-physical properties and is completely transparent to shortwave radiation.

The need for such delineation of atmospheric masses when calculating Tna arises from the fact that Eq. (10a) accurately describes RATEs of planetary bodies with tangible atmospheres over a wide range of conditions without explicitly accounting for the observed large differences in albedos (i.e., from 0.235 to 0.90) while assuming constant values of αe and ηe for the airless equivalent of these bodies. One possible explanation for this counterintuitive empirical result is that atmospheric pressure alters the planetary albedo and heat storage properties of the surface in a way that transforms these parameters from independent controllers of the global temperature in airless bodies to intrinsic byproducts of the climate system itself in worlds with appreciable atmospheres. In other words, once atmospheric pressure rises above a certain level, the effects of albedo and ground heat storage on GMAT become implicitly accounted for by Eq. (11). (my bold)

Significance

Equation (10b) describes the long-term (30 years) equilibrium GMATs of planetary bodies and does not predict inter-annual global temperature variations caused by intrinsic fluctuations of cloud albedo and/or ocean heat uptake. Thus, the observed 0.82 K rise of Earth’s global temperature since 1880 is not captured by our model, since this warming was likely not the result of an increased atmospheric pressure. Recent analyses of observed dimming and brightening periods worldwide [97-99] suggest that the warming over the past 130 years might have been caused by a decrease in global cloud cover and a subsequent increased absorption of solar radiation by the surface. Similarly, the mega shift of Earth’s climate from a ‘hothouse’ to an ‘icehouse’ evident in the sedimentary archives over the past 51 My cannot be explained by Eq. (10b) unless caused by a large loss of atmospheric mass and a corresponding significant drop in surface air pressure since the early Eocene.

Role of greenhouse gases from the new model perspective

Our analysis revealed a poor relationship between GMAT and the amount of greenhouse gases in planetary atmospheres across a broad range of environments in the Solar System (Figures 1-3 and Table 5). This is a surprising result from the standpoint of the current Greenhouse theory, which assumes that an atmosphere warms the surface of a planet (or moon) via trapping of radiant heat by certain gases controlling the atmospheric infrared optical depth [4,9,10]. The atmospheric opacity to LW radiation depends on air density and gas absorptivity, which in turn are functions of total pressure, temperature, and greenhouse-gas concentrations [9]. Pressure also controls the broadening of infrared absorption lines in individual gases. Therefore, the higher the pressure, the larger the infrared optical depth of an atmosphere, and the stronger the expected greenhouse effect would be. According to the present climate theory, pressure only indirectly affects global surface temperature through the atmospheric infrared opacity and its presumed constraint on the planet’s LW emission to Space [9,107].

The artificial decoupling between radiative and convective heat-transfer processes adopted in climate models leads to mathematically and physically incorrect solutions with regard to surface temperature. The LW radiative transfer in a real climate system is intimately intertwined with turbulent convection/advection as both transport mechanisms occur simultaneously. Since convection (and especially the moist one) is orders of magnitude more efficient in transferring energy than LW radiation [3,4], and because heat preferentially travels along the path of least resistance, a properly coupled radiative-convective algorithm of energy exchange will produce quantitatively and qualitatively different temperature solutions in response to a changing atmospheric composition than the ones obtained by current climate models. Specifically, a correctly coupled convective-radiative system will render the surface temperature insensitive to variations in the atmospheric infrared optical depth, a result indirectly supported by our analysis as well. This topic requires further investigation beyond the scope of the present study. (my bold)

The direct effect of atmospheric pressure on the global surface temperature has received virtually no attention in climate science thus far. However, the results from our empirical data analysis suggest that it deserves a serious consideration in the future.

How did Saturn’s moon Titan secure an atmosphere when no other moons in the solar system did? The answer lies largely in its size and location. Here, Titan as imaged in May 2005 by the Cassini spacecraft from about 900,000 miles away. Photo credit: Courtesy NASA/JPL/Space Science Institute

Physical nature of the atmospheric ‘greenhouse effect’

According to Eq. (10b), the heating mechanism of planetary atmospheres is analogous to a gravity-controlled adiabatic compression acting upon the entire surface. This means that the atmosphere does not function as an insulator reducing the rate of planet’s infrared cooling to space as presently assumed [9,10], but instead adiabatically boosts the kinetic energy of the lower troposphere beyond the level of solar input through gas compression. Hence, the physical nature of the atmospheric ‘greenhouse effect’ is a pressure-induced thermal enhancement independent of atmospheric composition. (my bold)

This mechanism is fundamentally different from the hypothesized ‘trapping’ of LW radiation by atmospheric trace gases first proposed in the 19th century and presently forming the core of the Greenhouse climate theory. However, a radiant-heat trapping by freely convective gases has never been demonstrated experimentally. We should point out that the hereto deduced adiabatic (pressure-controlled) nature of the atmospheric thermal effect rests on an objective analysis of vetted planetary observations from across the Solar System and is backed by proven thermodynamic principles, while the ‘trapping’ of LW radiation by an unconstrained atmosphere surmised by Fourier, Tyndall and Arrhenius in the 1800s was based on a theoretical conjecture. The latter has later been coded into algorithms that describe the surface temperature as a function of atmospheric infrared optical depth (instead of pressure) by artificially decoupling radiative transfer from convective heat exchange. Note also that the Ideal Gas Law (PV=nRT) forming the basis of atmospheric physics is indifferent to the gas chemical composition. (my bold)

Climate stability

Our semi-empirical model (Equations 4a, 10b and 11) suggests that, as long as the mean annual TOA solar flux and the total atmospheric mass of a planet are stationary, the equilibrium GMAT will remain stable. Inter-annual and decadal variations of global temperature forced by fluctuations of cloud cover, for example, are expected to be small compared to the magnitude of the background atmospheric warming because of strong negative feedbacks limiting the albedo changes. This implies a relatively stable climate for a planet such as Earth absent significant shifts in the total atmospheric mass and the planet’s orbital distance to the Sun. Hence, planetary climates appear to be free of tipping points, i.e., functional states fostering rapid and irreversible changes in the global temperature as a result of hypothesized positive feedbacks thought to operate within the system. In other words, our results suggest that the Earth’s climate is well buffered against sudden changes.

The hypothesis that a freely convective atmosphere could retain (trap) radiant heat due its opacity has remained undisputed since its introduction in the early 1800s even though it was based on a theoretical conjecture that has never been proven experimentally. It is important to note in this regard that the well-documented enhanced absorption of thermal radiation by certain gases does not imply an ability of such gases to trap heat in an open atmospheric environment. This is because, in gaseous systems, heat is primarily transferred (dissipated) by convection (i.e., through fluid motion) rather than radiative exchange.  (my bold)

If gases of high LW absorptivity/emissivity such as CO2, methane and water vapor were indeed capable of trapping radiant heat, they could be used as insulators. However, practical experience has taught us that thermal radiation losses can only be reduced by using materials of very low IR absorptivity/emissivity and correspondingly high thermal reflectivity such as aluminum foil. These materials are known among engineers at NASA and in the construction industry as radiant barriers [129]. It is also known that high-emissivity materials promote radiative cooling. Yet, all climate models proposed since 1800s were built on the premise that the atmosphere warms Earth by limiting radiant heat losses of the surface through to the action of IR absorbing gases aloft.

If a trapping of radiant heat occurred in Earth’s atmosphere, the same mechanism should also be expected to operate in the atmospheres of other planetary bodies. Thus, the Greenhouse concept should be able to mathematically describe the observed variation of average planetary surface temperatures across the Solar System as a continuous function of the atmospheric infrared optical depth and solar insolation. However, to our knowledge, such a continuous description (model) does not exist. 

Summary

The planetary temperature model consisting of Equations (4a), (10b), (11) has several fundamental theoretical implications, i.e.,
• The ‘greenhouse effect’ is not a radiative phenomenon driven by the atmospheric infrared optical depth as presently believed, but a pressure-induced thermal enhancement analogous to adiabatic heating and independent of atmospheric composition;
• The down-welling LW radiation is not a global driver of surface warming as hypothesized for over 100 years but a product of the near-surface air temperature controlled by solar heating and atmospheric pressure;
• The albedo of planetary bodies with tangible atmospheres is not an independent driver of climate but an intrinsic property (a byproduct) of the climate system itself. This does not mean that the cloud albedo cannot be influenced by external forcing such as solar wind or galactic cosmic rays. However, the magnitude of such influences is expected to be small due to the stabilizing effect of negative feedbacks operating within the system. This novel understanding explains the observed remarkable stability of planetary albedos;
• The equilibrium surface temperature of a planet is bound to remain stable (i.e., within ± 1 K) as long as the atmospheric mass and the TOA mean solar irradiance are stationary. Hence, Earth’s climate system is well buffered against sudden changes and has no tipping points;
• The proposed net positive feedback between surface temperature and the atmospheric infrared opacity controlled by water vapor appears to be a model artifact resulting from a mathematical decoupling of the radiative-convective heat transfer rather than a physical reality.

Update July 13, 2017

Michael Lewis pointed to a link on this subject in his comment below.  Reading again the discussion thread, I appreciated again this point by point response from Kristian to Tim Folkerts so I am adding it to the post. (To be clear, T_e means emission temperature (same as Tna in the article above, while T_s means surface temperature.)

Kristian says:

August 4, 2016 at 9:24 AM

Tim Folkerts says, August 3, 2016 at 3:33 PM:
“In any case, it seems we both agree that the atmosphere has some warming effect.”
That’s quite obvious. You only need to compare Earth’s T_s with the Moon’s.

“I agree that the mass itself plays a role. Mass creates thermal inertia to even out temperature swings. The mass of the atmosphere (and oceans) also allows convection to carry energy from warmer areas to cooler areas, which further reduces variations. By themselves, these could do no more than bring T_s UP TOWARD T_e.”
True.

“I see no physics that would explain mass itself raising T_s ABOVE T_e.”
Just as the radiative properties of gaseous molecules are also not able – all by themselves – to raise a planet’s T_s above its T_e. No, both mass and radiative properties are needed.

“To get above T_e we need something to change the outgoing thermal radiation, eg GHGs at a high enough altitude to be significantly cooler than the surface.”
Yes, but then we also need an air column above the solar-heated surface that can have such a “high enough altitude” in the first place. We also need that altitude to be cooler on average than the surface. IOW, we need mass. A certain gas density/pressure (molecular interaction). And we need fluid dynamics.

“So the key factor is ALTITUDE here (with some definite dependence of the concentrations of the GHGs as well).”
No. There is no dependence on the CONCENTRATION/CONTENT of IR-active constituents in an atmosphere. An atmosphere definitely needs to be IR active (although it’s evidently not enough) for a planet’s T_s to become higher than its T_e. It also needs to be IR active to be able to adequately rid itself of its absorbed energy from the surface (radiatively AND non-radiatively transferred) and directly from the Sun. But once it’s IR active, there is no dependence on the degree of activity. Because then the atmosphere has become stably convectively operative. And all that matters from then on is atmospheric MASS and SOLAR INPUT (TSI and global albedo).

“You say that atmospheric mass seems to force. Do you think that mass alone without GHGs could force temperatures higher than T_e?”
No. Just like “GHGs” alone could also not force T_s higher than T_e. You need both.
* * *
So I say: There IS a “GHE”. But it’s ultimately massively caused. The radiative properties are simply a tool. A means to an end. And there definitely ISN’T an “anthropogenically enhanced GHE” (AGW). It cannot happen.