Progressively Scaring the World (Lewin book synopsis)

H/T to Global Warming Policy Foundation for this publication. Announcement is here.

Bernie Lewin has written a thorough history explaining a series of environmental scares building up to the current obsession with global warming/climate change. The story is enlightening to people like me who were not paying attention when much of this activity was going down, prior to Copenhagen COP in my case.  It also provides a rich description of happenings behind the scenes.

As Lewin explains, it is a particularly modern idea to scare the public with science, and thereby advance a policy agenda. The power of this approach is evident these days, but his book traces it back to more humble origins and describes the process bringing us to the present state of full-blown climate fear. It is a cautionary tale.

“Those who don’t know history are doomed to repeat it.”
― Edmund Burke (1729-1797)

This fearful belief evolved through a series of expanding scares as diagrammed below:This article provides only some highlights while the book exposes the maneuvers and the players, their interests and tactics. Quotes from Lewin appear in italics, with my titles, summaries and bolds.

In the Beginning: The DDT Scare

The Context

A new ‘environmentalism’ arose through a broadening of specific campaigns against environmental destruction and pollution. It began to target more generally the industries and technologies deemed inherently damaging. Two campaigns in particular facilitated this transition, as they came to face-up squarely against the dreams of a fantastic future delivered by unfettered sci-tech progress.

One of these challenged the idea that we would all soon be tearing through the sky and crossing vast oceans in just a few hours while riding our new supersonic jets. But even before the ‘Supersonic Transportation Program’ was announced in 1963, another campaign was already gathering unprecedented support. This brought into question the widely promoted idea that a newly invented class of chemicals could safely bring an end to so much disease and destruction—of agriculture, of forests, and of human health—through the elimination of entire populations of insects. Pg.16

When the huge DDT spraying programs began, the Sierra Club’s immediate concern was the impact on nature reserves. But then, as the movement against DDT developed, and as it became increasingly involved, it began to broaden its interest and transform. By the end of the 1960s it and other similar conservation organisations were leading the new environmentalism in a broader campaign against DDT and other technological threats to the environment. Pg.18

The Alarm

This transformation was facilitated by the publication of a single book that served to consolidate the case against the widespread and reckless use of organic pesticides: Silent Spring. The author, Rachel Carson, had published two popular books on ocean ecology and a number of essays on ecological themes before Silent Spring came out in 1962. As with those earlier publications, one of the undoubted contributions of the book was the education of the public in a scientific understanding of nature. Pg.18

We will never know how Carson would have responded to the complete ban on DDT in the USA. She was suffering from cancer while writing Silent Spring and died shortly after publication (leaving the royalties from its sale to the Sierra Club), but the ban was not achieved for another decade. What we do know is that a full ban was never her intention. She supported targeted poisoning programs in place of blanket spraying, and she urged the authorities to look for alternative and ‘integrated control’, along the lines of the ‘Integrated Pest Management’ approach that is common and accepted today. Pg.19

The Exaggeration

Overall, by today’s standards at least, Carson’s policy position was moderate, and so we should be careful not to attribute to her the excesses of her followers. The trouble with Carson was otherwise: it was in her use and abuse of science to invoke in her readers an overwhelming fear. In Silent Spring, scientific claims find dubious grounding in the evidence. Research findings are exaggerated, distorted and then merged with the purely anecdotal and the speculative, to great rhetorical effect. Pg.19

Historically, the most important area of distortion is in linking organic pesticides with human cancers. The scientific case for DDT as a carcinogen has never been strong and it certainly was not strong when Silent Spring was published. Of course, uncertainty remained, but Carson used the authority of science to go beyond uncertainty and present DDT as a dangerous carcinogen. And it was not just DDT; Carson depicts us ‘living in a sea of carcinogens’, mostly of our own making, and for which there is ‘no safe dose’. Pg.19

The Legacy

If we are to understand how the EPA ban came about, it is important to realise that this action succeeded in breaking a policy stalemate that was becoming increasingly hazardous for the increasingly embattled Nixon administration. On one side of this stalemate were the repeated scientific assessments pointing to a moderate position, while on the other side were calls for more and more extreme measures fuelled by more and more outrageous claims. Pg.21

Such sober assessments by scientific panels were futile in the face of the pseudo-scientific catastrophism that was driving the likes of the Audubon Society into a panic over the silencing of the birds. By the early 1970s two things were clear: public anxiety over DDT would not go away, and yet the policy crisis would not be resolved by heeding the recommendations of scientific committees. Instead, resolution came through the EPA, and the special role that it found for itself following the publication of the Sweeney report. Pg.22

Summary

The DDT scare demonstrated an effective method: Claim that a chemical pollutant is a serious public health risk, Cancer being the most alarming of all. The media stoked the fear, and politicians acted to quell anxiety despite the weak scientific case. Also, the precedent was set for a governmental entity (EPA in this case) to make a judgment overruling expert advice in responding to public opinion.

The SST Scare

The Context

The contribution to the demise of the SST of the environmentalists’ campaign is sometimes overstated, but that is of less concern to our story than the perception that this was their victory. While the DDT campaign was struggling to make headway, the SST campaign would be seen as an early symbolic triumph over unfettered technological progressivism. It provided an enormous boost to the new movement and helped to shape it. Back in 1967, the Sierra Club had first come out campaigning against the SST for the sonic shockwaves sweeping the (sparsely populated) wilderness over which it was then set to fly. But as they began to win that argument, tension was developing within the organisation, with some members wishing to take a stronger, more general and ethical stand against new and environmentally damaging technologies such as this. P.27

With popular support for environmental causes already blooming across the country, and with the SST program already in jeopardy, scientists finally gained their own position of prominence in the controversy when they introduced some new pollution concerns. . . If that wasn’t enough, environmental concerns were also raised in the most general and cursory terms about the aircraft’s exhaust emissions. These first expressions of pollution concerns would soon be followed by others, from scientists who were brought into the debate to air speculation about various atmospheric catastrophes that would ensue if these supersonic birds were ever allowed to fly. Pg.27

The Alarm

What did make the front page of the New York Times on 2 August 1970 was concern about another climatic effect highlighted in the executive summary of the report. The headline trumpeted ‘Scientists ask SST delay pending study of pollution’ (see Figure 2.1).  The conference had analysed the effect of emissions from a fleet of 500 aircraft flying in the stratosphere, and concerns were raised that the emission of water vapour (and to a lesser extent other emissions) might absorb sunlight sufficiently to have a local or even global effect on climate. . . The climatic change argument remained in the arsenal of the anti-SST campaigners through to the end, but it was soon outgunned by much more dramatic claims about possible damage to the ozone layer. Pg.30

Throughout the 1970s, scientific speculation drove a series of ozone scares, each attracting significant press attention. These would climax in the mid-1980s, when evidence of ozone-depleting effects of spray-can propellants would be discovered in the most unlikely place. This takes us right up to the start of the global warming scare, presenting along the way many continuities and parallels. Indeed, the push for ozone protection up to the 1980s runs somewhat parallel with the global warming movement until the treaty process to mitigate ozone damage suddenly gained traction and became the very model for the process to mitigate global warming. The ozone story therefore warrants a much closer look. Pg.31

For Harold Johnston of the University of California, the real problem with SST exhaust would not be water vapour but oxides of nitrogen. Working all night, the next morning he presented Xerox copies of handwritten work projecting 10–90% depletion. In high traffic areas, there would be no stopping these voracious catalysts: the ozone layer would all but disappear within a couple of years. Even when Johnston later settled for a quotable reduction by half, there could be no quibbling over the dangers to nature and humanity of such massive environmental destruction. Pg.44

A New York Times reporter contacted Johnston to confirm his claims and although the report he delivered was subdued, the story remained alarming. It would take less than a year of full-fleet operations, Dr Johnston said in a telephone interview, for SSTs to deplete half of the stratospheric ozone that shields the earth from the sun’s ultraviolet radiation. Scientists argued in the SST debate last March that even a 1 percent reduction of ozone would increase radiation enough to cause an additional 10,000 cases of skin cancer a year in the United States. The next day, 19 May 1971, a strong negative vote demolished the funding bill. All but a few stalwarts agreed that one more vote in the House and it was all over for Boeing’s SST. After that final vote, on 30 May, the New York Times followed-up on its initial story with a feature on Johnston’s claims. This was written by their leading science writer, Walter Sullivan, an influential science communicator important to our story. Pg.48

The Exaggeration

It is true that in 1971 the link between skin cancer and sun exposure was fairly well established in various ways, including by epidemiological studies that found fair-skinned communities in low latitudes tended to record higher rates. However, the link to ultraviolet light exposure (specifically, the UV-B band) is strongest among those cancers that are most common but are also rarely lethal. The link with the rarer and most dangerous cancers, the malignant melanomas, is not so strong, especially because they often appear on skin that is not usually exposed to the sun. Pg.43

Thus, sceptics of the fuss over the risk of a few percent thinning of the already variable ozone layer would point out that the anti-SST crowd did not seemed overly worried about the modern preference for sunshine, which was, on the very same evidence, already presenting a risk many orders of magnitude greater: a small depletion in the ozone layer would be the equivalent of moving a few miles south. To the dismay of their environmentalist opponents, the bolder among these sceptics would recommend the same mitigation measures recommended to the lifestyle migrants—sunscreen, sunglasses and sunhats. Pg.43

But in 1971 there was no way to directly measure stratospheric NOx. No one was even sure whether there was any up there. Nor was there any way to confirm the presence—and, if so, the concentration— of many of the other possibly relevant reactive trace gases. This left scientists only guessing at natural concentrations, and for NOx, Johnston and others had done just that. These ‘best guesses’ were then the basis for modelling of the many possible reactions, the reaction rates, and the relative significance of each in the natural chemistry of the cold thin air miles above. All this speculation would then form the basis of further speculations about how the atmosphere might respond to the impacts of aircraft that had not yet flown; indeed none had even been built. Pg.46

The Legacy

But already the message had got through to where it mattered: to the chair of the Senate Committee on Aeronautical and Space Science, Clinton Anderson. The senator accepted Johnston’s theory on the strength of Sullivan’s account, which he summarised in a letter to NASA before concluding that ‘we either need NOx-free engines or a ban on stratospheric flight’.  And so it turned out that directly after the scrapping of the Boeing prototype, the overriding concern about supersonic exhaust pollution switched from water vapour to NOx. Pg.49

As startling as Johnston’s success appears, it is all the more extraordinary to consider how all the effort directed at solving the NOx problem was never distracted by a rising tide of doubt. The more the NOx effect was investigated, the more complex the chemistry seemed to be and the more doubtful became the original scientific foundations of the scare. In cases of serial uncertainty, the multiplying of best-guess estimates of an effect can shift one way and then the other as the science progresses. But this was never the case with NOx, nor with the SST-ozone scare generally. Pg.50

Summary

The SST Scare moved attention to the atmosphere and the notion of trace gases causing environmental damage, again linked to cancer risk. While ozone was the main issue, climate change was also raised along with interest in carbon dioxide emissions. Public policy was moved to withdraw funding for American SST production and later to ban European SSTs from landing in the US. It also demonstrated that fears could be promoted regarding a remote part of nature poorly known or understood. Models were built projecting fearful outcomes from small changes in atmospheric regions where data was mostly lacking.

The CFC Scare

The Context

Presumptions about the general state of a system’s stability are inevitable in situations of scant evidence, and they tend to determine positions across the sceptic/alarmist divide. Of course, one could suppose a stable system, in which a relatively minor compensatory adjustment might have an alarming impact on civilisation, like the rapid onset of a few metres of rise in sea level. But it is the use of such phrases as ‘disturbing the delicate balance of nature’ or ‘a threat to life on Earth’ that are giveaways to a supposition of instability. Hence Scorer’s incredulity regarding Johnston’s leap towards his catastrophic conclusion: ‘How could it be alleged seriously that the atmosphere would be upset by introducing a small quantity of the most commonly and easily formed compounds of the two elements which comprise 99% of it?’ Pg.68

Meanwhile, ‘Sherry’ Rowland at the University of California was looking around for a new interest. Since 1956 he had been mostly researching the chemistry of radioactive isotopes under funding from the Atomic Energy Commission. Hearing of Lovelock’s work, he was intrigued by the proposal that nearly all the CFCs ever produced might still be out there. Were there no environmental conditions anywhere that would degrade these chemicals? He handed the problem to his post-doctoral research assistant, Mario Molina. Molina eventually concluded that indeed there were no ‘sinks’ for CFCs anywhere in the ocean, soils or lower atmosphere. Thus we should expect that CFCs would drift around the globe, just as Lovelock had proposed, and that they would do so for decades, even centuries. . . or forever? Could mankind have created an organic compound that is so noble that it is almost immortal? Pg.75

The Alarm

The ozone effect that Molina had stumbled upon was different to those previously proposed from rockets and aeroplanes in one important respect: it would be tremendously delayed. Like a hidden cancer, the CFCs would build up quietly and insidiously in the lower atmosphere until their effect on the ozone miles above was eventually detectable, decades later. But when unequivocal evidence finally arrived to support the theory, it would be too late. By then there would be no stopping the destruction of the thin veil protecting us from the Sun’s carcinogenic rays. What Molina had stumbled upon had, in double-dose, one sure element of a good environmental scare. Pg.77

According to Walter Sullivan, they had calculated that spray-can CFCs have already accumulated sufficiently in the upper air to begin depleting the ozone that protects the earth from lethal ultraviolet radiation.  On current emission trends, 30% of the ozone layer would be destroyed as early as 1994. This was no longer a story about saving the sky for our grandchildren. These scientists had found an effect, already in train, with ‘lethal’ consequences for all living things during the lifetime of most of the New York Times’ massive and influential readership. Pg.82

During 1988, the second wave of global environmentalism would reach its peak in the USA, with CFC pollution its first flagship cause. Mid-March saw the US Congress voting unanimously to ratify the Montreal Protocol. It was only the second country to do so, while resistance remained strong in Europe. The following day, NASA announced the results of a huge two-year study of global ozone trends. Pb.107

The new scientific evidence came from a re-analysis of the ozone record. This found that the protective layer over high-population areas in the midlatitudes of the northern hemisphere had been depleted by between 1.7% and 3% from 1969 to 1986. These trends had been calculated after removing the effect of ‘natural geophysical variables’ so as to better approximate the anthropogenic influence. As such, these losses across just 15 years were at much faster rates than expected by the previous modelling of the CFC effect. Pg.107

The statements of the scientists (at least as quoted) made it clear to the press that this panel of experts had interpreted the empirical evidence as showing that a generalised CFC-driven depletion had already begun, and at a much faster rate than expected from the modelling used to inform the Montreal Protocol.  Pg.109

This linking by scientists of the breakup of the southern vortex with low ozone readings in southern Australia during December 1987 morphed into the idea that the ozone hole itself had moved over southern Australia. All sorts of further exaggerations and extrapolations ensued, including the idea of the hole’s continuing year-round presence. An indication of the strength of this mythology is provided by a small survey in 1999 of first-year students in an atmospheric science course at a university in Melbourne. This found that 80% of them believed the ozone hole to be over Australia, 97% believed it to be present during the summer and nearly 80% blamed ozone depletion for Australia’s high rate of skin cancer. Pg.114

After the London ‘Save the Ozone Layer Conference’, the campaign to save the ozone layer was all but won. It is true that a push for funding to assist poor country compliance did gain some momentum at this conference, and it was thought that this might stymie agreement, but promises of aid were soon extracted, and these opened the way for agreement on a complete global phase-out of CFC production. Pg.119

The Exaggeration

Here we had Harvard scientists suggesting that hairspray destruction of the ozone layer had already begun. Verification of the science behind this claim could not have played any part in the breaking of the scare, for there was nothing to show. It turned out that McElroy and Wofsy had not shown their work to anyone, anywhere. Indeed, the calculations they reported to Sullivan were only submitted for publication a few days after the story ran in the New York Times. By that time already, the science did not matter; when McElroy and Wofsy’s calculations finally appeared in print in February 1975, the response to the scare was in full swing, with spray-can boycotts, with ‘ban the can’ campaigns, and with bills to that effect on the table in Congress. Pg.82

It was on track to deliver its findings by April 1976 when it was hit with the shocking discovery of a new chlorine ‘sink’. On receiving this news, it descended into confusion and conflict and this made impossible the timely delivery of its much-anticipated report. The new ‘sink’ was chlorine nitrate. When chlorine reacts to form chlorine nitrate its attack on ozone is neutralised. It was not that chlorine nitrate had previously been ignored, but that it was previously considered very unstable. However, late in 1975 Rowland concluded it was actually quite stable in the mid-stratosphere, and therefore the two most feared ozone eaters—NOx and CFCs—would neutralise each other: not only could natural NOx moderate the CFC effect, but hairsprays and deodorants could serve to neutralise any damage Concorde might cause. Pg.84

Now, at the height of the spray-can scare, there was a shift back to climate. This was reinforced when others began to point to the greenhouse effect of CFCs. An amazing projection, which would appear prominently in the NAS report, was that CFCs alone would increase global mean temperature by 1°C by the end of the century—and that was only at current rates of emissions! In all this, McElroy was critical of Rowland (and others) for attempting to maintain the momentum of the scare by switching to climatic change as soon as doubts about the cancer scare emerged. It looked like the scientists were searching for a new scientific justification of the same policy outcome. Pg.87

The Legacy

The ban on the non-essential uses of spray-can CFCs that came into force in January 1978 marked a peak in the rolling ozone scares of the 1970s. Efforts to sustain the momentum and extend regulation to ‘essential’ spray cans, to refrigeration, and on to a complete ban, all failed. The tail-end of the SST-ozone scare had also petered out after the Franco-British consortium finally won the right to land their Concorde in New York State in 1977. And generally in the late 1970s, the environmental regulation movement was losing traction, with President Carter’s repeated proclamations of an environmental crisis becoming increasingly shrill (more on that below). Eventually, in 1981, Ronald Reagan’s arrival at the White House gave licence and drive to a backlash against environmental regulation that had been building throughout the 1970s. Long before Reagan’s arrival, it was made clear in various forums that further regulatory action on CFCs could only be premised on two things: international cooperation and empirical evidence. Pg.89

To some extent, the demand for better science had always been resisted. From the beginning, advocates conceded that direct and unequivocal evidence of CFC-caused depletion might be impossible to gain before it is too late.  But concerns over whether the science was adequate went deeper. The predictions were based on simple models of a part of our world that was still remote and largely unknown. Pg.91

Summary.

The CFC scare brought the focus of dangerous behavior down from the stratosphere to spray cans in the hands of ordinary people, along with their use of air conditioners so essential to life in the sunny places people prefer.  Speculation about ozone holes over polar regions were also more down to earth. And for the first time all of this concern produced an international treaty with extraordinary cooperation against CFCs, with UNEP soaring into prominence and gaining much credit for guiding the policy process.

The CO2 Scare

The Context

In the USA during the late 1970s, scientific interest in the potential catastrophic climatic consequences of carbon dioxide emissions came to surpass other climatic concerns. Most importantly, it came to surpass the competing scientific and popular anxiety over global cooling and its exacerbation by aerosol emissions. However, it was only during the late 1980s that the ‘carbon dioxide question’ broke out into the public discourse and transformed into the campaign to mitigate greenhouse warming. For more than a decade before the emergence of this widespread public concern, scientists were working on the question under generous government funding. Pg.122

The proven trigger for the release of funding was to forewarn of catastrophe, to generate public fear and so motivate administrators and politicians to fund investigations targeting the specific issue. The dilemma for the climatic research leadership was that calls for more research to assess the level of danger would fail unless declarations of danger were already spreading fear. Pg.143

The scare that would eventually triumph over all preceding global environmental scares, and the scare that would come to dominate climatic research funding, began with a coordinated, well-funded program of research into potentially catastrophic effects. It did so before there was any particular concern within the meteorological community about these effects, and before there was any significant public or political anxiety to drive it. It began in the midst of a debate over the relative merits of coal and nuclear energy production. Pg 144

The Alarm

In February 1979, at the first ever World Climate Conference, meteorologists would for the first time raise a chorus of warming concern. These meteorologists were not only Americans. Expert interest in the carbon dioxide threat had arisen during the late 1970s in Western Europe and Russia as well. However, there seemed to be nothing in particular that had triggered this interest. There was no new evidence of particular note. Nor was there any global warming to speak of. Global mean temperatures remained subdued, while in 1978 another severe winter descended over vast regions of North America. The policy environment also remained unsympathetic. Pg.184

At last, during the early 1980s, Nature gave some clear signals that it was coming out on the side of the warmers. In the early 1980s it started to become clear that the four-decade general cooling trend was over. Weather station records in the northern mid-latitudes began again to show an upward trend, which was traceable back to a turnaround during the 1970s. James Hansen was early in announcing this shift, and in doing so he also excited a foreboding of manmade warming. Pg.193

Besides, there was a much grander diluvian story that continued to gain currency: the semi-submerged West Antarctic ice sheet might detach and slide into the sea. This was for some an irresistible image of terrible beauty: displacement on a monumental scale, humanity unintentionally applying the lever of industrial emissions to cast off this inconceivably large body of ice. As if imagining some giant icy Archimedes slowly settling into his overflowing bath, Hansen calculated the consequential displacement to give a sea-level rise of 5 or 6 metres within a century. Pg.195

Moreover, it had the imprimatur of the American Association for the Advancement of Science; the AAAS journal, Science, was esteemed in the USA above all others. Thus we can forgive Sullivan his credulity of this string of claims: that the new discovery of ‘clear evidence’ shows that emissions have ‘already warmed the climate’, that this supports a prediction of warming in the next century of ‘almost unprecedented magnitude’, and that this warming might be sufficient to ‘melt and dislodge the ice cover of West Antarctica’. The cooling scare was barely in the grave, but the warmers had been rehearsing in the wings. Now their most daring member jumped out and stole the show. Pg.196

But Hansen went beyond this graph and beyond the conclusion of his published paper to firstly make a strong claim of causation, and then, secondly, to relate this cause to the heat being experienced that year (indeed, the heat being experienced in the hearing room even as he spoke!). He explained that ‘the Earth is warmer in 1988 than at any time in the history of instrumental measurements’. He had calculated that ‘there is only a 1 percent chance of an accidental warming of this magnitude. . . ’ This could only mean that ‘the greenhouse effect has been detected, and it is changing our climate now’. Hansen’s detection claim was covered by all the main television network news services and it won for him another New York Times front page headline: Global warming has begun, expert tells Senate. Pg.224

The Exaggeration

Where SCOPE 29 looked toward the time required for a doubling of the atmospheric concentration of carbon dioxide, at Villach the policy recommendation would be based on new calculations for the equivalent effect when all emitted greenhouse gases were taken into account. The impact of the new calculations was to greatly accelerate the rate of the predicted warming. According to SCOPE 29, on current rates of emissions, doubling of the carbon dioxide concentration would be expected in 2100. At Villach, the equivalent warming effect of all greenhouse gases was expected as early as 2030. Pg.209

This new doubling date slipped under a psychological threshold: the potential lifetime of the younger scientists in the group. Subsequently, these computations were generally rejected and the agreed date for ‘the equivalent of CO2 doubling’ was pushed out at least 20 years; indeed, never again would there be a doubling estimate so proximate with the time in which it was made. Pg.209

Like so many of the consensus statements from this time on, this one is twisted so that it gives the appearance of saying more than it actually does. In this way, those pushing for dramatic effect and those concerned not to overstate the case can come to agreement. In fact, this passage of the statement brings the case for alarm down to the reliability of the modelling, which is pretty much true of SCOPE 29. Pg.210

In other words, the Impact on Climate Change working group concluded that the models are not yet ready to make predictions (however vaguely) about the impact of greenhouse gas emissions on the global climate.  Pg.210

The Legacy

Today, emissions targets dominate discussions of the policy response to global warming, and total emissions rates are tacitly assumed to be locked to a climatic response of one, two or so many degrees of warming. Today’s discussions sits on top of a solid foundation of dogma established across several decades and supposedly supported by a scientific consensus, namely that there is a direct cause–effect temperature response to emissions. Pg.219

One of the main recommendations for mitigating these dire consequences is a comprehensive global treaty to protect the atmosphere. On the specific issue of global warming, the conference statement calls for the stabilisation of atmospheric concentrations of one greenhouse gas, namely carbon dioxide. It estimates that this would require a reduction of current global emissions by more than 50%. However, it suggests an initial goal for nations to reduce their current rates of carbon dioxide emission by 20% by 2005. This rather arbitrary objective would become the headline story: ‘Targets agreed to save climate’. And it stuck. In the emissions-reduction policy debate that followed, this ‘Toronto target’ became the benchmark. For many years to come—indeed, until the Kyoto Protocol of 1997—it would be a key objective of sustainable development’s newly launched flagship. Pg.221

Summary

The framework for international action is established presuming that CO2 emissions directly cause global warming and that all nations must collectively cut their use of fossil fuels. However, the drive for a world treaty is hampered by a lack of proof and scientists’ mixed commitment to the policy goals.

The IPCC Scare

The Context

Before winter closed in at the end of 1988, North America was brimming with warming enthusiasm. In the USA, global warming was promised attention no matter who won the presidential election. In Canada, after the overwhelming success of the Toronto conference, the government continued to promote the cause, most enthusiastically through its environment minister Tom McMillan. Elsewhere among world leaders, enthusiasm was also building. The German chancellor, Helmut Kohl, had been a long-time campaigner against fossil fuels. Pg.224

In February 1989, the year got off to a flying start with a conference in Delhi organised by India’s Tata Energy Research Institute and the Woods Hole Research Center, which convened to consider global warming from the perspective of developing countries. The report of the conference produced an early apportionment of blame and a call for reparations. It proclaimed that the global warming problem had been caused by the industrially developed countries and therefore its remediation should be financed by them, including by way of aid to underdeveloped countries. This call was made after presenting the problem in the most alarming terms: Global warming is the greatest crisis ever faced collectively by humankind, unlike other earlier crises, it is global in nature, threatens the very survival of civilisation, and promises to throw up only losers over the entire international socio-economic fabric. The reason for such a potential apocalyptic scenario is simple: climate change of geological proportions are occurring over time-spans as short as a single human lifetime. Pg.226

Throughout 1989, the IPCC working groups conducted a busy schedule of meetings and workshops at venues around the northern hemisphere. Meanwhile, the outpouring of political excitement that had been channelled into the process brought world attention to the IPCC. By the time of its second full session in June 1989, its treaty development mandate had become clearer: the final version of the resolution that had passed at the UN General Assembly the previous December—now called ‘Protection of global climate for present and future generations of mankind’—requested that the IPCC make recommendations on strengthening relevant existing international legal instruments and on ‘elements for inclusion in a possible future international convention on climate.’ pg.242

The Alarm

The general feeling in the research community that the policy process had surged ahead of the science often had a different effect on those scientists engaged with the global warming issue through its expanded funding. For them, the situation was more as President Bush had intimated when promising more funding: the fact that ‘politics and opinion have outpaced the science’ brought the scientists under pressure ‘to bridge the gap’pg.253

This is what became known as the ‘first detection’ program. With funding from DoE and elsewhere, the race was soon on to find ways to achieve early detection of the climate catastrophe signal. More than 10 years later, this search was still ongoing as the framework convention to mitigate the catastrophe was being put in place. It was not so much that the ‘conventional wisdom’ was proved wrong; in other words, that policy action did not in fact require empirical confirmation of the emissions effect. It was more that the policy action was operating on the presumption that this confirmation had already been achieved. Pg.254

The IPCC has warned that if CO2 emissions are not cut by 60 percent immediately, the changes in the next 60 years may be so rapid that nature will be unable to adapt and man incapable of controlling them.  The policy action to meet this threat—the UN Framework Convention on Climate Change—went on to play a leading role as the headline outcome of the entire show. The convention drafted through the INC negotiation over the previous two years would not be legally binding, but it would provide for updates, called ‘protocols’, specifying mandatory emissions limits. Towards the end of the Earth Summit, 154 delegations put their names to the text. Pg.266

The Exaggeration

It may surprise readers that even within the ‘carbon dioxide community’ it was not hard to find the view that the modelling of the carbon dioxide warming was failing validation against historical data and, further upon this admission, the suggestion that their predicted warming effect is wrong. In fact, there was much scepticism of the modelling freely expressed in and around the Carbon Dioxide Program in these days before the climate treaty process began. Those who persisted with the search for validation got stuck on the problem of better identifying background natural variability. There did at least seem to be agreement that any recent warming was well within the bounds of natural variability. Pg.261

During the IPCC review process, Wigley was asked to answer the question that he had avoided in the SCOPE 29: When is detection likely to be achieved? He responded with an addition to the IPCC chapter that explains that we would have to wait until the half-degree of warming that had occurred already during the 20th century is repeated. Only then are we likely to determine just how much of it is human-induced. If the carbon dioxide driven warming is at the high end of the predictions, then this would be early in the 21th century, but if the warming was slow then we may not know until 2050 (see Figure 15.1). In other words, scientific confirmation that carbon dioxide emissions is causing global warming is not likely for decades. Pg.263

These findings of the IPCC Working Group 1 assessment presented a political problem. This was not so much that the working group was giving the wrong answers; it was that it had got stuck on the wrong questions, questions obsolete to the treaty process. The IPCC first assessment was supposed to confirm the scientific rationale for responding to the threat of climate change, the rationale previously provided by the consensus statement coming out of the 1985 Villach conference. After that, it would provide the science to support the process of implementing a coordinated response. But instead of confirming the Villach findings, it presented a gaping hole in the scientific rationale. Pg.263

Scientist-advocates would continue their activism, but political leaders who pledged their support for climate action had invested all scientific authority for this action in the IPCC assessment. What did the IPCC offer in return? It had dished up dubiously validated model projections and the prospect of empirical confirmation perhaps not for decades to come. Far from legitimising a treaty, the scientific assessment of Working Group 1 provided governments with every reason to hesitate before committing to urgent and drastic action. Pg.263

In 1995, the IPCC was stuck between its science and its politics. The only way it could save itself from the real danger of political oblivion would be if its scientific diagnosis could shift in a positive direction and bring it into alignment with policy action. Without a positive shift in the science, it is hard to see how even the most masterful spin on another assessment could serve to support momentum towards real commitment in a binding protocol. With ozone protection, the Antarctic hole had done the trick and brought on agreement in the Montreal Protocol. But there was nothing like that in sight for the climate scare. Without a shift in the science, the IPCC would only cause further embarrassment and so precipitate its further marginalisation. Pg.278

For the second assessment, the final meeting of the 70-odd Working Group 1 lead authors was scheduled for July 1995 in Asheville, North Carolina. This meeting was set to finalise the drafting of the chapters in response to review comments. It was also (and mostly) to finalise the draft Summary for Policymakers, ready for intergovernmental review. The draft Houghton had prepared for the meeting was not so sceptical on the detection science as the main text of the detection chapter drafted by Santer; indeed it contained a weak detection claim. However, it matched the introduction to the detection chapter, where Santer had included the claim that ‘the best evidence to date suggests’. . . .. . a pattern of climate response to human activities is identifiable in observed climate records.

This detection claim appeared incongruous with the scepticism throughout the main text of the chapter and was in direct contradiction with its Concluding Summary. It represented a change of view that Santer had only arrived at recently due to a breakthrough in his own ‘fingerprinting’ investigations. These findings were so new that they were not yet published or otherwise available, and, indeed, Santer’s first opportunity to present them for broader scientific scrutiny was when Houghton asked him to give a special presentation to the Asheville meeting. Pg.279

However, the results were also challenged at Asheville: Santer’s fingerprint finding and the new detection claim were vigorously opposed by several experts in the field. One of the critics, John Christy, recalls challenging Santer on his data selection.  Santer recalls disputing the quality of the datasets used by Christy.  Debates over the scientific basis of the detection claim dominated the meeting, sometimes continuing long after the formal discussions had finished and on into the evening. Pg.280

In September, a draft summary of the entire IPCC second assessment was leaked by the New York Times, the new detection claim revealed on its front page. Pg.281

The UK Independent headlined ‘Global Warming is here, experts agree’ with
the subheading:  ‘Climate of fear: Old caution dropped as UN panel of scientists concur on danger posed by greenhouse gases.‘ The article explains the breakthough: “The panel’s declaration, after three days of torturous negotiation in Madrid, marks a decisive shift in the global-warming debate. Sceptics have claimed there is no sound evidence that climate has been changed by the billions of tonnes of carbon dioxide and other heat-trapping ‘greenhouse gases’ spewed into the atmosphere each year, mostly from the burning of fossil fuels and forests. But the great majority of governments and climate scientists now think otherwise and are now prepared to say so. ‘The balance of evidence suggests a discernible human influence on global climate’, the IPCC’s summary of its 200-page report says. The last such in-depth IPCC report was published five years ago and was far more cautious.” Pg.283

The Legacy

Stories appearing in the major newspapers over the next few days followed a standard pattern. They told how the new findings had resolved the scientific uncertainty and that the politically motivated scepticism that this uncertainty had supported was now untenable. Not only was the recent success of the attribution finding new to this story; also new was the previous failure. Before this announcement of the detection breakthrough, attention had rarely been drawn to the lack of empirical confirmation of the model predictions, but now this earlier failure was used to give a stark backdrop to the recent success, maximising its impact and giving a scientific green light to policy action. Thus, the standard narrative became: success after the previous failure points the way to policy action. Pg.284

With so many political actors using the authority of the IPCC’s detection finding to justify advancing in that direction, it is hard to disagree with his assessment. Another authority might well have been used to carry the treaty politics forward, but the fact that this particular authority was available, and was used, meant that the IPCC was hauled back into the political picture, where it remains the principal authority on the science to this day. Pg.301

What we can see from all this activity by scientists in the close vicinity of the second and third IPCC assessments is the existence of a significant body of opinion that is difficult to square with the IPCC’s message that the detection of the catastrophe signal provides the scientific basis for policy action. Most of these scientists chose not to engage the IPCC in public controversy and so their views did not impact on the public image of the panel. But even where the scientific basis of the detection claims drew repeated and pointed criticism from those prepared to engage in the public controversy, these objections had very little impact on the IPCC’s public image. Pg.310

Today, after five full assessments and with another on the way, the IPCC remains the pre-eminent authority on the science behind every effort to head off a global climate catastrophe. Pg.310

Summary:

Today the IPCC is a testament to the triumph of politics over science, of style and rhetoric over substance and evidence. A “bait and switch” gambit was applied at the right moment to produce the message wanted by the committed. Fooled by the finesse, the media then trumpeted the “idea whose time has come,” and the rest is history, as they say.   And yet, despite IPCC claims to the contrary, the detection question is still not answered for those who demand evidence.

Thank you Bernie Lewin and GWPF for setting the record straight, and for demonstrating how this campaign is sustained by unfounded fears.

A continuing supply of hot air keeps scare balloons inflated.

Advertisements

CO2 Fluxes, Sources and Sinks

A recent post Obsessed with Human CO2 pointed out how small is the amount of CO2 emissions from fossil fuels compared to natural sources. Human emissions fall within the error ranges around the estimates from land, oceans and biosphere. This post looks deeper into the issue and our current state of knowledge about attributing CO2 concentrations in the atmosphere.

Note the size of the human emissions next to the red arrow. (Units are in GT)

Alarming Claims by IPCC Followers

From Chapter 6 Working Group 1 AR5 with my bolds.

With a very high level of confidence, the increase in CO2 emissions from fossil fuel burning and those arising from land use change are the dominant cause of the observed increase in atmospheric CO2 concentration. About half of the emissions remained in the atmosphere (240 ± 10 PgC) since 1750. The rest was removed from the atmosphere by sinks and stored in the natural carbon cycle reservoirs. The ocean reservoir stored 155 ± 30 PgC. Vegetation biomass and soils not affected by land use change stored 160 ± 90 PgC. {6.1, 6.3, 6.3.2.3, Table 6.1, Figure 6.8}

Since the beginning of the Industrial Era (1750), the concentration of CO2 in the atmosphere has increased by 40%, from 278 ± 5 ppm to 390.5 ± 0.1 ppm in 2011 (Figure 6.11; updated from Ballantyne et al. (2012), corresponding to an increase in CO2 of 240 ± 10 PgC in the atmosphere. Atmospheric CO2 grew at a rate of 3.4 ± 0.2 PgC yr–1 in the 1980s, 3.1 ± 0.2 PgC yr–1 in the 1990s and 4.0 ± 0.2 PgC yr–1 in the 2000s (Conway and Tans, 2011) (Table 6.1).

Coupled carbon-cycle climate models indicate that less carbon is taken up by the ocean and land as the climate warms constituting a positive climate feedback. Many different factors contribute to this effect: warmer seawater, for instance, has a lower CO2 solubility, so altered chemical carbon reactions result in less oceanic uptake of excess atmospheric CO2. On land, higher temperatures foster longer seasonal growth periods in temperate and higher latitudes, but also faster respiration of soil carbon.

The removal of human-emitted CO2 from the atmosphere by natural processes will take a few hundred thousand years (high confidence). Depending on the RCP scenario considered, about 15 to 40% of emitted CO2 will remain in the atmosphere longer than 1,000 years. This very long time required by sinks to remove anthropogenic CO2 makes climate change caused by elevated CO2 irreversible on human time scale. {Box 6.1}

Alarmist Summary: All of the rise in atmospheric CO2 is caused by humans, is increasing and will last for 1000 years.

Sobering Facts from Scientific Observations

Fact 1. The Carbon Cycle System is estimated with uncertainties greater than human emissions.

Carbon fluxes describe the rate of exchange of carbon between the various carbon sinks / reservoirs.

There are four main carbon sinks – lithosphere (earth crust), hydrosphere (oceans), atmosphere (air), biosphere (organisms).

The rate at which carbon is exchanged between these reservoirs depends on the conversion processes involved:

Photosynthesis – removes carbon dioxide from the atmosphere and fixes it in producers as organic compounds
Respiration – releases carbon dioxide into the atmosphere when organic compounds are digested in living organisms
Decomposition – releases carbon products into the air or sediment when organic matter is recycled after death of an organism
Gaseous dissolution – the exchange of carbon gases between the ocean and atmosphere
Lithification – the compaction of carbon-containing sediments into fossils and rocks within the Earth’s crust (e.g. limestone)
Combustion – releases carbon gases when organic hydrocarbons (coal, oil and gas) are burned as a fuel source

It is not possible to directly measure the size of the carbon sinks or the fluxes between them – instead estimates are made.

Global carbon fluxes are very large and are therefore measured in gigatonnes (1 gigatonne of carbon = 1 billion metric tonnes).

Because carbon fluxes are large and based on measurements from many different sources, estimates have large uncertainties.

A good summary description of carbon fluxes and reservoirs is at University of New Hampshire (here). This figure from IPCC AR4 shows how estimates have been developed. Explanation below with my bolds.

IPCC AR4WG1 Figure 7.3. The global carbon cycle for the 1990s, showing the main annual fluxes in GtC yr–1: pre-industrial ‘natural’ fluxes in black and ‘anthropogenic’ fluxes in red (modified from Sarmiento and Gruber, 2006, with changes in pool sizes from Sabine et al., 2004). The net terrestrial loss of –39 GtC is inferred from cumulative fossil fuel emissions minus atmospheric increase minus ocean storage. The loss of –140 GtC from the ‘vegetation, soil and detritus’ compartment represents the cumulative emissions from land use change (Houghton, 2003), and requires a terrestrial biosphere sink of 101 GtC (in Sabine et al., given only as ranges of –140 to –80 GtC and 61 to 141 GtC, respectively; other uncertainties given in their Table 1). Net anthropogenic exchanges with the atmosphere are from Column 5 ‘AR4’ in Table 7.1. Gross fluxes generally have uncertainties of more than ±20% but fractional amounts have been retained to achieve overall balance when including estimates in fractions of GtC yr–1 for riverine transport, weathering, deep ocean burial, etc. ‘GPP’ is annual gross (terrestrial) primary production. Atmospheric carbon content and all cumulative fluxes since 1750 are as of end 1994.

The diagram shows that anthropogenic emissions of CO2 from burning of fossil fuels cannot be the reason for the increase in atmospheric CO2.

Fact 2. Land-based Carbon Pools Behave Diversely, Defying Global Averaging.

It should be clear from the observational data that Earth’s biosphere is exerting a powerful brake on the rate of rise of the air’s CO2 content, such that the large increases in anthropogenic CO2 emissions of the past two decades have not resulted in any increase in the rate of CO2 accumulation in the atmosphere. The IPCC has yet to acknowledge the existence and sign of this negative feedback, choosing to rely on projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5) models. Those models “consistently estimate a positive carbon cycle feedback, i.e. reduced natural sinks or increased natural CO2 sources in response to future climate change.” The models further find “in particular, carbon sinks in tropical land ecosystems are vulnerable to climate change” (p. 21 of the Technical Summary, Second Order Draft of AR5, dated October 5, 2012).

Fluxnet Observation Sites around the world.

Soils are the largest carbon reservoir of the terrestrial carbon cycle. Worldwide they contain three or four times more organic carbon (1500 Gt to 1 m depth, 2500 Gt to 2 m) than vegetation (610 Gt) and twice or three times as much carbon as the atmosphere (750 Gt, see Figure 1) [71]. Carbon storage in soils is the balance between the input of dead plant material (leaf, root litter, and decaying wood) and losses from decomposition and mineralization of organic matter (‘heterotrophic respiration’). Under aerobic conditions, most of the carbon entering the soil returns to the atmosphere by autotrophic root respiration and heterotrophic respiration (together called ‘soil respiration’ or ‘soil CO2 efflux’). The mineralization rate is a function of temperature and moisture levels and chemical environment with factors such as pH, Eh, nitrogen level and the cation exchange capacity of the minerals in the soil affecting the mineralization rate of soil organic carbon (SOC) [72, 73, 74, 75, 76, 77, 78]. Under anaerobic conditions, resulting from constantly high water levels, part of the carbon entering the soil is not fully mineralized and accumulates as peat.

Today, eddy covariance measurements of carbon dioxide and water vapor exchange are being made routinely on all continents.  The flux measurement sites are linked across a confederation of regional networks in North, Central and South America, Europe, Asia, Africa, and Australia, in a global network, called FLUXNET.  This global network includes more than eight hundred active and historic flux measurement sites, dispersed across most of the world’s climate space and representative biomes (Figure 1, 2). Fluxnet portal is here Excerpts with my bolds.

The flux network has also been pivotal in refining the functional response of net and gross carbon dioxide exchange with climatic drivers. One notable observation relates to the sensitivity of ecosystem respiration to temperature. That is, respiration is constant across climate and ecological space and increases by a factor of 1.4 with a ten degree increase in temperature. Another emergent property is the plasticity of the timing of the initiation of the growing season, and how it is triggered by when soil temperature matches mean annual air temperature.

Lessons learned from FLUXNET

One of the first and overarching things we have learned is “what is the net and gross annual carbon fluxes, at sites across the globe?” A collation of data has enabled the community to produce a probability distribution of net carbon exchange that is occurring across the network. We see that the central tendency of net carbon exchange is: −157±285 g C m−2 y−1 (Figure 1), representing a sink of carbon to the terrestrial biosphere from the atmosphere. We are also able to document the range of carbon uptake by terrestrial ecosystems. We find that the most negative tail of the histogram is about -1000 g C m−2 y−1. The most positive tail of the histogram, representing sites acting as carbon sources can be as large as +1000 g C m−2 y−1. Of course these values do not consider net biome exchange that would release pulses of carbon from fires or anthropogenic combustion of fossil fuels.

Fact 3. Fluxes are Dynamic and Difficult to Estimate Reliably.

This summary comes from Helge Hellevanga and Per Aagaard in Making Constraints on natural global atmospheric CO2 fluxes from 1860 to 2010 using a simplified explicit forward model (2015) Excerpt with my bolds.

The relative contribution of the emissions and the efficiency of the biosphere and the ocean to mitigate the increase in atmospheric CO2-concentrations, remain highly uncertain. This is demonstrated in chapter six of the latest IPCC report5, where we can read that the net land-atmosphere carbon flux in the 1980s was estimated to −0.1 ± 0.8 Gt C/a (negative numbers denote net uptake). These numbers were partly based on estimates of net CO2 releases caused by land use changes (+1.4 ± 0.8 Gt C/a), and a residual terrestrial sink estimated to −1.5 ± 1.1 Gt C/a.

There are globally much data supporting increased uptake of carbon by the ocean mixed layer (shallow surface water), but the global gross ocean-atmosphere fluxes, partly influenced by annual and inter-annual processes, such as El Niño/La Niña events, are nevertheless not easy to estimate. Obtaining global values of the carbon fluxes are further complicated by large local and regional variations in carbon releases and uptake by the terrestrial biosphere.

Because of the close coupling between oxygen and carbon fluxes during photosynthesis and respiration, the tracer APO (Atmospheric Potential Oxygen), in combination with atmospheric CO2 data, is used to obtain the net amount of CO2 being taken up by the oceanic sink. The net amount of carbon being taken up by the terrestrial biosphere can then be found from the residual (difference between carbon accumulated in the atmosphere and amount taken up by the global oceans).

APO values are however not straightforward to estimate, and a recent study suggests that the strength of the terrestrial sink may be significantly lower than found earlier. Moreover, current measurements of the atmospheric O2/N2 ratio and CO2 concentrations may suggest that the amount of oxygen is dropping at a faster rate than calculated from the APO tracer values.

Fact 4. The Carbon Cycle is driven by Temperature more than Human Emissions.

Global warming, human-induced carbon emissions,and their uncertainties
FANG JingYun, ZHU JiangLing, WANG ShaoPeng, YUE Chao & SHEN HaiHua. Excerpts with my bolds.

However, the current global carbon balance is disturbed by two factors: one is anthropogenic carbon emissions from fossil fuel combustion and land use change, which are 9–10 Pg C per year [74], i.e. equal to 1/22–1/26 of the natural emissions from terrestrial and oceanic biospheres; and the other is that increasing temperature can result in a positive feedback of carbon emissions caused from a greater soil heterotrophic respiration and from oceanic ecosystems [77, 78]. This increased emission will be reserved in atmosphere and contribute to the increase of atmospheric CO2 concentration if it cannot be absorbed by ecosystems. In this sense, in addition to the anthropogenic carbon emissions, the positive feedback of terrestrial and marine ecosystems to global warming may be another important source of the increasing atmospheric CO2 concentration. The estimation of global carbon budget indicates that a total of the natural and anthropogenic emissions are 250 Pg C per year, whereas the total of absorption by the natural ecosystems and the atmosphere is estimated as 230 Pg C per year (Table 2). This generates a gap of 20 Pg C between the global emissions and absorptions, which is twice the current total anthropogenic emissions (9–10 Pg C/yr). Therefore, there is a great uncertainty in the sources of the increased atmospheric CO2, and we may not reach to the conclusion that elevating atmospheric CO2 concentration is mainly from human activities.

Fact 5. CO2 Residence Times are Far Shorter than IPCC Imagines.

Tom Segalstad describes how alarmist dogma evolved in order to explain away contradictory facts. His paper is Carbon cycle modelling and the residence time of natural and anthropogenic atmospheric CO2 : on the construction of the “Greenhouse Effect Global Warming” dogma. Excerpts with my bolds.

Both radioactive and stable carbon isotopes show that the real atmospheric CO2 residence time (lifetime) is only about 5 years, and that the amount of fossil-fuel CO 2 in the atmosphere is maximum 4%. Any CO level rise beyond this can only come from a much larger, but natural, carbon reservoir with much higher 13-C/12-C isotope ratio than that of the fossil fuel pool, namely from the ocean, and/or the lithosphere, and/or the Earth’s interior.

The apparent annual atmospheric CO level increase, postulated to be anthropogenic, would constitute only some 0.2% of the total annual amount of CO exchanged naturally between the atmosphere and the ocean plus other natural sources and sinks. It is more probable that such a small ripple in the annual natural flow of CO would be caused by natural fluctuations of geophysical processes.

13-C/12-C isotope mass balance calculations show that IPCC’s atmospheric CO2 residence time of 50-200 years make the atmosphere too light (50% of its current CO2 mass) to fit its measured 13-C/12-C isotope ratio. This proves why IPCC’s wrong model creates its artificial 50% “missing sink”. IPCC’s 50% inexplicable “missing sink” of about 3 giga-tonnes carbon annually should have led all governments to reject IPCC’s model.

Tom V. Segalstad has conducted university research, publishing, and teaching in geochemistry, mineralogy, petrology, volcanology, structural geology, ore geology, and geophysics at the University of Oslo, Norway, and the Pennsylvania State University, USA.  Some images here are from Tom Segalstad’s presentation Carbon isotope mass balance modelling of atmospheric vs. oceanic CO2

Segalstad was a reviewer for IPCC assessment reports in the early days before observational facts were set aside in favor of the agenda and climate models tuned to suit the narrative. His whimsical comment on the experience:

Footnote:

For more on CO2 interchange between ocean and air, see Fear Not CO2: The Real Chemistry

For more on atmospheric CO2 processes, see Fearless Physics from Dr. Salby

For more on temperature impacting terrestrial CO2 sources, see Not Worried About CO2

 

Obsessed with Human CO2

A previous post described how alarmists make their case by radically reducing the climate reality down to a false simplicity, as shown in this diagram:
The full discussion of this all too common reductionism is reprinted later on.  This post focuses on the final step at the triangle bottom where all of the increase in atmospheric CO2 is attributed to us humans.

In the news last week were reports of climate scientists surprised that CO2 rose faster in 2015 and 2016 despite flat human emissions.  That should have been a wake-up call regarding their mistaken paradigm of the carbon cycle.  Fortunately there is an excellent resource to correct any such misconceptions.

Recently I was pointed to a great website and the analytical work by Dr. Edwin Berry.  (H/T NZ Climate Science Coalition) Dr. Ed has written a thorough, yet very readable explanation on the issue of human emissions vs. CO2 fluxes from natural sources and sinks.  He has a paper currently in review Why human CO2 does not change climate and I am respecting his request not to repost from it until it is published.  The link does allow you to read his convincing analysis and conclusions, supported by basic principles and math.

Dr. Berry has been working on this for some time, and I will provide excerpts from another post showing his train of thought.  The fork in the road of the climate change debate is his most recent essay aiming for a general audience.

Neither nature’s emissions nor human emissions stay in the atmosphere. They merely flow through the atmosphere. The atmosphere is like a lake where a river flows in and lake water flows out over a dam. The lake’s water level will rise or fall until the outflow over the dam equals the inflow from the river.

If the inflow increases, the level will rise until the outflow equals the inflow and the level becomes constant. Conversely, if inflow decreases, the level will decrease until, once again, outflow equals inflow. The faster the inflow, the higher the level to balance the inflow. Fig. 1 illustrates the simple physics model for both the lake and the atmosphere.

Fig. 1. The Model shows the rate of change of the level equals the difference between Inflow and Outflow. This model applies to both the lake model and the atmosphere model.

The ratio of natural to human carbon dioxide in the atmosphere is the ratio of their inflows. Nature produces more than 95 percent of the carbon dioxide in our atmosphere and human emissions produce less than 5 percent.

In terms of the often-quoted ppm (or parts per million), these percentages show that human emissions cause an 18-ppm rise, and nature’s emissions cause a 392-ppm rise, in atmospheric carbon dioxide. The total of each inflow is today’s carbon dioxide level of 410 ppm.

The IPCC reports are clear. While the IPCC correctly assumes nature’s emissions of about 100 ppm per year balance outflow to inflow, the IPCC incorrectly assumes human emissions do not balance. The IPCC assumes 1.5 ppm per year of human emissions gets stuck in the atmosphere and stays there. That 1.5 ppm is coincidently just enough to support their claim that human emissions have caused all the increase in atmospheric carbon dioxide since 1750.

The Paris Climate Agreement proposed to reduce worldwide human emissions by 28 percent. Twenty-eight percent of 18 ppm is 5 ppm. The Paris Agreement would have reduced atmospheric carbon dioxide by only 5 ppm, which is insignificant. Even 18 ppm is insignificant. The alarmists have no case.

Thank you Dr. Berry for taking IPCC data and showing the correct analysis and conclusion to draw.  A more technical description of his paradigm is A Model for Atmospheric Carbon Dioxide: Abstract

Note the size of the human emissions next to the red arrow. (Units are in GT)

 

Background:  Climate Reductionism

 

Reductionists are those who take one theory or phenomenon to be reducible to some other theory or phenomenon. For example, a reductionist regarding mathematics might take any given mathematical theory to be reducible to logic or set theory. Or, a reductionist about biological entities like cells might take such entities to be reducible to collections of physico-chemical entities like atoms and molecules.
Definition from The Internet Encyclopedia of Philosophy

Some of you may have seen this recent article: Divided Colorado: A Sister And Brother Disagree On Climate Change

The reporter describes a familiar story to many of us.  A single skeptic (the brother) is holding out against his sister and rest of the family who accept global warming/climate change. And of course, after putting some of their interchanges into the text, the reporter then sides against the brother by taking the word of a climate expert. From the article:

“CO2 absorbs infrared heat in certain wavelengths and those measurements were made first time — published — when Abraham Lincoln was president of the United States,” says Scott Denning, a professor of atmospheric science at Colorado State University. “Since that time, those measurements have been repeated by better and better instruments around the world.”

CO2, or carbon dioxide, has increased over time, scientists say, because of human activity. It’s a greenhouse gas that’s contributing to global warming.

“We know precisely how the molecule wiggles and waggles, and what the quantum interactions between the electrons are that cause everyone one of these little absorption lines,” he says. “And there’s just no wiggle room around it — CO2 absorbs heat, heat warms things up, so adding CO2 to the atmosphere will warm the climate.”

Denning says that most of the CO2 we see added to the atmosphere comes from humans — mostly through burning coal, oil and gas, which, as he puts it, is “indirectly caused by us.”

When looking at the scientific community, Denning says it’s united, as far as he knows.

A Case Study of Climate Reductionism

Denning’s comments, supported by several presentations at his website demonstrate how some scientists (all those known to Denning) engage in a classic form of reductionism.

The full complexity of earth’s climate includes many processes, some poorly understood, but known to have effects orders of magnitude greater than the potential of CO2 warming. The case for global warming alarm rests on simplifying away everything but the predetermined notion that humans are warming the planet. It goes like this:

Our Complex Climate

Earth’s climate is probably the most complicated natural phenomenon ever studied. Not only are there many processes, but they also interact and influence each other over various timescales, causing lagged effects and multiple cycling. This diagram illustrates some of the climate elements and interactions between them.

Flows and Feedbacks for Climate Models

The Many Climate Dimensions

Further, measuring changes in the climate goes far beyond temperature as a metric. Global climate indices, like the European dataset include 12 climate dimensions with 74 tracking measures. The set of climate dimensions include:

  • Sunshine
  • Pressure
  • Humidity
  • Cloudiness
  • Wind
  • Rain
  • Snow
  • Drought
  • Temperature
  • Heat
  • Cold

And in addition there are compound measures combining temperature and precipitation. While temperature is important, climate is much more than that.  With this reduction, all other dimensions are swept aside, and climate change is simplified down to global warming as seen in temperature measurements.

Climate Thermodynamics: Weather is the Climate System at work.

Another distortion is the notion that weather is bad or good, depending on humans finding it favorable. In fact, all that we call weather are the ocean and atmosphere acting to resolve differences in temperatures, humidities and pressures. It is the natural result of a rotating, irregular planetary surface mostly covered with water and illuminated mostly at its equator.

The sun warms the surface, but the heat escapes very quickly by convection so the build-up of heat near the surface is limited. In an incompressible atmosphere, it would *all* escape, and you’d get no surface warming. But because air is compressible, and because gases warm up when they’re compressed and cool down when allowed to expand, air circulating vertically by convection will warm and cool at a certain rate due to the changing atmospheric pressure.

Climate science has been obsessed with only a part of the system, namely the atmosphere and radiation, in order to focus attention on the non-condensing IR active gases. The climate is framed as a 3D atmosphere above a 2D surface. That narrow scope leaves out the powerful non-radiative heat transfer mechanisms that dominate the lower troposphere, and the vast reservoir of thermal energy deep in the oceans.

As Dr. Robert E Stevenson writes, it could have been different:

“As an oceanographer, I’d been around the world, once or twice, and I was rather convinced that I knew the factors that influenced the Earth’s climate. The oceans, by virtue of their enormous density and heat-storage capacity, are the dominant influence on our climate. It is the heat budget and the energy that flows into and out of the oceans that basically determines the mean temperature of the global atmosphere. These interactions, plus evaporation, are quite capable of canceling the slight effect of man-produced CO2.”

The troposphere is dominated by powerful heat transfer mechanisms: conduction, convection and evaporation, as well as physical kinetic movements.  All this is ignored in order to focus on radiative heat transfer, a bit player except at the top of the atmosphere.

There’s More than the Atmosphere

Once the world of climate is greatly reduced down to radiation of infrared frequencies, yet another set of blinders is applied. The most important source of radiation is of course the sun. Solar radiation in the short wave (SW) range is what we see and what heats up the earth’s surface, particularly the oceans. In addition solar radiation includes infrared, some absorbed in the atmosphere and some at the surface. The ocean is also a major source of heat into the atmosphere since its thermal capacity is 1000 times what the air can hold. The heat transfer from ocean to air is both by way of evaporation (latent heat) and also by direct contact at the sea surface (conduction).

Yet conventional climate science dismisses the sun as a climate factor saying that its climate input is unvarying. That ignores significant fluctuations in parts of the light range, for example ultraviolet, and also solar effects such as magnetic fields and cosmic rays. Also disregarded is solar energy varying due to cloud fluctuations. The ocean is also dismissed as a source of climate change despite obvious ocean warming and cooling cycles ranging from weeks to centuries. The problem is such oscillations are not well understood or predictable, so can not be easily modeled.

With the sun and the earth’s surface and ocean dismissed, the only consideration left is the atmosphere.

The Gorilla Greenhouse Gas

Thus climate has been reduced down to heat radiation passing through the atmosphere comprised of gases. One of the biggest reductions then comes from focusing on CO2 rather than H20. Of all the gases that are IR-active, water is the most prevalent and covers more of the spectrum.

The diagram below gives you the sense of proportion.

The Role of CO2

We come now to the role of CO2 in “trapping heat” and making the world warmer. The theory is that CO2 acts like a blanket by absorbing and re-radiating heat that would otherwise escape into space. By delaying the cooling while solar energy comes in constantly, CO2 is presumed to cause a buildup of heat resulting in warmer temperatures.

How the Atmosphere Processes Heat

There are 3 ways that heat (Infrared or IR radiation) passes from the surface to space.

1) A small amount of the radiation leaves directly, because all gases in our air are transparent to IR of 10-14 microns (sometimes called the “atmospheric window.” This pathway moves at the speed of light, so no delay of cooling occurs.

2) Some radiation is absorbed and re-emitted by IR active gases up to the tropopause. Calculations of the free mean path for CO2 show that energy passes from surface to tropopause in less than 5 milliseconds. This is almost speed of light, so delay is negligible. H2O is so variable across the globe that its total effects are not measurable. In arid places, like deserts, we see that CO2 by itself does not prevent the loss of the day’s heat after sundown.

3) The bulk gases of the atmosphere, O2 and N2, are warmed by conduction and convection from the surface. They also gain energy by collisions with IR active gases, some of that IR coming from the surface, and some absorbed directly from the sun. Latent heat from water is also added to the bulk gases. O2 and N2 are slow to shed this heat, and indeed must pass it back to IR active gases at the top of the troposphere for radiation into space.

In a parcel of air each molecule of CO2 is surrounded by 2500 other molecules, mostly O2 and N2. In the lower atmosphere, the air is dense and CO2 molecules energized by IR lose it to surrounding gases, slightly warming the entire parcel. Higher in the atmosphere, the air is thinner, and CO2 molecules can emit IR into space. Surrounding gases resupply CO2 with the energy it lost, which leads to further heat loss into space.

This third pathway has a significant delay of cooling, and is the reason for our mild surface temperature, averaging about 15C. Yes, earth’s atmosphere produces a buildup of heat at the surface. The bulk gases, O2 and N2, trap heat near the surface, while IR active gases, mainly H20 and CO2, provide the radiative cooling at the top of the atmosphere. Near the top of the atmosphere you will find the -18C temperature.

Sources of CO2

Note the size of the human emissions next to the red arrow.

A final reduction comes down to how much of the CO2 in the atmosphere is there because of us. Alarmists/activists say any increase in CO2 is 100% man-made, and would be more were it not for natural CO2 sinks, namely the ocean and biosphere. The claim overlooks the fact that those sinks are also sources of CO2 and the flux from the land and sea is an order of magnitude higher than estimates of human emissions. In fact, our few Gigatons of carbon are lost within the error range of estimating natural emissions. Insects produce far more CO2 than humans do by all our activity, including domestic animals.

Why Climate Reductionism is Dangerous

Reducing the climate in this fashion reaches its logical conclusion in the Activist notion of the “450 Scenario.”  Since Cancun, IPCC is asserting that global warming is capped at 2C by keeping CO2 concentration below 450 ppm. From Summary for Policymakers (SPM) AR5

Emissions scenarios leading to CO2-equivalent concentrations in 2100 of about 450 ppm or lower are likely to maintain warming below 2°C over the 21st century relative to pre-industrial levels. These scenarios are characterized by 40 to 70% global anthropogenic GHG emissions reductions by 2050 compared to 2010, and emissions levels near zero or below in 2100.

Thus is born the “450 Scenario” by which governments can be focused upon reducing human emissions without any reference to temperature measurements, which are troublesome and inconvenient. Almost everything in the climate world has been erased, and “Fighting Climate Change” is now code to mean accounting for fossil fuel emissions.

Conclusion

All propagandists begin with a kernel of truth, in this case the fact everything acting in the world has an effect on everything else. Edward Lorenz brought this insight to bear on the climate system in a ground breaking paper he presented in 1972 entitled: “Predictability: Does the Flap of a Butterfly’s Wings in Brazil Set Off a Tornado in Texas?”  Everything does matter and has an effect. Obviously humans impact on the climate in places where we build cities and dams, clear forests and operate farms. And obviously we add some CO2 when we burn fossil fuels.

But it is wrong to ignore the major dominant climate realities in order to exaggerate a small peripheral factor for the sake of an agenda. It is wrong to claim that IR active gases somehow “trap” heat in the air when they immediately emit any energy absorbed, if not already lost colliding with another molecule. No, it is the bulk gases, N2 and O2, making up the mass of the atmosphere, together with the ocean delaying the cooling and giving us the mild and remarkably stable temperatures that we enjoy. And CO2 does its job by radiating the heat into space.

Since we do little to cause it, we can’t fix it by changing what we do. The climate will not stop changing because we put a price on carbon. And the sun will rise despite the cock going on strike to protest global warming.

Footnote: For a deeper understanding of the atmospheric physics relating to CO2 and climate, I have done a guide and synopsis of Murry Salby’s latest textbook on the subject:  Fearless Physics from Dr. Salby

Climate Scientist Sues Over Hurt Feelings

Article By Alex Berezow — November 2, 2017 at American Council on Science and Health (ACSH) entitled:
Climate Scientist Mark Jacobson Sues Journal For $10M Over Hurt Feelings  Excerpts below with my bolds.

ACSH has been around since 1978. We have never seen anything like this.

Climate scientist Mark Z. Jacobson of Stanford University has sued the National Academy of Sciences, which publishes the prestigious journal PNAS, for publishing an article that disagreed with him. The lawsuit claims that Dr. Jacobson was libeled and slandered. He is suing to get the journal to retract the article.  For his hurt feelings and bruised ego, he also wants a big bag of money, $10 million to be precise

Let’s set aside the scientific arguments in this debate, which revolve around the feasibility of 100% renewable energy. Smart people can disagree about whether that is a technologically and economically achievable goal. The way smart (and mature) people handle their disagreements is in the pages of a peer-reviewed scientific journal. But, apparently, that’s no longer how things operate in our litigious society.

Dr. Jacobson published a paper in PNAS that other scientists found faulty. So, they published a rebuttal, which concluded that Dr. Jacobson’s analysis “involves errors, inappropriate methods, and implausible assumptions.” While this is considered rather harsh language for the scientific literature, critiquing the work of others occurs as a matter of routine. Indeed, questioning another scientist’s conclusions is a healthy and integral part of the pursuit of knowledge.

The ACSH article goes into the details and statements that suggest Jacobson’s hurt feelings are driving his actions.  But I want to put this dispute in a larger context.  For this is a powerful example of the misuse of scientific models that goes on flagrantly in climate science, but also in other fields.  The only difference here is Jacobson’s extreme measure of going to the courts to defend his model.  For background, consider the notion of Chameleon Models, a term invented by Paul Pfleiderer  (also of Stanford), and see how it applies to this conflict.

Chameleon2

Paul Pfleiderer has done a public service in calling attention to
The Misuse of Theoretical Models in Finance and Economics (here)
h/t to William Briggs for noticing and linking

He coins the term “Chameleon” for the abuse of models, and explains in the abstract of his article:

In this essay I discuss how theoretical models in finance and economics are used in ways that make them “chameleons” and how chameleons devalue the intellectual currency and muddy policy debates. A model becomes a chameleon when it is built on assumptions with dubious connections to the real world but nevertheless has conclusions that are uncritically (or not critically enough) applied to understanding our economy. I discuss how chameleons are created and nurtured by the mistaken notion that one should not judge a model by its assumptions, by the unfounded argument that models should have equal standing until definitive empirical tests are conducted, and by misplaced appeals to “as-if” arguments, mathematical elegance, subtlety, references to assumptions that are “standard in the literature,” and the need for tractability.

Chameleon Climate Models

Pfleiderer is writing about his specialty, financial models, and even more particularly banking systems, and gives several examples of how dysfunctional is the problem. As we shall see below, climate models are an order of magnitude more complicated, and abused in the same way, only more flagrantly.

As the analogy suggests, a chameleon model changes color when it is moved to a different context. When politicians and activists refer to climate models, they assert the model outputs as “Predictions”. The media is rife with examples, but here is one from Climate Concern UK

Some predicted Future Effects of Climate Change

  • Increased average temperatures: the IPCC (International Panel for Climate Change) predict a global rise of between 1.1ºC and 6.4ºC by 2100 depending on some scientific uncertainties and the extent to which the world decreases or increases greenhouse gas emissions.
  • 50% less rainfall in the tropics. Severe water shortages within 25 years – potentially affecting 5 billion people. Widespread crop failures.
  • 50% more river volume by 2100 in northern countries.
  • Desertification and burning down of vast areas of agricultural land and forests.
  • Continuing spread of malaria and other diseases, including from a much increased insect population in UK. Respiratory illnesses due to poor air quality with higher temperatures.
  • Extinction of large numbers of animal and plant species.
  • Sea level rise: due to both warmer water (greater volume) and melting ice. The IPCC predicts between 28cm and 43cm by 2100, with consequent high storm wave heights, threatening to displace up to 200 million people. At worst, if emissions this century were to set in place future melting of both the Greenland and West Antarctic ice caps, sea level would eventually rise approx 12m.

Now that alarming list of predictions is a claim to forecast what will be the future of the actual world as we know it.

Now for the switcheroo. When climate models are referenced by scientists or agencies likely to be held legally accountable for making claims, the model output is transformed into “Projections.” The difference is more than semantics:
Prediction: What will actually happen in the future.
Projection: What will possibly happen in the future.

In other words, the climate model has gone from the bookshelf world (possibilities) to the world of actualities and of policy decision-making.  The step of applying reality filters to the climate models (verification) is skipped in order to score political and public relations points.

The ultimate proof of this is the existence of legal disclaimers exempting the modelers from accountability. One example is from ClimateData.US

Disclaimer NASA NEX-DCP30 Terms of Use

The maps are based on NASA’s NEX-DCP30 dataset that are provided to assist the science community in conducting studies of climate change impacts at local to regional scales, and to enhance public understanding of possible future climate patterns and climate impacts at the scale of individual neighborhoods and communities. The maps presented here are visual representations only and are not to be used for decision-making. The NEX-DCP30 dataset upon which these maps are derived is intended for use in scientific research only, and use of this dataset or visualizations for other purposes, such as commercial applications, and engineering or design studies is not recommended without consultation with a qualified expert. (my bold)

Conclusion:

Whereas some theoretical models can be immensely useful in developing intuitions, in essence a theoretical model is nothing more than an argument that a set of conclusions follows from a given set of assumptions. Being logically correct may earn a place for a theoretical model on the bookshelf, but when a theoretical model is taken off the shelf and applied to the real world, it is important to question whether the model’s assumptions are in accord with what we know about the world. Is the story behind the model one that captures what is important or is it a fiction that has little connection to what we see in practice? Have important factors been omitted? Are economic agents assumed to be doing things that we have serious doubts they are able to do? These questions and others like them allow us to filter out models that are ill suited to give us genuine insights. To be taken seriously models should pass through the real world filter.

Chameleons are models that are offered up as saying something significant about the real world even though they do not pass through the filter. When the assumptions of a chameleon are challenged, various defenses are made (e.g., one shouldn’t judge a model by its assumptions, any model has equal standing with all other models until the proper empirical tests have been run, etc.). In many cases the chameleon will change colors as necessary, taking on the colors of a bookshelf model when challenged, but reverting back to the colors of a model that claims to apply the real world when not challenged.

A model becomes a chameleon when it is built on assumptions with dubious connections to the real world but nevertheless has conclusions that are uncritically (or not critically enough) applied to understanding our economy. Chameleons are not just mischievous they can be harmful − especially when used to inform policy and other decision making − and they devalue the intellectual currency.

Thank you Dr. Pfleiderer for showing us how the sleight-of-hand occurs in economic considerations. The same abuse prevails in the world of climate science.


Paul Pfleiderer, Stanford University Faculty
C.O.G. Miller Distinguished Professor of Finance
Senior Associate Dean for Academic Affairs
Professor of Law (by courtesy), School of Law

Postscript:  Now we have a scientist whose model has been reality tested and found wanting by others. His response is filing a lawsuit to make the criticism go away, and to levy a penalty so heavy that no model would ever again be challenged. Onward into the post-modern abyss.

Footnote:

There are a series of posts here which apply reality filters to attest climate models.  The first was Temperatures According to Climate Models where both hindcasting and forecasting were seen to be flawed.

Others in the Series are:

Sea Level Rise: Just the Facts

Data vs. Models #1: Arctic Warming

Data vs. Models #2: Droughts and Floods

Data vs. Models #3: Disasters

Data vs. Models #4: Climates Changing

Climate Medicine Bonn Update

Climate Quakery

With Bonn COP23 set to start next week, the media is awash with claims that climate change is an international public health crisis.  For example, in just one day from Google news:

Climate change isn’t just hurting the planet – it’s a public health emergency–The Guardian

Climate change’s impact on human health is already here — and is ‘potentially irreversible,’ report says –USA TODAY

Climate Change Is Bad for Your Health–New York Times

From heat stress to malnutr­ition, climate change is already making us sick–The Verge

As Richard Lindzen predicted, everyone wants on the climate bandwagon, because that is where the money is.  Medical scientists are pushing for their share of the pie, as evidenced by the Met office gathering on Assessing the Global Impacts of Climate and Extreme Weather on Health and Well-Being (following Paris COP).  Not coincidentally, the 2nd Global Conference on Health and Climate was held July 7-8, 2016 in Paris.  Now we have the American Public Health Association declaring:

2017 is the Year of Climate Change and Health

“We’re committed to making sure the nation knows about the effects of climate change on health. If anyone doesn’t think this is a severe problem, they are fooling themselves.” — APHA Executive Director Georges Benjamin, in The Washington Post

The new field of Climate Medicine is evidenced by a slew of new organizations and studies.  In addition to numerous agencies set up within WHO and the UN, and governmental entities (such as the Met Office), there are many NGOs, such as:

Health Care Without Harm
Health and Environment Alliance
Health and Climate Foundation
Climate and Health Council
United States National Association of County and City Health Officials
Care International
Global Gender and Climate Alliance / Women’s Environment and   Development Organization
International Federation of Medical Students’ Associations
Climate Change and Human Health Programme, Columbia U.
Center for Health and the Global Environment, Harvard
National Center for Epidemiology and Population Health, ANC Canberra
Centre for Sustainability and the Global Environment, U of Wisconsin
Environmental Change Institute, Oxford
London School of Tropical Medicine and Hygiene, London, UK
International Human Dimensions Programme on Global Environmental Change, US National Academies of Science
US Climate and Health Alliance
Etc, etc., etc.

Of course, they are encouraged and abetted by the IPCC.

From the Fifth Assessment Report:

Until mid-century, projected climate change will impact human health mainly by exacerbating health problems that already exist (very high confidence). Throughout the 21st century, climate change is expected to lead to increases in ill-health in many regions and especially in developing countries with low income, as compared to a baseline without climate change (high confidence). By 2100 for RCP8.5, the combination of high temperature and humidity in some areas for parts of the year is expected to compromise common human activities, including growing food and working outdoors (high confidence). {2.3.2}

In urban areas climate change is projected to increase risks for people, assets, economies and ecosystems, including risks from heat stress, storms and extreme precipitation, inland and coastal flooding, landslides, air pollution, drought, water scarcity, sea level rise and storm surges (very high confidence). These risks are amplified for those lacking essential infrastructure and services or living in exposed areas. {2.3.2}

Feared Climate Health Impacts Are Unsupported by Scientific Research

NIPCC has a compendium of peer-reviewed studies on this issue and provides these findings (here)

Key Findings: Human Health
• Warmer temperatures lead to a decrease in temperature-related mortality, including deaths associated with cardiovascular disease, respiratory disease, and strokes. The evidence of this benefit comes from research conducted in every major country of the world.

• In the United States the average person who died because of cold temperature exposure lost in excess of 10 years of potential life, whereas the average person who died because of hot temperature exposure likely lost no more than a few days or weeks of life.

• In the U.S., some 4,600 deaths are delayed each year as people move from cold northeastern states to warm southwestern states. Between 3 and 7% of the gains in longevity experienced over the past three decades was due simply to people moving to warmer states.

• Cold-related deaths are far more numerous than heat-related deaths in the United States, Europe, and almost all countries outside the tropics. Coronary and cerebral thrombosis account for about half of all cold-related mortality.

• Global warming is reducing the incidence of cardiovascular diseases related to low temperatures and wintry weather by a much greater degree than it increases the incidence of cardiovascular diseases associated with high temperatures and summer heat waves.

• A large body of scientific examination and research contradict the claim that malaria will expand across the globe and intensify as a result of CO2 -induced warming.

• Concerns over large increases in vector-borne diseases such as dengue as a result of rising temperatures are unfounded and unsupported by the scientific literature, as climatic indices are poor predictors for dengue disease.

• While temperature and climate largely determine the geographical distribution of ticks, they are not among the significant factors determining the incidence of tick-borne diseases.

• The ongoing rise in the air’s CO2 content is not only raising the productivity of Earth’s common food plants but also significantly increasing the quantity and potency of the many healthpromoting substances found in their tissues, which are the ultimate sources of sustenance for essentially all animals and humans.

• Atmospheric CO2 enrichment positively impacts the production of numerous health-promoting substances found in medicinal or “health food” plants, and this phenomenon may have contributed to the increase in human life span that has occurred over the past century or so.

• There is little reason to expect any significant CO2 -induced increases in human-health-harming substances produced by plants as atmospheric CO2 levels continue to rise.

Source: Chapter 7. “Human Health,” Climate Change Reconsidered II: Biological Impacts (Chicago, IL: The Heartland Institute, 2014).
Full text of Chapter 7 and references on Human health begins pg. 955 of the full report here

Summary

Advances in medical science and public health have  benefited billions of people with longer and higher quality lives.  Yet this crucial social asset has joined the list of those fields corrupted by the dash for climate cash. Increasingly, medical talent and resources are diverted into inventing bogeymen and studying imaginary public health crises.

Economists Francesco Boselloa, Roberto Roson and Richard Tol conducted an exhaustive study called Economy-wide estimates of the implications of climate change: Human health

After reviewing all the research and crunching the numbers, they concluded that achieving one degree of global warming by 2050 will, on balance, save more than 800,000 lives annually.

Not only is the warming not happening, we would be more healthy if it did.

Oh, Dr. Frankenmann, what have you wrought?

Footnote:  More proof against Climate Medicine

From: Gasparrini et al: Mortality risk attributable to high and low ambient temperature: a multicountry observational study. The Lancet, May 2015

Cold weather kills 20 times as many people as hot weather, according to an international study analyzing over 74 million deaths in 384 locations across 13 countries. The findings, published in The Lancet, also reveal that deaths due to moderately hot or cold weather substantially exceed those resulting from extreme heat waves or cold spells.

“It’s often assumed that extreme weather causes the majority of deaths, with most previous research focusing on the effects of extreme heat waves,” says lead author Dr Antonio Gasparrini from the London School of Hygiene & Tropical Medicine in the UK. “Our findings, from an analysis of the largest dataset of temperature-related deaths ever collected, show that the majority of these deaths actually happen on moderately hot and cold days, with most deaths caused by moderately cold temperatures.”

Now in 2017, Lancet sets the facts aside in order to prostrate itself before the global warming altar:

Christiana Figueres, chair of the Lancet Countdown’s high-level advisory board and former executive secretary of the UN Framework Convention on Climate Change, said, “The report lays bare the impact that climate change is having on our health today. It also shows that tackling climate change directly, unequivocally and immediately improves global health. It’s as simple as that.’’

 

 

 

Bonn COP23 Briefing for Realists

 

STEPHANE KIEHL POUR “LE MONDE”

French Mathematicians spoke out prior to COP21 in Paris, and their words provide a rational briefing for COP23 beginning in Bonn next month.  In a nutshell:

Fighting Global Warming is Absurd, Costly and Pointless.

  • Absurd because of no reliable evidence that anything unusual is happening in our climate.
  • Costly because trillions of dollars are wasted on immature, inefficient technologies that serve only to make cheap, reliable energy expensive and intermittent.
  • Pointless because we do not control the weather anyway.

The prestigious Société de Calcul Mathématique (Society for Mathematical Calculation) issued a detailed 195-page White Paper that presents a blistering point-by-point critique of the key dogmas of global warming. The synopsis is blunt and extremely well documented.  Here are extracts from the opening statements of the first three chapters of the SCM White Paper with my bolds and images.

Sisyphus at work.

Chapter 1: The crusade is absurd
There is not a single fact, figure or observation that leads us to conclude that the world‘s climate is in any way ‘disturbed.’ It is variable, as it has always been, but rather less so now than during certain periods or geological eras. Modern methods are far from being able to accurately measure the planet‘s global temperature even today, so measurements made 50 or 100 years ago are even less reliable. Concentrations of CO2 vary, as they always have done; the figures that are being released are biased and dishonest. Rising sea levels are a normal phenomenon linked to upthrust buoyancy; they are nothing to do with so-called global warming. As for extreme weather events — they are no more frequent now than they have been in the past. We ourselves have processed the raw data on hurricanes….

Chapter 2: The crusade is costly
Direct aid for industries that are completely unviable (such as photovoltaics and wind turbines) but presented as ‘virtuous’ runs into billions of euros, according to recent reports published by the Cour des Comptes (French Audit Office) in 2013. But the highest cost lies in the principle of ‘energy saving,’ which is presented as especially virtuous. Since no civilization can develop when it is saving energy, ours has stopped developing: France now has more than three million people unemployed — it is the price we have to pay for our virtue….

Chapter 3: The crusade is pointless
Human beings cannot, in any event, change the climate. If we in France were to stop all industrial activity (let’s not talk about our intellectual activity, which ceased long ago), if we were to eradicate all trace of animal life, the composition of the atmosphere would not alter in any measurable, perceptible way. To explain this, let us make a comparison with the rotation of the planet: it is slowing down. To address that, we might be tempted to ask the entire population of China to run in an easterly direction. But, no matter how big China and its population are, this would have no measurable impact on the Earth‘s rotation.

Full text in pdf format is available in English at link below:

The battle against global warming: an absurd, costly and pointless crusade
White Paper drawn up by the Société de Calcul Mathématique SA
(Mathematical Modelling Company, Corp.)

A Second report was published in 2016 entitled: Global Warming and Employment, which analyzes in depth the economic destruction from ill-advised climate change policies.

The two principal themes are that jobs are disappearing and that the destructive forces are embedded in our societies.

Jobs are Disappearing discusses issues such as:

The State is incapable of devising and implementing an industrial policy.

The fundamental absurdity of the concept of sustainable development

Biofuels an especially absurd policy leading to ridiculous taxes and job losses.

EU policy to reduce greenhouse gas emissions by 40% drives jobs elsewhere while being pointless: the planet has never asked for it, is completely unaware of it, and will never notice it!

The War against the Car and Road Maintenance undercuts economic mobility while destroying transportation sector jobs.

Solar and wind energy are weak, diffuse, and inconsistent, inadequate to power modern civilization.

Food production activities are attacked as being “bad for the planet.”

So-called Green jobs are entirely financed by subsidies.

The Brutalizing Whip discusses the damages to public finances and to social wealth and well-being, including these topics:

Taxes have never been so high

The Government is borrowing more and more

Dilapidated infrastructure

Instead of job creation, Relocations and Losses

The wastefulness associated with the new forms of energy

Return to the economy of an underdeveloped country

What is our predicament?
Four Horsemen are bringing down our societies:

  • The Ministry of Ecology (climate and environment);
  • Journalists;
  • Scientists;
  • Corporation Environmentalist Departments.

Steps required to recover from this demise:

  • Go back to the basic rules of research.
  • Go back to the basic rules of law
  • Do not trust international organizations
  • Leave the planet alone
  • Beware of any premature optimism

Conclusion

Climate lemmings

The real question is this: how have policymakers managed to make such absurd decisions, to blinker themselves to such a degree, when so many means of scientific investigation are available? The answer is simple: as soon as something is seen as being green, as being good for the planet, all discussion comes to an end and any scientific analysis becomes pointless or counterproductive. The policymakers will not listen to anyone or anything; they take all sorts of hasty, contradictory, damaging and absurd decisions. When will they finally be held to account?

Footnote:

The above cartoon image of climate talks includes water rising over politicians’ feet.  But actual observations made in Fiji (presiding over these talks in Bonn) show sea levels are stable (link below).

Fear Not For Fiji

Snowing and Freezing in the Arctic

The image from IMS shows snow and ice on day 296 (yesterday) 2007 to 2017, with focus on Eurasia but also showing Canada and Alaska.  You can see that low Arctic ice years, like 2007, 2012 and 2016 have a smaller snow extent on both sides of the Arctic.  Conversely, higher Arctic ice years like 2013, 2014 and 2015 show snow spreading into northern Europe, as well as Alaska.  The pattern appears as gaining snow and ice 2008 to 10, losing 2011 and 2012, then regaining 2013 to 15, before retreating in 2016.  So far 2017 is looking more like 2013 to 15.

From Post Natural Climate Factors: Snow 

Previously I posted an explanation by Dr. Judah Cohen regarding a correlation between autumn Siberian snow cover and the following winter conditions, not only in the Arctic but extending across the Northern Hemisphere. More recently, in looking into Climate Model Upgraded: INMCM5, I noticed some of the scientists were also involved in confirming the importance of snow cover for climate forecasting. Since the poles function as the primary vents for global cooling, what happens in the Arctic in no way stays in the Arctic. This post explores data suggesting changes in snow cover drive some climate changes.

The Snow Cover Climate Factor

The diagram represents how Dr. judah Cohen pictures the Northern Hemisphere wintertime climate system.  He leads research regarding Arctic and NH weather patterns for AER.

cohen-schematic2

Dr. Cohen explains the mechanism in this diagram.

Conceptual model for how fall snow cover modifies winter circulation in both the stratosphere and the troposphere–The case for low snow cover on left; the case for extensive snow cover on right.

1. Snow cover increases rapidly in the fall across Siberia, when snow cover is above normal diabatic cooling helps to;
2. Strengthen the Siberian high and leads to below normal temperatures.
3. Snow forced diabatic cooling in proximity to high topography of Asia increases upward flux of energy in the troposphere, which is absorbed in the stratosphere.
4. Strong convergence of WAF (Wave Activity Flux) indicates higher geopotential heights.
5. A weakened polar vortex and warmer down from the stratosphere into the troposphere all the way to the surface.
6. Dynamic pathway culminates with strong negative phase of the Arctic Oscillation at the surface.

From Eurasian Snow Cover Variability and Links with Stratosphere-Troposphere
Coupling and Their Potential Use in Seasonal to Decadal Climate Predictions by Judah Cohen.

Variations in Siberian snow cover October (day 304) 2004 to 2016. Eurasian snow charts from IMS.

Observations of the Snow Climate Factor

For several decades the IMS snow cover images have been digitized to produce a numerical database for NH snow cover, including area extents for Eurasia. The NOAA climate data record of Northern Hemisphere snow cover extent, Version 1, is archived and distributed by NCDC’s satellite Climate Data Record Program. The CDR is forward processed operationally every month, along with figures and tables made available at Rutgers University Global Snow Lab.

This first graph shows the snow extents of interest in Dr. Cohen’s paradigm. The Autumn snow area in Siberia is represented by the annual Eurasian averages of the months of October and November (ON). The following NH Winter is shown as the average snow area for December, January and February (DJF). Thus the year designates the December of that year plus the first two months of the next year.

Notes: NH snow cover minimum was 1981, trending upward since.  Siberian autumn snow cover was lowest in 1989, increasing since then.  Autumn Eurasian snow cover is about 1/3 of Winter NH snow area. Note also that fluctuations are sizable and correlated.

The second graph presents annual anomalies for the two series, each calculated as the deviation from the mean of its entire time series. Strikingly, the Eurasian Autumn flux is on the same scale as total NH flux, and closely aligned. While NH snow cover declined a few years prior to 2016, Eurasian snow is trending upward strongly.  If Dr. Cohen is correct, NH snowfall will follow. The linear trend is slightly positive, suggesting that fears of children never seeing snowfall have been exaggerated. The Eurasian trend line (not shown) is almost the same.

What About Winter 2017-2018?

These data confirm that Dr. Cohen and colleagues are onto something. Here are excerpts from his October 2 outlook for the upcoming season AER. (my bolds)

The main block/high pressure feature influencing Eurasian weather is currently centered over the Barents-Kara Seas and is predicted to first weaken and then strengthen over the next two weeks.

Blocking in the Barents-Kara Seas favors troughing/negative geopotential height anomalies and cool temperatures downstream over Eurasia but especially Central and East Asia. The forecast for the next two weeks across Central Asia is for continuation of overall below normal temperatures and new snowfall.

Currently the largest negative anomalies in sea ice extent are in the Chukchi and Beaufort Seas but that will change over the next month or so during the critical months of November-February. In my opinion low Arctic sea ice favors a more severe winter but not necessarily hemisphere-wide and depends on the regions of the strongest anomalies. Strong negative departures in the Barents-Kara Seas favors cold temperatures in Asia while strong negative departures near Greenland and/or the Beaufort Sea favor cold temperatures in eastern North America.

Siberian snow cover is advancing quickly relative to climatology and is on a pace similar to last year at this time. My, along with my colleagues and others, research has shown that extensive Siberian snow cover in the fall favors a trough across East Asia with a ridge to the west near the Urals. The atmospheric circulation pattern favors more active poleward heat flux, a weaker PV and cold temperatures across the NH. It is very early in the snow season but recent falls have been snowy across Siberia and therefore I do expect another upcoming snowy fall across Siberia.

Summary

In summary the three main predictors that I follow in the fall months most closely, the presence or absence of high latitude blocking, Arctic sea ice extent and Siberian snow cover extent all point towards a more severe winter across the continents of the NH.

Uh oh.  Now where did I put away my long johns?

Fear Not For Fiji

Fiji Map from Turtle Airways Seaplanes. Fiji International Airport is at Nadi.

Published this month is an update on sea levels at Fiji, and thankfully the threat level can be dialed way down.  (H/T Tallbloke)  The Research Article:  Our Oceans-Our Future: New Evidence-based Sea Level Records from the Fiji Islands for the Last 500 years Indicating Rotational Eustasy and Absence of a Present Rise in Sea Level by Nils-Axel Mörner, Paleogeophysics & Geodynamics, Stockholm, Sweden. Excerpts with my bolds.

Abstract:

Previously, no study in the Fiji Islands had been devoted to the sea level changes of the last 500 years. No serious prediction can be made unless we have a good understanding of the sea level changes today and in the past centuries. Therefore, this study fills a gap, and provides real observational facts to assess the question of present sea level changes.

There is a total absence of data supporting the notion of a present sea level rise; on the contrary all available facts indicate present sea level stability. On the centennial timescale, there was a +70 cm high level in the 16th and 17th centuries, a -50 cm low in the 18th century and a stability (with some oscillations) in the 19th, 20th and early 21st centuries. This is almost identical to the sea level change documented in the Maldives, Bangladesh and Goa (India).

This seems to indicate a mutual driving force. However, the recorded sea level changes are anti-correlated with the major changes in climate during the last 600 years. Therefore, glacial eustasy cannot be the driving force. The explanation seems to be rotational eustasy with speeding-up phases during Grand Solar Minima forcing ocean water masses to the equatorial region, and slowing-down phases during Grand Solar Maxima forcing ocean waster massed from the equator towards the poles.

Background

The Intergovernmental Panel on Climate Change [1] has claimed that sea level is rising and that an additional acceleration is soon to be expected as a function of global warming. This proposition only works if the present warming would be a function of increased CO2 content in the atmosphere (an hypothesis termed AGW from Anthropogenic Global Warming). On a longer-term basis, it seems quite clear, however, that the dominant factor of global changes in temperature is changes in solar variability [2-3]. Regardless of what actually is driving climate change and sea level changes, the proposition of a rapidly rising sea level grew to a mantra in media and politics. This initiated a flood of papers rather based on models and statistics, however, than on actual field observations.

The Fiji government will be the chair-nation at the next international climate conference; COP23 in Bonn in November 2017 [4].  This paper represents a detailed analysis of available field  observation on sea level changes in the Fiji Islands over the last 500 Years.

Figure 1.

Sea level changes as documented in the Yasawa Islands, Fiji, composed of 3 main segments: a high level (1), a low level (2) and a more or less constant level (3), which might be subdivided in an early high level, a main level just above the present level and a lowering to the present level generating microatoll growth in the last 60 years (based on data from [13]). (Subdivisions shown in Figure 3 below)

Figure 2.

The long-term changes during the last 500 years – i.e. a high, a low and a present level – is recorded in the Maldives [16], in Bangladesh [17-18] and in Goa, India, [15,18], as illustrated in Figure 3. A present long-term stability is also recorded in Qatar [19].

Figure 3.

The general agreement between the observed sea level changes in Fiji during the last 500 years, and those recorded in the three Indian Ocean sites: the Maldives, Goa and Bangladesh is striking, which is a very strong (even conclusive) argument that the recorded sea level change are of regional eustatic origin [20].

All four records show a high in the 17th century (which was a period of Little Ice Age conditions), a low in the 18th century (which was a period nearly as warm as today) and a high in the early 19th century (which was the last period of Little Ice Age conditions). This means that the Figure 3 sea level changes are almost directly opposite to the general changes in global climate. Consequently, the eustatic changes recorded cannot refer to glacial eustasy, but must be understood in terms of rotational eustasy.

Figure 4

This calls for some explanation. The idea that oceanic water masses may be dislocated horizontally by rotational–dynamical forces was launched in 1984 [21] and more extensively presented in 1988 [22].  Later, is was proposed that changes in the Solar Wind strongly affects the Earth’s rate of rotation [23] (with a deeper analysis in [24]) leading to a beat in the Gulf Stream with alternations between a dominant northeastward flow during rotational slowing-down periods of Grand Solar Maxima, and dominant east-south eastward flow during rotational seeding-up periods of Grand Solar Minima [25].

The sea level changes in the Indian Ocean, were therefore proposed [26,15] to be driven by rotational eustasy; i.e. the interchanges of water masses between high-latitudes and the equatorial region as a function of the speeding-up during Grand Solar Minima with Little Ice Age conditions and slowing down during Grand Solar Maxima with generally warm climatic conditions.

In the post-Little Ice Ages period from 1850 up to 1930-1940 there was a global glacial eustatic rise in the order of 11 cm [28]. For the rest of the last 500 years, rotational eustasy seems to have been the dominant factor as documented in Figure 3 and illustrated in Figure 4.

CONCLUSIONS

(1)– sea level is not at all in a rising mode in the Fiji area
(2) – on the contrary it has remained stable in the last 50-70 years
(3) – rotational eustasy has dominated the sea level changes in Fiji
(4) – the same changes are recorded in the Indian Ocean

Previously, the changes in sea level during the last 500 years were not covered by adequate research in the Fiji Islands. The present paper provides a detailed analyses documenting a +70 cm high level in the 16th and 17th centuries, a -50 cm low in the 18th century and a period of virtually stability in the 19th to early 21st centuries, the last period of which may be subdivided into an early 19th century +30 cm high, a long period of stability and a 10-20 cm fall in sea level in the last 60 years forcing corals to grew into microatolls under strictly stable sea level conditions. This means there are no traces of a present rise in sea level; on the contrary: full stability.

The long-term trend is almost identical to the trends documented in the Indian Ocean in the Maldives, Goa and Bangladesh. This implies a eustatic origin of the changes recorded; not of glacial eustatic origin, however, but of rotational eustatic origin. The rotational eustatic changes in sea level are driven by the alternations of speeding-up during Grand Solar Minima (the Maunder and Dalton Solar Minima) forcing water towards the equator, and slowing-down during Grand Solar Maxima (in the 18th century, around 1930-1940 and at about 1970-2000).

 

Earth Climate Layers

Thanks to No Tricks Zone for posting work (here) by Dr. Dai Davies of Canberra. In his writing I found a fine summary paradigm leading to the image above.  This post presents a scientifically rigorous view of our planetary climate system, starting with an airless rocky surface and then conceptually adding the dynamic elements in layers.  The text below with my bolds and images comes from Energy and Atmosphere by Dr. Dai Davies of University of Canberra,  Website: http://brindabella.id.au/climarc/.

The Earth’s atmosphere in stages

This is an hypothetical scenario that allows us to build up a picture, step by step, of how having an atmosphere can influence a planet.

Baseline: Airless, Rocky Planet
As a starting point we consider how the Earth’s temperature might vary through the daily cycle if it was an airless, rocky planet much like the moon. During the day, the sun heats up a surface layer of the rock which cools through infrared radiation. The temperature follows the sun’s irradiation almost directly, rising and plunging over a range of hundreds of degrees.

Add: Radiatively Inert Atmosphere
If we add a radiatively inert atmosphere, its only means of gaining and losing heat would be thermal conduction through direct contact with the Earth’s surface. The heat capacity of a square meter column of the Earth’s atmosphere is equivalent to that of about 12 tonnes of granite, so far greater than a thin layer of rock heated by the sun. While the surface would still go through a temperature cycle, the atmosphere would achieve an equilibrium where the mean lower atmosphere matched the mean surface temperature – give-or-take geography and atmospheric circulation. It would act as a buffer that would stabilise surface temperatures – cooling the surface during the day and warming it at night. This is discussed further in note (a) with some simple calculations.

All molecules are radiatively active if the energy is high enough. A realistic atmosphere, such as a nitrogen and oxygen mix, absorbs some energy from the light and UV components of incoming solar radiation, but still can’t lose heat through infrared radiation.

Various wavelengths of solar EM radiation penetrate Earth’s atmosphere to various depths. Fortunately for us, all of the high energy X-rays and most UV is filtered out long before it reaches the ground. Much of the infrared radiation is also absorbed by our atmosphere far above our heads. Most radio waves do make it to the ground, along with a narrow “window” of IR, UV, and visible light frequencies. Credit: Image courtesy STCI/JHU/NASA.

Add: Water Vapour, Ignoring Condensation
We now add water vapour to the atmosphere at typical Earth levels of up to 4%, but ignore the effects of condensation. Water molecules are kicked into excited states by collisions with nitrogen or oxygen molecules which lose some kinetic energy in the collision. Most of this energy will return to kinetic in subsequent collisions. Otherwise, the energy is radiated in a random direction as an infrared photon, which creates a radiation flux that travels much faster and further than molecular movement. Their mean free path (mfp) is typically 50 metres in the surface atmosphere, increasing with altitude as the density of the air decreases and collisions are less frequent. A small part of this energy escapes to space, a smaller part is absorbed by the Earth’s surface, leading to a net transfer to space.

This radiative flux greatly increases the thermal coupling between the surface and near-surface atmosphere, adding to the transfer via direct thermal conduction and reducing the daily temperature cycle of the surface, tying it closer to the temperature of the lower atmosphere. Due to the highly nonlinear nature of radiant emission, this will have a net heating effect on the surface as described in note (a).

The increase in mfp with altitude means there is a small upward bias in photon transmission through the atmosphere’s photon sea created by molecular collisions. This net upward transfer of energy largely substitutes the direct infrared radiation from surface to space, adding a slight delay in the order of milliseconds. Heat is not ‘trapped’, as is commonly claimed, just slowed a little. It’s a rapid conduit, not a reservoir.

Add: Liquid Water Covering 70% of Surface
For the next stage in the transition towards our current atmosphere we add our present distribution of liquid water over 70% of the rocky surface. This changes things dramatically. First, rather than just heating a thin surface layer of rock that can radiate heat rapidly, the sun’s rays penetrate deep into the oceans, heating water that retains its heat until physical mixing brings it to the surface. In the upper ‘mixing layer’ this happens in days to months. Some is mixed deeper and can travel for centuries in deep ocean currents before surfacing.

At the surface of the oceans and wet land we now have evaporative cooling which extracts heat of vaporisation and cools the surface just as sweat cools our skin. Water vapour is lighter than air and reduces the air density. The lighter air rises, creating convection. As it rises it eventually cools to the point where liquid water condenses out to form clouds and dumps the heat of vaporisation into the upper atmosphere. The main impact of clouds is to reduce incoming solar radiation by reflecting it back out to space.

Most of the heating is in equatorial regions. The rising air creates the major Hadley circulation cells that carry heat polewards in the upper troposphere. The radiating upper air cools and becomes more dense as it travels, eventually sinking back to surface level and returning to equatorial regions.

Water isn’t the only radiative gas in our atmosphere, but it dominates. The next in significance is carbon dioxide. It’s main impact is in the upper atmosphere where most of the water vapour has condensed out. This impact is cooling. Its influence in the lower atmosphere is discussed later.

Finally, we add Life.

Early on, it added the oxygen to our atmosphere. Now, its plants have changed the surface albedo – the amount of the sun’s energy reflected back to space. Through transpiration they also add to evaporation in increasing the input of water vapour to the atmosphere. Some plants and algae produce aerosols that seed clouds – terrestrial plants increasing their chances of rain – marine biota reducing the incidence of destructive UV.

Summary

There is much more to be learned from this thorough, well written article, but I will conclude with Davies’ summation:

The most fundamental of the many fatal mathematical flaws in the IPCC related modelling of atmospheric energy dynamics is to start with the impact of CO2 and assume water vapour as a dependent ‘forcing’ (note e). This has the tail trying to wag the dog. The impact of CO2 should be treated as a perturbation of the water cycle. When this is done, its effect is negligible.

Extensive analysis of radiosonde data over time, and an associated theoretical analysis, by Miskolczi (6) has shown that the water cycle adapts to maintain saturation – maximum impact – in the combined effects of water vapour and any other radiative gasses.

The sudden increase in evaporative cooling of warm water creating an upper bound for wet surface temperatures, along with the freezing point of water limiting ocean temperatures at the poles, anchor the overall surface temperature of the Earth. The Earth’s orbit, variations in solar activity, and long term transport of heat in ocean currents, provide cyclic variations. The lapse rate just determines the height of the tropopause. The net effect of CO2 is to help cool the upper troposphere where water vapour levels are low.

The current small peak in temperatures is partly the result of heat returning from past millennial cycles – the historians’ climate optima of the Medieval, Roman and earlier warm periods. As then, solar activity is now at low levels.

Davies provides a concise synopsis of several posts touching on key elements of earth’s climate.

My own discussion of climate layers is in Climate Reductionism

The effect of an inert atmosphere is shown empirically in Planetary Warming: Back to Basics

The reference above to Dr. Miskolczi is elaborated in The Curious Case of Dr. Miskolczi

The role of oceans in storing and distributing heat is described in Climate Water Wheel

The passage of energy through the atmosphere is explained at On Climate Theories

“The Earth, a rocky sphere at a distance from the Sun of ~149.6 million kilometers, where the Solar irradiance comes in at 1361.7 W/m2, with a mean global albedo, mostly from clouds, of 0.3 and with an atmosphere surrounding it containing a gaseous mass held in place by the planet’s gravity, producing a surface pressure of ~1013 mb, with an ocean of H2O covering 71% of its surface and with a rotation time around its own axis of ~24h, boasts an average global surface temperature of +15°C (288K).

Why this specific temperature? Because, with an atmosphere weighing down upon us with the particular pressure that ours exerts, this is the temperature level the surface has to reach and stay at for the global convectional engine to be able to pull enough heat away fast enough from it to be able to balance the particular averaged out energy input from the Sun that we experience.

It’s that simple.”  E. M. Smith

Update: October 16 Snow and Ice

Yesterday at AER Dr. Judah Cohen provided his Arctic Oscillation and Polar Vortex Analysis and Forecasts biweekly report and outlook regarding the arctic oscillation and the coming winter in Northern Hemisphere. Excerpts with my bolds.

  • As is often the case, the current positive AO is associated with a relatively mild weather pattern across the NH continents including Europe and much of North America.
  • However over the next two weeks with the predicted overall negative trend in the AO a concomitant cooling trend is predicted across the NH continents including the British Isles and Western Europe but especially the Eastern United States (US).
  • Across East Asia troughing will allow a series of fronts to swing through the region keeping temperatures variable but overall close to seasonable.
  • Looking ahead to this upcoming winter, in my opinion both below normal Arctic sea ice and above normal Siberian snow cover so far this month favor more severe winter weather especially mid and late winter across the NH mid-latitudes. Though it is still early and there is much uncertainty in predictions of winter weather.

The flow across the NH is currently mostly zonal especially across North America and this is resulting in an overall mild weather pattern including Europe and the US. The exception to the zonal flow is a block over the Laptev Sea resulting in troughing/negative geopotential height anomalies over both Western and Eastern Asia and colder temperatures.

Expanding Eurasian snow cover and Arctic ice extent October 1 to 16, 2017. Watch the ice growing toward the Siberian snow. Also at the top note ice growing toward Canadian snow cover.

Siberian snow cover has advanced at a relatively rapid pace so far this fall, which has been the recent trend. However snow cover extent this October is so far lagging the pace of last October. My, along with my colleagues and others, research have shown that extensive Siberian snow cover in the fall favors a trough across East Asia with a ridge to the west near the Urals. This atmospheric circulation pattern favors more active poleward heat flux, a weaker PV and cold temperatures across the NH.

Strong negative departures in the Barents-Kara Seas favors cold temperatures in Asia while strong negative departures near Greenland and/or the Beaufort Sea favor cold temperatures in eastern North America. However sea ice is currently more extensive in the Barents-Kara-Laptev Seas than last year at this time and even more than two years ago. I believe that low sea ice in the Barents Kara sea the past two winters helped anchor blocking in the region that favored cold temperatures in Eurasia relative to North America. That same forcing may not be as strong for the upcoming winter.

I would conclude that the three factors that I consider favorable for severe winter weather increased atmospheric blocking in the fall, more extensive Siberian snow cover and low Arctic sea ice have become the norm more than the exception over the past decade. I do believe that the lack of variability in these three factors, likely reduces their utility in winter predictions.

From Post Natural Climate Factors: Snow 

Previously I posted an explanation by Dr. Judah Cohen regarding a correlation between autumn Siberian snow cover and the following winter conditions, not only in the Arctic but extending across the Northern Hemisphere. More recently, in looking into Climate Model Upgraded: INMCM5, I noticed some of the scientists were also involved in confirming the importance of snow cover for climate forecasting. Since the poles function as the primary vents for global cooling, what happens in the Arctic in no way stays in the Arctic. This post explores data suggesting changes in snow cover drive some climate changes.

The Snow Cover Climate Factor

The diagram represents how Dr. judah Cohen pictures the Northern Hemisphere wintertime climate system.  He leads research regarding Arctic and NH weather patterns for AER.

cohen-schematic2

Dr. Cohen explains the mechanism in this diagram.

Conceptual model for how fall snow cover modifies winter circulation in both the stratosphere and the troposphere–The case for low snow cover on left; the case for extensive snow cover on right.

1. Snow cover increases rapidly in the fall across Siberia, when snow cover is above normal diabatic cooling helps to;
2. Strengthen the Siberian high and leads to below normal temperatures.
3. Snow forced diabatic cooling in proximity to high topography of Asia increases upward flux of energy in the troposphere, which is absorbed in the stratosphere.
4. Strong convergence of WAF (Wave Activity Flux) indicates higher geopotential heights.
5. A weakened polar vortex and warmer down from the stratosphere into the troposphere all the way to the surface.
6. Dynamic pathway culminates with strong negative phase of the Arctic Oscillation at the surface.

From Eurasian Snow Cover Variability and Links with Stratosphere-Troposphere
Coupling and Their Potential Use in Seasonal to Decadal Climate Predictions by Judah Cohen.

Variations in Siberian snow cover October (day 304) 2004 to 2016. Eurasian snow charts from IMS.

Observations of the Snow Climate Factor

The animation above shows from remote sensing that Eurasian snow cover fluctuates significantly from year to year, taking the end of October as a key indicator. Snowfall in 2016 was especially early and extensive, 2017 similar but slightly less at this point.

For several decades the IMS snow cover images have been digitized to produce a numerical database for NH snow cover, including area extents for Eurasia. The NOAA climate data record of Northern Hemisphere snow cover extent, Version 1, is archived and distributed by NCDC’s satellite Climate Data Record Program. The CDR is forward processed operationally every month, along with figures and tables made available at Rutgers University Global Snow Lab.

This first graph shows the snow extents of interest in Dr. Cohen’s paradigm. The Autumn snow area in Siberia is represented by the annual Eurasian averages of the months of October and November (ON). The following NH Winter is shown as the average snow area for December, January and February (DJF). Thus the year designates the December of that year plus the first two months of the next year.

Notes: NH snow cover minimum was 1981, trending upward since.  Siberian autumn snow cover was lowest in 1989, increasing since then.  Autumn Eurasian snow cover is about 1/3 of Winter NH snow area. Note also that fluctuations are sizable and correlated.

The second graph presents annual anomalies for the two series, each calculated as the deviation from the mean of its entire time series. Strikingly, the Eurasian Autumn flux is on the same scale as total NH flux, and closely aligned. While NH snow cover declined a few years prior to 2016, Eurasian snow is trending upward strongly.  If Dr. Cohen is correct, NH snowfall will follow. The linear trend is slightly positive, suggesting that fears of children never seeing snowfall have been exaggerated. The Eurasian trend line (not shown) is almost the same.

What About Winter 2017-2018?

These data confirm that Dr. Cohen and colleagues are onto something. Here are excerpts from his October 2 outlook for the upcoming season AER. (my bolds)

The main block/high pressure feature influencing Eurasian weather is currently centered over the Barents-Kara Seas and is predicted to first weaken and then strengthen over the next two weeks.

Blocking in the Barents-Kara Seas favors troughing/negative geopotential height anomalies and cool temperatures downstream over Eurasia but especially Central and East Asia. The forecast for the next two weeks across Central Asia is for continuation of overall below normal temperatures and new snowfall.

Currently the largest negative anomalies in sea ice extent are in the Chukchi and Beaufort Seas but that will change over the next month or so during the critical months of November-February. In my opinion low Arctic sea ice favors a more severe winter but not necessarily hemisphere-wide and depends on the regions of the strongest anomalies. Strong negative departures in the Barents-Kara Seas favors cold temperatures in Asia while strong negative departures near Greenland and/or the Beaufort Sea favor cold temperatures in eastern North America.

Siberian snow cover is advancing quickly relative to climatology and is on a pace similar to last year at this time. My, along with my colleagues and others, research has shown that extensive Siberian snow cover in the fall favors a trough across East Asia with a ridge to the west near the Urals. The atmospheric circulation pattern favors more active poleward heat flux, a weaker PV and cold temperatures across the NH. It is very early in the snow season but recent falls have been snowy across Siberia and therefore I do expect another upcoming snowy fall across Siberia.

Summary

In summary the three main predictors that I follow in the fall months most closely, the presence or absence of high latitude blocking, Arctic sea ice extent and Siberian snow cover extent all point towards a more severe winter across the continents of the NH.

Uh oh.  Now where did I put away my long johns?