More Methane Madness

The US Senate is considering an act to repeal with prejudice an Obama anti-methane regulation. The story from activist source Climate Central is
Senate Mulls ‘Kill Switch’ for Obama Methane Rule

The U.S. Senate is expected to vote soon on whether to use the Congressional Review Act to kill an Obama administration climate regulation that cuts methane emissions from oil and gas wells on federal land. The rule was designed to reduce oil and gas wells’ contribution to climate change and to stop energy companies from wasting natural gas.

The Congressional Review Act is rarely invoked. It was used this month to reverse a regulation for the first time in 16 years and it’s a particularly lethal way to kill a regulation as it would take an act of Congress to approve a similar regulation. Federal agencies cannot propose similar regulations on their own.

The Claim Against Methane

Now some Republican senators are hesitant to take this step because of claims like this one in the article:

Methane is 86 times more potent as a greenhouse gas than carbon dioxide over a period of 20 years and is a significant contributor to climate change. It warms the climate much more than other greenhouse gases over a period of decades before eventually losing its potency. Atmospheric carbon dioxide remains a potent greenhouse gas for thousands of years.

Essentially the journalist is saying: As afraid as you are about CO2, you should be 86 times more afraid of methane. Which also means, if CO2 is not a warming problem, your fear of methane is 86 times zero. The thousands of years claim is also bogus, but that is beside the point of this post, which is Methane.

IPCC Methane Scare

The article helpfully provides a link referring to Chapter 8 of IPCC AR5 report by Working Group 1 Anthropogenic and Natural Radiative Forcing.

The document is full of sophistry and creative accounting in order to produce as scary a number as possible. Table 8.7 provides the number for CH4 potency of 86 times that of CO2.  They note they were able to increase the Global Warming Potential (GWP) of CH4 by 20% over the estimate in AR4. The increase comes from adding in more indirect effects and feedbacks, as well as from increased concentration in the atmosphere.

In the details are some qualifying notes like these:

Uncertainties related to the climate–carbon feedback are large, comparable in magnitude to the strength of the feedback for a single gas.

For CH4 GWP we estimate an uncertainty of ±30% and ±40% for 20- and 100-year time horizons, respectively (for 5 to 95% uncertainty range).

Methane Facts from the Real World
From Sea Friends (here):

Methane is natural gas CH4 which burns cleanly to carbon dioxide and water. Methane is eagerly sought after as fuel for electric power plants because of its ease of transport and because it produces the least carbon dioxide for the most power. Also cars can be powered with compressed natural gas (CNG) for short distances.

In many countries CNG has been widely distributed as the main home heating fuel. As a consequence, methane has leaked to the atmosphere in large quantities, now firmly controlled. Grazing animals also produce methane in their complicated stomachs and methane escapes from rice paddies and peat bogs like the Siberian permafrost.

It is thought that methane is a very potent greenhouse gas because it absorbs some infrared wavelengths 7 times more effectively than CO2, molecule for molecule, and by weight even 20 times. As we have seen previously, this also means that within a distance of metres, its effect has saturated, and further transmission of heat occurs by convection and conduction rather than by radiation.

Note that when H20 is present in the lower troposphere, there are few photons left for CH4 to absorb:

Even if the IPCC radiative greenhouse theory were true, methane occurs only in minute quantities in air, 1.8ppm versus CO2 of 390ppm. By weight, CH4 is only 5.24Gt versus CO2 3140Gt (on this assumption). If it truly were twenty times more potent, it would amount to an equivalent of 105Gt CO2 or one thirtieth that of CO2. A doubling in methane would thus have no noticeable effect on world temperature.

However, the factor of 20 is entirely misleading because absorption is proportional to the number of molecules (=volume), so the factor of 7 (7.3) is correct and 20 is wrong. With this in mind, the perceived threat from methane becomes even less.

Further still, methane has been rising from 1.6ppm to 1.8ppm in 30 years (1980-2010), assuming that it has not stopped rising, this amounts to a doubling in 2-3 centuries. In other words, methane can never have any measurable effect on temperature, even if the IPCC radiative cooling theory were right.

Because only a small fraction in the rise of methane in air can be attributed to farm animals, it is ludicrous to worry about this aspect or to try to farm with smaller emissions of methane, or to tax it or to trade credits.

The fact that methane in air has been leveling off in the past two decades, even though we do not know why, implies that it plays absolutely no role as a greenhouse gas.

More information at THE METHANE MISCONCEPTIONS by Dr Wilson Flood (UK) here


Natural Gas (75% methane) burns the cleanest with the least CO2 for the energy produced.

Leakage of methane is already addressed by efficiency improvements for its economic recovery, and will apparently be subject to even more regulations.

The atmosphere is a methane sink where the compound is oxidized through a series of reactions producing 1 CO2 and 2H20 after a few years.

GWP (Global Warming Potential) is CO2 equivalent heat trapping based on laboratory, not real world effects.

Any IR absorption by methane is limited by H2O absorbing in the same low energy LW bands.

There is no danger this century from natural or man-made methane emissions.


Senators and the public are being bamboozled by opaque scientific bafflegab. The plain truth is much different. The atmosphere is a methane sink in which CH4 is oxidized in the first few meters. The amount of CH4 available in the air is miniscule, even compared to the trace gas CO2, and it is not accelerating. Methane is the obvious choice to signal virtue on the climate issue since governmental actions will not make a bit of difference anyway, except perhaps to do some economic harm.

Give a daisy a break (h/t Derek here)

Daisy methane


For a more thorough and realistic description of atmospheric warming see:

Fearless Physics from Dr. Salby

Ocean Oxygen Misdirection

The climate scare machine is promoting again the fear of suffocating oceans. For example, an article this week by Chris Mooney in Washington Post, It’s Official, the Oceans are Losing Oxygen.

A large research synthesis, published in one of the world’s most influential scientific journals, has detected a decline in the amount of dissolved oxygen in oceans around the world — a long-predicted result of climate change that could have severe consequences for marine organisms if it continues.

The paper, published Wednesday in the journal Nature by oceanographer Sunke Schmidtko and two colleagues from the GEOMAR Helmholtz Centre for Ocean Research in Kiel, Germany, found a decline of more than 2 percent in ocean oxygen content worldwide between 1960 and 2010.

Climate change models predict the oceans will lose oxygen because of several factors. Most obvious is simply that warmer water holds less dissolved gases, including oxygen. “It’s the same reason we keep our sparkling drinks pretty cold,” Schmidtko said.

But another factor is the growing stratification of ocean waters. Oxygen enters the ocean at its surface, from the atmosphere and from the photosynthetic activity of marine microorganisms. But as that upper layer warms up, the oxygen-rich waters are less likely to mix down into cooler layers of the ocean because the warm waters are less dense and do not sink as readily.

And of course, other journalists pile on with ever more catchy headlines.

The World’s Oceans Are Losing Oxygen Due to Climate Change

How Climate Change Is Suffocating The Oceans

Overview of Oceanic Oxygen

Once again climate alarmists/activists have seized upon an actual environmental issue, but misdirect the public toward their CO2 obsession, and away from practical efforts to address a real concern. Some excerpts from scientific studies serve to put things in perspective.

How the Ocean Breathes

Variability in oxygen and nutrients in South Pacific Antarctic Intermediate Water by J. L. Russell and A. G. Dickson

The Southern Ocean acts as the lungs of the ocean; drawing in oxygen and exchanging carbon dioxide. A quantitative understanding of the processes regulating the ventilation of the Southern Ocean today is vital to assessments of the geochemical significance of potential circulation reorganizations in the Southern Hemisphere, both during glacial-interglacial transitions and into the future.

Traditionally, the change in the concentration of oxygen along an isopycnal due to remineralization of organic material, known as the apparent oxygen utilization (AOU), has been used by physical oceanographers as a proxy for the time elapsed since the water mass was last exposed to the atmosphere. The concept of AOU requires that newly subducted water be saturated with respect to oxygen and is calculated from the difference between the measured oxygen concentration and the saturated concentration at the sample temperature.

This study has shown that the ratio of oxygen to nutrients can vary with time. Since Antarctic Intermediate Water provides a necessary component to the Pacific equatorial biological regime, this relatively high-nutrient, high-oxygen input to the Equatorial Undercurrent in the Western Pacific plays an important role in driving high rates of primary productivity on the equator, while limiting the extent of denitrifying bacteria in the eastern portion of the basin. 

Uncertain Measures of O2 Variability and Linkage to Climate Change

A conceptual model for the temporal spectrum of oceanic oxygen variability by Taka Ito and Curtis Deutsch

Changes in dissolved O2 observed across the world oceans in recent decades have been interpreted as a response of marine biogeochemistry to climate change. Little is known however about the spectrum of oceanic O2 variability. Using an idealized model, we illustrate how fluctuations in ocean circulation and biological respiration lead to low-frequency variability of thermocline oxygen.

Because the ventilation of the thermocline naturally integrates the effects of anomalous respiration and advection over decadal timescales, shortlived O2 perturbations are strongly damped, producing a red spectrum, even in a randomly varying oceanic environment. This background red spectrum of O2 suggests a new interpretation of the ubiquitous strength of decadal oxygen variability and provides a null hypothesis for the detection of climate change influence on oceanic oxygen. We find a statistically significant spectral peak at a 15–20 year timescale in the subpolar North Pacific, but the mechanisms connecting to climate variability remain uncertain.

The spectral power of oxygen variability increases from inter-annual to decadal frequencies, which can be explained using a simple conceptual model of an ocean thermocline exposed to random climate fluctuations. The theory predicts that the bias toward low-frequency variability is expected to level off as the forcing timescales become comparable to that of ocean ventilation. On time scales exceeding that of thermocline renewal, O2 variance may actually decrease due to the coupling between physical O2 supply and biological respiration [Deutsch et al., 2006], since the latter is typically limited by the physical nutrient supply.

Climate Model Projections are Confounded by Natural Variability

Natural variability and anthropogenic trends in oceanic oxygen in a coupled carbon cycle–climate model ensemble by T. L. Frolicher et al.

Internal and externally forced variability in oceanic oxygen (O2) are investigated on different spatiotemporal scales using a six-member ensemble from the National Center for Atmospheric Research CSM1.4-carbon coupled climate model. The oceanic O2 inventory is projected to decrease significantly in global warming simulations of the 20th and 21st centuries.

The anthropogenically forced O2 decrease is partly compensated by volcanic eruptions, which cause considerable interannual to decadal variability. Volcanic perturbations in oceanic oxygen concentrations gradually penetrate the ocean’s top 500 m and persist for several years. While well identified on global scales, the detection and attribution of local O2 changes to volcanic forcing is difficult because of unforced variability.

Internal climate modes can substantially contribute to surface and subsurface O2 variability. Variability in the North Atlantic and North Pacific are associated with changes in the North Atlantic Oscillation and Pacific Decadal Oscillation indexes. Simulated decadal variability compares well with observed O2 changes in the North Atlantic, suggesting that the model captures key mechanisms of late 20th century O2 variability, but the model appears to underestimate variability in the North Pacific.

Our results suggest that large interannual to decadal variations and limited data availability make the detection of human-induced O2 changes currently challenging.

The concentration of dissolved oxygen in the thermocline and the deep ocean is a particularly sensitive indicator of change in ocean transport and biology [Joos et al., 2003]. Less than a percent of the combined atmosphere and ocean O2 inventory is found in the ocean. The O2 concentration in the ocean interior reflects the balance between O2 supply from the surface through physical transport and O2 consumption by respiration of organic material.

Our modeling study suggests that over recent decades internal natural variability tends to mask simulated century-scale trends in dissolved oxygen from anthropogenic forcing in the North Atlantic and Pacific. Observed changes in oxygen are similar or even smaller in magnitude than the spread of the ensemble simulation. The observed decreasing trend in dissolved oxygen in the Indian Ocean thermocline and the boundary region between the subtropical and subpolar gyres in the North Pacific has reversed in recent years [McDonagh et al., 2005; Mecking et al., 2008], implicitly supporting this conclusion.

The presence of large-scale propagating O2 anomalies, linked with major climate modes, complicates the detection of long-term trends in oceanic O2 associated with anthropogenic climate change. In particular, we find a statistically significant link between O2 and the dominant climate modes (NAO and PDO) in the North Atlantic and North Pacific surface and subsurface waters, which are causing more than 50% of the total internal variability of O2 in these regions.

To date, the ability to detect and interpret observed changes is still limited by lack of data. Additional biogeo-chemical data from time series and profiling floats, such as the Argo array ( are needed to improve the detection of ocean oxygen and carbon system changes and our understanding of climate change.

The Real Issue is Ocean Dead Zones, Both Natural and Man-made

Since 1994, he and the World Resources Institute (report here) in Washington,D.C., have identified and mapped 479 dead zones around the world. That’s more than nine times as many as scientists knew about 50 years ago.

What triggers the loss of oxygen in ocean water is the explosive growth of sea life fueled by the release of too many nutrients. As they grow, these crowds can simply use up too much of the available oxygen.

Many nutrients entering the water — such as nitrogen and phosphorus — come from meeting the daily needs of some seven billion people around the world, Diaz says. Crop fertilizers, manure, sewage and exhaust spewed by cars and power plants all end up in waterways that flow into the ocean. Each can contribute to the creation of dead zones.

Ordinarily, when bacteria steal oxygen from one patch of water, more will arrive as waves and ocean currents bring new water in. Waves also can grab oxygen from the atmosphere.

Dead zones develop when this ocean mixing stops.

Rivers running into the sea dump freshwater into the salty ocean. The sun heats up the freshwater on the sea surface. This water is lighter than cold saltier water, so it floats atop it. When there are not enough storms (including hurricanes) and strong ocean currents to churn the water, the cold water can get trapped below the fresh water for long periods.

Dead zones are seasonal events. They typically last for weeks or months. Then they’ll disappear as the weather changes and ocean mixing resumes.

Solutions are Available and do not Involve CO2 Emissions

Helping dead zones recover

The Black Sea is bordered by Europe and Asia. Dead zones used to develop here that covered an area as large as Switzerland. Fertilizers running off of vast agricultural fields and animal feedlots in the former Soviet Union were a primary cause. Then, in 1989, parts of the Soviet Union began revolting. Two years later, this massive nation broke apart into 15 separate countries.

The political instability hurt farm activity. In short order, use of nitrogen and phosphorus fertilizers by area farmers declined. Almost at once, the size of the Black Sea’s dead zone shrunk dramatically. Now if a dead zone forms there it’s small, Rabalais says. Some years there is none.

Chesapeake Bay, the United State’s largest estuary, has its own dead zone. And the area affected has expanded over the past 50 years due to pollution. But since the 1980s, farmers, landowners and government agencies have worked to reduce the nutrients flowing into the bay.

Farmers now plant cover crops, such as oats or barley, that use up fertilizer that once washed away into rivers. Growers have also established land buffers to absorb nutrient runoff and to keep animal waste out of streams. People have even started to use laundry detergents made without phosphorus.

In 2011, scientists reported that these efforts had achieved some success in shrinking the size of the bay’s late-summer dead zones.

The World Resources Institute lists 55 dead zones as improving. “The bottom line is if we take a look at what is causing a dead zone and fix it, then the dead zone goes away,” says Diaz. “It’s not something that has to be permanent.”


Alarmists/activists are again confusing the public with their simplistic solution for a complex situation. And actual remedies are available, just not the agenda preferred by climatists.

Waste Management Saves the Ocean


Ocean Climate Ripples

Dr. Arnd Bernaerts is again active with edifying articles on how humans impact upon the oceans and thereby the climate. His recent post is Global Cooling 1940 – 1975 explained for climate change experts

I and others first approach Dr. Bernaerts’ theory relating naval warfare to climate change with a properly skeptical observation. The ocean is so vast, covering 71% of our planet’s surface and up to 11,000 meters deep, with such a storage of solar energy that it counteracts all forcings including human ones.

As an oceanographer, Bernaerts is well aware of that generalization, having named his website Oceans Govern Climate. But his understanding is much more particular and more clear to me in these recent presentations. His information is encyclopedic and his grasp of the details can be intimidating, but I think I get his main point.

When there is intense naval warfare concentrated in a small, shallow basin like the North Sea, the disturbance of the water structure and circulation is profound. The atmosphere responds, resulting in significant regional climate effects. Nearby basins and continents are impacted and eventually it ripples out across the globe.

The North Atlantic example is explained by Bernaerts Cooling of North Sea – 1939 (2_16) Some excerpts below.

Follow the Water

Water, among all solids and liquids, has the highest heat capacity except for liquid ammonia. If water within a water body remained stationary and did not move (which is what it does abundantly and often forcefully for a number of reasons), the uppermost water surface layer would, to a very high percentage, almost stop the transfer of any heat from a water body to the atmosphere.

However, temperature and salt are the biggest internal dynamic factors and they make the water move permanently. How much the ocean can transfer heat to the surface depends on how warm the surface water is relative to atmospheric air. Of no lesser importance is the question, as to how quickly and by what quantities cooled-down surface water is replaced by warmer water from sub-surface level. Wind, cyclones and hurricanes are atmospheric factors that quickly expose new water masses at the sea surface. Another ‘effective’ way to replace surface water is to stir the water body itself. Naval activities are doing just this.

War in the North Sea

Since the day the Second World War had started naval activities moved and turned the water in the North Sea at surface and lower levels at 5, 10, 20 or 30 metres or deeper on a scale that was possibly dozens of times higher than any comparable other external activity over a similar time period before. Presumably only World War One could be named in comparison.

The combatants arrived on the scene when the volume of heat from the sun had reached its annual peak. Impacts on temperatures and icing are listed in the last section: ‘Events’ (see below). The following circumstantial evidences help conclude with a high degree of certainty that the North Sea contributed to the arctic war winter of1939/40.

Climate Change in Response

On the basis of sea surface temperature record at Helgoland Station and subsequent air temperature, developments provide strong indication that the evaporation rate was high. This is confirmed by the following impacts observed:

More wind: As the rate of evaporation over the North Sea has not been measured and recorded, it seems there is little chance to prove that more vapour moved upwards during autumn 1939 than usual. It can be proved that the direction of the inflow of wind had changed from the usually most prevailing SW winds, to winds from the N to E, predominantly from the East. At Kew Observatory (London) general wind direction recorded was north-easterly only three times during 155 winter years; i.e. in 1814, 1841 and 1940[6]. This continental wind could have significantly contributed to the following phenomena of 1939: ‘The Western Front rain’.

More rain: One of the most immediate indicators of evaporation is the excessive rain in an area stretching from Southern England to Saxony, Silesia and Switzerland. Southern Baltic Sea together with Poland and Northern Germany were clearly separated from the generally wet weather conditions only three to four hundred kilometres further south. A demonstration of the dominant weather situation occurred in late October, when a rain section (supplied from Libya) south of the line Middle Germany, Hungary and Romania was completely separated from the rain section at Hamburg – Southern Baltic[7].

More cooling: Further, cooling observed from December 1939 onward can be linked to war activities in two ways. The most immediate effect, as has been explained (above), is the direct result from any excessive evaporation process. The second (at least for the establishment of global conditions in the first war winter) is the deprivation of the Northern atmosphere of its usual amount of water masses, circulating the globe as humidity.

Rippling Effects in Northern Europe and Beyond

Next to the Atlantic Gulf Current, the North Sea (Baltic Sea is discussed in the next chapter) plays a key role in determining the winter weather conditions in Northern Europe. The reason is simple. As long as these seas are warm, they help sustain the supremacy of maritime weather conditions. If their heat capacity turns negative, their feature turns ‘continental’, giving high air pressure bodies an easy opportunity to reign, i.e. to come with cold and dry air. Once that happens, access of warm Atlantic air is severely hampered or even prevented from moving eastwards freely.

The less moist air is circulating the globe south of the Arctic, the more easily cold polar air can travel south. A good piece of evidence is the record lack of rain in the USA from October – December 1939 followed by a colder than average January 1940, a long period of low water temperatures in the North Sea from October-March (see above) and the ‘sudden’ fall of air temperatures to record low in Northern Europe.

The graph above suggests that naval warfare is linked to rapid cooling. The climate system responds with negative feed backs to restore equilibrium. Following WWI, limited to the North Atlantic, the system overshot and the momentum continued upward into the 1930s. Following WWII, with both Pacific and Atlantic theaters, the climate feed backs show several peaks trying to offset the cooling, but the downward trend persisted until about 1975.


The Oceans Govern Climate. Man influences the ocean governor by means of an expanding fleet of motorized propeller-driven ships. Naval warfare in the two World Wars provide the most dramatic examples of the climate effects.

Neither I nor Dr. Bernaerts claim that shipping and naval activity are the only factors driving climate fluctuations. But it is disturbing that so much attention and money is spent on a bit player CO2, when a much more plausible human influence on climate is ignored and not investigated.

Scafetta vs. IPCC: Dueling Climate Theories

In one corner, Darth Vader, the Prince of CO2, filling the air with the overwhelming sound of his poison breath. Opposing him, Luke Skywalker, a single skeptic armed only with facts and logic.

OK, that’s over the top, but it’s what came to mind while reading a new paper by Nicola Scafetta in which he goes up against the IPCC empire. And Star Wars came to mind since Scafetta’s theory involves astronomical cycles. The title below links to the text, which is well worth reading.  Some excerpts follow. H/T GWPF

CMIP5 General Circulation Models versus a Semi-Empirical Model Based on Natural Oscillations

Scafetta comes out swinging: From the Abstract

Since 1850 the global surface temperature has warmed by about 0.9 oC. The CMIP5 computer climate models adopted by the IPCC have projected that the global surface temperature could rise by 2-5 oC from 2000 to 2100 for anthropogenic reasons. These projections are currently used to justify expensive mitigation policies to reduce the emission of anthropogenic greenhouse gases such as CO2.

However, recent scientific research has pointed out that the IPCC climate models fail to properly reconstruct the natural variability of the climate. Indeed, advanced techniques of analysis have revealed that the natural variability of the climate is made of several oscillations spanning from the decadal to the millennial scales (e.g. with periods of about 9.1, 10.4, 20, 60, 115, 1000 years and others). These oscillations likely have an astronomical origin.

In this short review I briefly summarize some of the main reasons why the AGWT should be questioned. In addition, I show that an alternative interpretation of climate change based on the evidences that a significant part of it is due to specific natural oscillations is possible. A modeling based on such interpretation agrees better with the climatic comprehensive picture deduced from the data.

The Missing Hot-Spot

It has been observed that for the last decades climate models predict a hot-spot, that is, a significant warming of a band of the upper troposphere 10 km over the tropics and the equator. The presence of this hot-spot is quite important because it would indicate that the water-vapor feedback to radiative forcing would be correctly reproduced by the models.

However, this predicted hot-spot has never been found in the tropospheric temperature records [20,21]. This could only be suggesting either that both the temperature records obtained with satellite measures and balloons have been poorly handled or that the models severely fail to properly simulate the water-vapor feedback. In the latter case, the flaw of the models would be fatal because the water-vapor feedback is the most important among the climate feedbacks.

Without a strong feedback response from water vapor the models would only predict a moderate climate sensitivity to radiative forcing of about 1.2 oC for CO2 doubling instead of about 3 oC. Figure 8 compares the observed temperature trend in the troposphere versus the climate model predictions: from Ref. [21]. The difference between the two record sets is evident.


Figure 8. Comparison between observed temperature trend in the troposphere (green-blue) versus the climate model predictions (red). From Ref. [21].

Observations Favor Scafetta’s Model Over GCM Models

I have proposed that the global surface temperature record could be reconstructed from the decadal to the millennial scale using a minimum of 6 harmonics at 9.1, 10.4, 20, 60, 115 and 983 years plus a anthropogenic and volcano contribution that can be evaluated from the CMIP5 GCM outputs reduced by half because, as discussed above, the real climate sensitivity to radiative forcing appears to be about half of what assumed by the current climate models. The figure highlights the better performance of the solar–astronomical semi-empirical model versus the CMIP5 models. This is particularly evident since 2000, as shown in the inserts.


Figure 12 [A] The four CMIP5 ensemble average projections versus the HadCRUT4 GST record (black). [B] The solar– astronomical semi-empirical model. From Ref. [4] Left axis shows temperature anomalies in degrees Celsius.

Forecast Validation

In 2011 I prepared a global surface temperature forecast based on a simplified climate model based on four natural oscillations (9.1, 10.4, 20 and 60 year) plus an estimate of a realistic anthropogenic contribution [25]: for example, see Refs. [33,34,35] referring to the 60-year cycle. Figure 13 compares my 2011 forecast (red curve) against the global surface temperature record I used in 2011 (HadCUT3, blue curve) and a modern global surface temperature record updated at June/2016 (RSS MSU record, black line,

The RSS MSU record, which is a global surface temperature estimate using satellite measurements, was linearly rescaled to fit the original HadCUT3 global surface temperature record for optimal comparison. Other global temperature reconstructions perform similarly. Note that the HadCUT3 has been dismissed in 2014. Figure 13 also shows in green a schematic representation of the IPCC GCMs prediction since 2000 [25].


Left axis shows temperature anomalies in degrees Celsius.

Figure 13. Comparison of the forecast (red-yellow curve) made in Scafetta (2011) [25] against (1) the temperature record used in 2011 (HadCRUT3, blue curve), (2) the IPCC climate model projections since 2000 (green area), (3) a recent global temperature record (RSS MSU record, black line, linearly re-scaled to match the HadCRUT3 from 1979 to 2014). The temperature record has followed Scafetta’s forecast better than the IPCC ones. In 2015-2016 there was a strong El-Nino Pacific Ocean natural warming that caused the observed temperature peak.


The considerations emerging from these findings yield to the conclusion that the IPCC climate models severely overestimate the anthropogenic climatic warming by about two times. I have finally proposed a semi-empirical climate model calibrated to reconstruct the natural climatic variability since Medieval times. I have shown that this model projects a very moderate warming until 2040 and a warming less than 2 oC from 2000 to 2100 using the same anthropogenic emission scenarios used by the CMIP5 models: see Figure 12.

This result suggests that climatic adaptation policies, which are less expensive than the mitigation ones, could be sufficient to address most of the consequences of a climatic change during the 21st century. Similarly, fossil fuels, which have contributed significantly to the development of our societies, can still be used to fulfill our energy necessities until equally efficient alternative energy sources could be determined and developed.

Scafetta Briefly Explains the Harmonic oscillation theory

“The theory is very simple in words. The solar system is characterized by a set of specific gravitational oscillations due to the fact that the planets are moving around the sun. Everything in the solar system tends to synchronize to these frequencies beginning with the sun itself. The oscillating sun then causes equivalent cycles in the climate system. Also the moon acts on the climate system with its own harmonics. In conclusion we have a climate system that is mostly made of a set of complex cycles that mirror astronomical cycles. Consequently it is possible to use these harmonics to both approximately hindcast and forecast the harmonic component of the climate, at least on a global scale. This theory is supported by strong empirical evidences using the available solar and climatic data.”

Footnote: Scafetta is not alone.  Dr. Norman Page has a new paper going into detail about forecasting climate by means of  solar-astronomical patterns.

The coming cooling: Usefully accurate climate forecasting for policy makers

Meet Richard Muller, Lukewarmist

Richard Muller, head of the Berkeley Earth project, makes a fair and balanced response to a question regarding the “97% consensus.”  Are any of the US Senators listening?  Full text below from Forbes 97%: An Inconvenient Truth About The Oft-Cited Polling Of Climate Scientists including a reference to Will Happer, potentially Trump’s science advisor.

Read it and see that he sounds a lot like Richard Lindzen.

What are some widely cited studies in the news that are false?

Answer by Richard Muller, Professor of Physics at UC Berkeley, on Quora:

That 97% of all climate scientists accept that climate change is real, large, and a threat to the future of humanity. That 97% basically concur with the vast majority of claims made by Vice President Al Gore in his Nobel Peace Prize winning film, An Inconvenient Truth.

The question asked in typical surveys is neither of those. It is this: “Do you believe that humans are affecting climate?” My answer would be yes. Humans are responsible for about a 1 degree Celsius rise in the average temperature in the last 100 years. So I would be included as one of the 97% who believe.

Yet the observed changes that are scientifically established, in my vast survey of the science, are confined to temperature rise and the resulting small (4-inch) rise in sea level. (The huge “sea level rise” seen in Florida is actually subsidence of the land mass, and is not related to global warming.) There is no significant change in the rate of storms, or of violent storms, including hurricanes and volcanoes. The temperature variability is not increasing. There is no scientifically significant increase in floods or droughts. Even the widely reported warming of Alaska (“the canary in the mine”) doesn’t match the pattern of carbon dioxide increase–it may have an explanation in terms of changes in the northern Pacific and Atlantic currents. Moreover, the standard climate models have done a very poor job of predicting the temperature rise in Antarctica, so we must be cautious about the danger of confirmation bias.

My friend Will Happer believes that humans do affect the climate, particularly in cities where concrete and energy use cause what is called the “urban heat island effect.” So he would be included in the 97% who believe that humans affect climate, even though he is usually included among the more intense skeptics of the IPCC. He also feels that humans cause a small amount of global warming (he isn’t convinced it is as large as 1 degree), but he does not think it is heading towards a disaster; he has concluded that the increase in carbon dioxide is good for food production, and has helped mitigate global hunger. Yet he would be included in the 97%.

The problem is not with the survey, which asked a very general question. The problem is that many writers (and scientists!) look at that number and mischaracterize it. The 97% number is typically interpreted to mean that 97% accept the conclusions presented in An Inconvenient Truth by former Vice President Al Gore. That’s certainly not true; even many scientists who are deeply concerned by the small global warming (such as me) reject over 70% of the claims made by Mr. Gore in that movie (as did a judge in the UK; see the following link: Gore climate film’s nine ‘errors‘).

The pollsters aren’t to blame. Well, some of them are; they too can do a good poll and then misrepresent what it means. The real problem is that many people who fear global warming (include me) feel that it is necessary to exaggerate the meaning of the polls in order to get action from the public (don’t include me).

There is another way to misrepresent the results of the polls. Yes, 97% of those polled believe that there is human caused climate change. How did they reach that decision? Was it based on a careful reading of the IPCC report? Was it based on their knowledge of the potential systematic uncertainties inherent in the data? Or was it based on their fear that opponents to action are anti-science, so we scientists have to get together and support each other. There is a real danger in people with Ph.D.s joining a consensus that they haven’t vetted professionally.

I like to ask scientists who “believe” in global warming what they think of the data. Do they believe hurricanes are increasing? Almost never do I get the answer “Yes, I looked at that, and they are.” Of course they don’t say that, because if they did I would show them the actual data! Do they say, “I’ve looked at the temperature record, and I agree that the variability is going up”? No. Sometimes they will say, “There was a paper by Jim Hansen that showed the variability was increasing.” To which I reply, “I’ve written to Jim Hansen about that paper, and he agrees with me that it shows no such thing. He even expressed surprise that his paper has been so misinterpreted.”

A really good question would be: “Have you studied climate change enough that you would put your scientific credentials on the line that most of what is said in An Inconvenient Truth is based on accurate scientific results? My guess is that a large majority of the climate scientists would answer no to that question, and the true percentage of scientists who support the statement I made in the opening paragraph of this comment, that true percentage would be under 30%. That is an unscientific guestimate, based on my experience in asking many scientists about the claims of Al Gore.

This question originally appeared on Quora. the place to gain and share knowledge, empowering people to learn from others and better understand the world.

Compare Muller’s statement with a short video by Lindzen.


Precipitation Misunderstandings


A previous post on Temperature Misunderstandings addressed mistaken notions about the meaning of temperature measurements and records. This post looks at rainfall, the other primary determinant of climates. For this topic California provides the means for everyone to see how misconceptions arise, and how to see precipitation statistics in context.

Lessons learned from the end of California’s “permanent drought”

A report by Larry Kummer documents how extensively California’s recent shortage of water was proclaimed as a “permanent drought”. And it goes on to document how El Nino conditions have ended the water shortage.

Status of the California drought

“During the past week, a series of storms bringing widespread rain and snow showers impacted the states along the Pacific Coast and northern Rockies. In California, the cumulative effect of several months of abundant precipitation has significantly improved drought conditions across the state.”
— US Drought monitor – California, February 9.

Precipitation over California in the water year so far (October 1 to January 31) is 178% of average for this date. The snowpack is 179% of average, as of Feb 8. Our reservoirs are at 125% of average capacity. See the bottom line summary as of February 7, from the US Drought monitor for California.

The improvement has been tremendous. The area with exceptional drought conditions have gone year over year from 38% of California to 0%, extreme drought from 23% to 1%, severe drought from 20% to 10% — while dry and moderate drought went from 18% to 48%, and no drought from <1% to 41%. See the map below. And the rain continues to fall.

In addition there is the saga of Oroville dam threatened by its reservoir overfilling.

Confusing Weather and Climate

As with temperature, rainy weather is not climate. Neither is fair, sunny weather permanent. Precipitation is variable in any particular climate, with the seasons and on decadal and mult-decadal bases. For a context on precipitation patterns around the world see Here Comes the Rain Again.

It is a mistake to call a temporary lack of rain a drought, or worse a permanent drought, and equally a mistake to call a return of rainfall the end of a drought. California’s history as a desert environment does not change just because politicians and the public have short memories.

H/T to Eric Simpson for reminding us of that history:

There is also this perceptive comment by tomholsinger

I wouldn’t be so quick about the drought ending. Droughts are ALWAYS multi-season events. I was very impressed by the references below, which made the point that the 20th Century average of ~200 million acre feet of precipitation in California (rain and snow combined) is way more than the average of ~140 million acre feet over the last 2000 years.

Drying of the West, National Geographic

The West without Water: What Past Floods, Droughts, and Other Climatic Clues Tell Us about Tomorrow, Ingram, B. Lynn, and Malamud-Roam, Frances, 2013, University of California Press

Tom goes on to quote himself from a Modesto Bee op-ed almost two years ago.

Global warming has nothing to do with this – history is bad enough. A long-standing pre-industrial regional climate fluctuation seems underway, returning us from the wettest century in the past 1000 years to at least the historic average of much less (~70%) rain and snow. Many paleoclimatologists believe we are entering a still worse mega-drought .

An extreme drought by historic standards means a drop to 35-40% of the 20th Century average for 10-20 years. California has experienced two centuries-long such extreme mega-droughts in the past 2000 years.

Our average 20th Century precipitation (rain and snow combined) produced about 200 million acre feet of water annually over the whole state. 118 million acre feet went to nature in 2000, and 82 million was allocated by humans – the first 39 million for federal mandates, 9 million was used by people and industry, and the last 34 million for irrigation. A drop to the historic average of ~140 million acre feet over the past 2000 years means extinction for California agriculture – it would bear almost all the burden of the decrease even if the federal water is released. An extreme drought means a drop to about 75 million acre feet, and we might be starting 1-2 centuries of that.

This is happening to the entire Southwest . ~20 million acre feet of the Southwest’s precipitation annually entered the Colorado River in the 20th Century, of which ~12 million is currently withdrawn by Americans. Colorado River flow too has averaged much less over the past 2000 years (12-14 million acre fee annually), and it drops to 7-8 million in droughts which sometimes last centuries.

A drop to only the historic average precipitation over the past 2000 years means catastrophe for the Southwest. 2/3 of the very wet 20th Century average is normal for the entire area. We can expect ALL of California’s allotment of Colorado River to be diverted to urban areas in Arizona and Nevada in the decades of drought the region seems to be entering.


As with temperatures, changes in precipitation are misinterpreted when taken out of historical context. This is usually done to hype a sociopolitical agenda by distracting people from the baseline realities to which we can only adapt, not prevent.

The rainfall measures above show that California enjoyed an unusually wet century and it would have been prudent to take advantage of it by storing water resources. As the fable tells us, grasshoppers live for today, ants prepare for tomorrow.

AMO: Atlantic Climate Pulse

I was inspired by David Dilley’s weather forecasting based upon Atlantic water pulsing into the Arctic Ocean (see post: Global Weather Oscillations). So I went looking for that signal in the AMO dataset, our best long-term measure of sea surface temperature variations in the North Atlantic.


For this purpose, I downloaded the AMO Index from Kaplan SST v.2, the unaltered and untrended dataset. By definition, the data are monthly average SSTs interpolated to a 5×5 grid over the North Atlantic basically 0 to 70N.

For an overview the graph below presents a comparison between Annual, March and September averages from 1856 to 2016 inclusive.


We see about 4°C difference between the cold month of March, and warm September. The overall trend is slightly positive at 0.27°C per century, about 10% higher in September and 10% lower in March. It is also clear that monthly patterns resemble closely the annual pattern, so it is reasonable to look more closely into Annual variability.

The details of the Annual fluctuations in AMO reveal the pulse pattern suggested by Dilley.


We note firstly the classic pattern of temperature cycles seen in all datasets featuring quality-controlled unadjusted data. The low in 1913, high in 1944, low in 1975, and high in 1998. Also evident are the matching El Nino years 1998, 2009 and 2016, indicating that what happens in the Pacific does not stay in the Pacific.

Most interesting are the periodic peaking of AMO in the 8 to 10 year time frame. The arrows indicate the peaks, which as Dilley describes produce a greater influx of warm Atlantic water under the Arctic ice. And as we know from historical records and naval ice charts, Arctic ice extents were indeed low in the 1930s, high in the 1970s, low in the 1990s and on a plateau presently.


I am intrigued but do not yet subscribe to the Lunarsolar explanation for these pulses, but the AMO index does provide impressive indication of the North Atlantic role as a climate pacemaker. Oceans make up 71% of the planet surface, so SSTs directly drive global mean temperatures (GMT). But beyond the math, Atlantic pulses set up oscillations in the Arctic that impact the world.

In the background is a large scale actor, the Atlantic Meridional Overturning Circulation (AMOC) which is the Atlantic part of the global “conveyor belt” moving warm water from the equatorial oceans to the poles and back again.  For more on this deep circulation pattern see Climate Pacemaker: The AMOC

Global Weather Oscillations

H/T to No Tricks Zone for posting (here) on the remarkable forecasting record of Global Weather Oscillations Inc. founded by David Dilley. The ability to predict storm activity demonstrates an understanding of earth’s climate system dynamics. The theory and supporting evidence are available to all in a free ebook Natural Climate Pulse

The heart of the matter seems to be Mr. Dilley’s extracting from very long term Milankovitch Cycles to determine decadal variations in weather activity. From the ebook pp. 16 ff.

Earth’s Natural Rhythm and Global Warming -Cooling Cycles

After researching various elements of the Milankovitch Cycles, Mr. Dilley found that specific sub-cycles which are called the “Lunisolar Precession” are a major factor in determining and maintaining the earth’s natural climate rhythm. It is the Lunisolar Precession that controls almost all of earth’s climate cycles, and it is well known throughout the climatological science community, that specific “Milankovitch Cycles” are the primary mechanism that controls glacial and interglacial periods on earth. If it were not for the gravitational tidal field of the moon, and the electromagnetic and gravitational tidal field of the sun, earth would spin out of control (ref: 23). It is these two bodies that keep earth’s orbit and tilt within certain limits, and provide earth’s climate cycles.

Mr. Dilley researched the Lunisolar Precession cycles for over 20 years, and correlated specific cycles to recurring cycles of climate. GWO incorporated his findings into climate – weather forecast models which provide a unique approach and extremely accurate long range cycle predictions for historical major earthquakes, regional hurricane landfalls many years in advance, historical floods, droughts, natural carbon dioxide cycles, global warming and global cooling cycles.


Figure 16 shows the approximate 9-year Lunisolar gravitational cycle. It is this cycle that is a major contributor to earth’s climate cycles. (Created by Global Weather Oscillations Inc.)

During the 1998 Global Warming Peak, the warm pulse occurred from 1990-93 and again 2004-07,and warmed the Arctic waters below the ice caps up to 1 Degree Celsius above normal. The Arctic Boundary Current from the Atlantic provides the largest input of water, heat, and salt into the Arctic Ocean; the total quantity of heat is substantial, enough to melt the Arctic sea ice cover several times over.
Courtesy…Fate of Early 2000s Arctic Warm Water Pulse Aigor V. Polyakov, Vladmir A. Alexeeve et al, Bulletin of the American Meteorological Society, Vol. 92 Number 5, May 2011


Figure 17 shows the North Atlantic warm water pulse (Ref:41) that enters the Arctic Ocean in coincidence with the 9-year Lunisolar Pulse shown as the red dots in Figure 16.(Created by Global Weather Oscillations Inc.)

Thus it can be seen that it is likely the approximate 9-year Lunisolar gravitational tidal pulse that sets up a rhythm or heartbeat for earth. During the recurring 230-year global warming cycles a very strong gravitation pulse acts like a plunger in the North Atlantic, causing a warm water pulse surge to enter the Arctic Ocean. It takes the warm water 13-years to circulate around the Arctic Ocean (Ref:43), gradually cooling during the period as it mixes with cooler water. It is this pulse that melts the Arctic Ice from the bottom up and eventually causes open waters to appear as melting continues during the lifespan of the pulse.


Figure 18 Shows the United States temperatures (red line) from 1880 on the left to the year 2008. Notice an approximate 9-year temperature rhythm for temperatures in the United States. Note the peaks in temperatures every 8 to 10 years, which are very similar to the 9-year Lunisolar. (Created by Global Weather Oscillations Inc.)

The strongest pulses are separated by 72-years during the 230-year global warming episode. For instance, a very warm water pulse caused 10-years of warm global temperatures in the 1930s, and a second very warm pulse 72-years later caused 10-years of warm global temperatures from 1998 to 2008. This approximate 9-year pulse also corresponds closely with temperature pulses around the world. If we extend the Lunisolar Precession 9-year Pulse out to an approximate 230-year pulse (full moon cycle only shown here), we get a clear picture of the relationship of the Lunisolar pulse to global warming cycles which occur approximately every 230 years.


Any theory stands or falls on the success of its predictions about the subject system’s behavior. Dilley is earning respect for his understanding of earth’s climate system. We should also note that his analysis anticipates a cooling period in the next decades, something not foreseen by any climate model builder.

Climate Poppycock

pop·py·cock ˈpäpēˌkäk/informal noun meaning nonsense.
Synonyms: nonsense, rubbish, claptrap, balderdash, blather, moonshine, garbage;

Origin: mid 19th century: from Dutch dialect pappekak, from pap ‘soft’ + kak ‘dung.’

This is obviously the linguistically correct term for most of the articles on climate published in the mainstream media. And it serves to describe perfectly the output from alarmist activists.

Exhibit A is provided by Ken Ward, leader of the “valve turners” and defendant facing felony charges in Washington state.

This week he succeeded to convince a juror to refuse him conviction because in his defense he “put up a map of Skagit County, about a third of which will be under water in 2050.”

I call “Poppycock.”  A study from U. of Washington came up with a range of 1″ to 18″ SLR by 2050 for coastal Washington state. Not only will that not flood the place, the range tells you they are shooting in the dark.

For a deeper look into this phenomenon, see Post-Truth Climatism

Fact: Future Climate Will Be Flatter, not Hotter

Another powerful post by Clive Best on how earth’s surface temperatures change by means of changing meridional heat transfers. Meridional Warming.

The key point for me was seeing how the best geological knowledge proves beyond the shadow of a doubt how the earth’s climate profile shifts over time, as presented in the diagram above.  It comes from esteemed paleoclimatologist Christopher Scotese.  His compete evidence and analysis can be reviewed in his article Some thoughts on Global Climate Change: The Transition from Icehouse to Hothouse (here).

In that essay Scotese shows where we are presently in this cycle between icehouse and hothouse.

As of 2015 earth is showing a GMT of 14.4C, compared to pre-industrial GMT of 13.8C.  According to the best geological evidence from millions of years of earth’s history, that puts us in the category “Severe Icehouse.”  So, thankfully we are warming up, albeit very slowly.

Moreover, and this is Clive Best’s point, progress toward a warming world means flattening the profile at the higher latitudes, especially the Arctic.  Equatorial locations remain at 23C throughout the millennia, while the gradient decreases in a warmer world.

The previous post explained what is wrong with averaging temperature anomalies.  See Temperature Misunderstandings


We have many, many centuries to go before the earth can warm up to the “Greenhouse” profile, let alone get to “Hothouse.”  Regional and local climates at higher latitudes will see slightly warming temperatures and smaller differences from equatorial climates.  These are facts based on solid geological evidence, not opinions or estimates from computer models.

It is still a very cold world, but we are moving in the right direction.  Stay the course.

Meanwhile, keep firing away Clive.