Global Warming Abates in Autumn

Hot, Hot, Hot.  You will have noticed that the term “climate change” is now synonymous with “summer”.  Since the northern hemisphere is where most of the world’s land, people and media are located, two typical summer months (June was not so hot) have been depicted as the fires of hell awaiting any and all who benefit from fossil fuels. If you were wondering what the media would do, apart from obsessing over the many small storms this year, you are getting the answer.

Fortunately, Autumn is on the way and already bringing cooler evenings in Montreal where I live. Once again open windows provide fresh air for sleeping, while mornings are showing condensation, and frost sometimes. This year’s period of “climate change” is winding down.  Unless of course, we get some hurricanes the next two months.  Below is a repost of seasonal changes in temperature and climate for those who may have been misled by the media reports of a forever hotter future.

[Note:  The text below refers to human migratory behavior now prohibited because, well Coronavirus.]

geese-in-v-formation

Autumnal Climate Change

Seeing a lot more of this lately, along with hearing the geese  honking. And in the next month or so, we expect that trees around here will lose their leaves. It definitely is climate change of the seasonal variety.

Interestingly, the science on this is settled: It is all due to reduction of solar energy because of the shorter length of days (LOD). The trees drop their leaves and go dormant because of less sunlight, not because of lower temperatures. The latter is an effect, not the cause.

Of course, the farther north you go, the more remarkable the seasonal climate change. St. Petersburg, Russia has their balmy “White Nights” in June when twilight is as dark as it gets, followed by the cold, dark winter and a chance to see the Northern Lights.

And as we have been monitoring, the Arctic ice has been melting from sunlight in recent months, but is already building again in the twilight, to reach its maximum in March under the cover of darkness.

We can also expect in January and February for another migration of millions of Canadians (nicknamed “snowbirds”) to fly south in search of a summer-like climate to renew their memories and hopes. As was said to me by one man in Saskatchewan (part of the Canadian wheat breadbasket region): “Around here we have Triple-A farmers: April to August, and then Arizona.” Here’s what he was talking about: Quartzsite Arizona annually hosts 1.5M visitors, mostly between November and March.

Of course, this is just North America. Similar migrations occur in Europe, and in the Southern Hemisphere, the climates are changing in the opposite direction, Springtime currently. Since it is so obviously the sun causing this seasonal change, the question arises: Does the sunlight vary on longer than annual timescales?

The Solar-Climate Debate

And therein lies a great, enduring controversy between those (like the IPCC) who dismiss the sun as a driver of multi-Decadal climate change, and those who see a connection between solar cycles and Earth’s climate history. One side can be accused of ignoring the sun because of a prior commitment to CO2 as the climate “control knob”.

The other side is repeatedly denounced as “cyclomaniacs” in search of curve-fitting patterns to prove one or another thesis. It is also argued that a claim of 60-year cycles can not be validated with only 150 years or so of reliable data. That point has weight, but it is usually made by those on the CO2 bandwagon despite temperature and CO2 trends correlating for only 2 decades during the last century.

One scientist in this field is Nicola Scafetta, who presents the basic concept this way:

“The theory is very simple in words. The solar system is characterized by a set of specific gravitational oscillations due to the fact that the planets are moving around the sun. Everything in the solar system tends to synchronize to these frequencies beginning with the sun itself. The oscillating sun then causes equivalent cycles in the climate system. Also the moon acts on the climate system with its own harmonics. In conclusion we have a climate system that is mostly made of a set of complex cycles that mirror astronomical cycles. Consequently it is possible to use these harmonics to both approximately hindcast and forecast the harmonic component of the climate, at least on a global scale. This theory is supported by strong empirical evidences using the available solar and climatic data.”

He goes on to say:

“The global surface temperature record appears to be made of natural specific oscillations with a likely solar/astronomical origin plus a noncyclical anthropogenic contribution during the last decades. Indeed, because the boundary condition of the climate system is regulated also by astronomical harmonic forcings, the astronomical frequencies need to be part of the climate signal in the same way the tidal oscillations are regulated by soli-lunar harmonics.”

He has concluded that “at least 60% of the warming of the Earth observed since 1970 appears to be induced by natural cycles which are present in the solar system.” For the near future he predicts a stabilization of global temperature and cooling until 2030-2040.

For more see Scafetta vs. IPCC: Dueling Climate Theories

A Deeper, but Accessible Presentation of Solar-Climate Theory

I have found this presentation by Ian Wilson to be persuasive while honestly considering all of the complexities involved.

The author raises the question: What if there is a third factor that not only drives the variations in solar activity that we see on the Sun but also drives the changes that we see in climate here on the Earth?

The linked article is quite readable by a general audience, and comes to a similar conclusion as Scafetta above: There is a connection, but it is not simple cause and effect. And yes, length of day (LOD) is a factor beyond the annual cycle.

Click to access IanwilsonForum2008.pdf

It is fair to say that we are still at the theorizing stage of understanding a solar connection to earth’s climate. And at this stage, investigators look for correlations in the data and propose theories (explanations) for what mechanisms are at work. Interestingly, despite the lack of interest from the IPCC, solar and climate variability is a very active research field these days.

For example Svensmark has now a Cosmosclimatology theory supported by empirical studies described in more detail in the red link.

A summary of recent studies is provided at NoTricksZone: Since 2014, 400 Scientific Papers Affirm A Strong Sun-Climate Link

Ian Wilson has much more to say at his blog: http://astroclimateconnection.blogspot.com.au/

Once again, it appears that the world is more complicated than a simple cause and effect model suggests.

Fluctuations in observed global temperatures can be explained by a combination of oceanic and solar cycles.  See engineering analysis from first principles Quantifying Natural Climate Change.

For everything there is a season, a time for every purpose under heaven.

What has been will be again, what has been done will be done again;
there is nothing new under the sun.
(Ecclesiastes 3:1 and 1:9)

Footnote:

jimbob child activist

Red Flag: Ontario’s Green Energy Debacle

Babatunde Williams writes at Spiked Ontario’s green-energy catastrophe.  Excerpts in italics with my bolds

A transition to renewables sent energy prices soaring, pushed thousands into poverty and fueled a populist backlash.

In February 2009, Ontario passed its Green Energy Act (GEA). It was signed a week after Obama’s Economic Recovery and Reinvestment Act in the US, following several months of slow and arduous negotiations. It also had grand plans to start a ‘green’ recovery following the financial crash – although on a more modest scale.

This was the plan: increased integration of wind and solar energy into Ontario’s electricity grid would shut down coal plants and create 50,000 green jobs in the first three years alone.

Additionally, First Nations communities would manage their own electricity supply and distribution – what observers would later call the ‘decolonisation’ of energy – empowering Canada’s indigenous communities who had been disenfranchised by historical trauma. Lawmakers promised that clean and sustainable energy provided by renewables would also reduce costs for poorer citizens. This won an endorsement from Ontario’s Low Income Energy Network – a group which campaigns for universal access to affordable energy.

But on 1 January, 2019, Ontario repealed the GEA, one month before its 10th anniversary. The 50,000 guaranteed jobs never materialised. The ‘decolonisation’ of energy didn’t work out, either. A third of indigenous Ontarians now live in energy poverty. Ontarians watched in dismay as their electricity bills more than doubled during the life of the GEA. Their electricity costs are now among the highest in North America.

To understand how the GEA went irreparably wrong, we must look at Ontario’s contracts with its green-energy suppliers. Today, Ontario’s contracts guarantee to electricity suppliers that they ‘will be paid for each kWh of electricity generated from the renewable energy project’, regardless of whether this electricity is consumed. As preposterous as this may seem, it’s actually an improvement on many of the original contracts the Ontario government locked itself into.

Earlier contracts guaranteed payments that benchmarked close to 100 per cent of the supplier’s capacity, rather than the electricity generated. So if a participating producer supplied only 33 per cent of its capacity in a given year, the state would still pay it as if it had produced 100 per cent.

This was especially alarming in context, as 97 per cent of the applicants to the GEA programme were using wind or solar energy. These are both intermittent forms of energy. In an hour, day or month with little wind or sun, wind and solar farms can’t supply the grid with electricity, and other sources are needed for back-up. As a result, wind and solar electricity providers can only supplement the grid but cannot replace consistently reliable power plants like gas or nuclear.

Many governments, including other Canadian provinces, have used subsidies of all hues to incentivise renewables. But Ontario put this strategy on steroids. For example, the Council for Clean and Reliable Energy found that ‘in 2015, Ontario’s wind farms operated at less than one-third capacity more than half (58 per cent) the time’. Regardless, Ontarians paid multiple contracts as if wind farms had operated at full capacity all year round. To add insult to injury, Ontario’s GEA contracts guaranteed exorbitant prices for renewable energy – often at up to 40 times the cost of conventional power for 20 years.

By 2015, Ontario’s auditor general, Bonnie Lysyk, concluded that citizens had paid ‘a total of $37 billion’ above the market rate for energy. They were even ‘expected to pay another $133 billion from 2015 to 2032’, again, ‘on top of market valuations’. (One steelmaker has taken the Ontarian government to court for these exorbitant energy costs.)

Today, this problem persists.  Furthermore, electricity demand from ratepayers declined between 2011 and 2015, and has continued to fall. Ontarians were forced to pay higher prices for new electricity capacity, even as their consumption was going down.

Ontario’s auditor general in 2015 stated that: ‘The implied cost of using non-hydro renewables to reduce carbon emissions in the electricity sector was quite high: approximately $257 million [£150million] for each megatonne of emissions reduced.’ Per tonne of carbon reduced, the Ontario scheme has cost 48 per cent more than Sweden’s carbon tax – the most expensive carbon tax in the world.

Clearly, bad policy has led to exorbitant waste. This wasn’t the result of corruption or conspiracy – it was sheer incompetence. It’s a meandering story of confusion and gross policy blunders that will fuel energy poverty in Ontario for at least another decade.

As democracies across the West respond to the coronavirus crisis with hastily prepared financial packages for a ‘green recovery’, they should consider the cautionary tale of Ontario.

The GEA’s stubborn defenders refuse to recognise that poor policy, even with the best intentions, discredits future efforts at cutting emissions. ‘Green New Deals’ for the post-pandemic recovery in the US and Europe should learn from the GEA. Clean energy at any cost will be rightfully short-lived and repealed, and its supporters will be unceremoniously booted out of power.

See Also:  Electrical Madness in Green Ontario

 

On Non-Infectious Covid Positives

Daniel Payne writes at Just the News Growing research indicates many COVID-19 cases might not be infectious at all. Excerpts in italics with my bolds.

Elevated ‘cycle thresholds’ may be detecting virus long after it is past the point of infection.

A growing body of research suggests that a significant number of confirmed COVID-19 infections in the U.S. — perhaps as many as 9 out of every 10 — may not be infectious at all, with much of the country’s testing equipment possibly picking up mere fragments of the disease rather than full-blown infections.

Yet a burgeoning line of scientific inquiry suggests that many confirmed infections of COVID-19 may actually be just residual traces of the virus itself, a contention that — if true — may suggest both that current high levels of positive viruses are clinically insignificant and that the mitigation measures used to suppress them may be excessive.

Background from previous post: New Better and Faster Covid Test

Kevin Pham reports on a breakthrough in coronavirus testing. Excerpts in italics with my bolds.

Another new test for COVID-19 was recently authorized — and this one could be a game-changer.

The Abbot Diagnostics BinaxNOW antigen test is a new point-of-care test that reportedly costs only $5 to administer, delivers results in as little as 15 minutes, and requires no laboratory equipment to perform. That means it can be used in clinics far from commercial labs or without relying on a nearby hospital lab.

That last factor is key. There are other quick COVID-19 tests on the market, but they have all required lab equipment that can be expensive to maintain and operate, and costs can be prohibitive in places that need tests most.

This kind of test is reminiscent of rapid flu tests that are ubiquitous in clinics. They’ll give providers tremendous flexibility in testing for the disease in not just clinics, but with trained and licensed medical professionals, in schools, workplaces, camps, or any other number of places.

So what’s new about this test? Most of the current tests detect viral RNA, the genetic material of SARS-CoV-2. This is a very accurate way of detecting the virus, but it requires lab equipment to break apart the virus and amplify the amount of genetic material to high enough levels for detection.

The BinaxNOW test detects antigens — proteins unique to the virus that are usually detectable whenever there is an active infection.

Abbott says it intends to produce 50 million tests per month starting in October. That’s far more than the number tested in July, when we were breaking new testing records on a daily basis with approximately 23 million tests recorded.

There’s a more important reason to be encouraged by this test coming available.  The viral load is not amplified by the test, so a positive is actually a person needing isolation and treatment.  As explained in a previous post below,  the PCR tests used up to now clutter up the record by showing as positive people with viral loads too low to be sick or to infect others.

Background from Previous Post The Truth About CV Tests

The peoples’ instincts are right, though they have been kept in the dark about this “pandemic” that isn’t.  Responsible citizens are starting to act out their outrage from being victimized by a medical-industrial complex (to update Eisenhower’s warning decades ago).  The truth is, governments are not justified to take away inalienable rights to life, liberty and the pursuit of happiness.  There are several layers of disinformation involved in scaring the public.  This post digs into the CV tests, and why the results don’t mean what the media and officials claim.

For months now, I have been updating the progress in Canada of the CV outbreak.  A previous post later on goes into the details of extracting data on tests, persons testing positive (termed “cases” without regard for illness symptoms) and deaths after testing positive.  Currently, the contagion looks like this.

The graph shows that deaths are less than 5 a day, compared to a daily death rate of 906 in Canada from all causes.  Also significant is the positivity ratio:  the % of persons testing positive out of all persons tested each day.  That % has been fairly steady for months now:  1% positive means 99% of people are not infected. And this is despite more than doubling the rate of testing.

But what does testing positive actually mean?  Herein lies more truth that has been hidden from the public for the sake of an agenda to control free movement and activity.  Background context comes from  Could Rapid Coronavirus Testing Help Life Return To Normal?, an interview at On Point with Dr. Michael Mina.  Excerpts in italics with my bolds. H/T Kip Hansen

A sign displays a new rapid coronavirus test on the new Abbott ID Now machine at a ProHEALTH center in Brooklyn on August 27, 2020 in New York City. (Spencer Platt/Getty Images)

Dr. Michael Mina:

COVID tests can actually be put onto a piece of paper, very much like a pregnancy test. In fact, it’s almost exactly like a pregnancy test. But instead of looking for the hormones that tell if somebody is pregnant, it looks for the virus proteins that are part of SA’s code to virus. And it would be very simple: You’d either swab the front of your nose or you’d take some saliva from under your tongue, for example, and put it onto one of these paper strips, essentially. And if you see a line, it means you’re positive. And if you see no line, it means you are negative, at least for having a high viral load that could be transmissible to other people.

An antigen is one of the proteins in the virus. And so unlike the PCR test, which is what most people who have received a test today have generally received a PCR test. And looking those types of tests look for the genome of the virus to RNA and you could think of RNA the same way that humans have DNA. This virus has RNA. But instead of looking for RNA like the PCR test, these antigen tests look for pieces of the protein. It would be like if I wanted a test to tell me, you know, that somebody was an individual, it would actually look for features like their eyes or their nose. And in this case, it is looking for different parts of the virus. In general, the spike protein or the nuclear capsid, these are two parts of the virus.

The reason that these antigen tests are going to be a little bit less sensitive to detect the virus molecules is because there’s no step that we call an amplification step. One of the things that makes the PCR test that looks for the virus RNA so powerful is that it can take just one molecule, which the sensor on the machine might not be able to detect readily, but then it amplifies that molecule millions and millions of times so that the sensor can see it. These antigen tests, because they’re so simple and so easy to use and just happen on a piece of paper, they don’t have that amplification step right now. And so they require a larger amount of virus in order to be able to detect it. And that’s why I like to think of these types of tests having their primary advantage to detect people with enough virus that they might be transmitting or transmissible to other people.”

The PCR test, provides a simple yes/no answer to the question of whether a patient is infected.
Source: Covid Confusion On PCR Testing: Maybe Most Of Those Positives Are Negatives.

Similar PCR tests for other viruses nearly always offer some measure of the amount of virus. But yes/no isn’t good enough, Mina added. “It’s the amount of virus that should dictate the infected patient’s next steps. “It’s really irresponsible, I think, to [ignore this]” Dr. Mina said, of how contagious an infected patient may be.

We’ve been using one type of data for everything,” Mina said. “for [diagnosing patients], for public health, and for policy decision-making.”

The PCR test amplifies genetic matter from the virus in cycles; the fewer cycles required, the greater the amount of virus, or viral load, in the sample. The greater the viral load, the more likely the patient is to be contagious.

This number of amplification cycles needed to find the virus, called the cycle threshold, is never included in the results sent to doctors and coronavirus patients, although if it was, it could give them an idea of how infectious the patients are.

One solution would be to adjust the cycle threshold used now to decide that a patient is infected. Most tests set the limit at 40, a few at 37. This means that you are positive for the coronavirus if the test process required up to 40 cycles, or 37, to detect the virus.

Any test with a cycle threshold above 35 is too sensitive, Juliet Morrison, a virologist at the University of California, Riverside told the New York Times. “I’m shocked that people would think that 40 could represent a positive,” she said.

A more reasonable cutoff would be 30 to 35, she added. Dr. Mina said he would set the figure at 30, or even less.

Another solution, researchers agree, is to use even more widespread use of Rapid Diagnostic Tests (RDTs) which are much less sensitive and more likely to identify only patients with high levels of virus who are a transmission risk.

Comment:  In other words, when they analyzed the tests that also reported cycle threshold (CT), they found that 85 to 90 percent were above 30. According to Dr. Mina a CT of 37 is 100 times too sensitive (7 cycles too much, 2^7 = 128) and a CT of 40 is 1,000 times too sensitive (10 cycles too much, 2^10 = 1024). Based on their sample of tests that also reported CT, as few as 10 percent of people with positive PCR tests actually have an active COVID-19 infection. Which is a lot less than reported.

Here is a graph showing how this applies to Canada.

It is evident that increased testing has resulted in more positives, while the positivity rate is unchanged. Doubling the tests has doubled the positives, up from 300 a day to nearly 600 a day presently.  Note these are PCR results. And the discussion above suggests that the number of persons with an active infectious viral load is likely 10% of those reported positive: IOW up from 30 a day to 60 a day.  And in the graph below, the total of actual cases in Canada is likely on the order of 13,000 total from the last 7 months, an average of 62 cases a day.

WuFlu Exposes a Fundamental Flaw in US Health System

Dr. Mina goes on to explain what went wrong in US response to WuFlu:

In the U.S, we have a major focus on clinical medicine, and we have undervalued and underfunded the whole concept of public health for a very long time. We saw an example of this for, for example, when we tried to get the state laboratories across the country to be able to perform the PCR tests back in March, February and March, we very quickly realized that our public health infrastructure in this country just wasn’t up to the task. We had very few labs that were really able to do enough testing to just meet the clinical demands. And so such a reduced focus on public health for so long has led to an ecosystem where our regulatory agencies, this being primarily the FDA, has a mandate to approve clinical medical diagnostic tests. But there’s actually no regulatory pathway that is available or exists — and in many ways, we don’t even have a language for it — for a test whose primary purpose is one of public health and not personal medical health

That’s really caused a problem. And a lot of times, it’s interesting if you think about the United States, every single test that we get, with the exception maybe of a pregnancy test, has to go through a physician. And so that’s a symptom of a country that has focused, and a society really, that has focused so heavily on the medical industrial complex. And I’m part of that as a physician. But I also am part of the public health complex as an epidemiologist. And I see that sometimes these are at odds with each other, medicine and public health. And this is an example where because all of our regulatory infrastructure is so focused on medical devices… If you’re a public health person, you can actually have a huge amount of leeway in how your tests are working and still be able to get epidemics under control. And so there’s a real tension here between the regulations that would be required for these types of tests versus a medical diagnostic test.

Footnote:  I don’t think the Chinese leaders were focusing on the systemic weakness Dr. MIna mentions.  But you do have to bow to the inscrutable cleverness of the Chinese Communists releasing WuFlu as a means to set internal turmoil within democratic capitalist societies.  On one side are profit-seeking Big Pharma, aided and abetted by Big Media using fear to attract audiences for advertising revenues.  The panicked public demands protection which clueless government provides by shutting down the service and manufacturing industries, as well as throwing money around and taking on enormous debt.  The world just became China’s oyster.

Background from Previous Post: Covid Burnout in Canada August 28

The map shows that in Canada 9108 deaths have been attributed to Covid19, meaning people who died having tested positive for SARS CV2 virus.  This number accumulated over a period of 210 days starting January 31. The daily death rate reached a peak of 177 on May 6, 2020, and is down to 6 as of yesterday.  More details on this below, but first the summary picture. (Note: 2019 is the latest demographic report)

  Canada Pop Ann Deaths Daily Deaths Risk per
Person
2019 37589262 330786 906 0.8800%
Covid 2020 37589262 9108 43 0.0242%

Over the epidemic months, the average Covid daily death rate amounted to 5% of the All Causes death rate. During this time a Canadian had an average risk of 1 in 5000 of dying with SARS CV2 versus a 1 in 114 chance of dying regardless of that infection. As shown later below the risk varied greatly with age, much lower for younger, healthier people.

Background Updated from Previous Post

In reporting on Covid19 pandemic, governments have provided information intended to frighten the public into compliance with orders constraining freedom of movement and activity. For example, the above map of the Canadian experience is all cumulative, and the curve will continue upward as long as cases can be found and deaths attributed.  As shown below, we can work around this myopia by calculating the daily differentials, and then averaging newly reported cases and deaths by seven days to smooth out lumps in the data processing by institutions.

A second major deficiency is lack of reporting of recoveries, including people infected and not requiring hospitalization or, in many cases, without professional diagnosis or treatment. The only recoveries presently to be found are limited statistics on patients released from hospital. The only way to get at the scale of recoveries is to subtract deaths from cases, considering survivors to be in recovery or cured. Comparing such numbers involves the delay between infection, symptoms and death. Herein lies another issue of terminology: a positive test for the SARS CV2 virus is reported as a case of the disease COVID19. In fact, an unknown number of people have been infected without symptoms, and many with very mild discomfort.

August 7 in the UK it was reported (here) that around 10% of coronavirus deaths recorded in England – almost 4,200 – could be wiped from official records due to an error in counting.  Last month, Health Secretary Matt Hancock ordered a review into the way the daily death count was calculated in England citing a possible ‘statistical flaw’.  Academics found that Public Health England’s statistics included everyone who had died after testing positive – even if the death occurred naturally or in a freak accident, and after the person had recovered from the virus.  Numbers will now be reconfigured, counting deaths if a person died within 28 days of testing positive much like Scotland and Northern Ireland…

Professor Heneghan, director of the Centre for Evidence-Based Medicine at Oxford University, who first noticed the error, told the Sun:

‘It is a sensible decision. There is no point attributing deaths to Covid-19 28 days after infection…

For this discussion let’s assume that anyone reported as dying from COVD19 tested positive for the virus at some point prior. From the reasoning above let us assume that 28 days after testing positive for the virus, survivors can be considered recoveries.

Recoveries are calculated as cases minus deaths with a lag of 28 days. Daily cases and deaths are averages of the seven days ending on the stated date. Recoveries are # of cases from 28 days earlier minus # of daily deaths on the stated date. Since both testing and reports of Covid deaths were sketchy in the beginning, this graph begins with daily deaths as of April 24, 2020 compared to cases reported on March 27, 2020.

The line shows the Positivity metric for Canada starting at nearly 8% for new cases April 24, 2020. That is, for the 7 day period ending April 24, there were a daily average of 21,772 tests and 1715 new cases reported. Since then the rate of new cases has dropped down, now holding steady at ~1% since mid-June. Yesterday, the daily average number of tests was 45,897 with 427 new cases. So despite more than doubling the testing, the positivity rate is not climbing.  Another view of the data is shown below.

The scale of testing has increased and now averages over 45,000 a day, while positive tests (cases) are hovering at 1% positivity.  The shape of the recovery curve resembles the case curve lagged by 28 days, since death rates are a small portion of cases.  The recovery rate has grown from 83% to 99% steady over the last 2 weeks, so that recoveries exceed new positives. This approximation surely understates the number of those infected with SAR CV2 who are healthy afterwards, since antibody studies show infection rates multiples higher than confirmed positive tests (8 times higher in Canada).  In absolute terms, cases are now down to 427 a day and deaths 6 a day, while estimates of recoveries are 437 a day.

The key numbers: 

99% of those tested are not infected with SARS CV2. 

99% of those who are infected recover without dying.

Summary of Canada Covid Epidemic

It took a lot of work, but I was able to produce something akin to the Dutch advice to their citizens.

The media and governmental reports focus on total accumulated numbers which are big enough to scare people to do as they are told.  In the absence of contextual comparisons, citizens have difficulty answering the main (perhaps only) question on their minds:  What are my chances of catching Covid19 and dying from it?

A previous post reported that the Netherlands parliament was provided with the type of guidance everyone wants to see.

For canadians, the most similar analysis is this one from the Daily Epidemiology Update: :

The table presents only those cases with a full clinical documentation, which included some 2194 deaths compared to the 5842 total reported.  The numbers show that under 60 years old, few adults and almost no children have anything to fear.

Update May 20, 2020

It is really quite difficult to find cases and deaths broken down by age groups.  For Canadian national statistics, I resorted to a report from Ontario to get the age distributions, since that province provides 69% of the cases outside of Quebec and 87% of the deaths.  Applying those proportions across Canada results in this table. For Canada as a whole nation:

Age  Risk of Test +  Risk of Death Population
per 1 CV death
<20 0.05% None NA
20-39 0.20% 0.000% 431817
40-59 0.25% 0.002% 42273
60-79 0.20% 0.020% 4984
80+ 0.76% 0.251% 398

In the worst case, if you are a Canadian aged more than 80 years, you have a 1 in 400 chance of dying from Covid19.  If you are 60 to 80 years old, your odds are 1 in 5000.  Younger than that, it’s only slightly higher than winning (or in this case, losing the lottery).

As noted above Quebec provides the bulk of cases and deaths in Canada, and also reports age distribution more precisely,  The numbers in the table below show risks for Quebecers.

Age  Risk of Test +  Risk of Death Population
per 1 CV death
0-9 yrs 0.13% 0 NA
10-19 yrs 0.21% 0 NA
20-29 yrs 0.50% 0.000% 289,647
30-39 0.51% 0.001% 152,009
40-49 years 0.63% 0.001% 73,342
50-59 years 0.53% 0.005% 21,087
60-69 years 0.37% 0.021% 4,778
70-79 years 0.52% 0.094% 1,069
80-89 1.78% 0.469% 213
90  + 5.19% 1.608% 62

While some of the risk factors are higher in the viral hotspot of Quebec, it is still the case that under 80 years of age, your chances of dying from Covid 19 are better than 1 in 1000, and much better the younger you are.

Conn AG Adds to Climate Lawsuit Dominos

Climate Dominos

William Allison reports at Energy In Depth Echoes of New York’s Failure:  Connecticut Files Climate Lawsuit.  Excerpts in italics with my bolds.

Four Years In The Making, But The Same Failed Arguments

Back in 2016, Tong’s predecessor, former Attorney General George Jespen, enlisted Connecticut to take part in former New York Attorney General’s Eric Schneiderman’s “AGs United for Clean Power” – a coalition of state attorneys general that aimed to investigate major energy companies over climate change. Not only did Jepsen participate in the March 2016 press conference announcing the coalition, he discussed how it’s formation allowed for easier collaboration between attorneys general.

While the coalition ultimately fell apart, following the withdrawal of several attorneys general and scrutiny over the political motivation behind its formation, the group’s demise (and even New York’s unsuccessful lawsuit) hasn’t stopped Tong. In fact, he’s using many of the same arguments that Schneiderman ineffectively deployed nearly five years ago.

In Monday’s press conference announcing the Connecticut lawsuit, AG Tong said:

“We tried to think long and hard about what our best and most impactful contribution would be. And what we settled on was a single defendant with a very simple claim: Exxon knew, and they lied.” (emphasis added)

Apparently Tong did not get the memo that “Exxon Knew” – the theory pushed by activists and lawyers that the company knew about climate change and hid that knowledge from the public – has been completely debunked. It was this theory that Schneiderman initially built his case against the company around, but he was forced to abandon it because the facts were not on his side. Indeed, after his successor was told to told “to put or shut up” on the accusations, the lawsuit was revised to remove these claims and instead focus on alleged accounting fraud. The case resulted in a resounding defeat for the New York attorney general, with State Supreme Court Justice Barry Ostrager calling the lawsuit “hyperbolic” and “without merit.”

The only other two climate lawsuits that have been decided on their merits were filed by San Francisco and Oakland and then New York City, both of which failed.

Connecticut Has Benefitted From The National Campaign

The Connecticut lawsuit isn’t a standalone effort, but part of a larger national campaign supported and funded by activist and wealthy donors to pursue climate litigation against energy companies.

Tong is still pursing the “Exxon Knew” angle that’s been developed by this campaign despite its previous losses and thinks his lawsuit is the strongest in the nation because Connecticut’s Connecticut Unfair Trade Practices Act doesn’t have a statute of limitations, allowing him to recall ExxonMobil documents from decades ago – even though the company already turned over 3 million documents as part of the New York Attorney General’s failed investigation.

Connecticut was also mentioned in the Pay Up Climate Polluters report, “Climate Costs 2040,” which seems to be a target list of cities to carry out potential litigation, as recent plaintiffs Hoboken, N.J. and Charleston, S.C. were also featured. Pay Up Climate Polluters is a campaign that promotes climate litigation that is sponsored by Center for Climate Integrity, which in turn is a project of Institute for Governance and Sustainable Development (IGSD), which, ironically enough, is paying for the outside counsel in Hoboken’s lawsuit.

IGSD also receives money from a network of Rockefeller groups, which with the help of wealthy donors and activists, have manufactured the entire climate litigation campaign. Tong even filed an amicus brief in support of the climate lawsuit filed by San Francisco and Oakland.

During his press conference, Tong even thanked 350.org and The Sunrise Movement, two other Rockefeller-supported groups that support and actively promote climate litigation.

Conclusion

The lawsuit filed by the Connecticut Attorney General is just the latest case to emerge in the broader, national campaign being pushed by weather donors and activist groups. But while a new lawsuit generates new headlines, it does nothing to change the fact that it’s based on another rehashing of the debunked “Exxon Knew” theory that failed in New York and will do nothing to address climate change.

Background from Previous Post:  Climate Lawsuit Dominos

Posted to Energy March 05, 2020 by Curt Levey writes at InsideSources Climate Change Lawsuits Collapsing Like Dominoes.  Excerpts in italics with my bolds.

Climate change activists went to court in California recently trying to halt a long losing streak in their quest to punish energy companies for aiding and abetting the world’s consumption of fossil fuels.

A handful of California cities — big consumers of fossil fuels themselves — asked the U.S. Court of Appeals for the Ninth Circuit to reverse the predictable dismissal of their public nuisance lawsuit seeking to pin the entire blame for global warming on five energy producers: BP, Chevron, ConocoPhillips, ExxonMobil and Royal Dutch Shell.

The cities hope to soak the companies for billions of dollars of damages, which they claim they’ll use to build sea walls, better sewer systems and the like in anticipation of rising seas and extreme weather that might result from climate change.

But no plaintiff has ever succeeded in bringing a public nuisance lawsuit based on climate change.

To the contrary, these lawsuits are beginning to collapse like dominoes as courts remind the plaintiffs that it is the legislative and executive branches — not the judicial branch — that have the authority and expertise to determine climate policy.

Climate change activists should have gotten the message in 2011 when the Supreme Court ruled against eight states and other plaintiffs who brought nuisance claims for the greenhouse gas emissions produced by electric power plants.

The Court ruled unanimously in American Electric Power v. Connecticut that the federal Clean Air Act, under which such emissions are subject to EPA regulation, preempts such lawsuits.

The Justices emphasized that “Congress designated an expert agency, here, EPA … [that] is surely better equipped to do the job than individual district judges issuing ad hoc, case-by-case injunctions” and better able to weigh “the environmental benefit potentially achievable [against] our Nation’s energy needs and the possibility of economic disruption.”

The Court noted that this was true of “questions of national or international policy” in general, reminding us why the larger trend of misusing public nuisance lawsuits is a problem.

The California cities, led by Oakland and San Francisco, tried to get around this Supreme Court precedent by focusing on the international nature of the emissions at issue.

But that approach backfired in 2018 when federal district judge William Alsup concluded that a worldwide problem “deserves a solution on a more vast scale than can be supplied by a district judge or jury in a public nuisance case.” Alsup, a liberal Clinton appointee, noted that “Without [fossil] fuels, virtually all of our monumental progress would have been impossible.”

In July 2018, a federal judge in Manhattan tossed out a nearly identical lawsuit by New York City on the same grounds. The city is appealing.

Meanwhile, climate lawfare is also being waged against energy companies by Rhode Island and a number of municipal governments, including Baltimore. Like the other failed cases, these governments seek billions of dollars.

Adding to the string of defeats was the Ninth Circuit’s rejection last month of the so-called “children’s” climate suit, which took a somewhat different approach by pitting a bunch of child plaintiffs against the federal government.

The children alleged “psychological harms, others impairment to recreational interests, others exacerbated medical conditions, and others damage to property” and sought an injunction forcing the executive branch to phase out fossil fuel emissions.

Judge Andrew Hurwitz, an Obama appointee, wrote for the majority that “such relief is beyond our constitutional power.” The case for redress, he said, “must be presented to the political branches of government.”

Yet another creative, if disingenuous, litigation strategy was attempted by New York State’s attorney general, who sued ExxonMobil for allegedly deceiving investors about the impact of future climate change regulations on profits by keeping two sets of books.

That lawsuit went down in flames in December when a New York court ruled that the state failed to prove any “material misstatements” to investors.

All these lawsuits fail because they are grounded in politics, virtue signaling and — in most cases — the hope of collecting billions from energy producers, rather than in sound legal theories or a genuine strategy for fighting climate change.

But in the unlikely event these plaintiffs prevail, would they use their billion dollar windfalls to help society cope with global warming?

It’s unlikely if past history is any indication.

State and local governments that have won large damage awards in successful non-climate-related public nuisance lawsuits — tobacco litigation is the most famous example — have notoriously blown most of the money on spending binges unrelated to the original lawsuit or on backfilling irresponsible budget deficits.

The question of what would happen to the award money will likely remain academic. Even sympathetic judges have repeatedly refused to be roped by weak public nuisance or other contorted legal theories into addressing a national or international policy issue — climate change — that is clearly better left to elected officials.

Like anything built on an unsound foundation, these climate lawsuits will continue to collapse.

Curt Levey is a constitutional law attorney and president of the Committee for Justice, a nonprofit organization dedicated to preserving the rule of law.

Update March 10

Honolulu joins the domino lineup with its own MeToo lawsuit: Honolulu Sues Petroleum Companies For Climate Change Damages to City

Honolulu city officials, lashing out at the fossil fuel industry in a climate change lawsuit filed Monday, accused oil producers of concealing the dangers that greenhouse gas emissions from petroleum products would create, while reaping billions in profits.

The lawsuit, against eight oil companies, says climate change already is having damaging effects on the city’s coastline, and lays out a litany of catastrophic public nuisances—including sea level rise, heat waves, flooding and drought caused by the burning of fossil fuels—that are costing the city billions, and putting its residents and property at risk.

“We are seeing in real time coastal erosion and the consequences,” Josh Stanbro, chief resilience officer and executive director for the City and County of Honolulu Office of Climate Change, Sustainability and Resiliency, told InsideClimate News. “It’s an existential threat for what the future looks like for islanders.”  [ I wonder if Stanbro’s salary matches the length of his job title, or if it is contingent on winning the case.]

Why Wu Flu Virus Looks Man-made

A virologist who fled China after studying the early outbreak of COVID-19 has published a new report claiming the coronavirus likely came from a lab.  This adds to the analysis done by Dr. Luc Montagnier earlier this year, and summarized in a previous post reprinted later on.  Dr. Yan was interviewed on Fox News, and YouTube has now blocked the video.

If you are wondering why Big Tech is censoring information unflattering to China, see Lee Smith’s Tablet article America’s China Class Launches a New War Against Trump  The corporate, tech, and media elites will not allow the president to come between them and Chinese money

Doctor Li-Meng Yan, a scientist who studied some of the available data on COVID-19 has published her claims on Zenodo, an open access digital platform. She wrote that she believed COVID-19 could have been “conveniently created” within a lab setting over a period of just six months, and “SARS-CoV-2 shows biological characteristics that are inconsistent with a naturally occurring, zoonotic virus”.

The paper by Yan, Li-Meng; Kang, Shu; Guan, Jie; Hu, Shanchang  is Unusual Features of the SARS-CoV-2 Genome Suggesting Sophisticated Laboratory Modification Rather Than Natural Evolution and Delineation of Its Probable Synthetic Route.  Excerpts in italics with my bolds.

Overview

The natural origin theory, although widely accepted, lacks substantial support. The alternative theory that the virus may have come from a research laboratory is, however, strictly censored on peer-reviewed scientific journals. Nonetheless, SARS-CoV-2 shows biological characteristics that are inconsistent with a naturally occurring, zoonotic virus. In this report, we describe the genomic, structural, medical, and literature evidence, which, when considered together, strongly contradicts the natural origin theory.

The evidence shows that SARS-CoV-2 should be a laboratory product created by using bat coronaviruses ZC45 and/or ZXC21 as a template and/or backbone.

Contents

Consistent with this notion, genomic, structural, and literature evidence also suggest a non-natural origin of SARS-CoV-2. In addition, abundant literature indicates that gain-of-function research has long advanced to the stage where viral genomes can be precisely engineered and manipulated to enable the creation of novel coronaviruses possessing unique properties. In this report, we present such evidence and the associated analyses.

Part 1 of the report describes the genomic and structural features of SARS-CoV-2, the presence of which could be consistent with the theory that the virus is a product of laboratory modification beyond what could be afforded by simple serial viral passage. Part 2 of the report describes a highly probable pathway for the laboratory creation of SARS-CoV-2, key steps of which are supported by evidence present in the viral genome. Importantly, part 2 should be viewed as a demonstration of how SARS-CoV-2 could be conveniently created in a laboratory in a short period of time using available materials and well-documented techniques. This report is produced by a team of experienced scientists using our combined expertise in virology, molecular biology, structural biology, computational biology, vaccine development, and medicine.

We present three lines of evidence to support our contention that laboratory manipulation is part of the history of SARS-CoV-2:

i. The genomic sequence of SARS-CoV-2 is suspiciously similar to that of a bat coronavirus discovered by military laboratories in the Third Military Medical University (Chongqing, China) and the Research Institute for Medicine of Nanjing Command (Nanjing, China).

ii. The receptor-binding motif (RBM) within the Spike protein of SARS-CoV-2, which determines the host specificity of the virus, resembles that of SARS-CoV from the 2003 epidemic in a suspicious manner. Genomic evidence suggests that the RBM has been genetically manipulated.

iii. SARS-CoV-2 contains a unique furin-cleavage site in its Spike protein, which is known to greatly enhance viral infectivity and cell tropism. Yet, this cleavage site is completely absent in this particular class of coronaviruses found in nature. In addition, rare codons associated with this additional sequence suggest the strong possibility that this furin-cleavage site is not the product of natural evolution and could have been inserted into the SARS-CoV-2 genome artificially by techniques other than simple serial passage or multi-strain recombination events inside co-infected tissue cultures or animals.

Background from Previous post June 30, 2020:  Pandemic Update: Virus Weaker, HCQ Stronger

In past weeks there have been anecdotal reports from frontline doctors that patients who would have been flattened fighting off SARS CV2 in April are now sitting up and recovering in a few days. We have also the statistical evidence in the US and Sweden, as two examples, that case numbers are rising while Covid deaths continue declining. One explanation is that the new cases are younger people who have been released from lockdown (in US) with stronger immune systems. But it may also be that the virus itself is losing potency.

In the past I have noticed theories about the origin of the virus, and what makes it “novel.” But when the scientist who identified HIV weighs in, I pay particular attention. The Coronavirus Is Man Made According to Luc Montagnier the Man Who Discovered HIV. Excerpts in italics with my bolds.

Contrary to the narrative that is being pushed by the mainstream that the COVID 19 virus was the result of a natural mutation and that it was transmitted to humans from bats via pangolins, Dr Luc Montagnier the man who discovered the HIV virus back in 1983 disagrees and is saying that the virus was man made.

Professor Luc Montagnier, 2008 Nobel Prize winner for Medicine, claims that SARS-CoV-2 is a manipulated virus that was accidentally released from a laboratory in Wuhan, China. Chinese researchers are said to have used coronaviruses in their work to develop an AIDS vaccine. HIV RNA fragments are believed to have been found in the SARS-CoV-2 genome.

“With my colleague, bio-mathematician Jean-Claude Perez, we carefully analyzed the description of the genome of this RNA virus,” explains Luc Montagnier, interviewed by Dr Jean-François Lemoine for the daily podcast at Pourquoi Docteur, adding that others have already explored this avenue: Indian researchers have already tried to publish the results of the analyses that showed that this coronavirus genome contained sequences of another virus, … the HIV virus (AIDS virus), but they were forced to withdraw their findings as the pressure from the mainstream was too great.

To insert an HIV sequence into this genome requires molecular tools

In a challenging question Dr Jean-François Lemoine inferred that the coronavirus under investigation may have come from a patient who is otherwise infected with HIV. No, “says Luc Montagnier,” in order to insert an HIV sequence into this genome, molecular tools are needed, and that can only be done in a laboratory.

According to the 2008 Nobel Prize for Medicine, a plausible explanation would be an accident in the Wuhan laboratory. He also added that the purpose of this work was the search for an AIDS vaccine.

In any case, this thesis, defended by Professor Luc Montagnier, has a positive turn.

According to him, the altered elements of this virus are eliminated as it spreads: “Nature does not accept any molecular tinkering, it will eliminate these unnatural changes and even if nothing is done, things will get better, but unfortunately after many deaths.”

This is enough to feed some heated debates! So much so that Professor Montagnier’s statements could also place him in the category of “conspiracy theorists”: “Conspirators are the opposite camp, hiding the truth,” he replies, without wanting to accuse anyone, but hoping that the Chinese will admit to what he believes happened in their laboratory.

To entice a confession from the Chinese he used the example of Iran which after taking full responsibility for accidentally hitting a Ukrainian plane was able to earn the respect of the global community. Hopefully the Chinese will do the right thing he adds. “In any case, the truth always comes out, it is up to the Chinese government to take responsibility.”

Implications: Leaving aside the geopolitics, this theory also explains why the virus weakens when mutations lose the unnatural pieces added in the lab. Since this is an RNA (not DNA) sequence mutations are slower, but inevitable. If correct, this theory works against fears of a second wave of infections. It also gives an unintended benefit from past lockdowns and shutdowns, slowing the rate of infections while the virus degrades itself.

Arctic Ice Bottoms at 3.7 Wadhams

The animation above shows Arctic ice extents from Sept. 1 to 16, 2020.  On the left are the Russian shelf seas already ice-free, and the Central Arctic retreating as well. Bottom left is Beaufort Sea losing ice. In the last week CAA in the center starts refreezing, and just above it Baffin Bay starts to add ice back.  At the top right Greenland Sea starts to refreeze.

Prof. Peter Wadhams made multiple predictions of an ice-free Arctic (extent as low as 1M km2), most recently to happen in 2015.  Thus was born the metric: 1 Wadham = 1M km2 Arctic ice extent. The details are provided on 2020 minimum below.  Though there could be a dip lower in the next few days, the record shows a daily minimum of 3.7M km2 on September 11 (MASIE) and September 13 (SII).  While BCE (Beaufort, Chukchi, East Siberian seas) may lose more ice,  gains have appeared on the Canadian side: CAA, Baffin Bay and Greenland Sea. So 3.7 Wadhams may well hold up as the daily low this year.  Note that day 260, September 16, 2020, is the date for the lowest annual extent averaged over the last 13-years.

The discussion later on refers to the September monthly average extent serving as the usual climate metric.  That stands presently at 3.9M km2 for MASIE and 3.8M km2 for SII, with both expected to rise slightly by month end as ice extent typically recovers.

The melting season this year showed ice extents briefly near the 13-year average on day 241, then dropping rapidly to go below all other years except 2012.  That year was exceptional due to the 2012 Great Arctic August Cyclone that pushed drift ice around producing a new record minimum.  The anomaly this year was the high pressure ridge persisting over Siberia producing an extremely hot summer there.  This resulted in early melting of the Russian shelf seas along with bordering parts of the Central Arctic.

 

As discussed below, the daily minimum on average occurs on day 260, but a given year may be earlier or later.  The 2020 extent began to flatten from day 248 onward in SII (orange) while MASIE showed stabilizing from day 252 with an upward bump in recent days.  Both lines are drawing near 2019 and 2007 while departing from 2012. The table below shows the distribution of ice in the various regions of the Arctic Ocean.

Region 2020260 Day 260 Average 2020-Ave. 2012260 2020-2012
 (0) Northern_Hemisphere 3770950 4483942 -712991 3398785 372165
 (1) Beaufort_Sea 503701 471897 31804 214206 289495
 (2) Chukchi_Sea 49625 143329 -93704 52708 -3084
 (3) East_Siberian_Sea 97749 278150 -180400 47293 50456
 (4) Laptev_Sea 0 124811 -124811 21509 -21509
 (5) Kara_Sea 12670 19162 -6492 0 12670
 (6) Barents_Sea 0 20787 -20787 0 0
 (7) Greenland_Sea 258624 191964 66660 253368 5256
 (8) Baffin_Bay_Gulf_of_St._Lawrence 20839 31394 -10555 12695 8144
 (9) Canadian_Archipelago 328324 269950 58374 154875 173449
 (10) Hudson_Bay 104 6195 -6092 3863 -3759
 (11) Central_Arctic 2498209 2925271 -427062 2637199 -138990

The extent numbers show that this year’s melt is dominated by the surprisingly hot Siberian summer, leading to major deficits in all the Eurasian shelf seas–East Siberian, Laptev, Kara.  As well, the bordering parts of the Central Arctic show a sizeable deficit to average. The main surpluses to average and to 2012 are Beaufort, Greenland Sea and CAA. Overall 2020 is 713k km2 below the 13-year average a deficit of 16%.

Background from Previous Post Outlook for Arctic Ice Minimum

The annual competition between ice and water in the Arctic ocean is approaching the maximum for water, which typically occurs mid September.  After that, diminishing energy from the slowly setting sun allows oceanic cooling causing ice to regenerate. Those interested in the dynamics of Arctic sea ice can read numerous posts here.  Note that for climate purposes the annual minimum is measured by the September monthly average ice extent, since the daily extents vary and will go briefly lower on or about day 260.

The Bigger Picture 

We are close to the annual Arctic ice extent minimum, which typically occurs on or about day 260 (mid September). Some take any year’s slightly lower minimum as proof that Arctic ice is dying, but the image above shows the Arctic heart is beating clear and strong.

Over this decade, the Arctic ice minimum has not declined, but since 2007 looks like fluctuations around a plateau. By mid-September, all the peripheral seas have turned to water, and the residual ice shows up in a few places. The table below indicates where we can expect to find ice this September. Numbers are area units of Mkm2 (millions of square kilometers).

Day 260 13 year
Arctic Regions 2007 2010 2012 2014 2015 2016 2017 2018 2019 Average
Central Arctic Sea 2.67 3.16 2.64 2.98 2.93 2.92 3.07 2.91 2.97 2.93
BCE 0.50 1.08 0.31 1.38 0.89 0.52 0.84 1.16 0.46 0.89
LKB 0.29 0.24 0.02 0.19 0.05 0.28 0.26 0.02 0.11 0.16
Greenland & CAA 0.56 0.41 0.41 0.55 0.46 0.45 0.52 0.41 0.36 0.46
B&H Bays 0.03 0.03 0.02 0.02 0.10 0.03 0.07 0.05 0.01 0.04
NH Total 4.05 4.91 3.40 5.13 4.44 4.20 4.76 4.56 3.91 4.48

The table includes three early years of note along with the last 6 years compared to the 13 year average for five contiguous arctic regions. BCE (Beaufort, Chukchi and East Siberian) on the Asian side are quite variable as the largest source of ice other than the Central Arctic itself.   Greenland Sea and CAA (Canadian Arctic Archipelago) together hold almost 0.5M km2 of ice at annual minimum, fairly consistently.  LKB are the European seas of Laptev, Kara and Barents, a smaller source of ice, but a difference maker some years, as Laptev was in 2016.  Baffin and Hudson Bays are inconsequential as of day 260.

For context, note that the average maximum has been 15M, so on average the extent shrinks to 30% of the March high before growing back the following winter.  In this context, it is foolhardy to project any summer minimum forward to proclaim the end of Arctic ice.

Resources:  Climate Compilation II Arctic Sea Ice

Trump Did Listen to Pandemic Experts. They just failed him.

President Trump, accompanied by, from left, Anthony S. Fauci, Vice President Pence and Robert Redfield, reacts to a question during a news conference on the coronavirus in the press briefing room at the White House in Washington on Feb. 29. (Andrew Harnik/AP)

Marc Thiessen writes at Washington Post Trump did listen to experts on the pandemic. They just failed him. Excerpts in italics with my bolds.

A narrative has taken hold since the release of Bob Woodward’s latest book that President Trump was told in late January that the coronavirus was spreading across America at pandemic rates but ignored the dire warnings of government experts.

That narrative is wrong and unfair.

The truth is that during the crucial early weeks of the pandemic, the government’s public health leaders assured Trump that the virus was not spreading in communities in the United States. They gave him bad intelligence because of two catastrophic failures: First, they relied on the flu surveillance system that failed to detect the rapid spread of covid-19; and second, they bungled the development of a diagnostic test for covid-19 that would have shown they were wrong, barred commercial labs from developing tests, and limited tests to people who had traveled to foreign hot spots or had contact with someone with a confirmed case.

As a result, according to former Food and Drug Administration chief Scott Gottlieb, they were “situationally blind” to the spread of the virus.

In an interview Sunday on CBS News’s “Face the Nation,” Gottlieb said officials at the Department of Health and Human Services “over-relied on a surveillance system that was built for flu and not for coronavirus without recognizing that it wasn’t going to be as sensitive at detecting coronavirus spread as it was for flu because the two viruses spread very differently.” Officials were looking for a spike in patients presenting with flu-like respiratory symptoms at hospitals. But there was a lag of a week or more in reporting data, and because many of those infected with the novel coronavirus didn’t develop symptoms, or did not present with respiratory illness, they were not picked up by this monitoring. As a result, officials concluded “therefore, coronavirus must not be spreading.”

They also failed to detect the spread, Gottlieb said, because for six weeks, they “had no diagnostic tests in the field to screen people.” That is because the FDA and HHS refused to allow private and academic labs to get into the testing game with covid-19 tests of their own. The FDA issued only a single emergency authorization to the Centers for Disease Control and Prevention — and then scientists at the CDC contaminated the only approved test kits with sloppy lab practices, rendering them ineffective. The results were disastrous.

How badly did the system fail? Researchers at the University of Notre Dame found that only 1,514 cases and 39 deaths had been officially reported by early March, when in truth more than 100,000 people were already infected. Because of this failure, Gottlieb said that as covid-19 was spreading, CDC officials were “telling the coronavirus task force … that there was no spread of coronavirus in the United States,” adding “They were adamant.”

It is often noted that on Feb. 25, Nancy Messonnier, director of the CDC’s National Center for Immunization and Respiratory Diseases, pointed to the spread of the virus abroad and said, “It’s not a question of if this will happen but when this will happen and how many people in this country will have severe illnesses” — and Trump reportedly nearly fired her. But Messonnier also said in that same interview, “To date, our containment strategies have been largely successful. As a result, we have very few cases in the United States and no spread in the community.” She added that the administration’s “proactive approach of containment and mitigation will delay the emergence of community spread in the United States while simultaneously reducing its ultimate impact” when it arrives. She had no idea it already had.

On Feb. 20, Gottlieb co-authored a Wall Street Journal op-ed raising concerns that infections were more widespread than CDC numbers showed. The next day, on Feb. 21, Anthony S. Fauci said in a CNBC interview that he was confident this was not the case. “Certainly, it’s a possibility,” Fauci said, “but it is extraordinarily unlikely.” He explained that if there were infected people in the United States who were not identified, isolated and traced, “you would have almost an exponential spread of an infection of which we are all looking out for. We have not seen that, so it is extremely unlikely that it is happening.” Fauci said the “pattern of what we’re seeing argues against infections that we’re missing.”

It was not until Feb. 26 that the first possible case of suspected community spread was reported. Even then, senior health officials played down the danger. On Feb. 29, CDC director Robert Redfield said at a White House press briefing: “The American public needs to go on with their normal lives. Okay? We’re continuing to aggressively investigate these new community links. … But at this stage, again, the risk is low.” It was not until early March that the experts realized just how disastrously wrong they had been.

So, when Trump told the American people on Feb. 25 that “the coronavirus … is very well under control in our country. We have very few people with it,” he was not lying or playing down more dire information he was being told privately. He was repeating exactly what experts such as Fauci, Redfield and Messonnier were telling him.

Trump did make serious errors of his own during this early period. On deputy national security adviser Matthew Pottinger’s advice, he barred travel by non-U.S. citizens from China on Jan. 31. But he did not also shut down travel from much of Europe, as Pottinger recommended, until March 11 — almost six weeks later — because of objections from his economic advisers. The outbreak in New York, the worst of the pandemic, was seeded by travelers from Italy.

But the main reason we were not able to contain the virus is that for six critical weeks, the health experts told the president covid-19 was not spreading in U.S. communities when it was, in fact, spreading like wildfire. They were wrong. The experts failed the president — and the country.

Footnote:  For What President Trump has done to fight the Chinese virus, see:

Trump DPA Initiatives Against China Virus

 

Ocean Cooling Pauses August 2020

The best context for understanding decadal temperature changes comes from the world’s sea surface temperatures (SST), for several reasons:

  • The ocean covers 71% of the globe and drives average temperatures;
  • SSTs have a constant water content, (unlike air temperatures), so give a better reading of heat content variations;
  • A major El Nino was the dominant climate feature in recent years.

HadSST is generally regarded as the best of the global SST data sets, and so the temperature story here comes from that source, the latest version being HadSST3.  More on what distinguishes HadSST3 from other SST products at the end.

The Current Context

The cool 2020 Spring was not just your local experience, it’s the result of Earth’s ocean cooling off after last summer’s warming in the Northern Hemisphere.  The chart below shows SST monthly anomalies as reported in HadSST3 starting in 2015 through August 2020. After three straight months of cooling led by the tropics and SH, August anomalies are up slightly.


A global cooling pattern is seen clearly in the Tropics since its peak in 2016, joined by NH and SH cycling downward since 2016.  In 2019 all regions had been converging to reach nearly the same value in April.

Then  NH rose exceptionally by almost 0.5C over the four summer months, in August exceeding previous summer peaks in NH since 2015.  In the 4 succeeding months, that warm NH pulse reversed sharply.  Now again NH temps are warming to a 2020 summer peak, matching 2019.  This had been offset by sharp cooling in the Tropics and SH, which instead warmed slightly last month. Thus the Global anomaly steadily decreased since March, then rose, now presently matching last summer.

Note that higher temps in 2015 and 2016 were first of all due to a sharp rise in Tropical SST, beginning in March 2015, peaking in January 2016, and steadily declining back below its beginning level. Secondly, the Northern Hemisphere added three bumps on the shoulders of Tropical warming, with peaks in August of each year.  A fourth NH bump was lower and peaked in September 2018.  As noted above, a fifth peak in August 2019 and a sixth August 2020 exceeded the four previous upward bumps in NH.

And as before, note that the global release of heat was not dramatic, due to the Southern Hemisphere offsetting the Northern one.  The major difference between now and 2015-2016 is the absence of Tropical warming driving the SSTs, along with SH anomalies reaching nearly the lowest in this period.

A longer view of SSTs

The graph below  is noisy, but the density is needed to see the seasonal patterns in the oceanic fluctuations.  Previous posts focused on the rise and fall of the last El Nino starting in 2015.  This post adds a longer view, encompassing the significant 1998 El Nino and since.  The color schemes are retained for Global, Tropics, NH and SH anomalies.  Despite the longer time frame, I have kept the monthly data (rather than yearly averages) because of interesting shifts between January and July.

1995 is a reasonable (ENSO neutral) starting point prior to the first El Nino.  The sharp Tropical rise peaking in 1998 is dominant in the record, starting Jan. ’97 to pull up SSTs uniformly before returning to the same level Jan. ’99.  For the next 2 years, the Tropics stayed down, and the world’s oceans held steady around 0.2C above 1961 to 1990 average.

Then comes a steady rise over two years to a lesser peak Jan. 2003, but again uniformly pulling all oceans up around 0.4C.  Something changes at this point, with more hemispheric divergence than before. Over the 4 years until Jan 2007, the Tropics go through ups and downs, NH a series of ups and SH mostly downs.  As a result the Global average fluctuates around that same 0.4C, which also turns out to be the average for the entire record since 1995.

2007 stands out with a sharp drop in temperatures so that Jan.08 matches the low in Jan. ’99, but starting from a lower high. The oceans all decline as well, until temps build peaking in 2010.

Now again a different pattern appears.  The Tropics cool sharply to Jan 11, then rise steadily for 4 years to Jan 15, at which point the most recent major El Nino takes off.  But this time in contrast to ’97-’99, the Northern Hemisphere produces peaks every summer pulling up the Global average.  In fact, these NH peaks appear every July starting in 2003, growing stronger to produce 3 massive highs in 2014, 15 and 16.  NH July 2017 was only slightly lower, and a fifth NH peak still lower in Sept. 2018.

The highest summer NH peak came in 2019, only this time the Tropics and SH are offsetting rather adding to the warming. Since 2014 SH has played a moderating role, offsetting the NH warming pulses. Now August 2020 is matching last summer’s unusually high NH SSTs. f(Note: these are high anomalies on top of the highest absolute temps in the NH.)

What to make of all this? The patterns suggest that in addition to El Ninos in the Pacific driving the Tropic SSTs, something else is going on in the NH.  The obvious culprit is the North Atlantic, since I have seen this sort of pulsing before.  After reading some papers by David Dilley, I confirmed his observation of Atlantic pulses into the Arctic every 8 to 10 years.

But the peaks coming nearly every summer in HadSST require a different picture.  Let’s look at August, the hottest month in the North Atlantic from the Kaplan dataset.
The AMO Index is from from Kaplan SST v2, the unaltered and not detrended dataset. By definition, the data are monthly average SSTs interpolated to a 5×5 grid over the North Atlantic basically 0 to 70N. The graph shows August warming began after 1992 up to 1998, with a series of matching years since, including 2020.  Because the N. Atlantic has partnered with the Pacific ENSO recently, let’s take a closer look at some AMO years in the last 2 decades.
This graph shows monthly AMO temps for some important years. The Peak years were 1998, 2010 and 2016, with the latter emphasized as the most recent. The other years show lesser warming, with 2007 emphasized as the coolest in the last 20 years. Note the red 2018 line is at the bottom of all these tracks. The black line shows that 2020 began slightly warm, then set records for 3 months. then dropped below 2016 and 2017, and now is matching 2016.

Summary

The oceans are driving the warming this century.  SSTs took a step up with the 1998 El Nino and have stayed there with help from the North Atlantic, and more recently the Pacific northern “Blob.”  The ocean surfaces are releasing a lot of energy, warming the air, but eventually will have a cooling effect.  The decline after 1937 was rapid by comparison, so one wonders: How long can the oceans keep this up? If the pattern of recent years continues, NH SST anomalies may rise slightly in coming months, but once again, ENSO which has weakened will probably determine the outcome.

Footnote: Why Rely on HadSST3

HadSST3 is distinguished from other SST products because HadCRU (Hadley Climatic Research Unit) does not engage in SST interpolation, i.e. infilling estimated anomalies into grid cells lacking sufficient sampling in a given month. From reading the documentation and from queries to Met Office, this is their procedure.

HadSST3 imports data from gridcells containing ocean, excluding land cells. From past records, they have calculated daily and monthly average readings for each grid cell for the period 1961 to 1990. Those temperatures form the baseline from which anomalies are calculated.

In a given month, each gridcell with sufficient sampling is averaged for the month and then the baseline value for that cell and that month is subtracted, resulting in the monthly anomaly for that cell. All cells with monthly anomalies are averaged to produce global, hemispheric and tropical anomalies for the month, based on the cells in those locations. For example, Tropics averages include ocean grid cells lying between latitudes 20N and 20S.

Gridcells lacking sufficient sampling that month are left out of the averaging, and the uncertainty from such missing data is estimated. IMO that is more reasonable than inventing data to infill. And it seems that the Global Drifter Array displayed in the top image is providing more uniform coverage of the oceans than in the past.

uss-pearl-harbor-deploys-global-drifter-buoys-in-pacific-ocean

USS Pearl Harbor deploys Global Drifter Buoys in Pacific Ocean

Oil Demand No End in Sight

Your next car? NURPHOTO VIA GETTY IMAGES

Michael Lynch writes at Forbes  Peak Oil Demand! Again?  Excerpts in italics with my bolds.

Amid stubbornly low prices and lackluster demand we’re now seeing, on cue, a new round of predictions that oil demand has already or is about to peak (including even scenarios published by BP). These cannot be dismissed out of hand — as the peak oil supply arguments could, inasmuch as they were either based on bad math or represented assumptions that the industry couldn’t continue overcoming its age-old problems like depletion. (See my book The Peak Oil Scare if you want the full treatment.)

Now, the news is highlighting various predictions that the pandemic will accelerate the point at which global oil demand peaks, which is certainly much more sexy than business as usual. When groups like Greenpeace or the Sierra Club predict or advocate for peak oil demand, it doesn’t make much news: dog bites man. But, as the newspeople say, when man bites dog it is news. Thus, when oil company execs seem to believe peak oil demand is near, you get headlines like, “BP Says the Era of Oil Demand-Growth is Over,” The Guardian newspaper proclaiming that “Even the Oil Giants Can Now Foresee the End of the Oil Age,” and Reuters in July: “End game for oil? OPEC prepares for an age of dwindling demand.”

Anyone who is familiar with the oil industry knows that a peak in oil production has been predicted many times throughout the decades, never to come true (or deter future predictions of same). But few realize that the end of the industry has been repeatedly predicted as well, including both the demise of an old-fashioned business model, but also replacement of petroleum by newer, better technologies or fuels.

Until the 1970s, few saw an end to the oil business. The automobile boom created a seemingly insatiable demand for oil, one which has only slowed when prices rose and/or economic growth stalled, neither of which has ever proved permanent.

Yet there have been three particular apocalyptic threads put forward for the oil industry: the industry would spiral into decline, demand would peak, and/or a new fuel or technology would displace petroleum.

The oil industry’s business model was challenged as far back as 1977, when Mobil XOM +1.3% CEO Rawleigh Warner tried to diversify out of the oil business for fear that those who didn’t would “go the way of the buggy-whip makers.” Similarly, Mike Bowlin, ARCO’s CEO, declared in 1999 “We’ve embarked on the beginning of the last days of oil.” Enron’s Jeff Skilling (whatever happened to him?) said he “had little use for anything that smacked of a traditional energy company — calling companies like Exxon Mobil ‘dinosaurs’”

Vanishing demand has been another common motif for prognosticators, especially when high prices caused demand to slump. Exxon CEO Rex Tillerson (whatever happened to him?) thought in 2009, when gasoline prices were $4/gallon, that gasoline demand had peaked in 2007. (The figure below shows how that worked out.)

Gasoline demand peaks then recovers U.S. Gasoline Demand (tb/d) THE AUTHOR FROM EIA DATA.

Sheikh Yamani, the former Saudi Oil Minister, warned in 2000 that in thirty years there would be “no buyers” for oil, because fuel cell technology would be commercial by the end of that decade. (From 2000, oil demand increased by 20 mb/d before the pandemic.) The fabled Economist magazine agreed with Yamani in 2003, “Finally, advances in technology are beginning to offer a way for economies, especially those of the developed world, to diversify their supplies of energy and reduce their demand for petroleum…Hydrogen fuel cells and other ways of storing and distributing energy are no longer a distant dream but a foreseeable reality.”

They might have been echoing William Ford, CEO of Ford Motor Company F +0.6%, who said in 2000, “Fuel cells could be the predominant automotive power source in 25 years.” Twenty years later, they are insignificant.

Amory Lovins, whose has probably received more awards than Tom Hanks, has long argued that extremely efficient (and expensive) cars would reduce gasoline demand substantially, including in his (and co-authors) Winning the Oil Endgame, which argued that a combination of efficiency and celluslosic ethanol could replace our imports from the Persian Gulf (then about 2.5 mb/d). (They’ve been replaced, but by shale oil, and demand was unchanged since their prediction.)

He was hardly alone, with Richard Lugar and James Woolsey in a 1999 Foreign Affairs article calling cellulosic ethanol “The New Petroleum.” Perhaps they relied on a 1996 Atlantic article by Charles Curtis and Joseph Room (“Mideast Oil Forever”) which argued that cellulosic ethanol should see its cost fall to about $1/gallon (adjusted for inflation). (In 2017, the National Renewable Energy Laboratory put the cost at $5/gallon.)

One of my persistent themes has been that too much writing is not based on rigorous analysis but superficial ideas, a few anecdotes and footnotes, supposedly supporting Herculean changes.

(See Tom Nichols The Death of Expertise.) Peak oil demand is the flavor of the month and people are rushing to publish predictions, prescriptions, guidelines, and fantastical views of a fantastical future. But petroleum remains by far the fuel of choice in transportation and the pandemic seems unlikely to change that. Sexy should be left for HBO and not energy analysis.

There are many reasons the demand for fossil fuels is strong and growing.  

Footnote:  Shareholder activism against Big Oil is based on a cascade of unlikely suppositions including declining demand and stranded assets.  See: Behind the Alarmist Scene

 

 

Attenborough’s Pandemic Porn

Ross Clark writes at The Spectator What David Attenborough’s ‘Extinction: The Facts’ didn’t tell you.  Excerpts in italics with my bolds.

It was only a matter of time before Covid-19 got swept up into the wider narrative of humans facing impending doom thanks to our abuse of the planet. But one might have expected better of Sir David Attenborough. His latest BBC documentary, Extinction: The Facts, broadcast on Sunday night might as well have been produced by Extinction Rebellion, so determined was it to present an hysterical picture of apocalypse caused by consumerism and capitalism. Just to ram home the point, one contributor, naturalist Robert Watson, spoke of ‘many in the private sector making a huge profit at the expense of the natural world’, seemingly oblivious to the far greater rape of the environment committed by the former Soviet Union and other socialist countries.

But it was the section on Covid-19 which really made the jaw drop. ‘Scientists have even linked the destructive relationship with nature to the emergence of Covid-19,’ we were told. ‘If we carry on like this we will see more epidemics.’ It went on: ‘We’ve seen an increasing rate of pandemic emergencies. We’ve had swine flu, SARS, ebola. We’ve found that we’re behind every single pandemic. One of the most obvious ways we’re making it more likely that a virus would jump [from animals to humans] is that we’re having lots of contacts with animals – wildlife trade is at unprecedented levels.’

It then tried to present two examples of food production – intensive cattle ranching and wildlife markets in China – as part of the same problem.

It is perfectly true that Chinese ‘wet markets’, where many different species are sold and killed alongside each other, have been implicated in SARS and Covid-19, the former involving civets and the latter most likely bats. Breeding poultry and pigs in close proximity has also been suggested as a breeding ground for flu viruses which can then jump to humans.

But these are hardly examples of the mass, intensive agriculture which feeds an increasing proportion of the global population. On the contrary, it is the exact opposite.

It is all those old-fashioned farmyards depicted in children’s books which mixed species and brought humans into close contact with animals. Modern livestock farming, by contrast, involves huge monocultures, bred in environments where infectious disease is very tightly-controlled. An outbreak, say, of swine flu is not going to be tolerated for long in a pig farm in a developed country – though it might well be allowed to spread in a developing country where large numbers of people keep pigs in their back yards. The only way in which most of us come into contact with a farm animal now is when a slab of it is presented to us on a plate.

The idea that we face a terrifying future of infectious disease flies in the face of reality. In developed countries infectious disease has gone from being the main cause of death – especially in children – to being a rarity. Globally, the chances of dying from an infectious disease have plummeted in recent decades. According to the Institute for Health Metrics and Evaluation, the proportion of global deaths caused by communicable disease, maternal and neonatal conditions fell from 46 per cent in 1990 to 28 per cent in 2017.

Covid-19 will in no way reverse this: so far, it has caused fewer than two per cent of the 56 million deaths which would have been expected this year anyway.

Pandemics, of course, have always been a regular feature of human life. But are novel diseases becoming more commonplace? Well, yes in the sense that we have become better at identifying them – the first virus, after all, was not discovered until 1900, and we have become ever better at isolating and identifying them.

Little over a century ago, we would have had no idea what Covid-19 was – it might possibly have acquired a name, maybe ‘coughing disease’, but we would have had no real idea whether it was novel or not. A study by Brown university in 2014, published in the Journal of the Royal Society, found that there has been a rise in the number of outbreaks of novel infectious diseases since 1980, but also that there has been a decline in the numbers of people being affected by them. We have become much better at identifying diseases, and much better at controlling them. Covid might have inspired an unprecedented global response, but in historical terms it is a pretty gentle pandemic – even now it has a lower death toll than Hong Kong flu, which hardly affected our lives at all.

It is shocking that the BBC can have allowed such one-sided green propaganda onto our screens without putting issues of human development and the natural world into proper context. But then David Attenborough has become a Greta of the Third Age – no-one dares question what he does because he is a ‘national treasure’. Someone at the BBC needs to pluck up the courage.

 

 

 

TOPICS IN THIS ARTICLE