Understanding Climate Risk

Science, policy and decision-making

On simplicity

leave a comment »

This is a long screed in response to a reading list posted by Massimo Pigliucci (so he bears no responsibility) where he nominated a post on simplicity in science by Elliot Sober on Aeon.

Why is simplicity better?

As it happens, I am in the midst of an argument with the climate science community over simplicity as applied to statistical inference. A couple of days ago I bought Probability, Confirmation and Simplicity: readings in the the philosophy of inductive logic Foster and Martin 1966, which contains six essays on simplicity. Not as simple as it’s cracked up to be – exactly the ammunition I require.

Accordingly, I disagree with Sober. He refers to the Akaike Information Criterion, which measures simplicity but says that it refers to the same underlying reality. But we see it being repeatedly used for different underlying realities by people who don’t read the small print. They are being simplistic (#OccamsRazor). By mixing probabilities with theory Sober is making a fundamental mistake. I can apply probabilities to an experiment or a test, but I cannot to a theory. At best I can severely test (Mayo) a hypothesis and by attaching it to probative criteria in such a way that the alternatives are as unlikely as the hypothesis is likely, then I have a chance of confirming that theory.

In climate science, simplicity is represented by trend-like change. Under increasing greenhouse gases, forcing leads to warming as the logarithm of the increasing forcing plus feedbacks. In the Earth system, this leads to monotonic warming, linear to forcing. Trouble is, most of this heat is absorbed by the ocean and it is the atmosphere that needs to respond. The atmosphere-ocean relationship is a dissipative system driven by thermodynamics and decidedly nonlinear. So, if I assume the atmosphere warms according to the linear radiative forcing concept, I have a simple model that is predictive over demi-century-long timescales. If I assume that warming obeys the dissipative pathway, then it proceeds via enhanced climate variability as a series of step-like regime changes. Over both pathways, warming reaches close to the same destination but its mode of getting there is very different. One contains more inherent risk than the other.

So, I can represent both pathways statistically. They get similar sum of squares residuals (trend-like change fails the heteroscedasticity test but almost no-one tests for this), but because the pathway of step-like change carries more adjustable parameters, it is penalised (actually that isn’t even true because the detection method is completely different). But they represent different realities – Sober does mention this but few have remembered this before, so why should they now?

Where simplicity works with this example, is that the natural greenhouse effect (average 155 Watts per square metre per year) is distributed through climate variability. The net anthropogenic greenhouse effect is 0.7 Watts per square metre per year and roughly 1% of that is assumed to be stored within the atmosphere (0.07 W m2/yr) producing a trend. So here we have created a very complex physical situation where most of the energy flux is controlled by climate variability and where perturbations in the climate of >1 W m2/y can be brought back to mean within months, but somehow a tiny amount of heat remains in the atmosphere in preference to an ocean with 24 times the heat conductivity and 3,200 the heat capacity.

Whereas we could accept that the simplest thermodynamic solution is for all heat to follow the same pathway, for climate change to behave like enhanced climate variability and for warming to follow a series of regime change producing a long-term, complex trend.

Theoretically and thermodynamically simple, statistically more complex. The problem with the simplicity argument is that it has to be very finely applied, and that confusing methodological simplicity with theoretical parsimony is an issue. In economic, climatology and a number of other disciplines, simplicity is being misapplied to methods rather than theory and this is a problem, because it means we apply simple solutions to complex, real-world problems.

Written by Roger Jones

April 26, 2019 at 11:17 pm

Announcing a Special Issue on Managing Nonlinear Climate Risk

leave a comment »

I am guest editor for a Special Issue on “The Implications of Nonlinear, Complex System Behaviour for Managing Changing Climate Risk” that will appear in the MDPI AG journal Atmosphere. Researchers, policymakers and practitioners are invited to submit a paper for consideration to this special issue.

With the IPCC Sixth Assessment Report in its early stages, there is a very limited literature on managing the risk of nonlinear climate change on decadal timescales, yet nonlinear change poses a much greater risk than gradual change. If climate change on decision-making timescales proves to be fundamentally nonlinear, as we maintained in a paper published earlier this year, there will be a substantial gap in the assessment. This special issue invites submissions on all aspects of the implications of nonlinear climate change for risk management, from theory through to practice.

More details can be found at:
http://www.mdpi.com/journal/atmosphere/special_issues/nonlinear_climate_risk (link)

 

Written by Roger Jones

December 19, 2017 at 6:15 pm

Trolling. It’s more important now than ever.

with 10 comments

When contrarian commentator Bret Stephens was hired by the New York Times as a columnist, there was an immediate outcry from climate scientists and the pro-climate policy community. Some cancelled their subscriptions:

Stephens had been on record as describing climate change as an ‘imaginary enemy’. The timing was odd. NYT has just hired a high-profile climate team and was selling itself with the slogan “Truth. It’s now more important than ever.”

Credit: Think Progress for the link. Ad from The New York Times’ marketing campaign. Credit: The New York Times via AdAge

The hire was defended by James Bennet, editorial page editor: Read the rest of this entry »

Written by Roger Jones

April 29, 2017 at 8:53 pm

Published step change paper

leave a comment »

Reconciling the signal and noise of atmospheric warming on decadal timescales

Roger N. Jones and James H. Ricketts
Victoria Institute of Strategic Economic Studies, Victoria University, Victoria 8001, Melbourne, Australia
Received: 13 Aug 2016 – Discussion started: 23 Aug 2016
Revised: 20 Feb 2017 – Accepted: 21 Feb 2017 – Published: 16 Mar 2017

Abstract
Interactions between externally forced and internally generated climate variations on decadal timescales is a major determinant of changing climate risk. Severe testing is applied to observed global and regional surface and satellite temperatures and modelled surface temperatures to determine whether these interactions are independent, as in the traditional signal-to-noise model, or whether they interact, resulting in step-like warming. The multistep bivariate test is used to detect step changes in temperature data. The resulting data are then subject to six tests designed to distinguish between the two statistical hypotheses, hstep and htrend. Test 1: since the mid-20th century, most observed warming has taken place in four events: in 1979/80 and 1997/98 at the global scale, 1988/89 in the Northern Hemisphere and 1968–70 in the Southern Hemisphere. Temperature is more step-like than trend-like on a regional basis. Satellite temperature is more step-like than surface temperature. Warming from internal trends is less than 40 % of the total for four of five global records tested (1880–2013/14). Test 2: correlations between step-change frequency in observations and models (1880–2005) are 0.32 (CMIP3) and 0.34 (CMIP5). For the period 1950–2005, grouping selected events (1963/64, 1968–70, 1976/77, 1979/80, 1987/88 and 1996–98), the correlation increases to 0.78. Test 3: steps and shifts (steps minus internal trends) from a 107-member climate model ensemble (2006–2095) explain total warming and equilibrium climate sensitivity better than internal trends. Test 4: in three regions tested, the change between stationary and non-stationary temperatures is step-like and attributable to external forcing. Test 5: step-like changes are also present in tide gauge observations, rainfall, ocean heat content and related variables. Test 6: across a selection of tests, a simple stepladder model better represents the internal structures of warming than a simple trend, providing strong evidence that the climate system is exhibiting complex system behaviour on decadal timescales. This model indicates that in situ warming of the atmosphere does not occur; instead, a store-and-release mechanism from the ocean to the atmosphere is proposed. It is physically plausible and theoretically sound. The presence of step-like – rather than gradual – warming is important information for characterising and managing future climate risk.

Earth Syst. Dynam., 8, 177-210, 2017
http://www.earth-syst-dynam.net/8/177/2017/
doi:10.5194/esd-8-177-2017

Download the full paper

Discussion paper for open review

leave a comment »

After promising to have our flagship paper on reconciling the signal and noise of global warming on decadal timescales subject to open review, it is finally on. The paper has been submitted and accepted for open review at Earth System Dynamics.

Reconciling the signal and noise of atmospheric warming on decadal timescales

Roger N. Jones and James H. Ricketts
Victoria Institute of Strategic Economic Studies, Victoria University, Melbourne, Victoria 8001, Australia

Received: 13 Aug 2016 – Accepted: 22 Aug 2016 – Published: 23 Aug 2016

Abstract

Interactions between externally-forced and internally-generated climate variations on decadal timescales is a major determinant of changing climate risk. Severe testing is applied to observed global and regional surface and satellite temperatures and modelled surface temperatures to determine whether these interactions are independent, as in the traditional signal-to-noise model, or whether they interact, resulting in steplike warming. The multi-step bivariate test is used to detect step changes in temperature data. The resulting data are then subject to six tests designed to show strong differences between the two statistical hypotheses, hstep and htrend: (1) Since the mid-20th century, most of the observed warming has taken place in four events: in 1979/80 and 1997/98 at the global scale, 1988/89 in the northern hemisphere and 1968/70 in the southern hemisphere. Temperature is more steplike than trend-like on a regional basis. Satellite temperature is more steplike than surface temperature. Warming from internal trends is less than 40 % of the total for four of five global records tested (1880–2013/14). (2) Correlations between step-change frequency in models and observations (1880–2005), are 0.32 (CMIP3) and 0.34 (CMIP5). For the period 1950–2005, grouping selected events (1963/64, 1968–70, 1976/77, 1979/80, 1987/88 and 1996–98), correlation increases to 0.78. (3) Steps and shifts (steps minus internal trends) from a 107-member climate model ensemble 2006–2095 explain total warming and equilibrium climate sensitivity better than internal trends. (4) In three regions tested, the change between stationary and non-stationary temperatures is steplike and attributable to external forcing. (5) Steplike changes are also present in tide gauge observations, rainfall, ocean heat content, forest fire danger index and related variables. (6) Across a selection of tests, a simple stepladder model better represents the internal structures of warming than a simple trend – strong evidence that the climate system is exhibiting complex system behaviour on decadal timescales. This model indicates that in situ warming of the atmosphere does not occur; instead, a store-and-release mechanism from the ocean to the atmosphere is proposed. It is physically plausible and theoretically sound. The presence of steplike – rather than gradual – warming is important information for characterising and managing future climate risk.

Comments welcome: here or there. Deadline October 4.

Step change hypothesis and working paper

with 11 comments

Imagine you didn’t know anything about climate change and the greenhouse effect but were interested and you know a bit about general science. Would you accept the following story?

“Earth’s climate is a large, complex system, affected by forces that produce both linear and nonlinear responses. Shortwave radiation – basically UV – from the sun comes in and heats up the planet, producing infrared radiation. Some UV gets reflected straight back out by clouds, snow and ice and stuff. The land can heat up quite a lot, but it cools back down again and doesn’t store much. If a forest is cleared and replaced by buildings, it will warm up a bit but the effect is only local.”

“But the ocean – that’s another story. It absorbs a lot of radiation, so is taking up heat all the time. Huge streams of energy are entering and leaving the ocean store each year. Some is ‘dry’ or sensible heat, which is ordinary warmth. Some is ‘wet heat’ or evaporated moisture. Energy gets taken up when the moisture is evaporated and it will be released again when the moisture cools, condenses and then gets rained out. In this way, the oceans provide a lot of heat to the land every year, largely as rainfall and a bit of snow.”

Read the rest of this entry »

What is public good research?

leave a comment »

When recently asked at a meeting with CSIRO scientists what he thought public good research was, their CEO Larry Marshall said:

“Anything that’s good for the public”

He then went on to say:

“Government policy, frankly, determines public good. That’s their decision. When they fund renewable energy, environmental science, education, health care, that’s a fundamental policy choice. It’s completely separate to us. National objectives, national challenges, is that not, a realistic measure of public good?”

Read the rest of this entry »