top of page

Climate Change: Are We Blinded By Abstract Statistics?


The argument can be made that, in popular media, there‘s’ too much of a focus, when referring to climate change, on abstract statistics.

For example, we are repeatedly told that Global Temperatures are projected to increase by 1.5°C by 2050 and 2–4°C by 2100. Meanwhile, the Paris Agreement’s framework aims to limit the increase in global average temperatures to well below 2°C above pre-industrial levels.


But what does this actually mean?

Whilst I assume most are in agreement that we want to do all we can to minimise these average temperature increases, there is a danger that these regularly touted figures become disconnected from our day-to-day experiences.

Specifically, the argument can be made that these figures:

  1. Sound small.

  2. They refer to a date in the distant future.

  3. They are unanchored from our day to day lives and the real world.

Implications of this ‘Disconnect?’

This potential disconnect and obscuration alters our decision making. Excessively focussing on the future, anticipating what lies ahead, can blind us from what’s happening today. In turn, we are at risk of being oblivious to the here and now implications of climate change.

Whether that be choosing a holiday destination, or an insurers underwriting decisions - we want to highlight the fact that these potentially ‘unrelatable’ average figures have a rather more serious impact on a day to day basis and lead us to make irrational decisions. Why?


WHEN YOU PUSH OUT THE AVERAGE, YOU PUSH OUT THE EXTREMES.


We ran an analysis on the data we ingest in our WEATHER ANALYTIX™ platform, notably the data from the European Space Agency’s Copernicus Hub.

A few standout examples:

Between 1950–2023:

  • London now experiences 10.4x more days in which temperatures exceed the level triggering an ‘excessive heat warning’ (Level deemed dangerous by authorities).

  • Phoenix, Arizona: 4x more searing days.

  • Paris : 8.1x more searing days.


HOW DOES WEATHER ANALYTIXâ„¢ MODEL THIS?


It’s critical that we account for this trending in our extreme event analysis and push up / account for the previously unobserved extremes or ‘tail’ of the distribution.


High Level Summary / Laymen’s Terms (For non-scientists)


We use a variety of modelling techniques, running thousands of simulations, to create a curve, one that represents the true distribution of the data.


This curve helps us understand how extreme values occur and gives us a range of possible outcomes. By using simulations and statistical techniques, we make the curve more accurate and smooth, which improves the accuracy of predicting probabilities for insurance purposes.


This ultimately results in the increasing frequency and severity of temperature events being accounted for in the value provided by the model.

The Science — Detailed (Dr A.Dow: CTO BirdsEyeView):

We employ a series of models to generate a cumulative distribution function (CDF) that better estimates the true distribution of the underlying: a non-parametric kernel density regression, and a parametric maximum likelihood fit for extreme values, defined as values exceeding the empirical 95th percentile.

Using Monte Carlo bootstrap techniques, we not only simulate the distribution, but also get a feel for possible alternative scenarios, giving rise to thousands of points that ultimately allow for a more extreme tail as well as an uncertainty band around the distribution, as illustrated in the example below. Additionally, the CDF is much smoother than the empirical CDF given by the limited set of historical data, leading to better probability readings for the underwriter.


Et Voila — the extreme events are accounted for in the here and now and modelled.

38 views0 comments
bottom of page