News – World Weather Attribution https://www.worldweatherattribution.org Exploring the contribution of climate change to extreme weather events Fri, 08 Mar 2024 11:28:55 +0000 en-GB hourly 1 https://wordpress.org/?v=6.6.1 https://www.worldweatherattribution.org/wp-content/uploads/wwa-favicon.png News – World Weather Attribution https://www.worldweatherattribution.org 32 32 Climate change fuelled extreme weather in 2023; expect more records in 2024 https://www.worldweatherattribution.org/climate-change-fuelled-extreme-weather-in-2023-expect-more-records-in-2024/ Fri, 22 Dec 2023 10:55:19 +0000 https://www.worldweatherattribution.org/?p=2412 After six consecutive months of record-breaking heat, 2023 is poised to become the hottest year on record. Fuelled by the heat, this year has seen unprecedented extreme weather events hitting every part of the world. 

In the Horn of Africa, a three-year drought that caused food insecurity for millions was followed by massive floods that have killed more than 300 people. Both events show clear climate change fingerprints.

In July, deadly heatwaves brought extreme temperatures to large regions of Europe, North America and China for weeks on end – an event that would have been extremely rare or even impossible without human-caused warming.

Months later, more than 3,400 people lost their lives in Libya when three dams collapsed after prodigious downpours in September. While conflict-related insecurity and poor dam maintenance created the conditions for the disaster to occur, we found that climate change increased the intensity of the heavy rainfall by up to 50%. 

This year’s wildfire season in Canada has been the most extreme ever recorded. The fires have burnt more than 18 million hectares, shattering the previous record by more than 10 million hectares. Our study found that climate change made the hot, dry and windy conditions that drove the wildfires in Québec at least two times more likely.

These are just a few examples of the devastating events we have witnessed this year. There were many more. In 2023, World Weather Attribution reviewed more than 120 weather events with human impacts severe enough to warrant an attribution study.

We studied 14 of these events in detail: five heatwaves, five heavy rainfall events, three droughts and one wildfire. Of these, six were in Africa, four were in Europe, two were in North America, two were in South America, three were in Asia and one was in Australasia (some events covered more than one region). 

To advance global understanding of changing weather extremes, we aim to achieve a balance of countries and continents studied. But we often face challenges. In June, when we investigated the heavy rainfall that led to devastating flooding and the deaths of nearly 600 people in Rwanda and the Democratic Republic of Congo the result was ‘inconclusive.’

A lack of real-world weather data and the inability of climate models to accurately simulate the event meant we couldn’t quantify the influence of climate change on the rainfall. Our study highlighted the need for significant investment in weather monitoring stations and climate science to understand changing weather extremes in central Africa.

Extreme heat is set to continue in 2024. The combination of human-caused climate change and El Niño, a naturally occurring climate phenomenon, could see 2024 break this year’s record and go on to become the hottest year on record.

At just 1.2°C of warming, millions of vulnerable people have experienced devastating extreme weather. With every fraction of a degree of warming from the burning of fossil fuels, heatwaves, fires, heavy rainfall and drought will become more intense and more likely. 

To minimise loss and damage, the world needs to be better prepared. Existing human vulnerabilities are what turns extreme weather events into humanitarian disasters. 

Whether it is insufficient early warning before heavy rainfall, a lack of cooling green spaces in a city during a heatwave, or settlements built in flood-prone areas, in 2023, our analyses highlighted again and again that many impacts could have been prevented or minimised with better planning and more funding. This finding underscores the urgency of accelerating both adaptation and emissions reduction. 

Our vulnerability and exposure analyses also identify the people who are least able to protect themselves from the impacts of extreme weather. Overwhelmingly, these people are the poorest and most marginalised in societies – people experiencing homelessness or living in informal housing, people with disabilities or underlying health conditions, and outdoor workers. It is clear climate change is increasing inequality. 

Alongside moving away from fossil fuels and reducing emissions to net zero, reducing the vulnerability and exposure of these populations is critical to make our world a safer place. 

]]>
Twelve months – WWA without Geert Jan https://www.worldweatherattribution.org/twelve-months-wwa-without-geert-jan/ Wed, 12 Oct 2022 03:00:43 +0000 https://www.worldweatherattribution.org/?p=1700 The words in our obituary are still true, and the loss is as big. It feels bigger though as we’ve been doing the science Geert Jan developed with us without him. Without his dedication, without his passion, without his knowledge and without his friendship. But we have all the foundations he helped to build and throughout this lonely, long year, we published a lot he still worked on. His last first author article has been published and several – but not yet all – of his many other manuscripts are finalised and published. Eventually, we will finalise them all. The world in October 2022 very much needs his work.

World Weather Attribution exists because we felt very strongly that in the face of increasingly manifesting effects of climate change, the scientific community could not stay silent when extreme events happened and the world asked questions about the role of climate change. On the climate hazard analysis side, “we” was particularly Geert Jan and Fredi, preceded by a range of collaborations in response to questions from the Red Cross Red Crescent where Geert Jan had already been supporting Maarten and others to answer the very practical queries coming from around the world, and especially from highly vulnerable contexts – at a time when most scientist were very reluctant to do so. We felt science could not continue to deliver only general statements on extremes and keep saying that one cannot say anything about a single event. The voice of scientists would need to be based on evidence, and providing and interpreting that evidence is what World Weather Attribution does and why we exist. Getting the science right, was Geert Jan’s maxime and is the essence of what we try to do. He would have been proud of what we have achieved in the year without him but he would also have seen just how much he is missed.

It was a difficult year for the whole world.

The first study we had to publish without Geert Jan was on a devastating drought in Madagascar which has not been made more intense by climate change, but extremely high levels of vulnerability led to a still ongoing food security crisis in the area.

The second study we published without Geert Jan and the first we had to do entirely without his insights was on the same region of the world which was battered by several tropical cyclones, affecting mainly Madagascar, Malawi and Mozambique. Here climate change did play a role, but impacts were particularly high because of the compounding of events and subsequently diminished resilience in a highly vulnerable part of the world.

In the third study we found again a combination of high vulnerability and climate change leading to devastating consequences of flooding in KwaZulu-Natal and the Eastern Cape, including the city of Durban.

A very different event was our fourth study, focussing on the record-breaking prolonged extreme heatwave in India and Pakistan. In this event, climate change did not only play a role, but was a major driver.

The fifth study we did without Geert Jan was again on heavy rainfall but in Brazil, a part of the world we had not done any studies on for many years. The findings showed again a now familiar pattern  of climate change exacerbating heavy rains combined with high levels of vulnerability leading to devastating impacts in the region.

The sixth study was again a heatwave, a heatwave in the UK, a country that until recently didn’t think of itself as even at risk of extreme heat. Last week the official death toll of that summer in the UK was published, showing excess mortality during the heatwave we analysed in July, indicating that about 2227 people died from heat-related conditions or incidents. Our analysis showed that this heat would have been extremely unlikely to have happened without human-caused climate change.

The seventh study in the year since Geert Jan passed focused on South Asia again, this time on the extreme monsoon rainfall that devastated Pakistan just months after the record-breaking heat we had assessed in the fourth study listed above. Again we found very clear fingerprints of human-induced climate change when all lines of evidence were taken together, despite the statistical analysis being rather difficult. And again, vulnerable communities suffered exceptionally.

The eighth and last rapid analysis we did in the 12 months since October 12th a year ago was quite different, looking at the droughts, or more precisely the lack of water in the soils, that impacted large parts of the northern hemisphere, particularly focussing on Western Central Europe. Here we found, in contrast to e.g. what we saw in Madagascar, that climate change did indeed play a major role..

These studies represent only a small part of all the extreme events the world has experienced in these last 12 months. If Geert Jan was still alive, we would have been able to do more. But probably not that many more and we think he would have approved of our choices, putting the limited people-power we have on extremes that matter in terms of human impacts, and particularly in parts of the world where research gaps are still very large and our understanding of the impacts of climate change is still patchy. In every study we learned something new and every study had different challenges, challenges we would have overcome better with him. When he just died we were numb and focussed on keeping World Weather Attribution alive. We have achieved that – the Climate Explorer can still be used for the analyses, and our team has grown, partly collectively filling the huge gaps left now that we miss Geert Jan’s talent in immediate zooming in on the right exciting questions in a recent event, quickly understanding data and seeing patterns across timescales, his critical questioning of our various lines of evidence.

While we spent a lot of time and effort on rapid studies, we did not only do rapid studies, but also peer-reviewed analyses, all of which have Geert Jan as an author, or build on his work.

A study that shows the less obvious impacts of human induced climate change that has now been accepted for publication after peer-review is the attribution of frost in the growing period in France in early April 2021. Here two effects apply: because of human-induced climate change, cold nights and frost in March and April become less frequent and less cold. But at the same time, the growing season starts earlier, because days and nights are warmer earlier in the year. This second effect is stronger and thus young leaves are at higher risk of frost during the growing season.

Every heatwave that occurs today would have been less hot and less frequent without climate change. We would not need to do an attribution study to know this, but to know how much more frequent and intense an individual heatwave is, we do need to do a study. And often the answers are not as straightforward as they could be. In his last first author paper, Geert Jan pointed out that the attribution of heatwaves is hard, but also that we could do better. This is a strong request for the scientific community to do more research on heatwaves. Because heat waves are probably climate change’s deadliest manifestation, as we’ve seen in recent months, and was also flagged by IFRC and UN OCHA this week.

Another type of extreme event that is often deadly, are tropical cyclones. The climate change signal is much less clear, at least in some parts of the world. A difficult case was a series of tropical cyclones that led to large-scale flooding in Vietnam in 2020 where we did not find a climate change signal. Of course that does not mean though, that climate change is irrelevant for future tropical cyclones in that part of the world.

This is an example of an attribution study that makes communicating the finding in clear, useful messages challenging. A challenge Geert Jan was passionate about and very good at overcoming. Some of the lessons we learned with him were published as part of the winter 2021/22 collection of extreme event attribution studies published by the Bulletin of the American Meteorological Society. Highlighting again, how important it is to be transparent is in every step of a rapid attribution study.

We also published a review paper tthat summarised what we have learned from a decade of attributing extreme weather events. Geert Jan did not write this paper himself, but without him it would not exist. Sixteen of the references were his papers, and probably all of them inspired by his work. We used this paper to also write a guide for journalists on how to talk about extreme events and the role of climate change even if no attribution study exists yet.

We missed Geert Jan’s guidance a lot this year, and will continue to do so. Seeing though, how far we have come as WWA, and as a global community, in understanding extreme weather events in a changing climate, shows how much his guidance and legacy is everywhere. We are proud to continue with his legacy.

]]>
Geert Jan van Oldenborgh, 1961–2021 https://www.worldweatherattribution.org/geert-jan-van-oldenborgh-1961-2021/ Thu, 14 Oct 2021 12:52:53 +0000 https://www.worldweatherattribution.org/?p=1524
Geert Jan van Oldenborgh, photo by Werry Crone

Geert Jan trained as a physicist and started working on climate in 1996, when he joined KNMI as a postdoc. In the 2000s, he created the “Climate Explorer”, a platform to analyse climate data. He did so single-handedly and with little funding; it remains one of the most useful tools for accessing and analysing climate data available to the world. His desire to share everything he made, and for science, data and tools to be open, advanced climate science and meant that results were more easily accessible for the general public.

Geert Jan realised we needed to answer not just the simple questions, or those with the most immediate scientific rewards, but the ones that mattered. He was deeply motivated to make his science valuable to society and especially to the most vulnerable, as reflected in some of the earliest analyses of changing extremes in Africa and his pioneering work on rapid attribution of extreme events

As the joint founder and leader of WWA, Geert Jan was central to its work to investigate and explain how climate change is influencing weather today. His work received growing scientific, political and social attention: this year alone he was recognised with a royal honour in the Netherlands, the European Meteorological Society’s Technology Achievement Award and Time Magazine’s listing of the world’s most influential people. Attribution science was identified as one of the “breakthrough technologies” of 2020 by the MIT Technology Review, and was one of the major advances reported in the IPCC’s Sixth Assessment Report.

Beyond his scientific achievements, Geert Jan was a passionate, generous and inspirational colleague, friend and role model to his collaborators at WWA and to many others. His honesty, kindness and morality shone through his life and work, leaving fellow scientists, students and friends mourning his loss but grateful for having known and worked with him. His legacy will be immense.

Reflections from some of us, Geert Jan’s WWA colleagues:

“He was one of the really great ones, but he didn’t have a big ego, he never quite believed just how good he was. He was not recognised enough. Geert Jan taught me so much, but the most important thing was to have fun in your work. And he was so much fun to work with.”

“Such a deep understanding of data – from observations and models – he would immediately have a story for each dataset, switching from broad insights about the global climate to specific events, and from quality of large reanalysis data sets to data gaps or instrument changes in a particular weather station.”

“Something that I’ve personally witnessed many times, from my first interaction with him as a master student at an EGU poster, to recent intense back-and-forth about the Australian bushfire paper: his moral compass was always pointing in the right direction and his ego was so down-to-Earth that you couldn’t tell the difference between his interacting with famous professors or new students in terms of the respect and attention he’d give the discussion. We thus not only lose an exceptional scientist, but also a role model for scientific and ethical integrity.”

“Everyone who had the privilege to learn from you came out the other end as a better scientist or student. You weren’t afraid to call a spade a spade, contributing massively to advancing the sciences in the best possible way, at a scale only few ever achieve. On a human level, you’ve been a beacon of hope even when the stupid cancer consumed much of your energy. Which didn’t keep you from being wildly productive. Your positivity kept you going. The same positivity that was so contagious to colleagues and students who were lucky enough to work with you.

“But what I personally most admired – apart from your general kindness – was how you dealt with your cancer treatment journey. You’ve been open about it from the start and pushed people to get over the awkwardness associated with disease and death in our society. You’ve been no less than a role model for a societal transition in that regard. I for one will follow your lead for the remainder of my life, and despite the numbness and devastation that we alI feel right now, your unique legacy is going to live on forever. Lastly, there’s never the right time to go, but there is a sense of relief that you, Geert Jan, were put on the list of the most influential 100 people in 2021 while you were still with us. You have deserved no less than that! You will be sorely missed!”

]]>
Pathways and Pitfalls in extreme event attribution https://www.worldweatherattribution.org/pathways-and-pitfalls-in-extreme-event-attribution/ Thu, 13 May 2021 08:32:54 +0000 https://www.worldweatherattribution.org/?p=1369 Although this is a relatively new branch of science that developed over the last 15 years, there are many groups that compute these connections. Our approach in the World Weather Attribution (WWA) collaboration is different in two ways: we attempt to have our results ready as soon as possible after the event, and we try to respond as much as possible to questions posed by the outside world. Both approaches aim to make the attribution studies more useful. Commissions to investigate a disaster typically operate within a few months and appreciate if the event being attributed is as close as possible to the aspects that caused the disaster, media interest that reaches a large audience wanes on the order of weeks to months after large events.

We started doing event attributions on our first physical meeting during a heat wave in Paris in the summer of 2015 and learned a lot of lessons from the over two dozen studies we performed since then. These are summarised in a scientific paper that was published in April 2021, Pathways and pitfalls in extreme event attribution. This article is a summary of the scientific paper. An even more detailed description of the methodology that resulted from all these lessons learned has been published as A protocol for probabilistic extreme event attribution analyses. The main lesson was that the actual attribution step, on which most attention was focused in 2015, was only one step out of eight. We needed ways to deal with:

  1. the trigger: which studies to perform,
  2. the event definition: which aspect of the extreme event were most relevant,
  3. observational trend analysis: how rare was it and how has that changed,
  4. climate model evaluation: which models can represent the extreme,
  5. climate model analysis: what part of the change is due to climate change,
  6. hazard synthesis: combine the observational and model information,
  7. analysis of trends in vulnerability and exposure, and
  8. communication of the results.

We discuss each step in turn.

1. Analysis trigger

The Earth is large and extreme weather occurs somewhere almost every day. Which of these events merit an attribution study? In WWA we try to prioritise events that have a large impact on society, or that provoked a strong discussion in society, so that the answers will be useful for a large audience. These are often events for which the Red Cross issues international appeals. Sometimes smaller events closer to home or even meteorological records that did not affect many people also seem to generate enough interest to spend the effort to obtain a scientifically valid answer to the attribution question. We explicitly do not include the expected influence of climate change on the event on the trigger criteria: the result that an event was not affected by climate change, or even became less likely, is just as useful as one that the probability increased.

2. Event definition

Defining the event turned out to be both much harder and more important than we thought when we started attribution science. As an example: the first published extreme event attribution study analysed the extremely hot summer of 2003 in Europe (Stott et al, 2004). It took as event definition a European-wide seasonally averaged temperature, whereas the impacts had been tens of thousands of deaths in the 10-day hottest period in cities. A large-scale event definition like a continental and seasonal average has the advantage that climate models can represent it better and the signal-to-noise ratio is usually better than a local, short time scale definition. However, it is not the event that caused the damage and in WWA we try to relate our attribution question to the impacts, so we usually choose a definition of the event that corresponds as closely as possible to the impacts.

However, we stick to meteorological or hydrological quantities, like temperature, wet bulb temperature, rainfall, river discharge and do not consider real impacts like number of deaths or economic damage. The reason is that the translation from extreme weather into impacts usually is a complex and uncertain function of the weather. It also changes over time: the introduction of heat plans in Europe after the disastrous heat waves of 2003 and 2006 has reduced the number of deaths per degree of heat by a factor of two or three. Similarly, more houses in natural areas have increased the risk of wildfire damage, or more houses on the coast are now exposed to storm damage. We do not have the expertise to also take these changes into account and therefore restrict ourselves to indicators of heat, fire weather risk and wind in these examples.

A final consideration of the choice of event definition is that the quantity chosen has to have long and reliable observations and be represented by climate models.

In practice we use the following definitions. For heat waves, the local highest daily maximum temperature of the year is a standard measure that captures the health risk to outdoor labourers in e.g. India. For Europe and North America, the maximum of 3-day mean temperature appears to be more relevant as the most vulnerable population is indoors. If humidity plays a role the wet bulb temperature can be used instead. For cold waves we just use temperature as the wind chill is hard to measure. Local daily maximum precipitation is usually relevant for flash floods, for larger floods we either average over a river basin or use hydrological models to compute river discharge. Drought has many definitions, from lack of rain to water shortage and great care has to be taken to choose the most relevant one. For wind the highest 10-minute or hourly wind speed is chosen as these are closest to the impacts.

It should be noted that in practice, finding out what really happened is not easy as different estimates of the variable can be very different. An example is given in Fig. 1, which shows the very different estimates of the highest 3-day averaged precipitation around Houston due to Hurricane Harvey from different observing systems. We used the GHCN-D station data.

Figure 1: Observed maximum three-day averaged rainfall over coastal Texas January-September 2017 (mm/dy). a) GHCN-D v2 rain gauges, b) CPC 25 km analysis, c) NOAA calibrated radar (maximum in August 25–30), d) NASA GPM/IMERG satellite analysis. From Van Oldenborgh et al. (2017).

3. Observational trend analysis

In WWA, we consider an analysis of the observations an essential part of an extreme event attribution. The observational dataset should go back at least to the 1950s but preferably to the 19th century and be as homogeneous as possible. To choose the most representative observation we usually collaborate with local experts, who know which time series are most reliable and least influenced by other factors than climate change, e.g., station changes, irrigation or urban heat.

The observational analysis gives two pieces of information: how rare the event is in the current climate and how much this has changed over the period with observations.

The probability in the current climate is very important to inform policy makers whether this is the kind of event that you should be able to handle or not. As an example, the floods that paralysed Jakarta in January 2014 turned out to be caused by a precipitation event with a return times of only 4 to 13 years, pointing to a very high vulnerability to flooding (which is well-known). Conversely, the floods in Chennai in December 2015 were caused by rainfall with an estimated return period of 600 to 2500 years, which implies that the event was too rare to make it worth it to defend against.

Climate change is by now so strong that many observed time series of extreme events show clear trends. An efficient way to quantify the changes is to fit the data to an extreme value distribution, which theoretically describes either block maxima like the hottest day of the year, or exceedances over a threshold like the 20% driest years. We describe the effects of climate change by either shifting or scaling the distribution with the 4-yr smoothed global mean surface temperature (GMST). This quantity is proportional to the anthropogenic forcing and estimates are available in real time. The smoothing serves to remove influences of El Niño and other weather that are unrelated to the long-term trends. The assumptions in these fits—constant variability for temperature, constant variability over mean for precipitation and other positive-definite quantities—can be checked in the observations themselves to some extent and more fully in the climate model output.

In practice we find that there are very clear trends in heat waves, although these are also influenced strongly by non-climatic factors such as land use changes, irrigation and air pollution. Cold waves also show significant trends by now, although due to the greater variability of winter weather the signal-to-noise ratio is not as good as for heat waves. Daily or sub-daily precipitation extremes also often show clear trends, longer-duration ones are more mixed. Drought trends are very difficult to see in observations, because droughts are long-term phenomena so there are not many independent samples. Drought is also usually only a problem when the anomaly is large relative to the mean, which usually implies that it is also large relative to the trend, so the signal-to-noise ratio is poor. In all our drought studies only one showed a borderline significant trend in precipitation.

Fig. 2 shows the trend in a heat extreme (the highest daily mean temperature at De Bilt, the Netherlands), which shows a clear trend by eye already, and 3-day precipitation extremes along the US Gulf Coast, for which the GEV fit shows a significant trend under the assumption that all intensities increase by the same percentage.

Figure 2: a,c) Highest daily mean temperature of the year at De Bilt, the Netherlands (ho- mogenised) for the period 1901-2018, fitted to a GEV that shifts with the 4-yr smoothed GMST. a) as a function of GMST and c) in the climates of 1900 and 2018. b,d) The same for the highest 3-day averaged precipitation along the US Gulf Coast for 13 stations with at least 80 years of data and 2º apart, fitted to a GEV that scales with 4-yr smoothed GMST. From climexp.knmi.nl, (b,d) also from Van Oldenborgh et al. (2017).

4. Climate model evaluation

Observations alone cannot show what caused the trend. In order to attribute the observed trend to global warming (or not), we have to use climate models. These are similar to the weather models that we use to forecast the weather of the next days to weeks, but instead of predicting the specific weather the next few days, they predict the statistics of it: how often extreme events occur in the computed weather in the climate model. However, we can only use the climate model output if these extremes are realistically simulated. In practice we use the following three criteria to select an ensemble of climate models.

  • Can the model represent the extreme in principle?
  • Is the statistical distribution of extreme events on the climate model compatible with the observed one, allowing for a bias correction?
  • Is the meteorology leading to these extremes in the model similar to the observations?

The first criterion usually concerns the resolution and included mechanisms. A relatively coarse resolution model with a 200 km grid may well be able to represent a heat wave, but for somewhat realistic tropical cyclones we need a resolution better than 50 km. If we study heat in an area where irrigation is important we would like the model to include that cooling influence on extreme temperatures.

For the second criterion we just fit the tail of the distribution to the same extreme value function as the observations and check whether the variability and shape of the tail are compatible. To verify that the agreement is not for the wrong reasons we try to check the meteorology behind the extremes. This includes in any case the seasonal cycle and spatial patterns, but if relevant also may concern El Niño teleconnections or the source of precipitation. We even found that many climate models have an unphysical limit on high precipitation amounts (Fig. 3), these cannot be used for attributing (or projecting) these kind of events.

Figure 3: Return time plots of extreme rainfall in Chennai, India, in two CMIP5 climate models with ten ensemble members (CSIRO-Mk3.6.0 and CNRM-CM5) and an attribution model (HadGEM3-A N216) showing an unphysical cut-off in precipitation extremes. The horizontal line represents the city-wide average in Chennai on 1 December 2015 (van Oldenborgh et al., 2016)

As climate models are imperfect representations of reality we demand at least two and preferably more models to be good enough for the attribution analysis. The spread of these models gives an indication of the model uncertainty.

5. Climate model analysis

The next step is the original attribution step. For each model we compute how much more likely or intense the extreme event has become due to anthropogenic emissions of greenhouse gases and aerosols. This can be done in one of two ways. The original proposal was to simulate the world twice: once for the current climate and once for a counterfactual climate that is the same as the current one but without anthropogenic modifications of the climate. For each climate we perform many simulations of the weather and count or fit the number of extremes. The difference between the two gives how much more (or less) likely the extremes have become.

The alternative is to take simulations of the historical climate, usually extended with a few years of the climate under one of the climate scenarios up to the current year (these are very close together up to 2040 or so). These transient simulations can then be analysed exactly the same way as the observations. This assumes that the influence of natural forcings—variations in solar radiation and volcanic eruptions—is small compared to the anthropogenic ones, which is usually the case.

As climate models usually have biases, we define the event by its return period and not by its amplitude. So if the observational analysis gives a return period of 100 yr, we also count or fit 100-yr events in the models. This turns out to give a better correspondence to the observations than specifying the amplitude and explicitly performing a bias correction when the extreme value distribution has an upper bound, as usually occurs for heat extremes, or a very thick tail, which we find for precipitation extremes.

For each model the attribution step gives a change in probability for the extreme to occur due to anthropogenic climate change, or equivalently the change in intensity for a given intensity.

6. Hazard synthesis

The next step is to combine the information from the observations and multiple models into a statement how the probability and intensity of the physical extreme event has changed. We use the term ‘hazard’ for this as the total ‘risk’ also includes how much exposure there is to the extreme and how vulnerable the people or systems are, which is addressed in the next step.

How to best combine this information is still an area of active research. Our current method is to combine all observational estimates under the assumption that they are highly correlated, as they are based on the same observations of the same sequence of weather. The model estimates are combined under the assumption that they are not correlated, as the weather in each model is different. However, the model spread can be larger than expected on the basis of weather variability alone, in which case we add a model uncertainty term to the average. Finally we combine observations and models. If they agree this can be a weighted average, but if they disagree we either have to increase the uncertainty or even give up on an attribution altogether.

Figure 4: Synthesis plots of a) the probability ratio (PR) for changes in wind intensity over the region of storm Friederike on 18 January 2018 (Vautard et al., 2019), b) the PR for the highest 3-day averaged daily mean temperature of the year at De Bilt, the Netherlands (Vautard et al, 2020) and c) the PR for extreme 3-day averaged precipitation in April–June averaged over the Seine basin (Philip et al., 2018a). Observations are shown in blue, models in red and the average in purple. An additional model uncertainty term is added as black outline boxes.

An example of the latter is our study into the winter storms that hit Europe in January 2018, Fig. 4a. The climate models compute a small increase in probability due to the increased temperature, but the observations show a large decrease. We think the latter is caused by the increased roughness of the land surface due to more buildings, trees and even wind turbines. This is not included in the climate models, so they cannot be expected to give a reliable projection of the intensity of future storms.

A less severe discrepancy is apparent in heatwaves in northwestern Europe, Fig. 4b. The models simulate a much lower trend than the observations. This means we can only give lower bounds on the changes in probability and intensity due to human induced climate change. The same holds for southeastern Australia.

In other studies observations and models agree well and we can give an accurate attribution statement based on the combination of all information, e.g. for the rainfall in the Seine basin in 2016 (Fig. 4c).

7. Vulnerability and exposure

A disaster happens due to a combination of three factors:

  1. the hydrometeorological hazard: process or phenomenon of atmospheric, hydrological or oceanographic nature that may cause loss of life, injury or other health impacts, property damage, loss of livelihoods and services, social and economic disruption, or environmental damage,
  2. the exposure: people, property, systems, or other elements present in hazard zones that are thereby subject to potential losses, and
  3. the vulnerability: the characteristics and circumstances of a community, system or asset that make it susceptible to the damaging effects of a hazard.

We consider it essential to also discuss the vulnerability and exposure in an attribution study. Not only do these combine with the changes in the physical extremes that we have computed in the previous step to determine the impact of the extreme weather, but they may have significant trends themselves.

As an example: we found that the drought in São Paulo, Brazil in 2014–2015 was not made worse due to climate change. Instead, the analysis showed that the increase of population of the city by  roughly 20% in 20 years, and the even faster increase in per capita water usage, had not been addressed by commensurate updates in the storage and supply systems. Hence, in this case, the trends in vulnerability and exposure were the main driver of the significant water shortages in the city.

Even though this section often has to be qualitative due to a lack of standardised data and literature describing trends in these factors, we think it is vitally important to put the extreme weather into the perspective of how it impacts society or other systems. Changing the exposure or vulnerability is also often the way to protect against similar impacts in the near future, as stopping climate change is a long-term project and we expect in general stronger impacts over the next decades. For instance the number of casualties of heat waves has decreased by a factor two or three in Europe after heat plans were developed in response to the 2003 and 2006 heat waves. Similar approaches are now being taken throughout the world. The inclusion of vulnerability and exposure is thus vital to making the analysis relevant for climate adaptation planning.

8. Communication

Finally, the results of the attribution study have to be communicated effectively to a range of audiences. We found that the key audiences that are interested in our attribution results should be stratified according to their level of expertise: scientists, policy-makers and emergency management agencies, media outlets and the general public.

For the scientific community  a focus on allowing full reproducibility is key. We always publish a scientific report that documents the attribution study in sufficient detail for another scientist to be able to reproduce or replicate the study’s results. If there are novel elements to the analysis the paper should also undergo peer review. We have found that this documentation is also essential to ensure consistency within the team on numbers and conclusions.

We have found it very useful to summarise the main findings and graphs of the attribution study into a two-page scientific summary aimed at science-literate audiences who prefer a snapshot of the results. Such audiences include communication experts associated with the study, science journalists and other scientists seeking a brief summary.

For policy-makers, humanitarian aid workers and other non-scientific professional audiences, we found that the most effective way to communicate attribution findings in written form are briefing notes that summarise the most salient points from the physical science analysis and the specific vulnerability and exposure context. This audience often requires the information to be available relatively quickly after the event.

Finally, if there is a demand from the media, a press release with the main points and quotes from the study leads and local experts is prepared. In addition to the physical science findings, these press releases typically provide a very brief, objective description of the non-physical science factors that contributed to the event. In developing this press piece, study authors need to be as unbiased as possible, for instance not emphasising lower bounds as conservative results because in practice this may lead to interpretations that underestimate the influence of climate change. This is also an effective way to reach other target audiences.

Conclusions

Using the procedure outlined above, based on lessons learned during five years of doing attribution studies, we found that often, we could find a consistent message from the attribution study based on imperfect observations and model simulations. This we used to inform key audiences with a solid scientific result, in many cases quite quickly after the event when the interest is often highest.

However, we also found many cases where the quality of the available observations or models was just not good enough to be able to make a statement on the influence of climate change on the event under study. This also points to scientific questions on the reliability of projections for these events and the need for model improvements.

Over these years, we found that when we can give answers, these are useful for informing risk reduction for future extremes after an event, and in the case of strong results also to raise awareness about the rising risks in a changing climate and thus the relevance of reducing greenhouse gas emissions. Most importantly, the results are relevant simply because the question is often asked — and if it is not answered scientifically, it will be answered unscientifically.

]]>
A limited role for unforced internal variability in 20th century warming https://www.worldweatherattribution.org/a-limited-role-for-unforced-internal-variability-in-20th-century-warming/ Mon, 20 May 2019 12:59:09 +0000 https://www.worldweatherattribution.org/?p=1225 While the scientific community overwhelmingly agrees that human activities are responsible for the observed increase in temperatures for the last half-century, the relative influences of natural drivers of climate change like volcanic eruptions, ocean cycles, and the sun on warmer and cooler phases superimposed on the long-term warming trend is still an area of active research. The new study, led by Oxford University’s Karsten Haustein and colleagues from around the world, concludes that so-called internal variability due to slow-acting ocean cycles is not necessary to explain the changes in the historical temperature record.

Rather, the team concludes that human factors like greenhouse gas emissions and particulate pollution, volcanic eruptions, and changes in solar activity (collectively known as external forcings), along with year-to-year ups and downs related to the El Niño-Southern Oscillation phenomenon, are sufficient to explain virtually all of the long-term change in the temperature record. In the course of this work the team also found that the near-term sensitivity of the Earth’s climate to influences like greenhouse gas emissions is consistent with previous estimates from the Intergovernmental Panel on Climate Change, and have provided evidence for unresolved biases both in parts of the ocean temperature record and the procedure used to compare climate models to the instrumental record.

The team also provides compelling evidence that the warming during so-called Early Warming period between 1915 and 1945 might in fact be caused entirely by external factors. Currently, half of the observed warming during that time is attributed to internal ocean variability, which is a key reason why the estimate of the human-induced warming fraction has been very uncertain in the past. Together with a revised index to describe North Atlantic Ocean variability more realistically, this study helps to resolve some of the most puzzling questions in climate science until now.

In detail

A new study published online in the Journal of Climate is providing strong evidence that virtually all of the observed changes in global mean temperatures over the last 170 are caused by external drivers, leaving only little room for an (unforced) internal ocean contribution. The team of researchers found that alleged ocean cycles on timescales of 60-70 years are unlikely to be a factor in the observed evolution of the Global Mean Surface Temperature (GMST) since 1850. Instead, external factors such as periods of strong volcanic activity and the release of anthropogenic aerosol particles (air pollution) have caused the temperature (including the ocean surface temperature) to fluctuate. While not an entirely novel suggestion (e.g. Mann et al. 2014), the new study provides strong evidence that virtually all of the observed variability is externally forced.

Most prominently, the so-called Early Warming period between 1915 and 1945 can now be explained by external influences. Currently, only half of the observed warming during this period was attributable to external factors, with the remaining half attributed to unforced natural variability (e.g. Hegerl et al. 2018). Since this very period fits right into the traditional concept of the existence of a quasi-oscillatory 60-70 year ocean cycle (with colder periods during 1880-1910 and 1950-1980, followed by warmer periods in between), the new findings robustly challenge this prevailing view. This is particularly true as the study’s results reveal that the conclusions hold when compared with paleo-climate data during the pre-instrumental period from 1500-1850.

Figure caption: Global (upper panel) and Northern Hemisphere (lower panel) response model result (green bold line) plotted against three observational datasets for the 1875-2017 period: Cowtan/Way land temperature data combined with HadISST2 sea surface temperature data over ocean (grey), standard Cowtan/Way which uses HadSST3 sea surface temperature data over ocean (yellow) and Berkeley Earth global temperature data (black). The anomalies are expressed relative to 1850-1879. El Nino Southern Oscillation index variability is added onto the response model time series.
Figure caption: Global (upper panel) and Northern Hemisphere (lower panel) response model result (green bold line) plotted against three observational datasets for the 1875-2017 period: Cowtan/Way land temperature data combined with HadISST2 sea surface temperature data over ocean (grey), standard Cowtan/Way which uses HadSST3 sea surface temperature data over ocean (yellow) and Berkeley Earth global temperature data (black). The anomalies are expressed relative to 1850-1879. El Nino Southern Oscillation index variability is added onto the response model time series.

Importantly, the team does not dispute short-term variability associated with El Niño Southern Oscillation (ENSO), a well understood climate mode known to cause strong interannual changes in GMST. In fact, the team demonstrates that the incorporation of ENSO variability into their model leads to an explained variability of the observed temperature of 93%. The simple model used in their study takes fast and slow climate feedback processes from natural and anthropogenic radiative forcing perturbations into account, thus allowing the researchers to estimate the resulting temperature response with high accuracy. By virtue of a more precise description of the anthropogenic aerosol feedback processes as well as the removal of known biases in the instrumentally observed sea surface temperature (SSTs) record such as a warm bias during World War II, mismatches between model and observational data disappear.

Asked whether the findings will have notable repercussions for our views of climate change, Karsten Haustein, the lead author of the new study, said that “the current estimate of the human contribution to warming (~100% human-induced) is all but confirmed, yet the confidence in this estimate is considerably enhanced.” He added: “we now have to worry less about the large uncertainty associated with unforced natural variability, as there never was a substantial contribution on sub-centennial timescales to start with”. The same is valid for our estimate of the climate sensitivity. It remains unchanged, but it is now underpinned with more robust evidence. As far as the alleged ocean cycles in general and multidecadal Atlantic Ocean variability (also known as AMV index) in particular is concerned, the results published by the team suggest that the ocean responds to external changes rather than the other way around. The North Atlantic may amplify this signal, it cannot, however, drive hemispheric temperatures. To describe the North Atlantic Ocean variability more realistically, a revised AMV index – called North Atlantic Variability Index – has been introduced.

To conclude: while the climate system continues to be influenced by interannual and presumably multi-annual internal variability, the idea that oceans have magically been driving the climate in a colder or warmer direction for multiple decades in the past, and hitherto do so in the future, is unlikely to be correct. Most of the complex global climate models strongly support the hypothesis that oceans have only limited ability to alter global temperatures on multidecadal timescales. Hence this study provides a very useful constraint on the simulated internal variability in climate models.

References

Mann, M.E., B.A. Steinman, and S.K. Miller (2014). On forced temperature changes, internal variability, and the AMO. Geophysical Research Letters, 41: 3211-3219. doi: 10.1002/2014GL059233

Hegerl, G.C., S. Brönnimann, A. Schurer, and T. Cowan (2018). The early 20th century warming: Anomalies, causes, and consequences. WIREs Climate Change, 2918:9:e522. doi: 10.1002/wcc.522

]]>
Devastating rains in Kenya, 2018 https://www.worldweatherattribution.org/devastating-rains-in-kenya/ Mon, 25 Jun 2018 13:17:52 +0000 http://wwa-test.ouce.ox.ac.uk/?p=589 The World Weather Attribution team have begun analysing the unusually intense rainfall in Kenya to determine whether climate change played a role.

The new study centres on an area of the country where rainfall has been unusually heavy and for which good meteorological data is available, but it may also offer insights into conditions further afield regionally.

A year ago we released a detailed study of the Kenyan drought, which found no detectable influence of climate change on rainfall, although not excluding small changes in the risk of poor rains linked to climate change.

The current study will be the first of its kind into intense rainfall in East Africa.

]]>
Trends in weather extremes https://www.worldweatherattribution.org/trends-in-weather-extremes-february-2018/ Wed, 28 Feb 2018 20:04:56 +0000 http://wwa-test.ouce.ox.ac.uk/?p=758 Studying these trends is an essential step in the extreme event attribution procedure we use at World Weather Attribution. Over the years, we have collected a fair number of results in these analyses and other articles. In this article, I take a step back and consider global long-term meteorological station data for hot, cold, and wet extremes, and share some thoughts on tropical cyclones and droughts. I make no claims for completeness; there is a lot of literature on this that I do not know.

The obvious first-order hypothesis is that warm extremes are getting warmer and cold extremes less cold. Severe precipitation tends to increase due to the higher moisture content of warmer air. Sea level rise simply heightens storms urges. Other extremes do not have as obvious first-order trends.

The figures are based on GHCN-D station data from NOAA/NCEI and can easily be reproduced on the KNMI Climate Explorer. They have at least 50 years of data and a minimum distance between station of 0.5º. The daily average temperature was chosen, defined as the average of maximum and minimum temperatures. This quantity is less sensitive to changes in observing practices and surroundings than either minimum or maximum temperatures alone, e.g., decreased ventilation due to trees growing affects minimum and maximum temperatures with opposite signs. There are no obvious urban/rural contrasts in the maps, so they mainly reflect large-scale trends.

Note that we compute trends as a function of the smoothed global average temperature, rather than simply time, since our first-order hypotheses are related to warming.

Heat extremes

In this analysis a heat extreme is simply defined as the highest daily average temperature of the year. Our trend analysis shows that almost everywhere these heat extremes are now warmer than a century ago, following the obvious first-order connection with global average temperature. We encountered two exceptions in our work: in the eastern U.S. heat extremes now are roughly as warm as they were during the Dust Bowl of the 1930s, when the severe drought heightened the temperature of hot days. In India, where there has generally been no trend since the 1970s, we find that increasing air pollution and irrigation counteracted the warming trend due to greenhouse gases over that period (van Oldenborgh et al, 2018). In the rest of the world, the increases are typically large, with the temperature of the hottest day of the year rising much faster than the global mean temperatures in most regions (values above one in Figure 1).

Trend in the temperature of the warmest day of the year as a multiple of the global mean temperature rise
Figure 1. Trend in the temperature of the warmest day of the year as a multiple of the global mean temperature rise. Source: NOAA/NCEI/GHCN-D stations with at least 50 years of data via the KNMI Climate Explorer.

The map of Figure 1 also shows the sparsity of stations with long daily temperature records that are publicly available in the tropics. This is mainly a result of the data being sold commercially. However, in this region, temperature extremes are also often not acknowledged, although there is evidence that they pose a major health hazard (see Gerland et al, 2015 for Africa).

Cold extremes

For cold extremes, the daily average temperature of the coldest day of the year is considered. Figure 2 shows that these cold events heat up even faster than the heat extremes, up to a factor of five times the global mean temperature (see e.g., van Oldenborgh et al, 2015 and the WWA cold wave analyses). The strongest increases over land are in Siberia and Canada. Winter temperatures are very low there due to radiative cooling over snow under a clear sky, with strong vertical gradients in the lowest meters of the atmosphere. These stable boundary layers are sensitive to perturbations, probably also to the extra downward longwave radiation due to greenhouse warming. This may explain the strength of the observed trends. Further south, the cold air from the north is simply less cold, also due to the well-understood Arctic amplification over the Arctic Ocean. Note that the current climate models do not have the resolution to simulate this properly and underestimate the trends.

Trend in the temperature of the coldest day of the year as a multiple of the global mean temperature rise.
Figure 2. Trend in the temperature of the coldest day of the year as a multiple of the global mean temperature rise. Source: NOAA/NCEI/GHCN-D stations with at least 50 years of data via the KNMI Climate Explorer.

Precipitation extremes

As a measure of extreme precipitation, we take the highest daily precipitation of the year. This is a measure relevant for local flooding. For flood events over larger basins, the time over which to compute the total amount should be longer, while flash floods may be caused by shorter events, making hourly totals more relevant. The daily total is a useful measure, in the middle of the relevant temporal range. Similar maps can be made for longer times scales via the Climate Explorer. We computed trends after transforming precipitation amounts by taking their logarithm, a transformation that makes the variable more like a normal distribution so that the trend is mathematically better defined. It also ensures that the precipitation remains positive. For the trends shown here, these are almost the same as relative trends in the precipitation itself.

Map showing trend in the logarithm of the wettest day of the year as a multiple of the global mean temperature rise.
Figure 3. Trend in the logarithm of the wettest day of the year as a multiple of the global mean temperature rise. Isolated station with strong trends in the map can be due to random weather (one big event in a short series) or factors that do not reflect reality, such as coding errors and changes in observing practices. Source: NOAA/NCEI/GHCN-D stations with at least 50 years of data via the KNMI Climate Explorer.

Note that also for precipitation, rather than computing trends in time, we compute trends as a function of global average temperature. Figure 3 shows that the highest daily mean of the year has increased at more stations than it has decreased. This was already found by Westra et al (2013). The average increase is similar to the increase of the amount of water the atmosphere can hold at higher temperatures (the ClausiusClapeyron relation), about 7% per degree Celsius. However, there is a large spread around this average. A large part of this is random weather: even in >50-year series, the variability is large compared to the trends. In some areas, there are systematic deviations from Clausius-Clapeyron due to other effects of climate change. Some examples from my own work are listed below. The trend in Colorado is lower than Clausius-Clapeyron, probably due to higher air pressure during the season of most extremes (Eden et al, 2016). Drying trends also suppress extreme precipitation, such as in summer in the Mediterranean region. However, in autumn extremes increase strongly at one mountain range there (Vautard et al, 2015). Some winter extremes in northwestern Europe are found to increase more strongly than Clausius-Clapeyron due to the increase in zonal circulation types (e.g., van Haren et al, 2013, Schaller et al 2016), but others do not (e.g., Otto et al, 2018).

To conclude, on average daily precipitation extremes increase in intensity, but local trends are often different from the global average. We’ll be busy studying these regional trends for some time.

Tropical storms

It is hard to determine trends in the number and intensity of tropical cyclones (called hurricanes in the North Atlantic). This is because the observing system has been improved so much over the last 150 years, that more storms are detected and the most intense parts of it are more likely to be measured (Vecchi and Knutson, 2011). There is also strong decadal variability in cyclone activity in many regions. All that said, most damage of tropical cyclones is caused by water: extreme rain and storm surges. Both observations and modelling show that extreme precipitation associated with hurricanes sees large increases. For the U.S. Gulf Coast, we found an increase of about 15% in extreme precipitation, including from non-cyclone events, over the last century (van der Wiel et al, 2017; van Oldenborgh et al, 2017). Storm surges are trivially higher due to sea level rise. This means that even if the theoretically expected increase in the most intense tropical cyclones is not yet detectable, their physical impacts have increased substantially already.

Drought

Trends in drought strongly depend on the definition of drought. There are three common ones: meteorological drought, which is simply an absence of rain; agricultural drought, which is a deficit of soil moisture and thus includes evaporation (and sometimes irrigation); and hydrological drought, which also includes the transport of water. Trends in meteorological droughts are often hard to determine: drought is only a problem if the variability is large relative to the mean, but that also implies that natural variability is large compared to the trend (e.g., Philip et al, 2018, Uhe et al, 2017). Hydrological droughts, such as the one in California, can be caused by an increase in temperature rather than precipitation. This reduces the snowpack in spring and thus causes a shortage of stored water in the dry summer (e.g., Mote et al, 2016). Socio-economic drought, that is shortage of water for common use by society, is often caused by increased water use rather than decreased availability (e.g., Otto et al, 2015). It is, therefore, very hard to make general statements about drought.

Conclusions

Observations of weather extremes show the expected long-term trends in line with the increase of the global average temperature: almost everywhere hotter heat extremes, almost everywhere less frigid cold extremes, in general more intense precipitation, but with variations from region to region, and more damage from hurricanes through more precipitation and higher storm surges. Other extremes are not so simply related to climate change, and we are undertaking background research to make rapid attribution of these extremes possible.

Thanks to Claudia Tebaldi for improvements to the text. Previous versions were published as KNMI klimaatbericht and on the Climate Lab Book blog.

]]>
Assigning historical responsibilities for extreme weather events https://www.worldweatherattribution.org/assigning-historical-responsibilities-for-extreme-weather-events/ Thu, 02 Nov 2017 16:24:04 +0000 http://wwa-test.ouce.ox.ac.uk/?p=751 The research combines the new science of extreme event attribution with assessments of historic emissions from individual countries and regions in the world. Otto along with scientists from CICERO in Oslo, Norway demonstrates that it is possible to assign individual countries responsibilities for some types of extreme weather events. Quantification of these country specific contributions rest on data and science, but also depend on value based judgements.

“We found that it is scientifically possible to quantify historical responsibility of individual countries/regions for specific extreme events,” said Dr Friederike Otto, deputy director of the ECI and lead author of the study. “The fact that it is possible to provide such quantification will greatly advance the possibility of an informed discussion,” Otto added. “The aim of the study was to explore what science could contribute to the debate of climate justice.”

We found that it is scientifically possible to quantify historical responsibility of individual countries/regions for specific extreme events.

The team applied two different statistical methodologies to assign contributions of individual countries’ emissions to an extreme weather event, using the example of the Argentinian heatwave of 2013–14. While a previously published attribution study found that anthropogenic climate change overall made the event approximately five times more likely, the new analysis showed that when accounting for all historic emissions from 1850 onwards large emitters like the U.S. and the EU made the event approximately 28% and 37%, respectively, more likely. The differences between the different methodologies were small compared to the overall responsibility assigned to the individual region.

“Overall, we find that choices about how to do the calculations that are not only scientific but also moral and political determine the quantitative results,” said Dr. Jan Fuglestvedt, research director at CICERO.

The manuscript is published in Nature Climate Change.

]]>
2015 – a record breaking hot year https://www.worldweatherattribution.org/record-hot-year-2015/ Tue, 24 Nov 2015 15:01:12 +0000 http://wwa-test.ouce.ox.ac.uk/?p=844 Based on the analysis described in the Methodology section below, we estimate the 2015 global temperature anomaly to be 1.05ºC (1.89ºF) above the 1850–1900 average that the IPCC takes to be “pre-industrial.” The year 2015 is therefore likely to be remembered as the first year that two symbolic thresholds were set: the 1ºC (1.8ºF) temperature anomaly threshold and the 400 parts per million (ppm) CO2 threshold.

Of that 1.05ºC temperature departure from pre-industrial, roughly 1.0ºC is due to the anthropogenic forcing, about 0.05ºC (0.09ºF) to 0.1ºC (0.18ºF) is due to El Niño and about 0.02ºC (0.04ºF) is due to higher solar activity. The remainder is well within the range of variations due to random weather, especially winter weather in Siberia and Canada. Volcanoes contribute very little at this time.

Methodology

This analysis uses well-established techniques from the peer-reviewed literature (Foster and Rahmstorf, 2011; Suckling et al, 2015).
First, the team extended the NCEI global mean surface temperature time series through the end of the year by assuming that November and December temperatures are similar to September and August. (October temperatures were unusually high, in part due to random weather fluctuations that are unlikely to persist into the next months.) This extrapolation gives an expected value for the global mean surface temperature of about 0.87ºC above the 1951-1980 average, which translates to 1.05ºC above the 1850–1900 average that the IPCC takes to be “pre-industrial.”

Observational Record showing the NCEI global surface temperature anomaly.
Observational Record: The gray line shows the NCEI global surface temperature anomaly. The red line shows CO2 equivalent from IIASA scaled to fit the observations. The blue line adds the fitted naturally forced temperature anomaly, with volcanic forcing from GISS/NASA and solar forcing from Krivova et al.

This temperature time series was fitted to the logarithm of the equivalent CO2 distribution (this includes CO2, other greenhouse gases, and aerosols) from IIASA, volcanic aerosols and solar radiation. As can be seen in Figure 1, this gives a very good fit, with 2014 almost on the fitted trend and 2015 clearly above it. The forced trend is for just about 1.0ºC due to anthropogenic forcing, and 0.02ºC due to the solar cycle, which was above average in 2015. The remainder, about 0.05ºC, is mainly due to the developing El Niño, which tends to heat the globe with a delay of around five months. Various ways to take this delay into account give different estimates of the contribution of the current very strong El Niño, from 0.05 to 0.1ºC. Assuming a shorter delay for this event gives higher values. Due to the delayed response of the global temperature, the influence of El Niño on global mean temperature will be greater in 2016 than in 2015, as it was in 1998 in the year after the peak El Niño ocean temperatures in December.
In addition the observational data was compared with simulated global mean temperature rise from 1850–1900 in the CMIP5 models (Figure 2). The CMIP5 ensemble was normalized against observations over a long period without volcanic eruptions, 1911-1960. This is driven by the observation that the global mean cooling due to volcanoes is much larger in CMIP5 simulations than in the observations and hence taking a period with large volcanoes, such as 1986-2005, gives an offset between the two curves. The recent observations are back in the middle of the plume after a few years on the low side. This comparison is influenced by many factors: the reference periods, the speed at which air quality has improved worldwide, the effects of the relative weak solar cycle and volcanic eruptions. It is therefore impossible to deduce the effect of only CO2 from the warming trend up to now.

CMPI5 comparison: observational data was compared with simulated global mean temperature rise starting from 1850-1900 in the full CMIP5 ensemble and he ensemble with only natural forcings
CMPI5 comparison: observational data was compared with simulated global mean temperature rise starting from 1850-1900 in the full CMIP5 ensemble (historical forcings up to 2005, RCP4.5 from 2006 onward, red) and the ensemble with only natural forcings (historicalNat, blue).

References

Foster, G. and Rahmstorf, S. (2011) Global temperature evolution 1979–2010. Environmental Research Letters, 6: 044022, doi: 10.1088/1748-9326/6/4/044022

Krivova, N.A., Vieira, L.E.A. and Solanki, S.K. (2010) Reconstruction of solar spectral irradiance since the Maunder minimum. Journal of Geophysical Research: Space Physics, 115(A12) CiteID A12112. doi: 10.1029/2010JA015431

Suckling, E., Hawkins, E., van Oldenborgh, G.J. and Eden, J. (2015) An empirical model for probabilistic decadal prediction: A global analysis. Climate Dynamics, 48(9–10): 3115–3138. doi: 10.1007/s00382-016-3255-8

 

]]>