Saturday, 7 October 2017

A short history of homogenisation of climate station data

The WMO Task Team on Homogenisation (TT-HOM) is working on a guidance for scientists and weather services who want to homogenise their data. I thought the draft chapter on the history of homogenisation doubles as a nice blog post. It is a pretty long history, starting well before people were worrying about climate change. Comments and other important historical references are very much appreciated.

Problems due to inhomogeneities have long been recognised and homogenisation has a long history. In September 1873, at the “International Meteorologen-Congress” in Vienna, Carl Jelinek requested information on national multi-annual data series ([[k.k.]] Hof- und Staatsdruckerei, 1873), but decades later, in 1905 G. Hellmann (k.k. Zentralanstalt für Meteorologie und Geodynamik, 1906) still regretted the absence of homogeneous climatological time series due to changes in the surrounding of stations and new instruments and pleaded for stations with a long record, “Säkularstationen”, to be kept as homogeneous as possible.

Although this “Conference of directors” of the national weather services recommended maintaining a sufficient number of stations under unchanged conditions today these basic inhomogeneity problems still exist.

Detection and adjustments

Homogenisation has a long tradition. For example, in early times documented change points have been removed with the help of parallel measurements. Differing observing times at the astronomical observatory of the k.k. University of in Vienna (Austria) have been adjusted by using multi-annual 24 hour measurements of the astronomical observatory of the k.k. University of Prague (today Czech Republic). Measurements of Milano (Italy) between 1763 and 1834 have been adjusted to 24 hour means by using measurements of Padova (Kreil, 1854a, 1854b).

However, for the majority of breaks we do not know the break magnitude; furthermore it is most likely that series contain undocumented inhomogeneities as well.Thus there was a need for statistical break detection methods. In the early 20th century Conrad (1925) applied and evaluated the Heidke criterion (Heidke, 1923) using ratios of two precipitation series. As a consequence he recommended the use of additional criteria to test the homogeneity of series, dealing with the succession and alternation of algebraic signs, the Helmert criterion (Helmert, 1907) and the “painstaking” Abbe criterion (Conrad and Schreier, 1927). The use of Helmert’s criterion for pairs of stations and Abbe’s criterion still has been described as appropriate tool in the 1940s (Conrad 1944). Some years later the double-mass principle was popularised for break detection (Kohler, 1949).

German Climate Reference Station which was founded in 1781 in Bavaria on the mountain Hohenpeißenberg.

Reference series

Julius Hann (1880, p. 57) studied the variability of absolute precipitation amounts and ratios between stations. He used these ratios for the quality control. This inspired Brückner (1890) to check precipitation data for inhomogeneities by comparison with neighbouring stations;he did not use any statistics.

In their book “Methods in Climatology” Conrad and Pollak (1950) formalised this relative homogenization approach, which is now the dominant method to detect and remove the effects of artificial changes. The building of reference series, by averaging the data from many stations in a relatively small geographical area, has been recommended by the WMO Working Group on Climatic Fluctuations (WMO, 1966).

The papers by Alexandersson (1986) and Alexandersson and Moberg (1997) made the Standard Normal Homogeneity Test (SNHT) popular. The broad adoption of SNHT was also for the clear guidance on how to use this test together with references to homogenize station data.

Modern developments

SNHT is a single-breakpoint method, but climate series typically contain more than one break. Thus a major step forward was the design of methods specifically designed to detect and correct multiple change-points and work with inhomogeneous references (Szentimrey, 1999; Mestre, 1999; Caussinus and Mestre, 2004). These kind of methods were shown to be more accurate by the benchmarking study of the EU COST Action HOME (Venema et al., 2012).

The paper by Caussinus and Mestre (2004) also provided the first description of a method that jointly corrects all series of a network simultaneously. This joint correction method was able to improve the accuracy of all but one contribution to the HOME benchmark that was not yet using this approach (Domonkos et al., 2013).

The ongoing work to create appropriate datasets for climate variability and change studies promoted the continual development of better methods for change point detection and correction. To follow this process the Hungarian Meteorological Service started a series of “Seminars for Homogenization” in 1996 (HMS 1996, WMO 1999, OMSZ 2001, WMO 2004, WMO 2006, WMO 2010).

Related reading

Homogenization of monthly and annual data from surface stations
A short description of the causes of inhomogeneities in climate data (non-climatic variability) and how to remove it using the relative homogenization approach.
Statistical homogenisation for dummies
A primer on statistical homogenisation with many pictures.
Just the facts, homogenization adjustments reduce global warming
Many people only know that climatologists increase the land surface temperature trend, but do not know that they also reduce the ocean surface trend and that the net effect is a reduction of global warming. This does not fit to well to the conspiracy theories of the mitigation sceptics.
Five statistically interesting problems in homogenization
Series written for statisticians and climatologists looking for interesting problems.
Why raw temperatures show too little global warming
The raw land surface temperature probably shows too little warming. This post explains the reasons why: thermometer screen changes, relocations and irrigation.
New article: Benchmarking homogenization algorithms for monthly data
Raw climate records contain changes due to non-climatic factors, such as relocations of stations or changes in instrumentation. This post introduces an article that tested how well such non-climatic factors can be removed.


Brückner, E., 1890: Klimaschwankungen seit 1700 nebst Bemerkungen über Klimaschwankungen der Diluvialzeit. E.D. Hölzel, Wien and Olnütz.
Alexandersson, A., 1986: A homogeneity test applied to precipitation data. J. Climatol., 6, pp. 661-675.
Alexandersson, H. and A. Moberg, 1997: Homogenization of Swedish temperature data .1. Homogeneity test for linear trends. Int. J. Climatol., 17, pp. 25-34.
Caussinus, H. and O. Mestre, 2004: Detection and correction of artificial shifts in climate series. Appl. Statist., 53, Part 3, pp. 405-425.
Conrad, V. and C. Pollak, 1950: Methods in Climatology. Harvard University Press, Cambridge, MA, 459 p.
Conrad V., O. Schreier, 1927: Die Anwendung des Abbe’schen Kriteriums auf physikalische Beobachtungsreihen. Gerland’s Beiträge zur Geophysik, XVII, 372.
Conrad, V., 1925: Homogenitätsbestimmung meteorologischer Beobachtungsreihen. Meteorologische Zeitschrift, 482–485.
Conrad V., 1944: Methods in Climatology. Harvard University Press, 228 p.
Domonkos, P., V. Venema, O. Mestre, 2013: Efficiencies of homogenisation methods: our present knowledge and its limitation. Proceedings of the Seventh seminar for homogenization and quality control in climatological databases, Budapest, Hungary, 24 – 28 October 2011, WMO report, Climate data and monitoring, WCDMP-No. 78, pp. 11-24.
Hann, J., 1880: Untersuchungen über die Regenverhältnisse von Österreich-Ungarn. II. Veränderlichkeit der Monats- und Jahresmengen. S.-B. Akad. Wiss. Wien.
Heidke P., 1923: Quantitative Begriffsbestimmung homogener Temperatur- und Niederschlagsreihen. Meteorologische Zeitschrift, 114-115.
Helmert F.R., 1907: Die Ausgleichrechnung nach der Methode der kleinsten Quadrate. 2. Auflage, Teubner Verlag.
Peterson T.C., D.R. Easterling, T.R. Karl, P. Groisman, N. Nicholls, N. Plummer, S. Torok, I. Auer, R. Boehm, D. Gullett, L. Vincent, R. Heino, H. Tuomenvirta, O. Mestre, T. Szentimrey, J. Salinger, E.J. Forland, I. Hanssen-Bauer, H. Alexandersson, P. Jones, D. Parker, 1998: Homogeneity adjustments of in situ atmospheric climate data: A review. Int. J. Climatol., 18, 1493-1517.
Hungarian Meteorological Service (HMS), 1996: Proceedings of the First Seminar for Homogenization of Surface Climatological Data, Budapest, Hungary, 6-12 October 1996, 44 p.
Kohler M.A., 1949: Double-mass analysis for testing the consistency of records and for making adjustments. Bull. Amer. Meteorol. Soc., 30: 188 – 189.
k.k. Hof- und Staatsdruckerei, 1873: Bericht über die Verhandlungen des internationalen Meteorologen-Congresses zu Wien, 2.-10. September 1873, Protokolle und Beilagen.
k.k. Zentralanstalt für Meterologie und Geodynamik. 1906: Bericht über die internationale meteorologische Direktorenkonferenz in Innsbruck, September 1905. Anhang zum Jahrbuch 1905. k.k. Hof-und Staatsdruckerei.
Kreil K., 1854a: Mehrjährige Beobachtungen in Wien vom Jahre 1775 bis 1850. Jahrbücher der k.k. Central-Anstalt für Meteorologie und Erdmagnetismus. I. Band – Jg 1848 und 1849, 35-74.
Kreil K., 1854b: Mehrjährige Beobachtungen in Mailand vom Jahre 1763 bis 1850. Jahrbücher der k.k. Central-Anstalt für Meteorologie und Erdmagnetismus. I. Band – Jg 1848 und 1849, 75-114.
Mestre O., 1999: Step-by-step procedures for choosing a model with change-points. In Proceedings of the second seminar for homogenisation of surface climatological data, Budapest, Hungary, WCDMP-No.41, WMO-TD No.962, 15-26.
OMSZ, 2001: Third Seminar for Homogenization and Quality Control in climatological Databases, Budapest.
Szentimrey, T., 1999: Multiple Analysis of Series for Homogenization (MASH). Proceedings of the second seminar for homogenization of surface climatological data, Budapest, Hungary; WMO, WCDMP-No. 41, 27-46.
Venema, V., O. Mestre, E. Aguilar, I. Auer, J.A. Guijarro, P. Domonkos, G. Vertacnik, T. Szentimrey, P. Stepanek, P. Zahradnicek, J. Viarre, G. Müller-Westermeier, M. Lakatos, C.N. Williams,
M.J. Menne, R. Lindau, D. Rasol, E. Rustemeier, K. Kolokythas, T. Marinova, L. Andresen, F. Acquaotta, S. Fratianni, S. Cheval, M. Klancar, M. Brunetti, Ch. Gruber, M. Prohom Duran, T. Likso,
P. Esteban, Th. Brandsma. Benchmarking homogenization algorithms for monthly data. Climate of the Past, 8, pp. 89-115, doi: 10.5194/cp-8-89-2012, 2012. See also the introductory blog post and a post on the weaknesses of the study.
WMO, 1966: Climatic Change, Report of a working group of the Commission for Climatology. Technical Note 79, WMO – No. 195. TP.100, 79 p.
WMO 1999: Proceedings of the Second Seminar for Homogenization of Surface Climatological Data, Budapest, Hungary, 9 – 13 November 1998, 214 p.
WMO, 2004: Fourth Seminar for Homogenization and Quality Control in Climatological Databases, Budapest, Hungary, 6-10 October 2003, WCDMP-No 56, WMO-TD No. 1236, 243 p.
WMO, 2006: Proceedings of the Fifth Seminar for Homogenization and Quality Control in Climatological Databases, Budapest, Hungary, 29 May – 2 June 2006. Climate Data and Monitoring WCDMP- No 71, WMO/TD- No. 1493.
WMO, 2010: Proceedings of the Meeting of COST-ES0601 (HOME) Action, Management Committee and Working groups and Sixth Seminar for Homogenization and Quality Control in Climatological Databases, Budapest, Hungary, 26 – 30 May 2008, WMO reports on Climate Data and Monitoring, WCDMP-No. 76.

Sunday, 1 October 2017

The Earth sciences no longer need the publishers for publishing

Manuscript servers are buzzing around our ears, as the Dutch say.

In physics it is common to put manuscripts on the ArXiv server (pronounced: Archive server). A large part of these manuscripts are later send to a scientific journal for peer review following the traditional scientific quality control system and assessment of the importance of studies.

This speeds up the dissemination of scientific studies and can promote informal peer review before the formal peer review. Manuscripts do not have copyrights yet, so this also makes the research available to all without pay-walls. Expecting the manuscripts to be published on paper in a journal later, ArXiv is called a pre-print server. In these modern times I prefer manuscript server.

The manuscript gets a time stamp, a pre-print server can thus be used to claim precedence. Although the date of publication is traditionally used for this and there are no rules which date is most important. Pre-print servers can also give the manuscript a Digital Object Identifier (DOI) that can be used to cite it. A problem could be that some journals see a pre-print as prior publication, but I am not aware of any such journals in the atmospheric sciences, if you do please leave a comment below.

ArXiv has a section for atmospheric physics, where I also uploaded some manuscripts as a young clouds researcher. However because most meteorologists did not participate it could not perform the same function as it does in physics; I never got any feedback based on these manuscripts. When ArXiv made uploading manuscripts harder to get rid of submissions by retire engineers, I stopped and just put the manuscripts on my homepage.

Three manuscript archives

Maybe the culture will now change and more scientists participate with three new initiatives for manuscript servers for the Earth sciences. All three follow a different concept.

This August a digital archive started for Paleontology (paleorXiv, twitter). If I see it correctly they already have 33 manuscripts. (Only a part of them are climate related.) This archive builds on the open source preprint server of the Open Science Framework (OSF) of the non-profit Center for Open Science. The OSF is a platform for the entire scientific workflow from idea, to coding and collaboration to publishing. Also other groups are welcome to make a pre-print archive using their servers and software.

[UPDATE. Just announced that in November a new ArXiv will start: MarXiv, not for Marxists, but for the marine-conservation and marine-climate sciences.]

Two initiatives have just started for all of the Earth sciences. One grassroots initiative (EarthArXiv) and one by AGU/Wiley (ESSOAr).

EarthArXiv will also be based on the open source solution of the Open Science Framework. It is not up yet, but I presume it will look a lot like paleorXiv. It seems to catch on with about 600 twitter listeners and about 100 volunteers in just a few days. They are working on a logo (requirements, competition). Most logos show the globe; I would include the study of other planets in the Earth sciences.

The American Geophysical Union (AGU) has announced plans for an Earth and Space Science Open Archive (ESSOAr), which should be up and running early next year. They plan to be able to show a demo at the AGU's fall meeting in December.

The topic would thus be somewhat different due to the inclusion of space science and they will also permanently archive posters presented at conferences. That sounds really useful; now every conference designs their own solution and the posters and presentations are often lost after some time when the homepage goes down. EarthArXiv unfortunately seems to be against hosting posters. ESSOAr would also make it easy to transfer the manuscripts to (AGU?) journals.

A range of other academic societies are on the "advisory board" of ESSOAr, including EGU. ESSOAr will be based on proprietary software of the scientific publisher Wiley. Proprietary software is a problem for something that should function for as close to an eternity as possible. Not only Wiley, but also the AUG itself are major scientific publishers. They are not Elsevier, but this quickly leads to conflicts of interest. It would be better to have an independent initiative.

There need not be any conflict between the two "duelling" (according to Nature) servers. The manuscripts are open access and I presume they will have an API that makes it possible to mirror manuscripts of one server on the other. The editors could then remove the ones they do not see as fitting to their standards (or not waste their time). Beyond esoteric (WUWT & Co.) nonsense, I would prefer not to have much standards, that is the idea of a manuscript server.

Paul Voosen of Nature magazine wonders whether: "researchers working in more sensitive areas of the geosciences, such as climate science, will embrace posting their work prior to peer review." I see no problem there. There is nothing climate scientists can do to pacify the American culture war, we should thus do our job as well as possible and my impression is that climatology is easily in the better half of the Open Science movement.

I love to complain about it, but my impression is that sharing data is more common in the atmospheric sciences than average. This could well be because it is more important because data is needed from all over the world. The World meteorological Organization was one of the first global organizations set up to coordinate this. The European Geophysical Union (EGU) has open review journals for more than 15 years. The initial publication in a "discussion" journal is similar to putting your manuscript on a pre-print server. Many of the contributions to the upcoming FORCE2017 conference on Research Communication and e-Scholarship that mention a topic are about climate science.

The road to Open Access

A manuscript server is one step on the way to an Open Access publishing future. This would make articles better accessible to researchers and the public who paid for it.

Open Access would break the monopoly given to scientific publishers by copyright laws. An author looking for a journal to publish his work can compare price and service. But a reader typically needs to read one specific article and then has to deal with a publishers with monopoly power. This has led to monopolistic profits and commercial publishers that have lost touch with their customers, the scientific community. That Elsevier has a profit margin of "only" 36 percent thus seems to be mismanagement, it should be close to a 100 percent.

ArXiv shows that publishing a manuscripts costs less than a dollar per article. Software to support the peer review can be rented for 10 dollar per article (see also: and Open Journal Systems). Writing the article and reviewing it is done for free by the scientific community. Most editors are also scientists working for free, sometimes the editor in chief gets some secretarial support, some money for a student help. Typesetting by journals is highly annoying as they often add errors doing so. Typesetting is easily done by a scientist, especially using Latex, but also with a Word template. That scientists pay thousands of dollars per article is not related to the incurred costs, but due to monopoly brand power.

Publishers that serve the community, articles that everyone can read and less funding wasted on publishing is a desirable goal, but it is hard to get there because the barriers to entry are large. Scientists want to publish in journals with a good reputation and if the journals are not Open Access with a broad circulation. This makes starting a new journal hard, even if a new journal does a much better job at a much lower price, it will start with no reputation and without a reputation it will not get manuscripts to prove its worth.

To make it easier to get from the current situation to an Open Access future, I propose the concept of Grassroot Scientific Publishing. Starting a new journal should be as easy as starting a blog: Make an account, give the journal name and select a lay-out. Finished, start reviewing.

To overcome the problem that initially no one will submit manuscripts a grassroots journal can start with reviewing already published articles. This is not wasted time because we can do a much better job communicating the strength and weakness as well as the importance of an article than we do now, where the only information we have on the importance is the journal in which it is published. We can categorise and rank them. We can have all articles of one field in the same journal, no longer scattered around in many different journals.

Even without replacing traditional journals, such a grassroots journal would provide a valuable service to its scientific community.

To explain the idea and get feedback on how to make it better I have started a new grassroots publishing blog:
Once this kind of journals is established and has shown it provides superior quality assurance and information, there is no longer any need for pay-wall journals and we can just review the articles on manuscript servers.

Related reading

Paul Voosen in Nature: Dueling preprint servers coming for the geosciences

AGU: ESSOAr Frequently Asked Questions

The Guardian, long read: Is the staggeringly profitable business of scientific publishing bad for science?

If you are on twitter, do show support and join EarthArXiv

Three cheers for gatekeeping

Peer review helps fringe ideas gain credibility

Grassroots scientific publishing

* Photo Clare Night 2 by Paolo Antonio Gonella is used under a Creative Commons Attribution 2.0 Generic (CC BY 2.0) license.

Friday, 22 September 2017

Standard two letter codes for meteorological variables

For the file names of the Parallel Observations Science Team (ISTI-POST) we needed short codes for the meteorological variables in the file.

Two letter codes are used quite often in the scientific literature, which suggests there exists a standard, but I was unable to find a WMO standard. Thus we would suggest to follow standard two-letter conventions.

ddwind direction[in degrees; calm = -1]
ffwind speed[in m/s]
tmmean temperature[in °C]
tnminimum temperature[in °C]
txmaximum temperature[in °C]
trread temperature (at a specific time)[in °C]
twwet-bulb temperature[in °C]
tssurface temperature[in °C]
snsunshine duration[in h]
sdsolar radiation flux down[in W m-2]
susolar radiation flux up[in W m-2]
hdinfra-red (heat) radiation flux down[in W m-2]
huinfra-red (heat) radiation flux up[in W m-2]
tstotal snow[in mm]
nsnew snow[in mm]
rhrelative humidity[in %]
pppressure[in mbar]
rrprecipitation[in mm per day]
nncloud cover[in %]

If you know of an existing system, please say so. Many codes are quite common in the literature, but some also less. If you have suggestion for other codes for these or would like to propose abbreviations for other variables, please contact us or write a comment below.

Monday, 18 September 2017

Angela Merkel wins German election

After my spectacular success as UK election pollster let my try my luck with a prediction for the elections here in German next week Sunday: Angela Merkel will win the election and stays Chancellor. I admit that it would have been more fun to make a claim that goes against the punditry, but that is harder to do for Germany than for the UK or the USA; the quality of German (public) media is quite high. The pundits also do not have a hard time this election, the only question is who is Merkel going to govern with and that depends on details we will only know on election night.

[UPDATE on the eve of the election. Something I did not think of because I have not heard anyone talk about it is that Merkel may step down when her party loses more than 5% and the coalition loses more than 10%. She took quite some time considering whether she would run again. My impression is that that was not just theatre; it is a tough job. Losing the election may well be the right excuse to hand power to the next generation.]

Germany is a representative parliamentary democracy. The voters select their representatives in parliament, like in the UK, and parliament elects the prime minister (Bundeskanzler). The prime minister is the most powerful politician, although officially ranked third after the president (Bundespräsident) and the president of the parliament. The advantage of this system is that when you notice your leader is an incompetent ignorant fool with no interest in working, you can get rid of them. Not to end up with a power vacuum, a new prime minister has to be elected to remove the old one, just voting against the old one is not enough.

Advertisement & song for a major supermarket chain. The title literally translated is: "Super horny", but more accurate is: "terrific". On the other hand, we have less gory violence on TV than the USA.

Germans get two votes: one for their local representative, just like the districts in the UK or USA, and a second vote for a party. This way you have politicians that represent their district, which some people seem to see as important; I have never understood why. The second vote determines the proportions in parliament. Parties make lists of candidates and they are added to the directly elected candidates to get a proportional result. This way all voters count, parties have to campaign everywhere and [[gerrymandering]] does not help. Win, win, win.

The only deviation from a representative result is that there is an [[election threshold]] of 5%. If a party gets less than 5%, their votes are unfortunately lost, except for elected direct candidates. In the last federal election 16% of the votes were lost that way. The election threshold should reduce the number of parties, but also conveniently limits competition for the existing parties.

Political parties

The latest polls are shown below.

Election polls over the last 4 years. For comparison the results of the 2013 election were: CDU/CSU: 41.5%, SPD: 25.7%, Greens: 8.4%, FDP: 4.8%, Die Linke 8.6%, Pirate party: 2.2%, AfD: 4.7%.

It is expected that six parties will cross the election limit. The largest party will be the Christian Democrats or Conservatives of Kanzler Merkel. They actually are two parties who caucus together in parliament: The Christian social Union (CSU) running in Bavaria and the Christian Democratic Union (CDU) in the rest of Germany.

The second largest party will be the Social Democrats, similar to Labour in the UK. The upward jump in spring this year of almost 10% almost made the party as large as the Christian Democrats. This was when their new party leader Martin Schulz was elected and he suggested to again treat unemployed people as humans and get rid of the policy package called [[Harz IV]].

This peak went away when Schulz explained that actually he only wanted to make a few small Clintonite tweaks. This Harz IV package was made by Germany's Tony Blair, the neo-liberal Gerhard Schröder who is now living on Vladimir Putin's pay check. The party strategists must have seen the movement in the polls, but threatening the middle class that they can fall really deep into poverty if they do not conform was apparently more important to them than being Social Democrats.

The Doctors: Man & Woman. die ärzte - M&F

The four small parties have about the same size this time. It is the policy of the Conservatives to be a sufficiently nationalistic big tent party to keep purely racist parties below the 5%, but this time the anti-Muslim party Alternative for Germany (Alternative für Deutschland) will likely make it into parliament. It started with a Euro currency sceptical party whose leader was open to racists to pass the 5% threshold and then got kicked out by them.

The latest polls show a few percent less for the two main parties and the Alternative for Germany at or above 10%. The easily exited punditry is immediately talking about 15 or 20%. People tend to worry whether people answer polling questions well when it comes to racist parties. The evidence shows that there is no bias, but that the noise error can be larger, especially for new parties. Racist parties typically are new parties as they do not last long being a coalition of unreasonable people with often a violent criminal past.

The other right-wing small party is the pro-business party FDP. They are officially classical liberals, but unfortunately in practise often crony capitalists. They got kicked out of parliament in the last election because their coalition government with the Conservatives was so disastrous. Their new leader Christian Lindner resurrected the party by stressing the pro-human parts of their liberal heritage. All these terms should be interpreted in a German perspective: Not even this classical liberal party would deny people health care and Barrack Obama could be a good replacement for Lindner.

On the left we have we a party called "The Left", Die Linke. They are mostly the Social Democrat party the SPD once was. Their main campaign promise is to get rid of the Harz IV package. However, they were born out of the communist party of Eastern Germany, which has left its traces. Due to old ties and maybe kompromat they are very pro-Russia. They are against NATO and German military actions, but were not particularly worried about the Russian occupation of Crimea. Because of their communist past and officially because of their foreign policies, most other parties are not willing to govern with them. It could be that this taboo will be broken this election or the next; about time almost three decades after the fall of communism.

Election billboard of the German Green party: Environment is not everything, but without the environment everything is nothing.

The German Greens are traditionally seen as part of the left being born out of the hippy movement, but for a Green party they are very conventional, the old geezers have become much like the parents they once revolted against. Half of the party would like to be in the middle and the party is flirting with the idea of a coalition government with the Conservatives. In one of the most conservative German states Baden-Württemberg the Green politician Winfried Kretschmann leads the coalition government with the Christian Democrats. I mostly mention this to emphasize that politics in Europe is a bit different than in corrupt Washington.


I am not expecting any large changes in the last week and German polls are normally quite good.

Now the the preliminary final result is in we can see that the last polls were reasonably good. The uncertainty is given as 2 to 3% and was met. Still the difference for the Conservatives is rather large and the larger percentage for the racist party sad.
Conservatives (CDU/CSU) 35.8%33.0%-2.8%
Social Democrats (SPD) 21.8%20.5%-1.3%
Greens (B'90/Grüne)  7.8% 8.9%+1.1%
Classical liberals (FDP) 9.6%10.7%+1.1%
The Left (Linke)  9.5% 9.2%-0.3%
Racists (AfD) 11.0%12.6%+1.6%
Others  4.6%

Theoretically Schulz could discover his inner Jeremy Corbyn and still announce to get rid of Harz IV, but even that would likely not change the coalition options much. The results will be very similar to those of 2013, but the two big parties will likely lose a few percent and the AfD and the FDP will likely pass the threshold this time. Because of this this 10% less votes will be wasted and the other parties will get less seats for the same percentage of votes. Thus all current parties will likely lose seats.

Currently the Bundeskanzler is Angela Merkel and she is likely the next one as well. There is no limit to how often one can become Bundeskanzler. Helmut Köhl did it four times. Merkel already made coalition governments with the social democrats (SDP), the FDP and currently again the SPD. Each time her coalition partner suffered clear losses.

To govern normally a coalition of parties is needed. The best part of the election night in 2013 was to look at the face of Angela Merkel when the exit polls suggested she might have a majority without any coalition partner. She clearly did not look forward to having to implement her platform without being able to blame the coalition partner for softening it.

The right parties (CDU and FDP) will not want to make a coalition with the racists (AfD). The left parties (SPD and Greens) will likely not be willing to make a coalition with the former communists (Die Linke) and this coalition is also likely not big enough. Also a coalition of Social Democrats, Greens and classical liberals is likely too small.

So whatever coalition is possible, it will include the Christian Democrats of Merkel. If it is possible to make a coalition with the classical liberals she will do so. This is likely only possible if the AfD stays below the election threshold. Due to this threshold it would perversely be best for people on the left if the racists get into parliament.

If a coalition with the liberals is not possible Merkel will most likely try to build a coalition with the Greens, a new combination federally, but a coalition that has been tested in the German states the last years as preparation and works.

Maybe I do have one complaint about the German punditry, they keep on talking about a coalition of Conservatives, classical liberals and greens (CDU/CSU, FDP, Greens). I understand that the small parties like such speculation to keep themselves in the news and having more options improves their negotiation position, but I do not see this coalition as a realistic option, although not fully impossible. The members of the Greens will have to vote on the coalition agreement and I see it as highly unlikely that they would approve such a right-wing government.

Whether any of these options work will depend on the last few percent of votes and we will thus have to wait for election night. The most likely result, which is always possible, but not a popular option, will be a continuation of the current ruling coalition of Christian Democrats and Social Democrats. Both parties will probably lose votes and not be keen to continue the coalition and likely loose again in four years.

An election billboard of the racists of Alternative for Germany above a sign saying "liars have tall ladders". A bit unfair: racists parties are not particularly popular after what they did to Germany and the world, so they have to hang their posters up high lest they get vandalized. Parties are allowed to advertise on the streets for free to make money in politics less important. They also get free time on public television.

Climate Change

A main environmental group (BUND) has made a comparison of the party platforms on climate change. No German party denies climate change, except for the racist party. It makes sense that a party that is willing to shoot refugees to kill at the border is also willing to destroy their existence and kill them at home. In their party platform they go full Trump and deny man-made climate change and call for higher CO2 concentrations. They are also a Trumpian party in the sense that they get a little help from foreign racists and Moscow in their quest against free and open societies. As typical for these kind of  parties the candidates are mostly incompetent and many have criminal records.

The two big parties have deep ties with big industry and the last four years have seen a reduction in ambitions to fight climate change. As a consequence the CO2 emission goals for 2020 will be hard to reach for the next government.

The classical liberal FDP reject solutions to climate change beyond the European Emissions Trading System, which makes sense from their perspective, however, it does not work and Germany alone cannot fix it. Thus this easily leads to doing nothing in practice.

The Greens are naturally best on climate change. After ending nuclear power, they now want to end coal power in 2030 (Kohleausstieg). Angela Merkel indicated her willingness to form a coalition by writing in to the Conservative platform: an end to lignite coal power (Braunkohleausstieg).


As an example of how the electoral system works let's consider Bonn, where I live (although I am not allowed to vote because the German parliament may have to vote whether to go to war with The Netherlands; as EU citizen I can vote locally.) If you, as reader of this blog, care mostly about the environment your best option for your direct (first) vote is the social democrat Ulrich Kelber. He is a strong pro-environment politician within the SPD, but the coal-NRW SPD did not put him on the party list, so he has to get a direct mandate. For this reason Campact is campaigning for Kelber.

In the last election Kelber won the direct votes, while the Christian Democrats got more of the party (second) votes. The Christian Democrat from Bonn Claudia Lücking-Michel still got elected via the list, something that is again likely as she has place 27 on the party list of North Rhine-Westphalia.

Your second vote would then be The Greens. The Green candidate Katja Dörner is third on the party list and will thus likely be elected via the party list although she has no chance to get a direct mandate in Bonn. Thus if Kelber gets the direct mandate, Bonn would likely be represented by the same three members of parliament as now. Because of the party lists, many districts have more than one candidates, but three is quite a lot.

This electoral system also distributes the power. The local/district party members determine their direct candidates. The state party members determine the party lists. The federal party only determines the leading candidate.

Related reading

Carbon Brief: German election 2017: Where the parties stand on energy and climate change.

If you are still undecided who to vote for, the Wahl-O-Mat can help you. (In German)

Where the donor money goes: Parteispenden - Wer zahlt? Wie viel? An wen? (In German)

Sonntagfrage Aktuell: Graph with the latest polls. (In German, but not many words)

The Guardian: Angela Merkel races ahead in polls with six weeks to go.

* Top photo Girls'Day-Auftaktveranstaltung am 26.04.2017 in Anwesenheit von Bundeskanzlerin Angela Merkel im Bundeskanzleramt, Berlin by Initiative D21 used under a Creative Commons Attribution-NoDerivs 2.0 Generic (CC BY-ND 2.0) license.

Photo Bundestagswahl 2017 #btw2017 Die Grünen by Markus Spiske used under a Creative Commons Attribution 2.0 Generic (CC BY 2.0) license.

Wednesday, 13 September 2017

My EMS2017 highlights

When I did my PhD, our professor wanted everyone to write short reports about conferences they had attended. It was a quick way for him to see what was happening, but it is also helpful to remember what you learned and often interesting to read yourself again some time later. Here is my short report on last week's Annual Meeting of the European Meteorological Society (EMS), the European Conference for Applied Meteorology and Climatology 2017, 4–8 September 2017, Dublin, Ireland.

This post is by its nature a bit of lose sand, but there were some common themes: more accurate temperature measurements by estimating the radiation errors, eternal problems estimating various trends, collaborations between WEIRD and developing countries and global stilling.

Radiation errors

Air temperature sounds so easy, but is hard to measure. What we actually measure is the temperature of the sensor and because air is a good isolator, the temperature of the air and the sensor can easily be different. For example, due to self-heating of electric resistance sensors or heat flows from the sensor holder, but the most important heat flow is from radiation. The sun shining on the sensor or the sensor losing heat via infra-red radiation by contact with the cold atmosphere.

In the [[metrology]] (not meteorology) session there was a talk and several posters on the beautiful work by the Korea Research Institute of Standards and Science to reduce the influence of radiation on the temperature measurements. They used two thermometers one dark and one light coloured to estimate how large radiation errors are and to be able to correct for them. This set-up was tested outside and in their amazing calibration laboratory.

These were sensors to measure the vertical temperature profile, going up to 15 km high. Thus they needed to study the sensors over a huge range of temperatures (-80°C to 25°C); it is terribly cold at the tropopause. The dual sensor was also exposed to a large range of solar irradiances from 0 to 1500 Watts per square meter; the sun is much stronger up there. The pressure ranged from 10 hPa to the 1000 hPa we typically have at the surface. The low pressure makes the air an even better isolator. The radiosondes drift with the wind reducing ventilation, thus the wind only needed to be tested from 0 to 10 meters per second.

I have seen this set-up to study radiation errors for automatic weather stations, it would be great to also use it for operational stations to reduce radiation errors.

The metrological organisation of the UK is working on a thermometer that does not have a radiation error by directly measuring the temperature of the air. Micheal de Podesta does so by measuring the speed of sound very accurately. The irony is that it is hard to see how well this new sound thermometer works outside the lab because the comparison thermometer has radiation errors.

Micheal de Podesta live experiments with the most accurate thermometer in human history:

To lighten up this post: I was asked to chair the metrology session because the organiser of the session (convener) gave a talk himself. The talks are supposed to be 12 minutes with 3 minutes for questions and changing to the next speaker. Because multiple sessions are running at the same time and people may switch it is important to stick to the time. Also people need some time between the time blocks to recharge.

One speaker crossed the 12 minutes and had his back towards me so that I could not signal his time was up. Thus I walked across the screen to the other side in front of him. This gave some praise on Twitter.

If you speak a foreign language (and are nervous) it can be hard to deviate from the prepared talk.

Satellite climate data

There were several talks on trying to make a stable dataset from satellite measurements to make them useful for climate change studies. Especially early satellites were not intended for quantitative use, but only to look at the moving cloud systems. And also later the satellites were mostly designed for meteorological uses, rather than climate studies.

Interesting was Ralf Quast looking at how the spectral response of the satellites deteriorated while in space. The sensitivity for visible light did not decline similarly for all colours, but deteriorated faster for blues than for reds. This was studied by looking at several calibration targets expected to be stable: the Sahara desert, the dark oceans, and the bright top of tropical convective clouds. The estimates for post-launch measurements were similar to pre-launch calibrations in the lab.

Gerrit Hall explained that there are 17 sources of uncertainties for visible satellite measurements from the noise when looking at the Earth and when looking at the space for partial calibration to several calibration constants and comparisons to [[SI]] standards (the measurement units everyone, but the USA uses).

The noise levels also change over time, typically going up over the life time, but sometimes also going down for a period. The constant noise level in the design specification often used for computations of uncertainties is just a first estimate. When looking at space the channels (measuring different frequencies of light) should be uncorrelated, but they are not always.

Global Surface Reference Network

Peter Thorne gave a talk about a future global surface climate reference network. I wrote about this network for climate change studies before.

A manuscript describing the main technical features of such a network is almost finished. The Global Climate Observing System of WMO is now setting up a group to study how we can make this vision a reality to make sure that future climatologists can study climate change with a much higher accuracy. The first meeting will be in November in Maynooth.

Global stilling

The 10-meter wind speed seems to be declining in much of the mid-latitudes, which is called "global stilling". It is especially prevalent in middle Europe (as the locals say, in my youth this was called east Europe). The last decade there seems to be an uptick again; see graph to the right from The State of the Climate 2016.

Cesar Azorin-Molina presented the work of his EU project STILLING in a longer talk in the Climate monitoring session giving an overview of global stilling research. Stilling is also expected to be one of the reasons for the reduction in pan evaporation.

The stilling could be due to forest growth and urbanization, both make the surface rougher to the wind, but could also be due to changes in the large scale circulation. Looking at vertical wind profiles one can get an idea about the roughness of the surface and thus study whether that is the reason, but there is not much such data available over longer periods.

If you have such data, know of such data, please contact Cesar. Also for normal wind data, which is hard to get, especially observations from developing countries. The next talk was about a European wind database and its quality control, this will hopefully improve the data situation in Europe.

This was part of the climate monitoring session, which has a focus on data quality because Cesar also studied the influence of the ageing of cup anemometers that measure the wind speed. Their ball bearings tend to wear out, producing lower observed wind speeds. By making parallel measurements with new equipment and a few year old instruments he quantified this problem, which is quite big.

Because these anemometers are normally regularly calibrated and replaced I would not expect that this would produce problems for the long-term trend. Only if the wearing is larger now than it was in the past it would create a trend bias. But this does create quite a lot of noise in the difference time series between one station and a neighbour, thus making relative homogenisation harder.

Marine humidity observations

My ISTI colleague Kate Willet was recipient of the WCRP/GCOS International Data Prize 2016. She leads the ISTI benchmarking group and is especially knowledgeable when it comes to humidity observations. The price was a nice occasion to invite her to talk about the upcoming HadISD marine humidity dataset. It looks to become a beautiful dataset with carefully computed uncertainties.

There is a decline in the 2-meter relative humidity over land since about 2000 and it is thus interesting to see how this changes over the ocean. Preliminary results suggest that also over the ocean the relative humidity is declining. Both quality control of individual values and bias corrections are important.

Developing countries

There was a workshop on the exchange of information about European initiatives in developing countries. Saskia Willemse of Meteo Swiss organised it after her experiences from a sabbatical in Bhutan. Like in the rest of science a large problem is that funding is often only available for projects and equipment, while it takes a long time to lift an organisation to a higher level and the people need to learn how to use the equipment in praxis and it is a problem that equipment is often not interoperable.

More collaboration could benefit both sides. Developing countries need information to adapt to climate change and improve weather predictions. To study the climate system, science needs high quality observations from all over the world. For me it is, for example, hard to find out how measurements are made now and especially in the past. We have no parallel measurements in Africa and few in Asia. The Global Climate Observing System (GCOS) Reference Upper-Air Network (GRUAN) has much too few observations in developing countries. We will probably run into the same problem again with a global station reference network.

At the next EMS (in Budapest) there will be a session on this topic to get a discussion going how we can better collaborate. The organisers will reach out to groups already doing this kind of work in WMO, UNEP and the World Bank. One idea was to build a blog to get an overview of what is already happening.

I hope that it will be possible to have sustainable funding for weather services in poor countries, for capacity building and for making observations in return for opening up their data stores. That would be something the UN climate negotiations could do via the [[Green Climate Fund]]. Compared to the costs of reducing greenhouse gases and adapting our infrastructure the costs of weather services are small and we need to know what will happen for efficient planning.

Somewhat related to this is the upcoming Data Management Workshop (DMW) in Peru modelled after the European EUMETNET DMWs, but hopefully with more people from South and Central America. The Peru workshop is organised by Stefanie Gubler of the Swiss Climandes project and will be held from 28th of May to the 1st of June 2018. More information follows later.

Wet bulb temperature

For the heat stress of workers, the wet bulb temperature is important. This is the temperature of a well-ventilated thermometer covered in a wet piece of cloth. If there is some wind the wet bulb temperature is gives an indication of the thermal comfort of a sweating person.

The fun fact I discovered is that the weather forecasts for the wet bulb temperature are more accurate than for the temperature and the relative humidity individually. There is even some skill up to 3 weeks in advance. Skill here only means that the weather prediction is better than using the climatic value. Any skill can have economic value, but sufficiently useful forecasts for the public would be much shorter-term.


The prize for the best Q&A goes to the talk on plague in the middle ages and its relationship with the weather in the previous period (somewhat cool previous summer, somewhat warm previous winter and a warm summer: good rat weather).

Question: why did you only study the plague in the Middle Ages?
Answer: I am a mediaevalist.

Other observational findings

Ian Simpson studied different ways to compute the climate normals (the averages over 30 years). The main difference between temperature datasets were in China due to a difference in how China itself computes the daily mean temperature (from synoptic fixed hour measurements at 0, 6, 12, 18 hours universal time) and how most climatological datasets do it (from the minimum and maximum temperature). Apart from that the main differences were seen when data was incomplete because datasets use different methods to handle this.

There was another example where the automatic mode (joint detection) of HOMER produced bad homogenisation results. The manual mode of HOMER is very similar to PRODIGE, which is a good HOME recommended method, but the joint detection part is new and was not studied well yet. I would advice against its use by itself.

Lisa Hannak of the German weather service looked at inhomogeneities in parallel data: manual observations made next to automatic measurements. Because they are so highly correlated it is possible to see very small inhomogeneities and quite frequent ones. An interesting new field. Not directly related to EMS, but there will be a workshop on parallel data in November as part of the Spanish IMPACTRON project.

The European daily climate dataset ECA&D, which is often used to study changes in extreme weather, will soon have a homogenised version. Some breaks in earlier periods were not corrected because there were no good reference stations in this period. I would suggest to at least correct the mean in such a case, that is better than doing nothing and having a large inhomogeneity in a dataset people expect to be homogenised is a problem.

One of the things that seems to help us free meteorological/climate data is that there is a trend towards open government. This means that as much as possible the data the government has gathered is made available to the public via an [[API]]. Finland is just working on such an initiative and also freed the data of the weather service. There are many people and especially consultants using such data. We can piggy back on this trend.

One can also estimate humidity with GPS satellites. Such data naturally also need to be homogenised. Roeland Van Malderen works on a benchmark to study how well this homogenisation would work.

The Austrian weather service ZAMG is working on an update for the HISTALP dataset with temperature and precipitation for the Greater Alpine Region. The new version will use HOMER. Two regions are ready.

It was great to see that Mexico is working on the homogenisation of some of their data. Unfortunately the network is very sparse after the 90s, which makes homogenisation difficult and the uncertainty in the trends large.

Sunday, 3 September 2017

We need to talk about a geo-intervention

Photo NASA: Filament Eruption Creates 'Canyon of Fire' on the Sun

There was a time to work on (the technologies for) reducing emissions of greenhouse gasses (mitigation), now is the time we also need to work on adaptation and now is the time we need to start having a serious conversation about a geo-intervention. The journalistic hook is a new interesting commentary in the scientific journal Nature. It makes the surprising, at least for me, case that a geo-intervention to reduce the insolation will also reduce greenhouse gas concentrations and ocean acidification.

Geo-intervention is the more accurate term for what is commonly called geo-engineering. We cannot engineer the climate, but we may be able to make the climate crisis less harmful. Also our emissions of greenhouse gases are a geo-intervention and by now we cannot even say anymore that it is an unintended intervention. We know what we are doing and are doing it anyway.

The best known geo-intervention is called Solar Radiation Management, that is, a reduction of the amount of sun that is absorbed at the Earth’s surface. This is possible by making the Earth brighter, especially the dark oceans, it may be possible to make clouds brighter or we could install mirrors in space. The most considered Solar Radiation Management method is creating large amounts particles high up in the air in the stratosphere. We know this works, large tropical volcanoes cool the Earth by emitting sulphur dioxide creating small particles in the stratosphere.

Advantages of Solar Radiation Management would be that it is relatively cheap. A medium sized economy like The Netherlands would have to spend a few percent of its gross domestic product to keep the global mean temperature stable. That sounds like a much better deal than having your culture disappear in the waves and more countries will likely be willing to chip in and join such a coalition of the chilling.

The Nature comment makes the interesting case that reducing the warming, will also reduce the concentrations of carbon dioxide in the atmosphere. It would do so as vegetation would take up more carbon dioxide if they are less stressed by the heat. The Arctic would warm less, which would reduce emissions from thawing permafrost. And also humans tend to use less energy when the planet is colder (for example, less air conditioning). Compared to other methods to remove carbon dioxide from the atmosphere solar radiation management is cheap.

Disadvantages are that this would hurt the ozone layer in the stratosphere. Volcanoes use sulphuric dioxide for their Solar Radiation Management, this would make acid rain worse. However, there is ongoing research on alternative particles and hopefully acid rain becomes better anyway due to the energy transition away from burning fossil fuels, which emits sulphuric dioxide in the troposphere where we live. In the dry stratosphere the particles are not removed as fast (for example due to precipitation) as in the troposphere. So we would need much less sulphuric dioxide emissions to cool the planet in the stratosphere than in the troposphere.

Solar Radiation Management can also not stop all climatic changes. Global warming due to greenhouse gases will mostly warm the Arctic (polar amplification), while solar radiation management would mostly cool the tropics where the sun is the strongest. Thus the temperature difference between equator and pole would become smaller and the circulation and water cycle would still change (although probably less).

The Nature comment argues that we should also talk about using Solar Radiation Management to partially offset global warming. Most studies look at bringing the temperature down to pre-industrial levels, but we could also make smaller reductions. For example, we could stabilise the temperature in the tropics. Then the rest of the planet would still warm, but the impacts in poor and thus vulnerable countries would then be reduced.

Goe-interventions are typically accompanied by academic debates about global governance of such a system. We have seen how good that works for the original problem, geo-engineering by carbon dioxide. In case of carbon dioxide the response has been limited by the people willing to take the largest risks with other people's lives and property. (Hopefully economic forces will now block them.) The only possible climate treaty was one without any obligations beyond reporting back. Still the incompetent president of the historically largest polluter said fuck you to the entire world for no other reason than the pleasure of saying Fuck You!

Similarly the coolest coalition of the chilling will set the temperature. If the hotheads do not like that, I am sure the reasonable cool people will be willing to talk, if the talks are also about carbon dioxide emissions.

We would have to keep on managing the insolation for millennia or until someone finds a cheap way to remove carbon dioxide from the air. The largest danger is thus that humanity gets into trouble over these millennia and would no longer be able to keep the program up, the temperature would jump up quickly and make the trouble even worse. Looking back at our history since Christ was born and especially the last century, it seems likely that we will be in trouble once in a while over such a long period.

This danger could also be an advantage, just as the mutual assured destruction (MAD) with nuclear arms brought us a period of relative peace, the automatic triggering of Mad Max would force humanity to behave somewhat sensibly and make people who love war less influential.

My impression is that the main objection from scientists against geo-interventions is their worry about creating such an automatically triggered doomsday machine. Those people seem to think of a scenario without mitigation, where we would have to do more and more Solar Radiation Management. While carbon dioxide accumulates in the atmosphere over millennia, the stratospheric particles (after a volcanoes) are removed after a few years. So we would need to keep adding them to the stratosphere and if we do not reduce greenhouse gas emissions increasingly many particles.

I do not think humanity will forgo mitigation, but we will likely be too slow. My expectation is that we will stabilise the temperature, but after quite a lot of warming. Renewable energy is getting very cheap and still rapidly declining in price. Also batteries are declining in price. Thus I would see the energy transition as unstoppable for electrical power and private transport. That would break the political power of the fossil fuel industry and then make the rest of the transition (heating and industrial processes) a lot easier.

About 20 percent of historical warming is due to methane, which is mostly due to animal husbandry (read burping cows) and rice paddies. The residence time of methane is about a decade. It thus accumulates thus much less than carbon dioxide and would be something we could fight with a modest and importantly stable amount of Solar Radiation Management. Hopeful was a recent study that feeding cows seaweed reduces their methane emissions almost to zero. (There are many other ethical and environmental problems with industrial agriculture, but the global warming part of it would then be solved.)

Thus I expect us to stabilise the climate, but at a level that will be harmful. If we stabilise at 3°C of warming, would it not be better to reduce the warming to 2°C, 1.5°C, or our current 1°C. The internationally agreed upon 1.5 and 2°C levels are not "safe" levels, below them there are clear damages (and above them the word will not suddenly end).

There are scientific justifications for the 2°C level, for example looking at some tipping points in the climate system, but this level is not set by science. In the end it is a political compromise between the risks of climate change and political difficulty of changing the energy system. As a Dutch person, my compromise is to go back to the old temperature. Also if the warming would stop now immediately, sea level rise would continue for millennia.

The question is not whether it would be nicer not to have climate change, but whether a geo-intervention can improve the situation. Some worry that a geo-intervention would reduce the pressure to reduce greenhouse gas emissions. In a rational world that may partially be the case, although a geo-intervention would not stop the market forces moving us to renewable energy. In the real, not rational, world it may well do the opposite.

Most people like to see the world become a better place, some for all, some for a large group they identify with, some for their community, some for their family. That may make these people blind for the possibility that some do not mind if their own situation becomes worse, as long as it becomes even worse for others and relatively they “win”. Let’s call these people supremacists or fascists.

The term supremacy or Trump’s slogan “America first” is already a hint that these people want to be on top, it does not say that the top is a nicer place, it is a relative measure. When the Second World War still went "well" for the Nazi’s, they were on top, but the suffering was enormous, less so, but also for the Germans. The Nazi’s did not care, for them war and violence is a natural state. They call normal people “Good people”, seeing themselves as bad people, as people who enjoy bringing about suffering of humans they perceive as less valuable. America’s white supremacists dream of a race war, which will also bring a lot of suffering on the people they claim to love.

When these people hear Greenpeace argue that vulnerable people will suffer most from climate change, it would make sense that they like this and want more of this. It is not possible to convince these people that climate change is real, they already accept it is real, they only claim they do not. The best way to get more climate change is to claim that you do not accept the science of climate change. That way you can also convince some conservatives who do not like to see others suffering, but are naturally sceptical of any claim that powerful corporations can do something wrong and trust their politicians who work for the fossil fuel companies (campaign contributions, cosy jobs afterwards).

I think the fascists are stupid to listen to Greenpeace. Yes, more people will die in poor countries, that means they will have more kids and multiply faster. Living in a harsh environment makes you flexible and strong. Our power and pampered life style is based on a fragile just-in-time economy, where everything is optimised and thus every change produces damages. Where a mid-sized bank going bust can produce a decade long recession. If civilisation goes down, the poor will have the more useful skills.

The mitigation sceptical movement seems to be against all types of geo-intervention, except for emitting greenhouse gases. Reading between the lines when they complain that scientists spread too much fear, one almost gets the impression they fear climate change more than most. That could be because they are fighting to make the worst case scenario happen and expect to be “successful”.

The fascists among them would hate it when geo-interventions would make their life’s work mostly futile. If they are no longer fighting for a bad world that would make the transition again a lot easier.

Hopefully they will move on to lie about other stuff, preferably claims that are easily checkable, like the size of inaugurations. (It is a sad state of affairs, that I thought is was worthwhile to add a link.)

Let me end this post with a And-Then-There's-Physics-style last paragraph. I know a little about the quality of climate data, but this post is just written as a participant in the climate debate. As far as I can judge nearly all scientists worry about geo-interventions and many do not even like doing research on it. So even if pro-intervention people are very present in the media, I am an outlier. Feel free to point out my thinking mistakes and alternative solutions.

Related reading

Do dissenters like climate change?

Nature commentary: solar geoengineering reduces atmospneric carbon burden. (Open Access with this link.)

Gernot Wagner co-author of the book Climate Shock: It's time to take solar geoengineering seriously, even though it seems outlandish.

Raymond T. Pierrehumbert in The Bulletin of Atomic Scientists The trouble with geoengineers “hacking the planet”.

Reto Knutti, Joeri Rogelj, Jan Sedláček & Erich M. Fischer, 2016: A scientific critique of the two-degree climate change target. (pay-walled)

Frieler, K., Mengel, M., and Levermann, A.: Delaying future sea-level rise by storing water in Antarctica, Earth Syst. Dynam., 7, 203-210, doi: 10.5194/esd-7-203-2016, 2016.

* Top photo by NASA, Filament Eruption Creates 'Canyon of Fire' on the Sun used with a Creative Commons Attribution 2.0 Generic (CC BY 2.0) license.

Photo of thawing permafrost by NPS Climate Change Response used with a Creative Commons Attribution 2.0 Generic (CC BY 2.0) license.

Photo Collecting salt under desert sun by Armando G Alonso used with a Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license.

Photo of a strong African lady Use No Hooks by Michał Huniewicz used with a Creative Commons Attribution 2.0 Generic (CC BY 2.0) license.

Monday, 21 August 2017

Germany weather service opens up its data

Some good news from Germany. The government has decided to make the data of the German weather service (DWD) freely available for all. This comes after a parliamentary hearing in April on a bill to make the data freely available. All but one expert at the hearing were positive about this change.

Data was mostly already free for research, but really free data still helps science a lot. If data is only free for research that means that you have to sign a contract. For a global study that means 200 contracts, in the best case where all countries do this, in the local language with hard to find contact persons, with different conditions each time and often only a part of the data. If the data is really free, you can automatically download it, create regional and global collections, enrich them with additional information, add value with data processing (homogenisation, quality control, extremes, etc.) and publish them for everyone to use. It would also make the data streams more transparent.

This move was aided by the resolution of the World Meteorological Organisation (WMO) calling on its members, the weather services, to free their data:
Strengthen their commitment to the free and unrestricted exchange of [Global Framework for Climate Services] GFCS relevant data and products;

Increase the volume of GFCS relevant data and products accessible to meet the needs for implementation of the GFCS and the requirements of the GFCS partners;
Unfortunately, there still is no legally binding require to share the data. The weather services cannot force their governments to do so, but the resolution makes it clear that governments refusing to open the data are hurting their people.

There is also a downside, the German weather service, [[Deutscher Wetterdienst]] (DWD), currently earns about 3.5 million Euro selling data. In perspective that is about 1 percent of their 305 million Euro budget. (The DWD earns about 20% of their budget themselves and thus costs only 3 Euro per citizen per year.)

Because of these earnings many weather services are reluctant to open up their data. Especially in poorer countries these earnings can be a considerable part of the budget. On the other hand, the benefits to society of open data are sure to be much higher. Because of more people and companies will actually use the data and because better data products can be produced. When it comes to climate data I hope that the international climate negotiations can free the data in return for funding for the observational networks of poorer countries.

The main problem in Germany are, optimistically were, the commercial weather services. They fear competition, both from the DWD themselves and because free data lowers the barrier to entry for other companies to start offering better services. These companies have been so successful that a long time it was even forbidden for the DWD to publish their weather predictions on their homepage. Weather prediction the DWD still had to make because it is their job to warn for dangerous weather. That was an enormous destruction of value created with taxpayer money to create an artificial market for (often worse quality) weather predictions.

There is a similar problem where commercial media companies have succeeded in limiting the time that public broadcasting organisations can make their information available for watching/listening/download. This destruction of public capital is still ongoing.

Good that for weather and climate data common sense has won in Germany. Only a small number of countries have made their data fully open, but I have the impression that there is a trend. It would be great if someone would track this, if only to create more pressure to open the data holdings.

Related reading

Link to the DWD open data portal.

German parliament press office: Experts endorse free provision of weather service data. In German: Experten befürworten entgeltfreies Angebot der Wetterdienst-Wetterdaten.

DWD press release: Amendment to the Deutscher Wetterdienst Act in force since 25 July 2017. Tasks and responsibilities of Deutscher Wetterdienst updated to take account of today's environment.

Free our climate data - from Geneva to Paris.

Congress of the World Meteorological Organization, free our climate data.

Thursday, 3 August 2017

Ottmar Edenhofer in 2010 on international climate politics and redistribution of wealth

If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him.
Cardinal Richelieu (or his circle)

[[Ottmar Georg Edenhofer]] (born in 1961 in Germany) currently holds the professorship of the Economics of Climate Change at the Technical University of Berlin. He is deputy director and chief economist of the Potsdam Institute for Climate Impact Research (PIK). From 2008 to 2015 he served as one of the co-chairs of the Intergovernmental Panel on Climate Change (IPCC) Working Group III "Mitigation of Climate Change".
Some hold the view that climate scientists are conspiring against humanity to bring down capitalism. (Not so sure whether a natural science is the best place to start the biggest, longest, global conspiracy, whether an abstract, slow and distributed environmental problem is the best way to motivate people, whether an economic sector whose business model is political corruption is the easiest one to topple, nor whether using another energy source would change capitalism.)

As evidence they occasionally cherry pick from an article by Ottmar Edenhofer, then the co-chairman of Working Group III on solving climate change (mitigation) of the Intergovernmental Panel on Climate Change. He was describing the political reality and explained why "the owners of coal and oil are not enthusiastic" about fighting climate change when he wrote: "We redistribute de facto the world’s wealth by climate politics."

The quote comes from an interview in the Swiss Newspaper Neue Zürcher Zeitung. So I can use my comparative advantage of knowing a little German. Fortunately people in Switzerland merely speak Swiss German, Schwyzerdütsch, they write standard German, Hochdeutsch, which is hard enough for a poor Dutch natural scientist.

This is the key part of the NZZ interview with Ottmar Edenhofer in 2010:
Grundsätzlich ist es ein grosser Fehler, Klimapolitik abgetrennt von den grossen Themen der Globalisierung zu diskutieren. Der Klimagipfel in Cancún Ende des Monats ist keine Klimakonferenz, sondern eine der grössten Wirtschaftskonferenzen seit dem Zweiten Weltkrieg. Warum? Weil wir noch 11 000 Gigatonnen Kohlenstoff in den Kohlereserven unter unseren Füssen haben – und wir dürfen nur noch 400 Gigatonnen in der Atmosphäre ablagern, wenn wir das 2-Grad-Ziel halten wollen. 11 000 zu 400 – da führt kein Weg daran vorbei, dass ein Grossteil der fossilen Reserven im Boden bleiben muss.

De facto ist das eine Enteignung der Länder mit den Bodenschätzen. Das führt zu einer ganz anderen Entwicklung als der, die bisher mit Entwicklungspolitik angestossen wurde.

Zunächst mal haben wir Industrieländer die Atmosphäre der Weltgemeinschaft quasi enteignet. Aber man muss klar sagen: Wir verteilen durch die Klimapolitik de facto das Weltvermögen um. Dass die Besitzer von Kohle und Öl davon nicht begeistert sind, liegt auf der Hand. Man muss sich von der Illusion freimachen, dass internationale Klimapolitik Umweltpolitik ist. Das hat mit Umweltpolitik, mit Problemen wie Waldsterben oder Ozonloch, fast nichts mehr zu tun.
I would translate that as:
Fundamentally, it is a big mistake to discuss climate politics separately from the big issues of globalization. The climate summit in Cancún at end of the month is not a climate conference, but one of the largest economic conferences since the Second World War. Why? Because we have 11,000 gigatons of carbon in the coal reserves under our feet – and we can only add 400 gigatons more to the atmosphere if we want to stay within the 2 °C target. 11,000 to 400 – we have to face the fact that a large part of the fossil reserves must remain in the ground.

De facto, this is the expropriation of the countries with these natural resources. This leads to an entirely different development than the one that has been initiated with development policy.

First of all, we as industrialized countries have quasi expropriated the atmosphere of the world community. But one must explicitly say: We de facto redistribute the world’s wealth due to climate politics. That the owners of coal and oil are not enthusiastic about this is obvious. One has to free oneself from the illusion that international climate politics is environmental politics. This has almost nothing to do any more with environmental politics, [as is was with] with problems such as deforestation or the ozone hole.
That is a wordy way of saying that climate policies have large economic implications and that these impact different countries differently. For most governments economics is more important than the environment. That means that the world leaders sit at the table and not the environment ministers. Ironically in the sentence most often quoted by the mitigation sceptics Ottmar Edenhofer is expressing understanding for the owners of coal and oil.

They were used to violating the property rights of others without paying for it, the large-scale equivalent of dumping your trash in your neighbours garden. Then it is annoying if the neighbour finds out what you are doing, wants you to stop and clean up the mess.

Also doing nothing is redistributing wealth: "we as industrialized countries have quasi expropriated the atmosphere of the world community." This is a kind of redistribution social Darwinists may find natural, but it goes against the property rights our capitalism system is based on. That is socialism for the owners of coal and oil.

I guess it is natural for people who are willing to pretend that climate science is wrong to defend their political views to assume that people who accept climate science do so for political reasons. That is [[psychological projection]] and Karl Rove strategy #3: Accuse your opponent of your own weakness. My impression is the opposite: most people prefer to be grounded in reality, not just in my science bubble.

* Photo of Prof. Dr. Ottmar Edenhofer Chefökonom des Potsdam-Instituts für Klimafolgenforschung by Stephan Roehl licensed under the Creative Commons Attribution-Share Alike 2.0 Generic license.