The possible effect of cartography on the computation of the mean Earth ground temperature

Franco Pavese Independent scientist- former Research Director on Metrology at the CNR, Italy

In a previous communication, [1] a data analysis of the metrological type was performed on the effect of the true uncertainty of the official data collected from the metereological stations round the world and used to compute the mean Earth ground temperature (taken at +2 m from ground, SAMT), in order to discuss about the reliability of predictions on future temperature changes, which have been pushed even beyond 2100 by the dedicated International Organizations, namely the ONU–supported IPCC. It was basically shown there that, with a certified uncertainty of the order of ± 0.5 °C for each metereological station, it is metrologically impossible to get published uncertainties as low as ± 0.05 °C— neither of ± 0.12 °C—associated to a total average increase of +1.1 °C of the SAMT. The reason has been found by the Author arising since that original uncertainty, that being only the first component of the “uncertainty budget”, a mandatory requirement for any metrological–valid analysis, that cannot be mitigated by any subsequent statistical treatment, even if over millions of available data. What is done by those Organisations is performing an additional extremely huge and complex number of treatments on those extremely large sets of initial data, with a “screening out”, or correction, of apparently inconsistent data, then by filling the amount of the Earth surface where no proxy data are available, by adding data to the grid according to physical/mathematical models assumed to be applicable. In Figure 1a,b, a recent collation of the distribution of such meteorological stations has been obtained on a chart: [2] in (a) one can appreciate that apparently large portions of the Earth surface are equipped with a very dense population of these stations. However, in (b) an apparently fully filled portion was enlarged to show the actual local station density, where, with exceptions, the distance between them is by far larger than the 1 km2 assumed to be a suitable step of the grid (1 second of arc). There, one can also appreciate the fact that on the oceans “ground” stations are obviously rare, where temperature is basically measured with radiation thermometers on satellites, known to have even a lower temperature measurement accuracy due to the correction for the vertical atmospheric composition. What can be easily obtained and is done with suitable techniques by those Organisations in these conditions is a general “smoothing” of the temperature distribution (particularly important for the correction of the “self”-heating effect in urban areas), i.e., a “consistency” of the dataset much higher than its precision, a fact that is probably the current reason for a gross mistake in reporting the published results. Starting from these considerations, I have now additionally observed that the Earth map of the (e.g., annual) SAMT shows an extremely large variety of situations round the world, not at all uniform, with even some limited regions not showing at all any increase. In Figure 2, [3] one such map shows that the increase is * former Research Director on Metrology at the CNR, Italy basically concentrated on ground, i.e. on the northern hemisphere, and in it on the medium-high latitudes, with a few and limited exceptions. These facts tend, first of all, to bring to a quite limited meaning of the SAMT with respect to an analysis of the possible reasons for the local increases. In addition, the consideration stands that about 2/3 of the Earth surface is covered by water, where for obvious reasons the increase is lower—due to the huge heat capacity and mobility of the mass of water. When one analyses those maps, one has necessarily to realise that they are flat representations of a spheroid, i.e. that they unavoidably deform the real surface. That is the domain of “cartography” studies, usually dominated by the Mercator representation. As to these studies, the most important difference between the more than a dozen of different methods of representation, is the property to preserve or not the surface ratios, because, in the climate studies, one wants to sum up the effect of different portions of surface to get an average value of the relevant parameter, e.g. temperature. However this is not the case, because Mercator preferred to keep comparable the distances between points on the surface, as that was at those times the most important parameter in traveling—by far the most important use of the maps. On the contrary, here we need to sum up pieces of surfaces equivalent to each other having shown a certain change of SAMT, and consequently a mean temperature value must come from summing up equivalent and consistent surface values, irrespective from their position of the Earth surface. That is not the case for the Mercator map: the most striking evidence is its increasing of the extension with respect to the real one going toward the poles, whose real surface is small and smaller while on the map it remains large. That issue already occurred to me when I used NASA maps of snow/ice coverage to compute its change over the last 20 years, [4] where most of the snow/ice just concentrates on high latitudes and the provided map was a Mercator-type one, the worst one. In that case, considering that the precision of the computation was not so critical, I approximated the whole surface above 60° from a rectangle to a triangle pointing to the pole, so halving that portion of surface. In the case of the computation of the SAMT that is not possible on a so easy way. On the other hand, I found impossible so far to find a single information about the type of map or other algorithm used for the computation of the SAMT: however, for the Earth temperature maps it is easily found on the vast literature that only non-preserving-area types were used, like the Robinson’s one. [5] Until recently there was a single map type that implemented our need, the Peters’ one, that has been made as an area-preserving one: in Figure 3 it is reported, [6] where its important differences are shown with respect to the one of Figure 2 above, a (probably) Mercator-type one. The polar areas closely reduce to a thick line, and regions where the temperature increase was more marked, like Europe, are also quite reduced in extension, while, e.g., Africa shows its real extension, as the largest continent of the Earth, and the fact that water is dominant in the south hemisphere also becomes much more evident. The reduction of the highlatitude areas is readily computed: in a Mercator-type map the regions below 60° in latitude cover the 66.7 % (2/3) of the total; in the Peters’ map they cover the 86.7%, 20% more (i.e., in this case, the extreme temperature changes in that areas account for 20% less on the SAMT value). Peters’ map was not well considered until recent times (except that UNESCO uses it), because the Mercator’s one better covers the most-developed Earth areas: that is not of interest here, where what is instead important is the fact that, should one be able to re-map Figure 2 into Figure 3, the areas of maximum temperature increase would quite decrease in extension, and thus so would be of SAMT if computed using Figure 1 or a not area-preserving map. The map transformation is not trivial and out from my personal competences, but one can consult [5] for trying to perform such transformation—and information should hopefully become available from IPCC about the type of map they used. Very recently a project has been started, and was ended recently, to prepare a new area-preserving map different from the Peters’ one, but eventually looking basically very similar: this map is now downloadable for free. [8] The new map is slightly different, where the region within 60° of latitude now accounts for 83.3% instead of 86.7% for Peters’—and of 66.7% for Mercator’s (marginally 16.6% instead of 20% more). In conclusion, in addition to the doubts already expressed about the fact that the SAMT can be a reliable parameter for characterising the assumed climate change, due to a true—much larger— data uncertainty, important when trying to assess a prediction, now also another doubt is arising about the fact that the representation used to show, and probably to compute, the SAMT is maximizing the resulting increase in accordance to the present worldwide policy, while in a possibly incorrect way. Let us use, as an example, the same basic method developed in [4] to compute instead the mean temperature of the map in Figure 2, a not-preserving surface areas, probably of the Robinson’s type. According to the colour scale supplied with the map, one can compute the number of pixels found in the whole map for each 0.25 °C temperature step. Several precautions must be taken for doing it correctly. First of all, setting a tolerance in the search is needed, i.e., setting the range of colour values that will be searched: for that purpose, for a full map represented by 50.000 total pixels, tolerances between 8 and 46 pixels where checked, to compare the differences in the results obtained, from temperature changes from <– 2 °C to >+2 °C. In Table 1 the results are reported: the elaboration consisted in computing the percent over the full surface of the pixels found for each DT-step of 0.25 °C, and using it to weight the DT value, then summing up all resulting values for the 18 steps, which will represent the weighted SAMT. As expected, not all tolerances provided the same mean value, due to several features embedded into the map file, as saved from the literature. For “too small” tolerances, the search is affected by the nonhomogeneity of each colour step: this can be checked either on the colour-scale sample on the colour scale, of about 400 pixels, or on any map area supposed to be univocal: in both cases they were found inhomogeneous by several colour indexes, depending on the selected pixel. This fact affects the “best” minimum tolerance. On the other hand, by increasing “too much” the tolerance, the search will end in also using adiacent colour steps, providing a too high (and incorrect) selection of pixels: in fact, for tolerances > 32 pixels 2 consecutive colour levels were found selected on the colour scale instead of one, meaning overlapping. Thus, in Table 1, only the results for tolerances in the range 12-16 pixels where considered consistent, bring to an average value of DT = (+0.55 ± 0.12) °C (for the period 2000-2018, with respect to the reference period 1940-2000): compared to Fig. 4, from the authors’ 1910-2020 fit, [9] it results (+0.61 ± 0.10) °C, consistent, but for a not-constant surface map—incidentally, for the period 1950-2020, the official IPCC increase per decade of the anomaly would be 20% higher (0.17 °C/decade instead of 0.14 °C/decade). In order to get the corresponding value for a Peters’ constant surface map, one has, in first approximation, to halve the contribution of the areas for latitudes >60°, or increase by 20% the contribution of the area within latitudes ± 60°—the two ways are not equivalent due to the non-homogeneous DT distribution shown by the map. In that map the issue is simplified by the fact that those two areas coincide with the areas for DT > or < ± 2 °C, and that the area for –2 °C is irrelevant in practice: by performing that correction, the DT value becomes (+0.47 ± 0.09) °C, inconsistently lower by –0.14 °C (–30% than the latter) with respect to the published one, but more correct. In conclusion, the effect of the choice of the correct type of the map can bring to a correct lowering of the assessed values and increases with time of the SAMT.

References

  1. F. Pavese, Uncertainty in the case of lack of information: extrapolating data in time, with examples on
    climate forecast models, Voci in Transito, 26 Novembre 2022.
  2. https://databasin.org/maps/new/#datasets=de8f7f71e3334eba863ff6003484364f
  3. C. P. Morice, J. J. Kennedy, N. A. Rayner, J. P. Winn, E. Hogan, R. E. Killick, R. J. H. Dunn, T. J.
    Osborn, P. D. Jones, I. R. Simpson, An updated assessment of near surface temperature change from 1850:
    the HadCRUT5 dataset, Journal of Geophysical Research: Atmospheres, 2019.
  4. F. Pavese, Graphic method for retrieval of quantitative data from computer-mapped qualitative
    information, with a NASA video as an example, 2020, ESIN 13, 655-662.
  5. D. Kelley, 4. Using map projections, 2022-08-18 (https://orcid.org/0000-0001-7808-5911) https://cran.rproject.org/web/packages/oce/vignettes/D_map_projections.html
  6. Proiezione di Gall-Peters, Wikipedia. F.Fontana, Le proiezioni cartografiche e la deformazione delle aree:
    uso di integrali di superficie. http://federicof89.altervista.org/cartografia/proiezCartogr.html
  7. G. Bacaro, Proiezioni Cartografiche, Lezione 4. Sistemi Informativi Geografici (GIS)
    CdL in Scienze e Tecnologie per l’Ambiente e la Natura, II anno, II semester 8. Last updated 19 March,
    2018.
  8. https://www.esri.com/about/newsroom/arcuser/equal-earth/
  9. P. Frank, Uncertainty in the global average surface temperature index: a representative lower
    limit, Energy & Environment, Vol. 21 No. 8, 2010–21.

Rispondi

Inserisci i tuoi dati qui sotto o clicca su un'icona per effettuare l'accesso:

Logo di WordPress.com

Stai commentando usando il tuo account WordPress.com. Chiudi sessione /  Modifica )

Foto Twitter

Stai commentando usando il tuo account Twitter. Chiudi sessione /  Modifica )

Foto di Facebook

Stai commentando usando il tuo account Facebook. Chiudi sessione /  Modifica )

Connessione a %s...