The Intergovernmental Panel on Climate Change (IPCC): The four factors that explain the most powerful language

The Intergovernmental Panel on Climate Change (IPCC): The four factors that explain the most powerful language

In 1990, in first report For the IPCC, the language was already strong enough, except that we recognized that there was still a margin of uncertainty. “The definitive discovery of the boosting effect of greenhouse gas emissions probably won’t be in the next decade.” Simply put: human guilt has not yet been proven, but we are getting close. And in 1995, in the second report of the Intergovernmental Panel on Climate Change, we came close to it a little: “The body of evidence indicates that there is a significant human impact on the planet’s climate.”

For American climate scientist Ben Sutter, who has participated in each of the six reports since 1990, and which has gradually led to this evolution of the language, these are two main factors:

  • Climate Change Notes Spread over a longer period of time and with better quality. From the ground or from orbit, from the surface of the oceans or from the upper or lower atmosphere, various technologies provide huge amounts of data with a wealth of details that were once unimaginable: the amount of water in the clouds, the outbreaks of plankton in the oceans, etc.
  • Develop better computer models of different climate systems. There were only half a dozen in there in 1990, and they were limited by the capabilities of computers at the time. Since then the strength and quality have increased and accurately, adding parameters ranging from how much carbon the oceans absorb and release to atmospheric chemistry, including the effect of ice cap melting. on ocean currents. Each new element can be put into relationship with the others to try to measure their respective roles.
READ  Wonderful Tropical Cities, Finally Discovered by Science

However, despite their limitations at the time, climate models didn’t get that far from the target, with a 2019 analysis showing a comparison of the reality of 2017 with predictions from 17 models of what the global temperature will be in 2019. Earth. Some of these models date back to the 1970s, and their results were, for the most part, “highly reliable in predicting global warming several years after they were published.”

Their weakness was their inability to predict parameters other than temperature, as well as, in some cases, a very large margin of uncertainty. It is this margin of uncertainty that technological progress has declined since then.

Santer identifies two other factors, secondary, but related to the first two.

  • Since 1990, scientists have developed more stringent protocols for Evaluation and testing The limits of these models, linking everything to platforms that allow better data sharing on an international scale.
  • Surprisingly, some would say – reactions to climate skeptics have helped raise the level of certainty. For example, he wrote, in 1995, the choice of the phrase “significant human influence on climate” sparked criticism, with some claiming that satellite data did not show “warming” in the lower atmosphere. The assertion would have turned out to be false, but it still had to be proven by more in-depth analyzes of the existing data and the strengthening of the satellite’s monitoring capability. Work that “advances the science, eventually leading to better satellite observations – observations that show warming of the lower atmosphere.”
READ  How soliton changes time, space, and rules

Leave a Reply

Your email address will not be published. Required fields are marked *