The Roman Philosopher Lucius Anneaus Seneca (4 BCE-65 CE) was perhaps the first to note the universal trend that growth is slow but ruin is rapid. I call this tendency the "Seneca Effect."
Showing posts with label global warming. Show all posts
Showing posts with label global warming. Show all posts

Friday, April 21, 2023

A new dichotomy: nation states vs. the "one planet" movement







A new dichotomy is emerging in the current debate: the contrast between the view of the world as composed of nation-states in relentless competition with one another and the “one planet” movement, which emphasizes human solidarity. These two trends are on a collision course. Up to now, the "one planet" approach seemed to be the only chance to set up an effective strategy against the degradation of the planetary ecosystem. But now, it seems that we'll have to adapt to the new vision that's emerging: for good or for bad, the world is back in the hands of nation-states, with all their limits and their idiosyncrasies, including their large-scale homicidal tendencies. Can we find survival strategies with at least a fighting chance to succeed? Daniele Conversi, researcher at the University of the Basque Country, has been among the first to pose the question in his recent book "Cambiamenti Climatici" (UB)




By Daniele Conversi (University of the Basque Country)




The climate crisis is one of the nine "planetary boundaries" identified in the Earth Sciences since 2009. The critical threshold for climate change (350 ppm) has only recently been passed. Other boundaries, such as biodiversity loss, have been overrun — and we are reaching other critical thresholds as well. The global environmental crisis signals the likely entrance into the most turbulent period in human history, requiring unprecedented creativity, force and adaptive skills to act quickly and radically in order to curb the global crisis. But which are the main obstacles arising in front of us?

Historians of science and investigative journalists plus some social and political scientists have studied in detail the way the fossil fuel lobbies hampered governmental action via disinformation, misinformation, and the "denial industry". However, these studies do not generally consider in detail the institutional scenario where lobbyists act, namely the nation-state.

My research explores a different set of variables originating in the current division of the world into nation-states powered by their own ideology, nationalism. In an age in which boundaries cannot halt climate change, nationalism fully engages in erecting ever higher boundaries.

Thus, we need to ask: if nationalism is the core ideological framework around which contemporary political relations are articulated, is it possible to involve it in the fight against climate change? I explore this answer via a few case studies arising within both stateless nations and nation-states. Riding the wave of nationalism, however, makes only senses if, at the same time, non-national solutions are also simultaneously considered, as condensed in the concept of "survival cosmopolitanism": Effective results can only be achieved when considering the plurality of possible solutions and avoiding fideistic responses such as 'techno-fixes' centred on the magic-irrational faith in technological innovation as the ultimate Holy Grail, which can easily be appropriated by nationalists. Salvation may come, not as much from technology, as from the abandonment of an economic system ruthlessly based on environmental destruction and the expansion of mass consumption.

Monday, August 23, 2021

Climate Change: What is the Worst that can Happen?

A Brontotherium, a creature similar to modern rhinos that lived up to some 35 million years ago in a world that was about 10 degrees centigrade hotter than ours. In this scene, we see a grassy plain, but Earth was mostly forested at that time. We may be moving toward similar conditions, although it is not obvious that humans could fare as well as Brontotheria did (image from BBC).

 

As it should have been predictable, the IPCC 6th assessment report, sank like a stone to the bottom of the memesphere just a few days after it was presented. Put simply, nobody is interested in sacrificing anything to reverse the warming trend and, most likely, nothing will be done. Renewable energy offers hope to mitigate the pressure on climate, but it may well be too late. We may have passed the point of non-return and be in free fall toward an unknown world. 

A disclaimer: I am not saying that nothing can be done anymore. I think we should keep doing what we can, as long as we can. But, at this stage, we can ask the question of "what is the worst thing that can happen?" Models can't help us too much to answer it. Complex systems -- and Earth's climate is one -- tend to be stable, but when they pass tipping points, they change rapidly and unpredictably. So, the best we can do is to imagine scenarios based on what we know, using the past as a guide.

Let's assume that humans keep burning fossil fuels for a few more decades, maybe slowing down a little, but still bent at burning everything burnable, deforesting what is deforestable, and exterminating what is exterminable. As a result, the atmosphere keeps warming, the ocean does that, too. Then, at some point -- bang! -- the concentrations of greenhouse gases shoot up, the system goes kinetic and undergoes a rapid transition to a much hotter world.

The new state could be similar to what the Earth was some 50 million years ago, during the Eocene. At that time, the concentration of CO2 in the atmosphere was of the order of one thousand parts per million (today it is ca. 400) and average surface temperature was about 10-12 degrees C higher than the current one. Note that this is an average: the high latitudes, North and South, where hotter than the low ones, nowhere life would experience temperatures so high that animals would boil alive. So, it was hot, but life thrived and Earth was a luxuriant, forested planet. In principle, humans could live in an Eocene-like climate. The problem is that getting there could be a rough ride, to say the least.

Nobody can say how fast we could get to a new Eocene, but tipping points are fast, so we don't need millions of years. We are thinking, more likely, of thousands of years and significant changes could occur in centuries or even in decades. So, let's try an exercise in looking at the worst-case hypothesis: assuming a warming of 5-10 degrees occurring over a time span of the order of 100-1000 years, what would we expect? It depends not just on temperatures, but on the interplay of several other factors, including mineral depletion, economic and social collapse, and the like. Let me propose a series of scenarios arranged from not so bad to very bad. Remember, these are possibilities, not predictions.


1. Extreme weather events: hurricanes, and the like. These events are spectacular and often described as the main manifestation of climate change. Nevertheless, it is not obvious that a warmer world will show violent atmospheric phenomena. A hurricane is a thermal engine, it transfers heat from a hot area to a cold area. It is more efficient, and hence more powerful, the higher the temperature difference. From what we know, in a warmer world these differences should be lower than they are now, at least horizontally, although vertically it is another matter. Overall, the power of hurricanes would not be necessarily increased. We may have a lot more rain because a hot atmosphere can contain more water, and this is an already detectable trend. Extreme weather events would be mainly local and hardly an existential threat to human civilization. 

2. Fires. Higher temperatures mean higher chances of fire, but the temperature is not the only parameter that enters into play. The trends over the past decades indicate a weak increase in the number of fires in the temperate zone and, of course, fires wreak havoc for those who didn't think too much before building a wooden house in a forest of eucalyptus trees. Nevertheless, as far as we know, fires were less common in the Eocene than they are now, which is what we would expect for a world of tropical forests. Fires should not be a threat for the future, although we may see a temporary rise in their frequency and intensity during the transition period.

3. Heat Waves. There is no doubt that heat waves kill, and that they are becoming more and more frequent. An Eocene-like climate would mean that the people living in what is today the temperate zone would experience summers in the form of a continuous series of extreme heat waves. Paris, for instance, would have a climate similar to the current one in Dubai. It would not be pleasant, but it is also true that people can stay alive in Dubai in Summer using air conditioning and taking other precautions. As long as we maintain a good supply of electricity and water, heat waves don't represent a major threat. Without electricity and abundant water, instead, disaster looms. Heat waves could force a large fraction of the population in the equatorial and temperate zones to move northward or relocate on higher grounds, or, simply, die where they are. The toll of future heat waves is impossible to estimate, but it could mean the death of millions or tens of millions of people, or even more. It may not destroy civilization, but humans would have to move away from the tropical regions of the planet

4. Sea level rise. Here, we face a potential threat that goes from the easily manageable to the existential, depending on how fast the ice sheets melt. The current 3.6 mm/year rate means 3-4 meters of rise in a thousand years. Over such a time span, it would be reasonably possible to adapt the harbor structures and to move them inland as the sea level rise. But if the rate increases, as it is expected to, things get tough. Having to rebuild the whole maritime commercial infrastructure in a few decades would be impossible, to say nothing about the possibility of catastrophic events involving large masses of ice crashing into the sea. If we lose the harbors, we lose the maritime commercial system. Without it, billions of people would starve to death. In the long run, the ice sheets of Greenland and Antarctica will have to melt completely, causing the sea level to rise by about 70 meters, but nobody can say how long that would take. Sea level rise has the potential for substantial disruption of the human civilization, even for its total collapse, but not to cause the extinction of humankind.

5. Agricultural collapseIn principle, climate change, may have disruptive effects on agriculture. Nevertheless, so far warming has not affected agricultural productivity too much. Assuming no major changes in the weather patterns, agriculture can continue producing at the current rates as long it is supplied with 1) fertilizers, 2) pesticides 3) mechanization, 4) irrigation. Take out any one of these 4 factors and the grain fields turn into a desert (genetically modified organisms (GMOs) may not need pesticides, but they have other problems). Keeping this supply needs a lot of energy and that may be a big problem in the future. Photovoltaic-powered artificial food production could come to the rescue, but it is still an experimental technology and it may arrive too late. Then, of course, technology can do little against the disruption of the weather patterns. Imagine that the Indian yearly monsoon were to disappear: most likely, it would be impossible to replace the monsoon rain with artificial irrigation and the result would be hundreds of millions of people starving to death. The lack of food is one of the main genocidal killers in history, directly or indirectly as the result of the epidemics that take advantage of weakened populations. As recently as a century and a half ago, famine directly killed about 30% of the population of Ireland and the toll would have been larger hadn't some of them been able to emigrate. If we extrapolate these numbers to the world today, where there is no possibility to migrate anywhere (despite Elon Musk's efforts to take people to Mars), we are talking about billions of deaths. Famines are among the greatest threats to humankind in the near future, although climate change would be only a co-factor in generating them. Famines may wreck sufficient damage to cause an economic, social, and cultural collapse. 

6. Ecosystem collapse. The history of Earth has seen several cases of ecosystemic collapses involving mass extinctions: the main ones are referred to as "the big five." The largest one took place at the end of the Permian, about 250 million years ago. In that case, the ecosystem recovered from the catastrophe, but it went close to losing all the vertebrates. Most large extinctions are correlated to volcanic emissions of the type called "large igneous provinces" that generate large amounts of greenhouse gases. The result is a warming sufficiently strong to disrupt the ecosystem. The current human-caused emission rate is larger than anything ever experienced by the ecosystem before, but it is unlikely to arrive to levels that could cause a Permian-like disaster. While volcanoes don't care about the biosphere, humans would be wiped out much before they could pump enough CO2 in the atmosphere to cause the death of the biosphere. Nevertheless, a substantial ecosystemic collapse could be caused by factors as the elimination of keystone species (say, bees), erosion, heavy metal pollution, arrest of the thermohaline oceanic currents, and others. The problem is that we have no idea of the time scale involved. Some people are proposing the "near term human extinction" (NTE) taking place in a few decades at most. It is not possible to prove that they are wrong, although most of the people studying the issue tend to think that the time involved should be much longer. The collapse of the ecosystem is a real threat: if it has happened in the past, it could happen again in the future. It may not be definitive and the ecosystem would probably recover as it has done in the past. But, if it happens, it may well be the end of humans as a species (and of many other species). 

7, The unexpected. Many things could cause an abrupt and unexpected change of the state of the system. The stopping of the thermoaline currents is a threat that could wreck disaster on the biosphere, but we don't know exactly what could happen, despite spectacular movies such as "The day after tomorrow"). Then, concentrations of CO2 of the order of 1,000 ppm could turn out to be poisonous for a biosphere that evolved for much lower concentrations. That would lead to a rapid ecosystem collapse. Then, heavy metal pollution could reduce human fertility so much that humans would go extinct in a couple of generations (we are especially sensitive to pollution because we are top predators). In this case, the human perturbation on climate would quickly disappear, although the past effects would still be felt for a long time. Or, we may think of a large scale nuclear war. It would cause a temporary "nuclear winter" generated by the injection of light-reflecting dust into the atmosphere. The cooling would disrupt agriculture and kill off a large fraction of the human population. After a few years, though, warming would return with a vengeance. How about developing an artificial intelligence so smart that it decides that humans are a nuisance and it exterminates them? Maybe it would keep some specimens in a zoo. Or, a silicon-based life would find that the whole biosphere is a nuisance, and proceed to sterilize the planet. In that case, we might be transferred as virtual creatures in a virtual universe created by the AI itself. And that may be exactly what we are! These extreme scenarios are unlikely, but who knows?

 


So, this is the view from where we stand: the peak of the Seneca Cliff, the curve that describes the rapid phase transitions of complex systems on the basis of the principle that "growth is sluggish, but ruin is rapid." We see a green valley in the distance, but the road down the cliff is so steep and rough that it is hard to say whether we will survive the descent. 

The most worrisome thing is not so much the steep descent in itself, but that most humans not only can't understand it, but they can't even perceive it. Even after the descent has started (and it may well have started already), humans are likely to misunderstand the situation, attribute the change to evil agents (the Greens, the Communists, the Trumpists, or whatever) and react in way that will worsen the situation -- at best with extensive greenwashing, at worst with large scale extermination programs.

So, we may well disappear as a species in a non remote future. But we may also survive the disaster and re-emerge on the other side of the climate transition. For those who make it, the new Eocene might be a good world to live in, warm and luxuriant, with plenty of life. Maybe some of our descendants will use stone-tipped lances to hunt a future equivalent of the ancient Eocene's brontotheria. And, who knows, they might be wiser than we have been. 

Whether humans survive or not, the planetary ecosystem -- Gaia -- will recover nicely from the human perturbation, even though it may take a few million years for it to regain the exquisite complexity of the ecosystem as it was before humans nearly destroyed it. But Gaia is not in a hurry. The Goddess is benevolent and merciful (although sometimes ruthless) and she will live for several hundred million years after that even the existence of humans will have been forgotten.

 

Saturday, August 7, 2021

Consensus Building: an art that we are losing. The Case of Climate Science

In 1956, Arthur C. Clarke wrote "The Forgotten Enemy," a science fiction story that dealt with the return of the ice age (image source). Surely it was not Clarke's best story, but it may have been the first written on that subject by a well-known author. Several other sci-fi authors examined the same theme, but that does not mean that, at that time, there was a scientific consensus on global cooling. It just means that a consensus on global warming was obtained only later, in the 1980s. But which mechanisms were used to obtain this consensus? And why is it that, nowadays, it seems to be impossible to attain consensus on anything? This post is a discussion on this subject that uses climate science as an example.

 

You may remember how, in 2017, during the Trump presidency, there briefly floated in the media the idea to stage a debate on climate change in the form of a "red team vs. blue team" encounter between orthodox climate scientists and their opponents. Climate scientists were horrified at the idea. They were especially appalled at the military implications of the "red vs. blue" idea that hinted at how the debate could have been organized. From the government side, then, it was quickly realized that in a fair scientific debate their side had no chances. So, the debate never took place and it is good that it didn't. Maybe those who proposed it were well intentioned (or maybe not), but in any case it would have degenerated into a fight and just created confusion.

Yet, the story of that debate that was never held hints at a point that most people understand: the need for consensus. Nothing in our world can be done without some form of consensus and the question of climate change is a good example. Climate scientists tend to claim that such a consensus exists, and they sometimes quantify it as 97% or even 100%. Their opponents claim the opposite

In a sense, they are both right. A consensus on climate change exists among scientists, but this is not true for the general public. The polls say that a majority of people know something about climate change and agree that something is to be done about it, but that is not the same as an in-depth, informed consensus. Besides, this majority rapidly disappears as soon as it is time to do something that touches someone's wallet. The result is that, for more than 30 years, thousands of the best scientists in the world have been warning humankind of a dire threat approaching, and nothing serious has been done. Only proclaims, greenwashing, and "solutions" that worsen the problem (the "hydrogen-based economy" is a good example).

So, consensus building is a fundamental matter. You can call it a science or see it as another way to define what others call "propaganda." Some reject the very idea as a form of "mind control," or practice it in various methods of rule-based negotiations. It is a fascinating subject that goes to the heart of our existence as human beings in a complex society. 

Here, instead of tackling the issue from a general viewpoint, I'll discuss a specific example: that of "global cooling" vs. "global warming," and how a consensus was obtained that warming is the real threat. It is a dispute often said to be proof that no such a thing as consensus exists in climate science. 

You surely heard the story of how, just a few decades ago, "global cooling" was the generally accepted scientific view of the future. And how those silly scientists changed their minds, switching to warming, instead. Conversely, you may also have heard that this is a myth and that there never was such a thing as a consensus that Earth was cooling.

As it is always the case, the reality is more complex than politics wants it to be. Global cooling as an early scientific consensus is one of the many legends generated by the discussion about climate change and, like most legends, it is basically false. But it has at least some links with reality. It is an interesting story that tells us a lot about how consensus is obtained in science. But we need to start from the beginning.

The idea that Earth's climate was not stable emerged in the mid-19th century with the discovery of the past ice ages. At that point, an obvious question was whether ice ages could return in the future. The matter remained at the level of scattered speculations until the mid 20th century, when the concept of "new ice age" appeared in the "memesphere" (the ensemble of human public memes). We can see this evolution using Google "Ngrams," a database that measures the frequency of strings of words in a large corpus of published books (Thanks, Google!!).

 

You see that the possibility of a "new ice age" entered the public consciousness already in the 1920s, then it grew and reached a peak in the early 1970s. Other strings such as "Earth cooling" and the like give similar results. Note also that the database "English Fiction" generates a large peak for the concept of a "new ice age" at about the same time, in the 1970s. Later on, cooling was completely replaced by the concept of global warming. You can see in the figure below how the crossover arrived in the late 1980s.

 


Even after it started to decline, the idea of a "new ice age" remained popular and journalists loved presenting it to the public as an imminent threat. For instance, Newsweek printed an article titled "The Cooling World" in 1975, but the concept provided good material for the catastrophic genre in fiction. As late as 2004, it was at the basis of the movie "The Day After Tomorrow."

Does that mean that scientists ever believed that the Earth was cooling? Of course not. There was no consensus on the matter. The status of climate science until the late 1970s simply didn't allow certainties about Earth's future climate.

As an example, in 1972, the well-known report to the Club of Rome, "The Limits to Growth," noted the growing concentration of CO2 in the atmosphere, but it did not state that it would cause warming -- evidently the issue was not yet clear even for scientists engaged in global ecosystem studies. 8 years later, in 1980, the authors of "The Global 2000 Report to the President of the U.S." commissioned by president Carter, already had a much better understanding of the climate effects of greenhouse gases. Nevertheless, they did not rule out global cooling and they discussed it as a plausible scenario.

The Global 2000 Report is especially interesting because it provides some data on the opinion of climate scientists as it was in 1975. 28 experts were interviewed and asked to forecast the average world temperature for the year 2000. The result was no warming or a minimal one of about 0.1 C. In the real world, though, temperatures rose by more than 0.4 C in 2000. Clearly, in 1980, there was not such a thing as a scientific consensus on global warming. On this point, see also the paper by Peterson (2008) which analyzes the scientific literature in the 1970s. A majority of paper was found to favor global warming, but also a significant minority arguing for no temperature changes or for global cooling.

Now we are getting to the truly interesting point of this discussion. The consensus that Earth was warming did not exist before the 1980s, but then it became the norm. How was it obtained?

There are two interpretations floating in the memesphere today. One is that scientists agreed on a global conspiracy to terrorize the public about global warming in order to obtain personal advantages. The other that scientists are cold-blooded data-analyzers and that they did as John Maynard Keynes said, "When I have new data, I change my mind." 

Both are legends. The one about the scientific conspiracy is obviously ridiculous, but the second is just as silly. Scientists are human beings and data are not a gospel of truth. Data are always incomplete, affected by uncertainties, and need to be selected. Try to develop Newton's law of universal gravitation without ignoring all the data about falling feathers, paper sheets, and birds, and you'll see what I mean. 

In practice, science is a fine-tuned consensus-building machine. It has evolved exactly for the purpose of smoothly absorbing new data in a gradual process that does not lead (normally) to the kind of partisan division that's typical of politics. 

Science uses a procedure derived from an ancient method that, in Medieval times was called disputatio and that has its roots in the art of rhetoric of classical times. The idea is to debate issues by having champions of the different theses squaring off against each other and trying to convince an informed audience using the best arguments they can muster. The Medieval disputatio could be very sophisticated and, as an example, I discussed the "Controversy of Valladolid" (1550-51) on the status of the American Indians. Theological disputationes normally failed to harmonize truly incompatible positions, say, convincing Jews to become Christians (it was tried more than once, but you may imagine the results). But sometimes they did lead to good compromises and they kept the confrontation to the verbal level (at least for a while).

In modern science, the rules have changed a little, but the idea remains the same: experts try to convince their opponents using the best arguments they can muster. It is supposed to be a discussion, not a fight. Good manners are to be maintained and the fundamental feature is being able to speak a mutually understandable language. And not just that: the discussants need to agree on some basic tenets of the frame of the discussion.  During the Middle Ages, theologians debated in Latin and agreed that the discussion was to be based on the Christian scriptures. Today, scientists debate in English and agree that the discussion is to be based on the scientific method.

In the early times of science, one-to-one debates were used (maybe you remember the famous debate about Darwin's ideas that involved Thomas Huxley and Archbishop Wilberforce in 1860). But, nowadays, that is rare. The debate takes place at scientific conferences and seminars where several scientists participate, gaining or losing "prestige points" depending on how good they are at presenting their views. Occasionally, a presenter, especially a young scientist, may be "grilled" by the audience in a small re-enactment of the coming of age ceremonies of Native Americans. But, most important of all, informal discussions take place all over the conference. These meetings are not supposed to be vacations, they are functional to the face-to-face exchange of ideas. As I said, scientists are human beings and they need to see each other in the face to understand each other. A lot of science is done in cafeterias and over a glass of beer. Possibly, most scientific discoveries start in this kind of informal setting. No one, as far as I know, was ever struck by a ray of light from heaven while watching a power point presentation.

It would be hard to maintain that scientists are more adept at changing their views than Medieval theologians and older scientists tend to stick to old ideas. Sometimes you hear that science advances one funeral at a time; it is not wrong, but surely an exaggeration: scientific views do change even without having to wait for the old guard to die. The debate at a conference can decisively tilt toward one side on the basis of the brilliance of a scientist, the availability of good data, and the overall competence demonstrated. 

I can testify that, at least once, I saw someone in the audience rising up after a presentation and say, "Sir, I was of a different opinion until I heard your talk, but now you convinced me. I was wrong and you are right." (and I can tell you that this person was more than 70 years old, good scientists may age gracefully, like wine). In many cases, the conversion is not so sudden and so spectacular, but it does happen. Then, of course, money can do miracles in affecting scientific views but, as long as we stick to climate science, there is not a lot of money involved and corruption among scientists is not widespread as it is in other fields, such as in medical research.

So, we can imagine that in the 1980s the consensus machine worked as it was supposed to do and it led to the general opinion of climate scientists switching from cooling to warming. That was a good thing, but the story didn't end with that. There remained to convince people outside the narrow field of climate science, and that was not obvious. 

From the 1990s onward, the disputatio was dedicated to convincing non-climate scientists, that is both scientists working in different fields and intelligent laypersons. There was a serious problem with that: climate science is not a matter for amateurs, it is a field where the Dunning-Kruger effect (people overestimating their competence) may be rampant. Climate scientists found themselves dealing with various kinds of opponents. Typically, elderly scientists who refused to accept new ideas or, sometimes, geologists who saw climate science as invading their turf and resenting that. Occasionally, opponents could score points in the debate by focusing on narrow points that they themselves had not completely understood (for instance, the "tropospheric hot spot" was a fashionable trick). But when the debate involved someone who knew climate science well enough the opponents' destiny was to be easily steamrolled.

These debates went on for at least a decade. You may know the 2009 book by Randy Olson, "Don't be Such a Scientist" that describes this period. Olson surely understood the basic point of debating: you must respect your opponent if you aim at convincing him or her, and the audience, too. It seemed to be working, slowly. Progress was being made and the climate problem was becoming more and more known.

And then, something went wrong. Badly wrong. Scientists suddenly found themselves cast into another kind of debate for which they had no training and little understanding. You see in Google Ngrams how the idea that climate change was a hoax lifted off in the 2000s and became a feature of the memesphere. Note how rapidly it rose: it had a climax in 2009, with the Climategate scandal, but it didn't decline afterward.



It was a completely new way to discuss: not anymore a disputatio. No more rules, no more reciprocal respect, no more a common language. Only slogans and insults. A climate scientist described this kind of debate as like being involved in a "bare-knuckle bar fight." From there onward, the climate issue became politicized and sharply polarized. No progress was made and none is being made, right now.

Why did this happen? In large part, it was because of a professional PR campaign aimed at disparaging climate scientists. We don't know who designed it and paid for it but, surely, there existed (and still exist) industrial lobbies which were bound to lose a lot if decisive action to stop climate change was implemented. Those who had conceived the campaign had an easy time against a group of people who were as naive in terms of communication as they were experts in terms of climate science. 

The Climategate story is a good example of the mistakes scientists made. If you read the whole corpus of the thousands of emails released in 2009, nowhere you'll find that the scientists were falsifying the data, were engaged in conspiracies, or tried to obtain personal gains. But they managed to give the impression of being a sectarian clique that refused to accept criticism from their opponents. In scientific terms, they did nothing wrong, but in terms of image, it was a disaster. Another mistake of scientists was to try to steamroll their adversaries claiming a 97% of scientific consensus on human-caused climate change. Even assuming that it is true (it may well be), it backfired, giving once more the impression that climate scientists are self-referential and do not take into account objections of other people. 

Let me give you another example of a scientific debate that derailed and become a political one. I already mentioned the 1972 study "The Limits to Growth." It was a scientific study, but the debate that ensued was outside the rules of the scientific debate. A feeding frenzy among sharks would be a better description of how the world's economists got together to shred to pieces the LTG study.  The "debate" rapidly spilled over to the mainstream press and the result was a general demonization of the study, accused to have made "wrong predictions," and, in some cases, to be planning the extermination of humankind. (I discuss this story in my 2011 book "The Limits to Growth Revisited.") The interesting (and depressing) thing you can learn from this old debate is that no progress was made in half a century. Approaching the 50th anniversary of the publication, you can find the same criticism republished afresh on Web sites, "wrong predictions", and all the rest. 

So, we are stuck. Is there a hope to reverse the situation? Hardly. The loss of the capability of obtaining a consensus seems to be a feature of our times: debates require a minimum of reciprocal respect to be effective, but that has been lost in the cacophony of the Web. The only form of debate that remains is the vestigial one that sees presidential candidates stiffly exchanging platitudes with each other every four years. But a real debate? No way, it is gone like the disputes among theologians in Middle Ages.

The discussion on climate, just as on all important issues, has moved to the Web, in large part to the social media. And the effect has been devastating on consensus-building. One thing is facing a human being across a table with two glasses of beer on it, another is to see a chunk of text falling from the blue as a comment to your post. This is a recipe for a quarrel, and it works like that every time. 

Also, it doesn't help that international scientific meetings and conferences have all but disappeared in a situation that discourages meetings in person. Online meetings turned out to be hours of boredom in which nobody listens to anybody and everyone is happy when it is over. Even if you can still manage to be at an in-person meeting, it doesn't help that your colleague appears to you in the form of a masked bag of dangerous viruses, to be kept at a distance all the time, if possible behind a plexiglass barrier. Not the best way to establish a human relationship.

This is a fundamental problem: if you can't build a consensus by a debate, the only other possibility is to use the political method. It means attaining a majority by means of a vote (and note that in science, like in theology, voting is not considered an acceptable consensus building technique). After the vote, the winning side can force their position on the minority using a combination of propaganda, intimidation, and, sometimes, physical force. An extreme consensus-building technique is the extermination of the opponents. It has been done so often in history that it is hard to think that it will not be done again on a large scale in the future, perhaps not even in a remote one. But, apart from the moral implications, forced consensus is expensive, inefficient, and often it leads to dogmas being established. Then it is impossible to adapt to new data when they arrive. 

So, where are we going? Things keep changing all the time; maybe we'll find new ways to attain consensus even online, which implies, at a minimum, not to insult and attack your opponent right from the beginning. As for a common language, after that we switched from Latin to English, we might now switch to "Googlish," a new world language that might perhaps be structured to avoid clashes of absolutes -- perhaps it might just be devoid of expletives, perhaps it may have some specific features that help build consensus. For sure, we need a reform of science that gets rid of the corruption rampant in many fields: money is a kind of consensus, but not the one we want.

Or, maybe, we might develop new rituals. Rituals have always been a powerful way to attain consensus, just think of the Christian mass (the Christian church has not yet realized that it has received a deadly blow from the anti-virus rules). Could rituals be transferred online? Or would we need to meet in person in the forest as the "book people" imagined by Ray Bradbury in his 1953 novel "Fahrenheit 451"?

We cannot say. We can only ride the wave of change that, nowadays, seems to have become a true tsunami. Will we float or sink? Who can say? The shore seems to be still far away.


h/t Carlo Cuppini and "moresoma"


Thursday, July 29, 2021

We are not in the Holocene Anymore: A World Without Permanent Ice.

The post below is reproduced from my blog "The Proud Holobionts," but I think the subject is compatible with the vision of the "Seneca Effect" blog. Indeed, everything is related on this planet and the concept of "holobiont" can be seen as strictly connected to the concept of "Seneca Cliff." Complex systems, both virtual and real, are networks that can be almost always seen as holobionts in their structure. A collapse, then, is when the network undergoes a chain of link breaking in a process known as the "Griffith fracture mechanism" in engineering (you see that everything is correlated!)
 
This post is also part of the material that myself and Chuck Pezeshki are assembling for a new book that will be titled (provisionally) "Holobiont: the new Science of Collaboration," where we plan to explore how new concepts in biology and network science can combine to give us the key to managing highly complex system: human societies, large and small. And the overarching concept that links all this is one: empathy.

 

 

When the Ice Will be Gone: The Greatest Change Seen on Earth in 30 Million Years.

From: "The Proud Holobionts," July 27, 2021

 

An image from the 2006 movie "The Meltdown," the second of the "Ice Ages" series. These movies attempted to present a picture of Earth during the Pleistocene. Of course, they were not supposed to be paleontology lessons, but they did show the megafauna of the time (mammoths, sabertooth tigers, and others) and the persistent ice, as you see in the figure. The plot of "The Meltdown" was based on a real event: the breakdown of the ice dam that kept the Lake Agassiz bonded inside the great glaciers of the Laurentide, in the North American continent. When the dam broke, some 15,000 years ago, the lake flowed into the sea in a giant flood that changed Earth's climate for more than a thousand years. So, the concept of ice ages as related to climate change is penetrating the human memesphere. It is strange that it is happening just when the human activity is pushing the ecosystem back to a pre-glacial period. If it happens, it will be the greatest change seen on Earth in 30 million years. And we won't be in the Holocene anymore.

 

We all know that there is permanent ice at Earth's poles: it forms glaciers and it covers huge areas of the sea. But is it there by chance, or is it functional in some way to Earth's ecosphere? 

Perhaps the first to ask this question was James Lovelock, the proposer (together with Lynn Margulis) of the concept of "Gaia" -- the name for the great holobiont that regulates the planetary ecosystem. Lovelock has always been a creative person and in his book "Gaia: A New Look at Life on Earth" (1979) he reversed the conventional view of ice as a negative entity. Instead, he proposed that the permanent ice at the poles was part of the planetary homeostasis, actually optimizing the functioning of the ecosphere. 

Lovelock was perhaps influenced by the idea that the efficiency of a thermal engine is directly proportional to the temperature differences that a circulating fluid encounters. It may make sense: permanent ice creates large temperature difference between the poles and the equator and, as a consequence, winds and ocean currents are stronger, and the "pumps" that bring nutrients everywhere sustain more life. Unfortunately, this idea is probably wrong, but Lovelock has the merit to have opened the lid on a set of deep questions on the role of permanent ice in the ecosystem. What do we know about this matter?

It took some time for our ancestors to realize that permanent ice existed in large amounts in the high latitude regions. The first who saw the ice sheet of Greenland was probably Eric the Red, the Norwegian adventurer, when he traveled there around the year 1000. But he had no way to know the true extent of the inland ice, and he didn't report about them.

The first report I could find on Greenland's ice sheet is the 1820 "History Of Greenland", a translation of an earlier report (1757) in German by David Crantz, where you can find descriptions of the ice-covered inland mountains. By the early 20th century, the maps clearly showed Greenland as fully ice-covered. About Antarctica, by the end of the 19th century, it was known that it was also fully covered with a thick ice sheet. 

Earlier on, in the mid 19th century, Louis Agassiz had proposed a truly revolutionary idea: that of the ice age. According to Agassiz, in ancient times, much of Northern Europe and North America were covered with thick ice sheets. Gradually, it became clear that there had not been just one ice age, but several, coming and going in cycles. In 1930, Milutin Milankovich proposed that these cycles were linked to periodic variations in the insulation of the Northern Hemisphere, in turn caused by cycles in Earth's motion. For nearly a million years, Earth was a sort of giant pendulum in terms of the extent of the ice sheet. 

The 2006 movie "An inconvenient truth" was the first time when these discoveries were presented to the general public. Here we see Al Gore showing the temperature data of the past half million years.

An even more radical idea about ice ages appeared in 1992, when Joseph Kirkschvink proposed the concept of "Snowball Earth." The idea is that Earth was fully covered by ice at some moment around 700-600 million years ago, the period appropriately called "Cryogenian."

This super-ice age is still controversial: it will never be possible to prove that every square kilometer of the planet was under ice and there is some evidence that it was not the case. But, surely, we are dealing with a cooling phase much heavier than anything seen during relatively recent geological times.

While more ice ages were discovered, it was also clear that Earth had been ice-free for most of its long existence. Our times, with permanent ice at the poles, are rather exceptional. Let's take a look at the temperatures of the past 65 million years (the "Cenozoic"). See this remarkable image (click to see it in high resolution)

At the beginning of the Cenozoic, Earth was still reeling after the great disaster of the end of the Mesozoic, the one that led to the disappearance of the dinosaurs (by the way, almost certainly not caused by an asteroidal impact). But, from 50 million years ago onward, the trend has been constant: cooling. 

The Earth is now some 12 degrees centigrade colder than it was during the "warmhouse" of the Eocene. It was still ice-free up to about 35 million years ago but, gradually, permanent ice started accumulating, first in the Southern hemisphere, then in the Northern one. During the Cenozoic, Earth never was so cold as it is now.

The reasons for the gradual cooling are being debated, but the simplest explanation is that it is due to the decline of CO2 concentrations in the atmosphere. That, in turn, may be caused to a slowdown of the outgassing of carbon from Earth's interior. Maybe Earth is just becoming a little older and colder, and so less active in terms of volcanoes and similar phenomena. There are other explanations, including the collision of India with Central Asia and the rise of the Himalaya that caused a drawdown of CO2 generated by the erosion of silicates. But it is a hugely complicated story and let's not go into the details.

Let's go back to our times. Probably you heard how, just a few decades ago, those silly scientists were predicting that we would go back to an ice age. That's an exaggeration -- there never was such a claim in the scientific literature. But it is true that the idea of a new ice age was floating in the memesphere, and for good reasons: if the Earth had seen ice ages in the past, why not a new one? Look at these data:

These are temperatures and CO2 concentrations from the Vostok ice cores, in Antarctica (you may have seen these data in Al Gore's movie). They describe the glacial cycles of the past 400,000 years. Without going into the details of what causes the cycles (solar irradiation cycles trigger them, but do not cause them), you may note how low we went in both temperatures and CO2 concentrations at the coldest moments of the past ice ages. The latest ice age was especially cold and associated with very low CO2 concentrations. 

Was Earth poised to slide down to another "snowball" condition? It cannot be excluded. What we know for sure is that during the past million years, the Earth tethered close to the snowball catastrophe every 100,000 years or so. What saved it from sliding all the way into an icy death?

There are several factors that may have stopped the ice from expanding all the way to the equator. For one thing, the sun irradiation is today about 7% larger than it was at the time of the last snowball episode, during the Cryogenian. But that may not enough as an explanation. Another factor is that the cold and the low CO2 concentrations may have led to a weakening -- or even to a stop -- of the biological pump in the oceans and of the biotic pump on land. Both these pumps cycle water and nutrients, keeping the biosphere alive and well. Their near disappearance may have caused a general loss of activity of the biosphere and, hence, the loss of one of the mechanisms that removes CO2 from the atmosphere. So, CO2 concentrations increased as a result of the continuing geological emissions, unaffected by changes of the biosphere. Note how, in the figure, the CO2 concentration and temperatures are perfectly superimposable during the warming phases: the reaction of the temperature to the CO2 increase was instantaneous on a geological time scale. Another factor may have been the desertification of the land that led to an increase in atmospheric dust that landed on the top of the glaciers. That lowered the albedo (the reflected fraction of light) of the system and led to a new warming phase. A very complicated story that is still being unraveled.  

But how close was the biosphere to total disaster? We will never know. What we know is that, 20 thousand years ago, the atmosphere contained just 180 parts per million (ppm) of CO2 (today, we are at 410 ppm). That was close to the survival limit of green plants and there is evidence of extensive desertification during these periods. Life was hard for the biosphere during the recent ice ages, although not so bad as in the Cryogenian. Lovelock's idea that permanent ice at the poles is good for life just doesn't seem to be right.

Of course, the idea that we could go back to a new ice age was legitimate in the 1950s, not anymore as we understand the role of human activities on climate. Some people maintain that it was a good thing that humans started burning fossil hydrocarbons since that "saved us from a new ice age." Maybe, but this is a classic case of too much of a good thing. We are pumping so much CO2 into the atmosphere that our problem is now the opposite: we are not facing an "icehouse Earth" but a "warmhouse" or even a "hothouse" Earth. 

A "hothouse Earth" would be a true disaster since it was the main cause of the mass extinctions that took place in the remote past of our planet. Mainly, the hothouse episodes were the result of outbursts of CO2 generated by the enormous volcanic eruptions called "large igneous provinces." In principle, human emissions can't even remotely match these events. According to some calculations, we would need to keep burning fossil fuels for 500 years at the current rates to create a hothouse like the one that killed the dinosaurs (but, there is always that detail that non linear systems always surprise you . . .)

Still, considering feedback effects such as the release of methane buried in the permafrost, it is perfectly possible that human emissions could bring CO2 concentrations in the atmosphere at levels of the order of 600-800 ppm, or even more, comparable to those of the Eocene, when temperatures were 12 degrees higher than they are now. We may reach the condition called, sometimes, "warmhouse Earth."

From the human viewpoint, it would be a disaster. If the change were to occur in a relatively short time, say, of the order of a few centuries, the human civilization is probably toast. We are not equipped to cope with this kind of change. Just think of what happened some 14,500 years ago, when the great Laurentide ice sheet in North America fragmented and collapsed. (image source) (the 2006 movie "Meltdown" was inspired exactly by this event). Earth's climate went through a series of cold and warm spells that is hard to think we could survive. 

 



Human survival concerns are legitimate, but probably irrelevant in the greater scheme of things. If we go back to the Eocene, the ecosystem would take a big hit during the transition, but it would survive and then adapt to the new conditions. In terms of life, the Eocene has been described as "luxuriant." With plenty of CO2 in the atmosphere, forests were thriving and, probably, the biotic pump provided abundant water everywhere inland, even though the temperatures were relatively uniform at different latitudes. A possible mental model for that period is the modern tropical forests of Central Africa or Indonesia. We don't have data that would allow us to compare Earth's productivity today with that of the Eocene, but we can't exclude that the Eocene was more productive in terms of life. Humans might well adapt to this new world, although their survival during the transition is by no means guaranteed. 

Again, it seems that Lovelock was wrong when he said that ice ages optimize the functioning of the biosphere. But maybe there is more to this idea. At least for one thing, ice ages have a good effect on life. Take a look at this image that summarizes the main ice ages of Earth's long history


 (image source)

The interesting point is that ice ages seem to occur just before major transitions in the evolutionary history of Earth. We don't know much about the Huronian ice age, but it occurred just at the boundary of the Archean and the Proterozoic, at the time of the appearance of the Eucaryotes. Then, the Cryogenian preceded the Ediacaran period and the appearance of multicellular life that colonized the land. Finally, even the evolution of the Homo Sapiens species may be related to the most recent ice age cycle. With the cooling of the planet and the reduction of the extent of forested areas, our ancestors were forced to leave the comfortable forests where they had lived up to then and take up a more dangerous lifestyle in the savannas. And you know what it led to!

So, maybe there is something good in ice ages and, after all, James Lovelock's intuition may have hinted at an important insight in how evolution works. Then, there remains the question of how exactly ice ages drive evolution. Maybe they have an active role, or maybe they are simply a parallel effect of the real cause that drives evolution, quite possibly the increasing concentration of atmospheric oxygen that has accompanied the biosphere over the past 2.7 billion years. Oxygen is the magic pill that boosts the metabolism of aerobic creatures -- what makes possible creatures like us. 

In any case, it is likely that ice ages will soon be a thing of the past on planet Earth. The effect of the human perturbation may be moderate and, when humans will stop burning fossil hydrocarbons (they have to, one day or another) the system may reabsorb the excess CO2 and gradually return to the ice age cycles of the past. That may occur in times of the order of at least several thousand years, possibly several tens of thousands. But the climate is a non-linear system and it may react by reinforcing the perturbation -- the results are unknowable. 

What we know for sure is that the cycle of Earth's ecosystem (Gaia) is limited. We still have about 600 million years before the sun's increasing brightness takes Earth to a different condition: that of "wet greenhouse" that will bring the oceans to boil and extinguish all life on the planet. And so it will be what it will have to be. Gaia is long-lived, but not eternal.