The Roman Philosopher Lucius Anneaus Seneca (4 BCE-65 CE) was perhaps the first to note the universal trend that growth is slow but ruin is rapid. I call this tendency the "Seneca Effect."
Showing posts with label global cooling. Show all posts
Showing posts with label global cooling. Show all posts

Saturday, August 7, 2021

Consensus Building: an art that we are losing. The Case of Climate Science

In 1956, Arthur C. Clarke wrote "The Forgotten Enemy," a science fiction story that dealt with the return of the ice age (image source). Surely it was not Clarke's best story, but it may have been the first written on that subject by a well-known author. Several other sci-fi authors examined the same theme, but that does not mean that, at that time, there was a scientific consensus on global cooling. It just means that a consensus on global warming was obtained only later, in the 1980s. But which mechanisms were used to obtain this consensus? And why is it that, nowadays, it seems to be impossible to attain consensus on anything? This post is a discussion on this subject that uses climate science as an example.

 

You may remember how, in 2017, during the Trump presidency, there briefly floated in the media the idea to stage a debate on climate change in the form of a "red team vs. blue team" encounter between orthodox climate scientists and their opponents. Climate scientists were horrified at the idea. They were especially appalled at the military implications of the "red vs. blue" idea that hinted at how the debate could have been organized. From the government side, then, it was quickly realized that in a fair scientific debate their side had no chances. So, the debate never took place and it is good that it didn't. Maybe those who proposed it were well intentioned (or maybe not), but in any case it would have degenerated into a fight and just created confusion.

Yet, the story of that debate that was never held hints at a point that most people understand: the need for consensus. Nothing in our world can be done without some form of consensus and the question of climate change is a good example. Climate scientists tend to claim that such a consensus exists, and they sometimes quantify it as 97% or even 100%. Their opponents claim the opposite

In a sense, they are both right. A consensus on climate change exists among scientists, but this is not true for the general public. The polls say that a majority of people know something about climate change and agree that something is to be done about it, but that is not the same as an in-depth, informed consensus. Besides, this majority rapidly disappears as soon as it is time to do something that touches someone's wallet. The result is that, for more than 30 years, thousands of the best scientists in the world have been warning humankind of a dire threat approaching, and nothing serious has been done. Only proclaims, greenwashing, and "solutions" that worsen the problem (the "hydrogen-based economy" is a good example).

So, consensus building is a fundamental matter. You can call it a science or see it as another way to define what others call "propaganda." Some reject the very idea as a form of "mind control," or practice it in various methods of rule-based negotiations. It is a fascinating subject that goes to the heart of our existence as human beings in a complex society. 

Here, instead of tackling the issue from a general viewpoint, I'll discuss a specific example: that of "global cooling" vs. "global warming," and how a consensus was obtained that warming is the real threat. It is a dispute often said to be proof that no such a thing as consensus exists in climate science. 

You surely heard the story of how, just a few decades ago, "global cooling" was the generally accepted scientific view of the future. And how those silly scientists changed their minds, switching to warming, instead. Conversely, you may also have heard that this is a myth and that there never was such a thing as a consensus that Earth was cooling.

As it is always the case, the reality is more complex than politics wants it to be. Global cooling as an early scientific consensus is one of the many legends generated by the discussion about climate change and, like most legends, it is basically false. But it has at least some links with reality. It is an interesting story that tells us a lot about how consensus is obtained in science. But we need to start from the beginning.

The idea that Earth's climate was not stable emerged in the mid-19th century with the discovery of the past ice ages. At that point, an obvious question was whether ice ages could return in the future. The matter remained at the level of scattered speculations until the mid 20th century, when the concept of "new ice age" appeared in the "memesphere" (the ensemble of human public memes). We can see this evolution using Google "Ngrams," a database that measures the frequency of strings of words in a large corpus of published books (Thanks, Google!!).

 

You see that the possibility of a "new ice age" entered the public consciousness already in the 1920s, then it grew and reached a peak in the early 1970s. Other strings such as "Earth cooling" and the like give similar results. Note also that the database "English Fiction" generates a large peak for the concept of a "new ice age" at about the same time, in the 1970s. Later on, cooling was completely replaced by the concept of global warming. You can see in the figure below how the crossover arrived in the late 1980s.

 


Even after it started to decline, the idea of a "new ice age" remained popular and journalists loved presenting it to the public as an imminent threat. For instance, Newsweek printed an article titled "The Cooling World" in 1975, but the concept provided good material for the catastrophic genre in fiction. As late as 2004, it was at the basis of the movie "The Day After Tomorrow."

Does that mean that scientists ever believed that the Earth was cooling? Of course not. There was no consensus on the matter. The status of climate science until the late 1970s simply didn't allow certainties about Earth's future climate.

As an example, in 1972, the well-known report to the Club of Rome, "The Limits to Growth," noted the growing concentration of CO2 in the atmosphere, but it did not state that it would cause warming -- evidently the issue was not yet clear even for scientists engaged in global ecosystem studies. 8 years later, in 1980, the authors of "The Global 2000 Report to the President of the U.S." commissioned by president Carter, already had a much better understanding of the climate effects of greenhouse gases. Nevertheless, they did not rule out global cooling and they discussed it as a plausible scenario.

The Global 2000 Report is especially interesting because it provides some data on the opinion of climate scientists as it was in 1975. 28 experts were interviewed and asked to forecast the average world temperature for the year 2000. The result was no warming or a minimal one of about 0.1 C. In the real world, though, temperatures rose by more than 0.4 C in 2000. Clearly, in 1980, there was not such a thing as a scientific consensus on global warming. On this point, see also the paper by Peterson (2008) which analyzes the scientific literature in the 1970s. A majority of paper was found to favor global warming, but also a significant minority arguing for no temperature changes or for global cooling.

Now we are getting to the truly interesting point of this discussion. The consensus that Earth was warming did not exist before the 1980s, but then it became the norm. How was it obtained?

There are two interpretations floating in the memesphere today. One is that scientists agreed on a global conspiracy to terrorize the public about global warming in order to obtain personal advantages. The other that scientists are cold-blooded data-analyzers and that they did as John Maynard Keynes said, "When I have new data, I change my mind." 

Both are legends. The one about the scientific conspiracy is obviously ridiculous, but the second is just as silly. Scientists are human beings and data are not a gospel of truth. Data are always incomplete, affected by uncertainties, and need to be selected. Try to develop Newton's law of universal gravitation without ignoring all the data about falling feathers, paper sheets, and birds, and you'll see what I mean. 

In practice, science is a fine-tuned consensus-building machine. It has evolved exactly for the purpose of smoothly absorbing new data in a gradual process that does not lead (normally) to the kind of partisan division that's typical of politics. 

Science uses a procedure derived from an ancient method that, in Medieval times was called disputatio and that has its roots in the art of rhetoric of classical times. The idea is to debate issues by having champions of the different theses squaring off against each other and trying to convince an informed audience using the best arguments they can muster. The Medieval disputatio could be very sophisticated and, as an example, I discussed the "Controversy of Valladolid" (1550-51) on the status of the American Indians. Theological disputationes normally failed to harmonize truly incompatible positions, say, convincing Jews to become Christians (it was tried more than once, but you may imagine the results). But sometimes they did lead to good compromises and they kept the confrontation to the verbal level (at least for a while).

In modern science, the rules have changed a little, but the idea remains the same: experts try to convince their opponents using the best arguments they can muster. It is supposed to be a discussion, not a fight. Good manners are to be maintained and the fundamental feature is being able to speak a mutually understandable language. And not just that: the discussants need to agree on some basic tenets of the frame of the discussion.  During the Middle Ages, theologians debated in Latin and agreed that the discussion was to be based on the Christian scriptures. Today, scientists debate in English and agree that the discussion is to be based on the scientific method.

In the early times of science, one-to-one debates were used (maybe you remember the famous debate about Darwin's ideas that involved Thomas Huxley and Archbishop Wilberforce in 1860). But, nowadays, that is rare. The debate takes place at scientific conferences and seminars where several scientists participate, gaining or losing "prestige points" depending on how good they are at presenting their views. Occasionally, a presenter, especially a young scientist, may be "grilled" by the audience in a small re-enactment of the coming of age ceremonies of Native Americans. But, most important of all, informal discussions take place all over the conference. These meetings are not supposed to be vacations, they are functional to the face-to-face exchange of ideas. As I said, scientists are human beings and they need to see each other in the face to understand each other. A lot of science is done in cafeterias and over a glass of beer. Possibly, most scientific discoveries start in this kind of informal setting. No one, as far as I know, was ever struck by a ray of light from heaven while watching a power point presentation.

It would be hard to maintain that scientists are more adept at changing their views than Medieval theologians and older scientists tend to stick to old ideas. Sometimes you hear that science advances one funeral at a time; it is not wrong, but surely an exaggeration: scientific views do change even without having to wait for the old guard to die. The debate at a conference can decisively tilt toward one side on the basis of the brilliance of a scientist, the availability of good data, and the overall competence demonstrated. 

I can testify that, at least once, I saw someone in the audience rising up after a presentation and say, "Sir, I was of a different opinion until I heard your talk, but now you convinced me. I was wrong and you are right." (and I can tell you that this person was more than 70 years old, good scientists may age gracefully, like wine). In many cases, the conversion is not so sudden and so spectacular, but it does happen. Then, of course, money can do miracles in affecting scientific views but, as long as we stick to climate science, there is not a lot of money involved and corruption among scientists is not widespread as it is in other fields, such as in medical research.

So, we can imagine that in the 1980s the consensus machine worked as it was supposed to do and it led to the general opinion of climate scientists switching from cooling to warming. That was a good thing, but the story didn't end with that. There remained to convince people outside the narrow field of climate science, and that was not obvious. 

From the 1990s onward, the disputatio was dedicated to convincing non-climate scientists, that is both scientists working in different fields and intelligent laypersons. There was a serious problem with that: climate science is not a matter for amateurs, it is a field where the Dunning-Kruger effect (people overestimating their competence) may be rampant. Climate scientists found themselves dealing with various kinds of opponents. Typically, elderly scientists who refused to accept new ideas or, sometimes, geologists who saw climate science as invading their turf and resenting that. Occasionally, opponents could score points in the debate by focusing on narrow points that they themselves had not completely understood (for instance, the "tropospheric hot spot" was a fashionable trick). But when the debate involved someone who knew climate science well enough the opponents' destiny was to be easily steamrolled.

These debates went on for at least a decade. You may know the 2009 book by Randy Olson, "Don't be Such a Scientist" that describes this period. Olson surely understood the basic point of debating: you must respect your opponent if you aim at convincing him or her, and the audience, too. It seemed to be working, slowly. Progress was being made and the climate problem was becoming more and more known.

And then, something went wrong. Badly wrong. Scientists suddenly found themselves cast into another kind of debate for which they had no training and little understanding. You see in Google Ngrams how the idea that climate change was a hoax lifted off in the 2000s and became a feature of the memesphere. Note how rapidly it rose: it had a climax in 2009, with the Climategate scandal, but it didn't decline afterward.



It was a completely new way to discuss: not anymore a disputatio. No more rules, no more reciprocal respect, no more a common language. Only slogans and insults. A climate scientist described this kind of debate as like being involved in a "bare-knuckle bar fight." From there onward, the climate issue became politicized and sharply polarized. No progress was made and none is being made, right now.

Why did this happen? In large part, it was because of a professional PR campaign aimed at disparaging climate scientists. We don't know who designed it and paid for it but, surely, there existed (and still exist) industrial lobbies which were bound to lose a lot if decisive action to stop climate change was implemented. Those who had conceived the campaign had an easy time against a group of people who were as naive in terms of communication as they were experts in terms of climate science. 

The Climategate story is a good example of the mistakes scientists made. If you read the whole corpus of the thousands of emails released in 2009, nowhere you'll find that the scientists were falsifying the data, were engaged in conspiracies, or tried to obtain personal gains. But they managed to give the impression of being a sectarian clique that refused to accept criticism from their opponents. In scientific terms, they did nothing wrong, but in terms of image, it was a disaster. Another mistake of scientists was to try to steamroll their adversaries claiming a 97% of scientific consensus on human-caused climate change. Even assuming that it is true (it may well be), it backfired, giving once more the impression that climate scientists are self-referential and do not take into account objections of other people. 

Let me give you another example of a scientific debate that derailed and become a political one. I already mentioned the 1972 study "The Limits to Growth." It was a scientific study, but the debate that ensued was outside the rules of the scientific debate. A feeding frenzy among sharks would be a better description of how the world's economists got together to shred to pieces the LTG study.  The "debate" rapidly spilled over to the mainstream press and the result was a general demonization of the study, accused to have made "wrong predictions," and, in some cases, to be planning the extermination of humankind. (I discuss this story in my 2011 book "The Limits to Growth Revisited.") The interesting (and depressing) thing you can learn from this old debate is that no progress was made in half a century. Approaching the 50th anniversary of the publication, you can find the same criticism republished afresh on Web sites, "wrong predictions", and all the rest. 

So, we are stuck. Is there a hope to reverse the situation? Hardly. The loss of the capability of obtaining a consensus seems to be a feature of our times: debates require a minimum of reciprocal respect to be effective, but that has been lost in the cacophony of the Web. The only form of debate that remains is the vestigial one that sees presidential candidates stiffly exchanging platitudes with each other every four years. But a real debate? No way, it is gone like the disputes among theologians in Middle Ages.

The discussion on climate, just as on all important issues, has moved to the Web, in large part to the social media. And the effect has been devastating on consensus-building. One thing is facing a human being across a table with two glasses of beer on it, another is to see a chunk of text falling from the blue as a comment to your post. This is a recipe for a quarrel, and it works like that every time. 

Also, it doesn't help that international scientific meetings and conferences have all but disappeared in a situation that discourages meetings in person. Online meetings turned out to be hours of boredom in which nobody listens to anybody and everyone is happy when it is over. Even if you can still manage to be at an in-person meeting, it doesn't help that your colleague appears to you in the form of a masked bag of dangerous viruses, to be kept at a distance all the time, if possible behind a plexiglass barrier. Not the best way to establish a human relationship.

This is a fundamental problem: if you can't build a consensus by a debate, the only other possibility is to use the political method. It means attaining a majority by means of a vote (and note that in science, like in theology, voting is not considered an acceptable consensus building technique). After the vote, the winning side can force their position on the minority using a combination of propaganda, intimidation, and, sometimes, physical force. An extreme consensus-building technique is the extermination of the opponents. It has been done so often in history that it is hard to think that it will not be done again on a large scale in the future, perhaps not even in a remote one. But, apart from the moral implications, forced consensus is expensive, inefficient, and often it leads to dogmas being established. Then it is impossible to adapt to new data when they arrive. 

So, where are we going? Things keep changing all the time; maybe we'll find new ways to attain consensus even online, which implies, at a minimum, not to insult and attack your opponent right from the beginning. As for a common language, after that we switched from Latin to English, we might now switch to "Googlish," a new world language that might perhaps be structured to avoid clashes of absolutes -- perhaps it might just be devoid of expletives, perhaps it may have some specific features that help build consensus. For sure, we need a reform of science that gets rid of the corruption rampant in many fields: money is a kind of consensus, but not the one we want.

Or, maybe, we might develop new rituals. Rituals have always been a powerful way to attain consensus, just think of the Christian mass (the Christian church has not yet realized that it has received a deadly blow from the anti-virus rules). Could rituals be transferred online? Or would we need to meet in person in the forest as the "book people" imagined by Ray Bradbury in his 1953 novel "Fahrenheit 451"?

We cannot say. We can only ride the wave of change that, nowadays, seems to have become a true tsunami. Will we float or sink? Who can say? The shore seems to be still far away.


h/t Carlo Cuppini and "moresoma"


Thursday, July 29, 2021

We are not in the Holocene Anymore: A World Without Permanent Ice.

The post below is reproduced from my blog "The Proud Holobionts," but I think the subject is compatible with the vision of the "Seneca Effect" blog. Indeed, everything is related on this planet and the concept of "holobiont" can be seen as strictly connected to the concept of "Seneca Cliff." Complex systems, both virtual and real, are networks that can be almost always seen as holobionts in their structure. A collapse, then, is when the network undergoes a chain of link breaking in a process known as the "Griffith fracture mechanism" in engineering (you see that everything is correlated!)
 
This post is also part of the material that myself and Chuck Pezeshki are assembling for a new book that will be titled (provisionally) "Holobiont: the new Science of Collaboration," where we plan to explore how new concepts in biology and network science can combine to give us the key to managing highly complex system: human societies, large and small. And the overarching concept that links all this is one: empathy.

 

 

When the Ice Will be Gone: The Greatest Change Seen on Earth in 30 Million Years.

From: "The Proud Holobionts," July 27, 2021

 

An image from the 2006 movie "The Meltdown," the second of the "Ice Ages" series. These movies attempted to present a picture of Earth during the Pleistocene. Of course, they were not supposed to be paleontology lessons, but they did show the megafauna of the time (mammoths, sabertooth tigers, and others) and the persistent ice, as you see in the figure. The plot of "The Meltdown" was based on a real event: the breakdown of the ice dam that kept the Lake Agassiz bonded inside the great glaciers of the Laurentide, in the North American continent. When the dam broke, some 15,000 years ago, the lake flowed into the sea in a giant flood that changed Earth's climate for more than a thousand years. So, the concept of ice ages as related to climate change is penetrating the human memesphere. It is strange that it is happening just when the human activity is pushing the ecosystem back to a pre-glacial period. If it happens, it will be the greatest change seen on Earth in 30 million years. And we won't be in the Holocene anymore.

 

We all know that there is permanent ice at Earth's poles: it forms glaciers and it covers huge areas of the sea. But is it there by chance, or is it functional in some way to Earth's ecosphere? 

Perhaps the first to ask this question was James Lovelock, the proposer (together with Lynn Margulis) of the concept of "Gaia" -- the name for the great holobiont that regulates the planetary ecosystem. Lovelock has always been a creative person and in his book "Gaia: A New Look at Life on Earth" (1979) he reversed the conventional view of ice as a negative entity. Instead, he proposed that the permanent ice at the poles was part of the planetary homeostasis, actually optimizing the functioning of the ecosphere. 

Lovelock was perhaps influenced by the idea that the efficiency of a thermal engine is directly proportional to the temperature differences that a circulating fluid encounters. It may make sense: permanent ice creates large temperature difference between the poles and the equator and, as a consequence, winds and ocean currents are stronger, and the "pumps" that bring nutrients everywhere sustain more life. Unfortunately, this idea is probably wrong, but Lovelock has the merit to have opened the lid on a set of deep questions on the role of permanent ice in the ecosystem. What do we know about this matter?

It took some time for our ancestors to realize that permanent ice existed in large amounts in the high latitude regions. The first who saw the ice sheet of Greenland was probably Eric the Red, the Norwegian adventurer, when he traveled there around the year 1000. But he had no way to know the true extent of the inland ice, and he didn't report about them.

The first report I could find on Greenland's ice sheet is the 1820 "History Of Greenland", a translation of an earlier report (1757) in German by David Crantz, where you can find descriptions of the ice-covered inland mountains. By the early 20th century, the maps clearly showed Greenland as fully ice-covered. About Antarctica, by the end of the 19th century, it was known that it was also fully covered with a thick ice sheet. 

Earlier on, in the mid 19th century, Louis Agassiz had proposed a truly revolutionary idea: that of the ice age. According to Agassiz, in ancient times, much of Northern Europe and North America were covered with thick ice sheets. Gradually, it became clear that there had not been just one ice age, but several, coming and going in cycles. In 1930, Milutin Milankovich proposed that these cycles were linked to periodic variations in the insulation of the Northern Hemisphere, in turn caused by cycles in Earth's motion. For nearly a million years, Earth was a sort of giant pendulum in terms of the extent of the ice sheet. 

The 2006 movie "An inconvenient truth" was the first time when these discoveries were presented to the general public. Here we see Al Gore showing the temperature data of the past half million years.

An even more radical idea about ice ages appeared in 1992, when Joseph Kirkschvink proposed the concept of "Snowball Earth." The idea is that Earth was fully covered by ice at some moment around 700-600 million years ago, the period appropriately called "Cryogenian."

This super-ice age is still controversial: it will never be possible to prove that every square kilometer of the planet was under ice and there is some evidence that it was not the case. But, surely, we are dealing with a cooling phase much heavier than anything seen during relatively recent geological times.

While more ice ages were discovered, it was also clear that Earth had been ice-free for most of its long existence. Our times, with permanent ice at the poles, are rather exceptional. Let's take a look at the temperatures of the past 65 million years (the "Cenozoic"). See this remarkable image (click to see it in high resolution)

At the beginning of the Cenozoic, Earth was still reeling after the great disaster of the end of the Mesozoic, the one that led to the disappearance of the dinosaurs (by the way, almost certainly not caused by an asteroidal impact). But, from 50 million years ago onward, the trend has been constant: cooling. 

The Earth is now some 12 degrees centigrade colder than it was during the "warmhouse" of the Eocene. It was still ice-free up to about 35 million years ago but, gradually, permanent ice started accumulating, first in the Southern hemisphere, then in the Northern one. During the Cenozoic, Earth never was so cold as it is now.

The reasons for the gradual cooling are being debated, but the simplest explanation is that it is due to the decline of CO2 concentrations in the atmosphere. That, in turn, may be caused to a slowdown of the outgassing of carbon from Earth's interior. Maybe Earth is just becoming a little older and colder, and so less active in terms of volcanoes and similar phenomena. There are other explanations, including the collision of India with Central Asia and the rise of the Himalaya that caused a drawdown of CO2 generated by the erosion of silicates. But it is a hugely complicated story and let's not go into the details.

Let's go back to our times. Probably you heard how, just a few decades ago, those silly scientists were predicting that we would go back to an ice age. That's an exaggeration -- there never was such a claim in the scientific literature. But it is true that the idea of a new ice age was floating in the memesphere, and for good reasons: if the Earth had seen ice ages in the past, why not a new one? Look at these data:

These are temperatures and CO2 concentrations from the Vostok ice cores, in Antarctica (you may have seen these data in Al Gore's movie). They describe the glacial cycles of the past 400,000 years. Without going into the details of what causes the cycles (solar irradiation cycles trigger them, but do not cause them), you may note how low we went in both temperatures and CO2 concentrations at the coldest moments of the past ice ages. The latest ice age was especially cold and associated with very low CO2 concentrations. 

Was Earth poised to slide down to another "snowball" condition? It cannot be excluded. What we know for sure is that during the past million years, the Earth tethered close to the snowball catastrophe every 100,000 years or so. What saved it from sliding all the way into an icy death?

There are several factors that may have stopped the ice from expanding all the way to the equator. For one thing, the sun irradiation is today about 7% larger than it was at the time of the last snowball episode, during the Cryogenian. But that may not enough as an explanation. Another factor is that the cold and the low CO2 concentrations may have led to a weakening -- or even to a stop -- of the biological pump in the oceans and of the biotic pump on land. Both these pumps cycle water and nutrients, keeping the biosphere alive and well. Their near disappearance may have caused a general loss of activity of the biosphere and, hence, the loss of one of the mechanisms that removes CO2 from the atmosphere. So, CO2 concentrations increased as a result of the continuing geological emissions, unaffected by changes of the biosphere. Note how, in the figure, the CO2 concentration and temperatures are perfectly superimposable during the warming phases: the reaction of the temperature to the CO2 increase was instantaneous on a geological time scale. Another factor may have been the desertification of the land that led to an increase in atmospheric dust that landed on the top of the glaciers. That lowered the albedo (the reflected fraction of light) of the system and led to a new warming phase. A very complicated story that is still being unraveled.  

But how close was the biosphere to total disaster? We will never know. What we know is that, 20 thousand years ago, the atmosphere contained just 180 parts per million (ppm) of CO2 (today, we are at 410 ppm). That was close to the survival limit of green plants and there is evidence of extensive desertification during these periods. Life was hard for the biosphere during the recent ice ages, although not so bad as in the Cryogenian. Lovelock's idea that permanent ice at the poles is good for life just doesn't seem to be right.

Of course, the idea that we could go back to a new ice age was legitimate in the 1950s, not anymore as we understand the role of human activities on climate. Some people maintain that it was a good thing that humans started burning fossil hydrocarbons since that "saved us from a new ice age." Maybe, but this is a classic case of too much of a good thing. We are pumping so much CO2 into the atmosphere that our problem is now the opposite: we are not facing an "icehouse Earth" but a "warmhouse" or even a "hothouse" Earth. 

A "hothouse Earth" would be a true disaster since it was the main cause of the mass extinctions that took place in the remote past of our planet. Mainly, the hothouse episodes were the result of outbursts of CO2 generated by the enormous volcanic eruptions called "large igneous provinces." In principle, human emissions can't even remotely match these events. According to some calculations, we would need to keep burning fossil fuels for 500 years at the current rates to create a hothouse like the one that killed the dinosaurs (but, there is always that detail that non linear systems always surprise you . . .)

Still, considering feedback effects such as the release of methane buried in the permafrost, it is perfectly possible that human emissions could bring CO2 concentrations in the atmosphere at levels of the order of 600-800 ppm, or even more, comparable to those of the Eocene, when temperatures were 12 degrees higher than they are now. We may reach the condition called, sometimes, "warmhouse Earth."

From the human viewpoint, it would be a disaster. If the change were to occur in a relatively short time, say, of the order of a few centuries, the human civilization is probably toast. We are not equipped to cope with this kind of change. Just think of what happened some 14,500 years ago, when the great Laurentide ice sheet in North America fragmented and collapsed. (image source) (the 2006 movie "Meltdown" was inspired exactly by this event). Earth's climate went through a series of cold and warm spells that is hard to think we could survive. 

 



Human survival concerns are legitimate, but probably irrelevant in the greater scheme of things. If we go back to the Eocene, the ecosystem would take a big hit during the transition, but it would survive and then adapt to the new conditions. In terms of life, the Eocene has been described as "luxuriant." With plenty of CO2 in the atmosphere, forests were thriving and, probably, the biotic pump provided abundant water everywhere inland, even though the temperatures were relatively uniform at different latitudes. A possible mental model for that period is the modern tropical forests of Central Africa or Indonesia. We don't have data that would allow us to compare Earth's productivity today with that of the Eocene, but we can't exclude that the Eocene was more productive in terms of life. Humans might well adapt to this new world, although their survival during the transition is by no means guaranteed. 

Again, it seems that Lovelock was wrong when he said that ice ages optimize the functioning of the biosphere. But maybe there is more to this idea. At least for one thing, ice ages have a good effect on life. Take a look at this image that summarizes the main ice ages of Earth's long history


 (image source)

The interesting point is that ice ages seem to occur just before major transitions in the evolutionary history of Earth. We don't know much about the Huronian ice age, but it occurred just at the boundary of the Archean and the Proterozoic, at the time of the appearance of the Eucaryotes. Then, the Cryogenian preceded the Ediacaran period and the appearance of multicellular life that colonized the land. Finally, even the evolution of the Homo Sapiens species may be related to the most recent ice age cycle. With the cooling of the planet and the reduction of the extent of forested areas, our ancestors were forced to leave the comfortable forests where they had lived up to then and take up a more dangerous lifestyle in the savannas. And you know what it led to!

So, maybe there is something good in ice ages and, after all, James Lovelock's intuition may have hinted at an important insight in how evolution works. Then, there remains the question of how exactly ice ages drive evolution. Maybe they have an active role, or maybe they are simply a parallel effect of the real cause that drives evolution, quite possibly the increasing concentration of atmospheric oxygen that has accompanied the biosphere over the past 2.7 billion years. Oxygen is the magic pill that boosts the metabolism of aerobic creatures -- what makes possible creatures like us. 

In any case, it is likely that ice ages will soon be a thing of the past on planet Earth. The effect of the human perturbation may be moderate and, when humans will stop burning fossil hydrocarbons (they have to, one day or another) the system may reabsorb the excess CO2 and gradually return to the ice age cycles of the past. That may occur in times of the order of at least several thousand years, possibly several tens of thousands. But the climate is a non-linear system and it may react by reinforcing the perturbation -- the results are unknowable. 

What we know for sure is that the cycle of Earth's ecosystem (Gaia) is limited. We still have about 600 million years before the sun's increasing brightness takes Earth to a different condition: that of "wet greenhouse" that will bring the oceans to boil and extinguish all life on the planet. And so it will be what it will have to be. Gaia is long-lived, but not eternal.