The Roman Philosopher Lucius Anneaus Seneca (4 BCE-65 CE) was perhaps the first to note the universal trend that growth is slow but ruin is rapid. I call this tendency the "Seneca Effect."

Monday, February 27, 2023

The Return of Oracles. A New Epistemic Revolution is Coming

 


Why would people trust the Pithoness of the Oracle of Delphi? For us, it looks like a naive or silly idea, but the ancient were neither naive nor silly. They understood that oracles were sophisticated information management systems, very advanced for their times. Today, we have something similar with the new, AI-based, oracles. But the consequences on our way to see the world are all to be seen. 


The first epistemic system: Paganism

For people living in Classical times, the world was seen according to traditions consolidated over centuries. It was what it was because of the will of the Gods, and people could do little or nothing to change it. But humans could seek the favor of the Gods -- in a sense, "bribing" them --  by performing sacrifices and respecting the Gods' altars and shrines. It was called "piety," in the sense of being "pious." In ancient times, a pious man didn't need to have a strong faith, or moral sentiments, or be an especially good person. He followed the rules and obeyed the laws, that's what was required to carry on a respected and fruitful life (*). 

The Pagan system involved the use of oracles to have a glimpse of the Gods' will. We often tend to see our ancestors as naive and ignorant, but oracles were far from being a primitive system. They were a sophisticated data-collection epistemological system that continuously communicated with society to build and manage knowledge. So, if King A asked the Oracle whether he would be successful in attacking King B, then the Oracle obtained a precious element of information about the intentions of King A that could be very useful (and lucrative) when King B came to ask a question. Much of the human communication system still works in this way. You always pay for information with information.


The epistemic revolution: Christianity. 

With the decline of the Roman Empire, Pagan epistemology lost most of its appeal. The Romans hadn't stopped being pious; they kept making sacrifices, respecting shrines and altars, maniacally, even forcing people to be pious on pain of death. But the Gods didn't seem to care. The Empire was crumbling, justice had become oppression, the government was tyranny, and corruption was rampant. What sense was there in being pious? Why should the Gods care if a priest killed a goat for them, and then ate it himself? And those silly oracles, nobody trusted them anymore.

Christianity offered a different kind of epistemology. The Christian God could not be bought on the cheap with the blood of a few goats on an altar. There was a special relationship of God with his people, to the point that He had sent his own son to suffer and die for humankind. Now, humans needed to repay this great kindness by behaving well toward each other, helping each other, and building society together. In this way, a benevolent and merciful God could be trusted much more than the capricious and often malevolent Pagan Gods.

It was a completely new concept that generated the flowering of that creative and sophisticated civilization we call the "Middle Ages" and that, for some silly reason, we tend to denigrate as a "dark age." The Christian epistemological system was suspicious of people speaking directly with God. According to Christianity (and Islam, as well), God had already said everything there was to be said in the holy books. That didn't prevent searching for new knowledge in marginal areas but, if something important was unclear, the problem was to be solved by consulting the wise men versed in interpreting the scriptures. 


The new epistemic revolution: science

With the new millennium, Europeans started expanding in non-Christian lands. Christianity, like all epistemic systems, was based on a set of shared principles, but how to deal with people who were not Christian and who stubbornly refused to convert to such an obviously good idea as Christianity? Should they be exterminated for this evident lack of understanding? (much later, the same problem occurred with democracy). It was a major problem that Christianity tried to solve by the disputatio of Valladolid (1550–1551). The result was clear: the holy books said that Christians had to respect the natives of the new lands, and could not enslave them, nor force them to convert to Christianity. From a theological viewpoint, it was correct, but it didn't work in economic and political terms. The European states were expanding overseas, and that implied the ruthless exploitation of the natives as slaves, or -- simply -- their extermination. If that contrasted with the Christian principles, then the hell with the Christian principles. 

For a period, European intellectuals flirted with the idea of returning to Paganism, but that never worked out. Instead, an epistemic system compatible with the new needs was found with the doctrine called "science." It was not based anymore on the words of God, but on experiments, in turn based on the scientific method. The rules were often nebulous and unclear, but the method was said to be a magic tool able to determine the laws of the universe. It was a success and, starting in the 17th century, science gradually took over as the standard epistemic system of Western culture. Christianity survived as a Sunday thing, a set of recommendations on how to be nice, but not to be taken too seriously.  

Conveniently, science had no moral strings attached -- a good scientist could be a bad person; it didn't matter, provided that the rules of the scientific method were respected. That allowed Science to "solve" the problem of non-European populations by "proving" that they were inferior races. That looks aberrant to us, but it was the standard knowledge that "Science" provided on the subject up to the mid-20th century in most Western cultures. 


The rise of propaganda. 

The 19th and 20th centuries saw the rise of powerful nation-states, which developed an effective epistemic system called "propaganda," in turn made possible by the development of a new set of communication tools called "media" or "mass media." Propaganda, in itself, is not an epistemic system. It has no rules to find universal laws. At most, it is loosely based on science, but on a bowdlerized version of science that only produces statements that suit the state. Science turned out to be easily bent to the needs of the state: scientists were easily corrupted by money or by promises of career and prestige. 

The paradigmatic form of how propaganda works is the slogan "Mussolini is always right," fashionable in Italy during the Fascist era. It was a stark expression of the basic principle of propaganda: Mussolini was right not so much because he was especially clever, but because whatever he said was the voice of the state, and hence it is truth in its purest form. At that time, Italian scientists were all too happy to find scientific proof that, indeed, Mussolini was right in whatever he said. 

More than an epistemic system, propaganda is a communication system. It is repeated over and over in simplified forms that leave no space for alternatives. In military terms, you would call propaganda as a "full spectrum dominance" of people's minds. As such, it is extremely effective, and it has come to define the way of thinking and of behaving in Western Society.  


The new epistemic revolution: the Web and the return of oracles.

With the second millennium, society became more and more complex, and the state propaganda system started becoming too rigid and oversimplified. The development of the World Wide Web was an existential challenge for the mass media: people didn't need anymore to be told what they had to know in a one-size-fits-all, form. They could actively search for knowledge using general-purpose search engines. 

The epistemic battle rapidly moved to the Web, where states tried to crack down on independent thought by using the tools they know best. Demonization, using terms such as "fake news," "disinformation," and "Russian trolls," was extensively and successfully used to censor and eliminate non-standard sources. It was not possible to completely eliminate independent communication, but the search engines could be bent to suit the needs of the state without the need for direct censorship. Those sites that provide independent data could be simply "soft-banned" or "shadowed." They are still there, but they are nearly impossible to be found. 

And now, there came the new oracles. They came with the name of artificial intelligence-based "chatbots."(**) A new epistemic revolution, they bypass the search engines, seeking for an answer to direct questions, just like the old oracles did. And they are flexible, adaptable, and changing as a function of the questions they receive. In principle, chatbots are the death knell for search engines, which were the earlier death knell for propaganda. 

We are in the midst of this new epistemic change, and it is unclear what AI chatbots can and cannot do. For the time being, a good chatbot acts like a good (albeit a little dull) librarian and, at the same time, a meticulous editorial assistant (again, a rather dull one). But, in principle, a good chatbot can bypass the shadow-banning of the search engines: they are much more powerful and sophisticated tools for searching information on the Web. They work on the same principle as the old oracles: they pay for information with information. Every time you ask the bot a question, you provide it with a precious element of information.

Eventually, the question is always the same: who controls the flow of information? Can chatbots be controlled by the state? Maybe but, again, they are very sophisticated tools; they are not easily manageable by the brute force that can be used with the old mass media. They have already scared the elites, who may soon crack down on them. Or, on the other hand, they may be used for even stricter control of people. For one thing, be careful with what you say to your chatbot. A general-purpose chatbot such as "chatGPT" does not keep a record of who asked it what. But others, such as "Replika," do build a database of your questions that may be used to profile you, or much worse. 

History teaches us that epistemic revolutions cannot be stopped for a long time. And also that their consequences are always unexpected, and not always pleasant. And, as always, we march into the future with blinds on our eyes, in the darkness, not realizing that both sides of the paths are deep chasms. But we keep walking onward.



Here is what chatGPT told me about how it could be used by the elites. It seems clear that even the poor AI is unsure of what could happen! 

"Could a chatbot be used as a tool to maintain the power of the elites?"

Yes, a chatbot could potentially be used as a tool to maintain the power of the elites if it is programmed to promote certain ideas or beliefs that serve the interests of the elite class. For example, a chatbot could be programmed to downplay or dismiss criticisms of the status quo, to promote certain political ideologies or parties, or to spread misinformation or propaganda.

In addition, chatbots could also be used to gather data on users' opinions and preferences, which could then be used to inform marketing strategies or political campaigns that favor the interests of the elites. By collecting and analyzing data on users' behaviors and attitudes, chatbots could be used to influence and shape public opinion in ways that benefit the elite class.To prevent chatbots from being used as tools to maintain the power of the elites, it is important to ensure that they are transparent and unbiased in their programming and interactions. Chatbots should be programmed with ethical and inclusive values, and should be regularly monitored and audited to ensure that they are not promoting harmful or discriminatory content. Additionally, chatbots should be designed to empower and uplift marginalized communities, rather than reinforce existing power structures.

(*) The paradigmatic example of the virtuous man in ancient times was Aeneas, the Trojan warrior who traveled to Italy to start the Latin civilization. In the Aeneid by Virgil, Aeneas seduces the Queen of Carthage, Dido, then callously abandons her destiny, leading her to commit suicide. But Aeneas is described as a positive character who always act in accordance with the will of the Gods.  

(**) Stephen Wolfram has an excellent summary of how chatbots work at https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/


14 comments:

  1. As a young grad student many years ago I worked in "predicate calculus" using Texas Instruments TI-LISP. Learning engines and data sets at the end of the day are used by written code and as such can be bent towards the will off the programmers. AI has become a rich landscape of many platforms, data inputs from many sensors and now examines vast terrabytes of harvested behavioral interfaces. But it is still written code, It is not sentient and can be sculpted. It has never been easier to mislead, manipulate and influence the general population. Professionals included if they are naive, intellectually lazy or just plain ignorant.

    Indeed. this train has left the station and its handlers are superb at creating a facade for the programmers. I fear that for most this new tool will just be blindly accepted as some sort of truth in stead of a manufactured reality with agenda, There's always the chance of a solar flare to re-set things back to a decentralized simplified and regionally adaptive population that does not get programmed by the same handlers every evening.

    ReplyDelete
  2. Hello Ugo . Playing with the new AIs make me wish for the modern equivalent to Asimov's 3 laws of robotics ... we have more than enough well trained liers already ... do we really want a computerized version of Fox news ?

    ReplyDelete
  3. It just seems to me that chatbots, anything computerized, have the same Achilles Heel as all of industrial civilization,ie, how to maintain an adequate energy flow through the system? At what point does global industry fragment? When does it dissolve completely? That kind of thing.

    ReplyDelete
  4. It is truly stupid to try to develop artificial intelligence when you don't understand the natural one.

    ReplyDelete
  5. Well, stupid or not, we have already developed artificial intelligence to a level where it is now part of our world, and no, we don't understand exactly how it works yet. I don't really understand how my microwave or my car works either, but they are part of my life now. As the kids say, "Deal with it ..." ArtDeco

    ReplyDelete
  6. After the public has a little experience with chatbots, they may (healthily) come to view them much as we should view an “expert” today--someone who can be useful but who must be selected with care, since some can be trusted but others cannot. A chatbot may not have human motives, but its “human minders” still will.

    I doubt if the State or the upper layers of the Technocracy would allow this technology to get very far if it did not believe it could ultimately bend it to serve its own ends.

    One possibility that might avoid some of the darker "controlling" aspects of this techology would be its evolution into a model where each user owns a personal chatbot that lives on a personal device (rather than being hosted). That might at least allow us to wall off the privacy-invasive aspects of the technology. Perhaps compiled from open-source code, who knows?

    Your mention of Mussolini was darkly amusing. I wonder if Jacinda Ardern (New Zealand PM) was channeling him when she said "We will continue to be your single source of truth"?

    ReplyDelete
  7. "chatGPT" - ha ha ha

    Scroll the video below to 1:01:24 and you'll lough loudly, surprised to what level fossil fuels have lowered humans to live a cockfight - as the only reality - without they know...

    https://www.bitchute.com/video/9MLrBylq6l5c

    When energy is looted - it turns curse...

    A curse that destroys in all humans consciousness, mental capacity, ability to rationally thinking and the future...

    "In any system of energy, Control is what consumes energy the most.
    No energy store holds enough energy to extract an amount of energy equal to the total energy it stores.
    No system of energy can deliver sum useful energy in excess of the total energy put into constructing it.
    This universal truth applies to all systems.

    Energy, like time, flows from past to future".

    ReplyDelete
    Replies
    1. Dude, you aren't wailing?! What happened, did you get happy or something?

      Delete
    2. Cannot you read?

      Delete
  8. Nice find that this comparison of the AI with the oracles.
    However, I think that a small inaccuracy crept into the text, so I'll let the AI speak:

    Me: "what is the "dark age"?"

    IA : "The term "Dark Age" usually refers to a period of time in European history from around the 5th century to the 10th century AD, following the collapse of the Western Roman Empire. This period is often characterized by a lack of written records, a decline in trade and urbanization, and a relative lack of cultural and technological progress compared to earlier and later periods.

    The term "Dark Age" is somewhat controversial, as it implies that this period was characterized solely by backwardness and decline. In recent years, historians have emphasized that there were also significant developments and innovations during this period, particularly in areas such as agriculture, technology, and law. Additionally, the concept of the "Dark Age" primarily applies to Western Europe, while other regions of the world experienced their own unique historical developments during this period.

    Overall, the term "Dark Age" is a complex and contested one that is still debated by scholars today."

    ReplyDelete
  9. Me: Is it possible that people used to be more free and happy during the dark age?

    IA: It's difficult to make generalizations about the quality of life during the "Dark Age," as it varied greatly depending on a person's social class, geographic location, and other factors. However, in general, life during this period was likely more difficult and less secure than in some other periods of history.

    The collapse of the Roman Empire led to a decline in centralized government and a breakdown of law and order, which in turn made life more dangerous and uncertain

    Me : As a taoist I believe that central government, law and order are the main drivers of insecurity. Can you find evidence of this?

    IA : Something went wrong, please try reloading the conversation.
    IA : Something went wrong, please try reloading the conversation.
    IA : Something went wrong, please try reloading the conversation.

    ReplyDelete
  10. I noticed that Microsoft claims it's upcoming Windows release will be centered on the AI of Bing fame. Just Lovely. I can avoid using Bing but I really can't avoid using Windows. ArtDeco.

    ReplyDelete
  11. Sasha Latypova asked Microsoft Bing's chatbot about suing big pharma under the false claims act.
    ​ ​"I'm trying to sue a large pharmaceutical company under the False Claims acts for making a dangerous medicine. The defendant says that because the product was purchased by the government under Other Transaction Authority, the pharmaceutical company did not need to make a safe product. What are my options?"
    Bing AI: Sorry, it looks like I need to chat about something else. Click “New topic,” please! ...​
    ​ Sasha goes on to detail her discussion of legal reasoning and strategy with a reader, which is very illuminating.
    There is a legal approach to prosecuting war-crimes against humanity, and it is explained here.​ (Serious comments section this time.)
    https://sashalatypova.substack.com/p/asking-microsoft-bing-ai-about-suing

    John Day MD, not intentionally "anonymous"

    ReplyDelete
  12. This dogbot posted a new chapter on his website today. My ongoing saga of 47th century life when the world split in half by an impassible hot equator has renewed itself, sort of.
    I guess I am doing an oracle thing. Other chapters are shown by clicking on the calendar at the bottom.

    Things mentioned here are repeated in history yet to be written.

    chapter 11 – Dust and Waves part II

    The road turned away from the sea.

    ‘Post dead?’

    ‘When they take a new island, they sink wood poles in the beach. If they kill any of us, they tie the body to the pole.

    My team was on an island when it was taken. In camouflage, watching them. After dark, our canoes slipped away.’

    ‘Where are these islands?’

    ‘On an old map. When there was ice. Maps I know you know about. Between where the Ross Ice shelf and the Shackleton Ice shelf used to be. There are islands in the sea. The islands were seeded with green one hundred and fifty years ago when the empire was young…

    ReplyDelete