Skip to main content

Home/ TOK Friends/ Group items tagged world war i

Rss Feed Group items tagged

Javier E

Economics of Good and Evil: The Quest for Economic Meaning from Gilgamesh to Wall Stree... - 1 views

  • Instead of self-confident and self-centered answers, the author humbly asks fundamental questions: What is economics? What is its meaning? Where does this new religion, as it is sometimes called, come from? What are its possibilities and its limitations and borders, if there are any? Why are we so dependent on permanent growing of growth and growth of growing of growth? Where did the idea of progress come from, and where is it leading us? Why are so many economic debates accompanied by obsession and fanaticism?
  • The majority of our political parties act with a narrow materialistic focus when, in their programs, they present the economy and finance first; only then, somewhere at the end, do we find culture as something pasted on or as a libation for a couple of madmen.
  • most of them—consciously or unconsciously—accept and spread the Marxist thesis of the economic base and the spiritual superstructure.
  • ...297 more annotations...
  • He tries to break free of narrow specialization and cross the boundaries between scientific disciplines. Expeditions beyond economics’ borders and its connection to history, philosophy, psychology, and ancient myths are not only refreshing, but necessary for understanding the world of the twenty-first century.
  • Reality is spun from stories, not from material. Zdeněk Neubauer
  • Before it was emancipated as a field, economics lived happily within subsets of philosophy—ethics, for example—miles away from today’s concept of economics as a mathematical-allocative science that views “soft sciences” with a scorn born from positivistic arrogance. But our thousand-year “education” is built on a deeper, broader, and oftentimes more solid base. It is worth knowing about.
  • Outside of our history, we have nothing more.
  • The study of the history of a certain field is not, as is commonly held, a useless display of its blind alleys or a collection of the field’s trials and errors (until we got it right), but history is the fullest possible scope of study of a menu that the given field can offer.
  • History of thought helps us to get rid of the intellectual brainwashing of the age, to see through the intellectual fashion of the day, and to take a couple of steps back.
  • “The separation between the history of a science, its philosophy, and the science itself dissolves into thin air, and so does the separation between science and non-science; differences between the scientific and unscientific are vanishing.”
  • we seek to chart the development of the economic ethos. We ask questions that come before any economic thinking can begin—both philosophically and, to a degree, historically. The area here lies at the very borders of economics—and often beyond. We may refer to this as protoeconomics (to borrow a term from protosociology) or, perhaps more fittingly, metaeconomics (to borrow a term from metaphysics).
  • stories; Adam Smith believed. As he puts it in The Theory of Moral Sentiments, “the desire of being believed, or the desire of persuading, of leading and directing other people, seems to be one of the strongest of all our natural desires.”
  • “The human mind is built to think in terms of narratives … in turn, much of human motivation comes from living through a story of our lives, a story that we tell to ourselves and that creates a framework of our motivation. Life could be just ‘one damn thing after another’ if it weren’t for such stories. The same is true for confidence in a nation, a company, or an institution. Great leaders are foremost creators of stories.”
  • contrary to what our textbooks say, economics is predominantly a normative field. Economics not only describes the world but is frequently about how the world should be (it should be effective, we have an ideal of perfect competition, an ideal of high-GDP growth in low inflation, the effort to achieve high competitiveness …). To this end, we create models, modern parables,
  • I will try to show that mathematics, models, equations, and statistics are just the tip of the iceberg of economics; that the biggest part of the iceberg of economic knowledge consists of everything else; and that disputes in economics are rather a battle of stories and various metanarratives than anything else.
  • That is the reason for this book: to look for economic thought in ancient myths and, vice versa, to look for myths in today’s economics.
  • is a paradox that a field that primarily studies values wants to be value-free. One more paradox is this: A field that believes in the invisible hand of the market wants to be without mysteries.
  • Almost all of the key concepts by which economics operates, both consciously and unconsciously, have a long history, and their roots extend predominantly outside the range of economics, and often completely beyond that of science.
  • The History of Animal Spirits: Dreams Never Sleep
  • In this sense, “the study of economics is too narrow and too fragmentary to lead to valid insight, unless complemented and completed by a study of metaeconomics.”17
  • The more important elements of a culture or field of inquiry such as economics are found in fundamental assumptions that adherents of all the various systems within the epoch unconsciously presuppose. Such assumptions appear so obvious that people do not know what they are assuming, because no other way of putting things has ever occurred to them, as the philosopher Alfred Whitehead notes in Adventures of Ideas.
  • I argue that economic questions were with mankind long before Adam Smith. I argue that the search for values in economics did not start with Adam Smith but culminated with him.
  • We should go beyond economics and study what beliefs are “behind the scenes,” ideas that have often become the dominant yet unspoken assumptions in our theories. Economics is surprisingly full of tautologies that economists are predominantly unaware of. I
  • argue that economics should seek, discover, and talk about its own values, although we have been taught that economics is a value-free science. I argue that none of this is true and that there is more religion, myth, and archetype in economics than there is mathematics.
  • In a way, this is a study of the evolution of both homo economicus and, more importantly, the history of the animal spirits within him. This book tries to study the evolution of the rational as well as the emotional and irrational side of human beings.
  • I argue that his most influential contribution to economics was ethical. His other thoughts had been clearly expressed long before him, whether on specialization, or on the principle of the invisible hand of the market. I try to show that the principle of the invisible hand of the market is much more ancient and developed long before Adam Smith. Traces of it appear even in the Epic of Gilgamesh, Hebrew thought, and in Christianity, and it is expressly stated by Aristophanes and Thomas Aquinas.
  • This is not a book on the thorough history of economic thought. The author aims instead to supplement certain chapters on the history of economic thought with a broader perspective and analysis of the influences that often escape the notice of economists and the wider public.
  • Progress (Naturalness and Civilization)
  • The Economy of Good and Evil
  • from his beginnings, man has been marked as a naturally unnatural creature, who for unique reasons surrounds himself with external possessions. Insatiability, both material and spiritual, are basic human metacharacteristics, which appear as early as the oldest myths and stories.
  • the Hebrews, with linear time, and later the Christians gave us the ideal (or amplified the Hebrew ideal) we now embrace. Then the classical economists secularized progress. How did we come to today’s progression of progress, and growth for growth’s sake?
  • The Need for Greed: The History of Consumption and Labor
  • Metamathematics From where did economics get the concept of numbers as the very foundation of the world?
  • mathematics at the core of economics, or is it just the icing of the cake, the tip of the iceberg of our field’s inquiry?
  • idea that we can manage to utilize our natural egoism, and that this evil is good for something, is an ancient philosophical and mythical concept. We will also look into the development of the ethos of homo economicus, the birth of “economic man.”
  • All of economics is, in the end, economics of good and evil. It is the telling of stories by people of people to people. Even the most sophisticated mathematical model is, de facto, a story, a parable, our effort to (rationally) grasp the world around us.
  • Masters of the Truth
  • Originally, truth was a domain of poems and stories, but today we perceive truth as something much more scientific, mathematical. Where does one go (to shop) for the truth? And who “has the truth” in our epoch?
  • Our animal spirits (something of a counterpart to rationality) are influenced by the archetype of the hero and our concept of what is good.
  • The entire history of ethics has been ruled by an effort to create a formula for the ethical rules of behavior. In the final chapter we will show the tautology of Max Utility, and we will discuss the concept of Max Good.
  • The History of the Invisible Hand of the Market and Homo Economicus
  • We understand “economics” to mean a broader field than just the production, distribution, and consumption of goods and services. We consider economics to be the study of human relations that are sometimes expressible in numbers, a study that deals with tradables, but one that also deals with nontradables (friendship, freedom, efficiency, growth).
  • When we mention economics in this book, we mean the mainstream perception of it, perhaps as best represented by Paul Samuelson.
  • By the term homo economicus, we mean the primary concept of economic anthropology. It comes from the concept of a rational individual, who, led by narrowly egotistical motives, sets out to maximize his benefit.
  • the Epic of Gilgamesh bears witness to the opposite—despite the fact that the first written clay fragments (such as notes and bookkeeping) of our ancestors may have been about business and war, the first written story is mainly about great friendship and adventure.
  • there is no mention of either money or war; for example, not once does anyone in the whole epic sell or purchase something.5 No nation conquers another, and we do not encounter a mention even of the threat of violence.
  • is a story of nature and civilization, of heroism, defiance, and the battle against the gods, and evil; an epic about wisdom, immortality, and also futility.
  • Gilgamesh becomes a hero not only due to his strength, but also due to discoveries and deeds whose importance were in large part economic—direct gaining of construction materials in the case of felling the cedar forest, stopping Enkidu from devastating Uruk’s economy, and discovering new desert routes during his expeditions.
  • Even today, we often consider the domain of humanity (human relations, love, friendship, beauty, art, etc.) to be unproductive;
  • Even today we live in Gilgamesh’s vision that human relations—and therefore humanity itself—are a disturbance to work and efficiency; that people would perform better if they did not “waste” their time and energy on nonproductive things.
  • But it is in friendship where—often by-the-way, as a side product, an externality—ideas and deeds are frequently performed or created that together can altogether change the face of society.19 Friendship can go against an ingrained system in places where an individual does not have the courage to do so himself or herself.
  • As Joseph Stiglitz says, One of the great “tricks” (some say “insights”) of neoclassical economics is to treat labour like any other factor of production. Output is written as a function of inputs—steel, machines, and labour. The mathematics treats labour like any other commodity, lulling one into thinking of labour like an ordinary commodity, such as steel or plastic.
  • Even the earliest cultures were aware of the value of cooperation on the working level—today we call this collegiality, fellowship, or, if you want to use a desecrated term, comradeship. These “lesser relationships” are useful and necessary for society and for companies because work can be done much faster and more effectively if people get along with each other on a human level
  • But true friendship, which becomes one of the central themes of the Epic of Gilgamesh, comes from completely different material than teamwork. Friendship, as C. S. Lewis accurately describes it, is completely uneconomical, unbiological, unnecessary for civilization, and an unneeded relationship
  • Here we have a beautiful example of the power of friendship, one that knows how to transform (or break down) a system and change a person. Enkidu, sent to Gilgamesh as a punishment from the gods, in the end becomes his faithful friend, and together they set out against the gods. Gilgamesh would never have gathered the courage to do something like that on his own—nor would Enkidu.
  • Due to their friendship, Gilgamesh and Enkidu then intend to stand up to the gods themselves and turn a holy tree into mere (construction) material they can handle almost freely, thereby making it a part of the city-construct, part of the building material of civilization, thus “enslaving” that which originally was part of wild nature. This is a beautiful proto-example of the shifting of the borders between the sacred and profane (secular)—and to a certain extent also an early illustration of the idea that nature is there to provide cities and people with raw material and production resources.
  • started with Babylonians—rural nature becomes just a supplier of raw materials, resources (and humans the source of human resources). Nature is not the garden in which humans were created and placed, which they should care for and which they should reside in, but becomes a mere reservoir for natural (re)sources.
  • But labour is unlike any other commodity. The work environment is of no concern for steel; we do not care about steel’s well-being.16
  • Both heroes change—each from opposite poles—into humans. In this context, a psychological dimension to the story may be useful: “Enkidu (…) is Gilgamesh’s alter ego, the dark, animal side of his soul, the complement to his restless heart. When Gilgamesh found Enkidu, he changed from a hated tyrant into the protector of his city. (…)
  • To be human seems to be somewhere in between, or both of these two. We
  • this moment of rebirth from an animal to a human state, the world’s oldest preserved epic implicitly hints at something highly important. Here we see what early cultures considered the beginning of civilization. Here is depicted the difference between people and animals or, better, savages. Here the epic quietly describes birth, the awakening of a conscious, civilized human. We are witnesses to the emancipation of humanity from animals,
  • The entire history of culture is dominated by an effort to become as independent as possible from the whims of nature.39 The more developed a civilization is, the more an individual is protected from nature and natural influences and knows how to create around him a constant or controllable environment to his liking.
  • The price we pay for independence from the whims of nature is dependence on our societies and civilizations. The more sophisticated a given society is as a whole, the less its members are able to survive on their own as individuals, without society.
  • The epic captures one of the greatest leaps in the development of the division of labor. Uruk itself is one of the oldest cities of all, and in the epic it reflects a historic step forward in specialization—in the direction of a new social city arrangement. Because of the city wall, people in the city can devote themselves to things other than worrying about their own safety, and they can continue to specialize more deeply.
  • Human life in the city gains a new dimension and suddenly it seems more natural to take up issues going beyond the life span of an individual. “The city wall symbolizes as well as founds the permanence of the city as an institution which will remain forever and give its inhabitants the certainty of unlimited safety, allowing them to start investing with an outlook reaching far beyond the borders of individual life.
  • The wall around the city of Uruk is, among other things, a symbol of an internal distancing from nature, a symbol of revolts against submission to laws that do not come under the control of man and that man can at most discover and use to his benefit.
  • “The chief thing which the common-sense individual wants is not satisfactions for the wants he had, but more, and better wants.”47
  • If a consumer buys something, theoretically it should rid him of one of his needs—and the aggregate of things they need should be decreased by one item. In reality, though, the aggregate of “I want to have” expands together with the growing aggregate of “I have.”
  • can be said that Enkidu was therefore happy in his natural state, because all of his needs were satiated. On the other hand, with people, it appears that the more a person has, the more developed and richer, the greater the number of his needs (including the unsaturated ones).
  • the Old Testament, this relationship is perceived completely differently. Man (humanity) is created in nature, in a garden. Man was supposed to care for the Garden of Eden and live in harmony with nature and the animals. Soon after creation, man walks naked and is not ashamed, de facto the same as the animals. What is characteristic is that man dresses (the natural state of creation itself is not enough for him), and he (literally and figuratively) covers52 himself—in shame after the fall.53
  • Nature is where one goes to hunt, collect crops, or gather the harvest. It is perceived as the saturator of our needs and nothing more. One goes back to the city to sleep and be “human.” On the contrary, evil resides in nature. Humbaba lives in the cedar forest, which also happens to be the reason to completely eradicate it.
  • Symbolically, then, we can view the entire issue from the standpoint of the epic in the following way: Our nature is insufficient, bad, evil, and good (humane) occurs only after emancipation from nature (from naturalness), through culturing and education. Humanity is considered as being in civilization.
  • The city was frequently (at least in older Jewish writings) a symbol of sin, degeneration, and decadence—nonhumanity. The Hebrews were originally a nomadic nation, one that avoided cities. It is no accident that the first important city57 mentioned in the Bible is proud Babylon,58 which God later turns to dust.
  • is enough, for example, to read the Book of Revelation to see how the vision of paradise developed from the deep Old Testament period, when paradise was a garden. John describes his vision of heaven as a city—paradise is in New Jerusalem, a city where the dimensions of the walls(!) are described in detail, as are the golden streets and gates of pearl.
  • Hebrews later also chose a king (despite the unanimous opposition of God’s prophets) and settled in cities, where they eventually founded the Lord’s Tabernacle and built a temple for Him. The city of Jerusalem later gained an illustrious position in all of religion.
  • this time Christianity (as well as the influence of the Greeks) does not consider human naturalness to be an unambiguous good, and it does not have such an idyllic relationship to nature as the Old Testament prophets.
  • If a tendency toward good is not naturally endowed in people, it must be imputed from above through violence or at least the threat of violence.
  • If we were to look at human naturalness as a good, then collective social actions need a much weaker ruling hand. If people themselves have a natural tendency (propensity) toward good, this role does not have to be supplied by the state, ruler, or, if you wish, Leviathan.
  • How does this affect economics?
  • us return for the last time to the humanization of the wild Enkidu, which is a process we can perceive with a bit of imagination as the first seed of the principle of the market’s invisible hand, and therefore the parallels with one of the central schematics of economic thinking.
  • Sometimes it is better to “harness the devil to the plow” than to fight with him. Instead of summoning up enormous energy in the fight against evil, it is better to use its own energy to reach a goal we desire; setting up a mill on the turbulent river instead of futile efforts to remove the current. This is also how Saint Prokop approached it in one of the oldest Czech legends.
  • Enkidu caused damage and it was impossible to fight against him. But with the help of a trap, trick, this evil was transformed into something that greatly benefited civilization.
  • By culturing and “domesticating” Enkidu, humanity tamed the uncontrollable wild and chaotic evil
  • Enkidu devastated the doings (the external, outside-the-walls) of the city. But he was later harnessed and fights at the side of civilization against nature, naturalness, the natural state of things.
  • A similar motif appears a thousand years after the reversal, which is well known even to noneconomists as the central idea of economics: the invisible hand of the market.
  • A similar story (reforming something animally wild and uncultivated in civilizational achievement) is used by Thomas Aquinas in his teachings. Several centuries later, this idea is fully emancipated in the hands of Bernard Mandeville and his Fable of the Bees: or, Private Vices, Publick Benefits. The economic and political aspects of this idea are—often incorrectly—ascribed to Adam Smith.
  • Here the individual does not try anymore to maximize his goods or profits, but what is important is writing his name in human memory in the form of heroic acts or deeds.
  • immortality, one connected with letters and the cult of the word: A name and especially a written name survives the body.”77
  • After this disappointment, he comes to the edge of the sea, where the innkeeper Siduri lives. As tonic for his sorrow, she offers him the garden of bliss, a sort of hedonistic fortress of carpe diem, where a person comes to terms with his mortality and at least in the course of the end of his life maximizes earthly pleasures, or earthly utility.
  • In the second stage, after finding his friend Enkidu, Gilgamesh abandons the wall and sets out beyond the city to maximalize heroism. “In his (…) search of immortal life, Gilgamesh
  • The hero refuses hedonism in the sense of maximizing terrestrial pleasure and throws himself into things that will exceed his life. In the blink of an eye, the epic turns on its head the entire utility maximization role that mainstream economics has tirelessly tried to sew on people as a part of their nature.81
  • It is simpler to observe the main features of our civilization at a time when the picture was more readable—at a time when our civilization was just being born and was still “half-naked.” In other words, we have tried to dig down to the bedrock of our written civilization;
  • today remember Gilgamesh for his story of heroic friendship with Enkidu, not for his wall, which no longer reaches monumental heights.
  • the eleventh and final tablet, Gilgamesh again loses what he sought. Like Sisyphus, he misses his goal just before the climax
  • is there something from it that is valid today? Have we found in Gilgamesh certain archetypes that are in us to this day?
  • The very existence of questions similar to today’s economic ones can be considered as the first observation. The first written considerations of the people of that time were not so different from those today. In other words: The epic is understandable for us, and we can identify with it.
  • We have also been witnesses to the very beginnings of man’s culturing—a great drama based on a liberation and then a distancing from the natural state.
  • Let us take this as a memento in the direction of our restlessness, our inherited dissatisfaction and the volatility connected to it. Considering that they have lasted five thousand years and to this day we find ourselves in harmony with a certain feeling of futility, perhaps these characteristics are inherent in man.
  • Gilgamesh had a wall built that divided the city from wild nature and created a space for the first human culture. Nevertheless, “not even far-reaching works of civilization could satisfy human desire.”
  • Friendship shows us new, unsuspected adventures, gives us the opportunity to leave the wall and to become neither its builder nor its part—to not be another brick in the wall.
  • with the phenomenon of the creation of the city, we have seen how specialization and the accumulation of wealth was born, how holy nature was transformed into a secular supplier of resources, and also how humans’ individualistic ego was emancipated.
  • to change the system, to break down that which is standing and go on an expedition against the gods (to awaken, from naïveté to awakening) requires friendship.
  • For small acts (hunting together, work in a factory), small love is enough: Camaraderie. For great acts, however, great love is necessary, real love: Friendship. Friendship that eludes the economic understanding of quid pro quo. Friendship gives. One friend gives (fully) for the other. That is friendship for life and death,
  • The thought that humanity comes at the expense of efficiency is just as old as humanity itself—as we have shown, subjects without emotion are the ideal of many tyrants.
  • The epic later crashes this idea through the friendship of Gilgamesh and Enkidu. Friendship—the biologically least essential love, which at first sight appears to be unnecessary
  • less a civilized, city person is dependent on nature, the more he or she is dependent on the rest of society. Like Enkidu, we have exchanged nature for society; harmony with (incalculable) nature for harmony with (incalculable) man.
  • human nature good or evil? To this day these questions are key for economic policy: If we believe that man is evil in his nature, therefore that a person himself is dog eat dog (animal), then the hard hand of a ruler is called for. If we believe that people in and of themselves, in their nature, gravitate toward good, then it is possible to loosen up the reins and live in a society that is more laissez-faire.
  • For a concept of historical progress, for the undeification of heroes, rulers, and nature, mankind had to wait for the Hebrews.
  • Because nature is not undeified, it is beyond consideration to explore it, let alone intervene in it (unless a person was a two-thirds god like Gilgamesh). It
  • They practiced money lending, traded in many assets (…) and especially were engaged in the trading of shares on capital markets, worked in currency exchange and frequently figured as mediators in financial transactions (…), they functioned as bankers and participated in emissions of all possible forms.
  • As regards modern capitalism (as opposed to the ancient and medieval periods) … there are activities in it which are, in certain forms, inherently (and completely necessarily) present—both from an economic and legal standpoint.7
  • As early as the “dark” ages, the Jews commonly used economic tools that were in many ways ahead of their time and that later became key elements of the modern economy:
  • Gilgamesh’s story ends where it began. There is a consistency in this with Greek myths and fables: At the end of the story, no progress occurs, no essential historic change; the story is set in indefinite time, something of a temporal limbo.
  • Jews believe in historical progress, and that progress is in this world.
  • For a nation originally based on nomadism, where did this Jewish business ethos come from? And can the Hebrews truly be considered as the architects of the values that set the direction of our civilization’s economic thought?
  • Hebrew religiosity is therefore strongly connected with this world, not with any abstract world, and those who take pleasure in worldly possessions are not a priori doing anything wrong.
  • PROGRESS: A SECULARIZED RELIGION One of the things the writers of the Old Testament gave to mankind is the idea and notion of progress. The Old Testament stories have their development; they change the history of the Jewish nation and tie in to each other. The Jewish understanding of time is linear—it has a beginning and an end.
  • The observance of God’s Commandments in Judaism leads not to some ethereal other world, but to an abundance of material goods (Genesis 49:25–26, Leviticus 26:3–13, Deuteronomy 28:1–13) (…) There are no accusing fingers pointed at
  • There are no echoes of asceticism nor for the cleansing and spiritual effect of poverty. It is fitting therefore, that the founders of Judaism, the Patriarchs Abraham, Isaac and Jacob, were all wealthy men.12
  • about due to a linear understanding of history. If history has a beginning as well as an end, and they are not the same point, then exploration suddenly makes sense in areas where the fruits are borne only in the next generation.
  • What’s more, economic progress has almost become an assumption of modern functional societies. We expect growth. We take it automatically. Today, if nothing “new” happens, if GDP does not grow (we say it stagnates) for several quarters, we consider it an anomaly.
  • however, the idea of progress itself underwent major changes, and today we perceive it very differently. As opposed to the original spiritual conceptions, today we perceive progress almost exclusively in an economic or scientific-technological sense.
  • Because care for the soul has today been replaced by care for external things,
  • This is why we must constantly grow, because we (deep down and often implicitly) believe that we are headed toward an (economic) paradise on Earth.
  • Only since the period of scientific-technological revolution (and at a time when economics was born as an independent field) is material progress automatically assumed.
  • Jewish thought is the most grounded, most realistic school of thought of all those that have influenced our culture.17 An abstract world of ideas was unknown to the Jews. To this day it is still forbidden to even depict God, people, and animals in symbols, paintings, statues, and drawings.
  • economists have become key figures of great importance in our time (Kacířské eseje o filosofii dějin [Heretical Essays in the Philosophy of History]). They are expected to perform interpretations of reality, give prophetic services (macroeconomic forecasts), reshape reality (mitigate the impacts of the crisis, speed up growth), and, in the long run, provide leadership on the way to the Promised Land—paradise on Earth.
  • REALISM AND ANTIASCETICISM Aside from ideas of progress, the Hebrews brought another very fundamental contribution to our culture: The desacralization of heroes, nature, and rulers.
  • Voltaire writes: “It certain fact is, that in his public laws he [Moses] never so much as once made mention of a life to come, limiting all punishments and all rewards to the present life.”21
  • As opposed to Christianity, the concept of an extraterrestrial paradise or heaven was not developed much in Hebrew thought.19 The paradise of the Israelites—Eden—was originally placed on Earth at a given place in Mesopotamia20 and at a given time,
  • The Hebrews consider the world to be real—not just a shadow reflection of a better world somewhere in the cloud of ideas, something the usual interpretation of history ascribes to Plato. The soul does not struggle against the body and is not its prisoner, as Augustine would write later.
  • The land, the world, the body, and material reality are for Jews the paramount setting for divine history, the pinnacle of creation. This idea is the conditio sine qua non of the development of economics, something of an utterly earthly making,
  • The mythology of the hero-king was strongly developed in that period, which Claire Lalouette summarizes into these basic characteristics: Beauty (a perfect face, on which it is “pleasant to look upon,” but also “beauty,” expressed in the Egyptian word nefer, not only means aesthetics, but contains moral qualities as well),
  • THE HERO AND HIS UNDEIFICATION: THE DREAM NEVER SLEEPS The concept of the hero is more important than it might appear. It may be the remote origin of Keynes’s animal spirits, or the desire to follow a kind of internal archetype that a given individual accepts as his own and that society values.
  • This internal animator of ours, our internal mover, this dream, never sleeps and it influences our behavior—including economic behavior—more than we want to realize.
  • manliness and strength,28 knowledge and intelligence,29 wisdom and understanding, vigilance and performance, fame and renown (fame which overcomes enemies because “a thousand men would not be able to stand firmly in his presence”);30 the hero is a good shepherd (who takes care of his subordinates), is a copper-clad rampart, the shield of the land, and the defender of heroes.
  • Each of us probably has a sort of “hero within”—a kind of internal role-model, template, an example that we (knowingly or not) follow. It is very important what kind of archetype it is, because its role is dominantly irrational and changes depending on time and the given civilization.
  • The oldest was the so-called Trickster—a fraudster; then the culture bearer—Rabbit; the musclebound hero called Redhorn; and finally the most developed form of hero: the Twins.
  • the Egyptian ruler, just as the Sumerian, was partly a god, or the son of a god.31
  • Jacob defrauds his father Isaac and steals his brother Esau’s blessing of the firstborn. Moses murders an Egyptian. King David seduces the wife of his military commander and then has him killed. In his old age, King Solomon turns to pagan idols, and so on.
  • Anthropology knows several archetypes of heroes. The Polish-born American anthropologist Paul Radin examined the myths of North American Indians and, for example, in his most influential book, The Trickster, he describes their four basic archetypes of heroes.
  • The Torah’s heroes (if that term can be used at all) frequently make mistakes and their mistakes are carefully recorded in the Bible—maybe precisely so that none of them could be deified.32
  • We do not have to go far for examples. Noah gets so drunk he becomes a disgrace; Lot lets his own daughters seduce him in a similar state of drunkenness. Abraham lies and (repeatedly) tries to sell his wife as a concubine.
  • the Hebrew heroes correspond most to the Tricksters, the Culture Bearers, and the Twins. The divine muscleman, that dominant symbol we think of when we say hero, is absent here.
  • To a certain extent it can be said that the Hebrews—and later Christianity—added another archetype, the archetype of the heroic Sufferer.35 Job
  • Undeification, however, does not mean a call to pillage or desecration; man was put here to take care of nature (see the story of the Garden of Eden or the symbolism of the naming of the animals). This protection and care of nature is also related to the idea of progress
  • For the heroes who moved our civilization to where it is today, the heroic archetypes of the cunning trickster, culture bearer, and sufferer are rather more appropriate.
  • the Old Testament strongly emphasizes the undeification of nature.37 Nature is God’s creation, which speaks of divinity but is not the domain of moody gods
  • This is very important for democratic capitalism, because the Jewish heroic archetype lays the groundwork much better for the development of the later phenomenon of the hero, which better suits life as we know it today. “The heroes laid down their arms and set about trading to become wealthy.”
  • in an Old Testament context, the pharaoh was a mere man (whom one could disagree with, and who could be resisted!).
  • RULERS ARE MERE MEN In a similar historical context, the Old Testament teachings carried out a similar desacralization of rulers, the so-called bearers of economic policy.
  • Ultimately the entire idea of a political ruler stood against the Lord’s will, which is explicitly presented in the Torah. The Lord unequivocally preferred the judge as the highest form of rule—an
  • The needs of future generations will have to be considered; after all humankind are the guardians of God’s world. Waste of natural resources, whether privately owned or nationally owned is forbidden.”39
  • Politics lost its character of divine infallibility, and political issues were subject to questioning. Economic policy could become a subject of examination.
  • 44 God first creates with the word and then on individual days He divides light from darkness, water from dry land, day from night, and so forth—and He gives order to things.45 The world is created orderly— it is wisely, reasonably put together. The way of the world is put together at least partially46 decipherably by any other wise and reasonable being who honors rational rules.
  • which for the methodology of science and economics is very important because disorder and chaos are difficult to examine scientifically.43 Faith in some kind of rational and logical order in a system (society, the economy) is a silent assumption of any (economic) examination.
  • THE PRAISE OF ORDER AND WISDOM: MAN AS A PERFECTER OF CREATION The created world has an order of sorts, an order recognizable by us as people,
  • From the very beginning, when God distances Himself from the entire idea, there is an anticipation that there is nothing holy, let alone divine, in politics. Rulers make mistakes, and it is possible to subject them to tough criticism—which frequently occurs indiscriminately through the prophets in the Old Testament.
  • Hebrew culture laid the foundations for the scientific examination of the world.
  • Examining the world is therefore an absolutely legitimate activity, and one that is even requested by God—it is a kind of participation in the Creator’s work.51 Man is called on to understand himself and his surroundings and to use his knowledge for good.
  • I was there when he set heavens in place, when he marked out the horizon on the face of the deep (…) Then I was the craftsman at his side.47
  • There are more urgings to gain wisdom in the Old Testament. “Wisdom calls aloud in the street (…): ‘How long will you simple ones love your simple ways?’”49 Or several chapters later: “Wisdom is supreme; therefore get wisdom. Though it cost all you have, get understanding.”50
  • examination is not forbidden. The fact that order can be grasped by human reason is another unspoken assumption that serves as a cornerstone of any scientific examination.
  • then, my sons, listen to me; blessed are those who keep my ways (…) Blessed is the man who listens to me, watching daily at my doors, waiting at my doorway. For whoever finds me finds life and receives favor from the Lord.
  • the rational examination of nature has its roots, surprisingly, in religion.
  • The Lord brought me forth as the first of his works, before his deeds of old. I was appointed from eternity, from the beginning, before the world began. When there were no oceans, I was given birth, when there were no springs abounding with water, before the mountains were settled in place,
  • The Book of Proverbs emphasizes specifically several times that it was wisdom that was present at the creation of the world. Wisdom personified calls out:
  • The last act, final stroke of the brush of creation, naming of the animals—this act is given to a human, it is not done by God, as one would expect. Man was given the task of completing the act of creation that the Lord began:
  • MAN AS A FINISHER OF CREATION The creation of the world, as it is explained in Jewish teachings, is described in the Book of Genesis. Here God (i) creates, (ii) separates, and (iii) names [my emphasis]:
  • Naming is a symbolic expression. In Jewish culture (and also in our culture to this day), the right to name meant sovereign rights and belonged, for example, to explorers (new places), inventors (new principles), or parents (children)—that is, to those who were there at the genesis, at the origin. This right was handed over by God to mankind.
  • The Naming itself (the capital N is appropriate) traditionally belongs to the crowning act of the Creator and represents a kind of grand finale of creation, the last move of the brush to complete the picture—a signature of the master.
  • Without naming, reality does not exist; it is created together with language. Wittgenstein tightly names this in his tractatus—the limits of our language are the limits of our world.53
  • He invented (fictitiously and completely abstractly!) a framework that was generally accepted and soon “made into” reality. Marx invented similarly; he created the notion of class exploitation. Through his idea, the perception of history and reality was changed for a large part of the world for nearly an entire century.
  • Reality is not a given; it is not passive. Perceiving reality and “facts” requires man’s active participation. It is man who must take the last step, an act (and we
  • How does this relate to economics? Reality itself, our “objective” world, is cocreated, man himself participates in the creation; creation, which is somewhat constantly being re-created.
  • Our scientific models put the finishing touches on reality, because (1) they interpret, (2) they give phenomena a name, (3) they enable us to classify the world and phenomena according to logical forms, and (4) through these models we de facto perceive reality.
  • When man finds a new linguistic framework or analytical model, or stops using the old one, he molds or remolds reality. Models are only in our heads; they are not “in objective reality.” In this sense, Newton invented (not merely discovered!) gravity.
  • A real-ization act on our part represents the creation of a construct, the imputation of sense and order (which is beautifully expressed by the biblical act of naming, or categorization, sorting, ordering).
  • Keynes enters into the history of economic thought from the same intellectual cadence; his greatest contribution to economics was precisely the resurrection of the imperceptible—for example in the form of animal spirits or uncertainty. The economist Piero Mini even ascribes Keynes’s doubting and rebellious approach to his almost Talmudic education.63
  • God connects man with the task of guarding and protecting the Garden of Eden, and thus man actually cocreates the cultural landscape. The Czech philosopher Zdeněk Neubauer also describes this: “Such is reality, and it is so deep that it willingly crystallizes into worlds. Therefore I profess that reality is a creation and not a place of occurrence for objectively given phenomena.”61
  • in this viewpoint it is possible to see how Jewish thought is mystical—it admits the role of the incomprehensible. Therefore, through its groundedness, Jewish thought indulges mystery and defends itself against a mechanistic-causal explanation of the world: “The Jewish way of thinking, according to Veblen, emphasizes the spiritual, the miraculous, the intangible.
  • The Jews believed the exact opposite. The world is created by a good God, and evil appears in it as a result of immoral human acts. Evil, therefore, is induced by man.66 History unwinds according to the morality of human acts.
  • What’s more, history seems to be based on morals; morals seem to be the key determining factors of history. For the Hebrews, history proceeds according to how morally its actors behave.
  • The Sumerians believed in dualism—good and evil deities exist, and the earth of people becomes their passive battlefield.
  • GOOD AND EVIL IN US: A MORAL EXPLANATION OF WELL-BEING We have seen that in the Epic of Gilgamesh, good and evil are not yet addressed systematically on a moral level.
  • This was not about moral-human evil, but rather a kind of natural evil. It is as if good and evil were not touched by morality at all. Evil simply occurred. Period.
  • the epic, good and evil are not envisaged morally—they are not the result of an (a)moral act. Evil was not associated with free moral action or individual will.
  • Hebrew thought, on the other hand, deals intensively with moral good and evil. A moral dimension touches the core of its stories.65
  • discrepancy between savings and investment, and others are convinced of the monetary essence
  • The entire history of the Jewish nation is interpreted and perceived in terms of morality. Morality has become, so to speak, a mover and shaker of Hebrew history.
  • sunspots. The Hebrews came up with the idea that morals were behind good and bad years, behind the economic cycle. But we would be getting ahead of ourselves. Pharaoh’s Dream: Joseph and the First Business Cycle To
  • It is the Pharaoh’s well-known dream of seven fat and seven lean cows, which he told to Joseph, the son of Jacob. Joseph interpreted the dream as a macroeconomic prediction of sorts: Seven years of abundance were to be followed by seven years of poverty, famine, and misery.
  • Self-Contradicting Prophecy Here, let’s make several observations on this: Through taxation74 on the level of one-fifth of a crop75 in good years to save the crop and then open granaries in bad years, the prophecy was de facto prevented (prosperous years were limited and hunger averted—through a predecessor of fiscal stabilization).
  • The Old Testament prophesies therefore were not any deterministic look into the future, but warnings and strategic variations of the possible, which demanded some kind of reaction. If the reaction was adequate, what was prophesied would frequently not occur at all.
  • This principle stands directly against the self-fulfilling prophecy,80 the well-known concept of social science. Certain prophecies become self-fulfilling when expressed (and believed) while others become self-contradicting prophecies when pronounced (and believed).
  • If the threat is anticipated, it is possible to totally or at least partially avoid it. Neither Joseph nor the pharaoh had the power to avoid bounty or crop failure (in this the dream interpretation was true and the appearance of the future mystical), but they avoided the impacts and implications of the prophecy (in this the interpretation of the dream was “false”)—famine did not ultimately occur in Egypt, and this was due to the application of reasonable and very intuitive economic policy.
  • Let us further note that the first “macroeconomic forecast” appears in a dream.
  • back to Torah: Later in this story we will notice that there is no reason offered as to why the cycle occurs (that will come later). Fat years will simply come, and then lean years after them.
  • Moral Explanation of a Business Cycle That is fundamentally different from later Hebrew interpretations, when the Jewish nation tries to offer reasons why the nation fared well or poorly. And those reasons are moral.
  • If you pay attention to these laws and are careful to follow them, then the Lord your God will keep his covenant of love with you, as he swore to your forefathers. He will love you and bless you and increase your numbers.
  • Only in recent times have some currents of economics again become aware of the importance of morals and trust in the form of measuring the quality of institutions, the level of justice, business ethics, corruption, and so forth, and examining their influence on the economy,
  • From today’s perspective, we can state that the moral dimension entirely disappeared from economic thought for a long time, especially due to the implementation of Mandeville’s concept of private vices that contrarily support the public welfare
  • Without being timid, we can say this is the first documented attempt to explain the economic cycle. The economic cycle, the explanation of which is to this day a mystery to economists, is explained morally in the Old Testament.
  • But how do we consolidate these two conflicting interpretations of the economic cycle: Can ethics be responsible for it or not? Can we influence reality around us through our acts?
  • it is not within the scope of this book to answer that question; justice has been done to the question if it manages to sketch out the main contours of possible searches for answers.
  • THE ECONOMICS OF GOOD AND EVIL: DOES GOOD PAY OFF? This is probably the most difficult moral problem we could ask.
  • Kant, the most important modern thinker in the area of ethics, answers on the contrary that if we carry out a “moral” act on the basis of economic calculus (therefore we carry out an hedonistic consideration; see below) in the expectation of later recompense, its morality is lost. Recompense, according to the strict Kant, annuls ethics.
  • Inquiring about the economics of good and evil, however, is not that easy. Where would Kant’s “moral dimension of ethics” go if ethics paid? If we do good for profit, the question of ethics becomes a mere question of rationality.
  • Job’s friends try to show that he must have sinned in some way and, in doing so, deserved God’s punishment. They are absolutely unable to imagine a situation in which Job, as a righteous man, would suffer without (moral) cause. Nevertheless, Job insists that he deserves no punishment because he has committed no offense: “God has wronged me and drawn his net around me.”94
  • But Job remains righteous, even though it does not pay to do so: Though he slay me, yet will I hope in him.95 And till I die, I will not deny my integrity I will maintain my righteousness and never let go of it; my conscience will not reproach me as long as I live.96
  • He remains righteous, even if his only reward is death. What economic advantage could he have from that?
  • morals cannot be considered in the economic dimension of productivity and calculus. The role of the Hebrews was to do good, whether it paid off or not. If good (outgoing) is rewarded by incoming goodness, it is a bonus,99 not a reason to do outgoing good. Good and reward do not correlate to each other.
  • This reasoning takes on a dimension of its own in the Old Testament. Good (incoming) has already happened to us. We must do good (outgoing) out of gratitude for the good (incoming) shown to us in the past.
  • So why do good? After all, suffering is the fate of many biblical figures. The answer can only be: For good itself. Good has the power to be its own reward. In this sense, goodness gets its reward, which may or may not take on a material dimension.
  • the Hebrews offered an interesting compromise between the teachings of the Stoics and Epicureans. We will go into it in detail later, so only briefly
  • constraint. It calls for bounded optimalization (with limits). A kind of symbiosis existed between the legitimate search for one’s own utility (or enjoyment of life) and maintaining rules, which are not negotiable and which are not subject to optimalization.
  • In other words, clear (exogenously given) rules exist that must be observed and cannot be contravened. But within these borders it is absolutely possible, and even recommended, to increase utility.
  • the mining of enjoyment must not come at the expense of exogenously given rules. “Judaism comes therefore to train or educate the unbounded desire … for wealth, so that market activities and patterns of consumption operate within a God-given morality.”102
  • The Epicureans acted with the goal of maximizing utility without regard for rules (rules developed endogenously, from within the system, computed from that which increased utility—this was one of the main trumps of the Epicurean school; they did not need exogenously given norms, and argued that they could “calculate” ethics (what to do) for every given situation from the situation itself).
  • The Stoics could not seek their enjoyment—or, by another name, utility. They could not in any way look back on it, and in no way could they count on it. They could only live according to rules (the greatest weakness of this school was to defend where exogenously the given rules came from and whether they are universal) and take a indifferent stand to the results of their actions.
  • To Love the Law The Jews not only had to observe the law (perhaps the word covenant would be more appropriate), but they were to love it because it was good.
  • Their relationship to the law was not supposed to be one of duty,105 but one of gratitude, love. Hebrews were to do good (outgoing), because goodness (incoming) has already been done to them.
  • This is in stark contrast with today’s legal system, where, naturally, no mention of love or gratefulness exists. But God expects a full internalization of the commandments and their fulfillment with love, not as much duty. By no means was this on the basis of the cost-benefit analyses so widespread in economics today, which determines when it pays to break the law and when not to (calculated on the basis of probability of being caught and the amount of punishment vis-à-vis the possible gain).
  • And now, O Israel, what does the Lord your God ask of you but to fear the Lord your God, to walk in all his ways, to love him, to serve the Lord your God with all your heart and with all your soul, and to observe the Lord’s commands and decrees that I am giving you today for your own good? To the Lord your God belong the heavens, even the highest heavens, the earth and everything in it. Yet the Lord set his affection on your forefathers and loved them….
  • the principle of doing good (outgoing) on the basis of a priori demonstrated good (incoming) was also taken over by the New Testament. Atonement itself is based on an a priori principle; all our acts are preceded by good.
  • The Hebrews, originally a nomadic tribe, preferred to be unrestrained and grew up in constant freedom of motion.
  • Human laws, if they are in conflict with the responsibilities given by God, are subordinate to personal responsibility, and a Jew cannot simply join the majority, even if it is legally allowed. Ethics, the concept of good, is therefore always superior to all local laws, rules, and customs:
  • THE SHACKLES OF THE CITY Owing to the Hebrew’s liberation from Egyptian slavery, freedom and responsibility become the key values of Jewish thought.
  • Laws given by God are binding for Jews, and God is the absolute source of all values,
  • The Hebrew ideal is represented by the paradise of the Garden of Eden, not a city.116 The despised city civilization or the tendency to see in it a sinful and shackling way of life appears in glimpses and allusions in many places in the Old Testament.
  • The nomadic Jewish ethos is frequently derived from Abraham, who left the Chaldean city of Ur on the basis of a command:
  • In addition, they were aware of a thin two-way line between owner and owned. We own material assets, but—to a certain extent—they own us and tie us down. Once we become used to a certain material
  • This way of life had understandably immense economic impacts. First, such a society lived in much more connected relationships, where there was no doubt that everyone mutually depended on each other. Second, their frequent wanderings meant the inability to own more than they could carry; the gathering up of material assets did not have great weight—precisely because the physical weight (mass) of things was tied to one place.
  • One of Moses’s greatest deeds was that he managed to explain to his nation once and for all that it is better to remain hungry and liberated than to be a slave with food “at no cost.”
  • SOCIAL WELFARE: NOT TO ACT IN THE MANNER OF SODOM
  • regulations is developed in the Old Testament, one we hardly find in any other nation of the time. In Hebrew teachings, aside from individual utility, indications of the concept of maximalizing utility societywide appear for the first time as embodied in the Talmudic principle of Kofin al midat S´dom, which can be translated as “one is compelled not to act in the manner of Sodom” and to take care of the weaker members of society.
  • In a jubilee year, debts were to be forgiven,125 and Israelites who fell into slavery due to their indebtedness were to be set free.126
  • Such provisions can be seen as the antimonopoly and social measures of the time. The economic system even then had a clear tendency to converge toward asset concentration, and therefore power as well. It would appear that these provisions were supposed to prevent this process
  • Land at the time could be “sold,” and it was not sale, but rent. The price (rent) of real estate depended on how long there was until a forgiveness year. It was about the awareness that we may work the land, but in the last instance we are merely “aliens and strangers,” who have the land only rented to us for a fixed time. All land and riches came from the Lord.
  • These provisions express a conviction that freedom and inheritance should not be permanently taken away from any Israelite. Last but not least, this system reminds us that no ownership lasts forever and that the fields we plow are not ours but the Lord’s.
  • Glean Another social provision was the right to glean, which in Old Testament times ensured at least basic sustenance for the poorest. Anyone who owned a field had the responsibility not to harvest it to the last grain but to leave the remains in the field for the poor.
  • Tithes and Early Social Net Every Israelite also had the responsibility of levying a tithe from their entire crop. They had to be aware from whom all ownership comes and, by doing so, express their thanks.
  • “Since the community has an obligation to provide food, shelter, and basic economic goods for the needy, it has a moral right and duty to tax its members for this purpose. In line with this duty, it may have to regulate markets, prices and competition, to protect the interests of its weakest members.”135
  • In Judaism, charity is not perceived as a sign of goodness; it is more of a responsibility. Such a society then has the right to regulate its economy in such a way that the responsibility of charity is carried out to its satisfaction.
  • With a number of responsibilities, however, comes the difficulty of getting them into practice. Their fulfillment, then, in cases when it can be done, takes place gradually “in layers.” Charitable activities are classified in the Talmud according to several target groups with various priorities, classified according to, it could be said, rules of subsidiarity.
  • Do not mistreat an alien or oppress him, for you were aliens in Egypt.140 As one can see, aside from widows and orphans, the Old Testament also includes immigrants in its area of social protection.141 The Israelites had to have the same rules apply for them as for themselves—they could not discriminate on the basis of their origin.
  • ABSTRACT MONEY, FORBIDDEN INTEREST, AND OUR DEBT AGE If it appears to us that today’s era is based on money and debt, and our time will be written into history as the “Debt age,” then it will certainly be interesting to follow how this development occurred.
  • Money is a social abstractum. It is a social agreement, an unwritten contract.
  • The first money came in the form of clay tablets from Mesopotamia, on which debts were written. These debts were transferable, so the debts became currency. In the end, “It is no coincidence that in English the root of ‘credit’ is ‘credo,’ the Latin for ‘I believe.’”
  • To a certain extent it could be said that credit, or trust, was the first currency. It can materialize, it can be embodied in coins, but what is certain is that “money is not metal,” even the rarest metal, “it is trust inscribed,”
  • Inseparably, with the original credit (money) goes interest. For the Hebrews, the problem of interest was a social issue: “If you lend money to one of my people among you who is needy, do not be like a moneylender; charge him no interest.”
  • there were also clearly set rules setting how far one could go in setting guarantees and the nonpayment of debts. No one should become indebted to the extent that they could lose the source of their livelihood:
  • In the end, the term “bank” comes from the Italian banci, or the benches that Jewish lenders sat on.157
  • Money is playing not only its classical roles (as a means of exchange, a holder of value, etc.) but also a much greater, stronger role: It can stimulate, drive (or slow down) the whole economy. Money plays a national economic role.
  • In the course of history, however, the role of loans changed, and the rich borrowed especially for investment purposes,
  • Today the position and significance of money and debt has gone so far and reached such a dominant position in society that operating with debts (fiscal policy) or interest or money supply (monetary policy) means that these can, to a certain extent, direct (or at least strongly influence) the whole economy and society.
  • In such a case a ban on interest did not have great ethical significance. Thomas Aquinas, a medieval scholar (1225-1274), also considers similarly; in his time, the strict ban on lending with usurious interest was loosened, possibly due to him.
  • As a form of energy, money can travel in three dimensions, vertically (those who have capital lend to those who do not) and horizontally (speed and freedom in horizontal or geographic motion has become the by-product—or driving force?—of globalization). But money (as opposed to people) can also travel through time.
  • money is something like energy that can travel through time. And it is a very useful energy, but at the same time very dangerous as well. Wherever
  • Aristotle condemned interest162 not only from a moral standpoint, but also for metaphysical reasons. Thomas Aquinas shared the same fear of interest and he too argued that time does not belong to us, and that is why we must not require interest.
  • MONEY AS ENERGY: TIME TRAVEL AND GROSS DEBT PRODUCT (GDP)
  • Due to this characteristic, we can energy-strip the future to the benefit of the present. Debt can transfer energy from the future to the present.163 On the other hand, saving can accumulate energy from the past and send it to the present.
  • labor was not considered degrading in the Old Testament. On the contrary, the subjugation of nature is even a mission from God that originally belonged to man’s very first blessings.
  • LABOR AND REST: THE SABBATH ECONOMY
  • The Jews as well as Aristotle behaved very guardedly toward loans. The issue of interest/usury became one of the first economic debates. Without having an inkling of the future role of economic policy (fiscal and monetary), the ancient Hebrews may have unwittingly felt that they were discovering in interest a very powerful weapon, one that can be a good servant, but (literally) an enslaving master as well.
  • It’s something like a dam. When we build one, we are preventing periods of drought and flooding in the valley; we are limiting nature’s whims and, to a large extent, avoiding its incalculable cycles. Using dams, we can regulate the flow of water to nearly a constant. With it we tame the river (and we can also gain
  • But if we do not regulate the water wisely, it may happen that we would overfill the dam and it would break. For the cities lying in the valley, their end would be worse than if a dam were never there.
  • If man lived in harmony with nature before, now, after the fall, he must fight; nature stands against him and he against it and the animals. From the Garden we have moved unto a (battle)field.
  • Only after man’s fall does labor turn into a curse.168 It could even be said that this is actually the only curse, the curse of the unpleasantness of labor, that the Lord places on Adam.
  • Both Plato and Aristotle consider labor to be necessary for survival, but that only the lower classes should devote themselves to it so that the elites would not have to be bothered with it and so that they could devote themselves to “purely spiritual matters—art, philosophy, and politics.”
  • Work is also not only a source of pleasure but a social standing; It is considered an honor. “Do you see a man skilled in his work? He will serve before kings.”170 None of the surrounding cultures appreciate work as much. The idea of the dignity of labor is unique in the Hebrew tradition.
  • Hebrew thinking is characterized by a strict separation of the sacred from the profane. In life, there are simply areas that are holy, and in which it is not allowed to economize, rationalize, or maximize efficiency.
  • good example is the commandment on the Sabbath. No one at all could work on this day, not even the ones who were subordinate to an observant Jew:
  • the message of the commandment on Saturday communicated that people were not primarily created for labor.
  • Paradoxically, it is precisely this commandment out of all ten that is probably the most violated today.
  • Aristotle even considers labor to be “a corrupted waste of time which only burdens people’s path to true honour.”
  • we have days when we must not toil connected (at least lexically) with the word meaning emptiness: the English term “vacation” (or emptying), as with the French term, les vacances, or German die Freizeit, meaning open time, free time, but also…
  • Translated into economic language: The meaning of utility is not to increase it permanently but to rest among existing gains. Why do we learn how to constantly increase gains but not how to…
  • This dimension has disappeared from today’s economics. Economic effort has no goal at which it would be possible to rest. Today we only know growth for growth’s sake, and if our company or country prospers, that does not…
  • Six-sevenths of time either be dissatisfied and reshape the world into your own image, man, but one-seventh you will rest and not change the creation. On the seventh day, enjoy creation and enjoy the work of your hands.
  • the purpose of creation was not just creating but that it had an end, a goal. The process was just a process, not a purpose. The whole of Being was created so…
  • Saturday was not established to increase efficiency. It was a real ontological break that followed the example of the Lord’s seventh day of creation. Just as the Lord did not rest due to tiredness or to regenerate strength; but because He was done. He was done with His work, so that He could enjoy it, to cherish in His creation.
  • If we believe in rest at all today, it is for different reasons. It is the rest of the exhausted machine, the rest of the weak, and the rest of those who can’t handle the tempo. It’s no wonder that the word “rest…
  • Related to this, we have studied the first mention of a business cycle with the pharaoh’s dream as well as seen a first attempt (that we may call…
  • We have tried to show that the quest for a heaven on Earth (similar to the Jewish one) has, in its desacralized form, actually also been the same quest for many of the…
  • We have also seen that the Hebrews tried to explain the business cycle with morality and ethics. For the Hebrews,…
  • ancient Greek economic ethos, we will examine two extreme approaches to laws and rules. While the Stoics considered laws to be absolutely valid, and utility had infinitesimal meaning in their philosophy, the Epicureans, at least in the usual historical explanation, placed utility and pleasure in first place—rules were to be made based on the principle of utility.
  • CONCLUSION: BETWEEN UTILITY AND PRINCIPLE The influence of Jewish thought on the development of market democracy cannot be overestimated. The key heritage for us was the lack of ascetic perception of the world, respect to law and private…
  • We have tried to show how the Torah desacralized three important areas in our lives: the earthly ruler, nature,…
  • What is the relationship between the good and evil that we do (outgoing) and the utility of disutility that we (expect to) get as a reward (incoming)? We have seen…
  • The Hebrews never despised material wealth; on contrary, the Jewish faith puts great responsibility on property management. Also the idea of progress and the linear perception of time gives our (economic)…
  • the Hebrews managed to find something of a happy compromise between both of these principles.
  • will not be able to completely understand the development of the modern notion of economics without understanding the disputes between the Epicureans and the Stoics;
  • poets actually went even further, and with their speech they shaped and established reality and truth. Honor, adventure, great deeds, and the acclaim connected with them played an important role in the establishment of the true, the real.
  • those who are famous will be remembered by people. They become more real, part of the story, and they start to be “realized,” “made real” in the lives of other people. That which is stored in memory is real; that which is forgotten is as if it never existed.
  • Today’s scientific truth is founded on the notion of exact and objective facts, but poetic truth stands on an interior (emotional) consonance with the story or poem. “It is not addressed first to the brain … [myth] talks directly to the feeling system.”
  • “epic and tragic poets were widely assumed to be the central ethical thinkers and teachers of Greece; nobody thought of their work as less serious, less aimed at truth, than the speculative prose treatises of historians and philosophers.”5 Truth and reality were hidden in speech, stories, and narration.
  • Ancient philosophy, just as science would later, tries to find constancy, constants, quantities, inalterabilities. Science seeks (creates?) order and neglects everything else as much as it can. In their own experiences, everyone knows that life is not like that,
  • Just as scientists do today, artists drew images of the world that were representative, and therefore symbolic, picturelike, and simplifying (but thus also misleading), just like scientific models, which often do not strive to be “realistic.”
  • general? In the end, poetry could be more sensitive to the truth than the philosophical method or, later, the scientific method. “Tragic poems, in virtue of their subject matter and their social function, are likely to confront and explore problems about human beings and luck that a philosophical text might be able to omit or avoid.”8
Javier E

MacIntyre | Internet Encyclopedia of Philosophy - 0 views

  • For MacIntyre, “rationality” comprises all the intellectual resources, both formal and substantive, that we use to judge truth and falsity in propositions, and to determine choice-worthiness in courses of action
  • Rationality in this sense is not universal; it differs from community to community and from person to person, and may both develop and regress over the course of a person’s life or a community’s history.
  • So rationality itself, whether theoretical or practical, is a concept with a history: indeed, since there are also a diversity of traditions of enquiry, with histories, there are, so it will turn out, rationalities rather than rationality, just as it will also turn out that there are justices rather than justice
  • ...164 more annotations...
  • Rationality is the collection of theories, beliefs, principles, and facts that the human subject uses to judge the world, and a person’s rationality is, to a large extent, the product of that person’s education and moral formation.
  • To the extent that a person accepts what is handed down from the moral and intellectual traditions of her or his community in learning to judge truth and falsity, good and evil, that person’s rationality is “tradition-constituted.” Tradition-constituted rationality provides the schemata by which we interpret, understand, and judge the world we live in
  • The apparent problem of relativism in MacIntyre’s theory of rationality is much like the problem of relativism in the philosophy of science. Scientific claims develop within larger theoretical frameworks, so that the apparent truth of a scientific claim depends on one’s judgment of the larger framework. The resolution of the problem of relativism therefore appears to hang on the possibility of judging frameworks or rationalities, or judging between frameworks or rationalities from a position that does not presuppose the truth of the framework or rationality, but no such theoretical standpoint is humanly possible.
  • MacIntyre finds that the world itself provides the criterion for the testing of rationalities, and he finds that there is no criterion except the world itself that can stand as the measure of the truth of any philosophical theory.
  • MacIntyre’s philosophy is indebted to the philosophy of science, which recognizes the historicism of scientific enquiry even as it seeks a truthful understanding of the world. MacIntyre’s philosophy does not offer a priori certainty about any theory or principle; it examines the ways in which reflection upon experience supports, challenges, or falsifies theories that have appeared to be the best theories so far to the people who have accepted them so far. MacIntyre’s ideal enquirers remain Hamlets, not Emmas.
  • history shows us that individuals, communities, and even whole nations may commit themselves militantly over long periods of their histories to doctrines that their ideological adversaries find irrational. This qualified relativism of appearances has troublesome implications for anyone who believes that philosophical enquiry can easily provide certain knowledge of the world
  • According to MacIntyre, theories govern the ways that we interpret the world and no theory is ever more than “the best standards so far” (3RV, p. 65). Our theories always remain open to improvement, and when our theories change, the appearances of our world—the apparent truths of claims judged within those theoretical frameworks—change with them.
  • From the subjective standpoint of the human enquirer, MacIntyre finds that theories, concepts, and facts all have histories, and they are all liable to change—for better or for worse.
  • MacIntyre holds that the rationality of individuals is not only tradition-constituted, it is also tradition constitutive, as individuals make their own contributions to their own rationality, and to the rationalities of their communities. Rationality is not fixed, within either the history of a community or the life of a person
  • The modern account of first principles justifies an approach to philosophy that rejects tradition. The modern liberal individualist approach is anti-traditional. It denies that our understanding is tradition-constituted and it denies that different cultures may differ in their standards of rationality and justice:
  • Modernity does not see tradition as the key that unlocks moral and political understanding, but as a superfluous accumulation of opinions that tend to prejudice moral and political reasoning.
  • Although modernity rejects tradition as a method of moral and political enquiry, MacIntyre finds that it nevertheless bears all the characteristics of a moral and political tradition.
  • If historical narratives are only projections of the interests of historians, then it is difficult to see how this historical narrative can claim to be truthful
  • For these post-modern theorists, “if the Enlightenment conceptions of truth and rationality cannot be sustained,” either relativism or perspectivism “is the only possible alternative” (p. 353). MacIntyre rejects both challenges by developing his theory of tradition-constituted and tradition-constitutive rationality on pp. 354-369
  • How, then, is one to settle challenges between two traditions? It depends on whether the adherents of either take the challenges of the other tradition seriously. It depends on whether the adherents of either tradition, on seeing a failure in their own tradition are willing to consider an answer offered by their rival (p. 355)
  • how a person with no traditional affiliation is to deal with the conflicting claims of rival traditions: “The initial answer is: that will depend upon who you are and how you understand yourself. This is not the kind of answer which we have been educated to expect in philosophy”
  • MacIntyre focuses the critique of modernity on the question of rational justification. Modern epistemology stands or falls on the possibility of Cartesian epistemological first principles. MacIntyre’s history exposes that notion of first principle as a fiction, and at the same time demonstrates that rational enquiry advances (or declines) only through tradition
  • MacIntyre cites Foucault’s 1966 book, Les Mots et les choses (The Order of Things, 1970) as an example of the self-subverting character of Genealogical enquiry
  • Foucault’s book reduces history to a procession of “incommensurable ordered schemes of classification and representation” none of which has any greater claim to truth than any other, yet this book “is itself organized as a scheme of classification and representation.”
  • From MacIntyre’s perspective, there is no question of deciding whether or not to work within a tradition; everyone who struggles with practical, moral, and political questions simply does. “There is no standing ground, no place for enquiry . . . apart from that which is provided by some particular tradition or other”
  • Three Rival Versions of Moral Enquiry (1990). The central idea of the Gifford Lectures is that philosophers make progress by addressing the shortcomings of traditional narratives about the world, shortcomings that become visible either through the failure of traditional narratives to make sense of experience, or through the introduction of contradictory narratives that prove impossible to dismiss
  • MacIntyre compares three traditions exemplified by three literary works published near the end of Adam Gifford’s life (1820–1887)
  • The Ninth Edition of the Encyclopaedia Britannica (1875–1889) represents the modern tradition of trying to understand the world objectively without the influence of tradition.
  • The Genealogy of Morals (1887), by Friedrich Nietzsche embodies the post-modern tradition of interpreting all traditions as arbitrary impositions of power.
  • The encyclical letter Aeterni Patris (1879) of Pope Leo XIII exemplifies the approach of acknowledging one’s predecessors within one’s own tradition of enquiry and working to advance or improve that tradition in the pursuit of objective truth. 
  • Of the three versions of moral enquiry treated in 3RV, only tradition, exemplified in 3RV by the Aristotelian, Thomistic tradition, understands itself as a tradition that looks backward to predecessors in order to understand present questions and move forward
  • Encyclopaedia obscures the role of tradition by presenting the most current conclusions and convictions of a tradition as if they had no history, and as if they represented the final discovery of unalterable truth
  • Encyclopaedists focus on the present and ignore the past.
  • Genealogists, on the other hand, focus on the past in order to undermine the claims of the present.
  • In short, Genealogy denies the teleology of human enquiry by denying (1) that historical enquiry has been fruitful, (2) that the enquiring person has a real identity, and (3) that enquiry has a real goal. MacIntyre finds this mode of enquiry incoherent.
  • Genealogy is self-deceiving insofar as it ignores the traditional and teleological character of its enquiry.
  • Genealogical moral enquiry must make similar exceptions to its treatments of the unity of the enquiring subject and the teleology of moral enquiry; thus “it seems to be the case that the intelligibility of genealogy requires beliefs and allegiances of a kind precluded by the genealogical stance” (3RV, p. 54-55)
  • MacIntyre uses Thomism because it applies the traditional mode of enquiry in a self-conscious manner. Thomistic students learn the work of philosophical enquiry as apprentices in a craft (3RV, p. 61), and maintain the principles of the tradition in their work to extend the understanding of the tradition, even as they remain open to the criticism of those principles.
  • 3RV uses Thomism as its example of tradition, but this use should not suggest that MacIntyre identifies “tradition” with Thomism or Thomism-as-a-name-for-the-Western-tradition. As noted above, WJWR distinguished four traditions of enquiry within the Western European world alone
  • MacIntyre’s emphasis on the temporality of rationality in traditional enquiry makes tradition incompatible with the epistemological projects of modern philosophy
  • Tradition is not merely conservative; it remains open to improvement,
  • Tradition differs from both encyclopaedia and genealogy in the way it understands the place of its theories in the history of human enquiry. The adherent of a tradition must understand that “the rationality of a craft is justified by its history so far,” thus it “is inseparable from the tradition through which it was achieved”
  • MacIntyre uses Thomas Aquinas to illustrate the revolutionary potential of traditional enquiry. Thomas was educated in Augustinian theology and Aristotelian philosophy, and through this education he began to see not only the contradictions between the two traditions, but also the strengths and weaknesses that each tradition revealed in the other. His education also helped him to discover a host of questions and problems that had to be answered and solved. Many of Thomas Aquinas’ responses to these concerns took the form of disputed questions. “Yet to each question the answer produced by Aquinas as a conclusion is no more than and, given Aquinas’s method, cannot but be no more than, the best answer reached so far. And hence derives the essential incompleteness”
  • argue that the virtues are essential to the practice of independent practical reason. The book is relentlessly practical; its arguments appeal only to experience and to purposes, and to the logic of practical reasoning.
  • Like other intelligent animals, human beings enter life vulnerable, weak, untrained, and unknowing, and face the likelihood of infirmity in sickness and in old age. Like other social animals, humans flourish in groups. We learn to regulate our passions, and to act effectively alone and in concert with others through an education provided within a community. MacIntyre’s position allows him to look to the animal world to find analogies to the role of social relationships in the moral formation of human beings
  • The task for the human child is to make “the transition from the infantile exercise of animal intelligence to the exercise of independent practical reasoning” (DRA, p. 87). For a child to make this transition is “to redirect and transform her or his desires, and subsequently to direct them consistently towards the goods of different stages of her or his life” (DRA, p. 87). The development of independent practical reason in the human agent requires the moral virtues in at least three ways.
  • DRA presents moral knowledge as a “knowing how,” rather than as a “knowing that.” Knowledge of moral rules is not sufficient for a moral life; prudence is required to enable the agent to apply the rules well.
  • “Knowing how to act virtuously always involves more than rule-following” (DRA, p. 93). The prudent person can judge what must be done in the absence of a rule and can also judge when general norms cannot be applied to particular cases.
  • Flourishing as an independent practical reasoner requires the virtues in a second way, simply because sometimes we need our friends to tell us who we really are. Independent practical reasoning also requires self-knowledge, but self-knowledge is impossible without the input of others whose judgment provides a reliable touchstone to test our beliefs about ourselves. Self-knowledge therefore requires the virtues that enable an agent to sustain formative relationships and to accept the criticism of trusted friends
  • Human flourishing requires the virtues in a third way, by making it possible to participate in social and political action. They enable us to “protect ourselves and others against neglect, defective sympathies, stupidity, acquisitiveness, and malice” (DRA, p. 98) by enabling us to form and sustain social relationships through which we may care for one another in our infirmities, and pursue common goods with and for the other members of our societies.
  • MacIntyre argues that it is impossible to find an external standpoint, because rational enquiry is an essentially social work (DRA, p. 156-7). Because it is social, shared rational enquiry requires moral commitment to, and practice of, the virtues to prevent the more complacent members of communities from closing off critical reflection upon “shared politically effective beliefs and concepts”
  • MacIntyre finds himself compelled to answer what may be called the question of moral provincialism: If one is to seek the truth about morality and justice, it seems necessary to “find a standpoint that is sufficiently external to the evaluative attitudes and practices that are to be put to the question.” If it is impossible for the agent to take such an external standpoint, if the agent’s commitments preclude radical criticism of the virtues of the community, does that leave the agent “a prisoner of shared prejudices” (DRA, p. 154)?
  • The book moves from MacIntyre’s assessment of human needs for the virtues to the political implications of that assessment. Social and political institutions that form and enable independent practical reasoning must “satisfy three conditions.” (1) They must enable their members to participate in shared deliberations about the communities’ actions. (2) They must establish norms of justice “consistent with exercise of” the virtue of justice. (3) They must enable the strong “to stand proxy” as advocates for the needs of the weak and the disabled.
  • The social and political institutions that MacIntyre recommends cannot be identified with the modern nation state or the modern nuclear family
  • The political structures necessary for human flourishing are essentially local
  • Yet local communities support human flourishing only when they actively support “the virtues of just generosity and shared deliberation”
  • MacIntyre rejects individualism and insists that we view human beings as members of communities who bear specific debts and responsibilities because of our social identities. The responsibilities one may inherit as a member of a community include debts to one’s forbearers that one can only repay to people in the present and future
  • The constructive argument of the second half of the book begins with traditional accounts of the excellences or virtues of practical reasoning and practical rationality rather than virtues of moral reasoning or morality. These traditional accounts define virtue as arête, as excellence
  • Practices are supported by institutions like chess clubs, hospitals, universities, industrial corporations, sports leagues, and political organizations.
  • Practices exist in tension with these institutions, since the institutions tend to be oriented to goods external to practices. Universities, hospitals, and scholarly societies may value prestige, profitability, or relations with political interest groups above excellence in the practices they are said to support.
  • Personal desires and institutional pressures to pursue external goods may threaten to derail practitioners’ pursuits of the goods internal to practices. MacIntyre defines virtue initially as the quality of character that enables an agent to overcome these temptations:
  • “A virtue is an acquired human quality the possession and exercise of which tends to enable us to achieve those goods which are internal to practices
  • Excellence as a human agent cannot be reduced to excellence in a particular practice (See AV, pp. 204–
  • The virtues therefore are to be understood as those dispositions which will not only sustain practices and enable us to achieve the goods internal to practices, but which will also sustain us in the relevant kind of quest for the good, by enabling us to overcome the harms, dangers, temptations, and distractions which we encounter, and which will furnish us with increasing self-knowledge and increasing knowledge of the good (AV, p. 219).
  • The excellent human agent has the moral qualities to seek what is good and best both in practices and in life as a whole.
  • The virtues find their point and purpose not only in sustaining those relationships necessary if the variety of goods internal to practices are to be achieved and not only in sustaining the form of an individual life in which that individual may seek out his or her good as the good of his or her whole life, but also in sustaining those traditions which provide both practices and individual lives with their necessary historical context (AV, p. 223)
  • Since “goods, and with them the only grounds for the authority of laws and virtues, can only be discovered by entering into those relationships which constitute communities whose central bond is a shared vision of and understanding of goods” (AV, p. 258), any hope for the transformation and renewal of society depends on the development and maintenance of such communities.
  • MacIntyre’s Aristotelian approach to ethics as a study of human action distinguishes him from post-Kantian moral philosophers who approach ethics as a means of determining the demands of objective, impersonal, universal morality
  • This modern approach may be described as moral epistemology. Modern moral philosophy pretends to free the individual to determine for her- or himself what she or he must do in a given situation, irrespective of her or his own desires; it pretends to give knowledge of universal moral laws
  • Aristotelian metaphysicians, particularly Thomists who define virtue in terms of the perfection of nature, rejected MacIntyre’s contention that an adequate Aristotelian account of virtue as excellence in practical reasoning and human action need not appeal to Aristotelian metaphysic
  • one group of critics rejects MacIntyre’s Aristotelianism because they hold that any Aristotelian account of the virtues must first account for the truth about virtue in terms of Aristotle’s philosophy of nature, which MacIntyre had dismissed in AV as “metaphysical biology”
  • Many of those who rejected MacIntyre’s turn to Aristotle define “virtue” primarily along moral lines, as obedience to law or adherence to some kind of natural norm. For these critics, “virtuous” appears synonymous with “morally correct;” their resistance to MacIntyre’s appeal to virtue stems from their difficulties either with what they take to be the shortcomings of MacIntyre’s account of moral correctness or with the notion of moral correctness altogether
  • MacIntyre continues to argue from the experience of practical reasoning to the demands of moral education.
  • Descartes and his successors, by contrast, along with certain “notable Thomists of the last hundred years” (p. 175), have proposed that philosophy begins from knowledge of some “set of necessarily true first principles which any truly rational person is able to evaluate as true” (p. 175). Thus for the moderns, philosophy is a technical rather than moral endeavor
  • MacIntyre distinguishes two related challenges to his position, the “relativist challenge” and the “perspectivist challenge.” These two challenges both acknowledge that the goals of the Enlightenment cannot be met and that, “the only available standards of rationality are those made available by and within traditions” (p. 252); they conclude that nothing can be known to be true or false
  • MacIntyre follows the progress of the Western tradition through “three distinct traditions:” from Homer and Aristotle to Thomas Aquinas, from Augustine to Thomas Aquinas and from Augustine through Calvin to Hume
  • Chapter 17 examines the modern liberal denial of tradition, and the ironic transformation of liberalism into the fourth tradition to be treated in the book.
  • MacIntyre credits John Stuart Mill and Thomas Aquinas as “two philosophers of the kind who by their writing send us beyond philosophy into immediate encounter with the ends of life
  • First, both were engaged by questions about the ends of life as questioning human beings and not just as philosophers. . . .
  • Secondly, both Mill and Aquinas understood their speaking and writing as contributing to an ongoing philosophical conversation. . . .
  • Thirdly, it matters that both the end of the conversation and the good of those who participate in it is truth and that the nature of truth, of good, of rational justification, and of meaning therefore have to be central topics of that conversation (Tasks, pp. 130-1).
  • Without these three characteristics, philosophy is first reduced to “the exercise of a set of analytic and argumentative skills. . . . Secondly, philosophy may thereby become a diversion from asking questions about the ends of life with any seriousness”
  • Neither Rosenzweig nor Lukács made philosophical progress because both failed to relate “their questions about the ends of life to the ends of their philosophical writing”
  • First, any adequate philosophical history or biography must determine whether the authors studied remain engaged with the questions that philosophy studies, or set the questions aside in favor of the answers. Second, any adequate philosophical history or biography must determine whether the authors studied insulated themselves from contact with conflicting worldviews or remained open to learning from every available philosophical approach. Third, any adequate philosophical history or biography must place the authors studied into a broader context that shows what traditions they come from and “whose projects” they are “carrying forward
  • MacIntyre’s recognition of the connection between an author’s pursuit of the ends of life and the same author’s work as a philosophical writer prompts him to finish the essay by demanding three things of philosophical historians and biographers
  • Philosophy is not just a study; it is a practice. Excellence in this practice demands that an author bring her or his struggles with the questions of the ends of philosophy into dialogue with historic and contemporary texts and authors in the hope of making progress in answering those questions
  • MacIntyre defends Thomistic realism as rational enquiry directed to the discovery of truth.
  • The three Thomistic essays in this book challenge those caricatures by presenting Thomism in a way that people outside of contemporary Thomistic scholarship may find surprisingly flexible and open
  • To be a moral agent, (1) one must understand one’s individual identity as transcending all the roles that one fills; (2) one must see oneself as a practically rational individual who can judge and reject unjust social standards; and (3) one must understand oneself as “as accountable to others in respect of the human virtues and not just in respect of [one’s] role-performances
  • J is guilty because he complacently accepted social structures that he should have questioned, structures that undermined his moral agency. This essay shows that MacIntyre’s ethics of human agency is not just a descriptive narrative about the manner of moral education; it is a standard laden account of the demands of moral agency.
  • MacIntyre considers “the case of J” (J, for jemand, the German word for “someone”), a train controller who learned, as a standard for his social role, to take no interest in what his trains carried, even during war time when they carried “munitions and . . . Jews on their way to extermination camps”
  • J had learned to do his work for the railroad according to one set of standards and to live other parts of his life according to other standards, so that this compliant participant in “the final solution” could contend, “You cannot charge me with moral failure” (E&P, p. 187).
  • The epistemological theories of Modern moral philosophy were supposed to provide rational justification for rules, policies, and practical determinations according to abstract universal standards, but MacIntyre has dismissed those theorie
  • Modern metaethics is supposed to enable its practitioners to step away from the conflicting demands of contending moral traditions and to judge those conflicts from a neutral position, but MacIntyre has rejected this project as well
  • In his ethical writings, MacIntyre seeks only to understand how to liberate the human agent from blindness and stupidity, to prepare the human agent to recognize what is good and best to do in the concrete circumstances of that agent’s own life, and to strengthen the agent to follow through on that judgment.
  • In his political writings, MacIntyre investigates the role of communities in the formation of effective rational agents, and the impact of political institutions on the lives of communities. This kind of ethics and politics is appropriately named the ethics of human agency.
  • The purpose of the modern moral philosophy of authors like Kant and Mill was to determine, rationally and universally, what kinds of behavior ought to be performed—not in terms of the agent’s desires or goals, but in terms of universal, rational duties. Those theories purported to let agents know what they ought to do by providing knowledge of duties and obligations, thus they could be described as theories of moral epistemology.
  • Contemporary virtue ethics purports to let agents know what qualities human beings ought to have, and the reasons that we ought to have them, not in terms of our fitness for human agency, but in the same universal, disinterested, non-teleological terms that it inherits from Kant and Mill.
  • For MacIntyre, moral knowledge remains a “knowing how” rather than a “knowing that;” MacIntyre seeks to identify those moral and intellectual excellences that make human beings more effective in our pursuit of the human good.
  • MacIntyre’s purpose in his ethics of human agency is to consider what it means to seek one’s good, what it takes to pursue one’s good, and what kind of a person one must become if one wants to pursue that good effectively as a human agent.
  • As a philosophy of human agency, MacIntyre’s work belongs to the traditions of Aristotle and Thomas Aquinas.
  • in keeping with the insight of Marx’s third thesis on Feuerbach, it maintained the common condition of theorists and people as peers in the pursuit of the good life.
  • He holds that the human good plays a role in our practical reasoning whether we recognize it or not, so that some people may do well without understanding why (E&P, p. 25). He also reads Aristotle as teaching that knowledge of the good can make us better agents
  • AV defines virtue in terms of the practical requirements for excellence in human agency, in an agent’s participation in practices (AV, ch. 14), in an agent’s whole life, and in an agent’s involvement in the life of her or his community
  • MacIntyre’s Aristotelian concept of “human action” opposes the notion of “human behavior” that prevailed among mid-twentieth-century determinist social scientists. Human actions, as MacIntyre understands them, are acts freely chosen by human agents in order to accomplish goals that those agents pursue
  • Human behavior, according to mid-twentieth-century determinist social scientists, is the outward activity of a subject, which is said to be caused entirely by environmental influences beyond the control of the subject.
  • Rejecting crude determinism in social science, and approaches to government and public policy rooted in determinism, MacIntyre sees the renewal of human agency and the liberation of the human agent as central goals for ethics and politics.
  • MacIntyre’s Aristotelian account of “human action” examines the habits that an agent must develop in order to judge and act most effectively in the pursuit of truly choice-worthy ends
  • MacIntyre seeks to understand what it takes for the human person to become the kind of agent who has the practical wisdom to recognize what is good and best to do and the moral freedom to act on her or his best judgment.
  • MacIntyre rejected the determinism of modern social science early in his career (“Determinism,” 1957), yet he recognizes that the ability to judge well and act freely is not simply given; excellence in judgment and action must be developed, and it is the task of moral philosophy to discover how these excellences or virtues of the human agent are established, maintained, and strengthened
  • MacIntyre’s Aristotelian philosophy investigates the conditions that support free and deliberate human action in order to propose a path to the liberation of the human agent through participation in the life of a political community that seeks its common goods through the shared deliberation and action of its members
  • As a classics major at Queen Mary College in the University of London (1945-1949), MacIntyre read the Greek texts of Plato and Aristotle, but his studies were not limited to the grammars of ancient languages. He also examined the ethical theories of Immanuel Kant and John Stuart Mill. He attended the lectures of analytic philosopher A. J. Ayer and of philosopher of science Karl Popper. He read Ludwig Wittgenstein’s Tractatus Logico Philosophicus, Jean-Paul Sartre’s L'existentialisme est un humanisme, and Marx’s Eighteenth Brumaire of Napoleon Bonaparte (What happened, pp. 17-18). MacIntyre met the sociologist Franz Steiner, who helped direct him toward approaching moralities substantively
  • Alasdair MacIntyre’s philosophy builds on an unusual foundation. His early life was shaped by two conflicting systems of values. One was “a Gaelic oral culture of farmers and fishermen, poets and storytellers.” The other was modernity, “The modern world was a culture of theories rather than stories” (MacIntyre Reader, p. 255). MacIntyre embraced both value systems
  • From Marxism, MacIntyre learned to see liberalism as a destructive ideology that undermines communities in the name of individual liberty and consequently undermines the moral formation of human agents
  • For MacIntyre, Marx’s way of seeing through the empty justifications of arbitrary choices to consider the real goals and consequences of political actions in economic and social terms would remain the principal insight of Marxism
  • After his retirement from teaching, MacIntyre has continued his work of promoting a renewal of human agency through an examination of the virtues demanded by practices, integrated human lives, and responsible engagement with community life. He is currently affiliated with the Centre for Contemporary Aristotelian Studies in Ethics and Politics (CASEP) at London Metropolitan University.
  • The second half of AV proposes a conception of practice and practical reasoning and the notion of excellence as a human agent as an alternative to modern moral philosophy
  • AV rejects the view of “modern liberal individualism” in which autonomous individuals use abstract moral principles to determine what they ought to do. The critique of modern normative ethics in the first half of AV rejects modern moral reasoning for its failure to justify its premises, and criticizes the frequent use of the rhetoric of objective morality and scientific necessity to manipulate people to accept arbitrary decisions
  • MacIntyre uses “modern liberal individualism” to name a much broader category that includes both liberals and conservatives in contemporary American political parlance, as well as some Marxists and anarchists (See ASIA, pp. 280-284). Conservatism, liberalism, Marxism, and anarchism all present the autonomous individual as the unit of civil society
  • The sources of modern liberal individualism—Hobbes, Locke, and Rousseau—assert that human life is solitary by nature and social by habituation and convention. MacIntyre’s Aristotelian tradition holds, on the contrary, that human life is social by nature.
  • MacIntyre identifies moral excellence with effective human agency, and seeks a political environment that will help to liberate human agents to recognize and seek their own goods, as components of the common goods of their communities, more effectively. For MacIntyre therefore, ethics and politics are bound together.
  • For MacIntyre ethics is not an application of principles to facts, but a study of moral action. Moral action, free human action, involves decisions to do things in pursuit of goals, and it involves the understanding of the implications of one’s actions for the whole variety of goals that human agents seek
  • In this sense, “To act morally is to know how to act” (SMJ, p. 56). “Morality is not a ‘knowing that’ but a ‘knowing how’”
  • If human action is a ‘knowing how,’ then ethics must also consider how one learns ‘how.’ Like other forms of ‘knowing how,’ MacIntyre finds that one learns how to act morally within a community whose language and shared standards shape our judgment
  • MacIntyre had concluded that ethics is not an abstract exercise in the assessment of facts; it is a study of free human action and of the conditions that enable rational human agency.
  • MacIntyre gives Marx credit for concluding in the third of the Theses on Feuerbach, that the only way to change society is to change ourselves, and that “The coincidence of the changing of human activity or self-changing can only be comprehended and rationally understood as revolutionary practice”
  • MacIntyre distinguishes “religion which is an opiate for the people from religion which is not” (MI, p. 83). He condemns forms of religion that justify social inequities and encourage passivity. He argues that authentic Christian teaching criticizes social structures and encourages action
  • Where “moral philosophy textbooks” discuss the kinds of maxims that should guide “promise-keeping, truth-telling, and the like,” moral maxims do not guide real agents in real life at all. “They do not guide us because we do not need to be guided. We know what to do” (ASIA, p. 106). Sometimes we do this without any maxims at all, or even against all the maxims we know. MacIntyre Illustrates his point with Huckleberry Finn’s decision to help Jim, Miss Watson’s escaped slave, to make his way to freedom
  • MacIntyre develops the ideas that morality emerges from history, and that morality organizes the common life of a community
  • The book concludes that the concepts of morality are neither timeless nor ahistorical, and that understanding the historical development of ethical concepts can liberate us “from any false absolutist claims” (SHE, p. 269). Yet this conclusion need not imply that morality is essentially arbitrary or that one could achieve freedom by liberating oneself from the morality of one’s society.
  • From this “Aristotelian point of view,” “modern morality” begins to go awry when moral norms are separated from the pursuit of human goods and moral behavior is treated as an end in itself. This separation characterizes Christian divine command ethics since the fourteenth century and has remained essential to secularized modern morality since the eighteenth century
  • From MacIntyre’s “Aristotelian point of view,” the autonomy granted to the human agent by modern moral philosophy breaks down natural human communities and isolates the individual from the kinds of formative relationships that are necessary to shape the agent into an independent practical reasoner.
  • the 1977 essay “Epistemological Crises, Dramatic Narrative, and the Philosophy of Science” (Hereafter EC). This essay, MacIntyre reports, “marks a major turning-point in my thought in the 1970s” (The Tasks of Philosophy, p. vii) EC may be described fairly as MacIntyre’s discourse on method
  • First, Philosophy makes progress through the resolution of problems. These problems arise when the theories, histories, doctrines and other narratives that help us to organize our experience of the world fail us, leaving us in “epistemological crises.” Epistemological crises are the aftermath of events that undermine the ways that we interpret our world
  • it presents three general points on the method for philosophy.
  • To live in an epistemological crisis is to be aware that one does not know what one thought one knew about some particular subject and to be anxious to recover certainty about that subject.
  • To resolve an epistemological crisis it is not enough to impose some new way of interpreting our experience, we also need to understand why we were wrong before: “When an epistemological crisis is resolved, it is by the construction of a new narrative which enables the agent to understand both how he or she could intelligibly have held his or her original beliefs and how he or she could have been so drastically misled by them
  • MacIntyre notes, “Philosophers have customarily been Emmas and not Hamlets” (p. 6); that is, philosophers have treated their conclusions as accomplished truths, rather than as “more adequate narratives” (p. 7) that remain open to further improvement.
  • To illustrate his position on the open-endedness of enquiry, MacIntyre compares the title characters of Shakespeare’s Hamlet and Jane Austen’s Emma. When Emma finds that she is deeply misled in her beliefs about the other characters in her story, Mr. Knightly helps her to learn the truth and the story comes to a happy ending (p. 6). Hamlet, by contrast, finds no pat answers to his questions; rival interpretations remain throughout the play, so that directors who would stage the play have to impose their own interpretations on the script
  • Another approach to education is the method of Descartes, who begins by rejecting everything that is not clearly and distinctly true as unreliable and false in order to rebuild his understanding of the world on a foundation of undeniable truth.
  • Descartes presents himself as willfully rejecting everything he had believed, and ignores his obvious debts to the Scholastic tradition, even as he argues his case in French and Latin. For MacIntyre, seeking epistemological certainty through universal doubt as a precondition for enquiry is a mistake: “it is an invitation not to philosophy but to mental breakdown, or rather to philosophy as a means of mental breakdown.
  • MacIntyre contrasts Descartes’ descent into mythical isolation with Galileo, who was able to make progress in astronomy and physics by struggling with the apparently insoluble questions of late medieval astronomy and physics, and radically reinterpreting the issues that constituted those questions
  • To make progress in philosophy one must sort through the narratives that inform one’s understanding, struggle with the questions that those narratives raise, and on occasion, reject, replace, or reinterpret portions of those narratives and propose those changes to the rest of one’s community for assessment. Human enquiry is always situated within the history and life of a community.
  • The third point of EC is that we can learn about progress in philosophy from the philosophy of science
  • Kuhn’s “paradigm shifts,” however, are unlike MacIntyre’s resolutions of epistemological crises in two ways.
  • First they are not rational responses to specific problems. Kuhn compares paradigm shifts to religious conversions (pp. 150, 151, 158), stressing that they are not guided by rational norms and he claims that the “mopping up” phase of a paradigm shift is a matter of convention in the training of new scientists and attrition among the holdouts of the previous paradigm
  • Second, the new paradigm is treated as a closed system of belief that regulates a new period of “normal science”; Kuhn’s revolutionary scientists are Emmas, not Hamlets
  • MacIntyre proposes elements of Imre Lakatos’ philosophy of science as correctives to Kuhn’s. While Lakatos has his own shortcomings, his general account of the methodologies of scientific research programs recognizes the role of reason in the transitions between theories and between research programs (Lakatos’ analog to Kuhn’s paradigms or disciplinary matrices). Lakatos presents science as an open ended enquiry, in which every theory may eventually be replaced by more adequate theories. For Lakatos, unlike Kuhn, rational scientific progress occurs when a new theory can account both for the apparent promise and for the actual failure of the theory it replaces.
  • The third conclusion of MacIntyre’s essay is that decisions to support some theories over others may be justified rationally to the extent that those theories allow us to understand our experience and our history, including the history of the failures of inadequate theories
  • For Aristotle, moral philosophy is a study of practical reasoning, and the excellences or virtues that Aristotle recommends in the Nicomachean Ethics are the intellectual and moral excellences that make a moral agent effective as an independent practical reasoner.
  • MacIntyre also finds that the contending parties have little interest in the rational justification of the principles they use. The language of moral philosophy has become a kind of moral rhetoric to be used to manipulate others in defense of the arbitrary choices of its users
  • examining the current condition of secular moral and political discourse. MacIntyre finds contending parties defending their decisions by appealing to abstract moral principles, but he finds their appeals eclectic, inconsistent, and incoherent.
  • The secular moral philosophers of the eighteenth and nineteenth centuries shared strong and extensive agreements about the content of morality (AV, p. 51) and believed that their moral philosophy could justify the demands of their morality rationally, free from religious authority.
  • MacIntyre traces the lineage of the culture of emotivism to the secularized Protestant cultures of northern Europe
  • Modern moral philosophy had thus set for itself an incoherent goal. It was to vindicate both the moral autonomy of the individual and the objectivity, necessity, and categorical character of the rules of morality
  • MacIntyre turns to an apparent alternative, the pragmatic expertise of professional managers. Managers are expected to appeal to the facts to make their decisions on the objective basis of effectiveness, and their authority to do this is based on their knowledge of the social sciences
  • An examination of the social sciences reveals, however, that many of the facts to which managers appeal depend on sociological theories that lack scientific status. Thus, the predictions and demands of bureaucratic managers are no less liable to ideological manipulation than the determinations of modern moral philosophers.
  • Modern moral philosophy separates moral reasoning about duties and obligations from practical reasoning about ends and practical deliberation about the means to one’s ends, and in doing so it separates morality from practice.
  • Many Europeans also lost the practical justifications for their moral norms as they approached modernity; for these Europeans, claiming that certain practices are “immoral,” and invoking Kant’s categorical imperative or Mill’s principle of utility to explain why those practices are immoral, seems no more adequate than the Polynesian appeal to taboo.
  • MacIntyre sifts these definitions and then gives his own definition of virtue, as excellence in human agency, in terms of practices, whole human lives, and traditions in chapters 14 and 15 of AV.
  • In the most often quoted sentence of AV, MacIntyre defines a practice as (1) a complex social activity that (2) enables participants to gain goods internal to the practice. (3) Participants achieve excellence in practices by gaining the internal goods. When participants achieve excellence, (4) the social understandings of excellence in the practice, of the goods of the practice, and of the possibility of achieving excellence in the practice “are systematically extended”
  • Practices, like chess, medicine, architecture, mechanical engineering, football, or politics, offer their practitioners a variety of goods both internal and external to these practices. The goods internal to practices include forms of understanding or physical abilities that can be acquired only by pursuing excellence in the associated practice
  • Goods external to practices include wealth, fame, prestige, and power; there are many ways to gain these external goods. They can be earned or purchased, either honestly or through deception; thus the pursuit of these external goods may conflict with the pursuit of the goods internal to practices.
  • An intelligent child is given the opportunity to win candy by learning to play chess. As long as the child plays chess only to win candy, he has every reason to cheat if by doing so he can win more candy. If the child begins to desire and pursue the goods internal to chess, however, cheating becomes irrational, because it is impossible to gain the goods internal to chess or any other practice except through an honest pursuit of excellence. Goods external to practices may nevertheless remain tempting to the practitioner.
  • Since MacIntyre finds social identity necessary for the individual, MacIntyre’s definition of the excellence or virtue of the human agent needs a social dimension:
  • These responsibilities also include debts incurred by the unjust actions of ones’ predecessors.
  • The enslavement and oppression of black Americans, the subjugation of Ireland, and the genocide of the Jews in Europe remained quite relevant to the responsibilities of citizens of the United States, England, and Germany in 1981, as they still do today.
  • Thus an American who said “I never owned any slaves,” “the Englishman who says ‘I never did any wrong to Ireland,’” or “the young German who believes that being born after 1945 means that what Nazis did to Jews has no moral relevance to his relationship to his Jewish contemporaries” all exhibit a kind of intellectual and moral failure.
  • “I am born with a past, and to cut myself off from that past in the individualist mode, is to deform my present relationships” (p. 221).  For MacIntyre, there is no moral identity for the abstract individual; “The self has to find its moral identity in and through its membership in communities” (p. 221).
Javier E

Ta-Nehisi Coates's 'Letter to My Son' - The Atlantic - 0 views

  • The question is not whether Lincoln truly meant “government of the people” but what our country has, throughout its history, taken the political term “people” to actually mean. In 1863 it did not mean your mother or your grandmother, and it did not mean you and me.
  • When the journalist asked me about my body, it was like she was asking me to awaken her from the most gorgeous dream. I have seen that dream all my life. It is perfect houses with nice lawns. It is Memorial Day cookouts, block associations, and driveways. The Dream is tree houses and the Cub Scouts. And for so long I have wanted to escape into the Dream, to fold my country over my head like a blanket. But this has never been an option, because the Dream rests on our backs, the bedding made from our bodies.
  • The destroyers will rarely be held accountable. Mostly they will receive pensions.
  • ...41 more annotations...
  • you know now, if you did not before, that the police departments of your country have been endowed with the authority to destroy your body. It does not matter if the destruction is the result of an unfortunate overreaction. It does not matter if it originates in a misunderstanding. It does not matter if the destruction springs from a foolish policy
  • I remember being amazed that death could so easily rise up from the nothing of a boyish afternoon, billow up like fog. I knew that West Baltimore, where I lived; that the north side of Philadelphia, where my cousins lived; that the South Side of Chicago, where friends of my father lived, comprised a world apart. Somewhere out there beyond the firmament, past the asteroid belt, there were other worlds where children did not regularly fear for their bodies
  • It is hard to face this. But all our phrasing—race relations, racial chasm, racial justice, racial profiling, white privilege, even white supremacy—serves to obscure that racism is a visceral experience, that it dislodges brains, blocks airways, rips muscle, extracts organs, cracks bones, breaks teeth
  • ou must never look away from this. You must always remember that the sociology, the history, the economics, the graphs, the charts, the regressions all land, with great violence, upon the body.
  • And should one live in such a body? What should be our aim beyond meager survival of constant, generational, ongoing battery and assault? I have asked this question all my life.
  • The question is unanswerable, which is not to say futile. The greatest reward of this constant interrogation, of confrontation with the brutality of my country, is that it has freed me from ghosts and myths.
  • I was afraid long before you, and in this I was unoriginal. When I was your age the only people I knew were black, and all of them were powerfully, adamantly, dangerously afraid. It was always right in front of me. The fear was there in the extravagant boys of my West Baltimore neighborhood
  • The fear lived on in their practiced bop, their slouching denim, their big T- shirts, the calculated angle of their baseball caps, a catalog of behaviors and garments enlisted to inspire the belief that these boys were in firm possession of everything they desired.
  • To be black in the Baltimore of my youth was to be naked before the elements of the world, before all the guns, fists, knives, crack, rape, and disease. The law did not protect us. And now, in your time, the law has become an excuse for stopping and frisking you, which is to say, for furthering the assault on your body
  • But a society that protects some people through a safety net of schools, government-backed home loans, and ancestral wealth but can only protect you with the club of criminal justice has either failed at enforcing its good intentions or has succeeded at something much darker.
  • here will surely always be people with straight hair and blue eyes, as there have been for all history. But some of these straight-haired people with blue eyes have been “black,” and this points to the great difference between their world and ours. We did not choose our fences. They were imposed on us by Virginia planters obsessed with enslaving as many Americans as possible. Now I saw that we had made something down here, in slavery, in Jim Crow, in ghettoes. At The Mecca I saw how we had taken their one-drop rule and flipped it. They made us into a race. We made ourselves into a people.
  • I came to understand that my country was a galaxy, and this galaxy stretched from the pandemonium of West Baltimore to the happy hunting grounds of Mr. Belvedere. I obsessed over the distance between that other sector of space and my own. I knew that my portion of the American galaxy, where bodies were enslaved by a tenacious gravity, was black and that the other, liberated portion was not. I knew that some inscrutable energy preserved the breach. I felt, but did not yet understand, the relation between that other world and me. And I felt in this a cosmic injustice, a profound cruelty, which infused an abiding, irrepressible desire to unshackle my body and achieve the velocity of escape.
  • Before I could escape, I had to survive, and this could only mean a clash with the streets, by which I mean not just physical blocks, nor simply the people packed into them, but the array of lethal puzzles and strange perils which seem to rise up from the asphalt itself. The streets transform every ordinary day into a series of trick questions, and every incorrect answer risks a beat-down, a shooting, or a pregnancy. No one survives unscathed
  • When I was your age, fully one-third of my brain was concerned with who I was walking to school with, our precise number, the manner of our walk, the number of times I smiled, who or what I smiled at, who offered a pound and who did not—all of which is to say that I practiced the culture of the streets, a culture concerned chiefly with securing the body.
  • Why were only our heroes nonviolent? Back then all I could do was measure these freedom-lovers by what I knew. Which is to say, I measured them against children pulling out in the 7-Eleven parking lot, against parents wielding extension cords, and the threatening intonations of armed black gangs saying, “Yeah, nigger, what’s up now?” I judged them against the country I knew, which had acquired the land through murder and tamed it under slavery, against the country whose armies fanned out across the world to extend their dominion. The world, the real one, was civilization secured and ruled by savage means. How could the schools valorize men and women whose values society actively scorned? How could they send us out into the streets of Baltimore, knowing all that they were, and then speak of nonviolence?
  • the beauty of the black body was never celebrated in movies, in television, or in the textbooks I’d seen as a child. Everyone of any import, from Jesus to George Washington, was white. This was why your grandparents banned Tarzan and the Lone Ranger and toys with white faces from the house. They were rebelling against the history books that spoke of black people only as sentimental “firsts”—first black four-star general, first black congressman, first black mayor—always presented in the bemused manner of a category of Trivial Pursuit.
  • erious history was the West, and the West was white. This was all distilled for me in a quote I once read, from the novelist Saul Bellow. I can’t remember where I read it, or when—only that I was already at Howard. “Who is the Tolstoy of the Zulus?,” Bellow quipped
  • this view of things was connected to the fear that passed through the generations, to the sense of dispossession. We were black, beyond the visible spectrum, beyond civilization. Our history was inferior because we were inferior, which is to say our bodies were inferior. And our inferior bodies could not possibly be accorded the same respect as those that built the West. Would it not be better, then, if our bodies were civilized, improved, and put to some legitimate Christian use?
  • now I looked back on my need for a trophy case, on the desire to live by the standards of Saul Bellow, and I felt that this need was not an escape but fear again—fear that “they,” the alleged authors and heirs of the universe, were right. And this fear ran so deep that we accepted their standards of civilization and humanity.
  • There is nothing uniquely evil in these destroyers or even in this moment. The destroyers are merely men enforcing the whims of our country, correctly interpreting its heritage and legacy. This legacy aspires to the shackling of black bodies
  • still and all I knew that we were something, that we were a tribe—on one hand, invented, and on the other, no less real. The reality was out there on the Yard, on the first warm day of spring when it seemed that every sector, borough, affiliation, county, and corner of the broad diaspora had sent a delegate to the great world party
  • I could see now that that world was more than a photonegative of that of the people who believe they are white. “White America” is a syndicate arrayed to protect its exclusive power to dominate and control our bodies. Sometimes this power is direct (lynching), and sometimes it is insidious (redlining). But however it appears, the power of domination and exclusion is central to the belief in being white, and without it, “white people” would cease to exist for want of reasons
  • “Tolstoy is the Tolstoy of the Zulus,” wrote Wiley. “Unless you find a profit in fencing off universal properties of mankind into exclusive tribal ownership.” And there it was. I had accepted Bellow’s premise. In fact, Bellow was no closer to Tolstoy than I was to Nzinga. And if I were closer it would be because I chose to be, not because of destiny written in DNA. My great error was not that I had accepted someone else’s dream but that I had accepted the fact of dreams, the need for escape, and the invention of racecraft.
  • Think of all the embraces, all the private jokes, customs, greetings, names, dreams, all the shared knowledge and capacity of a black family injected into that vessel of flesh and bone. And think of how that vessel was taken, shattered on the concrete, and all its holy contents, all that had gone into each of them, was sent flowing back to the earth. It is terrible to truly see our particular beauty, Samori, because then you see the scope of the loss. But you must push even further. You must see that this loss is mandated by the history of your country, by the Dream of living white.
  • I don’t know if you remember how the film we saw at the Petersburg Battlefield ended as though the fall of the Confederacy were the onset of a tragedy, not jubilee. I doubt you remember the man on our tour dressed in the gray wool of the Confederacy, or how every visitor seemed most interested in flanking maneuvers, hardtack, smoothbore rifles, grapeshot, and ironclads, but virtually no one was interested in what all of this engineering, invention, and design had been marshaled to achieve. You were only 10 years old. But even then I knew that I must trouble you, and this meant taking you into rooms where people would insult your intelligence, where thieves would try to enlist you in your own robbery and disguise their burning and looting as Christian charity. But robbery is what this is, what it always was.
  • American reunion was built on a comfortable narrative that made enslavement into benevolence, white knights of body snatchers, and the mass slaughter of the war into a kind of sport in which one could conclude that both sides conducted their affairs with courage, honor, and élan. This lie of the Civil War is the lie of innocence, is the Dream.
  • I, like every kid I knew, loved The Dukes of Hazzard. But I would have done well to think more about why two outlaws, driving a car named the General Lee, must necessarily be portrayed as “just some good ole boys, never meanin’ no harm”—a mantra for the Dreamers if there ever was one. But what one “means” is neither important nor relevant. It is not necessary that you believe that the officer who choked Eric Garner set out that day to destroy a body. All you need to understand is that the officer carries with him the power of the American state and the weight of an American legacy, and they necessitate that of the bodies destroyed every year, some wild and disproportionate number of them will be black.
  • Here is what I would like for you to know: In America, it is traditional to destroy the black body—it is heritage. Enslavement was not merely the antiseptic borrowing of labor—it is not so easy to get a human being to commit their body against its own elemental interest. And so enslavement must be casual wrath and random manglings, the gashing of heads and brains blown out over the river as the body seeks to escape. It must be rape so regular as to be industrial. There is no uplifting way to say this.
  • It had to be blood. It had to be the thrashing of kitchen hands for the crime of churning butter at a leisurely clip. It had to be some woman “chear’d ... with thirty lashes a Saturday last and as many more a Tuesday again.” It could only be the employment of carriage whips, tongs, iron pokers, handsaws, stones, paperweights, or whatever might be handy to break the black body, the black family, the black community, the black nation. The bodies were pulverized into stock and marked with insurance. And the bodies were an aspiration, lucrative as Indian land, a veranda, a beautiful wife, or a summer home in the mountains. For the men who needed to believe themselves white, the bodies were the key to a social club, and the right to break the bodies was the mark of civilization.
  • “The two great divisions of society are not the rich and poor, but white and black,” said the great South Carolina senator John C. Calhoun. “And all the former, the poor as well as the rich, belong to the upper class, and are respected and treated as equals.” And there it is—the right to break the black body as the meaning of their sacred equality. And that right has always given them meaning, has always meant that there was someone down in the valley because a mountain is not a mountain if there is nothing below.
  • There is no them without you, and without the right to break you they must necessarily fall from the mountain, lose their divinity, and tumble out of the Dream. And then they would have to determine how to build their suburbs on something other than human bones, how to angle their jails toward something other than a human stockyard, how to erect a democracy independent of cannibalism. I would like to tell you that such a day approaches when the people who believe themselves to be white renounce this demon religion and begin to think of themselves as human. But I can see no real promise of such a day. We are captured, brother, surrounded by the majoritarian bandits of America. And this has happened here, in our only home, and the terrible truth is that we cannot will ourselves to an escape on our own.
  • I think now of the old rule that held that should a boy be set upon in someone else’s chancy hood, his friends must stand with him, and they must all take their beating together. I now know that within this edict lay the key to all living. None of us were promised to end the fight on our feet, fists raised to the sky. We could not control our enemies’ number, strength, or weaponry. Sometimes you just caught a bad one. But whether you fought or ran, you did it together, because that is the part that was in our control. What we must never do is willingly hand over our own bodies or the bodies of our friends. That was the wisdom: We knew we did not lay down the direction of the street, but despite that, we could—and must—fashion the way of our walk. And that is the deeper meaning of your name—that the struggle, in and of itself, has meaning.
  • I have raised you to respect every human being as singular, and you must extend that same respect into the past. Slavery is not an indefinable mass of flesh. It is a particular, specific enslaved woman, whose mind is as active as your own, whose range of feeling is as vast as your own; who prefers the way the light falls in one particular spot in the woods, who enjoys fishing where the water eddies in a nearby stream, who loves her mother in her own complicated way, thinks her sister talks too loud, has a favorite cousin, a favorite season, who excels at dressmaking and knows, inside herself, that she is as intelligent and capable as anyone. “Slavery” is this same woman born in a world that loudly proclaims its love of freedom and inscribes this love in its essential texts, a world in which these same professors hold this woman a slave, hold her mother a slave, her father a slave, her daughter a slave, and when this woman peers back into the generations all she sees is the enslaved. She can hope for more. She can imagine some future for her grandchildren. But when she dies, the world—which is really the only world she can ever know—ends. For this woman, enslavement is not a parable. It is damnation. It is the never-ending night. And the length of that night is most of our history. Never forget that we were enslaved in this country longer than we have been free. Never forget that for 250 years black people were born into chains—whole generations followed by more generations who knew nothing but chains.
  • You must resist the common urge toward the comforting narrative of divine law, toward fairy tales that imply some irrepressible justice. The enslaved were not bricks in your road, and their lives were not chapters in your redemptive history. They were people turned to fuel for the American machine. Enslavement was not destined to end, and it is wrong to claim our present circumstance—no matter how improved—as the redemption for the lives of people who never asked for the posthumous, untouchable glory of dying for their children. Our triumphs can never redeem this. Perhaps our triumphs are not even the point. Perhaps struggle is all we have
  • I am not a cynic. I love you, and I love the world, and I love it more with every new inch I discover. But you are a black boy, and you must be responsible for your body in a way that other boys cannot know. Indeed, you must be responsible for the worst actions of other black bodies, which, somehow, will always be assigned to you. And you must be responsible for the bodies of the powerful—the policeman who cracks you with a nightstick will quickly find his excuse in your furtive movements. You have to make your peace with the chaos, but you cannot lie.
  • “I could have you arrested,” he said. Which is to say: “One of your son’s earliest memories will be watching the men who sodomized Abner Louima and choked Anthony Baez cuff, club, tase, and break you.” I had forgotten the rules, an error as dangerous on the Upper West Side of Manhattan as on the West Side of Baltimore. One must be without error out here. Walk in single file. Work quietly. Pack an extra No. 2 pencil. Make no mistakes.
  • the price of error is higher for you than it is for your countrymen, and so that America might justify itself, the story of a black body’s destruction must always begin with his or her error, real or imagined—with Eric Garner’s anger, with Trayvon Martin’s mythical words (“You are gonna die tonight”), with Sean Bell’s mistake of running with the wrong crowd, with me standing too close to the small-eyed boy pulling out.
  • You are called to struggle, not because it assures you victory but because it assures you an honorable and sane life
  • I am sorry that I cannot save you—but not that sorry. Part of me thinks that your very vulnerability brings you closer to the meaning of life, just as for others, the quest to believe oneself white divides them from it. The fact is that despite their dreams, their lives are also not inviolable. When their own vulnerability becomes real—when the police decide that tactics intended for the ghetto should enjoy wider usage, when their armed society shoots down their children, when nature sends hurricanes against their cities—they are shocked by the rages of logic and the natural world in a way that those of us who were born and bred to understand cause and effect can never be.
  • I would not have you live like them. You have been cast into a race in which the wind is always at your face and the hounds are always at your heels. And to varying degrees this is true of all life. The difference is that you do not have the privilege of living in ignorance of this essential fact.
  • I never wanted you to be twice as good as them, so much as I have always wanted you to attack every day of your brief bright life determined to struggle. The people who must believe they are white can never be your measuring stick. I would not have you descend into your own dream. I would have you be a conscious citizen of this terrible and beautiful world.
Javier E

At the Existentialist Café: Freedom, Being, and Apricot Cocktails with Jean-P... - 0 views

  • The phenomenologists’ leading thinker, Edmund Husserl, provided a rallying cry, ‘To the things themselves!’ It meant: don’t waste time on the interpretations that accrue upon things, and especially don’t waste time wondering whether the things are real. Just look at this that’s presenting itself to you, whatever this may be, and describe it as precisely as possible.
  • You might think you have defined me by some label, but you are wrong, for I am always a work in progress. I create myself constantly through action, and this is so fundamental to my human condition that, for Sartre, it is the human condition, from the moment of first consciousness to the moment when death wipes it out. I am my own freedom: no more, no less.
  • Sartre wrote like a novelist — not surprisingly, since he was one. In his novels, short stories and plays as well as in his philosophical treatises, he wrote about the physical sensations of the world and the structures and moods of human life. Above all, he wrote about one big subject: what it meant to be free. Freedom, for him, lay at the heart of all human experience, and this set humans apart from all other kinds of object.
  • ...97 more annotations...
  • Sartre listened to his problem and said simply, ‘You are free, therefore choose — that is to say, invent.’ No signs are vouchsafed in this world, he said. None of the old authorities can relieve you of the burden of freedom. You can weigh up moral or practical considerations as carefully as you like, but ultimately you must take the plunge and do something, and it’s up to you what that something is.
  • Even if the situation is unbearable — perhaps you are facing execution, or sitting in a Gestapo prison, or about to fall off a cliff — you are still free to decide what to make of it in mind and deed. Starting from where you are now, you choose. And in choosing, you also choose who you will be.
  • The war had made people realise that they and their fellow humans were capable of departing entirely from civilised norms; no wonder the idea of a fixed human nature seemed questionable.
  • If this sounds difficult and unnerving, it’s because it is. Sartre does not deny that the need to keep making decisions brings constant anxiety. He heightens this anxiety by pointing out that what you do really matters. You should make your choices as though you were choosing on behalf of the whole of humanity, taking the entire burden of responsibility for how the human race behaves. If you avoid this responsibility by fooling yourself that you are the victim of circumstance or of someone else’s bad advice, you are failing to meet the demands of human life and choosing a fake existence, cut off from your own ‘authenticity’.
  • Along with the terrifying side of this comes a great promise: Sartre’s existentialism implies that it is possible to be authentic and free, as long as you keep up the effort.
  • almost all agreed that it was, as an article in Les nouvelles littéraires phrased it, a ‘sickening mixture of philosophic pretentiousness, equivocal dreams, physiological technicalities, morbid tastes and hesitant eroticism … an introspective embryo that one would take distinct pleasure in crushing’.
  • he offered a philosophy designed for a species that had just scared the hell out of itself, but that finally felt ready to grow up and take responsibility.
  • In this rebellious world, just as with the Parisian bohemians and Dadaists in earlier generations, everything that was dangerous and provocative was good, and everything that was nice or bourgeois was bad.
  • Such interweaving of ideas and life had a long pedigree, although the existentialists gave it a new twist. Stoic and Epicurean thinkers in the classical world had practised philosophy as a means of living well, rather than of seeking knowledge or wisdom for their own sake. By reflecting on life’s vagaries in philosophical ways, they believed they could become more resilient, more able to rise above circumstances, and better equipped to manage grief, fear, anger, disappointment or anxiety.
  • In the tradition they passed on, philosophy is neither a pure intellectual pursuit nor a collection of cheap self-help tricks, but a discipline for flourishing and living a fully human, responsible life.
  • For Kierkegaard, Descartes had things back to front. In his own view, human existence comes first: it is the starting point for everything we do, not the result of a logical deduction. My existence is active: I live it and choose it, and this precedes any statement I can make about myself.
  • Studying our own moral genealogy cannot help us to escape or transcend ourselves. But it can enable us to see our illusions more clearly and lead a more vital, assertive existence.
  • What was needed, he felt, was not high moral or theological ideals, but a deeply critical form of cultural history or ‘genealogy’ that would uncover the reasons why we humans are as we are, and how we came to be that way. For him, all philosophy could even be redefined as a form of psychology, or history.
  • For those oppressed on grounds of race or class, or for those fighting against colonialism, existentialism offered a change of perspective — literally, as Sartre proposed that all situations be judged according to how they appeared in the eyes of those most oppressed, or those whose suffering was greatest.
  • She observed that we need not expect moral philosophers to ‘live by’ their ideas in a simplistic way, as if they were following a set of rules. But we can expect them to show how their ideas are lived in. We should be able to look in through the windows of a philosophy, as it were, and see how people occupy it, how they move about and how they conduct themselves.
  • the existentialists inhabited their historical and personal world, as they inhabited their ideas. This notion of ‘inhabited philosophy’ is one I’ve borrowed from the English philosopher and novelist Iris Murdoch, who wrote the first full-length book on Sartre and was an early adopter of existentialism
  • What is existentialism anyway?
  • An existentialist who is also phenomenological provides no easy rules for dealing with this condition, but instead concentrates on describing lived experience as it presents itself. — By describing experience well, he or she hopes to understand this existence and awaken us to ways of living more authentic lives.
  • Existentialists concern themselves with individual, concrete human existence. — They consider human existence different from the kind of being other things have. Other entities are what they are, but as a human I am whatever I choose to make of myself at every moment. I am free — — and therefore I’m responsible for everything I do, a dizzying fact which causes — an anxiety inseparable from human existence itself.
  • On the other hand, I am only free within situations, which can include factors in my own biology and psychology as well as physical, historical and social variables of the world into which I have been thrown. — Despite the limitations, I always want more: I am passionately involved in personal projects of all kinds. — Human existence is thus ambiguous: at once boxed in by borders and yet transcendent and exhilarating. —
  • The first part of this is straightforward: a phenomenologist’s job is to describe. This is the activity that Husserl kept reminding his students to do. It meant stripping away distractions, habits, clichés of thought, presumptions and received ideas, in order to return our attention to what he called the ‘things themselves’. We must fix our beady gaze on them and capture them exactly as they appear, rather than as we think they are supposed to be.
  • Husserl therefore says that, to phenomenologically describe a cup of coffee, I should set aside both the abstract suppositions and any intrusive emotional associations. Then I can concentrate on the dark, fragrant, rich phenomenon in front of me now. This ‘setting aside’ or ‘bracketing out’ of speculative add-ons Husserl called epoché — a term borrowed from the ancient Sceptics,
  • The point about rigour is crucial; it brings us back to the first half of the command to describe phenomena. A phenomenologist cannot get away with listening to a piece of music and saying, ‘How lovely!’ He or she must ask: is it plaintive? is it dignified? is it colossal and sublime? The point is to keep coming back to the ‘things themselves’ — phenomena stripped of their conceptual baggage — so as to bail out weak or extraneous material and get to the heart of the experience.
  • Husserlian ‘bracketing out’ or epoché allows the phenomenologist to temporarily ignore the question ‘But is it real?’, in order to ask how a person experiences his or her world. Phenomenology gives a formal mode of access to human experience. It lets philosophers talk about life more or less as non-philosophers do, while still being able to tell themselves they are being methodical and rigorous.
  • Besides claiming to transform the way we think about reality, phenomenologists promised to change how we think about ourselves. They believed that we should not try to find out what the human mind is, as if it were some kind of substance. Instead, we should consider what it does, and how it grasps its experiences.
  • For Brentano, this reaching towards objects is what our minds do all the time. Our thoughts are invariably of or about something, he wrote: in love, something is loved, in hatred, something is hated, in judgement, something is affirmed or denied. Even when I imagine an object that isn’t there, my mental structure is still one of ‘about-ness’ or ‘of-ness’.
  • Except in deepest sleep, my mind is always engaged in this aboutness: it has ‘intentionality’. Having taken the germ of this from Brentano, Husserl made it central to his whole philosophy.
  • Husserl saw in the idea of intentionality a way to sidestep two great unsolved puzzles of philosophical history: the question of what objects ‘really’ are, and the question of what the mind ‘really’ is. By doing the epoché and bracketing out all consideration of reality from both topics, one is freed to concentrate on the relationship in the middle. One can apply one’s descriptive energies to the endless dance of intentionality that takes place in our lives: the whirl of our minds as they seize their intended phenomena one after the other and whisk them around the floor,
  • Understood in this way, the mind hardly is anything at all: it is its aboutness. This makes the human mind (and possibly some animal minds) different from any other naturally occurring entity. Nothing else can be as thoroughly about or of things as the mind is:
  • Some Eastern meditation techniques aim to still this scurrying creature, but the extreme difficulty of this shows how unnatural it is to be mentally inert. Left to itself, the mind reaches out in all directions as long as it is awake — and even carries on doing it in the dreaming phase of its sleep.
  • a mind that is experiencing nothing, imagining nothing, or speculating about nothing can hardly be said to be a mind at all.
  • Three simple ideas — description, phenomenon, intentionality — provided enough inspiration to keep roomfuls of Husserlian assistants busy in Freiburg for decades. With all of human existence awaiting their attention, how could they ever run out of things to do?
  • For Sartre, this gives the mind an immense freedom. If we are nothing but what we think about, then no predefined ‘inner nature’ can hold us back. We are protean.
  • way of this interpretation. Real, not real; inside, outside; what difference did it make? Reflecting on this, Husserl began turning his phenomenology into a branch of ‘idealism’ — the philosophical tradition which denied external reality and defined everything as a kind of private hallucination.
  • For Sartre, if we try to shut ourselves up inside our own minds, ‘in a nice warm room with the shutters closed’, we cease to exist. We have no cosy home: being out on the dusty road is the very definition of what we are.
  • One might think that, if Heidegger had anything worth saying, he could have communicated it in ordinary language. The fact is that he does not want to be ordinary, and he may not even want to communicate in the usual sense. He wants to make the familiar obscure, and to vex us. George Steiner thought that Heidegger’s purpose was less to be understood than to be experienced through a ‘felt strangeness’.
  • He takes Dasein in its most ordinary moments, then talks about it in the most innovative way he can. For Heidegger, Dasein’s everyday Being is right here: it is Being-in-the-world, or In-der-Welt-sein. The main feature of Dasein’s everyday Being-in-the-world right here is that it is usually busy doing something.
  • Thus, for Heidegger, all Being-in-the-world is also a ‘Being-with’ or Mitsein. We cohabit with others in a ‘with-world’, or Mitwelt. The old philosophical problem of how we prove the existence of other minds has now vanished. Dasein swims in the with-world long before it wonders about other minds.
  • Sometimes the best-educated people were those least inclined to take the Nazis seriously, dismissing them as too absurd to last. Karl Jaspers was one of those who made this mistake, as he later recalled, and Beauvoir observed similar dismissive attitudes among the French students in Berlin.
  • In any case, most of those who disagreed with Hitler’s ideology soon learned to keep their view to themselves. If a Nazi parade passed on the street, they would either slip out of view or give the obligatory salute like everyone else, telling themselves that the gesture meant nothing if they did not believe in it. As the psychologist Bruno Bettelheim later wrote of this period, few people will risk their life for such a small thing as raising an arm — yet that is how one’s powers of resistance are eroded away, and eventually one’s responsibility and integrity go with them.
  • for Arendt, if you do not respond adequately when the times demand it, you show a lack of imagination and attention that is as dangerous as deliberately committing an abuse. It amounts to disobeying the one command she had absorbed from Heidegger in those Marburg days: Think!
  • ‘Everything takes place under a kind of anaesthesia. Objectively dreadful events produce a thin, puny emotional response. Murders are committed like schoolboy pranks. Humiliation and moral decay are accepted like minor incidents.’ Haffner thought modernity itself was partly to blame: people had become yoked to their habits and to mass media, forgetting to stop and think, or to disrupt their routines long enough to question what was going on.
  • Heidegger’s former lover and student Hannah Arendt would argue, in her 1951 study The Origins of Totalitarianism, that totalitarian movements thrived at least partly because of this fragmentation in modern lives, which made people more vulnerable to being swept away by demagogues. Elsewhere, she coined the phrase ‘the banality of evil’ to describe the most extreme failures of personal moral awareness.
  • His communicative ideal fed into a whole theory of history: he traced all civilisation to an ‘Axial Period’ in the fifth century BC, during which philosophy and culture exploded simultaneously in Europe, the Middle East and Asia, as though a great bubble of minds had erupted from the earth’s surface. ‘True philosophy needs communion to come into existence,’ he wrote, and added, ‘Uncommunicativeness in a philosopher is virtually a criterion of the untruth of his thinking.’
  • The idea of being called to authenticity became a major theme in later existentialism, the call being interpreted as saying something like ‘Be yourself!’, as opposed to being phony. For Heidegger, the call is more fundamental than that. It is a call to take up a self that you didn’t know you had: to wake up to your Being. Moreover, it is a call to action. It requires you to do something: to take a decision of some sort.
  • Being and Time contained at least one big idea that should have been of use in resisting totalitarianism. Dasein, Heidegger wrote there, tends to fall under the sway of something called das Man or ‘the they’ — an impersonal entity that robs us of the freedom to think for ourselves. To live authentically requires resisting or outwitting this influence, but this is not easy because das Man is so nebulous. Man in German does not mean ‘man’ as in English (that’s der Mann), but a neutral abstraction, something like ‘one’ in the English phrase ‘one doesn’t do that’,
  • for Heidegger, das Man is me. It is everywhere and nowhere; it is nothing definite, but each of us is it. As with Being, it is so ubiquitous that it is difficult to see. If I am not careful, however, das Man takes over the important decisions that should be my own. It drains away my responsibility or ‘answerability’. As Arendt might put it, we slip into banality, failing to think.
  • Jaspers focused on what he called Grenzsituationen — border situations, or limit situations. These are the moments when one finds oneself constrained or boxed in by what is happening, but at the same time pushed by these events towards the limits or outer edge of normal experience. For example, you might have to make a life-or-death choice, or something might remind you suddenly of your mortality,
  • Jaspers’ interest in border situations probably had much to do with his own early confrontation with mortality. From childhood, he had suffered from a heart condition so severe that he always expected to die at any moment. He also had emphysema, which forced him to speak slowly, taking long pauses to catch his breath. Both illnesses meant that he had to budget his energies with care in order to get his work done without endangering his life.
  • If I am to resist das Man, I must become answerable to the call of my ‘voice of conscience’. This call does not come from God, as a traditional Christian definition of the voice of conscience might suppose. It comes from a truly existentialist source: my own authentic self. Alas, this voice is one I do not recognise and may not hear, because it is not the voice of my habitual ‘they-self’. It is an alien or uncanny version of my usual voice. I am familiar with my they-self, but not with my unalienated voice — so, in a weird twist, my real voice is the one that sounds strangest to me.
  • Marcel developed a strongly theological branch of existentialism. His faith distanced him from both Sartre and Heidegger, but he shared a sense of how history makes demands on individuals. In his essay ‘On the Ontological Mystery’, written in 1932 and published in the fateful year of 1933, Marcel wrote of the human tendency to become stuck in habits, received ideas, and a narrow-minded attachment to possessions and familiar scenes. Instead, he urged his readers to develop a capacity for remaining ‘available’ to situations as they arise. Similar ideas of disponibilité or availability had been explored by other writers,
  • Marcel made it his central existential imperative. He was aware of how rare and difficult it was. Most people fall into what he calls ‘crispation’: a tensed, encrusted shape in life — ‘as though each one of us secreted a kind of shell which gradually hardened and imprisoned him’.
  • Bettelheim later observed that, under Nazism, only a few people realised at once that life could not continue unaltered: these were the ones who got away quickly. Bettelheim himself was not among them. Caught in Austria when Hitler annexed it, he was sent first to Dachau and then to Buchenwald, but was then released in a mass amnesty to celebrate Hitler’s birthday in 1939 — an extraordinary reprieve, after which he left at once for America.
  • we are used to reading philosophy as offering a universal message for all times and places — or at least as aiming to do so. But Heidegger disliked the notion of universal truths or universal humanity, which he considered a fantasy. For him, Dasein is not defined by shared faculties of reason and understanding, as the Enlightenment philosophers thought. Still less is it defined by any kind of transcendent eternal soul, as in religious tradition. We do not exist on a higher, eternal plane at all. Dasein’s Being is local: it has a historical situation, and is constituted in time and place.
  • For Marcel, learning to stay open to reality in this way is the philosopher’s prime job. Everyone can do it, but the philosopher is the one who is called on above all to stay awake, so as to be the first to sound the alarm if something seems wrong.
  • Second, it also means understanding that we are historical beings, and grasping the demands our particular historical situation is making on us. In what Heidegger calls ‘anticipatory resoluteness’, Dasein discovers ‘that its uttermost possibility lies in giving itself up’. At that moment, through Being-towards-death and resoluteness in facing up to one’s time, one is freed from the they-self and attains one’s true, authentic self.
  • If we are temporal beings by our very nature, then authentic existence means accepting, first, that we are finite and mortal. We will die: this all-important realisation is what Heidegger calls authentic ‘Being-towards-Death’, and it is fundamental to his philosophy.
  • Hannah Arendt, instead, left early on: she had the benefit of a powerful warning. Just after the Nazi takeover, in spring 1933, she had been arrested while researching materials on anti-Semitism for the German Zionist Organisation at Berlin’s Prussian State Library. Her apartment was searched; both she and her mother were locked up briefly, then released. They fled, without stopping to arrange travel documents. They crossed to Czechoslovakia (then still safe) by a method that sounds almost too fabulous to be true: a sympathetic German family on the border had a house with its front door in Germany and its back door in Czechoslovakia. The family would invite people for dinner, then let them leave through the back door at night.
  • As Sartre argued in his 1943 review of The Stranger, basic phenomenological principles show that experience comes to us already charged with significance. A piano sonata is a melancholy evocation of longing. If I watch a soccer match, I see it as a soccer match, not as a meaningless scene in which a number of people run around taking turns to apply their lower limbs to a spherical object. If the latter is what I’m seeing, then I am not watching some more essential, truer version of soccer; I am failing to watch it properly as soccer at all.
  • Much as they liked Camus personally, neither Sartre nor Beauvoir accepted his vision of absurdity. For them, life is not absurd, even when viewed on a cosmic scale, and nothing can be gained by saying it is. Life for them is full of real meaning, although that meaning emerges differently for each of us.
  • For Sartre, we show bad faith whenever we portray ourselves as passive creations of our race, class, job, history, nation, family, heredity, childhood influences, events, or even hidden drives in our subconscious which we claim are out of our control. It is not that such factors are unimportant: class and race, in particular, he acknowledged as powerful forces in people’s lives, and Simone de Beauvoir would soon add gender to that list.
  • Sartre takes his argument to an extreme point by asserting that even war, imprisonment or the prospect of imminent death cannot take away my existential freedom. They form part of my ‘situation’, and this may be an extreme and intolerable situation, but it still provides only a context for whatever I choose to do next. If I am about to die, I can decide how to face that death. Sartre here resurrects the ancient Stoic idea that I may not choose what happens to me, but I can choose what to make of it, spiritually speaking.
  • But the Stoics cultivated indifference in the face of terrible events, whereas Sartre thought we should remain passionately, even furiously engaged with what happens to us and with what we can achieve. We should not expect freedom to be anything less than fiendishly difficult.
  • Freedom does not mean entirely unconstrained movement, and it certainly does not mean acting randomly. We often mistake the very things that enable us to be free — context, meaning, facticity, situation, a general direction in our lives — for things that define us and take away our freedom. It is only with all of these that we can be free in a real sense.
  • Nor did he mean that privileged groups have the right to pontificate to the poor and downtrodden about the need to ‘take responsibility’ for themselves. That would be a grotesque misreading of Sartre’s point, since his sympathy in any encounter always lay with the more oppressed side. But for each of us — for me — to be in good faith means not making excuses for myself.
  • Camus’ novel gives us a deliberately understated vision of heroism and decisive action compared to those of Sartre and Beauvoir. One can only do so much. It can look like defeatism, but it shows a more realistic perception of what it takes to actually accomplish difficult tasks like liberating one’s country.
  • Camus just kept returning to his core principle: no torture, no killing — at least not with state approval. Beauvoir and Sartre believed they were taking a more subtle and more realistic view. If asked why a couple of innocuous philosophers had suddenly become so harsh, they would have said it was because the war had changed them in profound ways. It had shown them that one’s duties to humanity could be more complicated than they seemed. ‘The war really divided my life in two,’ Sartre said later.
  • Poets and artists ‘let things be’, but they also let things come out and show themselves. They help to ease things into ‘unconcealment’ (Unverborgenheit), which is Heidegger’s rendition of the Greek term alētheia, usually translated as ‘truth’. This is a deeper kind of truth than the mere correspondence of a statement to reality, as when we say ‘The cat is on the mat’ and point to a mat with a cat on it. Long before we can do this, both cat and mat must ‘stand forth out of concealedness’. They must un-hide themselves.
  • Heidegger does not use the word ‘consciousness’ here because — as with his earlier work — he is trying to make us think in a radically different way about ourselves. We are not to think of the mind as an empty cavern, or as a container filled with representations of things. We are not even supposed to think of it as firing off arrows of intentional ‘aboutness’, as in the earlier phenomenology of Brentano. Instead, Heidegger draws us into the depths of his Schwarzwald, and asks us to imagine a gap with sunlight filtering in. We remain in the forest, but we provide a relatively open spot where other beings can bask for a moment. If we did not do this, everything would remain in the thickets, hidden even to itself.
  • The astronomer Carl Sagan began his 1980 television series Cosmos by saying that human beings, though made of the same stuff as the stars, are conscious and are therefore ‘a way for the cosmos to know itself’. Merleau-Ponty similarly quoted his favourite painter Cézanne as saying, ‘The landscape thinks itself in me, and I am its consciousness.’ This is something like what Heidegger thinks humanity contributes to the earth. We are not made of spiritual nothingness; we are part of Being, but we also bring something unique with us. It is not much: a little open space, perhaps with a path and a bench like the one the young Heidegger used to sit on to do his homework. But through us, the miracle occurs.
  • Beauty aside, Heidegger’s late writing can also be troubling, with its increasingly mystical notion of what it is to be human. If one speaks of a human being mainly as an open space or a clearing, or a means of ‘letting beings be’ and dwelling poetically on the earth, then one doesn’t seem to be talking about any recognisable person. The old Dasein has become less human than ever. It is now a forestry feature.
  • Even today, Jaspers, the dedicated communicator, is far less widely read than Heidegger, who has influenced architects, social theorists, critics, psychologists, artists, film-makers, environmental activists, and innumerable students and enthusiasts — including the later deconstructionist and post-structuralist schools, which took their starting point from his late thinking. Having spent the late 1940s as an outsider and then been rehabilitated, Heidegger became the overwhelming presence in university philosophy all over the European continent from then on.
  • As Levinas reflected on this experience, it helped to lead him to a philosophy that was essentially ethical, rather than ontological like Heidegger’s. He developed his ideas from the work of Jewish theologian Martin Buber, whose I and Thou in 1923 had distinguished between my relationship with an impersonal ‘it’ or ‘them’, and the direct personal encounter I have with a ‘you’. Levinas took it further: when I encounter you, we normally meet face-to-face, and it is through your face that you, as another person, can make ethical demands on me. This is very different from Heidegger’s Mitsein or Being-with, which suggests a group of people standing alongside one another, shoulder to shoulder as if in solidarity — perhaps as a unified nation or Volk.
  • For Levinas, we literally face each other, one individual at a time, and that relationship becomes one of communication and moral expectation. We do not merge; we respond to one another. Instead of being co-opted into playing some role in my personal drama of authenticity, you look me in the eyes — and you remain Other. You remain you.
  • This relationship is more fundamental than the self, more fundamental than consciousness, more fundamental even than Being — and it brings an unavoidable ethical obligation. Ever since Husserl, phenomenologists and existentialists had being trying to stretch the definition of existence to incorporate our social lives and relationships. Levinas did more: he turned philosophy around entirely so that these relationships were the foundation of our existence, not an extension of it.
  • Her last work, The Need for Roots, argues, among other things, that none of us has rights, but each one of us has a near-infinite degree of duty and obligation to the other. Whatever the underlying cause of her death — and anorexia nervosa seems to have been involved — no one could deny that she lived out her philosophy with total commitment. Of all the lives touched on in this book, hers is surely the most profound and challenging application of Iris Murdoch’s notion that a philosophy can be ‘inhabited’.
  • Other thinkers took radical ethical turns during the war years. The most extreme was Simone Weil, who actually tried to live by the principle of putting other people’s ethical demands first. Having returned to France after her travels through Germany in 1932, she had worked in a factory so as to experience the degrading nature of such work for herself. When France fell in 1940, her family fled to Marseilles (against her protests), and later to the US and to Britain. Even in exile, Weil made extraordinary sacrifices. If there were people in the world who could not sleep in a bed, she would not do so either, so she slept on the floor.
  • The mystery tradition had roots in Kierkegaard’s ‘leap of faith’. It owed much to the other great nineteenth-century mystic of the impossible, Dostoevsky, and to older theological notions. But it also grew from the protracted trauma that was the first half of the twentieth century. Since 1914, and especially since 1939, people in Europe and elsewhere had come to the realisation that we cannot fully know or trust ourselves; that we have no excuses or explanations for what we do — and yet that we must ground our existence and relationships on something firm, because otherwise we cannot survive.
  • One striking link between these radical ethical thinkers, all on the fringes of our main story, is that they had religious faith. They also granted a special role to the notion of ‘mystery’ — that which cannot be known, calculated or understood, especially when it concerns our relationships with each other. Heidegger was different from them, since he rejected the religion he grew up with and had no real interest in ethics — probably as a consequence of his having no real interest in the human.
  • Meanwhile, the Christian existentialist Gabriel Marcel was also still arguing, as he had since the 1930s, that ethics trumps everything else in philosophy and that our duty to each other is so great as to play the role of a transcendent ‘mystery’. He too had been led to this position partly by a wartime experience: during the First World War he had worked for the Red Cross’ Information Service, with the unenviable job of answering relatives’ inquiries about missing soldiers. Whenever news came, he passed it on, and usually it was not good. As Marcel later said, this task permanently inoculated him against warmongering rhetoric of any kind, and it made him aware of the power of what is unknown in our lives.
  • As the play’s much-quoted and frequently misunderstood final line has it: ‘Hell is other people.’ Sartre later explained that he did not mean to say that other people were hellish in general. He meant that after death we become frozen in their view, unable any longer to fend off their interpretation. In life, we can still do something to manage the impression we make; in death, this freedom goes and we are left entombed in other’s people’s memories and perceptions.
  • We have to do two near-impossible things at once: understand ourselves as limited by circumstances, and yet continue to pursue our projects as though we are truly in control. In Beauvoir’s view, existentialism is the philosophy that best enables us to do this, because it concerns itself so deeply with both freedom and contingency. It acknowledges the radical and terrifying scope of our freedom in life, but also the concrete influences that other philosophies tend to ignore: history, the body, social relationships and the environment.
  • The aspects of our existence that limit us, Merleau-Ponty says, are the very same ones that bind us to the world and give us scope for action and perception. They make us what we are. Sartre acknowledged the need for this trade-off, but he found it more painful to accept. Everything in him longed to be free of bonds, of impediments and limitations
  • Of course we have to learn this skill of interpreting and anticipating the world, and this happens in early childhood, which is why Merleau-Ponty thought child psychology was essential to philosophy. This is an extraordinary insight. Apart from Rousseau, very few philosophers before him had taken childhood seriously; most wrote as though all human experience were that of a fully conscious, rational, verbal adult who has been dropped into this world from the sky — perhaps by a stork.
  • For Merleau-Ponty, we cannot understand our experience if we don’t think of ourselves in part as overgrown babies. We fall for optical illusions because we once learned to see the world in terms of shapes, objects and things relevant to our own interests. Our first perceptions came to us in tandem with our first active experiments in observing the world and reaching out to explore it, and are still linked with those experiences.
  • Another factor in all of this, for Merleau-Ponty, is our social existence: we cannot thrive without others, or not for long, and we need this especially in early life. This makes solipsistic speculation about the reality of others ridiculous; we could never engage in such speculation if we hadn’t already been formed by them.
  • As Descartes could have said (but didn’t), ‘I think, therefore other people exist.’ We grow up with people playing with us, pointing things out, talking, listening, and getting us used to reading emotions and movements; this is how we become capable, reflective, smoothly integrated beings.
  • In general, Merleau-Ponty thinks human experience only makes sense if we abandon philosophy’s time-honoured habit of starting with a solitary, capsule-like, immobile adult self, isolated from its body and world, which must then be connected up again — adding each element around it as though adding clothing to a doll. Instead, for him, we slide from the womb to the birth canal to an equally close and total immersion in the world. That immersion continues as long as we live, although we may also cultivate the art of partially withdrawing from time to time when we want to think or daydream.
  • When he looks for his own metaphor to describe how he sees consciousness, he comes up with a beautiful one: consciousness, he suggests, is like a ‘fold’ in the world, as though someone had crumpled a piece of cloth to make a little nest or hollow. It stays for a while, before eventually being unfolded and smoothed away. There is something seductive, even erotic, in this idea of my conscious self as an improvised pouch in the cloth of the world. I still have my privacy — my withdrawing room. But I am part of the world’s fabric, and I remain formed out of it for as long as I am here.
  • By the time of these works, Merleau-Ponty is taking his desire to describe experience to the outer limits of what language can convey. Just as with the late Husserl or Heidegger, or Sartre in his Flaubert book, we see a philosopher venturing so far from shore that we can barely follow. Emmanuel Levinas would head out to the fringes too, eventually becoming incomprehensible to all but his most patient initiates.
  • Sartre once remarked — speaking of a disagreement they had about Husserl in 1941 — that ‘we discovered, astounded, that our conflicts had, at times, stemmed from our childhood, or went back to the elementary differences of our two organisms’. Merleau-Ponty also said in an interview that Sartre’s work seemed strange to him, not because of philosophical differences, but because of a certain ‘register of feeling’, especially in Nausea, that he could not share. Their difference was one of temperament and of the whole way the world presented itself to them.
  • The two also differed in their purpose. When Sartre writes about the body or other aspects of experience, he generally does it in order to make a different point. He expertly evokes the grace of his café waiter, gliding between the tables, bending at an angle just so, steering the drink-laden tray through the air on the tips of his fingers — but he does it all in order to illustrate his ideas about bad faith. When Merleau-Ponty writes about skilled and graceful movement, the movement itself is his point. This is the thing he wants to understand.
  • We can never move definitively from ignorance to certainty, for the thread of the inquiry will constantly lead us back to ignorance again. This is the most attractive description of philosophy I’ve ever read, and the best argument for why it is worth doing, even (or especially) when it takes us no distance at all from our starting point.
  • By prioritising perception, the body, social life and childhood development, Merleau-Ponty gathered up philosophy’s far-flung outsider subjects and brought them in to occupy the centre of his thought.
  • In his inaugural lecture at the Collège de France on 15 January 1953, published as In Praise of Philosophy, he said that philosophers should concern themselves above all with whatever is ambiguous in our experience. At the same time, they should think clearly about these ambiguities, using reason and science. Thus, he said, ‘The philosopher is marked by the distinguishing trait that he possesses inseparably the taste for evidence and the feeling for ambiguity.’ A constant movement is required between these two
  • As Sartre wrote in response to Hiroshima, humanity had now gained the power to wipe itself out, and must decide every single day that it wanted to live. Camus also wrote that humanity faced the task of choosing between collective suicide and a more intelligent use of its technology — ‘between hell and reason’. After 1945, there seemed little reason to trust in humanity’s ability to choose well.
  • Merleau-Ponty observed in a lecture of 1951 that, more than any previous century, the twentieth century had reminded people how ‘contingent’ their lives were — how at the mercy of historical events and other changes that they could not control. This feeling went on long after the war ended. After the A-bombs were dropped on Hiroshima and Nagasaki, many feared that a Third World War would not be long in coming, this time between the Soviet Union and the United States.
Javier E

What Have We Learned, If Anything? by Tony Judt | The New York Review of Books - 0 views

  • During the Nineties, and again in the wake of September 11, 2001, I was struck more than once by a perverse contemporary insistence on not understanding the context of our present dilemmas, at home and abroad; on not listening with greater care to some of the wiser heads of earlier decades; on seeking actively to forget rather than remember, to deny continuity and proclaim novelty on every possible occasion. We have become stridently insistent that the past has little of interest to teach us. Ours, we assert, is a new world; its risks and opportunities are without precedent.
  • the twentieth century that we have chosen to commemorate is curiously out of focus. The overwhelming majority of places of official twentieth-century memory are either avowedly nostalgo-triumphalist—praising famous men and celebrating famous victories—or else, and increasingly, they are opportunities for the recollection of selective suffering.
  • The problem with this lapidary representation of the last century as a uniquely horrible time from which we have now, thankfully, emerged is not the description—it was in many ways a truly awful era, an age of brutality and mass suffering perhaps unequaled in the historical record. The problem is the message: that all of that is now behind us, that its meaning is clear, and that we may now advance—unencumbered by past errors—into a different and better era.
  • ...19 more annotations...
  • Today, the “common” interpretation of the recent past is thus composed of the manifold fragments of separate pasts, each of them (Jewish, Polish, Serb, Armenian, German, Asian-American, Palestinian, Irish, homosexual…) marked by its own distinctive and assertive victimhood.
  • The resulting mosaic does not bind us to a shared past, it separates us from it. Whatever the shortcomings of the national narratives once taught in school, however selective their focus and instrumental their message, they had at least the advantage of providing a nation with past references for present experience. Traditional history, as taught to generations of schoolchildren and college students, gave the present a meaning by reference to the past: today’s names, places, inscriptions, ideas, and allusions could be slotted into a memorized narrative of yesterday. In our time, however, this process has gone into reverse. The past now acquires meaning only by reference to our many and often contrasting present concerns.
  • the United States thus has no modern memory of combat or loss remotely comparable to that of the armed forces of other countries. But it is civilian casualties that leave the most enduring mark on national memory and here the contrast is piquant indeed
  • Today, the opposite applies. Most people in the world outside of sub-Saharan Africa have access to a near infinity of data. But in the absence of any common culture beyond a small elite, and not always even there, the fragmented information and ideas that people select or encounter are determined by a multiplicity of tastes, affinities, and interests. As the years pass, each one of us has less in common with the fast-multiplying worlds of our contemporaries, not to speak of the world of our forebears.
  • What is significant about the present age of transformations is the unique insouciance with which we have abandoned not merely the practices of the past but their very memory. A world just recently lost is already half forgotten.
  • In the US, at least, we have forgotten the meaning of war. There is a reason for this. I
  • Until the last decades of the twentieth century most people in the world had limited access to information; but—thanks to national education, state-controlled radio and television, and a common print culture—within any one state or nation or community people were all likely to know many of the same things.
  • it was precisely that claim, that “it’s torture, and therefore it’s no good,” which until very recently distinguished democracies from dictatorships. We pride ourselves on having defeated the “evil empire” of the Soviets. Indeed so. But perhaps we should read again the memoirs of those who suffered at the hands of that empire—the memoirs of Eugen Loebl, Artur London, Jo Langer, Lena Constante, and countless others—and then compare the degrading abuses they suffered with the treatments approved and authorized by President Bush and the US Congress. Are they so very different?
  • As a consequence, the United States today is the only advanced democracy where public figures glorify and exalt the military, a sentiment familiar in Europe before 1945 but quite unknown today
  • the complacent neoconservative claim that war and conflict are things Americans understand—in contrast to naive Europeans with their pacifistic fantasies—seems to me exactly wrong: it is Europeans (along with Asians and Africans) who understand war all too well. Most Americans have been fortunate enough to live in blissful ignorance of its true significance.
  • That same contrast may account for the distinctive quality of much American writing on the cold war and its outcome. In European accounts of the fall of communism, from both sides of the former Iron Curtain, the dominant sentiment is one of relief at the closing of a long, unhappy chapter. Here in the US, however, the story is typically recorded in a triumphalist key.5
  • For many American commentators and policymakers the message of the twentieth century is that war works. Hence the widespread enthusiasm for our war on Iraq in 2003 (despite strong opposition to it in most other countries). For Washington, war remains an option—on that occasion the first option. For the rest of the developed world it has become a last resort.6
  • Ignorance of twentieth-century history does not just contribute to a regrettable enthusiasm for armed conflict. It also leads to a misidentification of the enemy.
  • This abstracting of foes and threats from their context—this ease with which we have talked ourselves into believing that we are at war with “Islamofascists,” “extremists” from a strange culture, who dwell in some distant “Islamistan,” who hate us for who we are and seek to destroy “our way of life”—is a sure sign that we have forgotten the lesson of the twentieth century: the ease with which war and fear and dogma can bring us to demonize others, deny them a common humanity or the protection of our laws, and do unspeakable things to them.
  • How else are we to explain our present indulgence for the practice of torture? For indulge it we assuredly do.
  • “But what would I have achieved by proclaiming my opposition to torture?” he replied. “I have never met anyone who is in favor of torture.”8 Well, times have changed. In the US today there are many respectable, thinking people who favor torture—under the appropriate circumstances and when applied to those who merit it.
  • American civilian losses (excluding the merchant navy) in both world wars amounted to less than 2,000 dead.
  • We are slipping down a slope. The sophistic distinctions we draw today in our war on terror—between the rule of law and “exceptional” circumstances, between citizens (who have rights and legal protections) and noncitizens to whom anything can be done, between normal people and “terrorists,” between “us” and “them”—are not new. The twentieth century saw them all invoked. They are the selfsame distinctions that licensed the worst horrors of the recent past: internment camps, deportation, torture, and murder—those very crimes that prompt us to murmur “never again.” So what exactly is it that we think we have learned from the past? Of what possible use is our self-righteous cult of memory and memorials if the United States can build its very own internment camp and torture people there?
  • We need to learn again—or perhaps for the first time—how war brutalizes and degrades winners and losers alike and what happens to us when, having heedlessly waged war for no good reason, we are encouraged to inflate and demonize our enemies in order to justify that war’s indefinite continuance.
B Mannke

The War No Image Could Capture - Deborah Cohen - The Atlantic - 1 views

  • Essay December 2013 The War No Image Could Capture Photography has given us iconic representations of conflict since the Civil War—with a notable exception. Why, during the Great War, the camera failed. 
  • They could not be rescued yet, and so an anonymous official photographer attached to the Royal Engineers did what he could to record the scene. The picture he took, though, tells almost nothing without a caption. The landscape is flat and featureless. The dead and wounded look like dots. “Like a million bloody rugs,” wrote F. Scott Fitzgerald of the Somme carnage. In fact, you can’t make out blood. You can’t even tell you’re looking at bodies.
  • iconic representations of war
  • ...12 more annotations...
  • World War I yielded a number of striking and affecting pictures. Some, included in the gallery of 380 presented in The Great War: A Photographic Narrative, are famous: the line of gassed men, blinded and clutching each other’s shoulders as they approach a first-aid station in 1918; the haunting, charred landscapes of the Ypres Salient in 1917. And yet in both cases, the more-renowned versions were their painted successors of 1919: John Singer Sargent’s oil painting Gassed, and Paul Nash’s semi-abstract rendering of the blasted Belgian flatland, The Menin Road. The essence of the Great War lies in the absence of any emblematic photograph.
  • The quest to communicate an unprecedented experience of combat began almost as soon as the war did, and it has continued ever since
  • All Quiet on the Western Front (1929): the war was unimaginable, dehumanizing, the unredeemable sacrifice of a generation. It marked the origin of our ironic sensibility
  • The central conundrum in representing the First World War is a stark one: the staggering statistics of matériel, manpower, and casualties threaten constantly to extinguish the individual. That was what the war poets understood, and why the images they summoned in words have been transmitted down a century. As Wilfred Owen did in “Dulce et Decorum Est” (1917), the poets addressed their readers directly, unsettling them with a vision of the damage suffered by a particular man’s body or mind.
  • The British prime minister’s own eldest son, Raymond Asquith, was killed a few days later and a few miles away, at the Battle of Flers–Courcelette.
  • We felt they were mad.”
  • Needless to say, such a move was not repeated.
  • A great deal of the official photography of 1914 and 1915 borders on the risible: stiffly posed pictures that gesture to the heroic war that had been foretold rather than the war that was unfolding. In one picture, a marksman in a neat uniform crouches safely behind a fortification, intent on his quarry. In another, a dugout looks like a stage set, in which the actors have been urged to strike contemplative poses.
  • e Battle of Guillemont, a British and French offensive that was successful but at great cost, this image from September 1916, by the British official photographer John Warwick Brooke, is disorienting at first glance. Are the inert lumps on the ground dead bodies, or parts of dead bodies? They are neither. But the initial relief upon recognizing that they’re inanimate objects evaporates
  • Photography, of course, can’t capture sounds or bitter intonations—that devastatingly exact gargling, not gurgling
  • . All the way through—as he meticulously documents the laborious mobilization, the pointless charges, the dead and injured marooned in the field—Sacco’s perspective is from the British lines, which means the soldiers are seen mostly from the back. He gets the details of the carts, the guns, and the uniforms exactly right. The faces he draws are deliberately generic.
  • They visited the battlefields to find the small white headstone with their soldier’s name; when there was no grave, they touched the place where a name was engraved on a memorial. They held séances to summon the dead. But inevitably, as the decades roll on, what endures are the fearsome numbers.
Javier E

Cognitive Biases and the Human Brain - The Atlantic - 1 views

  • Present bias shows up not just in experiments, of course, but in the real world. Especially in the United States, people egregiously undersave for retirement—even when they make enough money to not spend their whole paycheck on expenses, and even when they work for a company that will kick in additional funds to retirement plans when they contribute.
  • hen people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. The collection is large. Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”)
  • If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view
  • ...48 more annotations...
  • Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
  • The whole idea of cognitive biases and faulty heuristics—the shortcuts and rules of thumb by which we make judgments and predictions—was more or less invented in the 1970s by Amos Tversky and Daniel Kahneman
  • versky died in 1996. Kahneman won the 2002 Nobel Prize in Economics for the work the two men did together, which he summarized in his 2011 best seller, Thinking, Fast and Slow. Another best seller, last year’s The Undoing Project, by Michael Lewis, tells the story of the sometimes contentious collaboration between Tversky and Kahneman
  • Another key figure in the field is the University of Chicago economist Richard Thaler. One of the biases he’s most linked with is the endowment effect, which leads us to place an irrationally high value on our possessions.
  • In an experiment conducted by Thaler, Kahneman, and Jack L. Knetsch, half the participants were given a mug and then asked how much they would sell it for. The average answer was $5.78. The rest of the group said they would spend, on average, $2.21 for the same mug. This flew in the face of classic economic theory, which says that at a given time and among a certain population, an item has a market value that does not depend on whether one owns it or not. Thaler won the 2017 Nobel Prize in Economics.
  • “The question that is most often asked about cognitive illusions is whether they can be overcome. The message … is not encouraging.”
  • that’s not so easy in the real world, when we’re dealing with people and situations rather than lines. “Unfortunately, this sensible procedure is least likely to be applied when it is needed most,” Kahneman writes. “We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available.”
  • At least with the optical illusion, our slow-thinking, analytic mind—what Kahneman calls System 2—will recognize a Müller-Lyer situation and convince itself not to trust the fast-twitch System 1’s perception
  • Kahneman and others draw an analogy based on an understanding of the Müller-Lyer illusion, two parallel lines with arrows at each end. One line’s arrows point in; the other line’s arrows point out. Because of the direction of the arrows, the latter line appears shorter than the former, but in fact the two lines are the same length.
  • Because biases appear to be so hardwired and inalterable, most of the attention paid to countering them hasn’t dealt with the problematic thoughts, judgments, or predictions themselves
  • Is it really impossible, however, to shed or significantly mitigate one’s biases? Some studies have tentatively answered that question in the affirmative.
  • what if the person undergoing the de-biasing strategies was highly motivated and self-selected? In other words, what if it was me?
  • Over an apple pastry and tea with milk, he told me, “Temperament has a lot to do with my position. You won’t find anyone more pessimistic than I am.”
  • I met with Kahneman
  • “I see the picture as unequal lines,” he said. “The goal is not to trust what I think I see. To understand that I shouldn’t believe my lying eyes.” That’s doable with the optical illusion, he said, but extremely difficult with real-world cognitive biases.
  • In this context, his pessimism relates, first, to the impossibility of effecting any changes to System 1—the quick-thinking part of our brain and the one that makes mistaken judgments tantamount to the Müller-Lyer line illusion
  • he most effective check against them, as Kahneman says, is from the outside: Others can perceive our errors more readily than we can.
  • “slow-thinking organizations,” as he puts it, can institute policies that include the monitoring of individual decisions and predictions. They can also require procedures such as checklists and “premortems,”
  • A premortem attempts to counter optimism bias by requiring team members to imagine that a project has gone very, very badly and write a sentence or two describing how that happened. Conducting this exercise, it turns out, helps people think ahead.
  • “My position is that none of these things have any effect on System 1,” Kahneman said. “You can’t improve intuition.
  • Perhaps, with very long-term training, lots of talk, and exposure to behavioral economics, what you can do is cue reasoning, so you can engage System 2 to follow rules. Unfortunately, the world doesn’t provide cues. And for most people, in the heat of argument the rules go out the window.
  • Kahneman describes an even earlier Nisbett article that showed subjects’ disinclination to believe statistical and other general evidence, basing their judgments instead on individual examples and vivid anecdotes. (This bias is known as base-rate neglect.)
  • over the years, Nisbett had come to emphasize in his research and thinking the possibility of training people to overcome or avoid a number of pitfalls, including base-rate neglect, fundamental attribution error, and the sunk-cost fallacy.
  • Nisbett’s second-favorite example is that economists, who have absorbed the lessons of the sunk-cost fallacy, routinely walk out of bad movies and leave bad restaurant meals uneaten.
  • When Nisbett asks the same question of students who have completed the statistics course, about 70 percent give the right answer. He believes this result shows, pace Kahneman, that the law of large numbers can be absorbed into System 2—and maybe into System 1 as well, even when there are minimal cues.
  • about half give the right answer: the law of large numbers, which holds that outlier results are much more frequent when the sample size (at bats, in this case) is small. Over the course of the season, as the number of at bats increases, regression to the mean is inevitabl
  • When Nisbett has to give an example of his approach, he usually brings up the baseball-phenom survey. This involved telephoning University of Michigan students on the pretense of conducting a poll about sports, and asking them why there are always several Major League batters with .450 batting averages early in a season, yet no player has ever finished a season with an average that high.
  • we’ve tested Michigan students over four years, and they show a huge increase in ability to solve problems. Graduate students in psychology also show a huge gain.”
  • , “I know from my own research on teaching people how to reason statistically that just a few examples in two or three domains are sufficient to improve people’s reasoning for an indefinitely large number of events.”
  • isbett suggested another factor: “You and Amos specialized in hard problems for which you were drawn to the wrong answer. I began to study easy problems, which you guys would never get wrong but untutored people routinely do … Then you can look at the effects of instruction on such easy problems, which turn out to be huge.”
  • Nisbett suggested that I take “Mindware: Critical Thinking for the Information Age,” an online Coursera course in which he goes over what he considers the most effective de-biasing skills and concepts. Then, to see how much I had learned, I would take a survey he gives to Michigan undergraduates. So I did.
  • he course consists of eight lessons by Nisbett—who comes across on-screen as the authoritative but approachable psych professor we all would like to have had—interspersed with some graphics and quizzes. I recommend it. He explains the availability heuristic this way: “People are surprised that suicides outnumber homicides, and drownings outnumber deaths by fire. People always think crime is increasing” even if it’s not.
  • When I finished the course, Nisbett sent me the survey he and colleagues administer to Michigan undergrads
  • It contains a few dozen problems meant to measure the subjects’ resistance to cognitive biases
  • I got it right. Indeed, when I emailed my completed test, Nisbett replied, “My guess is that very few if any UM seniors did as well as you. I’m sure at least some psych students, at least after 2 years in school, did as well. But note that you came fairly close to a perfect score.”
  • Nevertheless, I did not feel that reading Mindware and taking the Coursera course had necessarily rid me of my biases
  • For his part, Nisbett insisted that the results were meaningful. “If you’re doing better in a testing context,” he told me, “you’ll jolly well be doing better in the real world.”
  • The New York–based NeuroLeadership Institute offers organizations and individuals a variety of training sessions, webinars, and conferences that promise, among other things, to use brain science to teach participants to counter bias. This year’s two-day summit will be held in New York next month; for $2,845, you could learn, for example, “why are our brains so bad at thinking about the future, and how do we do it better?”
  • Philip E. Tetlock, a professor at the University of Pennsylvania’s Wharton School, and his wife and research partner, Barbara Mellers, have for years been studying what they call “superforecasters”: people who manage to sidestep cognitive biases and predict future events with far more accuracy than the pundits
  • One of the most important ingredients is what Tetlock calls “the outside view.” The inside view is a product of fundamental attribution error, base-rate neglect, and other biases that are constantly cajoling us into resting our judgments and predictions on good or vivid stories instead of on data and statistics
  • In 2006, seeking to prevent another mistake of that magnitude, the U.S. government created the Intelligence Advanced Research Projects Activity (iarpa), an agency designed to use cutting-edge research and technology to improve intelligence-gathering and analysis. In 2011, iarpa initiated a program, Sirius, to fund the development of “serious” video games that could combat or mitigate what were deemed to be the six most damaging biases: confirmation bias, fundamental attribution error, the bias blind spot (the feeling that one is less biased than the average person), the anchoring effect, the representativeness heuristic, and projection bias (the assumption that everybody else’s thinking is the same as one’s own).
  • most promising are a handful of video games. Their genesis was in the Iraq War
  • Together with collaborators who included staff from Creative Technologies, a company specializing in games and other simulations, and Leidos, a defense, intelligence, and health research company that does a lot of government work, Morewedge devised Missing. Some subjects played the game, which takes about three hours to complete, while others watched a video about cognitive bias. All were tested on bias-mitigation skills before the training, immediately afterward, and then finally after eight to 12 weeks had passed.
  • “The literature on training suggests books and classes are fine entertainment but largely ineffectual. But the game has very large effects. It surprised everyone.”
  • he said he saw the results as supporting the research and insights of Richard Nisbett. “Nisbett’s work was largely written off by the field, the assumption being that training can’t reduce bias,
  • even the positive results reminded me of something Daniel Kahneman had told me. “Pencil-and-paper doesn’t convince me,” he said. “A test can be given even a couple of years later. But the test cues the test-taker. It reminds him what it’s all about.”
  • Morewedge told me that some tentative real-world scenarios along the lines of Missing have shown “promising results,” but that it’s too soon to talk about them.
  • In the future, I will monitor my thoughts and reactions as best I can
Javier E

Opinion | Elon Musk, Geoff Hinton, and the War Over A.I. - The New York Times - 0 views

  • Beneath almost all of the testimony, the manifestoes, the blog posts and the public declarations issued about A.I. are battles among deeply divided factions
  • Some are concerned about far-future risks that sound like science fiction.
  • Some are genuinely alarmed by the practical problems that chatbots and deepfake video generators are creating right now.
  • ...31 more annotations...
  • Some are motivated by potential business revenue, others by national security concerns.
  • Sometimes, they trade letters, opinion essays or social threads outlining their positions and attacking others’ in public view. More often, they tout their viewpoints without acknowledging alternatives, leaving the impression that their enlightened perspective is the inevitable lens through which to view A.I.
  • you’ll realize this isn’t really a debate only about A.I. It’s also a contest about control and power, about how resources should be distributed and who should be held accountable.
  • It is critical that we begin to recognize the ideologies driving what we are being told. Resolving the fracas requires us to see through the specter of A.I. to stay true to the humanity of our values.
  • Because language itself is part of their battleground, the different A.I. camps tend not to use the same words to describe their positions
  • One faction describes the dangers posed by A.I. through the framework of safety, another through ethics or integrity, yet another through security and others through economics.
  • The Doomsayers
  • These are the A.I. safety people, and their ranks include the “Godfathers of A.I.,” Geoff Hinton and Yoshua Bengio. For many years, these leading lights battled critics who doubted that a computer could ever mimic capabilities of the human mind
  • The technology historian David C. Brock calls these fears “wishful worries” — that is, “problems that it would be nice to have, in contrast to the actual agonies of the present.”
  • Reasonable sounding on their face, these ideas can become dangerous if stretched to their logical extremes. A dogmatic long-termer would willingly sacrifice the well-being of people today to stave off a prophesied extinction event like A.I. enslavement.
  • Many doomsayers say they are acting rationally, but their hype about hypothetical existential risks amounts to making a misguided bet with our future
  • OpenAI’s Sam Altman and Meta’s Mark Zuckerberg, both of whom lead dominant A.I. companies, are pushing for A.I. regulations that they say will protect us from criminals and terrorists. Such regulations would be expensive to comply with and are likely to preserve the market position of leading A.I. companies while restricting competition from start-ups
  • the roboticist Rodney Brooks has pointed out that we will see the existential risks coming, the dangers will not be sudden and we will have time to change course.
  • While we shouldn’t dismiss the Hollywood nightmare scenarios out of hand, we must balance them with the potential benefits of A.I. and, most important, not allow them to strategically distract from more immediate concerns.
  • they appear deeply invested in the idea that there is no limit to what their creations will be able to accomplish.
  • While the doomsayer faction focuses on the far-off future, its most prominent opponents are focused on the here and now. We agree with this group that there’s plenty already happening to cause concern: Racist policing and legal systems that disproportionately arrest and punish people of color. Sexist labor systems that rate feminine-coded résumés lower
  • Superpower nations automating military interventions as tools of imperialism and, someday, killer robots.
  • Propagators of these A.I. ethics concerns — like Meredith Broussard, Safiya Umoja Noble, Rumman Chowdhury and Cathy O’Neil — have been raising the alarm on inequities coded into A.I. for years. Although we don’t have a census, it’s noticeable that many leaders in this cohort are people of color, women and people who identify as L.G.B.T.Q.
  • Others frame efforts to reform A.I. in terms of integrity, calling for Big Tech to adhere to an oath to consider the benefit of the broader public alongside — or even above — their self-interest. They point to social media companies’ failure to control hate speech or how online misinformation can undermine democratic elections. Adding urgency for this group is that the very companies driving the A.I. revolution have, at times, been eliminating safeguards
  • reformers tend to push back hard against the doomsayers’ focus on the distant future. They want to wrestle the attention of regulators and advocates back toward present-day harms that are exacerbated by A.I. misinformation, surveillance and inequity.
  • Integrity experts call for the development of responsible A.I., for civic education to ensure A.I. literacy and for keeping humans front and center in A.I. systems.
  • Surely, we are a civilization big enough to tackle more than one problem at a time; even those worried that A.I. might kill us in the future should still demand that it not profile and exploit us in the present.
  • Other groups of prognosticators cast the rise of A.I. through the language of competitiveness and national security.
  • Some arguing from this perspective are acting on genuine national security concerns, and others have a simple motivation: money. These perspectives serve the interests of American tech tycoons as well as the government agencies and defense contractors they are intertwined with.
  • The Reformers
  • U.S. megacompanies pleaded to exempt their general purpose A.I. from the tightest regulations, and whether and how to apply high-risk compliance expectations on noncorporate open-source models emerged as a key point of debate. All the while, some of the moguls investing in upstart companies are fighting the regulatory tide. The Inflection AI co-founder Reid Hoffman argued, “The answer to our challenges is not to slow down technology but to accelerate it.”
  • The warriors’ narrative seems to misrepresent that science and engineering are different from what they were during the mid-20th century. A.I. research is fundamentally international; no one country will win a monopoly.
  • As the science-fiction author Ted Chiang has said, fears about the existential risks of A.I. are really fears about the threat of uncontrolled capitalism
  • Regulatory solutions do not need to reinvent the wheel. Instead, we need to double down on the rules that we know limit corporate power. We need to get more serious about establishing good and effective governance on all the issues we lost track of while we were becoming obsessed with A.I., China and the fights picked among robber barons.
  • By analogy to the health care sector, we need an A.I. public option to truly keep A.I. companies in check. A publicly directed A.I. development project would serve to counterbalance for-profit corporate A.I. and help ensure an even playing field for access to the 21st century’s key technology while offering a platform for the ethical development and use of A.I.
  • Also, we should embrace the humanity behind A.I. We can hold founders and corporations accountable by mandating greater A.I. transparency in the development stage, in addition to applying legal standards for actions associated with A.I. Remarkably, this is something that both the left and the right can agree on.
Javier E

Sex, Morality, and Modernity: Can Immanuel Kant Unite Us? - The Atlantic - 1 views

  • Before I jump back into the conversation about sexual ethics that has unfolded on the Web in recent days, inspired by Emily Witt's n+1 essay "What Do You Desire?" and featuring a fair number of my favorite writers, it's worth saying a few words about why I so value debate on this subject, and my reasons for running through some sex-life hypotheticals near the end of this article.
  • As we think and live, the investment required to understand one another increases. So do the stakes of disagreeing. 18-year-olds on the cusp of leaving home for the first time may disagree profoundly about how best to live and flourish, but the disagreements are abstract. It is easy, at 18, to express profound disagreement with, say, a friend's notions of child-rearing. To do so when he's 28, married, and raising a son or daughter is delicate, and perhaps best avoided
  • I have been speaking of friends. The gulfs that separate strangers can be wider and more difficult to navigate because there is no history of love and mutual goodwill as a foundation for trust. Less investment has been made, so there is less incentive to persevere through the hard parts.
  • ...27 more annotations...
  • I've grown very close to new people whose perspectives are radically different than mine.
  • It floors me: These individuals are all repositories of wisdom. They've gleaned it from experiences I'll never have, assumptions I don't share, and brains wired different than mine. I want to learn what they know.
  • Does that get us anywhere? A little ways, I think.
  • "Are we stuck with a passé traditionalism on one hand, and total laissez-faire on the other?" Is there common ground shared by the orthodox-Christian sexual ethics of a Rod Dreher and those who treat consent as their lodestar?
  • Gobry suggests that Emmanuel Kant provides a framework everyone can and should embrace, wherein consent isn't nearly enough to make a sexual act moral--we must, in addition, treat the people in our sex lives as ends, not means.
  • Here's how Kant put it: "Act in such a way that you treat humanity, whether in your own person or in the person of any other, never merely as a means to an end, but always at the same time as an end."
  • the disappearance of a default sexual ethic in America and the divergence of our lived experiences means we have more to learn from one another than ever, even as our different choices raise the emotional stakes.
  • It seems to me that the Kantian insight is exactly the sort of challenge traditionalist Christians should make to college students as they try to persuade them to look more critically at hookup culture. I think a lot of college students casually mislead one another about their intentions and degree of investment, feigning romantic interest when actually they just want to have sex. Some would say they're transgressing against consent. I think Kant has a more powerful challenge. 
  • It seems to me that, whether we're talking about a three-week college relationship or a 60-year marriage, it is equally possible to treat one's partner as a means or as an end (though I would agree that "treating as means" is more common in hookups than marriage)
  • my simple definition is this: It is wrong to treat human persons in such a way that they are reduced to objects. This says nothing about consent: a person may consent to be used as an object, but it is still wrong to use them that way. It says nothing about utility: society may approve of using some people as objects; whether those people are actual slaves or economically oppressed wage-slaves it is still wrong to treat them like objects. What it says, in fact, is that human beings have intrinsic worth and dignity such that treating them like objects is wrong.
  • what it means to treat someone as a means, or as an object, turns out to be in dispute.
  • Years ago, I interviewed a sister who was acting as a surrogate for a sibling who couldn't carry her own child. The notion that either regarded the other (or themselves) as an object seems preposterous to me. Neither was treating the other as a means, because they both freely chose, desired and worked in concert to achieve the same end.
  • Nor does it seem intuitively obvious that a suffering, terminally ill 90-year-old is regarding himself as a means, or an object, if he prefers to end his life with a lethal injection rather than waiting three months in semi-lucid agony for his lungs to slowly shut down and suffocate him. (Kant thought suicide impermissible.) The terminally ill man isn't denigrating his own worth or the preciousness of life or saying it's permissible "any time" it is difficult. He believes ending his life is permissible only because the end is nigh, and the interim affords no opportunity for "living" in anything except a narrow biological sense.
  • Ultimately, Kant only gets us a little way in this conversation because, outside the realm of sex, he thinks consent goes a long way toward mitigating the means problem, whereas in the realm of sex, not so much. This is inseparable from notions he has about sex that many of us just don't share.
  • two Biblical passages fit my moral intuition even better than Kant. "Love your neighbor as yourself." And "therefore all things whatsoever would that men should do to you, do ye even so to them.
  • "do unto others..." is extremely demanding, hard to live up to, and a very close fit with my moral intuitions.
  • "Do unto others" is also enough to condemn all sorts of porn, and to share all sorts of common ground with Dreher beyond consent. Interesting that it leaves us with so many disagreements too. "Do unto others" is core to my support for gay marriage.
  • Are our bones always to be trusted?) The sexual behavior parents would be mortified by is highly variable across time and cultures. So how can I regard it as a credible guide of inherent wrong? Professional football and championship boxing are every bit as violent and far more physically damaging to their participants than that basement scene, yet their cultural familiarity is such that most people don't feel them to be morally suspect. Lots of parents are proud, not mortified, when a son makes the NFL.
  • "Porn operates in fantasy the way boxing and football operate in fantasy. The injuries are quite real." He is, as you can see, uncomfortable with both. Forced at gunpoint to choose which of two events could proceed on a given night, an exact replica of the San Francisco porn shoot or an Ultimate Fighting Championship tournament--if I had to shut one down and grant the other permission to proceed--what would the correct choice be?
  • insofar as there is something morally objectionable here, it's that the audience is taking pleasure in the spectacle of someone being abused, whether that abuse is fact or convincing illusion. Violent sports and violent porn interact with dark impulses in humanity, as their producers well know.
  • If Princess Donna was failing to "do unto others" at all, the audience was arguably who she failed. Would she want others to entertain her by stoking her dark human impulses? Then again, perhaps she is helping to neuter and dissipate them in a harmless way. That's one theory of sports, isn't it? We go to war on the gridiron as a replacement for going to war? And the rise in violent porn has seemed to coincide with falling, not rising, incidence of sexual violence. 
  • On all sorts of moral questions I can articulate confident judgments. But I am confident in neither my intellect nor my gut when it comes to judging Princess Donna, or whether others are transgressing against themselves or "nature" when doing things that I myself wouldn't want to do. Without understanding their mindset, why they find that thing desirable, or what it costs them, if anything, I am loath to declare that it's grounded in depravity or inherently immoral just because it triggers my disgust instinct, especially if the people involved articulate a plausible moral code that they are following, and it even passes a widely held standard like "do unto others."
  • Here's another way to put it. Asked to render moral judgments about sexual behaviors, there are some I would readily label as immoral. (Rape is an extreme example. Showing the topless photo your girlfriend sent to your best friend is a milder one.) But I often choose to hold back and error on the side of not rendering a definitive judgment, knowing that occasionally means I'll fail to label as unethical some things that actually turn out to be morally suspect.
  • Partly I take that approach because, unlike Dreher, I don't see any great value or urgency in the condemnations, and unlike Douthat, I worry more about wrongful stigma than lack of rightful stigmas
  • In a society where notions of sexual morality aren't coercively enforced by the church or the state, what purpose is condemnation serving?
  • People are great! Erring on the side of failing to condemn permits at least the possibility of people from all of these world views engaging in conversation with one another.
  • Dreher worries about the fact that, despite our discomfort, neither Witt nor I can bring ourselves to say that the sexual acts performed during the S.F. porn shoot were definitely wrong. Does that really matter? My interlocutors perhaps see a cost more clearly than me, as well they might. My bias is that just arguing around the fire is elevating.
Javier E

Elliot Ackerman Went From U.S. Marine to Bestselling Novelist - WSJ - 0 views

  • Years before he impressed critics with his first novel, “Green on Blue” (2015), written from the perspective of an Afghan boy, Ackerman was already, in his words, “telling stories and inhabiting the minds of others.” He explains that much of his work as a special-operations officer involved trying to grasp what his adversaries were thinking, to better anticipate how they might act
  • “Look, I really believe in stories, I believe in art, I believe that this is how we express our humanity,” he says. “You can’t understand a society without understanding the stories they tell about themselves, and how these stories are constantly changing.”
  • his, in essence, is the subject of “Halcyon,” in which a scientific breakthrough allows Robert Ableson, a World War II hero and renowned lawyer, to come back from the dead. Yet the 21st-century America he returns to feels like a different place, riven by debates over everything from Civil War monuments to workplace misconduct.
  • ...9 more annotations...
  • The novel probes how nothing in life is fixed, including the legacies of the dead and the stories we tell about our pas
  • “The study of history shouldn’t be backward looking,” explains a historian in “Halcyon.” “To matter, it has to take us forward.”
  • Ackerman was in college on Sept. 11, 2001, but what he remembers more vividly is watching the premiere of the TV miniseries “Band of Brothers” the previous Sunday. “If you wanted to know the zeitgeist in the U.S. at the time, it was this very sentimental view of World War II,” he says. “There was this nostalgia for a time where we’re the good guys, they’re the bad guys, and we’re going to liberate oppressed people.”
  • Ackerman, who also covers wars and veteran affairs as a journalist, says that America’s backing of Ukraine is essential in the face of what he calls “an authoritarian axis rising up in the world, with China, Russia and Iran.” Were the country to offer similar help to Taiwan in the face of an invasion from China, he notes, having some air bases in nearby Afghanistan would help, but the U.S. gave those up in 2021.
  • With Islamic fundamentalists now in control of places where he lost friends, he says he is often asked if he regrets his service. “When you are a young man and your country goes to war, you’re presented with a choice: You either fight or you don’t,” he writes in his 2019 memoir “Places and Names.” “I don’t regret my choice, but maybe I regret being asked to choose.”
  • Serving in the military at a time when wars are no longer generation-defining events has proven alienating for Ackerman. “When you’ve got wars with an all-volunteer military funded through deficit spending, they can go on forever because there are no political costs
  • The catastrophic withdrawal from Afghanistan in 2021, which Ackerman covers in his recent memoir “The Fifth Act,” compounded this moral injury. “The fact that there has been so little government support for our Afghan allies has left it to vets to literally clean this up,” he says, noting that he still fields requests for help on WhatsApp. He adds that unless lawmakers act, the tens of thousands of Afghans currently living in the U.S. on humanitarian parole will be sent back to Taliban-held Afghanistan later this year: “It’s very painful to see how our allies are treated.”
  • Looking back on America’s misadventures in Iraq, Afghanistan and elsewhere, he notes that “the stories we tell about war are really important to the decisions we make around war. It’s one reason why storytelling fills me with a similar sense of purpose.”
  • “We don’t talk about the world and our place in it in a holistic way, or a strategic way,” Ackerman says. “We were telling a story about ending America’s longest war, when the one we should’ve been telling was about repositioning ourselves in a world that’s becoming much more dangerous,” he adds. “Our stories sometimes get us in trouble, and we’re still dealing with that trouble today.”
Javier E

My Mom Believes In QAnon. I've Been Trying To Get Her Out. - 0 views

  • An early adopter of the QAnon mass delusion, on board since 2018, she held firm to the claim that a Satan-worshipping cabal of child sex traffickers controlled the world and the only person standing in their way was Trump. She saw him not merely as a politician but a savior, and she expressed her devotion in stark terms.
  • “The prophets have said Trump is anointed,” she texted me once. “God is using him to finally end the evil doings of the cabal which has hurt humanity all these centuries… We are in a war between good & evil.”
  • By 2020, I’d pretty much given up on swaying my mom away from her preferred presidential candidate. We’d spent many hours arguing over basic facts I considered indisputable. Any information I cited to prove Trump’s cruelty, she cut down with a corresponding counterattack. My links to credible news sources disintegrated against a wall of outlets like One America News Network, Breitbart, and Before It’s News. Any cracks I could find in her positions were instantly undermined by the inconvenient fact that I was, in her words, a member of “the liberal media,” a brainwashed acolyte of the sprawling conspiracy trying to take down her heroic leader.
  • ...20 more annotations...
  • The irony gnawed at me: My entire vocation as an investigative reporter was predicated on being able to reveal truths, and yet I could not even rustle up the evidence to convince my own mother that our 45th president was not, in fact, the hero she believed him to be. Or, for that matter, that John F. Kennedy Jr. was dead. Or that Tom Hanks had not been executed for drinking the blood of children.
  • The theories spun from Q’s messages seemed much easier to disprove. Oprah Winfrey couldn’t have been detained during a wave of deep state arrests because we could still see her conducting live interviews on television. Trump’s 4th of July speech at Mount Rushmore came to an end without John F. Kennedy Jr. revealing he was alive and stepping in as the president’s new running mate. The widespread blackouts that her Patriot friend’s “source from the Pentagon” had warned about failed to materialize. And I could testify firsthand that the CIA had no control over my newsroom’s editorial decisions.
  • “I believe the Holy Spirit led me to the QAnons to discover the truth which is being suppressed,” she texted me. “Otherwise, how would I be able to know the truth if the lamestream media suppresses the truth?”
  • Through the years, I’d battled against conspiracy theories my mom threw at me that were far more formidable than QAnon. I’d been stumped when she asked me to prove that Beyoncé wasn’t an Illuminati member, dumbfounded when research studies I sent her weren’t enough to reach an agreement on vaccine efficacy, and too worn down to say anything more than “that’s not true” when confronted with false allegations of murders committed by prominent politicians.
  • Eventually, I accepted the impasse. It didn’t seem healthy that every conversation we had would devolve into a circuitous debate about which one of us was on the side of the bad guys. So I tried to pick my battles.
  • She regretted not taking politics more seriously when I was younger. I’d grown up blinkered by American privilege, trained to ignore the dirty machinations securing my comforts. My mom had shed that luxury long ago.
  • With no overlap between our filters of reality, I was at a loss for any facts that would actually stick.
  • Meanwhile, she wondered where she’d gone wrong with me
  • But what I had dismissed as damaging inconsistencies turned out to be the core strength of the belief system: It was alive, flexible, sprouting more questions than answers, more clues to study, an investigation playing out in real time, with the fate of the world at stake.
  • The year my mom began falling down QAnon rabbit holes, I turned the age she was when she first arrived in the States. By then, I was no longer sure that America was worth the cost of her migration. When the real estate market collapsed under the weight of Wall Street speculation, she had to sell our house at a steep loss to avoid foreclosure and her budding career as a realtor evaporated. Her near–minimum wage jobs weren’t enough to cover her bills, so her credit card debts rose. She delayed retirement plans because she saw no path to breaking even anytime soon, though she was hopeful that a turnaround was on the horizon. Through the setbacks and detours, she drifted into the arms of the people and beliefs I held most responsible for her troubles.
  • With a fervor I knew was futile, I’d tell my mom she was missing the real conspiracy: The powerful people shaping policy to benefit their own interests, to maintain wealth and white predominance, through tax cuts and voter suppression, were commandeering her support solely by catering to her stance on the one issue she cared most about.
  • The voice my mom trusted most now was Trump’s. Our disagreements were no longer ideological to her but part of a celestial conflict.
  • “I love you but you have to be on the side of good,” she texted me. “Im sad cuz u have become part of the deep state. May God have mercy on you...I pray you will see the truth of the evil agenda and be on the side of Trump.”
  • She likened her fellow Patriots to the early Christians who spread the word of Jesus at the risk of persecution. She often sent me a meme with a caption about “ordinary people who spent countless hours researching, debating, meditating and praying” for the truth to be revealed to them. “Although they were mocked, dismissed and cast off, they knew their souls had agreed long ago to do this work.”
  • Last summer, as my mom marched in a pink MAGA hat amid maskless crowds, and armed extremists stalked racial justice protests, and a disputed election loomed like a time bomb, I entertained my darkest thoughts about the fate of our country. Was there any hope in a democracy without a shared set of basic facts? Had my elders fled one authoritarian regime only for their children to face another? Amid the gloom, I found only a single morsel of solace: My mom was as hopeful as she’d ever been.
  • I wish I could offer some evidence showing that the gulf between us might be narrowing, that my love, persistence, and collection of facts might be enough to draw her back into a reality we share, and that when our wager about the storm comes due in a few months, she’ll realize that the voices she trusts have been lying to her. But I don’t think that will happen
  • What can I do but try to limit the damage? Send my mom movie recommendations to occupy the free time she instead spends on conspiracy research. Shift our conversations to the common ground of cooking recipes and family gossip. Raise objections when her beliefs nudge her toward dangerous decisions.
  • I now understand our debates as marks of the very bond I thought was disintegrating. No matter how far she believes I’ve fallen into the deep state, how hard I fight for the forces of evil, how imminent the grand plan’s rapture, my mom will be there on the other side of the line putting in a good word for me with the angels and saints, trying to save me from damnation. And those are the two realities we live in. ●
  • understand
  • now understand our debates as marks of the very bond I thought was disintegrating. No matter how far she believes I’ve fallen into the deep state, how hard I fight for the forces of evil, how imminent the grand plan’s rapture, my mom will be there on the other side of the line putting in a good word for me with the angels and saints, trying to save me from damnation. And those are the two realities we live in. ●
Ellie McGinnis

The 50 Greatest Breakthroughs Since the Wheel - James Fallows - The Atlantic - 0 views

  • Some questions you ask because you want the right answer. Others are valuable because no answer is right; the payoff comes from the range of attempts.
  • That is the diversity of views about the types of historical breakthroughs that matter, with a striking consensus on whether the long trail of innovation recorded here is now nearing its end.
  • The clearest example of consensus was the first item on the final compilation, the printing press
  • ...27 more annotations...
  • Leslie Berlin, a historian of business at Stanford, organized her nominations not as an overall list but grouped into functional categories.
  • Innovations that expand the human intellect and its creative, expressive, and even moral possibilities.
  • Innovations that are integral to the physical and operating infrastructure of the modern world
  • Innovations that enabled the Industrial Revolution and its successive waves of expanded material output
  • Innovations extending life, to use Leslie Berlin’s term
  • Innovations that allowed real-time communication beyond the range of a single human voice
  • Innovations in the physical movement of people and goods.
  • Organizational breakthroughs that provide the software for people working and living together in increasingly efficient and modern ways
  • Finally, and less prominently than we might have found in 1950 or 1920—and less prominently than I initially expected—we have innovations in killing,
  • Any collection of 50 breakthroughs must exclude 50,000 more.
  • We learn, finally, why technology breeds optimism, which may be the most significant part of this exercise.
  • Popular culture often lionizes the stars of discovery and invention
  • For our era, the major problems that technology has helped cause, and that faster innovation may or may not correct, are environmental, demographic, and socioeconomic.
  • people who have thought deeply about innovation’s sources and effects, like our panelists, were aware of the harm it has done along with the good.
  • “Does innovation raise the wealth of the planet? I believe it does,” John Doerr, who has helped launch Google, Amazon, and other giants of today’s technology, said. “But technology left to its own devices widens rather than narrows the gap between the rich and the poor.”
  • Are today’s statesmen an improvement over those of our grandparents’ era? Today’s level of public debate? Music, architecture, literature, the fine arts—these and other manifestations of world culture continually change, without necessarily improving. Tolstoy and Dostoyevsky, versus whoever is the best-selling author in Moscow right now?
  • The argument that a slowdown might happen, and that it would be harmful if it did, takes three main forms.
  • Some societies have closed themselves off and stopped inventing altogether:
  • By failing to move forward, they inevitably moved backward relative to their rivals and to the environmental and economic threats they faced. If the social and intellectual climate for innovation sours, what has happened before can happen again.
  • visible slowdown in the pace of solutions that technology offers to fundamental problems.
  • a slowdown in, say, crop yields or travel time is part of a general pattern of what economists call diminishing marginal returns. The easy improvements are, quite naturally, the first to be made; whatever comes later is slower and harder.
  • America’s history as a nation happens to coincide with a rare moment in technological history now nearing its end. “There was virtually no economic growth before 1750,” he writes in a recent paper.
  • “We can be concerned about the last 1 percent of an environment for innovation, but that is because we take everything else for granted,” Leslie Berlin told me.
  • This reduction in cost, he says, means that the next decade should be a time of “amazing advances in understanding the genetic basis of disease, with especially powerful implications for cancer.”
  • the very concept of an end to innovation defied everything they understood about human inquiry. “If you look just at the 20th century, the odds against there being any improvement in living standards are enormous,”
  • “Two catastrophic world wars, the Cold War, the Depression, the rise of totalitarianism—it’s been one disaster after another, a sequence that could have been enough to sink us back into barbarism. And yet this past half century has been the fastest-ever time of technological growth. I see no reason why that should be slowing down.”
  • “I am a technological evolutionist,” he said. “I view the universe as a phase-space of things that are possible, and we’re doing a random walk among them. Eventually we are going to fill the space of everything that is possible.”
Javier E

Why Is Finland the Happiest Country on Earth? The Answer Is Complicated. - The New York... - 0 views

  • the United Nations Sustainable Development Solutions Network released its annual World Happiness Report, which rates well-being in countries around the world. For the sixth year in a row, Finland was ranked at the very top.
  • “I wouldn’t say that I consider us very happy,” said Nina Hansen, 58, a high school English teacher from Kokkola, a midsize city on Finland’s west coast. “I’m a little suspicious of that word, actually.”
  • what, supposedly, makes Finland so happy. Our subjects ranged in age from 13 to 88 and represented a variety of genders, sexual orientations, ethnic backgrounds and professions
  • ...21 more annotations...
  • While people praised Finland’s strong social safety net and spoke glowingly of the psychological benefits of nature and the personal joys of sports or music, they also talked about guilt, anxiety and loneliness. Rather than “happy,” they were more likely to characterize Finns as “quite gloomy,” “a little moody” or not given to unnecessary smiling
  • Many also shared concerns about threats to their way of life, including possible gains by a far-right party in the country’s elections in April, the war in Ukraine and a tense relationship with Russia, which could worsen now that Finland is set to join NATO.
  • It turns out even the happiest people in the world aren’t that happy. But they are something more like content.
  • Finns derive satisfaction from leading sustainable lives and perceive financial success as being able to identify and meet basic need
  • “In other words,” he wrote in an email, “when you know what is enough, you are happy.”
  • “‘Happiness,’ sometimes it’s a light word and used like it’s only a smile on a face,” Teemu Kiiski, the chief executive of Finnish Design Shop, said. “But I think that this Nordic happiness is something more foundational.”
  • e conventional wisdom is that it’s easier to be happy in a country like Finland where the government ensures a secure foundation on which to build a fulfilling life and a promising future. But that expectation can also create pressure to live up to the national reputation.
  • “We are very privileged and we know our privilege,” said Clara Paasimaki, 19, one of Ms. Hansen’s students in Kokkola, “so we are also scared to say that we are discontent with anything, because we know that we have it so much better than other people,” especially in non-Nordic countries.
  • “The fact that Finland has been ‘the happiest country on earth’ for six years in a row could start building pressure on people,” he wrote in an email. “If we Finns are all so happy, why am I not happy?
  • Since immigrating from Zimbabwe in 1992, Julia Wilson-Hangasmaa, 59, has come to appreciate the freedom Finland affords people to pursue their dreams without worrying about meeting basic needs
  • “Back in the day when it wasn’t that easy to survive the winter, people had to struggle, and then it’s kind of been passed along the generations,” said Ms. Paasimaki’s classmate Matias From, 18. “Our parents were this way. Our grandparents were this way. Tough and not worrying about everything. Just living life.”
  • The Finnish way of life is summed up in “sisu,” a trait said to be part of the national character. The word roughly translates to “grim determination in the face of hardships,” such as the country’s long winters: Even in adversity, a Finn is expected to persevere, without complaining.
  • When she returns to her home country, she is struck by the “good energy” that comes not from the satisfaction of sisu but from exuberant joy.
  • “What I miss the most, I realize when I enter Zimbabwe, are the smiles,” she said, among “those people who don’t have much, compared to Western standards, but who are rich in spirit.”
  • Many of our subjects cited the abundance of nature as crucial to Finnish happiness: Nearly 75 percent of Finland is covered by forest, and all of it is open to everyone thanks to a law known as “jokamiehen oikeudet,” or “everyman’s right,” that entitles people to roam freely throughout any natural areas, on public or privately owned land.
  • “I enjoy the peace and movement in nature,” said Helina Marjamaa, 66, a former track athlete who represented the country at the 1980 and 1984 Olympic Games. “That’s where I get strength. Birds are singing, snow is melting, and nature is coming to life. It’s just incredibly beautiful.”
  • “I am worried with this level of ignorance we have toward our own environment,” he said, citing endangered species and climate change. The threat, he said, “still doesn’t seem to shift the political thinking.”
  • Born 17 years after Finland won independence from Russia, Eeva Valtonen has watched her homeland transform: from the devastation of World War II through years of rebuilding to a nation held up as an exemplar to the world.
  • “My mother used to say, ‘Remember, the blessing in life is in work, and every work you do, do it well,’” Ms. Valtonen, 88, said. “I think Finnish people have been very much the same way. Everybody did everything together and helped each other.”
  • Maybe it isn’t that Finns are so much happier than everyone else. Maybe it’s that their expectations for contentment are more reasonable, and if they aren’t met, in the spirit of sisu, they persevere.
  • “We don’t whine,” Ms. Eerikainen said. “We just do.”
knudsenlu

How World War II Spurred a Veteran's Ambition - The Atlantic - 0 views

  • My first 18 months of military service were uninspiring. Donning the uniform did not fill me with pride, nor did the experience alter my perspective on life. What basic training had taught me was that the best way to get by was to stay out of sight.
  • Accordingly, my detachment expanded its mission: all males—soldiers and others of military serviceable age, no matter where encountered or whether in uniform—were to be taken prisoner unless they had persuasive evidence of either having been exempted or discharged from military service. Anyone without such proof was considered a potential guerilla. A sweep of the countryside would yield scores of German “civilians,” among them soldiers who had simply shed their uniform or party activists suspected of organizing a resistance usually with cover stories that I had to break. Not only did I become very adept at this task, but it also gave me some great insights into postwar German mentalities—insights that would later inspire me to revisit my views on higher education.
  • Wilke and his family had emigrated from Germany in the early 1920s, and he still spoke German with a genuine Saxon dialect.
  • ...4 more annotations...
  • As soon as I was released from the hospital, I worked my way step-by-step from France back to Berlin, the capital of the four occupying powers. After some stumbling blocks and a job I despised, I found a position as a research analyst in the intelligence branch of the military government’s information-control division.
  • This job turned out to be extremely challenging, but that also made it a real blessing. I wrote drafts on a wide range of political topics, including the identification of potential political leaders not yet recognized, a catalog of the rumors circulating among the population, incidents indicative of how people felt about American troops, and the dominant mood among German youth. We gleaned information from reports compiled by field representatives stationed in roughly a dozen communities throughout the American occupation zone, supplemented with details from German newspapers—and, in my case, with insights based on contacts and conversation I had whenever I passed myself off as a German civilian.
  • That is how my war and post-war service induced me, in the fall of 1947, after an interlude of more than five years, to enroll in the University of Chicago as a freshman. I stayed for six years, left with a Ph.D., and ever since enjoyed a long academic career as a sociologist, first specializing in the military and then studying propaganda and the effect of television on politics.
  • World War II spurred my ambition by teaching me how to navigate the army. Those lessons led me to confront the society I had once known so well, and to study politics and people living in a time of upheaval.
dpittenger

How World War I Shapes U.S. Foreign Policy - The Atlantic - 0 views

  • Americans prefer the sequel: better villains, bigger explosions.” There’s something to that. But if this earlier war has faded from national memory, its aftermath shapes American culture.
  • The question confronting the U.S. in 1917 was the same question that confronted Americans in 1941, and again after World War II, and now again as China rises: Who will shape world order?
  • A struggle against totalitarian dictatorship undertaken in alliance with Joseph Stalin? A fight for freedom that left half of Europe under communist rule? A battle against genocide that ended with the indiscriminate atomic bombing of two Japanese cities? What about the Bengal famine? The internment of Japanese Americans? Racial segregation in the U.S. armed forces?
  • ...4 more annotations...
  • Americans are susceptible to the belief that their country is somehow not a state like other states: It is either something purer and higher, or something unforgivably worse.
  • Self-accusation is as American as self-assertion—and as based on illusions. America’s strength sways world politics even when it is not exerted.
  • At present, too, many worry whether this world is safe for democratic societies challenged by the aggressive and illiberal.
  • A better understanding of history can at least emancipate Americans from the isolationist polemics that caricatured the why and the how of U.S. entry into the First World War
ilanaprincilus06

'Third World' Is An Offensive Term. Here's Why : Goats and Soda : NPR - 0 views

  • When an armed mob stormed the U.S. Capitol and took over the building on Wednesday, many Americans said that's what happens in "Third World" countries.
  • Everyone knows what they meant — countries that are poor, where health care systems are weak, where democracy may not be exactly flourishing.
  • But the very term "Third World" is a problem.
  • ...17 more annotations...
  • "this assumption about people outside of the 'First World' — that they lived really different lives, the assumption they were poor, they should be happy to eat every day. As if we don't have the same value as humans."
  • "I think it's a very antiquated and offensive term."
  • "There is no 'Third World.' There were the oppressed and the oppressors,"
  • The oppressors, he says, often took resources from the countries they colonized
  • Yet as Wednesday's events made clear, "Third World" is often the first term that pops into Westerners' minds when they try to characterize less well-off, troubled countries.
  • The idea of a world divided into three domains dates back to the 1950s when the Cold War was just starting. It was Western capitalism versus Soviet socialism
  • The "First World" consisted of the U.S., Western Europe and their allies. The "Second World" was the so-called communist bloc: the Soviet Union, China, Cuba and friends. The remaining nations, which aligned with neither group, were assigned to the "Third World."
  • "That's the 'Fourth World,' " Farmer says, referring to parts of the United States and other wealthy nations where health and economic problems loom large.
  • Because many countries in the Third World were impoverished, the term came to be used to refer to countries where poverty is rampant, where health care is inadequate, and where democracy does not flourish.
  • Who is to say which part of the world is "first"? Plus, the Soviet Union doesn't even exist anymore.
  • And it's not like the "First World" is the best world in every way. It has pockets of deep urban and rural poverty,
  • "Although the phrase was widely used, it was never clear whether it was a clear category of analysis, or simply a convenient and rather vague label for an imprecise collection of states in the second half of the 20th century and some of the common problems that they faced,"
  • "Being called a 'developing country' gives me a chance to improve." He hopes that one day India will go "a few steps beyond what 'developed countries' have achieved."
  • "I dislike the term 'developing world' because it assumes a hierarchy between countries"
  • "It paints a picture of Western societies as ideal but there are many social problems in these societies as well. It also perpetuates stereotypes about people who come from the so-called 'developing world' as backward, lazy, ignorant, irresponsible."
  • There are extremely wealthy people in poor countries, for example. Kenya has slums and neighborhoods where real estate prices rival any nation. It's part of a growing trend of income inequality around the world, Over notes.
  • So income levels tell you something — but not everything. Over would like to see classifications based on a combination of income and equality.
Javier E

The War in Ukraine Has Unleashed a New Word - The New York Times - 0 views

  • As I read about Irpin, about Bucha, about Trostyanets, of the bodies crushed by tanks, of the bicyclists shot on the street, of the desecrated corpses, there it was, “рашизм,” again and again
  • Grasping its meaning requires crossing differences in alphabet and pronunciation, thinking our way into the experience of a bilingual society at war with a fascist empire.
  • “Pашизм” sounds like “fascism,” but with an “r” sound instead of an “f” at the beginning; it means, roughly, “Russian fascism.”
  • ...19 more annotations...
  • The aggressor in this war keeps trying to push back toward a past as it never happened, toward nonsensical and necrophiliac accounts of history. Russia must conquer Ukraine, Vladimir Putin says, because of a baptism a thousand years ago, or because of bloodshed during World War II.
  • The new word “рашизм” is a useful conceptualization of Putin’s worldview. Far more than Western analysts, Ukrainians have noticed the Russian tilt toward fascism in the last decade.
  • Undistracted by Putin’s operational deployment of genocide talk, they have seen fascist practices in Russia: the cults of the leader and of the dead, the corporatist state, the mythical past, the censorship, the conspiracy theories, the centralized propaganda and now the war of destruction
  • we have tended to overlook the central example of fascism’s revival, which is the Putin regime in the Russian Federation.
  • A bilingual nation like Ukraine is not just a collection of bilingual individuals; it is an unending set of encounters in which people habitually adjust the language they use to other people and new settings, manipulating language in ways that are foreign to monolingual nations
  • I have gone on Ukrainian television and radio, taken questions in Russian and answered them in Ukrainian, without anyone for a moment finding that switch worthy of mention.
  • Ukrainians change languages effortlessly — not just as situations change, but also to make situations change, sometimes in the middle of a sentence, or even in the middle of a word.
  • “Pашизм” is a word built up from the inside, from several languages, as a complex of puns and references that reveal a bilingual society thinking out its predicament and communicating to itself.
  • Putin’s ethnic imperialism insists that Ukrainians must be Russians because they speak Russian. They do — and they speak Ukrainian. But Ukrainian identity has as much to do with an ability to live between languages than it does with the use of any one of them
  • Those six Cyrillic letters contain references to Italian, Russian and English, all of which a mechanical, letter-by-letter transliteration would block
  • The best (if imperfect) way I have found to render “рашизм” from Ukrainian into English is “ruscism”
  • When we see “ruscism” we might guess this word has to do with Russia (“rus”), with politics (“ism”) and with the extreme right (“ascism”) — as, indeed, it does
  • I have had to spell “рашизм” as “ruscism” in English because we need “rus,” with a “u,” to see the reference to Russia. In losing the original Ukrainian “a,” though, we weaken a multilayered reference — because the “a” in “рашизм,” conveniently, allows the Ukrainian word to associate Russia and fascism in a way English cannot.
  • If you don’t know either language, you might think that Russian and Ukrainian are very similar. They are pretty close — much as, say, Spanish and Italian are.
  • the semantics are not that close
  • From a Russian perspective, the false friends are legion. There is an elegant four-syllable Ukrainian word that simply means “soon” or “without delay,” but to a Russian it sounds like “not behind the bar.” The Ukrainian word for “cat” sounds like the Russian for “whale,” while the Ukrainian for “female cats” sounds like Russian for “intestines.”
  • Russians do not understand Ukrainian, because they have not learned it. Ukrainians do understand Russian, because they have learned it.
  • Ukrainian soldiers often speak Russian, though they are instructed to use Ukrainian to spot infiltrators and spies. This is a drastic example of a general practice of code-switching.
  • Ukrainians are perfectly capable of writing Russian correctly, but during the war some internet commentators have spelled the occasional Russian word using the Ukrainian writing system, leaving it looking unmoored and pitiable. Writing in Ukrainian, you might spell “oсвобождение” as “aсвобaждениe,” the way it is pronounced — a bit of lexicographic alchemy that makes it (and, by extension, Russians) look silly, and mocks the political concepts being used to justify a war. In a larger sense, such efforts are a means of displacing Russia from its central position in regional culture.
runlai_jiang

Taking On Adam Smith (and Karl Marx) - The New York Times - 0 views

  • “This sort of vaccinated me for life against lazy, anticapitalist rhetoric, because when you see these empty shops, you see these people queuing for nothing in the street,” he said, “it became clear to me that we need private property and market institutions, not just for economic efficiency but for personal freedom.”
  • But his disenchantment with communism doesn’t mean that Mr. Piketty has turned his back on the intellectual heritage of Karl Marx, who sought to explain the “iron laws” of capitalism. Like Marx, he is fiercely critical of the economic and social inequalities that untrammeled capitalism produces — and, he concludes, will continue to worsen. “I belong to a generation that never had any temptation with the Communist Party; I was too young for that,” Mr. Piketty said, in
  • In his new book “Capital in the Twenty-First Century” (Harvard University Press), Mr. Piketty, 42, has written a blockbuster, at least in the world of economics. His book punctures earlier assumptions about the benevolence of advanced capitalism and forecasts sharply increasing inequality of wealth in industrialized countries, with deep and deleterious impact on democratic values of justice and fairness.
  • ...7 more annotations...
  • Branko Milanovic, a former economist at the World Bank, called it “one of the watershed books in economic thinking.
  • “Capital in the Twenty-First Century,” with its title echoing Marx’s “Das Kapital,” is meant to be a return to the kind of economic history, of political economy, written by predecessors like Marx and Adam Smith. It is nothing less than a broad effort to understand Western societies and the economic rules that underpin them.
  • he said, are his generation’s “founding experiences”: the collapse of Communism, the economic degradation of Eastern Europe and the first Gulf War, in 1991.
  • Those events motivated him to try to understand a world where economic ideas had such bad consequences. As for the Gulf War, it showed him that “governments can do a lot in terms of redistribution of wealth when they want.” The rapid intervention to fo
  • The reason that postwar economies looked different — that inequality fell — was historical catastrophe. World War I, the Depression and World War II destroyed huge accumulations of private capital, especially in Europe. What the French call “les
  • In 2012 the top 1 percent of American households collected 22.5 percent of the nation’s income, the highest total since 1928. The richest 10 percent of Americans now take a larger slice of the pie than in 1913, at the close of the Gilded Age, owning more than 70 percent of the nation’s wealth. And half of that is owned by the top 1 percent. Advertisement Continue reading the main story Mr. Piketty, father of three daughters — 11, 13 and 16 — is no revolutionary. He is a member of no political party, and says he never served as an economic adviser to any politician. He calls himself a pragmatist, who simply follows the data.
  • Net wealth is a better indicator of ability to pay than income alone, he said. “All I’m proposing is to reduce the property tax on half or three-quarters of the population who have very little wealth,” he said. Write A Comment Published a year ago in French, the book is not without critics, especially of Mr. Piketty’s policy prescriptions, which have been called politically naïve. Others point out that some of the increase in capital is because of aging populations and postwar pension plans, which are not necessarily inherited.More criticism is sure to come, and Mr. Piketty says he welcomes it. “I’m certainly looking forward to the debate.”
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
Javier E

Liu Cixin's War of the Worlds | The New Yorker - 0 views

  • he briskly dismissed the idea that fiction could serve as commentary on history or on current affairs. “The whole point is to escape the real world!” he said.
  • Chinese tech entrepreneurs discuss the Hobbesian vision of the trilogy as a metaphor for cutthroat competition in the corporate world; other fans include Barack Obama, who met Liu in Beijing two years ago, and Mark Zuckerberg. Liu’s international career has become a source of national pride. In 2015, China’s then Vice-President, Li Yuanchao, invited Liu to Zhongnanhai—an off-limits complex of government accommodation sometimes compared to the Kremlin—to discuss the books and showed Liu his own copies, which were dense with highlights and annotations.
  • In China, one of his stories has been a set text in the gao kao—the notoriously competitive college-entrance exams that determine the fate of ten million pupils annually; another has appeared in the national seventh-grade-curriculum textbook. When a reporter recently challenged Liu to answer the middle-school questions about the “meaning” and the “central themes” of his story, he didn’t get a single one right. “I’m a writer,” he told me, with a shrug.
  • ...20 more annotations...
  • Liu’s tomes—they tend to be tomes—have been translated into more than twenty languages, and the trilogy has sold some eight million copies worldwide. He has won China’s highest honor for science-fiction writing, the Galaxy Award, nine times, and in 2015 he became the first Asian writer to win the Hugo Award, the most prestigious international science-fiction prize
  • Liu believes that this trend signals a deeper shift in the Chinese mind-set—that technological advances have spurred a new excitement about the possibilities of cosmic exploration.
  • Concepts that seemed abstract to others took on, for him, concrete forms; they were like things he could touch, inducing a “druglike euphoria.” Compared with ordinary literature, he came to feel, “the stories of science are far more magnificent, grand, involved, profound, thrilling, strange, terrifying, mysterious, and even emotional
  • Pragmatic choices like this one, or like the decision his grandparents made when their sons were conscripted, recur in his fiction—situations that present equally unconscionable choices on either side of a moral fulcrum
  • The great flourishing of science fiction in the West at the end of the nineteenth century occurred alongside unprecedented technological progress and the proliferation of the popular press—transformations that were fundamental to the development of the genre
  • Joel Martinsen, the translator of the second volume of Liu’s trilogy, sees the series as a continuation of this tradition. “It’s not hard to read parallels between the Trisolarans and imperialist designs on China, driven by hunger for resources and fear of being wiped out,” he told me. Even Liu, unwilling as he is to endorse comparisons between the plot and China’s current face-off with the U.S., did at one point let slip that “the relationship between politics and science fiction cannot be underestimated.”
  • Speculative fiction is the art of imagining alternative worlds, and the same political establishment that permits it to be used as propaganda for the existing regime is also likely to recognize its capacity to interrogate the legitimacy of the status quo.
  • Liu has been criticized for peopling his books with characters who seem like cardboard cutouts installed in magnificent dioramas. Liu readily admits to the charge. “I did not begin writing for love of literature,” he told me. “I did so for love of science.”
  • “The Three-Body Problem” takes its title from an analytical problem in orbital mechanics which has to do with the unpredictable motion of three bodies under mutual gravitational pull. Reading an article about the problem, Liu thought, What if the three bodies were three suns? How would intelligent life on a planet in such a solar system develop? From there, a structure gradually took shape that almost resembles a planetary system, with characters orbiting the central conceit like moons. For better or worse, the characters exist to support the framework of the story rather than to live as individuals on the page.
  • Liu’s imagination is dauntingly capacious, his narratives conceived on a scale that feels, at times, almost hallucinogenic. The time line of the trilogy spans 18,906,450 years, encompassing ancient Egypt, the Qin dynasty, the Byzantine Empire, the Cultural Revolution, the present, and a time eighteen million years in the future
  • The first book is set on Earth, though some of its scenes take place in virtual reality; by the end of the third book, the scope of the action is interstellar and annihilation unfolds across several dimensions. The London Review of Books has called the trilogy “one of the most ambitious works of science fiction ever written.”
  • Although physics furnishes the novels’ premises, it is politics that drives the plots. At every turn, the characters are forced to make brutal calculations in which moral absolutism is pitted against the greater good
  • In Liu’s fictional universe, idealism is fatal and kindness an exorbitant luxury. As one general says in the trilogy, “In a time of war, we can’t afford to be too scrupulous.” Indeed, it is usually when people do not play by the rules of Realpolitik that the most lives are lost.
  • “I know what you are thinking,” he told me with weary clarity. “What about individual liberty and freedom of governance?” He sighed, as if exhausted by a debate going on in his head. “But that’s not what Chinese people care about. For ordinary folks, it’s the cost of health care, real-estate prices, their children’s education. Not democracy.”
  • Liu closed his eyes for a long moment and then said quietly, “This is why I don’t like to talk about subjects like this. The truth is you don’t really—I mean, can’t truly—understand.”
  • Liu explained to me, the existing regime made the most sense for today’s China, because to change it would be to invite chaos. “If China were to transform into a democracy, it would be hell on earth,”
  • It was an opinion entirely consistent with his systems-level view of human societies, just as mine reflected a belief in democracy and individualism as principles to be upheld regardless of outcomes
  • “I cannot escape and leave behind reality, just like I cannot leave behind my shadow. Reality brands each of us with its indelible mark. Every era puts invisible shackles on those who have lived through it, and I can only dance in my chains.
  • Chinese people of his generation were lucky, he said. The changes they had seen were so huge that they now inhabited a world entirely different from that of their childhood. “China is a futuristic country,” he said. “I realized that the world around me became more and more like science fiction, and this process is speeding up.”
  • “We have statues of a few martyrs, but we never—We don’t memorialize those, the individuals.” He took off his glasses and blinked, peering into the wide expanse of green and concrete. “This is how we Chinese have always been,” he said. “When something happens, it passes, and time buries the stories.”
1 - 20 of 118 Next › Last »
Showing 20 items per page