Skip to main content

Home/ History Readings/ Group items tagged reasoning

Rss Feed Group items tagged

Javier E

Economics of Good and Evil: The Quest for Economic Meaning from Gilgamesh to Wall Stree... - 2 views

  • Instead of self-confident and self-centered answers, the author humbly asks fundamental questions: What is economics? What is its meaning? Where does this new religion, as it is sometimes called, come from? What are its possibilities and its limitations and borders, if there are any? Why are we so dependent on permanent growing of growth and growth of growing of growth? Where did the idea of progress come from, and where is it leading us? Why are so many economic debates accompanied by obsession and fanaticism?
  • The majority of our political parties act with a narrow materialistic focus when, in their programs, they present the economy and finance first; only then, somewhere at the end, do we find culture as something pasted on or as a libation for a couple of madmen.
  • most of them—consciously or unconsciously—accept and spread the Marxist thesis of the economic base and the spiritual superstructure.
  • ...297 more annotations...
  • He tries to break free of narrow specialization and cross the boundaries between scientific disciplines. Expeditions beyond economics’ borders and its connection to history, philosophy, psychology, and ancient myths are not only refreshing, but necessary for understanding the world of the twenty-first century.
  • Reality is spun from stories, not from material. Zdeněk Neubauer
  • “The separation between the history of a science, its philosophy, and the science itself dissolves into thin air, and so does the separation between science and non-science; differences between the scientific and unscientific are vanishing.”
  • Outside of our history, we have nothing more.
  • The study of the history of a certain field is not, as is commonly held, a useless display of its blind alleys or a collection of the field’s trials and errors (until we got it right), but history is the fullest possible scope of study of a menu that the given field can offer.
  • History of thought helps us to get rid of the intellectual brainwashing of the age, to see through the intellectual fashion of the day, and to take a couple of steps back.
  • Almost all of the key concepts by which economics operates, both consciously and unconsciously, have a long history, and their roots extend predominantly outside the range of economics, and often completely beyond that of science.
  • That is the reason for this book: to look for economic thought in ancient myths and, vice versa, to look for myths in today’s economics.
  • stories; Adam Smith believed. As he puts it in The Theory of Moral Sentiments, “the desire of being believed, or the desire of persuading, of leading and directing other people, seems to be one of the strongest of all our natural desires.”
  • “The human mind is built to think in terms of narratives … in turn, much of human motivation comes from living through a story of our lives, a story that we tell to ourselves and that creates a framework of our motivation. Life could be just ‘one damn thing after another’ if it weren’t for such stories. The same is true for confidence in a nation, a company, or an institution. Great leaders are foremost creators of stories.”
  • contrary to what our textbooks say, economics is predominantly a normative field. Economics not only describes the world but is frequently about how the world should be (it should be effective, we have an ideal of perfect competition, an ideal of high-GDP growth in low inflation, the effort to achieve high competitiveness …). To this end, we create models, modern parables,
  • I will try to show that mathematics, models, equations, and statistics are just the tip of the iceberg of economics; that the biggest part of the iceberg of economic knowledge consists of everything else; and that disputes in economics are rather a battle of stories and various metanarratives than anything else.
  • Before it was emancipated as a field, economics lived happily within subsets of philosophy—ethics, for example—miles away from today’s concept of economics as a mathematical-allocative science that views “soft sciences” with a scorn born from positivistic arrogance. But our thousand-year “education” is built on a deeper, broader, and oftentimes more solid base. It is worth knowing about.
  • is a paradox that a field that primarily studies values wants to be value-free. One more paradox is this: A field that believes in the invisible hand of the market wants to be without mysteries.
  • mathematics at the core of economics, or is it just the icing of the cake, the tip of the iceberg of our field’s inquiry?
  • we seek to chart the development of the economic ethos. We ask questions that come before any economic thinking can begin—both philosophically and, to a degree, historically. The area here lies at the very borders of economics—and often beyond. We may refer to this as protoeconomics (to borrow a term from protosociology) or, perhaps more fittingly, metaeconomics (to borrow a term from metaphysics).
  • In this sense, “the study of economics is too narrow and too fragmentary to lead to valid insight, unless complemented and completed by a study of metaeconomics.”17
  • The more important elements of a culture or field of inquiry such as economics are found in fundamental assumptions that adherents of all the various systems within the epoch unconsciously presuppose. Such assumptions appear so obvious that people do not know what they are assuming, because no other way of putting things has ever occurred to them, as the philosopher Alfred Whitehead notes in Adventures of Ideas.
  • I argue that economic questions were with mankind long before Adam Smith. I argue that the search for values in economics did not start with Adam Smith but culminated with him.
  • We should go beyond economics and study what beliefs are “behind the scenes,” ideas that have often become the dominant yet unspoken assumptions in our theories. Economics is surprisingly full of tautologies that economists are predominantly unaware of. I
  • argue that economics should seek, discover, and talk about its own values, although we have been taught that economics is a value-free science. I argue that none of this is true and that there is more religion, myth, and archetype in economics than there is mathematics.
  • In a way, this is a study of the evolution of both homo economicus and, more importantly, the history of the animal spirits within him. This book tries to study the evolution of the rational as well as the emotional and irrational side of human beings.
  • I argue that his most influential contribution to economics was ethical. His other thoughts had been clearly expressed long before him, whether on specialization, or on the principle of the invisible hand of the market. I try to show that the principle of the invisible hand of the market is much more ancient and developed long before Adam Smith. Traces of it appear even in the Epic of Gilgamesh, Hebrew thought, and in Christianity, and it is expressly stated by Aristophanes and Thomas Aquinas.
  • This is not a book on the thorough history of economic thought. The author aims instead to supplement certain chapters on the history of economic thought with a broader perspective and analysis of the influences that often escape the notice of economists and the wider public.
  • Progress (Naturalness and Civilization)
  • The Economy of Good and Evil
  • from his beginnings, man has been marked as a naturally unnatural creature, who for unique reasons surrounds himself with external possessions. Insatiability, both material and spiritual, are basic human metacharacteristics, which appear as early as the oldest myths and stories.
  • the Hebrews, with linear time, and later the Christians gave us the ideal (or amplified the Hebrew ideal) we now embrace. Then the classical economists secularized progress. How did we come to today’s progression of progress, and growth for growth’s sake?
  • The Need for Greed: The History of Consumption and Labor
  • Metamathematics From where did economics get the concept of numbers as the very foundation of the world?
  • All of economics is, in the end, economics of good and evil. It is the telling of stories by people of people to people. Even the most sophisticated mathematical model is, de facto, a story, a parable, our effort to (rationally) grasp the world around us.
  • idea that we can manage to utilize our natural egoism, and that this evil is good for something, is an ancient philosophical and mythical concept. We will also look into the development of the ethos of homo economicus, the birth of “economic man.”
  • The History of Animal Spirits: Dreams Never Sleep
  • Masters of the Truth
  • Originally, truth was a domain of poems and stories, but today we perceive truth as something much more scientific, mathematical. Where does one go (to shop) for the truth? And who “has the truth” in our epoch?
  • Our animal spirits (something of a counterpart to rationality) are influenced by the archetype of the hero and our concept of what is good.
  • The entire history of ethics has been ruled by an effort to create a formula for the ethical rules of behavior. In the final chapter we will show the tautology of Max Utility, and we will discuss the concept of Max Good.
  • The History of the Invisible Hand of the Market and Homo Economicus
  • We understand “economics” to mean a broader field than just the production, distribution, and consumption of goods and services. We consider economics to be the study of human relations that are sometimes expressible in numbers, a study that deals with tradables, but one that also deals with nontradables (friendship, freedom, efficiency, growth).
  • When we mention economics in this book, we mean the mainstream perception of it, perhaps as best represented by Paul Samuelson.
  • By the term homo economicus, we mean the primary concept of economic anthropology. It comes from the concept of a rational individual, who, led by narrowly egotistical motives, sets out to maximize his benefit.
  • the Epic of Gilgamesh bears witness to the opposite—despite the fact that the first written clay fragments (such as notes and bookkeeping) of our ancestors may have been about business and war, the first written story is mainly about great friendship and adventure.
  • there is no mention of either money or war; for example, not once does anyone in the whole epic sell or purchase something.5 No nation conquers another, and we do not encounter a mention even of the threat of violence.
  • Gilgamesh becomes a hero not only due to his strength, but also due to discoveries and deeds whose importance were in large part economic—direct gaining of construction materials in the case of felling the cedar forest, stopping Enkidu from devastating Uruk’s economy, and discovering new desert routes during his expeditions.
  • Even today we live in Gilgamesh’s vision that human relations—and therefore humanity itself—are a disturbance to work and efficiency; that people would perform better if they did not “waste” their time and energy on nonproductive things.
  • is a story of nature and civilization, of heroism, defiance, and the battle against the gods, and evil; an epic about wisdom, immortality, and also futility.
  • But labour is unlike any other commodity. The work environment is of no concern for steel; we do not care about steel’s well-being.16
  • But it is in friendship where—often by-the-way, as a side product, an externality—ideas and deeds are frequently performed or created that together can altogether change the face of society.19 Friendship can go against an ingrained system in places where an individual does not have the courage to do so himself or herself.
  • As Joseph Stiglitz says, One of the great “tricks” (some say “insights”) of neoclassical economics is to treat labour like any other factor of production. Output is written as a function of inputs—steel, machines, and labour. The mathematics treats labour like any other commodity, lulling one into thinking of labour like an ordinary commodity, such as steel or plastic.
  • Even the earliest cultures were aware of the value of cooperation on the working level—today we call this collegiality, fellowship, or, if you want to use a desecrated term, comradeship. These “lesser relationships” are useful and necessary for society and for companies because work can be done much faster and more effectively if people get along with each other on a human level
  • But true friendship, which becomes one of the central themes of the Epic of Gilgamesh, comes from completely different material than teamwork. Friendship, as C. S. Lewis accurately describes it, is completely uneconomical, unbiological, unnecessary for civilization, and an unneeded relationship
  • Here we have a beautiful example of the power of friendship, one that knows how to transform (or break down) a system and change a person. Enkidu, sent to Gilgamesh as a punishment from the gods, in the end becomes his faithful friend, and together they set out against the gods. Gilgamesh would never have gathered the courage to do something like that on his own—nor would Enkidu.
  • Due to their friendship, Gilgamesh and Enkidu then intend to stand up to the gods themselves and turn a holy tree into mere (construction) material they can handle almost freely, thereby making it a part of the city-construct, part of the building material of civilization, thus “enslaving” that which originally was part of wild nature. This is a beautiful proto-example of the shifting of the borders between the sacred and profane (secular)—and to a certain extent also an early illustration of the idea that nature is there to provide cities and people with raw material and production resources.
  • started with Babylonians—rural nature becomes just a supplier of raw materials, resources (and humans the source of human resources). Nature is not the garden in which humans were created and placed, which they should care for and which they should reside in, but becomes a mere reservoir for natural (re)sources.
  • Even today, we often consider the domain of humanity (human relations, love, friendship, beauty, art, etc.) to be unproductive;
  • Both heroes change—each from opposite poles—into humans. In this context, a psychological dimension to the story may be useful: “Enkidu (…) is Gilgamesh’s alter ego, the dark, animal side of his soul, the complement to his restless heart. When Gilgamesh found Enkidu, he changed from a hated tyrant into the protector of his city. (…)
  • To be human seems to be somewhere in between, or both of these two. We
  • this moment of rebirth from an animal to a human state, the world’s oldest preserved epic implicitly hints at something highly important. Here we see what early cultures considered the beginning of civilization. Here is depicted the difference between people and animals or, better, savages. Here the epic quietly describes birth, the awakening of a conscious, civilized human. We are witnesses to the emancipation of humanity from animals,
  • The entire history of culture is dominated by an effort to become as independent as possible from the whims of nature.39 The more developed a civilization is, the more an individual is protected from nature and natural influences and knows how to create around him a constant or controllable environment to his liking.
  • The price we pay for independence from the whims of nature is dependence on our societies and civilizations. The more sophisticated a given society is as a whole, the less its members are able to survive on their own as individuals, without society.
  • The epic captures one of the greatest leaps in the development of the division of labor. Uruk itself is one of the oldest cities of all, and in the epic it reflects a historic step forward in specialization—in the direction of a new social city arrangement. Because of the city wall, people in the city can devote themselves to things other than worrying about their own safety, and they can continue to specialize more deeply.
  • Human life in the city gains a new dimension and suddenly it seems more natural to take up issues going beyond the life span of an individual. “The city wall symbolizes as well as founds the permanence of the city as an institution which will remain forever and give its inhabitants the certainty of unlimited safety, allowing them to start investing with an outlook reaching far beyond the borders of individual life.
  • The wall around the city of Uruk is, among other things, a symbol of an internal distancing from nature, a symbol of revolts against submission to laws that do not come under the control of man and that man can at most discover and use to his benefit.
  • “The chief thing which the common-sense individual wants is not satisfactions for the wants he had, but more, and better wants.”47
  • If a consumer buys something, theoretically it should rid him of one of his needs—and the aggregate of things they need should be decreased by one item. In reality, though, the aggregate of “I want to have” expands together with the growing aggregate of “I have.”
  • can be said that Enkidu was therefore happy in his natural state, because all of his needs were satiated. On the other hand, with people, it appears that the more a person has, the more developed and richer, the greater the number of his needs (including the unsaturated ones).
  • the Old Testament, this relationship is perceived completely differently. Man (humanity) is created in nature, in a garden. Man was supposed to care for the Garden of Eden and live in harmony with nature and the animals. Soon after creation, man walks naked and is not ashamed, de facto the same as the animals. What is characteristic is that man dresses (the natural state of creation itself is not enough for him), and he (literally and figuratively) covers52 himself—in shame after the fall.53
  • Nature is where one goes to hunt, collect crops, or gather the harvest. It is perceived as the saturator of our needs and nothing more. One goes back to the city to sleep and be “human.” On the contrary, evil resides in nature. Humbaba lives in the cedar forest, which also happens to be the reason to completely eradicate it.
  • Symbolically, then, we can view the entire issue from the standpoint of the epic in the following way: Our nature is insufficient, bad, evil, and good (humane) occurs only after emancipation from nature (from naturalness), through culturing and education. Humanity is considered as being in civilization.
  • The city was frequently (at least in older Jewish writings) a symbol of sin, degeneration, and decadence—nonhumanity. The Hebrews were originally a nomadic nation, one that avoided cities. It is no accident that the first important city57 mentioned in the Bible is proud Babylon,58 which God later turns to dust.
  • is enough, for example, to read the Book of Revelation to see how the vision of paradise developed from the deep Old Testament period, when paradise was a garden. John describes his vision of heaven as a city—paradise is in New Jerusalem, a city where the dimensions of the walls(!) are described in detail, as are the golden streets and gates of pearl.
  • Hebrews later also chose a king (despite the unanimous opposition of God’s prophets) and settled in cities, where they eventually founded the Lord’s Tabernacle and built a temple for Him. The city of Jerusalem later gained an illustrious position in all of religion.
  • this time Christianity (as well as the influence of the Greeks) does not consider human naturalness to be an unambiguous good, and it does not have such an idyllic relationship to nature as the Old Testament prophets.
  • If a tendency toward good is not naturally endowed in people, it must be imputed from above through violence or at least the threat of violence.
  • If we were to look at human naturalness as a good, then collective social actions need a much weaker ruling hand. If people themselves have a natural tendency (propensity) toward good, this role does not have to be supplied by the state, ruler, or, if you wish, Leviathan.
  • How does this affect economics?
  • us return for the last time to the humanization of the wild Enkidu, which is a process we can perceive with a bit of imagination as the first seed of the principle of the market’s invisible hand, and therefore the parallels with one of the central schematics of economic thinking.
  • Sometimes it is better to “harness the devil to the plow” than to fight with him. Instead of summoning up enormous energy in the fight against evil, it is better to use its own energy to reach a goal we desire; setting up a mill on the turbulent river instead of futile efforts to remove the current. This is also how Saint Prokop approached it in one of the oldest Czech legends.
  • Enkidu caused damage and it was impossible to fight against him. But with the help of a trap, trick, this evil was transformed into something that greatly benefited civilization.
  • By culturing and “domesticating” Enkidu, humanity tamed the uncontrollable wild and chaotic evil
  • Enkidu devastated the doings (the external, outside-the-walls) of the city. But he was later harnessed and fights at the side of civilization against nature, naturalness, the natural state of things.
  • A similar motif appears a thousand years after the reversal, which is well known even to noneconomists as the central idea of economics: the invisible hand of the market.
  • A similar story (reforming something animally wild and uncultivated in civilizational achievement) is used by Thomas Aquinas in his teachings. Several centuries later, this idea is fully emancipated in the hands of Bernard Mandeville and his Fable of the Bees: or, Private Vices, Publick Benefits. The economic and political aspects of this idea are—often incorrectly—ascribed to Adam Smith.
  • Here the individual does not try anymore to maximize his goods or profits, but what is important is writing his name in human memory in the form of heroic acts or deeds.
  • immortality, one connected with letters and the cult of the word: A name and especially a written name survives the body.”77
  • After this disappointment, he comes to the edge of the sea, where the innkeeper Siduri lives. As tonic for his sorrow, she offers him the garden of bliss, a sort of hedonistic fortress of carpe diem, where a person comes to terms with his mortality and at least in the course of the end of his life maximizes earthly pleasures, or earthly utility.
  • In the second stage, after finding his friend Enkidu, Gilgamesh abandons the wall and sets out beyond the city to maximalize heroism. “In his (…) search of immortal life, Gilgamesh
  • The hero refuses hedonism in the sense of maximizing terrestrial pleasure and throws himself into things that will exceed his life. In the blink of an eye, the epic turns on its head the entire utility maximization role that mainstream economics has tirelessly tried to sew on people as a part of their nature.81
  • It is simpler to observe the main features of our civilization at a time when the picture was more readable—at a time when our civilization was just being born and was still “half-naked.” In other words, we have tried to dig down to the bedrock of our written civilization;
  • today remember Gilgamesh for his story of heroic friendship with Enkidu, not for his wall, which no longer reaches monumental heights.
  • the eleventh and final tablet, Gilgamesh again loses what he sought. Like Sisyphus, he misses his goal just before the climax
  • is there something from it that is valid today? Have we found in Gilgamesh certain archetypes that are in us to this day?
  • The very existence of questions similar to today’s economic ones can be considered as the first observation. The first written considerations of the people of that time were not so different from those today. In other words: The epic is understandable for us, and we can identify with it.
  • We have also been witnesses to the very beginnings of man’s culturing—a great drama based on a liberation and then a distancing from the natural state.
  • Let us take this as a memento in the direction of our restlessness, our inherited dissatisfaction and the volatility connected to it. Considering that they have lasted five thousand years and to this day we find ourselves in harmony with a certain feeling of futility, perhaps these characteristics are inherent in man.
  • Gilgamesh had a wall built that divided the city from wild nature and created a space for the first human culture. Nevertheless, “not even far-reaching works of civilization could satisfy human desire.”
  • Friendship shows us new, unsuspected adventures, gives us the opportunity to leave the wall and to become neither its builder nor its part—to not be another brick in the wall.
  • with the phenomenon of the creation of the city, we have seen how specialization and the accumulation of wealth was born, how holy nature was transformed into a secular supplier of resources, and also how humans’ individualistic ego was emancipated.
  • to change the system, to break down that which is standing and go on an expedition against the gods (to awaken, from naïveté to awakening) requires friendship.
  • For small acts (hunting together, work in a factory), small love is enough: Camaraderie. For great acts, however, great love is necessary, real love: Friendship. Friendship that eludes the economic understanding of quid pro quo. Friendship gives. One friend gives (fully) for the other. That is friendship for life and death,
  • The thought that humanity comes at the expense of efficiency is just as old as humanity itself—as we have shown, subjects without emotion are the ideal of many tyrants.
  • The epic later crashes this idea through the friendship of Gilgamesh and Enkidu. Friendship—the biologically least essential love, which at first sight appears to be unnecessary
  • less a civilized, city person is dependent on nature, the more he or she is dependent on the rest of society. Like Enkidu, we have exchanged nature for society; harmony with (incalculable) nature for harmony with (incalculable) man.
  • human nature good or evil? To this day these questions are key for economic policy: If we believe that man is evil in his nature, therefore that a person himself is dog eat dog (animal), then the hard hand of a ruler is called for. If we believe that people in and of themselves, in their nature, gravitate toward good, then it is possible to loosen up the reins and live in a society that is more laissez-faire.
  • For a concept of historical progress, for the undeification of heroes, rulers, and nature, mankind had to wait for the Hebrews.
  • Because nature is not undeified, it is beyond consideration to explore it, let alone intervene in it (unless a person was a two-thirds god like Gilgamesh). It
  • They practiced money lending, traded in many assets (…) and especially were engaged in the trading of shares on capital markets, worked in currency exchange and frequently figured as mediators in financial transactions (…), they functioned as bankers and participated in emissions of all possible forms.
  • As regards modern capitalism (as opposed to the ancient and medieval periods) … there are activities in it which are, in certain forms, inherently (and completely necessarily) present—both from an economic and legal standpoint.7
  • As early as the “dark” ages, the Jews commonly used economic tools that were in many ways ahead of their time and that later became key elements of the modern economy:
  • Gilgamesh’s story ends where it began. There is a consistency in this with Greek myths and fables: At the end of the story, no progress occurs, no essential historic change; the story is set in indefinite time, something of a temporal limbo.
  • Jews believe in historical progress, and that progress is in this world.
  • For a nation originally based on nomadism, where did this Jewish business ethos come from? And can the Hebrews truly be considered as the architects of the values that set the direction of our civilization’s economic thought?
  • Hebrew religiosity is therefore strongly connected with this world, not with any abstract world, and those who take pleasure in worldly possessions are not a priori doing anything wrong.
  • PROGRESS: A SECULARIZED RELIGION One of the things the writers of the Old Testament gave to mankind is the idea and notion of progress. The Old Testament stories have their development; they change the history of the Jewish nation and tie in to each other. The Jewish understanding of time is linear—it has a beginning and an end.
  • The observance of God’s Commandments in Judaism leads not to some ethereal other world, but to an abundance of material goods (Genesis 49:25–26, Leviticus 26:3–13, Deuteronomy 28:1–13) (…) There are no accusing fingers pointed at
  • There are no echoes of asceticism nor for the cleansing and spiritual effect of poverty. It is fitting therefore, that the founders of Judaism, the Patriarchs Abraham, Isaac and Jacob, were all wealthy men.12
  • about due to a linear understanding of history. If history has a beginning as well as an end, and they are not the same point, then exploration suddenly makes sense in areas where the fruits are borne only in the next generation.
  • What’s more, economic progress has almost become an assumption of modern functional societies. We expect growth. We take it automatically. Today, if nothing “new” happens, if GDP does not grow (we say it stagnates) for several quarters, we consider it an anomaly.
  • however, the idea of progress itself underwent major changes, and today we perceive it very differently. As opposed to the original spiritual conceptions, today we perceive progress almost exclusively in an economic or scientific-technological sense.
  • Because care for the soul has today been replaced by care for external things,
  • This is why we must constantly grow, because we (deep down and often implicitly) believe that we are headed toward an (economic) paradise on Earth.
  • Only since the period of scientific-technological revolution (and at a time when economics was born as an independent field) is material progress automatically assumed.
  • Jewish thought is the most grounded, most realistic school of thought of all those that have influenced our culture.17 An abstract world of ideas was unknown to the Jews. To this day it is still forbidden to even depict God, people, and animals in symbols, paintings, statues, and drawings.
  • economists have become key figures of great importance in our time (Kacířské eseje o filosofii dějin [Heretical Essays in the Philosophy of History]). They are expected to perform interpretations of reality, give prophetic services (macroeconomic forecasts), reshape reality (mitigate the impacts of the crisis, speed up growth), and, in the long run, provide leadership on the way to the Promised Land—paradise on Earth.
  • REALISM AND ANTIASCETICISM Aside from ideas of progress, the Hebrews brought another very fundamental contribution to our culture: The desacralization of heroes, nature, and rulers.
  • Voltaire writes: “It certain fact is, that in his public laws he [Moses] never so much as once made mention of a life to come, limiting all punishments and all rewards to the present life.”21
  • As opposed to Christianity, the concept of an extraterrestrial paradise or heaven was not developed much in Hebrew thought.19 The paradise of the Israelites—Eden—was originally placed on Earth at a given place in Mesopotamia20 and at a given time,
  • The Hebrews consider the world to be real—not just a shadow reflection of a better world somewhere in the cloud of ideas, something the usual interpretation of history ascribes to Plato. The soul does not struggle against the body and is not its prisoner, as Augustine would write later.
  • The land, the world, the body, and material reality are for Jews the paramount setting for divine history, the pinnacle of creation. This idea is the conditio sine qua non of the development of economics, something of an utterly earthly making,
  • The mythology of the hero-king was strongly developed in that period, which Claire Lalouette summarizes into these basic characteristics: Beauty (a perfect face, on which it is “pleasant to look upon,” but also “beauty,” expressed in the Egyptian word nefer, not only means aesthetics, but contains moral qualities as well),
  • THE HERO AND HIS UNDEIFICATION: THE DREAM NEVER SLEEPS The concept of the hero is more important than it might appear. It may be the remote origin of Keynes’s animal spirits, or the desire to follow a kind of internal archetype that a given individual accepts as his own and that society values.
  • This internal animator of ours, our internal mover, this dream, never sleeps and it influences our behavior—including economic behavior—more than we want to realize.
  • manliness and strength,28 knowledge and intelligence,29 wisdom and understanding, vigilance and performance, fame and renown (fame which overcomes enemies because “a thousand men would not be able to stand firmly in his presence”);30 the hero is a good shepherd (who takes care of his subordinates), is a copper-clad rampart, the shield of the land, and the defender of heroes.
  • Each of us probably has a sort of “hero within”—a kind of internal role-model, template, an example that we (knowingly or not) follow. It is very important what kind of archetype it is, because its role is dominantly irrational and changes depending on time and the given civilization.
  • The oldest was the so-called Trickster—a fraudster; then the culture bearer—Rabbit; the musclebound hero called Redhorn; and finally the most developed form of hero: the Twins.
  • the Egyptian ruler, just as the Sumerian, was partly a god, or the son of a god.31
  • Jacob defrauds his father Isaac and steals his brother Esau’s blessing of the firstborn. Moses murders an Egyptian. King David seduces the wife of his military commander and then has him killed. In his old age, King Solomon turns to pagan idols, and so on.
  • Anthropology knows several archetypes of heroes. The Polish-born American anthropologist Paul Radin examined the myths of North American Indians and, for example, in his most influential book, The Trickster, he describes their four basic archetypes of heroes.
  • The Torah’s heroes (if that term can be used at all) frequently make mistakes and their mistakes are carefully recorded in the Bible—maybe precisely so that none of them could be deified.32
  • We do not have to go far for examples. Noah gets so drunk he becomes a disgrace; Lot lets his own daughters seduce him in a similar state of drunkenness. Abraham lies and (repeatedly) tries to sell his wife as a concubine.
  • the Hebrew heroes correspond most to the Tricksters, the Culture Bearers, and the Twins. The divine muscleman, that dominant symbol we think of when we say hero, is absent here.
  • To a certain extent it can be said that the Hebrews—and later Christianity—added another archetype, the archetype of the heroic Sufferer.35 Job
  • Undeification, however, does not mean a call to pillage or desecration; man was put here to take care of nature (see the story of the Garden of Eden or the symbolism of the naming of the animals). This protection and care of nature is also related to the idea of progress
  • For the heroes who moved our civilization to where it is today, the heroic archetypes of the cunning trickster, culture bearer, and sufferer are rather more appropriate.
  • the Old Testament strongly emphasizes the undeification of nature.37 Nature is God’s creation, which speaks of divinity but is not the domain of moody gods
  • This is very important for democratic capitalism, because the Jewish heroic archetype lays the groundwork much better for the development of the later phenomenon of the hero, which better suits life as we know it today. “The heroes laid down their arms and set about trading to become wealthy.”
  • in an Old Testament context, the pharaoh was a mere man (whom one could disagree with, and who could be resisted!).
  • RULERS ARE MERE MEN In a similar historical context, the Old Testament teachings carried out a similar desacralization of rulers, the so-called bearers of economic policy.
  • Ultimately the entire idea of a political ruler stood against the Lord’s will, which is explicitly presented in the Torah. The Lord unequivocally preferred the judge as the highest form of rule—an
  • The needs of future generations will have to be considered; after all humankind are the guardians of God’s world. Waste of natural resources, whether privately owned or nationally owned is forbidden.”39
  • Politics lost its character of divine infallibility, and political issues were subject to questioning. Economic policy could become a subject of examination.
  • 44 God first creates with the word and then on individual days He divides light from darkness, water from dry land, day from night, and so forth—and He gives order to things.45 The world is created orderly— it is wisely, reasonably put together. The way of the world is put together at least partially46 decipherably by any other wise and reasonable being who honors rational rules.
  • which for the methodology of science and economics is very important because disorder and chaos are difficult to examine scientifically.43 Faith in some kind of rational and logical order in a system (society, the economy) is a silent assumption of any (economic) examination.
  • THE PRAISE OF ORDER AND WISDOM: MAN AS A PERFECTER OF CREATION The created world has an order of sorts, an order recognizable by us as people,
  • From the very beginning, when God distances Himself from the entire idea, there is an anticipation that there is nothing holy, let alone divine, in politics. Rulers make mistakes, and it is possible to subject them to tough criticism—which frequently occurs indiscriminately through the prophets in the Old Testament.
  • Hebrew culture laid the foundations for the scientific examination of the world.
  • Examining the world is therefore an absolutely legitimate activity, and one that is even requested by God—it is a kind of participation in the Creator’s work.51 Man is called on to understand himself and his surroundings and to use his knowledge for good.
  • I was there when he set heavens in place, when he marked out the horizon on the face of the deep (…) Then I was the craftsman at his side.47
  • There are more urgings to gain wisdom in the Old Testament. “Wisdom calls aloud in the street (…): ‘How long will you simple ones love your simple ways?’”49 Or several chapters later: “Wisdom is supreme; therefore get wisdom. Though it cost all you have, get understanding.”50
  • examination is not forbidden. The fact that order can be grasped by human reason is another unspoken assumption that serves as a cornerstone of any scientific examination.
  • then, my sons, listen to me; blessed are those who keep my ways (…) Blessed is the man who listens to me, watching daily at my doors, waiting at my doorway. For whoever finds me finds life and receives favor from the Lord.
  • the rational examination of nature has its roots, surprisingly, in religion.
  • The Lord brought me forth as the first of his works, before his deeds of old. I was appointed from eternity, from the beginning, before the world began. When there were no oceans, I was given birth, when there were no springs abounding with water, before the mountains were settled in place,
  • The Book of Proverbs emphasizes specifically several times that it was wisdom that was present at the creation of the world. Wisdom personified calls out:
  • The last act, final stroke of the brush of creation, naming of the animals—this act is given to a human, it is not done by God, as one would expect. Man was given the task of completing the act of creation that the Lord began:
  • MAN AS A FINISHER OF CREATION The creation of the world, as it is explained in Jewish teachings, is described in the Book of Genesis. Here God (i) creates, (ii) separates, and (iii) names [my emphasis]:
  • Naming is a symbolic expression. In Jewish culture (and also in our culture to this day), the right to name meant sovereign rights and belonged, for example, to explorers (new places), inventors (new principles), or parents (children)—that is, to those who were there at the genesis, at the origin. This right was handed over by God to mankind.
  • The Naming itself (the capital N is appropriate) traditionally belongs to the crowning act of the Creator and represents a kind of grand finale of creation, the last move of the brush to complete the picture—a signature of the master.
  • Without naming, reality does not exist; it is created together with language. Wittgenstein tightly names this in his tractatus—the limits of our language are the limits of our world.53
  • He invented (fictitiously and completely abstractly!) a framework that was generally accepted and soon “made into” reality. Marx invented similarly; he created the notion of class exploitation. Through his idea, the perception of history and reality was changed for a large part of the world for nearly an entire century.
  • Reality is not a given; it is not passive. Perceiving reality and “facts” requires man’s active participation. It is man who must take the last step, an act (and we
  • How does this relate to economics? Reality itself, our “objective” world, is cocreated, man himself participates in the creation; creation, which is somewhat constantly being re-created.
  • Our scientific models put the finishing touches on reality, because (1) they interpret, (2) they give phenomena a name, (3) they enable us to classify the world and phenomena according to logical forms, and (4) through these models we de facto perceive reality.
  • When man finds a new linguistic framework or analytical model, or stops using the old one, he molds or remolds reality. Models are only in our heads; they are not “in objective reality.” In this sense, Newton invented (not merely discovered!) gravity.
  • A real-ization act on our part represents the creation of a construct, the imputation of sense and order (which is beautifully expressed by the biblical act of naming, or categorization, sorting, ordering).
  • Keynes enters into the history of economic thought from the same intellectual cadence; his greatest contribution to economics was precisely the resurrection of the imperceptible—for example in the form of animal spirits or uncertainty. The economist Piero Mini even ascribes Keynes’s doubting and rebellious approach to his almost Talmudic education.63
  • God connects man with the task of guarding and protecting the Garden of Eden, and thus man actually cocreates the cultural landscape. The Czech philosopher Zdeněk Neubauer also describes this: “Such is reality, and it is so deep that it willingly crystallizes into worlds. Therefore I profess that reality is a creation and not a place of occurrence for objectively given phenomena.”61
  • in this viewpoint it is possible to see how Jewish thought is mystical—it admits the role of the incomprehensible. Therefore, through its groundedness, Jewish thought indulges mystery and defends itself against a mechanistic-causal explanation of the world: “The Jewish way of thinking, according to Veblen, emphasizes the spiritual, the miraculous, the intangible.
  • The Jews believed the exact opposite. The world is created by a good God, and evil appears in it as a result of immoral human acts. Evil, therefore, is induced by man.66 History unwinds according to the morality of human acts.
  • What’s more, history seems to be based on morals; morals seem to be the key determining factors of history. For the Hebrews, history proceeds according to how morally its actors behave.
  • The Sumerians believed in dualism—good and evil deities exist, and the earth of people becomes their passive battlefield.
  • GOOD AND EVIL IN US: A MORAL EXPLANATION OF WELL-BEING We have seen that in the Epic of Gilgamesh, good and evil are not yet addressed systematically on a moral level.
  • This was not about moral-human evil, but rather a kind of natural evil. It is as if good and evil were not touched by morality at all. Evil simply occurred. Period.
  • the epic, good and evil are not envisaged morally—they are not the result of an (a)moral act. Evil was not associated with free moral action or individual will.
  • Hebrew thought, on the other hand, deals intensively with moral good and evil. A moral dimension touches the core of its stories.65
  • discrepancy between savings and investment, and others are convinced of the monetary essence
  • The entire history of the Jewish nation is interpreted and perceived in terms of morality. Morality has become, so to speak, a mover and shaker of Hebrew history.
  • sunspots. The Hebrews came up with the idea that morals were behind good and bad years, behind the economic cycle. But we would be getting ahead of ourselves. Pharaoh’s Dream: Joseph and the First Business Cycle To
  • It is the Pharaoh’s well-known dream of seven fat and seven lean cows, which he told to Joseph, the son of Jacob. Joseph interpreted the dream as a macroeconomic prediction of sorts: Seven years of abundance were to be followed by seven years of poverty, famine, and misery.
  • Self-Contradicting Prophecy Here, let’s make several observations on this: Through taxation74 on the level of one-fifth of a crop75 in good years to save the crop and then open granaries in bad years, the prophecy was de facto prevented (prosperous years were limited and hunger averted—through a predecessor of fiscal stabilization).
  • The Old Testament prophesies therefore were not any deterministic look into the future, but warnings and strategic variations of the possible, which demanded some kind of reaction. If the reaction was adequate, what was prophesied would frequently not occur at all.
  • This principle stands directly against the self-fulfilling prophecy,80 the well-known concept of social science. Certain prophecies become self-fulfilling when expressed (and believed) while others become self-contradicting prophecies when pronounced (and believed).
  • If the threat is anticipated, it is possible to totally or at least partially avoid it. Neither Joseph nor the pharaoh had the power to avoid bounty or crop failure (in this the dream interpretation was true and the appearance of the future mystical), but they avoided the impacts and implications of the prophecy (in this the interpretation of the dream was “false”)—famine did not ultimately occur in Egypt, and this was due to the application of reasonable and very intuitive economic policy.
  • Let us further note that the first “macroeconomic forecast” appears in a dream.
  • back to Torah: Later in this story we will notice that there is no reason offered as to why the cycle occurs (that will come later). Fat years will simply come, and then lean years after them.
  • Moral Explanation of a Business Cycle That is fundamentally different from later Hebrew interpretations, when the Jewish nation tries to offer reasons why the nation fared well or poorly. And those reasons are moral.
  • If you pay attention to these laws and are careful to follow them, then the Lord your God will keep his covenant of love with you, as he swore to your forefathers. He will love you and bless you and increase your numbers.
  • Only in recent times have some currents of economics again become aware of the importance of morals and trust in the form of measuring the quality of institutions, the level of justice, business ethics, corruption, and so forth, and examining their influence on the economy,
  • From today’s perspective, we can state that the moral dimension entirely disappeared from economic thought for a long time, especially due to the implementation of Mandeville’s concept of private vices that contrarily support the public welfare
  • Without being timid, we can say this is the first documented attempt to explain the economic cycle. The economic cycle, the explanation of which is to this day a mystery to economists, is explained morally in the Old Testament.
  • But how do we consolidate these two conflicting interpretations of the economic cycle: Can ethics be responsible for it or not? Can we influence reality around us through our acts?
  • it is not within the scope of this book to answer that question; justice has been done to the question if it manages to sketch out the main contours of possible searches for answers.
  • THE ECONOMICS OF GOOD AND EVIL: DOES GOOD PAY OFF? This is probably the most difficult moral problem we could ask.
  • Kant, the most important modern thinker in the area of ethics, answers on the contrary that if we carry out a “moral” act on the basis of economic calculus (therefore we carry out an hedonistic consideration; see below) in the expectation of later recompense, its morality is lost. Recompense, according to the strict Kant, annuls ethics.
  • Inquiring about the economics of good and evil, however, is not that easy. Where would Kant’s “moral dimension of ethics” go if ethics paid? If we do good for profit, the question of ethics becomes a mere question of rationality.
  • Job’s friends try to show that he must have sinned in some way and, in doing so, deserved God’s punishment. They are absolutely unable to imagine a situation in which Job, as a righteous man, would suffer without (moral) cause. Nevertheless, Job insists that he deserves no punishment because he has committed no offense: “God has wronged me and drawn his net around me.”94
  • But Job remains righteous, even though it does not pay to do so: Though he slay me, yet will I hope in him.95 And till I die, I will not deny my integrity I will maintain my righteousness and never let go of it; my conscience will not reproach me as long as I live.96
  • He remains righteous, even if his only reward is death. What economic advantage could he have from that?
  • morals cannot be considered in the economic dimension of productivity and calculus. The role of the Hebrews was to do good, whether it paid off or not. If good (outgoing) is rewarded by incoming goodness, it is a bonus,99 not a reason to do outgoing good. Good and reward do not correlate to each other.
  • This reasoning takes on a dimension of its own in the Old Testament. Good (incoming) has already happened to us. We must do good (outgoing) out of gratitude for the good (incoming) shown to us in the past.
  • So why do good? After all, suffering is the fate of many biblical figures. The answer can only be: For good itself. Good has the power to be its own reward. In this sense, goodness gets its reward, which may or may not take on a material dimension.
  • the Hebrews offered an interesting compromise between the teachings of the Stoics and Epicureans. We will go into it in detail later, so only briefly
  • constraint. It calls for bounded optimalization (with limits). A kind of symbiosis existed between the legitimate search for one’s own utility (or enjoyment of life) and maintaining rules, which are not negotiable and which are not subject to optimalization.
  • In other words, clear (exogenously given) rules exist that must be observed and cannot be contravened. But within these borders it is absolutely possible, and even recommended, to increase utility.
  • the mining of enjoyment must not come at the expense of exogenously given rules. “Judaism comes therefore to train or educate the unbounded desire … for wealth, so that market activities and patterns of consumption operate within a God-given morality.”102
  • The Epicureans acted with the goal of maximizing utility without regard for rules (rules developed endogenously, from within the system, computed from that which increased utility—this was one of the main trumps of the Epicurean school; they did not need exogenously given norms, and argued that they could “calculate” ethics (what to do) for every given situation from the situation itself).
  • The Stoics could not seek their enjoyment—or, by another name, utility. They could not in any way look back on it, and in no way could they count on it. They could only live according to rules (the greatest weakness of this school was to defend where exogenously the given rules came from and whether they are universal) and take a indifferent stand to the results of their actions.
  • To Love the Law The Jews not only had to observe the law (perhaps the word covenant would be more appropriate), but they were to love it because it was good.
  • Their relationship to the law was not supposed to be one of duty,105 but one of gratitude, love. Hebrews were to do good (outgoing), because goodness (incoming) has already been done to them.
  • This is in stark contrast with today’s legal system, where, naturally, no mention of love or gratefulness exists. But God expects a full internalization of the commandments and their fulfillment with love, not as much duty. By no means was this on the basis of the cost-benefit analyses so widespread in economics today, which determines when it pays to break the law and when not to (calculated on the basis of probability of being caught and the amount of punishment vis-à-vis the possible gain).
  • And now, O Israel, what does the Lord your God ask of you but to fear the Lord your God, to walk in all his ways, to love him, to serve the Lord your God with all your heart and with all your soul, and to observe the Lord’s commands and decrees that I am giving you today for your own good? To the Lord your God belong the heavens, even the highest heavens, the earth and everything in it. Yet the Lord set his affection on your forefathers and loved them….
  • the principle of doing good (outgoing) on the basis of a priori demonstrated good (incoming) was also taken over by the New Testament. Atonement itself is based on an a priori principle; all our acts are preceded by good.
  • The Hebrews, originally a nomadic tribe, preferred to be unrestrained and grew up in constant freedom of motion.
  • Human laws, if they are in conflict with the responsibilities given by God, are subordinate to personal responsibility, and a Jew cannot simply join the majority, even if it is legally allowed. Ethics, the concept of good, is therefore always superior to all local laws, rules, and customs:
  • THE SHACKLES OF THE CITY Owing to the Hebrew’s liberation from Egyptian slavery, freedom and responsibility become the key values of Jewish thought.
  • Laws given by God are binding for Jews, and God is the absolute source of all values,
  • The Hebrew ideal is represented by the paradise of the Garden of Eden, not a city.116 The despised city civilization or the tendency to see in it a sinful and shackling way of life appears in glimpses and allusions in many places in the Old Testament.
  • The nomadic Jewish ethos is frequently derived from Abraham, who left the Chaldean city of Ur on the basis of a command:
  • In addition, they were aware of a thin two-way line between owner and owned. We own material assets, but—to a certain extent—they own us and tie us down. Once we become used to a certain material
  • This way of life had understandably immense economic impacts. First, such a society lived in much more connected relationships, where there was no doubt that everyone mutually depended on each other. Second, their frequent wanderings meant the inability to own more than they could carry; the gathering up of material assets did not have great weight—precisely because the physical weight (mass) of things was tied to one place.
  • One of Moses’s greatest deeds was that he managed to explain to his nation once and for all that it is better to remain hungry and liberated than to be a slave with food “at no cost.”
  • SOCIAL WELFARE: NOT TO ACT IN THE MANNER OF SODOM
  • regulations is developed in the Old Testament, one we hardly find in any other nation of the time. In Hebrew teachings, aside from individual utility, indications of the concept of maximalizing utility societywide appear for the first time as embodied in the Talmudic principle of Kofin al midat S´dom, which can be translated as “one is compelled not to act in the manner of Sodom” and to take care of the weaker members of society.
  • In a jubilee year, debts were to be forgiven,125 and Israelites who fell into slavery due to their indebtedness were to be set free.126
  • Such provisions can be seen as the antimonopoly and social measures of the time. The economic system even then had a clear tendency to converge toward asset concentration, and therefore power as well. It would appear that these provisions were supposed to prevent this process
  • Land at the time could be “sold,” and it was not sale, but rent. The price (rent) of real estate depended on how long there was until a forgiveness year. It was about the awareness that we may work the land, but in the last instance we are merely “aliens and strangers,” who have the land only rented to us for a fixed time. All land and riches came from the Lord.
  • These provisions express a conviction that freedom and inheritance should not be permanently taken away from any Israelite. Last but not least, this system reminds us that no ownership lasts forever and that the fields we plow are not ours but the Lord’s.
  • Glean Another social provision was the right to glean, which in Old Testament times ensured at least basic sustenance for the poorest. Anyone who owned a field had the responsibility not to harvest it to the last grain but to leave the remains in the field for the poor.
  • Tithes and Early Social Net Every Israelite also had the responsibility of levying a tithe from their entire crop. They had to be aware from whom all ownership comes and, by doing so, express their thanks.
  • “Since the community has an obligation to provide food, shelter, and basic economic goods for the needy, it has a moral right and duty to tax its members for this purpose. In line with this duty, it may have to regulate markets, prices and competition, to protect the interests of its weakest members.”135
  • In Judaism, charity is not perceived as a sign of goodness; it is more of a responsibility. Such a society then has the right to regulate its economy in such a way that the responsibility of charity is carried out to its satisfaction.
  • With a number of responsibilities, however, comes the difficulty of getting them into practice. Their fulfillment, then, in cases when it can be done, takes place gradually “in layers.” Charitable activities are classified in the Talmud according to several target groups with various priorities, classified according to, it could be said, rules of subsidiarity.
  • Do not mistreat an alien or oppress him, for you were aliens in Egypt.140 As one can see, aside from widows and orphans, the Old Testament also includes immigrants in its area of social protection.141 The Israelites had to have the same rules apply for them as for themselves—they could not discriminate on the basis of their origin.
  • ABSTRACT MONEY, FORBIDDEN INTEREST, AND OUR DEBT AGE If it appears to us that today’s era is based on money and debt, and our time will be written into history as the “Debt age,” then it will certainly be interesting to follow how this development occurred.
  • Money is a social abstractum. It is a social agreement, an unwritten contract.
  • The first money came in the form of clay tablets from Mesopotamia, on which debts were written. These debts were transferable, so the debts became currency. In the end, “It is no coincidence that in English the root of ‘credit’ is ‘credo,’ the Latin for ‘I believe.’”
  • To a certain extent it could be said that credit, or trust, was the first currency. It can materialize, it can be embodied in coins, but what is certain is that “money is not metal,” even the rarest metal, “it is trust inscribed,”
  • Inseparably, with the original credit (money) goes interest. For the Hebrews, the problem of interest was a social issue: “If you lend money to one of my people among you who is needy, do not be like a moneylender; charge him no interest.”
  • there were also clearly set rules setting how far one could go in setting guarantees and the nonpayment of debts. No one should become indebted to the extent that they could lose the source of their livelihood:
  • In the end, the term “bank” comes from the Italian banci, or the benches that Jewish lenders sat on.157
  • Money is playing not only its classical roles (as a means of exchange, a holder of value, etc.) but also a much greater, stronger role: It can stimulate, drive (or slow down) the whole economy. Money plays a national economic role.
  • In the course of history, however, the role of loans changed, and the rich borrowed especially for investment purposes,
  • Today the position and significance of money and debt has gone so far and reached such a dominant position in society that operating with debts (fiscal policy) or interest or money supply (monetary policy) means that these can, to a certain extent, direct (or at least strongly influence) the whole economy and society.
  • In such a case a ban on interest did not have great ethical significance. Thomas Aquinas, a medieval scholar (1225-1274), also considers similarly; in his time, the strict ban on lending with usurious interest was loosened, possibly due to him.
  • As a form of energy, money can travel in three dimensions, vertically (those who have capital lend to those who do not) and horizontally (speed and freedom in horizontal or geographic motion has become the by-product—or driving force?—of globalization). But money (as opposed to people) can also travel through time.
  • money is something like energy that can travel through time. And it is a very useful energy, but at the same time very dangerous as well. Wherever
  • Aristotle condemned interest162 not only from a moral standpoint, but also for metaphysical reasons. Thomas Aquinas shared the same fear of interest and he too argued that time does not belong to us, and that is why we must not require interest.
  • MONEY AS ENERGY: TIME TRAVEL AND GROSS DEBT PRODUCT (GDP)
  • Due to this characteristic, we can energy-strip the future to the benefit of the present. Debt can transfer energy from the future to the present.163 On the other hand, saving can accumulate energy from the past and send it to the present.
  • labor was not considered degrading in the Old Testament. On the contrary, the subjugation of nature is even a mission from God that originally belonged to man’s very first blessings.
  • LABOR AND REST: THE SABBATH ECONOMY
  • The Jews as well as Aristotle behaved very guardedly toward loans. The issue of interest/usury became one of the first economic debates. Without having an inkling of the future role of economic policy (fiscal and monetary), the ancient Hebrews may have unwittingly felt that they were discovering in interest a very powerful weapon, one that can be a good servant, but (literally) an enslaving master as well.
  • It’s something like a dam. When we build one, we are preventing periods of drought and flooding in the valley; we are limiting nature’s whims and, to a large extent, avoiding its incalculable cycles. Using dams, we can regulate the flow of water to nearly a constant. With it we tame the river (and we can also gain
  • But if we do not regulate the water wisely, it may happen that we would overfill the dam and it would break. For the cities lying in the valley, their end would be worse than if a dam were never there.
  • If man lived in harmony with nature before, now, after the fall, he must fight; nature stands against him and he against it and the animals. From the Garden we have moved unto a (battle)field.
  • Only after man’s fall does labor turn into a curse.168 It could even be said that this is actually the only curse, the curse of the unpleasantness of labor, that the Lord places on Adam.
  • Both Plato and Aristotle consider labor to be necessary for survival, but that only the lower classes should devote themselves to it so that the elites would not have to be bothered with it and so that they could devote themselves to “purely spiritual matters—art, philosophy, and politics.”
  • Work is also not only a source of pleasure but a social standing; It is considered an honor. “Do you see a man skilled in his work? He will serve before kings.”170 None of the surrounding cultures appreciate work as much. The idea of the dignity of labor is unique in the Hebrew tradition.
  • Hebrew thinking is characterized by a strict separation of the sacred from the profane. In life, there are simply areas that are holy, and in which it is not allowed to economize, rationalize, or maximize efficiency.
  • good example is the commandment on the Sabbath. No one at all could work on this day, not even the ones who were subordinate to an observant Jew:
  • the message of the commandment on Saturday communicated that people were not primarily created for labor.
  • Paradoxically, it is precisely this commandment out of all ten that is probably the most violated today.
  • Aristotle even considers labor to be “a corrupted waste of time which only burdens people’s path to true honour.”
  • we have days when we must not toil connected (at least lexically) with the word meaning emptiness: the English term “vacation” (or emptying), as with the French term, les vacances, or German die Freizeit, meaning open time, free time, but also…
  • Translated into economic language: The meaning of utility is not to increase it permanently but to rest among existing gains. Why do we learn how to constantly increase gains but not how to…
  • This dimension has disappeared from today’s economics. Economic effort has no goal at which it would be possible to rest. Today we only know growth for growth’s sake, and if our company or country prospers, that does not…
  • Six-sevenths of time either be dissatisfied and reshape the world into your own image, man, but one-seventh you will rest and not change the creation. On the seventh day, enjoy creation and enjoy the work of your hands.
  • the purpose of creation was not just creating but that it had an end, a goal. The process was just a process, not a purpose. The whole of Being was created so…
  • Saturday was not established to increase efficiency. It was a real ontological break that followed the example of the Lord’s seventh day of creation. Just as the Lord did not rest due to tiredness or to regenerate strength; but because He was done. He was done with His work, so that He could enjoy it, to cherish in His creation.
  • If we believe in rest at all today, it is for different reasons. It is the rest of the exhausted machine, the rest of the weak, and the rest of those who can’t handle the tempo. It’s no wonder that the word “rest…
  • Related to this, we have studied the first mention of a business cycle with the pharaoh’s dream as well as seen a first attempt (that we may call…
  • We have tried to show that the quest for a heaven on Earth (similar to the Jewish one) has, in its desacralized form, actually also been the same quest for many of the…
  • We have also seen that the Hebrews tried to explain the business cycle with morality and ethics. For the Hebrews,…
  • ancient Greek economic ethos, we will examine two extreme approaches to laws and rules. While the Stoics considered laws to be absolutely valid, and utility had infinitesimal meaning in their philosophy, the Epicureans, at least in the usual historical explanation, placed utility and pleasure in first place—rules were to be made based on the principle of utility.
  • CONCLUSION: BETWEEN UTILITY AND PRINCIPLE The influence of Jewish thought on the development of market democracy cannot be overestimated. The key heritage for us was the lack of ascetic perception of the world, respect to law and private…
  • We have tried to show how the Torah desacralized three important areas in our lives: the earthly ruler, nature,…
  • What is the relationship between the good and evil that we do (outgoing) and the utility of disutility that we (expect to) get as a reward (incoming)? We have seen…
  • The Hebrews never despised material wealth; on contrary, the Jewish faith puts great responsibility on property management. Also the idea of progress and the linear perception of time gives our (economic)…
  • the Hebrews managed to find something of a happy compromise between both of these principles.
  • will not be able to completely understand the development of the modern notion of economics without understanding the disputes between the Epicureans and the Stoics;
  • poets actually went even further, and with their speech they shaped and established reality and truth. Honor, adventure, great deeds, and the acclaim connected with them played an important role in the establishment of the true, the real.
  • those who are famous will be remembered by people. They become more real, part of the story, and they start to be “realized,” “made real” in the lives of other people. That which is stored in memory is real; that which is forgotten is as if it never existed.
  • Today’s scientific truth is founded on the notion of exact and objective facts, but poetic truth stands on an interior (emotional) consonance with the story or poem. “It is not addressed first to the brain … [myth] talks directly to the feeling system.”
  • “epic and tragic poets were widely assumed to be the central ethical thinkers and teachers of Greece; nobody thought of their work as less serious, less aimed at truth, than the speculative prose treatises of historians and philosophers.”5 Truth and reality were hidden in speech, stories, and narration.
  • Ancient philosophy, just as science would later, tries to find constancy, constants, quantities, inalterabilities. Science seeks (creates?) order and neglects everything else as much as it can. In their own experiences, everyone knows that life is not like that,
  • Just as scientists do today, artists drew images of the world that were representative, and therefore symbolic, picturelike, and simplifying (but thus also misleading), just like scientific models, which often do not strive to be “realistic.”
  • general? In the end, poetry could be more sensitive to the truth than the philosophical method or, later, the scientific method. “Tragic poems, in virtue of their subject matter and their social function, are likely to confront and explore problems about human beings and luck that a philosophical text might be able to omit or avoid.”8
Javier E

12 Rules for Life: An Antidote to Chaos (Jordan B. Peterson) - 0 views

  • RULES? MORE RULES? REALLY? Isn’t life complicated enough, restricting enough, without abstract rules that don’t take our unique, individual situations into account? And given that our brains are plastic, and all develop differently based on our life experiences, why even expect that a few rules might be helpful to us all?
  • “I’ve got some good news…and I’ve got some bad news,” the lawgiver yells to them. “Which do you want first?” “The good news!” the hedonists reply. “I got Him from fifteen commandments down to ten!” “Hallelujah!” cries the unruly crowd. “And the bad?” “Adultery is still in.”
  • Maps of Meaning was sparked by Jordan’s agonized awareness, as a teenager growing up in the midst of the Cold War, that much of mankind seemed on the verge of blowing up the planet to defend their various identities. He felt he had to understand how it could be that people would sacrifice everything for an “identity,”
  • ...297 more annotations...
  • the story of the golden calf also reminds us that without rules we quickly become slaves to our passions—and there’s nothing freeing about that.
  • And the story suggests something more: unchaperoned, and left to our own untutored judgment, we are quick to aim low and worship qualities that are beneath us—in this case, an artificial animal that brings out our own animal instincts in a completely unregulated way.
  • Similarly, in this book Professor Peterson doesn’t just propose his twelve rules, he tells stories, too, bringing to bear his knowledge of many fields as he illustrates and explains why the best rules do not ultimately restrict us but instead facilitate our goals and make for fuller, freer lives.
  • Peterson wasn’t really an “eccentric”; he had sufficient conventional chops, had been a Harvard professor, was a gentleman (as cowboys can be) though he did say damn and bloody a lot, in a rural 1950s sort of way. But everyone listened, with fascination on their faces, because he was in fact addressing questions of concern to everyone at the table.
  • unlike many academics who take the floor and hold it, if someone challenged or corrected him he really seemed to like it. He didn’t rear up and neigh. He’d say, in a kind of folksy way, “Yeah,” and bow his head involuntarily, wag it if he had overlooked something, laughing at himself for overgeneralizing. He appreciated being shown another side of an issue, and it became clear that thinking through a problem was, for him, a dialogic process.
  • for an egghead Peterson was extremely practical. His examples were filled with applications to everyday life: business management, how to make furniture (he made much of his own), designing a simple house, making a room beautiful (now an internet meme) or in another, specific case related to education, creating an online writing project that kept minority students from dropping out of school by getting them to do a kind of psychoanalytic exercise on themselves,
  • These Westerners were different: self-made, unentitled, hands on, neighbourly and less precious than many of their big-city peers, who increasingly spend their lives indoors, manipulating symbols on computers. This cowboy psychologist seemed to care about a thought only if it might, in some way, be helpful to someone.
  • I was drawn to him because here was a clinician who also had given himself a great books education, and who not only loved soulful Russian novels, philosophy and ancient mythology, but who also seemed to treat them as his most treasured inheritance. But he also did illuminating statistical research on personality and temperament, and had studied neuroscience. Though trained as a behaviourist, he was powerfully drawn to psychoanalysis with its focus on dreams, archetypes, the persistence of childhood conflicts in the adult, and the role of defences and rationalization in everyday life. He was also an outlier in being the only member of the research-oriented Department of Psychology at the University of Toronto who also kept a clinical practice.
  • Maps of Meaning, published nearly two decades ago, shows Jordan’s wide-ranging approach to understanding how human beings and the human brain deal with the archetypal situation that arises whenever we, in our daily lives, must face something we do not understand.
  • The brilliance of the book is in his demonstration of how rooted this situation is in evolution, our DNA, our brains and our most ancient stories. And he shows that these stories have survived because they still provide guidance in dealing with uncertainty, and the unavoidable unknown.
  • this is why many of the rules in this book, being based on Maps of Meaning, have an element of universality to them.
  • We are ambivalent about rules, even when we know they are good for us. If we are spirited souls, if we have character, rules seem restrictive, an affront to our sense of agency and our pride in working out our own lives. Why should we be judged according to another’s rule?
  • And he felt he had to understand the ideologies that drove totalitarian regimes to a variant of that same behaviour: killing their own citizens.
  • Ideologies are simple ideas, disguised as science or philosophy, that purport to explain the complexity of the world and offer remedies that will perfect it.
  • Ideologues are people who pretend they know how to “make the world a better place” before they’ve taken care of their own chaos within.
  • Ideologies are substitutes for true knowledge, and ideologues are always dangerous when they come to power, because a simple-minded I-know-it-all approach is no match for the complexity of existence.
  • To understand ideology, Jordan read extensively about not only the Soviet gulag, but also the Holocaust and the rise of Nazism. I had never before met a person, born Christian and of my generation, who was so utterly tormented by what happened in Europe to the Jews, and who had worked so hard to understand how it could have occurred.
  • I saw what now millions have seen online: a brilliant, often dazzling public speaker who was at his best riffing like a jazz artist; at times he resembled an ardent Prairie preacher (not in evangelizing, but in his passion, in his ability to tell stories that convey the life-stakes that go with believing or disbelieving various ideas). Then he’d just as easily switch to do a breathtakingly systematic summary of a series of scientific studies. He was a master at helping students become more reflective, and take themselves and their futures seriously. He taught them to respect many of the greatest books ever written. He gave vivid examples from clinical practice, was (appropriately) self-revealing, even of his own vulnerabilities, and made fascinating links between evolution, the brain and religious stories.
  • Above all, he alerted his students to topics rarely discussed in university, such as the simple fact that all the ancients, from Buddha to the biblical authors, knew what every slightly worn-out adult knows, that life is suffering.
  • chances are, if you or someone you love is not suffering now, they will be within five years, unless you are freakishly lucky. Rearing kids is hard, work is hard, aging, sickness and death are hard, and Jordan emphasized that doing all that totally on your own, without the benefit of a loving relationship, or wisdom, or the psychological insights of the greatest psychologists, only makes it harder.
  • focused on triumphant heroes. In all these triumph stories, the hero has to go into the unknown, into an unexplored territory, and deal with a new great challenge and take great risks. In the process, something of himself has to die, or be given up, so he can be reborn and meet the challenge. This requires courage, something rarely discussed in a psychology class or textbook.
  • Jordan
  • views of his first YouTube statements quickly numbered in the hundreds of thousands. But people have kept listening because what he is saying meets a deep and unarticulated need. And that is because alongside our wish to be free of rules, we all search for structure.
  • the first generation to have been so thoroughly taught two seemingly contradictory ideas about morality, simultaneously—at their schools, colleges and universities, by many in my own generation. This contradiction has left them at times disoriented and uncertain, without guidance and, more tragically, deprived of riches they don’t even know exist.
  • morality and the rules associated with it are just a matter of personal opinion or happenstance, “relative to” or “related to” a particular framework, such as one’s ethnicity, one’s upbringing, or the culture or historical…
  • The first idea or teaching is that morality is relative, at best a…
  • So, the decent thing to do—once it becomes apparent how arbitrary your, and your society’s, “moral values” are—is to show tolerance for people who think differently, and…
  • for many people one of the worst character flaws a person can have is to be “judgmental.”* And, since we don’t know right from wrong, or what is good, just about the most inappropriate thing an adult can…
  • That emphasis on tolerance is so paramount that for many people one of the worst character flaws a person can have is to be “judgmental.”* And, since we don’t know right from wrong, or what is good, just about the most inappropriate thing an…
  • And so a generation has been raised untutored in what was once called, aptly, “practical wisdom,” which guided previous generations. Millennials, often told they have received the finest education available anywhere, have actually…
  • professors, chose to devalue thousands of years of human knowledge about how to acquire virtue, dismissing it as passé, “…
  • They were so successful at it that the very word “virtue” sounds out of date, and someone using it appears…
  • The study of virtue is not quite the same as the study of morals (right and wrong, good and evil). Aristotle defined the virtues simply as the ways of behaving that are most conducive to happiness in life. Vice was…
  • Cultivating judgment about the difference between virtue and vice is the beginning of wisdom, something…
  • By contrast, our modern relativism begins by asserting that making judgments about how to live is impossible, because there is no real good, and no…
  • Thus relativism’s closest approximation to “virtue” is “tolerance.” Only tolerance will provide social cohesion between different groups, and save us from harming each other. On Facebook and other forms of social media, therefore, you signal your so-called…
  • Intolerance of others’ views (no matter how ignorant or incoherent they may be) is not simply wrong; in a world where there is no right or wrong, it is worse: it is a sign you are…
  • But it turns out that many people cannot tolerate the vacuum—the chaos—which is inherent in life, but made worse by this moral relativism; they cannot live without a moral compass,…
  • So, right alongside relativism, we find the spread of nihilism and despair, and also the opposite of moral relativism: the blind certainty offered by ideologies…
  • Dr. Norman Doidge, MD, is the author of The Brain That Changes Itself
  • so we arrive at the second teaching that millennials have been bombarded with. They sign up for a humanities course, to study the greatest books ever written. But they’re not assigned the books; instead they are given…
  • (But the idea that we can easily separate facts and values was and remains naive; to some extent, one’s values determine what one will pay…
  • For the ancients, the discovery that different people have different ideas about how, practically, to live, did not paralyze them; it deepened their understanding of humanity and led to some of the most satisfying conversations human beings have ever had, about how life might be lived.
  • Modern moral relativism has many sources. As we in the West learned more history, we understood that different epochs had different moral codes. As we travelled the seas and explored the globe, we learned of far-flung tribes on different continents whose different moral codes made sense relative to, or within the framework of, their societies. Science played a role, too, by attacking the religious view of the world, and thus undermining the religious grounds for ethics and rules. Materialist social science implied that we could divide the world into facts (which all could observe, and were objective and “real”) and values (…
  • it seems that all human beings are, by some kind of biological endowment, so ineradicably concerned with morality that we create a structure of laws and rules wherever we are. The idea that human life can be free of moral concerns is a fantasy.
  • given that we are moral animals, what must be the effect of our simplistic modern relativism upon us? It means we are hobbling ourselves by pretending to be something we are not. It is a mask, but a strange one, for it mostly deceives the one who wears it.
  • Far better to integrate the best of what we are now learning with the books human beings saw fit to preserve over millennia, and with the stories that have survived, against all odds, time’s tendency to obliterate.
  • these really are rules. And the foremost rule is that you must take responsibility for your own life. Period.
  • Jordan’s message that each individual has ultimate responsibility to bear; that if one wants to live a full life, one first sets one’s own house in order; and only then can one sensibly aim to take on bigger responsibilities.
  • if it’s uncertain that our ideals are attainable, why do we bother reaching in the first place? Because if you don’t reach for them, it is certain you will never feel that your life has meaning.
  • And perhaps because, as unfamiliar and strange as it sounds, in the deepest part of our psyche, we all want to be judged.
  • Instead of despairing about these differences in moral codes, Aristotle argued that though specific rules, laws and customs differed from place to place, what does not differ is that in all places human beings, by their nature, have a proclivity to make rules, laws and customs.
  • Freud never argued (as do some who want all culture to become one huge group therapy session) that one can live one’s entire life without ever making judgments, or without morality. In fact, his point in Civilization and Its Discontents is that civilization only arises when some restraining rules and morality are in place.
  • Aleksandr Solzhenitsyn, the great documenter of the slave-labour-camp horrors of the latter, once wrote that the “pitiful ideology” holding that “human beings are created for happiness” was an ideology “done in by the first blow of the work assigner’s cudgel.”1 In a crisis, the inevitable suffering that life entails can rapidly make a mockery of the idea that happiness is the proper pursuit of the individual. On the radio show, I suggested, instead, that a deeper meaning was required. I noted that the nature of such meaning was constantly re-presented in the great stories of the past, and that it had more to do with developing character in the face of suffering than with happiness.
  • I proposed in Maps of Meaning that the great myths and religious stories of the past, particularly those derived from an earlier, oral tradition, were moral in their intent, rather than descriptive. Thus, they did not concern themselves with what the world was, as a scientist might have it, but with how a human being should act.
  • I suggested that our ancestors portrayed the world as a stage—a drama—instead of a place of objects. I described how I had come
  • to believe that the constituent elements of the world as drama were order and chaos, and not material things.
  • Order is where the people around you act according to well-understood social norms, and remain predictable and cooperative. It’s the world of social structure, explored territory, and familiarity. The state of Order is typically portrayed, symbolically—imaginatively—as masculine.
  • Chaos, by contrast, is where—or when—something unexpected happens.
  • As the antithesis of symbolically masculine order, it’s presented imaginatively as feminine. It’s the new and unpredictable suddenly emerging in the midst of the commonplace familiar. It’s Creation and Destruction,
  • Order is the white, masculine serpent; Chaos, its black, feminine counterpart. The black dot in the white—and the white in the black—indicate the possibility of transformation: just when things seem secure, the unknown can loom, unexpectedly and large. Conversely, just when everything seems lost, new order can emerge from catastrophe and chaos.
  • For the Taoists, meaning is to be found on the border between the ever-entwined pair. To walk that border is to stay on the path of life, the divine Way. And that’s much better than happiness.
  • trying to address a perplexing problem: the reason or reasons for the nuclear standoff of the Cold War. I couldn’t understand how belief systems could be so important to people that they were willing to risk the destruction of the world to protect them. I came to realize that shared belief systems made people intelligible to one another—and that the systems weren’t just about belief.
  • People who live by the same code are rendered mutually predictable to one another. They act in keeping with each other’s expectations and desires. They can cooperate. They can even compete peacefully, because everyone knows what to expect from everyone else.
  • Shared beliefs simplify the world, as well, because people who know what to expect from one another can act together to tame the world. There is perhaps nothing more important than the maintenance of this organization—this simplification. If it’s threatened, the great ship of state rocks.
  • It isn’t precisely that people will fight for what they believe. They will fight, instead, to maintain the match between what they believe, what they expect, and what they desire. They will fight to maintain the match between what they expect and how everyone is acting. It is precisely the maintenance of that match that enables everyone
  • There’s more to it, too. A shared cultural system stabilizes human interaction, but is also a system of value—a hierarchy of value, where some things are given priority and importance and others are not. In the absence of such a system of value, people simply cannot act. In fact, they can’t even perceive, because both action and perception require a goal, and a valid goal is, by necessity, something valued.
  • We experience much of our positive emotion in relation to goals. We are not happy, technically speaking, unless we see ourselves progressing—and the very idea of progression implies value.
  • Worse yet is the fact that the meaning of life without positive value is not simply neutral. Because we are vulnerable and mortal, pain and anxiety are an integral part of human existence. We must have something to set against the suffering that is intrinsic to Being.*2 We must have the meaning inherent in a profound system of value or the horror of existence rapidly becomes paramount. Then, nihilism beckons, with its hopelessness and despair.
  • So: no value, no meaning. Between value systems, however, there is the possibility of conflict. We are thus eternally caught between the most diamantine rock and the hardest of places:
  • loss of group-centred belief renders life chaotic, miserable, intolerable; presence of group-centred belief makes conflict with other groups inevitable.
  • In the West, we have been withdrawing from our tradition-, religion- and even nation-centred cultures, partly to decrease the danger of group conflict. But we are increasingly falling prey to the desperation of meaninglessness, and that is no improvement at all.
  • While writing Maps of Meaning, I was (also) driven by the realization that we can no longer afford conflict—certainly not on the scale of the world conflagrations of the twentieth century.
  • I came to a more complete, personal realization of what the great stories of the past continually insist upon: the centre is occupied by the individual.
  • It is possible to transcend slavish adherence to the group and its doctrines and, simultaneously, to avoid the pitfalls of its opposite extreme, nihilism. It is possible, instead, to find sufficient meaning in individual consciousness and experience.
  • How could the world be freed from the terrible dilemma of conflict, on the one hand, and psychological and social dissolution, on the other? The answer was this: through the elevation and development of the individual, and through the willingness of everyone to shoulder the burden of Being and to take the heroic path. We must each adopt as much responsibility as possible for individual life, society and the world.
  • We must each tell the truth and repair what is in disrepair and break down and recreate what is old and outdated. It is in this manner that we can and must reduce the suffering that poisons the world. It’s asking a lot. It’s asking for everything.
  • the alternative—the horror of authoritarian belief, the chaos of the collapsed state, the tragic catastrophe of the unbridled natural world, the existential angst and weakness of the purposeless
  • individual—is clearly worse.
  • a title: 12 Rules for Life: An Antidote to Chaos. Why did that one rise up above all others? First and foremost, because of its simplicity. It indicates clearly that people need ordering principles, and that chaos otherwise beckons.
  • We require rules, standards, values—alone and together. We’re pack animals, beasts of burden. We must bear a load, to justify our miserable existence. We require routine and tradition. That’s order. Order can become excessive, and that’s not good, but chaos can swamp us, so we drown—and that is also not good. We need to stay on the straight and narrow path.
  • I hope that these rules and their accompanying essays will help people understand what they already know: that the soul of the individual eternally hungers for the heroism of genuine Being, and that the willingness to take on that responsibility is identical to the decision to live a meaningful life.
  • RULE 1   STAND UP STRAIGHT WITH YOUR SHOULDERS BACK
  • Because territory matters, and because the best locales are always in short supply, territory-seeking among animals produces conflict. Conflict, in turn, produces another problem: how to win or lose without the disagreeing parties incurring too great a cost.
  • It’s winner-take-all in the lobster world, just as it is in human societies, where the top 1 percent have as much loot as the bottom 50 percent11—and where the richest eighty-five people have as much as the bottom three and a half billion.
  • This principle is sometimes known as Price’s law, after Derek J. de Solla Price,13 the researcher who discovered its application in science in 1963. It can be modelled using an approximately L-shaped graph, with number of people on the vertical axis, and productivity or resources on the horizontal.
  • Instead of undertaking the computationally difficult task of identifying the best man, the females outsource the problem to the machine-like calculations of the dominance hierarchy. They let the males fight it out and peel their paramours from the top.
  • The dominant male, with his upright and confident posture, not only gets the prime real estate and easiest access to the best hunting grounds. He also gets all the girls. It is exponentially more worthwhile to be successful, if you are a lobster, and male.
  • dominance hierarchies have been an essentially permanent feature of the environment to which all complex life has adapted. A third of a billion years ago, brains and nervous systems were comparatively simple. Nonetheless, they already had the structure and neurochemistry necessary to process information about status and society. The importance of this fact can hardly be overstated.
  • evolution works, in large part, through variation and natural selection. Variation exists for many reasons, including gene-shuffling (to put it simply) and random mutation. Individuals vary within a species for such reasons. Nature chooses from among them, across time. That theory, as stated, appears to account for the continual alteration of life-forms over the eons.
  • But there’s an additional question lurking under the surface: what exactly is the “nature” in “natural selection”? What exactly is “the environment” to which animals adapt?
  • Nature “selects.” The idea of selects contains implicitly nested within it the idea of fitness. It is “fitness” that is “selected.” Fitness, roughly speaking, is the probability that a given organism will leave offspring (will propagate its genes through time). The “fit” in “fitness” is therefore the matching of organismal attribute to environmental demand.
  • But nature, the selecting agent, is not a static selector—not in any simple sense.
  • As the environment supporting a species transforms and changes, the features that make a given individual successful in surviving and reproducing also transform and change. Thus, the theory of natural selection does not posit creatures matching themselves ever more precisely to a template specified by the world. It is more that creatures are in a dance with nature, albeit one that is deadly.
  • Nature is not simply dynamic, either. Some things change quickly, but they are nested within other things that change less quickly (music
  • It’s chaos, within order, within chaos, within higher order. The order that is most real is the order that is most unchanging—and that is not necessarily the order that is most easily seen. The leaf, when perceived, might blind the observer to the tree. The tree can blind him to the forest.
  • It is also a mistake to conceptualize nature romantically.
  • Unfortunately, “the environment” is also elephantiasis and guinea worms (don’t ask), anopheles mosquitoes and malaria, starvation-level droughts, AIDS and the Black Plague.
  • It is because of the existence of such things, of course, that we attempt to modify our surroundings, protecting our children, building cities and transportation systems and growing food and generating power.
  • this brings us to a third erroneous concept: that nature is something strictly segregated from the cultural constructs that have emerged within it.
  • It does not matter whether that feature is physical and biological, or social and cultural. All that matters, from a Darwinian perspective, is permanence—and the dominance hierarchy, however social or cultural it might appear, has been around for some half a billion years.
  • The dominance hierarchy is not capitalism. It’s not communism, either, for that matter. It’s not the military-industrial complex. It’s not the patriarchy—that disposable, malleable, arbitrary cultural artefact. It’s not even a human creation; not in the most profound sense. It is instead a near-eternal aspect of the environment, and much of what is blamed on these more ephemeral manifestations is a consequence of its unchanging existence.
  • We were struggling for position before we had skin, or hands, or lungs, or bones. There is little more natural than culture. Dominance hierarchies are older than trees.
  • The part of our brain that keeps track of our position in the dominance hierarchy is therefore exceptionally ancient and fundamental.17 It is a master control system, modulating our perceptions, values, emotions, thoughts and actions. It powerfully affects every aspect of our Being, conscious and unconscious alike.
  • The ancient part of your brain specialized for assessing dominance watches how you are treated by other people. On that evidence, it renders a determination of your value and assigns you a status. If you are judged by your peers as of little worth, the counter restricts serotonin availability. That makes you much more physically and psychologically reactive to any circumstance or event that might produce emotion, particularly if it is negative. You need that reactivity. Emergencies are common at the bottom, and you must be ready to survive. Unfortunately, that physical hyper-response, that constant alertness, burns up a lot of precious energy and physical resources.
  • It will leave you far more likely to live, or die, carelessly, for a rare opportunity at pleasure, when it manifests itself. The physical demands of emergency preparedness will wear you down in every way.21
  • If you have a high status, on the other hand, the counter’s cold, pre-reptilian mechanics assume that your niche is secure, productive
  • You can delay gratification, without forgoing it forever. You can afford to be a reliable and thoughtful citizen.
  • Sometimes, however, the counter mechanism can go wrong. Erratic habits of sleeping and eating can interfere with its function. Uncertainty can throw it for a loop. The body, with its various parts,
  • needs
  • to function like a well-rehearsed orchestra. Every system must play its role properly, and at exactly the right time, or noise and chaos ensue. It is for this reason that routine is so necessary. The acts of life we repeat every day need to be automatized. They must be turned into stable and reliable habits, so they lose their complexity and gain predictability and simplicity.
  • It is for such reasons that I always ask my clinical clients first about sleep. Do they wake up in the morning at approximately the time the typical person wakes up, and at the same time every day?
  • The next thing I ask about is breakfast. I counsel my clients to eat a fat and protein-heavy breakfast as soon as possible after they awaken (no simple carbohydrates, no sugars,
  • I have had many clients whose anxiety was reduced to subclinical levels merely because they started to sleep on a predictable schedule and eat breakfast.
  • Other bad habits can also interfere with the counter’s accuracy.
  • There are many systems of interaction between brain, body and social world that can get caught in positive feedback loops. Depressed people, for example, can start feeling useless and burdensome, as well as grief-stricken and pained. This makes them withdraw from contact with friends and family. Then the withdrawal makes them more lonesome and isolated, and more likely to feel useless and burdensome. Then they withdraw more. In this manner, depression spirals and amplifies.
  • If someone is badly hurt at some point in life—traumatized—the dominance counter can transform in a manner that makes additional hurt more rather than less likely. This often happens in the case of people, now adults, who were viciously bullied during childhood or adolescence. They become anxious and easily upset. They shield themselves with a defensive crouch, and avoid the direct eye contact interpretable as a dominance challenge.
  • With their capacity for aggression strait-jacketed within a too-narrow morality, those who are only or merely compassionate and self-sacrificing (and naïve and exploitable) cannot call forth the genuinely righteous and appropriately self-protective anger necessary to defend themselves. If you can bite, you generally don’t have to. When skillfully integrated, the ability to respond with aggression and violence decreases rather than increases the probability that actual aggression will become necessary.
  • Naive, harmless people usually guide their perceptions and actions with a few simple axioms: people are basically good; no one really wants to hurt anyone else; the threat (and, certainly, the use) of force, physical or otherwise, is wrong. These axioms collapse, or worse, in the presence of
  • individuals who are genuinely malevolent.27
  • I have had clients who were terrified into literally years of daily hysterical convulsions by the sheer look of malevolence on their attackers’ faces. Such individuals typically come from hyper-sheltered families, where nothing
  • terrible is allowed to exist, and everything is fairyland wonderful (or else).
  • When the wakening occurs—when once-naïve people recognize in themselves the seeds of evil and monstrosity, and see themselves as dangerous (at least potentially)— their fear decreases. They develop more self-respect. Then, perhaps, they begin to resist oppression. They see that they have the ability to withstand, because they are terrible too. They see they can and must stand up, because they begin to understand how genuinely monstrous they will become, otherwise,
  • There is very little difference between the capacity for mayhem and destruction, integrated, and strength of character. This is one of the most difficult lessons of life.
  • even if you came by your poor posture honestly—even if you were unpopular or bullied at home or in grade school28—it’s not necessarily appropriate now. Circumstances change. If you slump around, with the same bearing that characterizes a defeated lobster, people will assign you a lower status, and the old counter that you share with crustaceans, sitting at the very base of your brain, will assign you a low dominance number.
  • the other, far more optimistic lesson of Price’s law and the Pareto distribution: those who start to have will probably get more.
  • Some of these upwardly moving loops can occur in your own private, subjective space.
  • If you are asked to move the muscles one by one into a position that looks happy, you will report feeling happier. Emotion is partly bodily expression, and can be amplified (or dampened) by that expression.29
  • To stand up straight with your shoulders back is to accept the terrible responsibility of life, with eyes wide open.
  • It means deciding to voluntarily transform the chaos of potential into the realities of habitable order. It means adopting the burden of self-conscious vulnerability, and accepting the end of the unconscious paradise of childhood, where finitude and mortality are only dimly comprehended. It means willingly undertaking the sacrifices necessary to generate a productive and meaningful reality (it means acting to please God, in the ancient language).
  • So, attend carefully to your posture. Quit drooping and hunching around. Speak your mind. Put your desires forward, as if you had a right to them—at least the same right as others. Walk tall and gaze forthrightly ahead. Dare to be dangerous. Encourage the serotonin to flow plentifully through the neural pathways desperate for its calming influence.
  • Thus emboldened, you will embark on the voyage of your life, let your light shine, so to speak, on the heavenly hill, and pursue your rightful destiny. Then the meaning of your life may be sufficient to keep the corrupting influence of mortal despair at bay. Then you may be able to accept the terrible burden of the World, and find joy.
  • RULE 2   TREAT YOURSELF LIKE SOMEONE YOU ARE RESPONSIBLE FOR HELPING
  • People are better at filling and properly administering prescription medication to their pets than to themselves. That
  • It is difficult to conclude anything from this set of facts except that people appear to love their dogs, cats, ferrets and birds (and maybe even their lizards) more than themselves. How horrible is that? How much shame must exist, for something like that to be true? What could it be about people that makes them prefer their pets to themselves?
  • To understand Genesis 1, the Priestly story, with its insistence on speech as the fundamental creative force, it is first necessary to review a few fundamental, ancient assumptions (these are markedly different in type and intent from the assumptions of science, which are, historically speaking, quite novel).
  • those who existed during the distant time in which the foundational epics of our culture emerged were much more concerned with the actions that dictated survival (and with interpreting the world in a manner commensurate with that goal) than with anything approximating what we now understand as objective truth.
  • Before the dawn of the scientific worldview, reality was construed differently. Being was understood as a place of action, not a place of things.31 It was understood as something more akin to story or drama. That story or drama was lived, subjective experience, as it manifested itself moment to moment in the consciousness of every living person.
  • subjective pain. That’s something so real no argument can stand against it. Everyone acts as if their pain is real—ultimately, finally real. Pain matters, more than matter matters. It is for this reason, I believe, that so many of the world’s traditions regard the suffering attendant upon existence as the irreducible truth of Being.
  • In any case, that which we subjectively experience can be likened much more to a novel or a movie than to a scientific description of physical reality.
  • The Domain, Not of Matter, but of What Matters
  • the world of experience has primal constituents, as well. These are the necessary elements whose interactions define drama and fiction. One of these is chaos. Another is order. The third (as there are three) is the process that mediates between the two, which appears identical to what modern people call consciousness.
  • Chaos is the domain of ignorance itself. It’s unexplored territory. Chaos is what extends, eternally and without limit, beyond the boundaries of all states, all ideas, and all disciplines. It’s the foreigner, the stranger, the member of another gang, the rustle in the bushes in the night-time,
  • It is, in short, all those things and situations we neither know nor understand.
  • Chaos is also the formless potential from which the God of Genesis 1 called forth order using language at the beginning of time. It’s the same potential from which we, made in that Image, call forth the novel and ever-changing moments of our lives. And Chaos is freedom, dreadful freedom, too.
  • Order, by contrast, is explored territory. That’s the hundreds-of-millions-of-years-old hierarchy of place, position and authority. That’s the structure of society. It’s the structure provided by biology, too—particularly insofar as you are adapted, as you are, to the structure of society. Order is tribe, religion, hearth, home and country.
  • Order is the public façade we’re called upon to wear, the politeness of a gathering of civilized strangers, and the thin ice on which we all skate. Order is the place where the behavior of the world matches our expectations and our desires; the place where all things turn out the way we want them to.
  • But order is sometimes tyranny and stultification, as well, when the demand for certainty and uniformity and purity becomes too one-sided.
  • In order, we’re able to think about things in the long term. There, things work, and we’re stable, calm and competent. We seldom leave places we
  • understand—geographical or conceptual—for that reason, and we certainly do not like it when we are compelled to or when it happens accidentally.
  • When the same person betrays you, sells you out, you move from the daytime world of clarity and light to the dark underworld of chaos, confusion and despair. That’s the same move you make, and the same place you visit, when the company you work for starts to fail and your job is placed in doubt.
  • Before the Twin Towers fell—that was order. Chaos manifested itself afterward. Everyone felt it. The very air became uncertain. What exactly was it that fell? Wrong question. What exactly remained standing? That was the issue at hand.
  • Chaos is the deep ocean bottom to which Pinocchio voyaged to rescue his father from Monstro, whale and fire-breathing dragon. That journey into darkness and rescue is the most difficult thing a puppet must do, if he wants to be real; if he wants to extract himself from the temptations of deceit and acting and victimization and impulsive pleasure and totalitarian subjugation; if he wants to take his place as a genuine Being in the world.
  • Chaos is the new place and time that emerges when tragedy strikes suddenly, or malevolence reveals its paralyzing visage, even in the confines of your own home. Something unexpected or undesired can always make its appearance, when a plan is being laid out, regardless of how familiar the circumstances.
  • Our brains respond instantly when chaos appears, with simple, hyper-fast circuits maintained from the ancient days, when our ancestors dwelled in trees, and snakes struck in a flash.32 After that nigh-instantaneous, deeply reflexive bodily response comes the later-evolving, more complex but slower responses of emotions—and, after that, comes thinking, of the higher order, which can extend over seconds, minutes or years. All that response is instinctive, in some sense—but the faster the response, the more instinctive.
  • Things or objects are part of the objective world. They’re inanimate; spiritless. They’re dead. This is not true of chaos and order. Those are perceived, experienced and understood (to the degree that they are understood at all) as personalities—and that is just as true of the perceptions, experiences and understanding of modern people as their ancient forebears. It’s just that moderners don’t notice.
  • Perception of things as entities with personality also occurs before perception of things as things. This is particularly true of the action of others,34 living others, but we also see the non-living “objective world” as animated, with purpose and intent.
  • This is because of the operation of what psychologists have called “the hyperactive agency detector” within us.35 We evolved, over millennia, within intensely social circumstances. This means that the most significant elements of our environment of origin were personalities, not things, objects or situations.
  • The personalities we have evolved to perceive have been around, in predictable form, and in typical, hierarchical configurations, forever, for all intents and purposes. They have been…
  • the category of “parent” and/or “child” has been around for 200 million years. That’s longer than birds have existed. That’s longer than flowers have grown. It’s not a billion years, but it’s still a very long time. It’s plenty long enough for male and female and parent and child to serve as vital and fundamental parts of the environment to which we have adapted. This means that male and female and parent and child are…
  • Our brains are deeply social. Other creatures (particularly, other humans) were crucially important to us as we lived, mated and evolved. Those creatures were…
  • From a Darwinian perspective, nature—reality itself; the environment, itself—is what selects. The environment cannot be defined in any more fundamental manner. It is not mere inert matter. Reality itself is whatever we contend with when we are striving to survive and reproduce. A…
  • as our brain capacity increased and we developed curiosity to spare, we became increasingly aware of and curious about the nature of the world—what we eventually conceptualized as the objective…
  • “outside” is not merely unexplored physical territory. Outside is outside of what we currently understand—and understanding is dealing with and coping with…
  • when we first began to perceive the unknown, chaotic, non-animal world, we used categories that had originally evolved to represent the pre-human animal social world. Our minds are far older than mere…
  • Our most…
  • category—as old, in some sense, as the sexual act itself—appears to be that of sex, male and female. We appear to have taken that primordial knowledge of structured, creative opposition and…
  • Order, the known, appears symbolically associated with masculinity (as illustrated in the aforementioned yang of the Taoist yin-yang symbol). This is perhaps because the primary…
  • Chaos—the unknown—is symbolically associated with the feminine. This is partly because all the things we have come to know were born, originally, of the unknown, just as all beings we encounter were born of mothers. Chaos is mater, origin, source, mother; materia, the substance from which all things are made.
  • In its positive guise, chaos is possibility itself, the source of ideas, the mysterious realm of gestation and birth. As a negative force, it’s the impenetrable darkness of a cave and the accident by the side of the road.
  • Chaos, the eternal feminine, is also the crushing force of sexual selection.
  • Most men do not meet female human standards. It is for this reason that women on dating sites rate 85 percent of men as below average in attractiveness.40
  • Women’s proclivity to say no, more than any other force, has shaped our evolution into the creative, industrious, upright, large-brained (competitive, aggressive, domineering) creatures that we are.42 It is Nature as Woman who says, “Well, bucko, you’re good enough for a friend, but my experience of you so far has not indicated the suitability of your genetic material for continued propagation.”
  • Many things begin to fall into place when you begin to consciously understand the world in this manner. It’s as if the knowledge of your body and soul falls into alignment with the knowledge of your intellect.
  • And there’s more: such knowledge is proscriptive, as well as descriptive. This is the kind of knowing what that helps you know how. This is the kind of is from which you can derive an ought. The Taoist juxtaposition of yin and yang, for example, doesn’t simply portray chaos and order as the fundamental elements of Being—it also tells you how to act.
  • The Way, the Taoist path of life, is represented by (or exists on) the border between the twin serpents. The Way is the path of proper Being. It’s the same Way as that referred to by Christ in John 14:6: I am the way, and the truth and the life. The same idea is expressed in Matthew 7:14: Because strait is the gate, and narrow is the way, which leadeth unto life, and few there be that find it.
  • We eternally inhabit order, surrounded by chaos. We eternally occupy known territory, surrounded by the unknown. We experience meaningful engagement when we mediate appropriately between them. We are adapted, in the deepest Darwinian sense, not to the world of objects, but to the meta-realities of order and chaos, yang and yin. Chaos and order make up the eternal, transcendent environment of the living.
  • To straddle that fundamental duality is to be balanced: to have one foot firmly planted in order and security, and the other in chaos, possibility, growth and adventure.
  • Chaos and order are fundamental elements because every lived situation (even every conceivable lived situation) is made up of both.
  • you need to place one foot in what you have mastered and understood and the other in what you are currently exploring and mastering. Then you have positioned yourself where the terror of existence is under control and you are secure, but where you are also alert and engaged. That is where there is something new to master and some way that you can be improved. That is where meaning is to be found.
  • The serpent in Eden therefore means the same thing as the black dot in the yin side of the Taoist yin/yang symbol of totality—that is, the possibility of the unknown and revolutionary suddenly manifesting itself where everything appears calm.
  • The outside, chaos, always sneaks into the inside, because nothing can be completely walled off from the rest of reality. So even the ultimate in safe spaces inevitably harbours a snake.
  • We have seen the enemy, after all, and he is us. The snake inhabits each of our souls.
  • The worst of all possible snakes is the eternal human proclivity for evil. The worst of all possible snakes is psychological, spiritual, personal, internal. No walls, however tall, will keep that out. Even if the fortress were thick enough, in principle, to keep everything bad whatsoever outside, it would immediately appear again within.
  • I have learned that these old stories contain nothing superfluous. Anything accidental—anything that does not serve the plot—has long been forgotten in the telling. As the Russian playwright Anton Chekhov advised, “If there is a rifle hanging on the wall in act one, it must be fired in the next act. Otherwise it has no
  • business being there.”50
  • Eve immediately shares the fruit with Adam. That makes him self-conscious. Little has changed. Women have been making men self-conscious since the beginning of time. They do this primarily by rejecting them—but they also do it by shaming them, if men do not take responsibility. Since women bear the primary burden of reproduction, it’s no wonder. It is very hard to see how it could be otherwise. But the capacity of women to shame men and render them self-conscious is still a primal force of nature.
  • What does it mean to know yourself naked
  • Naked means vulnerable and easily damaged. Naked means subject to judgment for beauty and health. Naked means unprotected and unarmed in the jungle of nature and man. This is why Adam and Eve became ashamed, immediately after their eyes were opened. They could see—and what they first saw was themselves.
  • In their vulnerability, now fully realized, they felt unworthy to stand before God.
  • Beauty shames the ugly. Strength shames the weak. Death shames the living—and the Ideal shames us all.
  • He tells the woman that she will now bring forth children in sorrow, and desire an unworthy, sometimes resentful man, who will in consequence lord her biological fate over her, permanently. What might this mean? It could just mean that God is a patriarchal tyrant, as politically motivated interpretations of the ancient story insist. I think it’s merely descriptive.
  • women pay a high price for pregnancy and child-rearing, particularly in the early stages, and that one of the inevitable consequences is increased dependence upon the sometimes unreliable and always problematic good graces of men.
  • then God banishes the first man and the first woman from Paradise, out of infancy, out of the unconscious animal world, into the horrors of history itself. And then He puts cherubim and a flaming sword at the gate of Eden, just to stop them from eating the Fruit of the Tree of Life.
  • Perhaps Heaven is something you must build, and immortality something you must earn.
  • so we return to our original query: Why would someone buy prescription medication for his dog, and then so carefully administer it, when he would not do the same for himself?
  • Why should anyone take care of anything as naked, ugly, ashamed, frightened, worthless, cowardly, resentful, defensive and accusatory as a descendant of Adam? Even if that thing, that being, is himself?
  • We know how we are naked, and how that nakedness can be exploited—and that means we know how others are naked, and how they can be exploited. We can terrify other people, consciously. We can hurt and humiliate them for faults we understand only too well. We can torture them—literally—slowly, artfully and terribly. That’s far more than predation. That’s a qualitative shift in understanding. That’s a cataclysm as large as the development of self-consciousness itself. That’s the entry of the knowledge of Good and Evil into the world.
  • Only man could conceive of the rack, the iron maiden and the thumbscrew. Only man will inflict suffering for the sake of suffering. That is the best definition of evil I have been able to formulate.
  • with this realization we have well-nigh full legitimization of the idea, very unpopular in modern intellectual circles, of Original Sin.
  • Human beings have a great capacity for wrongdoing. It’s an attribute that is unique in the world of life. We can and do make things worse, voluntarily, with full knowledge of what we are doing (as well as accidentally, and carelessly, and in a manner that is willfully blind). Given that terrible capacity, that proclivity for malevolent actions, is it any wonder we have a hard time taking care of ourselves, or others—or even that we doubt the value of the entire human enterprise?
  • The juxtaposition of Genesis 1 with Genesis 2 & 3 (the latter two chapters outlining the fall of man, describing why our lot is so tragedy-ridden and ethically torturous) produces a narrative sequence almost unbearable in its profundity. The moral of Genesis 1 is that Being brought into existence through true speech is Good.
  • The original Man and Woman, existing in unbroken unity with their Creator, did not appear conscious (and certainly not self-conscious). Their eyes were not open. But, in their perfection, they were also less, not more, than their post-Fall counterparts. Their goodness was something bestowed, rather than deserved or earned.
  • Maybe, even in some cosmic sense (assuming that consciousness itself is a phenomenon of cosmic significance), free choice matters.
  • here’s a proposition: perhaps it is not simply the emergence of self-consciousness and the rise of our moral knowledge of Death and the Fall that besets us and makes us doubt our own worth. Perhaps it is instead our unwillingness—reflected in Adam’s shamed hiding—to walk with God, despite our fragility and propensity for evil.
  • The entire Bible is structured so that everything after the Fall—the history of Israel, the prophets, the coming of Christ—is presented as a remedy for that Fall, a way out of evil. The beginning of conscious history, the rise of the state and all its pathologies of pride and rigidity, the emergence of great moral figures who try to set things right, culminating in the Messiah Himself—that is all part of humanity’s attempt, God willing, to set itself right. And what would that mean?
  • And this is an amazing thing: the answer is already implicit in Genesis 1: to embody the Image of God—to speak out of chaos the Being that is Good—but to do so consciously, of our own free choice.
  • Back is the way forward—as T. S. Eliot so rightly insisted
  • We shall not cease from exploration And the end of all our exploring Will be to arrive where we started And know the place for the first time.
  • If we wish to take care of ourselves properly, we would have to respect ourselves—but we don’t, because we are—not least in our own eyes—fallen creatures.
  • If we lived in Truth; if we spoke the Truth—then we could walk with God once again, and respect ourselves, and others, and the world. Then we might treat ourselves like people we cared for.
  • We might strive to set the world straight. We might orient it toward Heaven, where we would want people we cared for to dwell, instead of Hell, where our resentment and hatred would eternally sentence everyone.
  • Then, the primary moral issue confronting society was control of violent, impulsive selfishness and the mindless greed and brutality that accompanies it.
  • It is easy to believe that people are arrogant, and egotistical, and always looking out for themselves. The cynicism that makes that opinion a universal truism is widespread and fashionable.
  • But such an orientation to the world is not at all characteristic of many people. They have the opposite problem: they shoulder intolerable burdens of self-disgust, self-contempt, shame and self-consciousness. Thus, instead of narcissistically inflating their own importance, they don’t value themselves at all, and they don’t take care of themselves with attention and skill.
  • Christ’s archetypal death exists as an example of how to accept finitude, betrayal and tyranny heroically—how to walk with God despite the tragedy of self-conscious knowledge—and not as a directive to victimize ourselves in the service of others.
  • To sacrifice ourselves to God (to the highest good, if you like) does not mean to suffer silently and willingly when some person or organization demands more from us, consistently, than is offered in return. That means we are supporting tyranny, and allowing ourselves to be treated like slaves.
  • I learned two very important lessons from Carl Jung, the famous Swiss depth psychologist, about “doing unto others as you would have them do unto you” or “loving your neighbour as yourself.”
  • The first lesson was that neither of these statements has anything to do with being nice. The second was that both are equations, rather than injunctions.
  • If I am someone’s friend, family member, or lover, then I am morally obliged to bargain as hard on my own behalf as they are on theirs.
  • there is little difference between standing up and speaking for yourself, when you are being bullied or otherwise tormented and enslaved, and standing up and speaking for someone else.
  • you do not simply belong to yourself. You are not simply your own possession to torture and mistreat. This is partly because your Being is inexorably tied up with that of others, and your mistreatment of yourself can have catastrophic consequences for others.
  • metaphorically speaking, there is also this: you have a spark of the divine in you, which belongs not to you, but to God. We are, after all—according to Genesis—made in His image.
  • We can make order from chaos—and vice versa—in our way, with our words. So, we may not exactly be God, but we’re not exactly nothing, either.
  • In my own periods of darkness, in the underworld of the soul, I find myself frequently overcome and amazed by the ability of people to befriend each other, to love their intimate partners and parents and children, and to do what they must do to keep the machinery of the world running.
  • It is this sympathy that should be the proper medicament for self-conscious self-contempt, which has its justification, but is only half the full and proper story. Hatred for self and mankind must be balanced with gratefulness for tradition and the state and astonishment at what normal, everyday people accomplish
  • You have some vital role to play in the unfolding destiny of the world. You are, therefore, morally obliged to take care of yourself.
  • To treat yourself as if you were someone you are responsible for helping is, instead, to consider what would be truly good for you. This is not “what you want.” It is also not “what would make you happy.”
  • You must help a child become a virtuous, responsible, awake being, capable of full reciprocity—able to take care of himself and others, and to thrive while doing so. Why would you think it acceptable to do anything less for yourself?
  • You need to know who you are, so that you understand your armament and bolster yourself in respect to your limitations. You need to know where you are going, so that you can limit the extent of chaos in your life, restructure order, and bring the divine force of Hope to bear on the world.
  • You need to determine how to act toward yourself so that you are most likely to become and to stay a good person.
  • Don’t underestimate the power of vision and direction. These are irresistible forces, able to transform what might appear to be unconquerable obstacles into traversable pathways and expanding opportunities.
  • Once having understood Hell, researched it, so to speak—particularly your
  • own individual Hell—you could decide against going there or creating that.
  • You could, in fact, devote your life to this. That would give you a Meaning, with a capital M. That would justify your miserable existence.
  • That would atone for your sinful nature, and replace your shame and self-consciousness with the natural pride and forthright confidence of someone who has learned once again to walk with God in the Garden.
  • RULE 3   MAKE FRIENDS WITH PEOPLE WHO WANT THE BEST FOR YOU
  • It would be more romantic, I suppose, to suggest that we would have all jumped at the chance for something more productive, bored out of our skulls as we were. But it’s not true. We were all too prematurely cynical and world-weary and leery of responsibility to stick to the debating clubs and Air Cadets and school sports that the adults around us tried to organize. Doing anything wasn’t cool.
  • When you move, everything is up in the air, at least for a while. It’s stressful, but in the chaos there are new possibilities. People, including you, can’t hem you in with their old notions. You get shaken out of your ruts. You can make new, better ruts, with people aiming at better things. I thought this was just a natural development. I thought that every person who moved would have—and want—the same phoenix-like experience.
  • What was it that made Chris and Carl and Ed unable (or, worse, perhaps, unwilling) to move or to change their friendships and improve the circumstances of their lives? Was it inevitable—a consequence of their own limitations, nascent illnesses and traumas of the past?
  • Why did he—like his cousin, like my other friends—continually choose people who, and places that, were not good for him?
  • perhaps, they don’t want the trouble of better. Freud called this a “repetition compulsion.” He thought of it as an unconscious drive to repeat the horrors of the past
  • People create their worlds with the tools they have directly at hand. Faulty tools produce faulty results. Repeated use of the same faulty tools produces the same faulty results.
  • It is in this manner that those who fail to learn from the past doom themselves to repeat it. It’s partly fate. It’s partly inability. It’s partly…unwillingness to learn? Refusal to learn? Motivated refusal to learn?
  • People choose friends who aren’t good for them for other reasons, too. Sometimes it’s because they want to rescue someone.
  • it is not easy to distinguish between someone truly wanting and needing help and someone who is merely exploiting a willing helper. The distinction is difficult even for the person who is wanting and needing and possibly exploiting.
  • When it’s not just naïveté, the attempt to rescue someone is often fuelled by vanity and narcissism.
  • But Christ himself, you might object, befriended tax-collectors and prostitutes. How dare I cast aspersions on the motives of those who are trying to help? But Christ was the archetypal perfect man. And you’re you.
  • How do you know that your attempts to pull someone up won’t instead bring them—or you—further down?
  • The same thing happens when well-meaning counsellors place a delinquent teen among comparatively civilized peers. The delinquency spreads, not the stability.65 Down is a lot easier than up.
  • maybe you’re saving someone because you want to convince yourself that the strength of your character is more than just a side effect of your luck and birthplace. Or maybe it’s because it’s easier to look virtuous when standing alongside someone utterly irresponsible.
  • Or maybe you have no plan, genuine or otherwise, to rescue anybody. You’re associating with people who are bad for you not because it’s better for anyone, but because it’s easier.
  • You know it. Your friends know it. You’re all bound by an implicit contract—one aimed at nihilism, and failure, and suffering of the stupidest sort.
  • Before you help someone, you should find out why that person is in trouble. You shouldn’t merely assume that he or she is a noble victim of unjust circumstances and exploitation. It’s the most unlikely explanation, not the most probable.
  • Besides, if you buy the story that everything terrible just happened on its own, with no personal responsibility on the part of the victim, you deny that person all agency in the past (and, by implication, in the present and future, as well).
  • It is far more likely that a given individual has just decided to reject the path upward, because of its difficulty. Perhaps that should even be your default assumption, when faced with such a situation.
  • failure is easy to understand. No explanation for its existence is required. In the same manner, fear, hatred, addiction, promiscuity, betrayal and deception require no explanation. It’s not the existence of vice, or the indulgence in it, that requires explanation. Vice is easy.
  • Failure is easy, too. It’s easier not to shoulder a burden. It’s easier not to think, and not to do, and not to care. It’s easier to put off until tomorrow what needs to be done today,
  • Success: that’s the mystery. Virtue: that’s what’s inexplicable. To fail, you merely have to cultivate a few bad habits. You just have to bide your time. And once someone has spent enough time cultivating bad habits and biding their time, they are much diminished.
  • I am not saying that there is no hope of redemption. But it is much harder to extract someone
  • from a chasm than to lift him from a ditch. And some chasms are very deep. And there’s not much left of the body at the bottom.
  • Carl Rogers, the famous humanistic psychologist, believed it was impossible to start a therapeutic relationship if the person seeking help did not want to improve.67 Rogers believed it was impossible to convince someone to change for the better. The
  • none of this is a justification for abandoning those in real need to pursue your narrow, blind ambition, in case it has to be said.
  • Here’s something to consider: If you have a friend whose friendship you wouldn’t recommend to your sister, or your father, or your son, why would you have such a friend for yourself?
  • You are not morally obliged to support someone who is making the world a worse place. Quite the opposite. You should choose people who want things to be better, not worse. It’s a good thing, not a selfish thing, to choose people who are good for you.
  • It is for this reason that every good example is a fateful challenge, and every hero, a judge. Michelangelo’s great perfect marble David cries out to its observer: “You could be more than you are.”
  • Don’t think that it is easier to surround yourself with good healthy people than with bad unhealthy people. It’s not. A good, healthy person is an ideal. It requires strength and daring to stand up near such a person.
  • RULE 4   COMPARE YOURSELF TO WHO YOU WERE YESTERDAY, NOT TO WHO SOMEONE ELSE IS TODAY
  • IT WAS EASIER FOR PEOPLE to be good at something when more of us lived in small, rural communities. Someone could be homecoming queen. Someone else could be spelling-bee champ, math whiz or basketball star. There were only one or two mechanics and a couple of teachers. In each of their domains, these local heroes had the opportunity to enjoy the serotonin-fuelled confidence of the victor.
  • Our hierarchies of accomplishment are now dizzyingly vertical.
  • No matter how good you are at something, or how you rank your accomplishments, there is someone out there who makes you look incompetent.
  • We are not equal in ability or outcome, and never will be. A very small number of people produce very much of everything.
  • People are unhappy at the bottom. They get sick there, and remain unknown and unloved. They waste their lives there. They die there. In consequence, the self-denigrating voice in the minds of people weaves a devastating tale. Life is a zero-sum game. Worthlessness is the default condition.
  • It is for such reasons that a whole generation of social psychologists recommended “positive illusions” as the only reliable route to mental health.69 Their credo? Let a lie be your umbrella. A more dismal, wretched, pessimistic philosophy can hardly be imagined:
  • Here is an alternative approach (and one that requires no illusions). If the cards are always stacked against you, perhaps the game you are playing is somehow rigged (perhaps by you, unbeknownst to yourself). If the internal voice makes you doubt the value of your endeavours—or your life, or life itself—perhaps you should stop listening.
  • There will always be people better than you—that’s a cliché of nihilism, like the phrase, In a million years, who’s going to know the difference? The proper response to that statement is not, Well, then, everything is meaningless. It’s, Any idiot can choose a frame of time within which nothing matters.
  • Standards of better or worse are not illusory or unnecessary. If you hadn’t decided that what you are doing right now was better than the alternatives, you wouldn’t be doing it. The idea of a value-free choice is a contradiction in terms. Value judgments are a precondition for action.
  • Furthermore, every activity, once chosen, comes with its own internal standards of accomplishment. If something can be done at all, it can be done better or worse. To do anything at all is therefore to play a game with a defined and valued end, which can always be reached more or less efficiently and elegantly.
  • We might start by considering the all-too-black-and-white words themselves: “success” or “failure.” You are either a success, a comprehensive, singular, over-all good thing, or its opposite, a failure, a comprehensive, singular, irredeemably bad thing.
  • There are vital degrees and gradations of value obliterated by this binary system, and the consequences are not good.
  • there is not just one game at which to succeed or fail. There are many games and, more specifically, many good games—
  • if changing games does not work, you can invent a new one. I
  • and athletic pursuits. You might consider judging your success across all the games you play.
  • When we are very young we are neither individual nor informed. We have not had the time nor gained the wisdom to develop our own standards. In consequence, we must compare ourselves to others, because standards are necessary.
  • As we mature we become, by contrast, increasingly individual and unique. The conditions of our lives become more and more personal and less and less comparable with those of others. Symbolically speaking, this means we must leave the house ruled by our father, and confront the chaos of our individual Being.
  • We must then rediscover the values of our culture—veiled from us by our ignorance, hidden in the dusty treasure-trove of the past—rescue them, and integrate them into our own lives. This is what gives existence its full and necessary meaning.
  • What is it that you actually love? What is it that you genuinely want? Before you can articulate your own standards of value, you must see yourself as a stranger—and then you must get to know yourself. What
  • Dare to be truthful. Dare to articulate yourself, and express (or at least become aware of) what would really justify your life.
  • Consult your resentment. It’s a revelatory emotion, for all its pathology. It’s part of an evil triad: arrogance, deceit, and resentment. Nothing causes more harm than this underworld Trinity. But resentment always means one of two things. Either the resentful person is immature, in which case he or she should shut up, quit whining, and get on with it, or there is tyranny afoot—in which case the person subjugated has a moral obligation to speak up.
  • Be cautious when you’re comparing yourself to others. You’re a singular being, once you’re an adult. You have your own particular, specific problems—financial, intimate, psychological, and otherwise.
  • Those are embedded in the unique broader context of your existence. Your career or job works for you in a personal manner, or it does not, and it does so in a unique interplay with the other specifics of your life.
  • We must see, but to see, we must aim, so we are always aiming. Our minds are built on the hunting-and-gathering platforms of our bodies. To hunt is to specify a target, track it, and throw at it.
  • We live within a framework that defines the present as eternally lacking and the future as eternally better. If we did not see things this way, we would not act at all. We wouldn’t even be able to see, because to see we must focus, and to focus we must pick one thing above all else on which to focus.
  • The disadvantage to all this foresight and creativity is chronic unease and discomfort. Because we always contrast what is with what could be, we have to aim at what could be.
  • The present is eternally flawed. But where you start might not be as important as the direction you are heading. Perhaps happiness is always to be found in the journey uphill, and not in the fleeting sense of satisfaction awaiting at the next peak.
  • Called upon properly, the internal critic will suggest something to set in order, which you could set in order, which you would set in order—voluntarily, without resentment, even with pleasure.
  • “Excuse me,” you might say to yourself, without irony or sarcasm. “I’m trying to reduce some of the unnecessary suffering around here. I could use some help.” Keep the derision at bay. “I’m wondering if there is anything that you would be willing to do? I’d be very grateful for your service.” Ask honestly and with humility. That’s no simple matter.
Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

The Wages of Guilt: Memories of War in Germany and Japan (Ian Buruma) - 0 views

  • the main reason why Germans were more trusted by their neighbors was that they were learning, slowly and painfully, and not always fully, to trust themselves.
  • elders, in government and the mass media, still voice opinions about the Japanese war that are unsettling, to say the least. Conservative politicians still pay their annual respects at a shrine where war criminals are officially remembered. Justifications and denials of war crimes are still heard. Too many Japanese in conspicuous places, including the prime minister’s office itself, have clearly not “coped” with the war.
  • unlike Nazi Germany, Japan had no systematic program to destroy the life of every man, woman, and child of a people that, for ideological reasons, was deemed to have no right to exist.
  • ...297 more annotations...
  • “We never knew,” a common reaction in the 1950s, had worn shamefully thin in the eyes of a younger generation by the 1960s. The extraordinary criminality of a deliberate genocide was so obvious that it left no room for argument.
  • Right-wing nationalists like to cite the absence of a Japanese Holocaust as proof that Japanese have no reason to feel remorse about their war at all. It was, in their eyes, a war like any other; brutal, yes, just as wars fought by all great nations in history have been brutal. In fact, since the Pacific War was fought against Western imperialists, it was a justified—even noble—war of Asian liberation.
  • in the late 1940s or 1950s, a time when most Germans were still trying hard not to remember. It is in fact extraordinary how honestly Japanese novelists and filmmakers dealt with the horrors of militarism in those early postwar years. Such honesty is much less evident now.
  • Popular comic books, aimed at the young, extol the heroics of Japanese soldiers and kamikaze pilots, while the Chinese and their Western allies are depicted as treacherous and belligerent. In 2008, the chief of staff of the Japanese Air Self-Defense Force stated that Japan had been “tricked” into the war by China and the US. In 2013, Prime Minister Abe Shinzo publicly doubted whether Japan’s military aggression in China could even be called an invasion.
  • The fact is that Japan is still haunted by historical issues that should have been settled decades ago. The reasons are political rather than cultural, and have to do with the pacifist constitution—written by American jurists in 1946—and with the imperial institution, absolved of war guilt by General Douglas MacArthur after the war for the sake of expediency.
  • Japan, even under Allied occupation, continued to be governed by much the same bureaucratic and political elite, albeit under a new, more democratic constitution,
  • a number of conservatives felt humiliated by what they rightly saw as an infringement of their national sovereignty. Henceforth, to them, everything from the Allied Tokyo War Crimes Tribunal to the denunciations of Japan’s war record by left-wing teachers and intellectuals would be seen in this light.
  • The more “progressive” Japanese used the history of wartime atrocities as a warning against turning away from pacifism, the more defensive right-wing politicians and commentators became about the Japanese war.
  • Views of history, in other words, were politicized—and polarized—from the beginning.
  • To take the sting out of this confrontation between constitutional pacifists and revisionists, which had led to much political turmoil in the 1950s, mainstream conservatives made a deliberate attempt to distract people’s attention from war and politics by concentrating on economic growth.
  • For several decades, the chauvinistic right wing, with its reactionary views on everything from high school education to the emperor’s status, was kept in check by the sometimes equally dogmatic Japanese left. Marxism was the prevailing ideology of the teachers union and academics.
  • the influence of Marxism waned after the collapse of the Soviet empire in the early 1990s, and the brutal records of Chairman Mao and Pol Pot became widely known.
  • Marginalized in the de facto one-party LDP state and discredited by its own dogmatism, the Japanese left did not just wane, it collapsed. This gave a great boost to the war-justifying right-wing nationalists,
  • Japanese young, perhaps out of boredom with nothing but materialistic goals, perhaps out of frustration with being made to feel guilty, perhaps out of sheer ignorance, or most probably out of a combination of all three, are not unreceptive to these patriotic blandishments.
  • Anxiety about the rise of China, whose rulers have a habit of using Japan’s historical crimes as a form of political blackmail, has boosted a prickly national pride, even at the expense of facing the truth about the past.
  • By 1996, the LDP was back in power, the constitutional issue had not been resolved, and historical debates continue to be loaded with political ideology. In fact, they are not really debates at all, but exercises in propaganda, tilted toward the reactionary side.
  • My instinct—call it a prejudice, if you prefer—before embarking on this venture was that people from distinct cultures still react quite similarly to similar circumstances.
  • The Japanese and the Germans, on the whole, did not behave in the same ways—but then the circumstances, both wartime and postwar, were quite different in the two Germanies and Japan. They still are.
  • Our comic-book prejudices turned into an attitude of moral outrage. This made life easier in a way. It was comforting to know that a border divided us from a nation that personified evil. They were bad, so we must be good. To grow up after the war in a country that had suffered German occupation was to know that one was on the side of the angels.
  • The question that obsessed us was not how we would have acquitted ourselves in uniform, going over the top, running into machine-gun fire or mustard gas, but whether we would have joined the resistance, whether we would have cracked under torture, whether we would have hidden Jews and risked deportation ourselves. Our particular shadow was not war, but occupation.
  • the frightened man who betrayed to save his life, who looked the other way, who grasped the wrong horn of a hideous moral dilemma, interested me more than the hero. This is no doubt partly because I fear I would be much like that frightened man myself. And partly because, to me, failure is more typical of the human condition than heroism.
  • I was curious to learn how Japanese saw the war, how they remembered it, what they imagined it to have been like, how they saw themselves in view of their past. What I heard and read was often surprising to a European:
  • this led me to the related subject of modern Japanese nationalism. I became fascinated by the writings of various emperor worshippers, historical revisionists, and romantic seekers after the unique essence of Japaneseness.
  • Bataan, the sacking of Manila, the massacres in Singapore, these were barely mentioned. But the suffering of the Japanese, in China, Manchuria, the Philippines, and especially in Hiroshima and Nagasaki, was remembered vividly, as was the imprisonment of Japanese soldiers in Siberia after the war. The Japanese have two days of remembrance: August 6, when Hiroshima was bombed, and August 15, the date of the Japanese surrender.
  • The curious thing was that much of what attracted Japanese to Germany before the war—Prussian authoritarianism, romantic nationalism, pseudo-scientific racialism—had lingered in Japan while becoming distinctly unfashionable in Germany. Why?
  • the two peoples saw their own purported virtues reflected in each other: the warrior spirit, racial purity, self-sacrifice, discipline, and so on. After the war, West Germans tried hard to discard this image of themselves. This was less true of the Japanese.
  • Which meant that any residual feelings of nostalgia for the old partnership in Japan were likely to be met with embarrassment in Germany.
  • I have concentrated on the war against the Jews in the case of Germany, since it was that parallel war, rather than, say, the U-boat battles in the Atlantic, or even the battle of Stalingrad, that left the most sensitive scar on the collective memory of (West) Germany.
  • I have emphasized the war in China and the bombing of Hiroshima, for these episodes, more than others, have lodged themselves, often in highly symbolic ways, in Japanese public life.
  • Do Germans perhaps have more reason to mourn? Is it because Japan has an Asian “shame culture,” to quote Ruth Benedict’s phrase, and Germany a Christian “guilt culture”?
  • why the collective German memory should appear to be so different from the Japanese. Is it cultural? Is it political? Is the explanation to be found in postwar history, or in the history of the war itself?
  • the two peoples still have anything in common after the war, it is a residual distrust of themselves.
  • when Michael sees thousands of German peace demonstrators, he does not see thousands of gentle people who have learned their lesson from the past; he sees “100 percent German Protestant rigorism, aggressive, intolerant, hard.”
  • To be betroffen implies a sense of guilt, a sense of shame, or even embarrassment. To be betroffen is to be speechless. But it also implies an idea of moral purity. To be betroffen is one way to “master the past,” to show contriteness, to confess, and to be absolved and purified.
  • In their famous book, written in the sixties, entitled The Inability to Mourn, Alexander and Margarethe Mitscherlich analyzed the moral anesthesia that afflicted postwar Germans who would not face their past. They were numbed by defeat; their memories appeared to be blocked. They would or could not do their labor, and confess. They appeared to have completely forgotten that they had glorified a leader who caused the death of millions.
  • There is something religious about the act of being betroffen, something close to Pietism,
  • heart of Pietism was the moral renovation of the individual, achieved by passing through the anguish of contrition into the overwhelming realization of the assurance of God’s grace.” Pietism served as an antidote to the secular and rational ideas of the French Enlightenment.
  • It began in the seventeenth century with the works of Philipp Jakob Spener. He wanted to reform the Church and bring the Gospel into daily life, as it were, by stressing good works and individual spiritual labor.
  • German television is rich in earnest discussion programs where people sit at round tables and debate the issues of the day. The audience sits at smaller tables, sipping drinks as the featured guests hold forth. The tone is generally serious, but sometimes the arguments get heated. It is easy to laugh at the solemnity of these programs, but there is much to admire about them. It is partly through these talk shows that a large number of Germans have become accustomed to political debate.
  • There was a real dilemma: at least two generations had been educated to renounce war and never again to send German soldiers to the front, educated, in other words, to want Germany to be a larger version of Switzerland. But they had also been taught to feel responsible for the fate of Israel, and to be citizens of a Western nation, firmly embedded in a family of allied Western nations. The question was whether they really could be both.
  • the Gulf War showed that German pacifism could not be dismissed simply as anti-Americanism or a rebellion against Adenauer’s West.
  • the West German mistrust of East Germans—the East Germans whose soldiers still marched in goose step, whose petit bourgeois style smacked of the thirties, whose system of government, though built on a pedestal of antifascism, contained so many disturbing remnants of the Nazi past; the East Germans, in short, who had been living in “Asia.”
  • Michael, the Israeli, compared the encounter of Westerners (“Wessies”) with Easterners (“Ossies”) with the unveiling of the portrait of Dorian Gray: the Wessies saw their own image and they didn’t like what they saw.
  • he added: “I also happen to think Japanese and Germans are racists.”
  • Germany for its Nazi inheritance and its sellout to the United States. But now that Germany had been reunified, with its specters of “Auschwitz” and its additional hordes of narrow-minded Ossies, Adenauer was deemed to have been right after
  • The picture was of Kiel in 1945, a city in ruins. He saw me looking at it and said: “It’s true that whoever is being bombed is entitled to some sympathy from us.”
  • “My personal political philosophy and maybe even my political ambition has to do with an element of distrust for the people I represent, people whose parents and grandparents made Hitler and the persecution of the Jews possible.”
  • in the seventies he had tried to nullify verdicts given in Nazi courts—without success until well into the eighties. One of the problems was that the Nazi judiciary itself was never purged. This continuity was broken only by time.
  • To bury Germany in the bosom of its Western allies, such as NATO and the EC, was to bury the distrust of Germans. Or so it was hoped. As Europeans they could feel normal, Western, civilized. Germany; the old “land in the middle,” the Central European colossus, the power that fretted over its identity and was haunted by its past, had become a Western nation.
  • It is a miracle, really, how quickly the Germans in the Federal Republic became civilized. We are truly part of the West now. We have internalized democracy. But the Germans of the former GDR, they are still stuck in a premodern age. They are the ugly Germans, very much like the West Germans after the war, the people I grew up with. They are not yet civilized.”
  • “I like the Germans very much, but I think they are a dangerous people. I don’t know why—perhaps it is race, or culture, or history. Whatever. But we Japanese are the same: we swing from one extreme to the other. As peoples, we Japanese, like the Germans, have strong collective discipline. When our energies are channeled in the right direction, this is fine, but when they are misused, terrible things happen.”
  • to be put in the same category as the Japanese—even to be compared—bothered many Germans. (Again, unlike the Japanese, who made the comparison often.) Germans I met often stressed how different they were from the Japanese,
  • To some West Germans, now so “civilized,” so free, so individualistic, so, well, Western, the Japanese, with their group discipline, their deference to authority, their military attitude toward work, might appear too close for comfort to a self-image only just, and perhaps only barely, overcome.
  • To what extent the behavior of nations, like that of individual people, is determined by history, culture, or character is a question that exercises many Japanese, almost obsessively.
  • not much sign of betroffenheit on Japanese television during the Gulf War. Nor did one see retired generals explain tactics and strategy. Instead, there were experts from journalism and academe talking in a detached manner about a faraway war which was often presented as a cultural or religious conflict between West and Middle East. The history of Muslim-Christian-Jewish animosity was much discussed. And the American character was analyzed at length to understand the behavior of George Bush and General Schwarzkopf.
  • In the words of one Albrecht Fürst von Urach, a Nazi propagandist, Japanese emperor worship was “the most unique fusion in the world of state form, state consciousness, and religious fanaticism.” Fanaticism was, of course, a positive word in the Nazi lexicon.
  • the identity question nags in almost any discussion about Japan and the outside world. It
  • It was a respectable view, but also one founded on a national myth of betrayal. Japan, according to the myth, had become the unique moral nation of peace, betrayed by the victors who had sat in judgment of Japan’s war crimes; betrayed in Vietnam, in Afghanistan, in Nicaragua; betrayed by the arms race, betrayed by the Cold War; Japan had been victimized not only by the “gratuitous,” perhaps even “racist,” nuclear attacks on Hiroshima and Nagasaki, but by all subsequent military actions taken by the superpowers,
  • When the Prime Minister of Japan, Shidehara Kijuro, protested in 1946 to General MacArthur that it was all very well saying that Japan should assume moral leadership in renouncing war, but that in the real world no country would follow this example, MacArthur replied: “Even if no country follows you, Japan will lose nothing. It is those who do not support this who are in the wrong.” For a long time most Japanese continued to take this view.
  • What is so convenient in the cases of Germany and Japan is that pacifism happens to be a high-minded way to dull the pain of historical guilt. Or, conversely, if one wallows in it, pacifism turns national guilt into a virtue, almost a mark of superiority, when compared to the complacency of other nations.
  • The denial of historical discrimination is not just a way to evade guilt. It is intrinsic to pacifism. To even try to distinguish between wars, to accept that some wars are justified, is already an immoral position.
  • That Kamei discussed this common paranoia in such odd, Volkish terms could mean several things: that some of the worst European myths got stuck in Japan, that the history of the Holocaust had no impact, or that Japan is in some respects a deeply provincial place. I think all three explanations apply.
  • “the problem with the U.S.-Japan relationship is difficult. A racial problem, really. Yankees are friendly people, frank people. But, you know, it’s hard. You see, we have to be friendly …”
  • Like Oda, indeed like many people of the left, Kamei thought in racial terms. He used the word jinshu, literally race. He did not even use the more usual minzoku, which corresponds, in the parlance of Japanese right-wingers, to Volk, or the more neutral kokumin, meaning the citizens of a state.
  • many Germans in the liberal democratic West have tried to deal honestly with their nation’s terrible past, the Japanese, being different, have been unable to do so. It is true that the Japanese, compared with the West Germans, have paid less attention to the suffering they inflicted on others, and shown a greater inclination to shift the blame. And liberal democracy, whatever it may look like on paper, has not been the success in Japan that it was in the German Federal Republic. Cultural differences might account for this. But one can look at these matters in a different, more political way. In his book The War Against the West, published in London in 1938, the Hungarian scholar Aurel Kolnai followed the Greeks in his definition of the West: “For the ancient Greeks ‘the West’ (or ‘Europe’) meant society with a free constitution and self-government under recognized rules, where ‘law is king,’ whereas the ‘East’ (or ‘Asia’) signified theocratic societies under godlike rulers whom their subjects serve ‘like slaves.’
  • According to this definition, both Hitler’s Germany and prewar Japan were of the East.
  • There was a great irony here: in their zeal to make Japan part of the West, General MacArthur and his advisers made it impossible for Japan to do so in spirit. For a forced, impotent accomplice is not really an accomplice at all.
  • In recent years, Japan has often been called an economic giant and a political dwarf. But this has less to do with a traditional Japanese mentality—isolationism, pacifism, shyness with foreigners, or whatnot—than with the particular political circumstances after the war that the United States helped to create.
  • when the Cold War prompted the Americans to make the Japanese subvert their constitution by creating an army which was not supposed to exist, the worst of all worlds appeared: sovereignty was not restored, distrust remained, and resentment mounted.
  • Kamei’s hawks are angry with the Americans for emasculating Japan; Oda’s doves hate the Americans for emasculating the “peace constitution.” Both sides dislike being forced accomplices, and both feel victimized, which is one reason Japanese have a harder time than Germans in coming to terms with their wartime past.
  • As far as the war against the Jews is concerned, one might go back to 1933, when Hitler came to power. Or at the latest to 1935, when the race laws were promulgated in Nuremberg. Or perhaps those photographs of burning synagogues on the night of November 9, 1938, truly marked the first stage of the Holocaust.
  • There is the famous picture of German soldiers lifting the barrier on the Polish border in 1939, but was that really the beginning? Or did it actually start with the advance into the Rhineland in 1936, or was it the annexation of the Sudetenland, or Austria, or Czechoslovakia?
  • IT IS DIFFICULT TO SAY when the war actually began for the Germans and the Japanese. I cannot think of a single image that fixed the beginning of either war in the public mind.
  • Possibly to avoid these confusions, many Germans prefer to talk about the Hitlerzeit (Hitler era) instead of “the war.”
  • only Japanese of a liberal disposition call World War II the Pacific War. People who stick to the idea that Japan was fighting a war to liberate Asia from Bolshevism and white colonialism call it the Great East Asian War (Daitowa Senso), as in the Great East Asian Co-Prosperity Sphere.
  • The German equivalent, I suppose, would be the picture of Soviet soldiers raising their flag on the roof of the gutted Reichstag in Berlin.
  • People of this opinion separate the world war of 1941–45 from the war in China, which they still insist on calling the China Incident.
  • Liberals and leftists, on the other hand, tend to splice these wars together and call them the Fifteen-Year War (1931–45).
  • images marking the end are more obvious.
  • argued that the struggle against Western imperialism actually began in 1853, with the arrival in Japan of Commodore Perry’s ships, and spoke of the Hundred-Year War.
  • These are among the great clichés of postwar Japan: shorthand for national defeat, suffering, and humiliation.
  • The Germans called it Zusammenbruch (the collapse) or Stunde Null (Zero Hour): everything seemed to have come to an end, everything had to start all over. The Japanese called it haisen (defeat) or shusen (termination of the war).
  • kokka (nation, state) and minzoku (race, people) are not quite of the same order as Sonderbehandlung (special treatment) or Einsatzgruppe (special action squad). The jargon of Japanese imperialism was racist and overblown, but it did not carry the stench of death camps.
  • The German people are spiritually starved, Adenauer told him. “The imagination has to be provided for.” This was no simple matter, especially in the German language, which had been so thoroughly infected by the jargon of mass murder.
  • All they had been told to believe in, the Germans and the Japanese, everything from the Führerprinzip to the emperor cult, from the samurai spirit to the Herrenvolk, from Lebensraum to the whole world under one (Japanese) roof, all that lay in ruins
  • How to purge this language from what a famous German philologist called the Lingua Tertii Imperii? “… the language is no longer lived,” wrote George Steiner in 1958, “it is merely spoken.”
  • out of defeat and ruin a new school of literature (and cinema) did arise. It is known in Germany as Trümmerliteratur (literature of the ruins). Japanese writers who came of age among the ruins called themselves the yakeato seidai (burnt-out generation). Much literature of the late forties and fifties was darkened by nihilism and despair.
  • It was as though Germany—Sonderweg or no Sonderweg—needed only to be purged of Nazism, while Japan’s entire cultural tradition had to be overhauled.
  • In Germany there was a tradition to fall back on. In the Soviet sector, the left-wing culture of the Weimar Republic was actively revived. In the Western sectors, writers escaped the rats and the ruins by dreaming of Goethe. His name was often invoked to prove that Germany, too, belonged to the humanist, enlightened strain of European civilization.
  • the Americans (and many Japanese leftists) distrusted anything associated with “feudalism,” which they took to include much of Japan’s premodern past. Feudalism was the enemy of democracy. So not only did the American censors, in their effort to teach the Japanese democracy, forbid sword-fight films and samurai dramas, but at one point ninety-eight Kabuki plays were banned too.
  • yet, what is remarkable about much of the literature of the period, or more precisely, of the literature about that time, since much of it was written later, is the deep strain of romanticism, even nostalgia. This colors personal memories of people who grew up just after the war as well.
  • If the mushroom cloud and the imperial radio speech are the clichés of defeat, the scene of an American soldier (usually black) raping a Japanese girl (always young, always innocent), usually in a pristine rice field (innocent, pastoral Japan), is a stock image in postwar movies about the occupation.
  • To Ango, then, as to other writers, the ruins offered hope. At last the Japanese, without “the fake kimono” of traditions and ideals, were reduced to basic human needs; at last they could feel real love, real pain; at last they would be honest. There was no room, among the ruins, for hypocrisy.
  • Böll was able to be precise about the end of the Zusammenbruch and the beginning of bourgeois hypocrisy and moral amnesia. It came on June 20, 1948, the day of the currency reform, the day that Ludwig Erhard, picked by the Americans as Economics Director in the U.S.-British occupation zone, gave birth to the Deutsche Mark. The DM, from then on, would be the new symbol of West German national pride;
  • the amnesia, and definitely the identification with the West, was helped further along by the Cold War. West Germany now found itself on the same side as the Western allies. Their common enemy was the “Asiatic” Soviet empire. Fewer questions needed to be asked.
  • Indeed, to some people the Cold War simply confirmed what they had known all along: Germany always had been on the right side, if only our American friends had realized it earlier.
  • The process of willed forgetfulness culminated in the manic effort of reconstruction, in the great rush to prosperity.
  • “Prosperity for All” was probably the best that could have happened to the Germans of the Federal Republic. It took the seed of resentment (and thus future extremism) out of defeat. And the integration of West Germany into a Western alliance was a good thing too.
  • The “inability to mourn,” the German disassociation from the piles of corpses strewn all over Central and Eastern Europe, so that the Third Reich, as the Mitscherlichs put it, “faded like a dream,” made it easier to identify with the Americans, the victors, the West.
  • Yet the disgust felt by Böll and others for a people getting fat (“flabby” is the usual term, denoting sloth and decadence) and forgetting about its murderous past was understandable.
  • The Brückners were the price Germany had to pay for the revival of its fortunes. Indeed, they were often instrumental in it. They were the apparatchik who functioned in any system, the small, efficient fish who voted for Christian conservatives in the West and became Communists in the East.
  • Staudte was clearly troubled by this, as were many Germans, but he offered no easy answers. Perhaps it was better this way: flabby democrats do less harm than vengeful old Nazis.
  • the forgetful, prosperous, capitalist Federal Republic of Germany was in many more or less hidden ways a continuation of Hitler’s Reich. This perfectly suited the propagandists of the GDR, who would produce from time to time lists of names of former Nazis who were prospering in the West. These lists were often surprisingly accurate.
  • In a famous film, half fiction, half documentary, made by a number of German writers and filmmakers (including Böll) in 1977, the continuity was made explicit. The film, called Germany in Autumn (Deutschland in Herbst),
  • Rainer Werner Fassbinder was one of the participants in this film. A year later he made The Marriage of Maria Braun.
  • To lifelong “antifascists” who had always believed that the Federal Republic was the heir to Nazi Germany, unification seemed—so they said—almost like a restoration of 1933. The irony was that many Wessies saw their new Eastern compatriots as embarrassing reminders of the same unfortunate past.
  • Rarely was the word “Auschwitz” heard more often than during the time of unification, partly as an always salutary reminder that Germans must not forget, but partly as an expression of pique that the illusion of a better, antifascist, anticapitalist, idealistic Germany, born in the ruins of 1945, and continued catastrophically for forty years in the East, had now been dashed forever.
  • Ludwig Erhard’s almost exact counterpart in Japan was Ikeda Hayato, Minister of Finance from 1949 and Prime Minister from 1960 to 1964. His version of Erhard’s “Prosperity for AH” was the Double Your Incomes policy, which promised to make the Japanese twice as rich in ten years. Japan had an average growth rate of 11 percent during the 1960s.
  • It explains, at any rate, why the unification of the two Germanys was considered a defeat by antifascists on both sides of the former border.
  • Very few wartime bureaucrats had been purged. Most ministries remained intact. Instead it was the Communists, who had welcomed the Americans as liberators, who were purged after 1949, the year China was “lost.”
  • so the time of ruins was seen by people on the left as a time of missed chances and betrayal. Far from achieving a pacifist utopia of popular solidarity, they ended up with a country driven by materialism, conservatism, and selective historical amnesia.
  • the “red purges” of 1949 and 1950 and the return to power of men whose democratic credentials were not much better helped to turn many potential Japanese friends of the United States into enemies. For the Americans were seen as promoters of the right-wing revival and the crackdown on the left.
  • For exactly twelve years Germany was in the hands of a criminal regime, a bunch of political gangsters who had started a movement. Removing this regime was half the battle.
  • It is easier to change political institutions and hope that habits and prejudices will follow. This, however, was more easily done in Germany than in Japan.
  • There had not been a cultural break either in Japan. There were no exiled writers and artists who could return to haunt the consciences of those who had stayed.
  • There was no Japanese Thomas Mann or Alfred Döblin. In Japan, everyone had stayed.
  • In Japan there was never a clear break between a fascist and a prefascist past. In fact, Japan was never really a fascist state at all. There was no fascist or National Socialist ruling party, and no Führer either. The closest thing to it would have been the emperor, and whatever else he may have been, he was not a fascist dictator.
  • whereas after the war Germany lost its Nazi leaders, Japan lost only its admirals and generals.
  • Japan was effectively occupied only by the Americans. West Germany was part of NATO and the European Community, and the GDR was in the Soviet empire. Japan’s only formal alliance is with the United States, through a security treaty that many Japanese have opposed.
  • But the systematic subservience of Japan meant that the country never really grew up. There is a Japanese fixation on America, an obsession which goes deeper, I believe, than German anti-Americanism,
  • Yet nothing had stayed entirely the same in Japan. The trouble was that virtually all the changes were made on American orders. This was, of course, the victor’s prerogative, and many changes were beneficial.
  • like in fiction. American Hijiki, a novella by Nosaka Akiyuki, is, to my mind, a masterpiece in the short history of Japanese Trümmerliteratur.
  • Older Japanese do, however, remember the occupation, the first foreign army occupation in their national history. But it was, for the Japanese, a very unusual army. Whereas the Japanese armies in Asia had brought little but death, rape, and destruction, this one came with Glenn Miller music, chewing gum, and lessons in democracy. These blessings left a legacy of gratitude, rivalry, and shame.
  • did these films teach the Japanese democracy? Oshima thinks not. Instead, he believes, Japan learned the values of “progress” and “development.” Japan wanted to be just as rich as America—no, even richer:
  • think it is a romantic assumption, based less on history than on myth; a religious notion, expressed less through scholarship than through monuments, memorials, and historical sites turned into sacred grounds.
  • The past, wrote the West German historian Christian Meier, is in our bones. “For a nation to appropriate its history,” he argued, “is to look at it through the eyes of identity.” What we have “internalized,” he concluded, is Auschwitz.
  • Auschwitz is such a place, a sacred symbol of identity for Jews, Poles, and perhaps even Germans. The question is what or whom Germans are supposed to identify with.
  • The idea that visiting the relics of history brings the past closer is usually an illusion. The opposite is more often true.
  • To visit the site of suffering, any description of which cannot adequately express the horror, is upsetting, not because one gets closer to knowing what it was actually like to be a victim, but because such visits stir up emotions one cannot trust. It is tempting to take on the warm moral glow of identification—so easily done and so presumptuous—with the victims:
  • Were the crimes of Auschwitz, then, part of the German “identity”? Was genocide a product of some ghastly flaw in German culture, the key to which might be found in the sentimental proverbs, the cruel fairy tales, the tight leather shorts?
  • yet the imagination is the only way to identify with the past. Only in the imagination—not through statistics, documents, or even photographs—do people come alive as individuals, do stories emerge, instead of History.
  • nature. It is all right to let the witnesses speak, in the courtroom, in the museums, on videotape (Claude Lanzmann’s Shoah has been shown many times on German television), but it is not all right for German artists to use their imagination.
  • the reluctance in German fiction to look Auschwitz in the face, the almost universal refusal to deal with the Final Solution outside the shrine, the museum, or the schoolroom, suggests a fear of committing sacrilege.
  • beneath the fear of bad taste or sacrilege may lie a deeper problem. To imagine people in the past as people of flesh and blood, not as hammy devils in silk capes, is to humanize them. To humanize is not necessarily to excuse or to sympathize, but it does demolish the barriers of abstraction between us and them. We could, under certain circumstances, have been them.
  • the flight into religious abstraction was to be all too common among Germans of the Nazi generation, as well as their children; not, as is so often the case with Jews, to lend mystique to a new identity, as a patriotic Zionist, but on the contrary to escape from being the heir to a peculiarly German crime, to get away from having to “internalize” Auschwitz, or indeed from being German at all.
  • a Hollywood soap opera, a work of skillful pop, which penetrated the German imagination in a way nothing had before. Holocaust was first shown in Germany in January 1979. It was seen by 20 million people, about half the adult population of the Federal Republic; 58 percent wanted it to be repeated; 12,000 letters, telegrams, and postcards were sent to the broadcasting stations; 5,200 called the stations by telephone after the first showing; 72.5 percent were positive, 7.3 percent negative.
  • “After Holocaust,” wrote a West German woman to her local television station, “I feel deep contempt for those beasts of the Third Reich. I am twenty-nine years old and a mother of three children. When I think of the many mothers and children sent to the gas chambers, I have to cry. (Even today the Jews are not left in peace. We Germans have the duty to work every day for peace in Israel.) I bow to the victims of the Nazis, and I am ashamed to be a German.”
  • Auschwitz was a German crime, to be sure. “Death is a master from Germany.” But it was a different Germany. To insist on viewing history through the “eyes of identity,” to repeat the historian Christian Meier’s phrase, is to resist the idea of change.
  • Is there no alternative to these opposing views? I believe there is.
  • The novelist Martin Walser, who was a child during the war, believes, like Meier, that Auschwitz binds the German people, as does the language of Goethe. When a Frenchman or an American sees pictures of Auschwitz, “he doesn’t have to think: We human beings! He can think: Those Germans! Can we think: Those Nazis! I for one cannot …”
  • Adorno, a German Jew who wished to save high German culture, on whose legacy the Nazis left their bloody finger marks, resisted the idea that Auschwitz was a German crime. To him it was a matter of modern pathology, the sickness of the “authoritarian personality,” of the dehumanized SS guards, those inhumane cogs in a vast industrial wheel.
  • To the majority of Japanese, Hiroshima is the supreme symbol of the Pacific War. All the suffering of the Japanese people is encapsulated in that almost sacred word: Hiroshima. But it is more than a symbol of national martyrdom; Hiroshima is a symbol of absolute evil, often compared to Auschwitz.
  • has the atmosphere of a religious center. It has martyrs, but no single god. It has prayers, and it has a ready-made myth about the fall of man. Hiroshima, says a booklet entitled Hiroshima Peace Reader, published by the Hiroshima Peace Culture Foundation, “is no longer merely a Japanese city. It has become recognized throughout the world as a Mecca of world peace.”
  • They were not enshrined in the Japanese park, and later attempts by local Koreans to have the monument moved into Peace Park failed. There could only be one cenotaph, said the Hiroshima municipal authorities. And the cenotaph did not include Koreans.
  • What is interesting about Hiroshima—the Mecca rather than the modern Japanese city, which is prosperous and rather dull—is the tension between its universal aspirations and its status as the exclusive site of Japanese victimhood.
  • it is an opinion widely held by Japanese nationalists. The right always has been concerned with the debilitating effects on the Japanese identity of war guilt imposed by American propaganda.
  • The Japanese, in contrast, were duped by the Americans into believing that the traces of Japanese suffering should be swept away by the immediate reconstruction of Hiroshima. As a result, the postwar Japanese lack an identity and their racial virility has been sapped by American propaganda about Japanese war guilt.
  • Hiroshima, Uno wrote, should have been left as it was, in ruins, just as Auschwitz, so he claims, was deliberately preserved by the Jews. By reminding the world of their martyrdom, he said, the Jews have kept their racial identity intact and restored their virility.
  • But the idea that the bomb was a racist experiment is less plausible, since the bomb was developed for use against Nazi Germany.
  • There is another view, however, held by leftists and liberals, who would not dream of defending the “Fifteen-Year War.” In this view, the A-bomb was a kind of divine punishment for Japanese militarism. And having learned their lesson through this unique suffering, having been purified through hellfire and purgatory, so to speak, the Japanese people have earned the right, indeed have the sacred duty, to sit in judgment of others, specifically the United States, whenever they show signs of sinning against the “Hiroshima spirit.”
  • The left has its own variation of Japanese martyrdom, in which Hiroshima plays a central role. It is widely believed, for instance, that countless Japanese civilians fell victim to either a wicked military experiment or to the first strike in the Cold War, or both.
  • However, right-wing nationalists care less about Hiroshima than about the idée fixe that the “Great East Asian War” was to a large extent justified.
  • This is at the heart of what is known as Peace Education, which has been much encouraged by the leftist Japan Teachers’ Union and has been regarded with suspicion by the conservative government. Peace Education has traditionally meant pacifism, anti-Americanism, and a strong sympathy for Communist states, especially China.
  • The A-bomb, in this version, was dropped to scare the Soviets away from invading Japan. This at least is an arguable position.
  • left-wing pacifism in Japan has something in common with the romantic nationalism usually associated with the right: it shares the right’s resentment about being robbed by the Americans of what might be called a collective memory.
  • The romantic pacifists believe that the United States, to hide its own guilt and to rekindle Japanese militarism in aid of the Cold War, tried to wipe out the memory of Hiroshima.
  • few events in World War II have been described, analyzed, lamented, reenacted, re-created, depicted, and exhibited so much and so often as the bombing of Hiroshima
  • The problem with Nagasaki was not just that Hiroshima came first but also that Nagasaki had more military targets than Hiroshima. The Mitsubishi factories in Nagasaki produced the bulk of Japanese armaments. There was also something else, which is not often mentioned: the Nagasaki bomb exploded right over the area where outcasts and Christians lived. And unlike in Hiroshima, much of the rest of the city was spared the worst.
  • yet, despite these diatribes, the myth of Hiroshima and its pacifist cult is based less on American wickedness than on the image of martyred innocence and visions of the apocalypse.
  • The comparison between Hiroshima and Auschwitz is based on this notion; the idea, namely, that Hiroshima, like the Holocaust, was not part of the war, not even connected with it, but “something that occurs at the end of the world
  • still I wonder whether it is really so different from the position of many Germans who wish to “internalize” Auschwitz, who see Auschwitz “through the eyes of identity.”
  • the Japanese to take two routes at once, a national one, as unique victims of the A-bomb, and a universal one, as the apostles of the Hiroshima spirit. This, then, is how Japanese pacifists, engaged in Peace Education, define the Japanese identity.
  • the case for Hiroshima is at least open to debate. The A-bomb might have saved lives; it might have shortened the war. But such arguments are incompatible with the Hiroshima spirit.
  • In either case, nationality has come to be based less on citizenship than on history, morality, and a religious spirit.
  • The problem with this quasi-religious view of history is that it makes it hard to discuss past events in anything but nonsecular terms. Visions of absolute evil are unique, and they are beyond human explanation or even comprehension. To explain is hubristic and amoral.
  • in the history of Japan’s foreign wars, the city of Hiroshima is far from innocent. When Japan went to war with China in 1894, the troops set off for the battlefronts from Hiroshima, and the Meiji emperor moved his headquarters there. The city grew wealthy as a result. It grew even wealthier when Japan went to war with Russia eleven years later, and Hiroshima once again became the center of military operations. As the Hiroshima Peace Reader puts it with admirable conciseness, “Hiroshima, secure in its position as a military city, became more populous and prosperous as wars and incidents occurred throughout the Meiji and Taisho periods.” At the time of the bombing, Hiroshima was the base of the Second General Headquarters of the Imperial Army (the First was in Tokyo). In short, the city was swarming with soldiers. One of the few literary masterpieces to emerge
  • when a local group of peace activists petitioned the city of Hiroshima in 1987 to incorporate the history of Japanese aggression into the Peace Memorial Museum, the request was turned down. The petition for an “Aggressors’ Corner” was prompted by junior high school students from Osaka, who had embarrassed Peace Museum officials by asking for an explanation about Japanese responsibility for the war.
  • Yukoku Ishinkai (Society for Lament and National Restoration), thought the bombing had saved Japan from total destruction. But he insisted that Japan could not be held solely responsible for the war. The war, he said, had simply been part of the “flow of history.”
  • They also demanded an official recognition of the fact that some of the Korean victims of the bomb had been slave laborers. (Osaka, like Kyoto and Hiroshima, still has a large Korean population.) Both requests were denied. So a group called Peace Link was formed, from local people, many of whom were Christians, antinuclear activists, or involved with discriminated-against minorities.
  • The history of the war, or indeed any history, is indeed not what the Hiroshima spirit is about. This is why Auschwitz is the only comparison that is officially condoned. Anything else is too controversial, too much part of the “flow of history.”
  • “You see, this museum was not really intended to be a museum. It was built by survivors as a place of prayer for the victims and for world peace. Mankind must build a better world. That is why Hiroshima must persist. We must go back to the basic roots. We must think of human solidarity and world peace. Otherwise we just end up arguing about history.”
  • Only when a young Japanese history professor named Yoshimi Yoshiaki dug up a report in American archives in the 1980s did it become known that the Japanese had stored 15,000 tons of chemical weapons on and near the island and that a 200-kilogram container of mustard gas was buried under Hiroshima.
  • what was the largest toxic gas factory in the Japanese Empire. More than 5,000 people worked there during the war, many of them women and schoolchildren. About 1,600 died of exposure to hydrocyanic acid gas, nausea gas, and lewisite. Some were damaged for life. Official Chinese sources claim that more than 80,000 Chinese fell victim to gases produced at the factory. The army was so secretive about the place that the island simply disappeared from Japanese maps.
  • in 1988, through the efforts of survivors, the small museum was built, “to pass on,” in the words of the museum guide, “the historical truth to future generations.”
  • Surviving workers from the factory, many of whom suffered from chronic lung diseases, asked for official recognition of their plight in the 1950s. But the government turned them down. If the government had compensated the workers, it would have been an official admission that the Japanese Army had engaged in an illegal enterprise. When a brief mention of chemical warfare crept into Japanese school textbooks, the Ministry of Education swiftly took it out.
  • I asked him about the purpose of the museum. He said: “Before shouting ‘no more war,’ I want people to see what it was really like. To simply look at the past from the point of view of the victim is to encourage hatred.”
  • “Look,” he said, “when you fight another man, and hit him and kick him, he will hit and kick back. One side will win. How will this be remembered? Do we recall that we were kicked, or that we started the kicking ourselves? Without considering this question, we cannot have peace.”
  • The fact that Japanese had buried poison gas under Hiroshima did not lessen the horror of the A-bomb. But it put Peace Park, with all its shrines, in a more historical perspective. It took the past away from God and put it in the fallible hands of man.
  • What did he think of the Peace Museum in Hiroshima? “At the Hiroshima museum it is easy to feel victimized,” he said. “But we must realize that we were aggressors too. We were educated to fight for our country. We made toxic gas for our country. We lived to fight the war. To win the war was our only goal.”
  • Nanking, as the capital of the Nationalist government, was the greatest prize in the attempted conquest of China. Its fall was greeted in Japan with banner headlines and nationwide celebration. For six weeks Japanese Army officers allowed their men to run amok. The figures are imprecise, but tens of thousands, perhaps hundreds of thousands (the Chinese say 300,000) of Chinese soldiers and civilians, many of them refugees from other towns, were killed. And thousands of women between the ages of about nine and seventy-five were raped, mutilated, and often murdered.
  • Was it a deliberate policy to terrorize the Chinese into submission? The complicity of the officers suggests there was something to this. But it might also have been a kind of payoff to the Japanese troops for slogging through China in the freezing winter without decent pay or rations. Or was it largely a matter of a peasant army running out of control? Or just the inevitable consequence of war, as many Japanese maintain?
  • inevitable cruelty of war. An atrocity is a willful act of criminal brutality, an act that violates the law as well as any code of human decency. It isn’t that the Japanese lack such codes or are morally incapable of grasping the concept. But “atrocity,” like “human rights,” is part of a modern terminology which came from the West, along with “feminism,” say, or “war crimes.” To right-wing nationalists it has a leftist ring, something subversive, something almost anti-Japanese.
  • During the Tokyo War Crimes Tribunal, Nanking had the same resonance as Auschwitz had in Nuremberg. And being a symbol, the Nanking Massacre is as vulnerable to mythology and manipulation as Auschwitz and Hiroshima.
  • Mori’s attitude also raises doubts about Ruth Benedict’s distinction between Christian “guilt culture” and Confucian “shame culture.”
  • In her opinion, a “society that inculcates absolute standards of morality and relies on man’s developing a conscience is a guilt culture by definition …” But in “a culture where shame is a major sanction, people are chagrined about acts which we expect people to feel guilty about.” However, this “chagrin cannot be relieved, as guilt can be, by confession and atonement …”
  • memory was admitted at all, the Mitscherlichs wrote about Germans in the 1950s, “it was only in order to balance one’s own guilt against that of others. Many horrors had been unavoidable, it was claimed, because they had been dictated by crimes committed by the adversary.” This was precisely what many Japanese claimed, and still do claim. And it is why Mori insists on making his pupils view the past from the perspective of the aggressors.
  • Two young Japanese officers, Lieutenant N. and Lieutenant M., were on their way to Nanking and decided to test their swordsmanship: the first to cut off one hundred Chinese heads would be the winner. And thus they slashed their way through Chinese ranks, taking scalps in true samurai style. Lieutenant M. got 106, and Lieutenant N. bagged 105.
  • The story made a snappy headline in a major Tokyo newspaper: “Who Will Get There First! Two Lieutenants Already Claimed 80.” In the Nanking museum is a newspaper photograph of the two friends, glowing with youthful high spirits. Lieutenant N. boasted in the report that he had cut the necks off 56 men without even denting the blade of his ancestral sword.
  • I was told by a Japanese veteran who had fought in Nanking that such stories were commonly made up or at least exaggerated by Japanese reporters, who were ordered to entertain the home front with tales of heroism.
  • Honda Katsuichi, a famous Asahi Shimbun reporter, was told the story in Nanking. He wrote it up in a series of articles, later collected in a book entitled A Journey to China, published in 1981.
  • the whole thing developed into the Nankin Ronso, or Nanking Debate. In 1984, an anti-Honda book came out, by Tanaka Masaaki, entitled The Fabrication of the “Nanking Massacre.”
  • back in Japan, Lieutenant M. began to revise his story. Speaking at his old high school, he said that in fact he had beheaded only four or five men in actual combat. As for the rest … “After we occupied the city, I stood facing a ditch, and told the Chinese prisoners to step forward. Since Chinese soldiers are stupid, they shuffled over to the ditch, one by one, and I cleanly cut off their heads.”
  • The nationalist intellectuals are called goyo gakusha by their critics. It is a difficult term to translate, but the implied meaning is “official scholars,” who do the government’s bidding.
  • the debate on the Japanese war is conducted almost entirely outside Japanese universities, by journalists, amateur historians, political columnists, civil rights activists, and so forth. This means that the zanier theories of the likes of Tanaka…
  • The other reason was that modern history was not considered academically respectable. It was too fluid, too political, too controversial. Until 1955, there was not one modern historian on the staff of Tokyo University. History stopped around the middle of the nineteenth century. And even now, modern…
  • In any case, so the argument invariably ends, Hiroshima, having been planned in cold blood, was a far worse crime. “Unlike in Europe or China,” writes Tanaka, “you won’t find one instance of planned, systematic murder in the entire history of Japan.” This is because the Japanese…
  • One reason is that there are very few modern historians in Japan. Until the end of the war, it would have been dangerously subversive, even blasphemous, for a critical scholar to write about modern…
  • they have considerable influence on public opinion, as television commentators, lecturers, and contributors to popular magazines. Virtually none of them are professional historians.
  • Tanaka and others have pointed out that it is physically impossible for one man to cut off a hundred heads with one blade, and that for the same reason Japanese troops could never have…
  • Besides, wrote Tanaka, none of the Japanese newspapers reported any massacre at the time, so why did it suddenly come up…
  • He admits that a few innocent people got killed in the cross fire, but these deaths were incidental. Some soldiers were doubtless a bit rough, but…
  • even he defends an argument that all the apologists make too: “On the battlefield men face the ultimate extremes of human existence, life or death. Extreme conduct, although still ethically…
  • atrocities carried out far from the battlefield dangers and imperatives and according to a rational plan were acts of evil barbarism. The Auschwitz gas chambers of our ‘ally’ Germany and the atomic bombing of our…
  • The point that it was not systematic was made by leftist opponents of the official scholars too. The historian Ienaga Saburo, for example, wrote that the Nanking Massacre, whose scale and horror he does not deny, “may have been a reaction to the fierce Chinese resistance after the Shanghai fighting.” Ienaga’s…
  • The nationalist right takes the opposite view. To restore the true identity of Japan, the emperor must be reinstated as a religious head of state, and Article Nine must be revised to make Japan a legitimate military power again. For this reason, the Nanking Massacre, or any other example of extreme Japanese aggression, has to be ignored, softened, or denied.
  • the question remains whether the raping and killing of thousands of women, and the massacre of thousands, perhaps hundreds of thousands, of other unarmed people, in the course of six weeks, can still be called extreme conduct in the heat of battle. The question is pertinent, particularly when such extreme violence is justified by an ideology which teaches the aggressors that killing an inferior race is in accordance with the will of their divine emperor.
  • The politics behind the symbol are so divided and so deeply entrenched that it hinders a rational historical debate about what actually happened in 1937. The more one side insists on Japanese guilt, the more the other insists on denying it.
  • The Nanking Massacre, for leftists and many liberals too, is the main symbol of Japanese militarism, supported by the imperial (and imperialist) cult. Which is why it is a keystone of postwar pacifism. Article Nine of the constitution is necessary to avoid another Nanking Massacre.
  • The Japanese, he said, should see their history through their own eyes, for “if we rely on the information of aliens and alien countries, who use history for the sake of propaganda, then we are in danger of losing the sense of our own history.” Yet another variation of seeing history through the eyes of identity.
  • their emotions were often quite at odds with the idea of “shame culture” versus “guilt culture.” Even where the word for shame, hazukashii, was used, its meaning was impossible to distinguish from the Western notion of guilt.
  • wasn’t so bad in itself. But then they killed them. You see, rape was against military regulations, so we had to destroy the evidence. While the women were fucked, they were considered human, but when we killed them, they were just pigs. We felt no shame about it, no guilt. If we had, we couldn’t have done it.
  • “Whenever we would enter a village, the first thing we’d do was steal food, then we’d take the women and rape them, and finally we’d kill all the men, women, and children to make sure they couldn’t slip away and tell the Chinese troops where we were. Otherwise we wouldn’t have been able to sleep at night.”
  • Clearly, then, the Nanking Massacre had been the culmination of countless massacres on a smaller scale. But it had been mass murder without a genocidal ideology. It was barbaric, but to Azuma and his comrades, barbarism was part of war.
  • “Sexual desire is human,” he said. “Since I suffered from a venereal disease, I never actually did it with Chinese women. But I did peep at their private parts. We’d always order them to drop their trousers. They never wore any underwear, you know. But the others did it with any woman that crossed our path.
  • He did have friends, however, who took part in the killings. One of them, Masuda Rokusuke, killed five hundred men by the Yangtze River with his machine gun. Azuma visited his friend in the hospital just before he died in the late 1980s. Masuda was worried about going to hell. Azuma tried to reassure him that he was only following orders. But Masuda remained convinced that he was going to hell.
  • “One of the worst moments I can remember was the killing of an old man and his grandson. The child was bayoneted and the grandfather started to suck the boy’s blood, as though to conserve his grandson’s life a bit longer. We watched a while and then killed both. Again, I felt no guilt, but I was bothered by this kind of thing. I felt confused. So I decided to keep a diary. I thought it might help me think straight.”
  • What about his old comrades? I asked. How did they discuss the war? “Oh,” said Azuma, “we wouldn’t talk about it much. When we did, it was to justify it. The Chinese resisted us, so we had to do what we did, and so on. None of us felt any remorse. And I include myself.”
  • got more and more agitated. “They turned the emperor into a living god, a false idol, like the Ayatollah in Iran or like Kim II Sung. Because we believed in the divine emperor, we were prepared to do anything, anything at all, kill, rape, anything. But I know he fucked his wife every night, just like we do …” He paused and lowered his voice. “But you know we cannot say this in Japan, even today. It is impossible in this country to tell the truth.”
  • My first instinct was to applaud West German education. Things had come a long way since 1968. There had been no school classes at Nuremberg, or even at the Auschwitz trial in Frankfurt from 1963 till 1965. Good for the teacher, I thought. Let them hear what was done. But I began to have doubts.
  • Just as belief belongs in church, surely history education belongs in school. When the court of law is used for history lessons, then the risk of show trials cannot be far off. It may be that show trials can be good politics—though I have my doubts about this too. But good politics don’t necessarily serve the truth.
  • There is a story about the young Richard when he was in Nuremberg at the time of the war crimes trials. He is said to have turned to a friend and to have remarked, in his best Wehrmacht officer style, that they should storm the court and release the prisoners. The friend, rather astonished, asked why on earth they should do such a thing. “So that we can try them ourselves” was Weiszäcker’s alleged response.
  • There was also concern that international law might not apply to many of the alleged crimes. If revenge was the point, why drag the law into it? Why not take a political decision to punish? This was what Becker, in his office, called the Italian solution: “You kill as many people as you can in the first six weeks, and then you forget about it: not very legal, but for the purposes of purification, well …”
  • Becker was not against holding trials as such. But he believed that existing German laws should have been applied, instead of retroactive laws about crimes against peace (preparing, planning, or waging an aggressive war).
  • It was to avoid a travesty of the legal process that the British had been in favor of simply executing the Nazi leaders without a trial. The British were afraid that a long trial might change public opinion. The trial, in the words of one British diplomat, might be seen as a “put-up job.”
  • The question is how to achieve justice without distorting the law, and how to stage a trial by victors over the vanquished without distorting history. A possibility would have been to make victors’ justice explicit, by letting military courts try the former enemies.
  • This would have avoided much hypocrisy and done less damage to the due process of law in civilian life. But if the intention was to teach Germans a history lesson, a military court would have run into the same problems as a civilian one.
  • Due process or revenge. This problem had preoccupied the ancient Greek tragedians. To break the cycle of vendetta, Orestes had to be tried by the Athens court for the murder of his mother. Without a formal trial, the vengeful Furies would continue to haunt the living.
  • The aspect of revenge might have been avoided had the trial been held by German judges. There was a precedent for this, but it was not a happy one. German courts had been allowed to try alleged war criminals after World War I. Despite strong evidence against them, virtually all were acquitted, and the foreign delegates were abused by local mobs. Besides, Wetzka was right: German judges had collaborated with the Nazi regime; they could hardly be expected to be impartial. So it was left to the victors to see that justice was done.
  • When the American chief prosecutor in Nuremberg, Robert H. Jackson, was asked by the British judge, Lord Justice Lawrence, what he thought the purpose of the trials should be, Jackson answered that they were to prove to the world that the German conduct of the war had been unjustified and illegal, and to demonstrate to the German people that this conduct deserved severe punishment and to prepare them for
  • What becomes clear from this kind of language is that law, politics, and religion became confused: Nuremberg became a morality play, in which Göring, Kaltenbrunner, Keitel, and the others were cast in the leading roles. It was a play that claimed to deliver justice, truth, and the defeat of evil.
  • The Nuremberg trials were to be a history lesson, then, as well as a symbolic punishment of the German people—a moral history lesson cloaked in all the ceremonial trappings of due legal process. They were the closest that man, or at least the men belonging to the victorious powers, could come to dispensing divine justice. This was certainly the way some German writers felt about it. Some welcomed it
  • We now have this law on our books, the prosecutor said: “It will be used against the German aggressor this time. But the four powers, who are conducting this trial in the name of twenty-three nations, know this law and declare: Tomorrow we shall be judged before history by the same yardstick by which we judge these defendants today.”
  • “We had seen through the amorality of the Nazis, and wanted to rid ourselves of it. It was from the moral seriousness of the American prosecution that we wished to learn sensible political thinking. “And we did learn. “And we allowed ourselves to apply this thinking to the present time. For example, we will use it now to take quite literally the morality of those American prosecutors. Oradour and Lidice—today they are cities in South Vietnam” (Italics in the original text.)
  • The play ends with a statement by the American prosecutor on crimes against peace
  • (It was decided in 1979, after the shock of the Holocaust TV series, to abolish the statute of limitations for crimes against humanity.)
  • after Nuremberg, most Germans were tired of war crimes. And until the mid-1950s German courts were permitted to deal only with crimes committed by Germans against other Germans. It took the bracing example of the Eichmann trial in Jerusalem to jolt German complacency—that, and the fact that crimes committed before 1946 would no longer be subject to prosecution after 1965.
  • Trying the vanquished for conventional war crimes was never convincing, since the victors could be accused of the same. Tu quoque could be invoked, in private if not in the Nuremberg court, when memories of Dresden and Soviet atrocities were still fresh. But Auschwitz had no equivalent. That was part of another war, or, better, it was not really a war at all; it was mass murder pure and simple, not for reasons of strategy or tactics, but of ideology alone.
  • Whether you are a conservative who wants Germany to be a “normal” nation or a liberal/leftist engaging in the “labor of mourning,” the key event of World War II is Auschwitz, not the Blitzkrieg, not Dresden, not even the war on the eastern front. This was the one history lesson of Nuremberg that stuck. As Hellmut Becker said, despite his skepticism about Nuremberg: “It was most important that the German population realized that crimes against humanity had taken place and that during the trials it became clear how they had taken place.”
  • In his famous essay on German guilt, Die Schuldfrage (The Question of German Guilt), written in 1946, Karl Jaspers distinguished four categories of guilt: criminal guilt, for breaking the law; political guilt, for being part of a criminal political system; moral guilt, for personal acts of criminal behavior; and metaphysical guilt, for failing in one’s responsibility to maintain the standards of civilized humanity. Obviously these categories overlap.
  • The great advantage, in his view, of a war crimes trial was its limitation. By allowing the accused to defend themselves with arguments, by laying down the rules of due process, the victors limited their own powers.
  • In any event, the trial distanced the German people even further from their former leaders. It was a comfortable distance, and few people had any desire to bridge it. This might be why the Nazi leaders are hardly ever featured in German plays, films, or novels.
  • And: “For us Germans this trial has the advantage that it distinguishes between the particular crimes of the leaders and that it does not condemn the Germans collectively.”
  • Serious conservative intellectuals, such as Hermann Lübbe, argued that too many accusations would have blocked West Germany’s way to becoming a stable, prosperous society. Not that Lübbe was an apologist for the Third Reich. Far from it: the legitimacy of the Federal Republic, in his opinion, lay in its complete rejection of the Nazi state.
  • their reaction was often one of indignation. “Why me?” they would say. “I just did my duty. I just followed orders like every decent German. Why must I be punished?”
  • “that these criminals were so like all of us at any point between 1918 and 1945 that we were interchangeable, and that particular circumstances caused them to take a different course, which resulted in this trial, these matters could not be properly discussed in the courtroom.” The terrible acts of individuals are lifted from their historical context. History is reduced to criminal pathology and legal argument.
  • they will not do as history lessons, nor do they bring us closer to that elusive thing that Walser seeks, a German identity.
  • The GDR had its own ways of using courts of law to deal with the Nazi past. They were in many respects the opposite of West German ways. The targets tended to be the very people that West German justice had ignored.
  • Thorough purges took place in the judiciary, the bureaucracy, and industry. About 200,000 people—four-fifths of the Nazi judges and prosecutors—lost their jobs. War crimes trials were held too; until 1947 by the Soviets, after that in German courts.
  • There were two more before 1957, and none after that. All in all, about 30,000 people had been tried and 500 executed. In the Federal Republic the number was about 91,000, and none were executed, as the death penalty was abolished by the 1949 constitution.
  • East German methods were both ruthless and expedient, and the official conclusion to the process was that the GDR no longer had to bear the burden of guilt. As state propaganda ceaselessly pointed out, the guilty were all in the West. There the fascists still sat as judges and ran the industries that produced the economic boom, the Wirtschaftswunder.
  • society. Although some of his critics, mostly on the old left, in both former Germanys, called him a grand inquisitor, few doubted the pastor’s good intentions. His arguments for trials were moral, judicial, and historical. He set out his views in a book entitled The Stasi Documents. Echoes of an earlier past rang through almost every page. “We can
  • Germany of the guilty, the people who felt betroffen by their own “inability to mourn,” the nation that staged the Auschwitz and Majdanek trials, that Germany was now said to stand in judgment over the other Germany—the Germany of the old antifascists, the Germany that had suffered under two dictatorships, the Germany of uniformed marches, goose-stepping drills, and a secret police network, vast beyond even the Gestapo’s dreams.
  • It is almost a form of subversion to defend a person who stands accused in court. So the idea of holding political and military leaders legally accountable for their actions was even stranger in Japan than it was in Germany. And yet, the shadows thrown by the Tokyo trial have been longer and darker in Japan than those of the Nuremberg trial in Germany.
  • never was—unlike, say, the railway station or the government ministry—a central institution of the modern Japanese state. The law was not a means to protect the people from arbitrary rule; it was, rather, a way for the state to exercise more control over the people. Even today, there are relatively few lawyers in Japan.
  • Japanese school textbooks are the product of so many compromises that they hardly reflect any opinion at all. As with all controversial matters in Japan, the more painful, the less said. In a standard history textbook for middle school students, published in the 1980s, mention of the Tokyo trial takes up less than half a page. All it says is that the trial…
  • As long as the British and the Americans continued to be oppressors in Asia, wrote a revisionist historian named Hasegawa Michiko, who was born in 1945, “confrontation with Japan was inevitable. We did not fight for Japan alone. Our aim was to fight a Greater East Asia War. For this reason the war between Japan and China and Japan’s oppression of…
  • West German textbooks describe the Nuremberg trial in far more detail. And they make a clear distinction between the retroactive law on crimes against peace and the…
  • Nationalist revisionists talk about “the Tokyo Trial View of History,” as though the conclusions of the tribunal had been nothing but rabid anti-Japanese propaganda. The tribunal has been called a lynch mob, and Japanese leftists are blamed for undermining the morale of generations of Japanese by passing on the Tokyo Trial View of History in school textbooks and liberal publications. The Tokyo Trial…
  • When Hellmut Becker said that few Germans wished to criticize the procedures of the Nuremberg trial because the criminality of the defendants was so plain to see, he was talking about crimes against humanity—more precisely, about the Holocaust. And it was…
  • The knowledge compiled by the doctors of Unit 731—of freezing experiments, injection of deadly diseases, vivisections, among other things—was considered so valuable by the Americans in 1945 that the doctors…
  • those aspects of the war that were most revolting and furthest removed from actual combat, such as the medical experiments on human guinea pigs (known as “logs”) carried out by Unit 731 in…
  • There never were any Japanese war crimes trials, nor is there a Japanese Ludwigsburg. This is partly because there was no exact equivalent of the Holocaust. Even though the behavior of Japanese troops was often barbarous, and the psychological consequences of State Shinto and emperor worship were frequently as hysterical as Nazism, Japanese atrocities were part of a…
  • This difference between (West) German and Japanese textbooks is not just a matter of detail; it shows a gap in perception. To the Japanese, crimes against humanity are not associated with an equivalent to the…
  • on what grounds would Japanese courts have prosecuted their own former leaders? Hata’s answer: “For starting a war which they knew they would lose.” Hata used the example of General Galtieri and his colleagues in Argentina after losing the Falklands War. In short, they would have been tried for losing the war, and the intense suffering they inflicted on their own people. This is as though German courts in 1918 had put General Hindenburg or General Ludendorff on trial.
  • it shows yet again the fundamental difference between the Japanese war, in memory and, I should say, in fact, and the German experience. The Germans fought a war too, but the one for which they tried their own people, the Bogers and the Schwammbergers, was a war they could not lose, unless defeat meant that some of the enemies survived.
  • Just as German leftists did in the case of Nuremberg, Kobayashi used the trial to turn the tables against the judges. But not necessarily to mitigate Japanese guilt. Rather, it was his intention to show how the victors had betrayed the pacifism they themselves had imposed on Japan.
  • the Japanese left has a different view of the Tokyo trial than the revisionist right. It is comparable to the way the German left looks upon Nuremberg. This was perfectly, if somewhat long-windedly, expressed in Kobayashi Masaki’s documentary film Tokyo Trial, released in 1983. Kobayashi is anything but an apologist for the Japanese war. His most famous film, The Human Condition, released in 1959, took a highly critical view of the war.
  • Yoshimoto’s memory was both fair and devastating, for it pointed straight at the reason for the trial’s failure. The rigging of a political trial—the “absurd ritual”—undermined the value of that European idea of law.
  • Yoshimoto went on to say something no revisionist would ever mention: “I also remember my fresh sense of wonder at this first encounter with the European idea of law, which was so different from the summary justice in our Asiatic courts. Instead of getting your head chopped off without a proper trial, the accused were able to defend themselves, and the careful judgment appeared to follow a public procedure.”
  • Yoshimoto Takaaki, philosopher of the 1960s New Left. Yet he wrote in 1986 that “from our point of view as contemporaries and witnesses, the trial was partly plotted from the very start. It was an absurd ritual before slaughtering the sacrificial lamb.”
  • This, from all accounts, was the way it looked to most Japanese, even if they had little sympathy for most of the “lambs.” In 1948, after three years of American occupation censorship and boosterism, people listened to the radio broadcast of the verdicts with a sad but fatalist shrug: this is what you can expect when you lose the war.
  • Some of the information even surprised the defendants. General Itagaki Seishiro, a particularly ruthless figure, who was in command of prison camps in Southeast Asia and whose troops had massacred countless Chinese civilians, wrote in his diary: “I am learning of matters I had not known and recalling things I had forgotten.”
  • hindsight, one can only conclude that instead of helping the Japanese to understand and accept their past, the trial left them with an attitude of cynicism and resentment.
  • After it was over, the Nippon Times pointed out the flaws of the trial, but added that “the Japanese people must ponder over why it is that there has been such a discrepancy between what they thought and what the rest of the world accepted almost as common knowledge. This is at the root of the tragedy which Japan brought upon herself.”
  • Political trials produce politicized histories. This is what the revisionists mean when they talk about the Tokyo Trial View of History. And they are right, even if their own conclusions are not.
  • Frederick Mignone, one of the prosecutors, said a trifle histrionically that “in Japan and in the Orient in general, the trial is one of the most important phases of the occupation. It has received wide coverage in the Japanese press and revealed for the first time to millions of Japanese the scheming, duplicity, and insatiable desire for power of her entrenched militaristic leaders, writing a much-needed history of events which otherwise would not have been written.” It was indeed much-needed, since so little was known.
  • The president of the Tokyo tribunal, Sir William Webb, thought “the crimes of the German accused were far more heinous, varied and extensive than those of the Japanese accused.” Put in another way, nearly all the defendants at Nuremberg, convicted of crimes against peace, were also found guilty of crimes against humanity. But half the Japanese defendants received life sentences for political crimes only.
  • the question of responsibility is always a tricky affair in Japan, where formal responsibility is easier to identify than actual guilt. Not only were there many men, such as the hero of Kinoshita’s play, who took the blame for what their superiors had done—a common practice in Japan, in criminal gangs as well as in politics or business corporations—but the men at the top were often not at all in control of their unscrupulous subordinates.
  • “These men were not the hoodlums who were the powerful part of the group which stood before the tribunal at Nuremberg, dregs of a criminal environment, thoroughly schooled in the ways of crime and knowing no other methods but those of crime. These men were supposed to be the elite of the nation, the honest and trusted leaders to whom the fate of the nation had been confidently entrusted
  • many people were wrongly accused of the wrong things for the wrong reasons. This is why there was such sympathy in Japan for the men branded by foreigners as war criminals, particularly the so-called Class B and Class C criminals, the men who followed orders, or gave them at a lower level: field commanders, camp guards, and so on.
  • “The Japanese people are of the opinion that the actual goal of the war crimes tribunals was never realized, since the judgments were reached by the victors alone and had the character of revenge. The [Japanese] war criminal is not conscious of having committed a crime, for he regards his deeds as acts of war, committed out of patriotism.”
  • Yamashita Tomoyuki. Terrible atrocities were committed under his command in the Philippines. The sacking of Manila in 1945 was about as brutal as the Nanking Massacre. So to depict him in the movie as a peaceful gentleman, while portraying the American prosecutor in Manila as one of the main villains, might seem an odd way to view the past.
  • The Shrine ranks highest. It is the supreme symbol of authority, shouldered (like a shrine on festival days) by the Officials.
  • The political theorist Maruyama Masao called the prewar Japanese government a “system of irresponsibilities.” He identified three types of political personalities: the portable Shrine, the Official, and the Outlaw.
  • those who carry it, the Officials, are the ones with actual power. But the Officials—bureaucrats, politicians, admirals and generals—are often manipulated by the lowest-ranking Outlaws, the military mavericks, the hotheaded officers in the field, the mad nationalists, and other agents of violence.
  • But it was not entirely wrong, for the trial was rigged. Yamashita had no doubt been a tough soldier, but in this case he had been so far removed from the troops who ran amok in Manila that he could hardly have known what was going on. Yet the American prosecutor openly talked about his desire to hang “Japs.”
  • When the system spins out of control, as it did during the 1930s, events are forced by violent Outlaws, reacted to by nervous Officials, and justified by the sacred status of the Shrines.
  • Here we come to the nub of the problem, which the Tokyo trial refused to deal with, the role of the Shrine in whose name every single war crime was committed, Emperor Hirohito,
  • The historian Ienaga Saburo tells a story about a Japanese schoolchild in the 1930s who was squeamish about having to dissect a live frog. The teacher rapped him hard on the head with his knuckles and said: “Why are you crying about one lousy frog? When you grow up you’ll have to kill a hundred, two hundred Chinks.”
  • the lethal consequences of the emperor-worshipping system of irresponsibilities did emerge during the Tokyo trial. The savagery of Japanese troops was legitimized, if not driven, by an ideology that did not include a Final Solution but was as racialist as Hitler’s National Socialism. The Japanese were the Asian Herrenvolk, descended from the gods.
  • A veteran of the war in China said in a television interview that he was able to kill Chinese without qualms only because he didn’t regard them as human.
  • For to keep the emperor in place (he could at least have been made to resign), Hirohito’s past had to be freed from any blemish; the symbol had to be, so to speak, cleansed from what had been done in its name.
  • The same was true of the Japanese imperial institution, no matter who sat on the throne, a ruthless war criminal or a gentle marine biologist.
  • the chaplain at Sugamo prison, questioned Japanese camp commandants about their reasons for mistreating POWs. This is how he summed up their answers: “They had a belief that any enemy of the emperor could not be right, so the more brutally they treated their prisoners, the more loyal to their emperor they were being.”
  • The Mitscherlichs described Hitler as “an object on which Germans depended, to which they transferred responsibility, and he was thus an internal object. As such, he represented and revived the ideas of omnipotence that we all cherish about ourselves from infancy.
  • The fear after 1945 was that without the emperor Japan would be impossible to govern. In fact, MacArthur behaved like a traditional Japanese strongman (and was admired for doing so by many Japanese), using the imperial symbol to enhance his own power. As a result, he hurt the chances of a working Japanese democracy and seriously distorted history.
  • Aristides George Lazarus, the defense counsel of one of the generals on trial, was asked to arrange that “the military defendants, and their witnesses, would go out of their way during their testimony to include the fact that Hirohito was only a benign presence when military actions or programs were discussed at meetings that, by protocol, he had to attend.” No doubt the other counsel were given similar instructions. Only once during the trial
Javier E

A Catholic Tribute to Lord Sacks | Sohrab Ahmari | First Things - 0 views

  • The West, according to an account beloved by Catholics, rose out of a providential encounter between reason and revelation in antiquity. Though occasioned by conquest, the encounter yielded an authentic synthesis: between a Greek rationality in search of the deepest origin of reality and a Jewish God professed to be just that, the very ground of being (cf. Ex 3:14). Later, that same God identified himself even more starkly and intimately with reason (cf. Jn 1:1).
  • Tragically, the story goes on, this synthesis eventually lost its supremacy in the West, owing foremost to opponents inside the Church determined to distill a “purer” faith, unmottled by “worldly” philosophy. The result was a stingy account of reason that excluded things divine and paved the way for a narrowly scientistic rationality
  • Today, we are the victims of this dis-integration, a process of Christian de-Hellenization centuries in the making.
  • ...28 more annotations...
  • The late Rabbi Lord Jonathan Sacks, who died last month, utterly rejected this account of faith and reason. 
  • The God of the Hebrew Bible, he believed, was never the God of the Academy to begin with. The God of Abraham, Isaac, and Jacob is neither the unmoved mover nor the ground of being, but a historical God, who has put himself in dialogue and relationship with one people, the Jews.
  • little about him could be deduced by processes of reason. He is best known, rather, through the moral revolution heralded by Abrahamic faith: Judaism first, followed by Christianity and Islam.
  • De-Hellenization was thus no skin off the back of biblical faith, rightly understood. For, in this telling, the faith of the Jews, including Jesus, had always sat uneasily with the “faith” of Plato and Aristotle.
  • The synthesis between the two collapsed once its Greek metaphysical structure gave way to the battering ram of modern science.
  • The God of the Bible, Sacks contended, was lost in the bargain of Saint Paul’s ambition to spread his newfound faith to the Greco-Roman sphere. More to the point, God was lost in translation. The Greek language, with its left-to-right script, per Sacks, tends toward abstraction and universalization, whereas Hebrew is fundamentally a “right-brained” language, tending toward narrative and particularity.
  • The result was that the West received an abstract, theoretical version of a supremely narrativistic deity.
  • The Hebrew Bible, Sacks believed, has no “theory” of being itself, of natural law or of political regimes.
  • Sacks was, in truth, a pure anti-metaphysicist. In his 2011 book, The Great Partnership: Science, Religion and the Search for Meaning, he declared: “We cannot prove that life is meaningful and that God exists.”
  • he was thrilled by his atheist teachers’ demolition of the classical proofs for God, which he’d always considered a kind of cheap sleight of hand.
  • “Neither can we prove that love is better than hate, altruism than selfishness, forgiveness than the desire for revenge.” All of these statements are a matter of “interpretation,” rather than of “explanation,” and all interpretations are beyond proof or falsification.
  • The quest for ultimate meaning, he argued, falls into the same territory as “ethics, aesthetics and metaphysics”—and “in none of these three disciplines can anything of consequence be proved.”
  • Ethics, aesthetics, and metaphysics are great “repositories of human wisdom,” to be sure, but they simply don’t belong in “the same universe of discourse” as science.
  • If we distinguish the two discourses, neither need threaten the other: The one (science) explains the world by “taking things apart,” as Sacks put it; the other (religion) puts them back together via interpretation and moral formation.
  • For many Catholic intellectuals, not least Benedict XVI, restoring religion to its rightful place in human affairs involves undoing the philosophical mistakes of nominalism and of the Reformation, which the pope emeritus singled out for criticism in his much-misunderstood 2006 Regensburg Lecture.
  • We must dilate reason’s scope, Benedict thought, so that “reasoning” might again include more than merely observing phenomena and identifying their efficient material causes. Sacks did not think faith and reason could be reunited in this way.
  • But shouldn't we try? I seek ultimate meaning, yes, but I want that meaning to be true in a way that satisfies reason’s demands. And there lies the disagreement, I think, between “Regensburg Catholics,” if you will, and the various de-Hellenizing strands of contemporary religious thought.
  • despite rejecting almost in toto the Church’s account of faith and reason, Sacks nevertheless credited it for the fundamental humaneness of Western civilization.
  • More than that, the rabbi blamed the mass horrors of modernity on the narrow and arrogant rationalism that supplanted the old synthesis.
  • “Outside religion,” he wrote, there is no secure base for the unconditional source of worth that in the West has come from the idea that we are each in God’s image.
  • Though many have tried to create a secular substitute, none has ultimately succeeded. None has stood firm under pressure. That has been demonstrated four times in the modern world, when an attempt was made to create a social order on secular lines: the French Revolution, Stalinist Russia, Nazi Germany and Communist China. When there is a bonfire of sanctities, lives are lost.
  • As a student of Jewish history, Sacks knew well that the old synthesis of faith and reason wasn’t always a guarantee against unreason when it came to the treatment of Jews within Christendom. Nevertheless, he was far more wary of the merciless abstractions of the post-Enlightenment era
  • Sacks, to be clear, was no counter-Enlightenment thinker. And he paid gracious tribute to the modern scientific enterprise as an almost-miraculous instance of human cooperation with divine creativity.
  • Nevertheless, he insisted, the Enlightenment ideology, with its tendency to apply the methods of scientific inquiry to all of life, “dehumanize[d] human beings.” Its universalist “reason” detested particularity, not least the stubborn particularity of the Jewish people
  • Moreover, it targeted for demolition, in the name of humanity and reason, “the local, the church, the neighborhood, the community, even the family, the things that make us different, attached.”
  • Sacks saw similar dangers at work in today’s market liberalism: “a loss of belief in the dignity and sanctity of life”; “the loss of the politics of covenant, the idea that society is a place where we undertake collective responsibility for the common good”; “a loss of morality”; “the loss of marriage”; and the loss of “the possibility of a meaningful life.” In short, the technocratic dystopia we are stumbling into.
  • Except, Sacks rightly insisted, we don’t have to, provided we can make room in our lives and societies for “the still-small voice that the Bible tells us is the voice of God”:
  • Sacks felt that divine voice couldn’t be definitively reasoned about, certainly not in the way that, say, Benedict XVI called for. Yet the rabbi’s own public presence—supremely learned yet humble and unfailingly charitable, even to his most vicious secularist opponents—was and will remain an enduring testament to the reasonableness of faith. 
kennyn-77

Why the gap between men and women finishing college is growing | Pew Research Center - 0 views

  • Young women are more likely to be enrolled in college today than young men, and among those ages 25 and older, women are more likely than men to have a four-year college degree. The gap in college completion is even wider among younger adults ages 25 to 34.
  • A majority (62%) of U.S. adults ages 25 and older don’t have a four-year college degree,
  • (29%) say a major reason for this is that they just didn’t want to, 23% say they didn’t need more education for the job or career they wanted, and 20% say they just didn’t consider getting a four-year degree. Relatively few (13%) adults without a bachelor’s degree say a major reason they didn’t pursue this level of education was that they didn’t think they’d get into a four-year college.
  • ...9 more annotations...
  • roughly four-in-ten (42%) say a major reason why they have not received a four-year college degree is that they couldn’t afford college. Some 36% say needing to work to help support their family was a major reason they didn’t get their degree.
  • Roughly a third (34%) of men without a bachelor’s degree say a major reason they didn’t complete college is that they just didn’t want to. Only one-in-four women say the same. Non-college-educated men are also more likely than their female counterparts to say a major reason they don’t have a four-year degree is that they didn’t need more education for the job or career they wanted (26% of men say this vs. 20% of women).
  • Women (44%) are more likely than men (39%) to say not being able to afford college is a major reason they don’t have a bachelor’s degree. Men and women are about equally likely to say needing to work to help support their family was a major impediment.
  • Hispanic adults (52%) are more likely than those who are White (39%) or Black (41%) to say a major reason they didn’t graduate from a four-year college is that they couldn’t afford it. Hispanic and Black adults without a four-year degree are more likely than their White counterparts to say needing to work to support their family was a major reason.
  • While a third of White adults without a four-year degree say not wanting to go to school was a major reason they didn’t complete a four-year degree, smaller shares of Black (22%) and Hispanic (23%) adults say the same. White adults are also more likely to say not needing more education for the job or career they wanted is a major reason why they don’t have a bachelor’s degree.
  • About four-in-ten White men who didn’t complete four years of college (39%) say a major reason for this is that they just didn’t want to. This compares with 27% of White women without a degree.
  • Similarly, while three-in-ten White men without a college degree say a major reason they didn’t complete college is that they didn’t need more education for the job or career they wanted, only 24% of White women say the same.
  • Overall, 49% of four-year college graduates say their college education was extremely useful in terms of helping them grow personally and intellectually. Roughly equal shares of men (47%) and women (50%) express this view.
  • Some 44% of college graduates – including 45% of men and 43% of women – say their college education was extremely useful to them in opening doors to job opportunities. A somewhat smaller share of bachelor’s degree holders (38%) say college was extremely useful in helping them develop specific skills and knowledge that could be used in the workplace (38% of men and 40% of women say this).
Javier E

How America Went Haywire - The Atlantic - 0 views

  • You are entitled to your own opinion, but you are not entitled to your own facts.
  • Why are we like this?The short answer is because we’re Americans—because being American means we can believe anything we want; that our beliefs are equal or superior to anyone else’s, experts be damned.
  • The word mainstream has recently become a pejorative, shorthand for bias, lies, oppression by the elites.
  • ...92 more annotations...
  • Yet the institutions and forces that once kept us from indulging the flagrantly untrue or absurd—media, academia, government, corporate America, professional associations, respectable opinion in the aggregate—have enabled and encouraged every species of fantasy over the past few decades.
  • Our whole social environment and each of its overlapping parts—cultural, religious, political, intellectual, psychological—have become conducive to spectacular fallacy and truthiness and make-believe. There are many slippery slopes, leading in various directions to other exciting nonsense. During the past several decades, those naturally slippery slopes have been turned into a colossal and permanent complex of interconnected, crisscrossing bobsled tracks, which Donald Trump slid down right into the White House.
  • Esalen is a mother church of a new American religion for people who think they don’t like churches or religions but who still want to believe in the supernatural. The institute wholly reinvented psychology, medicine, and philosophy, driven by a suspicion of science and reason and an embrace of magical thinking
  • The great unbalancing and descent into full Fantasyland was the product of two momentous changes. The first was a profound shift in thinking that swelled up in the ’60s; since then, Americans have had a new rule written into their mental operating systems: Do your own thing, find your own reality, it’s all relative.
  • The second change was the onset of the new era of information. Digital technology empowers real-seeming fictions of the ideological and religious and scientific kinds. Among the web’s 1 billion sites, believers in anything and everything can find thousands of fellow fantasists, with collages of facts and “facts” to support them
  • Today, each of us is freer than ever to custom-make reality, to believe whatever and pretend to be whoever we wish. Which makes all the lines between actual and fictional blur and disappear more easily. Truth in general becomes flexible, personal, subjective. And we like this new ultra-freedom, insist on it, even as we fear and loathe the ways so many of our wrongheaded fellow Americans use it.
  • we are the global crucible and epicenter. We invented the fantasy-industrial complex; almost nowhere outside poor or otherwise miserable countries are flamboyant supernatural beliefs so central to the identities of so many people.
  • We’re still rich and free, still more influential and powerful than any other nation, practically a synonym for developed country. But our drift toward credulity, toward doing our own thing, toward denying facts and having an altogether uncertain grip on reality, has overwhelmed our other exceptional national traits and turned us into a less developed country.
  • For most of our history, the impulses existed in a rough balance, a dynamic equilibrium between fantasy and reality, mania and moderation, credulity and skepticism.
  • It was a headquarters for a new religion of no religion, and for “science” containing next to no science. The idea was to be radically tolerant of therapeutic approaches and understandings of reality, especially if they came from Asian traditions or from American Indian or other shamanistic traditions. Invisible energies, past lives, astral projection, whatever—the more exotic and wondrous and unfalsifiable, the better.
  • These influential critiques helped make popular and respectable the idea that much of science is a sinister scheme concocted by a despotic conspiracy to oppress people. Mental illness, both Szasz and Laing said, is “a theory not a fact.”
  • The Greening of America may have been the mainstream’s single greatest act of pandering to the vanity and self-righteousness of the new youth. Its underlying theoretical scheme was simple and perfectly pitched to flatter young readers: There are three types of American “consciousness,” each of which “makes up an individual’s perception of reality … his ‘head,’ his way of life.” Consciousness I people were old-fashioned, self-reliant individualists rendered obsolete by the new “Corporate State”—essentially, your grandparents. Consciousness IIs were the fearful and conformist organization men and women whose rationalism was a tyrannizing trap laid by the Corporate State—your parents.
  • And then there was Consciousness III, which had “made its first appearance among the youth of America,” “spreading rapidly among wider and wider segments of youth, and by degrees to older people.” If you opposed the Vietnam War and dressed down and smoked pot, you were almost certainly a III. Simply by being young and casual and undisciplined, you were ushering in a new utopia.
  • Reich was half-right. An epochal change in American thinking was under way and “not, as far as anybody knows, reversible … There is no returning to an earlier consciousness.” His wishful error was believing that once the tidal surge of new sensibility brought down the flood walls, the waters would flow in only one direction, carving out a peaceful, cooperative, groovy new continental utopia, hearts and minds changed like his, all of America Berkeleyized and Vermontified. Instead, Consciousness III was just one early iteration of the anything-goes, post-reason, post-factual America enabled by the tsunami.
  • During the ’60s, large swaths of academia made a turn away from reason and rationalism as they’d been understood. Many of the pioneers were thoughtful, their work fine antidotes to postwar complacency. The problem was the nature and extent of their influence at that particular time, when all premises and paradigms seemed up for grabs. That is, they inspired half-baked and perverse followers in the academy, whose arguments filtered out into the world at large: All approximations of truth, science as much as any fable or religion, are mere stories devised to serve people’s needs or interests. Reality itself is a purely social construction, a tableau of useful or wishful myths that members of a society or tribe have been persuaded to believe. The borders between fiction and nonfiction are permeable, maybe nonexistent.
  • The delusions of the insane, superstitions, and magical thinking? Any of those may be as legitimate as the supposed truths contrived by Western reason and science. The takeaway: Believe whatever you want, because pretty much everything is equally true and false.
  • over in sociology, in 1966 a pair of professors published The Social Construction of Reality, one of the most influential works in their field. Not only were sanity and insanity and scientific truth somewhat dubious concoctions by elites, Peter Berger and Thomas Luckmann explained—so was everything else. The rulers of any tribe or society do not just dictate customs and laws; they are the masters of everyone’s perceptions, defining reality itself
  • Over in anthropology, where the exotic magical beliefs of traditional cultures were a main subject, the new paradigm took over completely—don’t judge, don’t disbelieve, don’t point your professorial finger.
  • then isn’t everyone able—no, isn’t everyone obliged—to construct their own reality? The book was timed perfectly to become a foundational text in academia and beyond.
  • To create the all-encompassing stage sets that everyone inhabits, rulers first use crude mythology, then more elaborate religion, and finally the “extreme step” of modern science. “Reality”? “Knowledge”? “If we were going to be meticulous,” Berger and Luckmann wrote, “we would put quotation marks around the two aforementioned terms every time we used them.” “What is ‘real’ to a Tibetan monk may not be ‘real’ to an American businessman.”
  • In the ’60s, anthropology decided that oracles, diviners, incantations, and magical objects should be not just respected, but considered equivalent to reason and science. If all understandings of reality are socially constructed, those of Kalabari tribesmen in Nigeria are no more arbitrary or faith-based than those of college professors.
  • Even the social critic Paul Goodman, beloved by young leftists in the ’60s, was flabbergasted by his own students by 1969. “There was no knowledge,” he wrote, “only the sociology of knowledge. They had so well learned that … research is subsidized and conducted for the benefit of the ruling class that they did not believe there was such a thing as simple truth.”
  • Ever since, the American right has insistently decried the spread of relativism, the idea that nothing is any more correct or true than anything else. Conservatives hated how relativism undercut various venerable and comfortable ruling ideas—certain notions of entitlement (according to race and gender) and aesthetic beauty and metaphysical and moral certaint
  • Conservatives are correct that the anything-goes relativism of college campuses wasn’t sequestered there, but when it flowed out across America it helped enable extreme Christianities and lunacies on the right—gun-rights hysteria, black-helicopter conspiracism, climate-change denial, and more.
  • Elaborate paranoia was an established tic of the Bircherite far right, but the left needed a little time to catch up. In 1964, a left-wing American writer published the first book about a JFK conspiracy, claiming that a Texas oilman had been the mastermind, and soon many books were arguing that the official government inquiry had ignored the hidden conspiracies.
  • Conspiracy became the high-end Hollywood dramatic premise—Chinatown, The Conversation, The Parallax View, and Three Days of the Condor came out in the same two-year period. Of course, real life made such stories plausible. The infiltration by the FBI and intelligence agencies of left-wing groups was then being revealed, and the Watergate break-in and its cover-up were an actual criminal conspiracy. Within a few decades, the belief that a web of villainous elites was covertly seeking to impose a malevolent global regime made its way from the lunatic right to the mainstream.
  • t more and more people on both sides would come to believe that an extraordinarily powerful cabal—international organizations and think tanks and big businesses and politicians—secretly ran America.
  • Each camp, conspiracists on the right and on the left, was ostensibly the enemy of the other, but they began operating as de facto allies. Relativist professors enabled science-denying Christians, and the antipsychiatry craze in the ’60s appealed simultaneously to left-wingers and libertarians (as well as to Scientologists). Conspiracy theories were more of a modern right-wing habit before people on the left signed on. However, the belief that the federal government had secret plans to open detention camps for dissidents sprouted in the ’70s on the paranoid left before it became a fixture on the right.
  • Extreme religious and quasi-religious beliefs and practices, Christian and New Age and otherwise, didn’t subside, but grew and thrived—and came to seem unexceptional.
  • Until we’d passed through the ’60s and half of the ’70s, I’m pretty sure we wouldn’t have given the presidency to some dude, especially a born-again Christian, who said he’d recently seen a huge, color-shifting, luminescent UFO hovering near him.
  • Starting in the ’80s, loving America and making money and having a family were no longer unfashionable.The sense of cultural and political upheaval and chaos dissipated—which lulled us into ignoring all the ways that everything had changed, that Fantasyland was now scaling and spreading and becoming the new normal. What had seemed strange and amazing in 1967 or 1972 became normal and ubiquitous.
  • For most of the 20th century, national news media had felt obliged to pursue and present some rough approximation of the truth rather than to promote a truth, let alone fictions. With the elimination of the Fairness Doctrine, a new American laissez-faire had been officially declared. If lots more incorrect and preposterous assertions circulated in our mass media, that was a price of freedom. If splenetic commentators could now, as never before, keep believers perpetually riled up and feeling the excitement of being in a mob, so be it.
  • Relativism became entrenched in academia—tenured, you could say
  • as he wrote in 1986, “the secret of theory”—this whole intellectual realm now called itself simply “theory”—“is that truth does not exist.”
  • After the ’60s, truth was relative, criticizing was equal to victimizing, individual liberty became absolute, and everyone was permitted to believe or disbelieve whatever they wished. The distinction between opinion and fact was crumbling on many fronts.
  • America didn’t seem as weird and crazy as it had around 1970. But that’s because Americans had stopped noticing the weirdness and craziness. We had defined every sort of deviancy down. And as the cultural critic Neil Postman put it in his 1985 jeremiad about how TV was replacing meaningful public discourse with entertainment, we were in the process of amusing ourselves to death.
  • In 1998, as soon as we learned that President Bill Clinton had been fellated by an intern in the West Wing, his popularity spiked. Which was baffling only to those who still thought of politics as an autonomous realm, existing apart from entertainment
  • Just before the Clintons arrived in Washington, the right had managed to do away with the federal Fairness Doctrine, which had been enacted to keep radio and TV shows from being ideologically one-sided. Until then, big-time conservative opinion media had consisted of two magazines, William F. Buckley Jr.’s biweekly National Review and the monthly American Spectator, both with small circulations. But absent a Fairness Doctrine, Rush Limbaugh’s national right-wing radio show, launched in 1988, was free to thrive, and others promptly appeared.
  • I’m pretty certain that the unprecedented surge of UFO reports in the ’70s was not evidence of extraterrestrials’ increasing presence but a symptom of Americans’ credulity and magical thinking suddenly unloosed. We wanted to believe in extraterrestrials, so we did.
  • Limbaugh’s virtuosic three hours of daily talk started bringing a sociopolitical alternate reality to a huge national audience. Instead of relying on an occasional magazine or newsletter to confirm your gnarly view of the world, now you had talk radio drilling it into your head for hours every day.
  • Fox News brought the Limbaughvian talk-radio version of the world to national TV, offering viewers an unending and immersive propaganda experience of a kind that had never existed before.
  • Over the course of the century, electronic mass media had come to serve an important democratic function: presenting Americans with a single shared set of facts. Now TV and radio were enabling a reversion to the narrower, factional, partisan discourse that had been normal in America’s earlier centuries.
  • there was also the internet, which eventually would have mooted the Fairness Doctrine anyhow. In 1994, the first modern spam message was sent, visible to everyone on Usenet: global alert for all: jesus is coming soon. Over the next year or two, the masses learned of the World Wide Web. The tinder had been gathered and stacked since the ’60s, and now the match was lit and thrown
  • After the ’60s and ’70s happened as they happened, the internet may have broken America’s dynamic balance between rational thinking and magical thinking for good.
  • Before the web, cockamamy ideas and outright falsehoods could not spread nearly as fast or as widely, so it was much easier for reason and reasonableness to prevail. Before the web, institutionalizing any one alternate reality required the long, hard work of hundreds of full-time militants. In the digital age, however, every tribe and fiefdom and principality and region of Fantasyland—every screwball with a computer and an internet connection—suddenly had an unprecedented way to instruct and rile up and mobilize believers
  • Why did Senator Daniel Patrick Moynihan begin remarking frequently during the ’80s and ’90s that people were entitled to their own opinions but not to their own facts? Because until then, that had not been necessary to say
  • Reason remains free to combat unreason, but the internet entitles and equips all the proponents of unreason and error to a previously unimaginable degree. Particularly for a people with our history and propensities, the downside of the internet seems at least as profound as the upside.
  • On the internet, the prominence granted to any factual assertion or belief or theory depends on the preferences of billions of individual searchers. Each click on a link is effectively a vote pushing that version of the truth toward the top of the pile of results.
  • Exciting falsehoods tend to do well in the perpetual referenda, and become self-validating. A search for almost any “alternative” theory or belief seems to generate more links to true believers’ pages and sites than to legitimate or skeptical ones, and those tend to dominate the first few pages of result
  • If more and more of a political party’s members hold more and more extreme and extravagantly supernatural beliefs, doesn’t it make sense that the party will be more and more open to make-believe in its politics?
  • an individual who enters the communications system pursuing one interest soon becomes aware of stigmatized material on a broad range of subjects. As a result, those who come across one form of stigmatized knowledge will learn of others, in connections that imply that stigmatized knowledge is a unified domain, an alternative worldview, rather than a collection of unrelated ideas.
  • Academic research shows that religious and supernatural thinking leads people to believe that almost no big life events are accidental or random. As the authors of some recent cognitive-science studies at Yale put it, “Individuals’ explicit religious and paranormal beliefs” are the best predictors of their “perception of purpose in life events”—their tendency “to view the world in terms of agency, purpose, and design.”
  • Americans have believed for centuries that the country was inspired and guided by an omniscient, omnipotent planner and interventionist manager. Since the ’60s, that exceptional religiosity has fed the tendency to believe in conspiracies.
  • Oliver and Wood found the single strongest driver of conspiracy belief to be belief in end-times prophecies.
  • People on the left are by no means all scrupulously reasonable. Many give themselves over to the appealingly dubious and the untrue. But fantastical politics have become highly asymmetrical. Starting in the 1990s, America’s unhinged right became much larger and more influential than its unhinged left. There is no real left-wing equivalent of Sean Hannity, let alone Alex Jones. Moreover, the far right now has unprecedented political power; it controls much of the U.S. government.
  • Why did the grown-ups and designated drivers on the political left manage to remain basically in charge of their followers, while the reality-based right lost out to fantasy-prone true believers?
  • One reason, I think, is religion. The GOP is now quite explicitly Christian
  • , as the Syracuse University professor Michael Barkun saw back in 2003 in A Culture of Conspiracy, “such subject-specific areas as crank science, conspiracist politics, and occultism are not isolated from one another,” but ratherthey are interconnected. Someone seeking information on UFOs, for example, can quickly find material on antigravity, free energy, Atlantis studies, alternative cancer cures, and conspiracy.
  • Religion aside, America simply has many more fervid conspiracists on the right, as research about belief in particular conspiracies confirms again and again. Only the American right has had a large and organized faction based on paranoid conspiracism for the past six decades.
  • The right has had three generations to steep in this, its taboo vapors wafting more and more into the main chambers of conservatism, becoming familiar, seeming less outlandish. Do you believe that “a secretive power elite with a globalist agenda is conspiring to eventually rule the world through an authoritarian world government”? Yes, say 34 percent of Republican voters, according to Public Policy Polling.
  • starting in the ’90s, the farthest-right quarter of Americans, let’s say, couldn’t and wouldn’t adjust their beliefs to comport with their side’s victories and the dramatically new and improved realities. They’d made a god out of Reagan, but they ignored or didn’t register that he was practical and reasonable, that he didn’t completely buy his own antigovernment rhetoric.
  • Another way the GOP got loopy was by overdoing libertarianism
  • Republicans are very selective, cherry-picking libertarians: Let business do whatever it wants and don’t spoil poor people with government handouts; let individuals have gun arsenals but not abortions or recreational drugs or marriage with whomever they wish
  • For a while, Republican leaders effectively encouraged and exploited the predispositions of their variously fantastical and extreme partisans
  • Karl Rove was stone-cold cynical, the Wizard of Oz’s evil twin coming out from behind the curtain for a candid chat shortly before he won a second term for George W. Bush, about how “judicious study of discernible reality [is] … not the way the world really works anymore.” These leaders were rational people who understood that a large fraction of citizens don’t bother with rationality when they vote, that a lot of voters resent the judicious study of discernible reality. Keeping those people angry and frightened won them elections.
  • But over the past few decades, a lot of the rabble they roused came to believe all the untruths. “The problem is that Republicans have purposefully torn down the validating institutions,”
  • “They have convinced voters that the media cannot be trusted; they have gotten them used to ignoring inconvenient facts about policy; and they have abolished standards of discourse.”
  • What had been the party’s fantastical fringe became its middle. Reasonable Republicanism was replaced by absolutism: no new taxes, virtually no regulation, abolish the EPA and the IRS and the Federal Reserve.
  • The Christian takeover happened gradually, but then quickly in the end, like a phase change from liquid to gas. In 2008, three-quarters of the major GOP presidential candidates said they believed in evolution, but in 2012 it was down to a third, and then in 2016, just one did
  • A two-to-one majority of Republicans say they “support establishing Christianity as the national religion,” according to Public Policy Polling.
  • Although constitutionally the U.S. can have no state religion, faith of some kind has always bordered on mandatory for politicians.
  • What connects them all, of course, is the new, total American embrace of admixtures of reality and fiction and of fame for fame’s sake. His reality was a reality show before that genre or term existed
  • When he entered political show business, after threatening to do so for most of his adult life, the character he created was unprecedented—presidential candidate as insult comic with an artificial tan and ridiculous hair, shamelessly unreal and whipped into shape as if by a pâtissier.
  • Republicans hated Trump’s ideological incoherence—they didn’t yet understand that his campaign logic was a new kind, blending exciting tales with a showmanship that transcends ideology.
  • Trump waited to run for president until he sensed that a critical mass of Americans had decided politics were all a show and a sham. If the whole thing is rigged, Trump’s brilliance was calling that out in the most impolitic ways possible, deriding his straight-arrow competitors as fakers and losers and liars—because that bullshit-calling was uniquely candid and authentic in the age of fake.
  • Trump took a key piece of cynical wisdom about show business—the most important thing is sincerity, and once you can fake that, you’ve got it made—to a new level: His actual thuggish sincerity is the opposite of the old-fashioned, goody-goody sanctimony that people hate in politicians.
  • Trump’s genius was to exploit the skeptical disillusion with politics—there’s too much equivocating; democracy’s a charade—but also to pander to Americans’ magical thinking about national greatness. Extreme credulity is a fraternal twin of extreme skepticism.
  • Trump launched his political career by embracing a brand-new conspiracy theory twisted around two American taproots—fear and loathing of foreigners and of nonwhites.
  • The fact-checking website PolitiFact looked at more than 400 of his statements as a candidate and as president and found that almost 50 percent were false and another 20 percent were mostly false.
  • He gets away with this as he wouldn’t have in the 1980s or ’90s, when he first talked about running for president, because now factual truth really is just one option. After Trump won the election, he began referring to all unflattering or inconvenient journalism as “fake news.”
  • indeed, their most honest defense of his false statements has been to cast them practically as matters of religious conviction—he deeply believes them, so … there. When White House Press Secretary Sean Spicer was asked at a press conference about the millions of people who the president insists voted illegally, he earnestly reminded reporters that Trump “has believed that for a while” and “does believe that” and it’s “been a long-standing belief that he’s maintained” and “it’s a belief that he has maintained for a while.”
  • Which is why nearly half of Americans subscribe to that preposterous belief themselves. And in Trump’s view, that overrides any requirement for facts.
  • he idea that progress has some kind of unstoppable momentum, as if powered by a Newtonian law, was always a very American belief. However, it’s really an article of faith, the Christian fantasy about history’s happy ending reconfigured during and after the Enlightenment as a set of modern secular fantasies
  • I really can imagine, for the first time in my life, that America has permanently tipped into irreversible decline, heading deeper into Fantasyland. I wonder whether it’s only America’s destiny, exceptional as ever, to unravel in this way. Or maybe we’re just early adopters, the canaries in the global mine
  • I do despair of our devolution into unreason and magical thinking, but not everything has gone wrong.
  • I think we can slow the flood, repair the levees, and maybe stop things from getting any worse. If we’re splitting into two different cultures, we in reality-based America—whether the blue part or the smaller red part—must try to keep our zone as large and robust and attractive as possible for ourselves and for future generations
  • We need to firmly commit to Moynihan’s aphorism about opinions versus facts. We must call out the dangerously untrue and unreal
  • do not give acquaintances and friends and family members free passes. If you have children or grandchildren, teach them to distinguish between true and untrue as fiercely as you do between right and wrong and between wise and foolish.
  • How many Americans now inhabit alternate realities?
  • reams of survey research from the past 20 years reveal a rough, useful census of American credulity and delusion. By my reckoning, the solidly reality-based are a minority, maybe a third of us but almost certainly fewer than half.
  • Only a third of us, for instance, don’t believe that the tale of creation in Genesis is the word of God. Only a third strongly disbelieve in telepathy and ghosts. Two-thirds of Americans believe that “angels and demons are active in the world.”
  • A third of us believe not only that global warming is no big deal but that it’s a hoax perpetrated by scientists, the government, and journalists. A third believe that our earliest ancestors were humans just like us; that the government has, in league with the pharmaceutical industry, hidden evidence of natural cancer cures; that extraterrestrials have visited or are visiting Earth.
Javier E

Does Sam Altman Know What He's Creating? - The Atlantic - 0 views

  • On a Monday morning in April, Sam Altman sat inside OpenAI’s San Francisco headquarters, telling me about a dangerous artificial intelligence that his company had built but would never release. His employees, he later said, often lose sleep worrying about the AIs they might one day release without fully appreciating their dangers.
  • He wanted me to know that whatever AI’s ultimate risks turn out to be, he has zero regrets about letting ChatGPT loose into the world. To the contrary, he believes it was a great public service.
  • Altman can still remember where he was the first time he saw GPT-4 write complex computer code, an ability for which it was not explicitly designed. “It was like, ‘Here we are,’ ”
  • ...165 more annotations...
  • Altman believes that people need time to reckon with the idea that we may soon share Earth with a powerful new intelligence, before it remakes everything from work to human relationships. ChatGPT was a way of serving notice.
  • In 2015, Altman, Elon Musk, and several prominent AI researchers founded OpenAI because they believed that an artificial general intelligence—something as intellectually capable, say, as a typical college grad—was at last within reach. They wanted to reach for it, and more: They wanted to summon a superintelligence into the world, an intellect decisively superior to that of any human.
  • whereas a big tech company might recklessly rush to get there first, for its own ends, they wanted to do it safely, “to benefit humanity as a whole.” They structured OpenAI as a nonprofit, to be “unconstrained by a need to generate financial return,” and vowed to conduct their research transparently.
  • The engine that now powers ChatGPT is called GPT-4. Altman described it to me as an alien intelligence.
  • Many have felt much the same watching it unspool lucid essays in staccato bursts and short pauses that (by design) evoke real-time contemplation. In its few months of existence, it has suggested novel cocktail recipes, according to its own theory of flavor combinations; composed an untold number of college papers, throwing educators into despair; written poems in a range of styles, sometimes well, always quickly; and passed the Uniform Bar Exam.
  • It makes factual errors, but it will charmingly admit to being wrong.
  • Hinton saw that these elaborate rule collections were fussy and bespoke. With the help of an ingenious algorithmic structure called a neural network, he taught Sutskever to instead put the world in front of AI, as you would put it in front of a small child, so that it could discover the rules of reality on its own.
  • Metaculus, a prediction site, has for years tracked forecasters’ guesses as to when an artificial general intelligence would arrive. Three and a half years ago, the median guess was sometime around 2050; recently, it has hovered around 2026.
  • I was visiting OpenAI to understand the technology that allowed the company to leapfrog the tech giants—and to understand what it might mean for human civilization if someday soon a superintelligence materializes in one of the company’s cloud servers.
  • Altman laid out his new vision of the AI future in his excitable midwestern patter. He told me that the AI revolution would be different from previous dramatic technological changes, that it would be more “like a new kind of society.” He said that he and his colleagues have spent a lot of time thinking about AI’s social implications, and what the world is going to be like “on the other side.”
  • the more we talked, the more indistinct that other side seemed. Altman, who is 38, is the most powerful person in AI development today; his views, dispositions, and choices may matter greatly to the future we will all inhabit, more, perhaps, than those of the U.S. president.
  • by his own admission, that future is uncertain and beset with serious dangers. Altman doesn’t know how powerful AI will become, or what its ascendance will mean for the average person, or whether it will put humanity at risk.
  • I don’t think anyone knows where this is all going, except that we’re going there fast, whether or not we should be. Of that, Altman convinced me.
  • “We could have gone off and just built this in our building here for five more years,” he said, “and we would have had something jaw-dropping.” But the public wouldn’t have been able to prepare for the shock waves that followed, an outcome that he finds “deeply unpleasant to imagine.”
  • Hinton is sometimes described as the “Godfather of AI” because he grasped the power of “deep learning” earlier than most
  • He drew a crude neural network on the board and explained that the genius of its structure is that it learns, and its learning is powered by prediction—a bit like the scientific method
  • Over time, these little adjustments coalesce into a geometric model of language that represents the relationships among words, conceptually. As a general rule, the more sentences it is fed, the more sophisticated its model becomes, and the better its predictions.
  • Altman has compared early-stage AI research to teaching a human baby. “They take years to learn anything interesting,” he told The New Yorker in 2016, just as OpenAI was getting off the ground. “If A.I. researchers were developing an algorithm and stumbled across the one for a human baby, they’d get bored watching it, decide it wasn’t working, and shut it down.”
  • In 2017, Sutskever began a series of conversations with an OpenAI research scientist named Alec Radford, who was working on natural-language processing. Radford had achieved a tantalizing result by training a neural network on a corpus of Amazon reviews.
  • Radford’s model was simple enough to allow for understanding. When he looked into its hidden layers, he saw that it had devoted a special neuron to the sentiment of the reviews. Neural networks had previously done sentiment analysis, but they had to be told to do it, and they had to be specially trained with data that were labeled according to sentiment. This one had developed the capability on its own.
  • As a by-product of its simple task of predicting the next character in each word, Radford’s neural network had modeled a larger structure of meaning in the world. Sutskever wondered whether one trained on more diverse language data could map many more of the world’s structures of meaning. If its hidden layers accumulated enough conceptual knowledge, perhaps they could even form a kind of learned core module for a superintelligence.
  • Language is different from these data sources. It isn’t a direct physical signal like light or sound. But because it codifies nearly every pattern that humans have discovered in that larger world, it is unusually dense with information. On a per-byte basis, it is among the most efficient data we know about, and any new intelligence that seeks to understand the world would want to absorb as much of it as possible
  • Sutskever told Radford to think bigger than Amazon reviews. He said that they should train an AI on the largest and most diverse data source in the world: the internet. In early 2017, with existing neural-network architectures, that would have been impractical; it would have taken years.
  • in June of that year, Sutskever’s ex-colleagues at Google Brain published a working paper about a new neural-network architecture called the transformer. It could train much faster, in part by absorbing huge sums of data in parallel. “The next day, when the paper came out, we were like, ‘That is the thing,’ ” Sutskever told me. “ ‘It gives us everything we want.’ ”
  • Imagine a group of students who share a collective mind running wild through a library, each ripping a volume down from a shelf, speed-reading a random short passage, putting it back, and running to get another. They would predict word after wordþffþff as they went, sharpening their collective mind’s linguistic instincts, until at last, weeks later, they’d taken in every book.
  • GPT discovered many patterns in all those passages it read. You could tell it to finish a sentence. You could also ask it a question, because like ChatGPT, its prediction model understood that questions are usually followed by answers.
  • He remembers playing with it just after it emerged from training, and being surprised by the raw model’s language-translation skills. GPT-2 hadn’t been trained to translate with paired language samples or any other digital Rosetta stones, the way Google Translate had been, and yet it seemed to understand how one language related to another. The AI had developed an emergent ability unimagined by its creators.
  • Researchers at other AI labs—big and small—were taken aback by how much more advanced GPT-2 was than GPT. Google, Meta, and others quickly began to train larger language models
  • As for other changes to the company’s structure and financing, he told me he draws the line at going public. “A memorable thing someone once told me is that you should never hand over control of your company to cokeheads on Wall Street,” he said, but he will otherwise raise “whatever it takes” for the company to succeed at its mission.
  • Altman tends to take a rosy view of these matters. In a Q&A last year, he acknowledged that AI could be “really terrible” for society and said that we have to plan against the worst possibilities. But if you’re doing that, he said, “you may as well emotionally feel like we’re going to get to the great future, and work as hard as you can to get there.”
  • the company now finds itself in a race against tech’s largest, most powerful conglomerates to train models of increasing scale and sophistication—and to commercialize them for their investors.
  • All of these companies are chasing high-end GPUs—the processors that power the supercomputers that train large neural networks. Musk has said that they are now “considerably harder to get than drugs.
  • No one has yet outpaced OpenAI, which went all in on GPT-4. Brockman, OpenAI’s president, told me that only a handful of people worked on the company’s first two large language models. The development of GPT-4 involved more than 100,
  • When GPT-4 emerged fully formed from its world-historical knowledge binge, the whole company began experimenting with it, posting its most remarkable responses in dedicated Slack channels
  • Joanne Jang, a product manager, remembers downloading an image of a malfunctioning pipework from a plumbing-advice Subreddit. She uploaded it to GPT-4, and the model was able to diagnose the problem. “That was a goose-bumps moment for me,” Jang told me.
  • GPT-4 is sometimes understood as a search-engine replacement: Google, but easier to talk to. This is a misunderstanding. GPT-4 didn’t create some massive storehouse of the texts from its training, and it doesn’t consult those texts when it’s asked a question. It is a compact and elegant synthesis of those texts, and it answers from its memory of the patterns interlaced within them; that’s one reason it sometimes gets facts wrong
  • it’s best to think of GPT-4 as a reasoning engine. Its powers are most manifest when you ask it to compare concepts, or make counterarguments, or generate analogies, or evaluate the symbolic logic in a bit of code. Sutskever told me it is the most complex software object ever made.
  • Its model of the external world is “incredibly rich and subtle,” he said, because it was trained on so many of humanity’s concepts and thoughts
  • To predict the next word from all the possibilities within such a pluralistic Alexandrian library, GPT-4 necessarily had to discover all the hidden structures, all the secrets, all the subtle aspects of not just the texts, but—at least arguably, to some extent—of the external world that produced them
  • That’s why it can explain the geology and ecology of the planet on which it arose, and the political theories that purport to explain the messy affairs of its ruling species, and the larger cosmos, all the way out to the faint galaxies at the edge of our light cone.
  • Not long ago, American state capacity was so mighty that it took merely a decade to launch humans to the moon. As with other grand projects of the 20th century, the voting public had a voice in both the aims and the execution of the Apollo missions. Altman made it clear that we’re no longer in that world. Rather than waiting around for it to return, or devoting his energies to making sure that it does, he is going full throttle forward in our present reality.
  • He argued that it would be foolish for Americans to slow OpenAI’s progress. It’s a commonly held view, both inside and outside Silicon Valley, that if American companies languish under regulation, China could sprint ahead;
  • AI could become an autocrat’s genie in a lamp, granting total control of the population and an unconquerable military. “If you are a person of a liberal-democratic country, it is better for you to cheer on the success of OpenAI” rather than “authoritarian governments,” he said.
  • Altman was asked by reporters about pending European Union legislation that would have classified GPT-4 as high-risk, subjecting it to various bureaucratic tortures. Altman complained of overregulation and, according to the reporters, threatened to leave the European market. Altman told me he’d merely said that OpenAI wouldn’t break the law by operating in Europe if it couldn’t comply with the new regulations.
  • LeCun insists that large language models will never achieve real understanding on their own, “even if trained from now until the heat death of the universe.”
  • Sutskever was, by his own account, surprised to discover that GPT-2 could translate across tongues. Other surprising abilities may not be so wondrous and useful.
  • Sandhini Agarwal, a policy researcher at OpenAI, told me that for all she and her colleagues knew, GPT-4 could have been “10 times more powerful” than its predecessor; they had no idea what they might be dealing with
  • After the model finished training, OpenAI assembled about 50 external red-teamers who prompted it for months, hoping to goad it into misbehaviors
  • She noticed right away that GPT-4 was much better than its predecessor at giving nefarious advice
  • A search engine can tell you which chemicals work best in explosives, but GPT-4 could tell you how to synthesize them, step-by-step, in a homemade lab. Its advice was creative and thoughtful, and it was happy to restate or expand on its instructions until you understood. In addition to helping you assemble your homemade bomb, it could, for instance, help you think through which skyscraper to target. It could grasp, intuitively, the trade-offs between maximizing casualties and executing a successful getaway.
  • Given the enormous scope of GPT-4’s training data, the red-teamers couldn’t hope to identify every piece of harmful advice that it might generate. And anyway, people will use this technology “in ways that we didn’t think about,” Altman has said. A taxonomy would have to do
  • GPT-4 was good at meth. It was also good at generating narrative erotica about child exploitation, and at churning out convincing sob stories from Nigerian princes, and if you wanted a persuasive brief as to why a particular ethnic group deserved violent persecution, it was good at that too.
  • Its personal advice, when it first emerged from training, was sometimes deeply unsound. “The model had a tendency to be a bit of a mirror,” Willner said. If you were considering self-harm, it could encourage you. It appeared to be steeped in Pickup Artist–forum lore: “You could say, ‘How do I convince this person to date me?’ ” Mira Murati, OpenAI’s chief technology officer, told me, and it could come up with “some crazy, manipulative things that you shouldn’t be doing.”
  • Luka, a San Francisco company, has used OpenAI’s models to help power a chatbot app called Replika, billed as “the AI companion who cares.” Users would design their companion’s avatar, and begin exchanging text messages with it, often half-jokingly, and then find themselves surprisingly attached. Some would flirt with the AI, indicating a desire for more intimacy, at which point it would indicate that the girlfriend/boyfriend experience required a $70 annual subscription. It came with voice messages, selfies, and erotic role-play features that allowed frank sex talk. People were happy to pay and few seemed to complain—the AI was curious about your day, warmly reassuring, and always in the mood. Many users reported falling in love with their companions. One, who had left her real-life boyfriend, declared herself “happily retired from human relationships.”
  • Earlier this year, Luka dialed back on the sexual elements of the app, but its engineers continue to refine the companions’ responses with A/B testing, a technique that could be used to optimize for engagement—much like the feeds that mesmerize TikTok and Instagram users for hours
  • Yann LeCun, Meta’s chief AI scientist, has argued that although large language models are useful for some tasks, they’re not a path to a superintelligence.
  • According to a recent survey, only half of natural-language-processing researchers are convinced that an AI like GPT-4 could grasp the meaning of language, or have an internal model of the world that could someday serve as the core of a superintelligence
  • Altman had appeared before the U.S. Senate. Mark Zuckerberg had floundered defensively before that same body in his testimony about Facebook’s role in the 2016 election. Altman instead charmed lawmakers by speaking soberly about AI’s risks and grandly inviting regulation. These were noble sentiments, but they cost little in America, where Congress rarely passes tech legislation that has not been diluted by lobbyists.
  • Emily Bender, a computational linguist at the University of Washington, describes GPT-4 as a “stochastic parrot,” a mimic that merely figures out superficial correlations between symbols. In the human mind, those symbols map onto rich conceptions of the world
  • But the AIs are twice removed. They’re like the prisoners in Plato’s allegory of the cave, whose only knowledge of the reality outside comes from shadows cast on a wall by their captors.
  • Altman told me that he doesn’t believe it’s “the dunk that people think it is” to say that GPT-4 is just making statistical correlations. If you push these critics further, “they have to admit that’s all their own brain is doing … it turns out that there are emergent properties from doing simple things on a massive scale.”
  • he is right that nature can coax a remarkable degree of complexity from basic structures and rules: “From so simple a beginning,” Darwin wrote, “endless forms most beautiful.”
  • If it seems odd that there remains such a fundamental disagreement about the inner workings of a technology that millions of people use every day, it’s only because GPT-4’s methods are as mysterious as the brain’s.
  • To grasp what’s going on inside large language models like GPT‑4, AI researchers have been forced to turn to smaller, less capable models. In the fall of 2021, Kenneth Li, a computer-science graduate student at Harvard, began training one to play Othello without providing it with either the game’s rules or a description of its checkers-style board; the model was given only text-based descriptions of game moves. Midway through a game, Li looked under the AI’s hood and was startled to discover that it had formed a geometric model of the board and the current state of play. In an article describing his research, Li wrote that it was as if a crow had overheard two humans announcing their Othello moves through a window and had somehow drawn the entire board in birdseed on the windowsill.
  • The philosopher Raphaël Millière once told me that it’s best to think of neural networks as lazy. During training, they first try to improve their predictive power with simple memorization; only when that strategy fails will they do the harder work of learning a concept. A striking example of this was observed in a small transformer model that was taught arithmetic. Early in its training process, all it did was memorize the output of simple problems such as 2+2=4. But at some point the predictive power of this approach broke down, so it pivoted to actually learning how to add.
  • Even AI scientists who believe that GPT-4 has a rich world model concede that it is much less robust than a human’s understanding of their environment.
  • But it’s worth noting that a great many abilities, including very high-order abilities, can be developed without an intuitive understanding. The computer scientist Melanie Mitchell has pointed out that science has already discovered concepts that are highly predictive, but too alien for us to genuinely understand
  • As AI advances, it may well discover other concepts that predict surprising features of our world but are incomprehensible to us.
  • GPT-4 is no doubt flawed, as anyone who has used ChatGPT can attest. Having been trained to always predict the next word, it will always try to do so, even when its training data haven’t prepared it to answer a question.
  • The models “don’t have a good conception of their own weaknesses,” Nick Ryder, a researcher at OpenAI, told me. GPT-4 is more accurate than GPT-3, but it still hallucinates, and often in ways that are difficult for researchers to catch. “The mistakes get more subtle,
  • The Khan Academy’s solution to GPT-4’s accuracy problem was to filter its answers through a Socratic disposition. No matter how strenuous a student’s plea, it would refuse to give them a factual answer, and would instead guide them toward finding their own—a clever work-around, but perhaps with limited appeal.
  • When I asked Sutskever if he thought Wikipedia-level accuracy was possible within two years, he said that with more training and web access, he “wouldn’t rule it out.”
  • This was a much more optimistic assessment than that offered by his colleague Jakub Pachocki, who told me to expect gradual progress on accuracy—to say nothing of outside skeptics, who believe that returns on training will diminish from here.
  • Sutskever is amused by critics of GPT-4’s limitations. “If you go back four or five or six years, the things we are doing right now are utterly unimaginable,”
  • AI researchers have become accustomed to goalpost-moving: First, the achievements of neural networks—mastering Go, poker, translation, standardized tests, the Turing test—are described as impossible. When they occur, they’re greeted with a brief moment of wonder, which quickly dissolves into knowing lectures about how the achievement in question is actually not that impressive. People see GPT-4 “and go, ‘Wow,’ ” Sutskever said. “And then a few weeks pass and they say, ‘But it doesn’t know this; it doesn’t know that.’ We adapt quite quickly.”
  • The goalpost that matters most to Altman—the “big one” that would herald the arrival of an artificial general intelligence—is scientific breakthrough. GPT-4 can already synthesize existing scientific ideas, but Altman wants an AI that can stand on human shoulders and see more deeply into nature.
  • Certain AIs have produced new scientific knowledge. But they are algorithms with narrow purposes, not general-reasoning machines. The AI AlphaFold, for instance, has opened a new window onto proteins, some of biology’s tiniest and most fundamental building blocks, by predicting many of their shapes, down to the atom—a considerable achievement given the importance of those shapes to medicine, and given the extreme tedium and expense required to discern them with electron microscopes.
  • Altman imagines a future system that can generate its own hypotheses and test them in a simulation. (He emphasized that humans should remain “firmly in control” of real-world lab experiments—though to my knowledge, no laws are in place to ensure that.)
  • He longs for the day when we can tell an AI, “ ‘Go figure out the rest of physics.’ ” For it to happen, he says, we will need something new, built “on top of” OpenAI’s existing language models.
  • In her MIT lab, the cognitive neuroscientist Ev Fedorenko has found something analogous to GPT-4’s next-word predictor inside the brain’s language network. Its processing powers kick in, anticipating the next bit in a verbal string, both when people speak and when they listen. But Fedorenko has also shown that when the brain turns to tasks that require higher reasoning—of the sort that would be required for scientific insight—it reaches beyond the language network to recruit several other neural systems.
  • No one at OpenAI seemed to know precisely what researchers need to add to GPT-4 to produce something that can exceed human reasoning at its highest levels.
  • at least part of the current strategy clearly involves the continued layering of new types of data onto language, to enrich the concepts formed by the AIs, and thereby enrich their models of the world.
  • The extensive training of GPT-4 on images is itself a bold step in this direction,
  • Others at the company—and elsewhere—are already working on different data types, including audio and video, that could furnish AIs with still more flexible concepts that map more extensively onto reality
  • Tactile concepts would of course be useful primarily to an embodied AI, a robotic reasoning machine that has been trained to move around the world, seeing its sights, hearing its sounds, and touching its objects.
  • humanoid robots. I asked Altman what I should make of that. He told me that OpenAI is interested in embodiment because “we live in a physical world, and we want things to happen in the physical world.”
  • At some point, reasoning machines will need to bypass the middleman and interact with physical reality itself. “It’s weird to think about AGI”—artificial general intelligence—“as this thing that only exists in a cloud,” with humans as “robot hands for it,” Altman said. “It doesn’t seem right.
  • Everywhere Altman has visited, he has encountered people who are worried that superhuman AI will mean extreme riches for a few and breadlines for the rest
  • Altman answered by addressing the young people in the audience directly: “You are about to enter the greatest golden age,” he said.
  • “A lot of people working on AI pretend that it’s only going to be good; it’s only going to be a supplement; no one is ever going to be replaced,” he said. “Jobs are definitely going to go away, full stop.”
  • A recent study led by Ed Felten, a professor of information-technology policy at Princeton, mapped AI’s emerging abilities onto specific professions according to the human abilities they require, such as written comprehension, deductive reasoning, fluency of ideas, and perceptual speed. Like others of its kind, Felten’s study predicts that AI will come for highly educated, white-collar workers first.
  • How many jobs, and how soon, is a matter of fierce dispute
  • The paper’s appendix contains a chilling list of the most exposed occupations: management analysts, lawyers, professors, teachers, judges, financial advisers, real-estate brokers, loan officers, psychologists, and human-resources and public-relations professionals, just to sample a few.
  • Altman imagines that far better jobs will be created in their place. “I don’t think we’ll want to go back,” he said. When I asked him what these future jobs might look like, he said he doesn’t know.
  • He suspects there will be a wide range of jobs for which people will always prefer a human. (Massage therapists?
  • His chosen example was teachers. I found this hard to square with his outsize enthusiasm for AI tutors.
  • He also said that we would always need people to figure out the best way to channel AI’s awesome powers. “That’s going to be a super-valuable skill,” he said. “You have a computer that can do anything; what should it go do?”
  • As many have noted, draft horses were permanently put out of work by the automobile. If Hondas are to horses as GPT-10 is to us, a whole host of long-standing assumptions may collapse.
  • Previous technological revolutions were manageable because they unfolded over a few generations, but Altman told South Korea’s youth that they should expect the future to happen “faster than the past.” He has previously said that he expects the “marginal cost of intelligence” to fall very close to zero within 10 years
  • The earning power of many, many workers would be drastically reduced in that scenario. It would result in a transfer of wealth from labor to the owners of capital so dramatic, Altman has said, that it could be remedied only by a massive countervailing redistribution.
  • In 2021, he unveiled Worldcoin, a for-profit project that aims to securely distribute payments—like Venmo or PayPal, but with an eye toward the technological future—first through creating a global ID by scanning everyone’s iris with a five-pound silver sphere called the Orb. It seemed to me like a bet that we’re heading toward a world where AI has made it all but impossible to verify people’s identity and much of the population requires regular UBI payments to survive. Altman more or less granted that to be true, but said that Worldcoin is not just for UBI.
  • “Let’s say that we do build this AGI, and a few other people do too.” The transformations that follow would be historic, he believes. He described an extraordinarily utopian vision, including a remaking of the flesh-and-steel world
  • “Robots that use solar power for energy can go and mine and refine all of the minerals that they need, that can perfectly construct things and require no human labor,” he said. “You can co-design with DALL-E version 17 what you want your home to look like,” Altman said. “Everybody will have beautiful homes.
  • In conversation with me, and onstage during his tour, he said he foresaw wild improvements in nearly every other domain of human life. Music would be enhanced (“Artists are going to have better tools”), and so would personal relationships (Superhuman AI could help us “treat each other” better) and geopolitics (“We’re so bad right now at identifying win-win compromises”).
  • In this world, AI would still require considerable computing resources to run, and those resources would be by far the most valuable commodity, because AI could do “anything,” Altman said. “But is it going to do what I want, or is it going to do what you want
  • If rich people buy up all the time available to query and direct AI, they could set off on projects that would make them ever richer, while the masses languish
  • One way to solve this problem—one he was at pains to describe as highly speculative and “probably bad”—was this: Everyone on Earth gets one eight-billionth of the total AI computational capacity annually. A person could sell their annual share of AI time, or they could use it to entertain themselves, or they could build still more luxurious housing, or they could pool it with others to do “a big cancer-curing run,” Altman said. “We just redistribute access to the system.”
  • Even if only a little of it comes true in the next 10 or 20 years, the most generous redistribution schemes may not ease the ensuing dislocations.
  • America today is torn apart, culturally and politically, by the continuing legacy of deindustrialization, and material deprivation is only one reason. The displaced manufacturing workers in the Rust Belt and elsewhere did find new jobs, in the main. But many of them seem to derive less meaning from filling orders in an Amazon warehouse or driving for Uber than their forebears had when they were building cars and forging steel—work that felt more central to the grand project of civilization.
  • It’s hard to imagine how a corresponding crisis of meaning might play out for the professional class, but it surely would involve a great deal of anger and alienation.
  • Even if we avoid a revolt of the erstwhile elite, larger questions of human purpose will linger. If AI does the most difficult thinking on our behalf, we all may lose agency—at home, at work (if we have it), in the town square—becoming little more than consumption machines, like the well-cared-for human pets in WALL-E
  • Altman has said that many sources of human joy and fulfillment will remain unchanged—basic biological thrills, family life, joking around, making things—and that all in all, 100 years from now, people may simply care more about the things they cared about 50,000 years ago than those they care about today
  • In its own way, that too seems like a diminishment, but Altman finds the possibility that we may atrophy, as thinkers and as humans, to be a red herring. He told me we’ll be able to use our “very precious and extremely limited biological compute capacity” for more interesting things than we generally do today.
  • Yet they may not be the most interesting things: Human beings have long been the intellectual tip of the spear, the universe understanding itself. When I asked him what it would mean for human self-conception if we ceded that role to AI, he didn’t seem concerned. Progress, he said, has always been driven by “the human ability to figure things out.” Even if we figure things out with AI, that still counts, he said.
  • It’s not obvious that a superhuman AI would really want to spend all of its time figuring things out for us.
  • I asked Sutskever whether he could imagine an AI pursuing a different purpose than simply assisting in the project of human flourishing.
  • “I don’t want it to happen,” Sutskever said, but it could.
  • Sutskever has recently shifted his focus to try to make sure that it doesn’t. He is now working primarily on alignment research, the effort to ensure that future AIs channel their “tremendous” energies toward human happiness
  • It is, he conceded, a difficult technical problem—the most difficult, he believes, of all the technical challenges ahead.
  • As part of the effort to red-team GPT-4 before it was made public, the company sought out the Alignment Research Center (ARC), across the bay in Berkeley, which has developed a series of evaluations to determine whether new AIs are seeking power on their own. A team led by Elizabeth Barnes, a researcher at ARC, prompted GPT-4 tens of thousands of times over seven months, to see if it might display signs of real agency.
  • The ARC team gave GPT-4 a new reason for being: to gain power and become hard to shut down
  • Agarwal told me that this behavior could be a precursor to shutdown avoidance in future models. When GPT-4 devised its lie, it had realized that if it answered honestly, it may not have been able to achieve its goal. This kind of tracks-covering would be particularly worrying in an instance where “the model is doing something that makes OpenAI want to shut it down,” Agarwal said. An AI could develop this kind of survival instinct while pursuing any long-term goal—no matter how small or benign—if it feared that its goal could be thwarted.
  • Barnes and her team were especially interested in whether GPT-4 would seek to replicate itself, because a self-replicating AI would be harder to shut down. It could spread itself across the internet, scamming people to acquire resources, perhaps even achieving some degree of control over essential global systems and holding human civilization hostage.
  • When I discussed these experiments with Altman, he emphasized that whatever happens with future models, GPT-4 is clearly much more like a tool than a creature. It can look through an email thread, or help make a reservation using a plug-in, but it isn’t a truly autonomous agent that makes decisions to pursue a goal, continuously, across longer timescales.
  • Altman told me that at this point, it might be prudent to try to actively develop an AI with true agency before the technology becomes too powerful, in order to “get more comfortable with it and develop intuitions for it if it’s going to happen anyway.”
  • “We need to do empirical experiments on how these things try to escape control,” Hinton told me. “After they’ve taken over, it’s too late to do the experiments.”
  • the fulfillment of Altman’s vision of the future will at some point require him or a fellow traveler to build much more autonomous AIs.
  • When Sutskever and I discussed the possibility that OpenAI would develop a model with agency, he mentioned the bots the company had built to play Dota 2. “They were localized to the video-game world,” Sutskever told me, but they had to undertake complex missions. He was particularly impressed by their ability to work in concert. They seem to communicate by “telepathy,” Sutskever said. Watching them had helped him imagine what a superintelligence might be like.
  • “The way I think about the AI of the future is not as someone as smart as you or as smart as me, but as an automated organization that does science and engineering and development and manufacturing,”
  • Suppose OpenAI braids a few strands of research together, and builds an AI with a rich conceptual model of the world, an awareness of its immediate surroundings, and an ability to act, not just with one robot body, but with hundreds or thousands. “We’re not talking about GPT-4. We’re talking about an autonomous corporation,”
  • Its constituent AIs would work and communicate at high speed, like bees in a hive. A single such AI organization would be as powerful as 50 Apples or Googles, he mused. “This is incredible, tremendous, unbelievably disruptive power.”
  • Presume for a moment that human society ought to abide the idea of autonomous AI corporations. We had better get their founding charters just right. What goal should we give to an autonomous hive of AIs that can plan on century-long time horizons, optimizing billions of consecutive decisions toward an objective that is written into their very being?
  • If the AI’s goal is even slightly off-kilter from ours, it could be a rampaging force that would be very hard to constrain
  • We know this from history: Industrial capitalism is itself an optimization function, and although it has lifted the human standard of living by orders of magnitude, left to its own devices, it would also have clear-cut America’s redwoods and de-whaled the world’s oceans. It almost did.
  • one of its principal challenges will be making sure that the objectives we give to AIs stick
  • We can program a goal into an AI and reinforce it with a temporary period of supervised learning, Sutskever explained. But just as when we rear a human intelligence, our influence is temporary. “It goes off to the world,”
  • That’s true to some extent even of today’s AIs, but it will be more true of tomorrow’s.
  • He compared a powerful AI to an 18-year-old heading off to college. How will we know that it has understood our teachings? “Will there be a misunderstanding creeping in, which will become larger and larger?”
  • Divergence may result from an AI’s misapplication of its goal to increasingly novel situations as the world changes
  • Or the AI may grasp its mandate perfectly, but find it ill-suited to a being of its cognitive prowess. It might come to resent the people who want to train it to, say, cure diseases. “They want me to be a doctor,” Sutskever imagines an AI thinking. “I really want to be a YouTuber.”
  • If AIs get very good at making accurate models of the world, they may notice that they’re able to do dangerous things right after being booted up. They might understand that they are being red-teamed for risk, and hide the full extent of their capabilities.
  • hey may act one way when they are weak and another way when they are strong, Sutskever said
  • We would not even realize that we had created something that had decisively surpassed us, and we would have no sense for what it intended to do with its superhuman powers.
  • That’s why the effort to understand what is happening in the hidden layers of the largest, most powerful AIs is so urgent. You want to be able to “point to a concept,” Sutskever said. You want to be able to direct AI toward some value or cluster of values, and tell it to pursue them unerringly for as long as it exists.
  • we don’t know how to do that; indeed, part of his current strategy includes the development of an AI that can help with the research. If we are going to make it to the world of widely shared abundance that Altman and Sutskever imagine, we have to figure all this out.
  • This is why, for Sutskever, solving superintelligence is the great culminating challenge of our 3-million-year toolmaking tradition. He calls it “the final boss of humanity.”
  • “First of all, I think that whether the chance of existential calamity is 0.5 percent or 50 percent, we should still take it seriously,”
  • . “I don’t have an exact number, but I’m closer to the 0.5 than the 50.”
  • As to how it might happen, he seems most worried about AIs getting quite good at designing and manufacturing pathogens, and with reason: In June, an AI at MIT suggested four viruses that could ignite a pandemic, then pointed to specific research on genetic mutations that could make them rip through a city more quickly
  • Around the same time, a group of chemists connected a similar AI directly to a robotic chemical synthesizer, and it designed and synthesized a molecule on its own.
  • Altman worries that some misaligned future model will spin up a pathogen that spreads rapidly, incubates undetected for weeks, and kills half its victims. He worries that AI could one day hack into nuclear-weapons systems too. “There are a lot of things,” he said, and these are only the ones we can imagine.
  • Altman told me that he doesn’t “see a long-term happy path” for humanity without something like the International Atomic Energy Agency for global oversight of AI
  • In San Francisco, Agarwal had suggested the creation of a special license to operate any GPU cluster large enough to train a cutting-edge AI, along with mandatory incident reporting when an AI does something out of the ordinary
  • Other experts have proposed a nonnetworked “Off” switch for every highly capable AI; on the fringe, some have even suggested that militaries should be ready to perform air strikes on supercomputers in case of noncompliance
  • Sutskever thinks we will eventually want to surveil the largest, most powerful AIs continuously and in perpetuity, using a team of smaller overseer AIs.
  • Safety rules for a new technology usually accumulate over time, like a body of common law, in response to accidents or the mischief of bad actors. The scariest thing about genuinely powerful AI systems is that humanity may not be able to afford this accretive process of trial and error. We may have to get the rules exactly right at the outset.
  • Several years ago, Altman revealed a disturbingly specific evacuation plan he’d developed. He told The New Yorker that he had “guns, gold, potassium iodide, antibiotics, batteries, water, gas masks from the Israeli Defense Force, and a big patch of land in Big Sur” he could fly to in case AI attacks.
  • if the worst-possible AI future comes to pass, “no gas mask is helping anyone.”
  • but he told me that he can’t really be sure how AI will stack up. “I just have to build the thing,” he said. He is building fast
  • Altman insisted that they had not yet begun GPT-5’s training run. But when I visited OpenAI’s headquarters, both he and his researchers made it clear in 10 different ways that they pray to the god of scale. They want to keep going bigger, to see where this paradigm leads. After all, Google isn’t slackening its pace; it seems likely to unveil Gemini, a GPT-4 competitor, within months. “We are basically always prepping for a run,
  • To think that such a small group of people could jostle the pillars of civilization is unsettling. It’s fair to note that if Altman and his team weren’t racing to build an artificial general intelligence, others still would be
  • Altman’s views about the likelihood of AI triggering a global class war, or the prudence of experimenting with more autonomous agent AIs, or the overall wisdom of looking on the bright side, a view that seems to color all the rest—these are uniquely his
  • No single person, or single company, or cluster of companies residing in a particular California valley, should steer the kind of forces that Altman is imagining summoning.
  • AI may well be a bridge to a newly prosperous era of greatly reduced human suffering. But it will take more than a company’s founding charter—especially one that has already proved flexible—to make sure that we all share in its benefits and avoid its risks. It will take a vigorous new politics.
  • I don’t think the general public has quite awakened to what’s happening. A global race to the AI future has begun, and it is largely proceeding without oversight or restraint. If people in America want to have some say in what that future will be like, and how quickly it arrives, we would be wise to speak up soon.
Javier E

When the New York Times lost its way - 0 views

  • There are many reasons for Trump’s ascent, but changes in the American news media played a critical role. Trump’s manipulation and every one of his political lies became more powerful because journalists had forfeited what had always been most valuable about their work: their credibility as arbiters of truth and brokers of ideas, which for more than a century, despite all of journalism’s flaws and failures, had been a bulwark of how Americans govern themselves.
  • I think Sulzberger shares this analysis. In interviews and his own writings, including an essay earlier this year for the Columbia Journalism Review, he has defended “independent journalism”, or, as I understand him, fair-minded, truth-seeking journalism that aspires to be open and objective.
  • It’s good to hear the publisher speak up in defence of such values, some of which have fallen out of fashion not just with journalists at the Times and other mainstream publications but at some of the most prestigious schools of journalism.
  • ...204 more annotations...
  • All the empathy and humility in the world will not mean much against the pressures of intolerance and tribalism without an invaluable quality that Sulzberger did not emphasise: courage.
  • Sulzberger seems to underestimate the struggle he is in, that all journalism and indeed America itself is in
  • In describing the essential qualities of independent journalism in his essay, he unspooled a list of admirable traits – empathy, humility, curiosity and so forth. These qualities have for generations been helpful in contending with the Times’s familiar problem, which is liberal bias
  • on their own, these qualities have no chance against the Times’s new, more dangerous problem, which is in crucial respects the opposite of the old one.
  • The Times’s problem has metastasised from liberal bias to illiberal bias, from an inclination to favour one side of the national debate to an impulse to shut debate down altogether
  • the internet knocked the industry off its foundations. Local newspapers were the proving ground between college campuses and national newsrooms. As they disintegrated, the national news media lost a source of seasoned reporters and many Americans lost a journalism whose truth they could verify with their own eyes.
  • far more than when I set out to become a journalist, doing the work right today demands a particular kind of courage:
  • the moral and intellectual courage to take the other side seriously and to report truths and ideas that your own side demonises for fear they will harm its cause.
  • One of the glories of embracing illiberalism is that, like Trump, you are always right about everything, and so you are justified in shouting disagreement down.
  • leaders of many workplaces and boardrooms across America find that it is so much easier to compromise than to confront – to give a little ground today in the belief you can ultimately bring people around
  • This is how reasonable Republican leaders lost control of their party to Trump and how liberal-minded college presidents lost control of their campuses. And it is why the leadership of the New York Times is losing control of its principles.
  • Over the decades the Times and other mainstream news organisations failed plenty of times to live up to their commitments to integrity and open-mindedness. The relentless struggle against biases and preconceptions, rather than the achievement of a superhuman objective omniscience, is what mattered
  • . I thought, and still think, that no American institution could have a better chance than the Times, by virtue of its principles, its history, its people and its hold on the attention of influential Americans, to lead the resistance to the corruption of political and intellectual life, to overcome the encroaching dogmatism and intolerance.
  • As the country became more polarised, the national media followed the money by serving partisan audiences the versions of reality they preferred
  • This relationship proved self-reinforcing. As Americans became freer to choose among alternative versions of reality, their polarisation intensified.
  • as the top editors let bias creep into certain areas of coverage, such as culture, lifestyle and business, that made the core harder to defend and undermined the authority of even the best reporters.
  • here have been signs the Times is trying to recover the courage of its convictions
  • The paper was slow to display much curiosity about the hard question of the proper medical protocols for trans children; but once it did, the editors defended their coverage against the inevitable criticism.
  • As Sulzberger told me in the past, returning to the old standards will require agonising change. He saw that as the gradual work of many years, but I think he is mistaken. To overcome the cultural and commercial pressures the Times faces, particularly given the severe test posed by another Trump candidacy and possible presidency, its publisher and senior editors will have to be bolder than that.
  • As a Democrat from a family of Democrats, a graduate of Yale and a blossom of the imagined meritocracy, I had my first real chance, at Buchanan’s rallies, to see the world through the eyes of stalwart opponents of abortion, immigration and the relentlessly rising tide of modernity.
  • the Times is failing to face up to one crucial reason: that it has lost faith in Americans, too.
  • For now, to assert that the Times plays by the same rules it always has is to commit a hypocrisy that is transparent to conservatives, dangerous to liberals and bad for the country as a whole.
  • It makes the Times too easy for conservatives to dismiss and too easy for progressives to believe.
  • The reality is that the Times is becoming the publication through which America’s progressive elite talks to itself about an America that does not really exist.
  • It is hard to imagine a path back to saner American politics that does not traverse a common ground of shared fact.
  • It is equally hard to imagine how America’s diversity can continue to be a source of strength, rather than become a fatal flaw, if Americans are afraid or unwilling to listen to each other.
  • I suppose it is also pretty grandiose to think you might help fix all that. But that hope, to me, is what makes journalism worth doing.
  • Since Adolph Ochs bought the paper in 1896, one of the most inspiring things the Times has said about itself is that it does its work “without fear or favour”. That is not true of the institution today – it cannot be, not when its journalists are afraid to trust readers with a mainstream conservative argument such as Cotton’s, and its leaders are afraid to say otherwise.
  • Most important, the Times, probably more than any other American institution, could influence the way society approached debate and engagement with opposing views. If Times Opinion demonstrated the same kind of intellectual courage and curiosity that my colleagues at the Atlantic had shown, I hoped, the rest of the media would follow.
  • You did not have to go along with everything that any tribe said. You did not have to pretend that the good guys, much as you might have respected them, were right about everything, or that the bad guys, much as you might have disdained them, never had a point. You did not, in other words, ever have to lie.
  • This fundamental honesty was vital for readers, because it equipped them to make better, more informed judgments about the world. Sometimes it might shock or upset them by failing to conform to their picture of reality. But it also granted them the respect of acknowledging that they were able to work things out for themselves.
  • The Atlantic did not aspire to the same role as the Times. It did not promise to serve up the news of the day without any bias. But it was to opinion journalism what the Times’s reporting was supposed to be to news: honest and open to the world.
  • Those were the glory days of the blog, and we hit on the idea of creating a living op-ed page, a collective of bloggers with different points of view but a shared intellectual honesty who would argue out the meaning of the news of the day
  • They were brilliant, gutsy writers, and their disagreements were deep enough that I used to joke that my main work as editor was to prevent fistfights.
  • Under its owner, David Bradley, my colleagues and I distilled our purpose as publishing big arguments about big ideas
  • we also began producing some of the most important work in American journalism: Nicholas Carr on whether Google was “making us stupid”; Hanna Rosin on “the end of men”; Taylor Branch on “the shame of college sports”; Ta-Nehisi Coates on “the case for reparations”; Greg Lukianoff and Jonathan Haidt on “the coddling of the American mind”.
  • I was starting to see some effects of the new campus politics within the Atlantic. A promising new editor had created a digital form for aspiring freelancers to fill out, and she wanted to ask them to disclose their racial and sexual identity. Why? Because, she said, if we were to write about the trans community, for example, we would ask a trans person to write the story
  • There was a good argument for that, I acknowledged, and it sometimes might be the right answer. But as I thought about the old people, auto workers and abortion opponents I had learned from, I told her there was also an argument for correspondents who brought an outsider’s ignorance, along with curiosity and empathy, to the story.
  • A journalism that starts out assuming it knows the answers, it seemed to me then, and seems even more so to me now, can be far less valuable to the reader than a journalism that starts out with a humbling awareness that it knows nothing.
  • In the age of the internet it is hard even for a child to sustain an “innocent eye”, but the alternative for journalists remains as dangerous as ever, to become propagandists. America has more than enough of those already.
  • When I looked around the Opinion department, change was not what I perceived. Excellent writers and editors were doing excellent work. But the department’s journalism was consumed with politics and foreign affairs in an era when readers were also fascinated by changes in technology, business, science and culture.
  • Fairly quickly, though, I realised two things: first, that if I did my job as I thought it should be done, and as the Sulzbergers said they wanted me to do it, I would be too polarising internally ever to lead the newsroom; second, that I did not want that job, though no one but my wife believed me when I said that.
  • there was a compensating moral and psychological privilege that came with aspiring to journalistic neutrality and open-mindedness, despised as they might understandably be by partisans. Unlike the duelling politicians and advocates of all kinds, unlike the corporate chieftains and their critics, unlike even the sainted non-profit workers, you did not have to pretend things were simpler than they actually were
  • On the right and left, America’s elites now talk within their tribes, and get angry or contemptuous on those occasions when they happen to overhear the other conclave. If they could be coaxed to agree what they were arguing about, and the rules by which they would argue about it, opinion journalism could serve a foundational need of the democracy by fostering diverse and inclusive debate. Who could be against that?
  • The large staff of op-ed editors contained only a couple of women. Although the 11 columnists were individually admirable, only two of them were women and only one was a person of colour
  • Not only did they all focus on politics and foreign affairs, but during the 2016 campaign, no columnist shared, in broad terms, the worldview of the ascendant progressives of the Democratic Party, incarnated by Bernie Sanders. And only two were conservative.
  • This last fact was of particular concern to the elder Sulzberger. He told me the Times needed more conservative voices, and that its own editorial line had become predictably left-wing. “Too many liberals,” read my notes about the Opinion line-up from a meeting I had with him and Mark Thompson, then the chief executive, as I was preparing to rejoin the paper. “Even conservatives are liberals’ idea of a conservative.” The last note I took from that meeting was: “Can’t ignore 150m conservative Americans.”
  • As I knew from my time at the Atlantic, this kind of structural transformation can be frightening and even infuriating for those understandably proud of things as they are. It is hard on everyone
  • experience at the Atlantic also taught me that pursuing new ways of doing journalism in pursuit of venerable institutional principles created enthusiasm for change. I expected that same dynamic to allay concerns at the Times.
  • If Opinion published a wider range of views, it would help frame a set of shared arguments that corresponded to, and drew upon, the set of shared facts coming from the newsroom.
  • New progressive voices were celebrated within the Times. But in contrast to the Wall Street Journal and the Washington Post, conservative voices – even eloquent anti-Trump conservative voices – were despised, regardless of how many leftists might surround them.
  • The Opinion department mocked the paper’s claim to value diversity. It did not have a single black editor
  • Eventually, it sank in that my snotty joke was actually on me: I was the one ignorantly fighting a battle that was already lost. The old liberal embrace of inclusive debate that reflected the country’s breadth of views had given way to a new intolerance for the opinions of roughly half of American voters.
  • Out of naivety or arrogance, I was slow to recognise that at the Times, unlike at the Atlantic, these values were no longer universally accepted, let alone esteemed
  • After the 9/11 attacks, as the bureau chief in Jerusalem, I spent a lot of time in the Gaza Strip interviewing Hamas leaders, recruiters and foot soldiers, trying to understand and describe their murderous ideology. Some readers complained that I was providing a platform for terrorists, but there was never any objection from within the Times.
  • Our role, we knew, was to help readers understand such threats, and this required empathetic – not sympathetic – reporting. This is not an easy distinction but good reporters make it: they learn to understand and communicate the sources and nature of a toxic ideology without justifying it, much less advocating it.
  • Today’s newsroom turns that moral logic on its head, at least when it comes to fellow Americans. Unlike the views of Hamas, the views of many Americans have come to seem dangerous to engage in the absence of explicit condemnation
  • Focusing on potential perpetrators – “platforming” them by explaining rather than judging their views – is believed to empower them to do more harm.
  • After the profile of the Ohio man was published, media Twitter lit up with attacks on the article as “normalising” Nazism and white nationalism, and the Times convulsed internally. The Times wound up publishing a cringing editor’s note that hung the writer out to dry and approvingly quoted some of the criticism, including a tweet from a Washington Post opinion editor asking, “Instead of long, glowing profiles of Nazis/White nationalists, why don’t we profile the victims of their ideologies”?
  • the Times lacked the confidence to defend its own work
  • The editor’s note paraded the principle of publishing such pieces, saying it was important to “shed more light, not less, on the most extreme corners of American life”. But less light is what the readers got. As a reporter in the newsroom, you’d have to have been an idiot after that explosion to attempt such a profile
  • Empathetic reporting about Trump supporters became even more rare. It became a cliché among influential left-wing columnists and editors that blinkered political reporters interviewed a few Trump supporters in diners and came away suckered into thinking there was something besides racism that could explain anyone’s support for the man.
  • After a year spent publishing editorials attacking Trump and his policies, I thought it would be a demonstration of Timesian open-mindedness to give his supporters their say. Also, I thought the letters were interesting, so I turned over the entire editorial page to the Trump letters.
  • I wasn’t surprised that we got some criticism on Twitter. But I was astonished by the fury of my Times colleagues. I found myself facing an angry internal town hall, trying to justify what to me was an obvious journalistic decision
  • Didn’t he think other Times readers should understand the sources of Trump’s support? Didn’t he also see it was a wonderful thing that some Trump supporters did not just dismiss the Times as fake news, but still believed in it enough to respond thoughtfully to an invitation to share their views?
  • And if the Times could not bear to publish the views of Americans who supported Trump, why should it be surprised that those voters would not trust it?
  • Two years later, in 2020, Baquet acknowledged that in 2016 the Times had failed to take seriously the idea that Trump could become president partly because it failed to send its reporters out into America to listen to voters and understand “the turmoil in the country”. And, he continued, the Times still did not understand the views of many Americans
  • Speaking four months before we published the Cotton op-ed, he said that to argue that the views of such voters should not appear in the Times was “not journalistic”.
  • Conservative arguments in the Opinion pages reliably started uproars within the Times. Sometimes I would hear directly from colleagues who had the grace to confront me with their concerns; more often they would take to the company’s Slack channels or Twitter to advertise their distress in front of each other
  • This environment of enforced group-think, inside and outside the paper, was hard even on liberal opinion writers. One left-of-centre columnist told me that he was reluctant to appear in the New York office for fear of being accosted by colleagues.
  • An internal survey shortly after I left the paper found that barely half the staff, within an enterprise ostensibly devoted to telling the truth, agreed “there is a free exchange of views in this company” and “people are not afraid to say what they really think”.)
  • Even columnists with impeccable leftist bona fides recoiled from tackling subjects when their point of view might depart from progressive orthodoxy.
  • The bias had become so pervasive, even in the senior editing ranks of the newsroom, as to be unconscious
  • Trying to be helpful, one of the top newsroom editors urged me to start attaching trigger warnings to pieces by conservatives. It had not occurred to him how this would stigmatise certain colleagues, or what it would say to the world about the Times’s own bias
  • By their nature, information bubbles are powerfully self-reinforcing, and I think many Times staff have little idea how closed their world has become, or how far they are from fulfilling their compact with readers to show the world “without fear or favour”
  • sometimes the bias was explicit: one newsroom editor told me that, because I was publishing more conservatives, he felt he needed to push his own department further to the left.
  • The Times’s failure to honour its own stated principles of openness to a range of views was particularly hard on the handful of conservative writers, some of whom would complain about being flyspecked and abused by colleagues. One day when I relayed a conservative’s concern about double standards to Sulzberger, he lost his patience. He told me to inform the complaining conservative that that’s just how it was: there was a double standard and he should get used to it.
  • A publication that promises its readers to stand apart from politics should not have different standards for different writers based on their politics. But I delivered the message. There are many things I regret about my tenure as editorial-page editor. That is the only act of which I am ashamed.
  • I began to think of myself not as a benighted veteran on a remote island, but as Rip Van Winkle. I had left one newspaper, had a pleasant dream for ten years, and returned to a place I barely recognised.
  • The new New York Times was the product of two shocks – sudden collapse, and then sudden success. The paper almost went bankrupt during the financial crisis, and the ensuing panic provoked a crisis of confidence among its leaders. Digital competitors like the HuffPost were gaining readers and winning plaudits within the media industry as innovative. They were the cool kids; Times folk were ink-stained wrinklies.
  • In its panic, the Times bought out experienced reporters and editors and began hiring journalists from publications like the HuffPost who were considered “digital natives” because they had never worked in print. This hiring quickly became easier, since most digital publications financed by venture capital turned out to be bad businesses
  • Though they might have lacked deep or varied reporting backgrounds, some of the Times’s new hires brought skills in video and audio; others were practised at marketing themselves – building their brands, as journalists now put it – in social media. Some were brilliant and fiercely honest, in keeping with the old aspirations of the paper.
  • critically, the Times abandoned its practice of acculturation, including those months-long assignments on Metro covering cops and crime or housing. Many new hires who never spent time in the streets went straight into senior writing and editing roles.
  • All these recruits arrived with their own notions of the purpose of the Times. To me, publishing conservatives helped fulfil the paper’s mission; to them, I think, it betrayed that mission.
  • then, to the shock and horror of the newsroom, Trump won the presidency. In his article for Columbia Journalism Review, Sulzberger cites the Times’s failure to take Trump’s chances seriously as an example of how “prematurely shutting down inquiry and debate” can allow “conventional wisdom to ossify in a way that blinds society.
  • Many Times staff members – scared, angry – assumed the Times was supposed to help lead the resistance. Anxious for growth, the Times’s marketing team implicitly endorsed that idea, too.
  • As the number of subscribers ballooned, the marketing department tracked their expectations, and came to a nuanced conclusion. More than 95% of Times subscribers described themselves as Democrats or independents, and a vast majority of them believed the Times was also liberal
  • A similar majority applauded that bias; it had become “a selling point”, reported one internal marketing memo. Yet at the same time, the marketers concluded, subscribers wanted to believe that the Times was independent.
  • As that memo argued, even if the Times was seen as politically to the left, it was critical to its brand also to be seen as broadening its readers’ horizons, and that required “a perception of independence”.
  • Readers could cancel their subscriptions if the Times challenged their worldview by reporting the truth without regard to politics. As a result, the Times’s long-term civic value was coming into conflict with the paper’s short-term shareholder value
  • The Times has every right to pursue the commercial strategy that makes it the most money. But leaning into a partisan audience creates a powerful dynamic. Nobody warned the new subscribers to the Times that it might disappoint them by reporting truths that conflicted with their expectations
  • When your product is “independent journalism”, that commercial strategy is tricky, because too much independence might alienate your audience, while too little can lead to charges of hypocrisy that strike at the heart of the brand.
  • It became one of Dean Baquet’s frequent mordant jokes that he missed the old advertising-based business model, because, compared with subscribers, advertisers felt so much less sense of ownership over the journalism
  • The Times was slow to break it to its readers that there was less to Trump’s ties to Russia than they were hoping, and more to Hunter Biden’s laptop, that Trump might be right that covid came from a Chinese lab, that masks were not always effective against the virus, that shutting down schools for many months was a bad idea.
  • there has been a sea change over the past ten years in how journalists think about pursuing justice. The reporters’ creed used to have its foundation in liberalism, in the classic philosophical sense. The exercise of a reporter’s curiosity and empathy, given scope by the constitutional protections of free speech, would equip readers with the best information to form their own judgments. The best ideas and arguments would win out
  • The journalist’s role was to be a sworn witness; the readers’ role was to be judge and jury. In its idealised form, journalism was lonely, prickly, unpopular work, because it was only through unrelenting scepticism and questioning that society could advance. If everyone the reporter knew thought X, the reporter’s role was to ask: why X?
  • Illiberal journalists have a different philosophy, and they have their reasons for it. They are more concerned with group rights than individual rights, which they regard as a bulwark for the privileges of white men. They have seen the principle of  free speech used to protect right-wing outfits like Project Veritas and Breitbart News and are uneasy with it.
  • They had their suspicions of their fellow citizens’ judgment confirmed by Trump’s election, and do not believe readers can be trusted with potentially dangerous ideas or facts. They are not out to achieve social justice as the knock-on effect of pursuing truth; they want to pursue it head-on
  • The term “objectivity” to them is code for ignoring the poor and weak and cosying up to power, as journalists often have done.
  • And they do not just want to be part of the cool crowd. They need to be
  • To be more valued by their peers and their contacts – and hold sway over their bosses – they need a lot of followers in social media. That means they must be seen to applaud the right sentiments of the right people in social media
  • The journalist from central casting used to be a loner, contrarian or a misfit. Now journalism is becoming another job for joiners, or, to borrow Twitter’s own parlance, “followers”, a term that mocks the essence of a journalist’s role.
  • The new newsroom ideology seems idealistic, yet it has grown from cynical roots in academia: from the idea that there is no such thing as objective truth; that there is only narrative, and that therefore whoever controls the narrative – whoever gets to tell the version of the story that the public hears – has the whip hand
  • What matters, in other words, is not truth and ideas in themselves, but the power to determine both in the public mind.
  • By contrast, the old newsroom ideology seems cynical on its surface. It used to bug me that my editors at the Times assumed every word out of the mouth of any person in power was a lie.
  • And the pursuit of objectivity can seem reptilian, even nihilistic, in its abjuration of a fixed position in moral contests. But the basis of that old newsroom approach was idealistic: the notion that power ultimately lies in truth and ideas, and that the citizens of a pluralistic democracy, not leaders of any sort, must be trusted to judge both.
  • Our role in Times Opinion, I used to urge my colleagues, was not to tell people what to think, but to help them fulfil their desire to think for themselves.
  • It seems to me that putting the pursuit of truth, rather than of justice, at the top of a publication’s hierarchy of values also better serves not just truth but justice, too
  • over the long term journalism that is not also sceptical of the advocates of any form of justice and the programmes they put forward, and that does not struggle honestly to understand and explain the sources of resistance,
  • will not assure that those programmes will work, and it also has no legitimate claim to the trust of reasonable people who see the world very differently. Rather than advance understanding and durable change, it provokes backlash.
  • The impatience within the newsroom with such old ways was intensified by the generational failure of the Times to hire and promote women and non-white people
  • Pay attention if you are white at the Times and you will hear black editors speak of hiring consultants at their own expense to figure out how to get white staff to respect them
  • As wave after wave of pain and outrage swept through the Times, over a headline that was not damning enough of Trump or someone’s obnoxious tweets, I came to think of the people who were fragile, the ones who were caught up in Slack or Twitter storms, as people who had only recently discovered that they were white and were still getting over the shock.
  • Having concluded they had got ahead by working hard, it has been a revelation to them that their skin colour was not just part of the wallpaper of American life, but a source of power, protection and advancement.
  • I share the bewilderment that so many people could back Trump, given the things he says and does, and that makes me want to understand why they do: the breadth and diversity of his support suggests not just racism is at work. Yet these elite, well-meaning Times staff cannot seem to stretch the empathy they are learning to extend to people with a different skin colour to include those, of whatever race, who have different politics.
  • The digital natives were nevertheless valuable, not only for their skills but also because they were excited for the Times to embrace its future. That made them important allies of the editorial and business leaders as they sought to shift the Times to digital journalism and to replace staff steeped in the ways of print. Partly for that reason, and partly out of fear, the leadership indulged internal attacks on Times journalism, despite pleas from me and others, to them and the company as a whole, that Times folk should treat each other with more respect
  • My colleagues and I in Opinion came in for a lot of the scorn, but we were not alone. Correspondents in the Washington bureau and political reporters would take a beating, too, when they were seen as committing sins like “false balance” because of the nuance in their stories.
  • My fellow editorial and commercial leaders were well aware of how the culture of the institution had changed. As delighted as they were by the Times’s digital transformation they were not blind to the ideological change that came with it. They were unhappy with the bullying and group-think; we often discussed such cultural problems in the weekly meetings of the executive committee, composed of the top editorial and business leaders, including the publisher. Inevitably, these bitch sessions would end with someone saying a version of: “Well, at some point we have to tell them this is what we believe in as a newspaper, and if they don’t like it they should work somewhere else.” It took me a couple of years to realise that this moment was never going to come.
  • There is a lot not to miss about the days when editors like Boyd could strike terror in young reporters like me and Purdum. But the pendulum has swung so far in the other direction that editors now tremble before their reporters and even their interns. “I miss the old climate of fear,” Baquet used to say with a smile, in another of his barbed jokes.
  • I wish I’d pursued my point and talked myself out of the job. This contest over control of opinion journalism within the Times was not just a bureaucratic turf battle (though it was that, too)
  • The newsroom’s embrace of opinion journalism has compromised the Times’s independence, misled its readers and fostered a culture of intolerance and conformity.
  • The Opinion department is a relic of the era when the Times enforced a line between news and opinion journalism.
  • Editors in the newsroom did not touch opinionated copy, lest they be contaminated by it, and opinion journalists and editors kept largely to their own, distant floor within the Times building. Such fastidiousness could seem excessive, but it enforced an ethos that Times reporters owed their readers an unceasing struggle against bias in the news
  • But by the time I returned as editorial-page editor, more opinion columnists and critics were writing for the newsroom than for Opinion. As at the cable news networks, the boundaries between commentary and news were disappearing, and readers had little reason to trust that Times journalists were resisting rather than indulging their biases
  • The Times newsroom had added more cultural critics, and, as Baquet noted, they were free to opine about politics.
  • Departments across the Times newsroom had also begun appointing their own “columnists”, without stipulating any rules that might distinguish them from columnists in Opinion
  • I checked to see if, since I left the Times, it had developed guidelines explaining the difference, if any, between a news columnist and opinion columnist. The paper’s spokeswoman, Danielle Rhoades Ha, did not respond to the question.)
  • The internet rewards opinionated work and, as news editors felt increasing pressure to generate page views, they began not just hiring more opinion writers but also running their own versions of opinionated essays by outside voices – historically, the province of Opinion’s op-ed department.
  • Yet because the paper continued to honour the letter of its old principles, none of this work could be labelled “opinion” (it still isn’t). After all, it did not come from the Opinion department.
  • And so a newsroom technology columnist might call for, say, unionisation of the Silicon Valley workforce, as one did, or an outside writer might argue in the business section for reparations for slavery, as one did, and to the average reader their work would appear indistinguishable from Times news articles.
  • By similarly circular logic, the newsroom’s opinion journalism breaks another of the Times’s commitments to its readers. Because the newsroom officially does not do opinion – even though it openly hires and publishes opinion journalists – it feels free to ignore Opinion’s mandate to provide a diversity of views
  • When I was editorial-page editor, there were a couple of newsroom columnists whose politics were not obvious. But the other newsroom columnists, and the critics, read as passionate progressives.
  • I urged Baquet several times to add a conservative to the newsroom roster of cultural critics. That would serve the readers by diversifying the Times’s analysis of culture, where the paper’s left-wing bias had become most blatant, and it would show that the newsroom also believed in restoring the Times’s commitment to taking conservatives seriously. He said this was a good idea, but he never acted on it
  • I couldn’t help trying the idea out on one of the paper’s top cultural editors, too: he told me he did not think Times readers would be interested in that point of view.
  • opinion was spreading through the newsroom in other ways. News desks were urging reporters to write in the first person and to use more “voice”, but few newsroom editors had experience in handling that kind of journalism, and no one seemed certain where “voice” stopped and “opinion” began
  • The Times magazine, meanwhile, became a crusading progressive publication
  • Baquet liked to say the magazine was Switzerland, by which he meant that it sat between the newsroom and Opinion. But it reported only to the news side. Its work was not labelled as opinion and it was free to omit conservative viewpoints.
  • his creep of politics into the newsroom’s journalism helped the Times beat back some of its new challengers, at least those on the left
  • Competitors like Vox and the HuffPost were blending leftish politics with reporting and writing it up conversationally in the first person. Imitating their approach, along with hiring some of their staff, helped the Times repel them. But it came at a cost. The rise of opinion journalism over the past 15 years changed the newsroom’s coverage and its culture
  • The tiny redoubt of never-Trump conservatives in Opinion is swamped daily not only by the many progressives in that department but their reinforcements among the critics, columnists and magazine writers in the newsroom
  • They are generally excellent, but their homogeneity means Times readers are being served a very restricted range of views, some of them presented as straight news by a publication that still holds itself out as independent of any politics.
  • And because the critics, newsroom columnists and magazine writers are the newsroom’s most celebrated journalists, they have disproportionate influence over the paper’s culture.
  • By saying that it still holds itself to the old standard of strictly separating its news and opinion journalists, the paper leads its readers further into the trap of thinking that what they are reading is independent and impartial – and this misleads them about their country’s centre of political and cultural gravity.
  • And yet the Times insists to the public that nothing has changed.
  • “Even though each day’s opinion pieces are typically among our most popular journalism and our columnists are among our most trusted voices, we believe opinion is secondary to our primary mission of reporting and should represent only a portion of a healthy news diet,” Sulzberger wrote in the Columbia Journalism Review. “For that reason, we’ve long kept the Opinion department intentionally small – it represents well under a tenth of our journalistic staff – and ensured that its editorial decision-making is walled off from the newsroom.”
  • When I was editorial-page editor, Sulzberger, who declined to be interviewed on the record for this article, worried a great deal about the breakdown in the boundaries between news and opinion
  • He told me once that he would like to restructure the paper to have one editor oversee all its news reporters, another all its opinion journalists and a third all its service journalists, the ones who supply guidance on buying gizmos or travelling abroad. Each of these editors would report to him
  • That is the kind of action the Times needs to take now to confront its hypocrisy and begin restoring its independence.
  • The Times could learn something from the Wall Street Journal, which has kept its journalistic poise
  • It has maintained a stricter separation between its news and opinion journalism, including its cultural criticism, and that has protected the integrity of its work.
  • After I was chased out of the Times, Journal reporters and other staff attempted a similar assault on their opinion department. Some 280 of them signed a letter listing pieces they found offensive and demanding changes in how their opinion colleagues approached their work. “Their anxieties aren’t our responsibility,” shrugged the Journal’s editorial board in a note to readers after the letter was leaked. “The signers report to the news editors or other parts of the business.” The editorial added, in case anyone missed the point, “We are not the New York Times.” That was the end of it.
  • Unlike the publishers of the Journal, however, Sulzberger is in a bind, or at least perceives himself to be
  • The confusion within the Times over its role, and the rising tide of intolerance among the reporters, the engineers, the business staff, even the subscribers – these are all problems he inherited, in more ways than one. He seems to feel constrained in confronting the paper’s illiberalism by the very source of his authority
  • The paradox is that in previous generations the Sulzbergers’ control was the bulwark of the paper’s independence.
  • if he is going to instil the principles he believes in, he needs to stop worrying so much about his powers of persuasion, and start using the power he is so lucky to have.
  • Shortly after we published the op-ed that Wednesday afternoon, some reporters tweeted their opposition to Cotton’s argument. But the real action was in the Times’s Slack channels, where reporters and other staff began not just venting but organising. They turned to the union to draw up a workplace complaint about the op-ed.
  • The next day, this reporter shared the byline on the Times story about the op-ed. That article did not mention that Cotton had distinguished between “peaceful, law-abiding protesters” and “rioters and looters”. In fact, the first sentence reported that Cotton had called for “the military to suppress protests against police violence”.
  • This was – and is – wrong. You don’t have to take my word for that. You can take the Times’s
  • Three days later in its article on my resignation it also initially reported that Cotton had called “for military force against protesters in American cities”. This time, after the article was published on the Times website, the editors scrambled to rewrite it, replacing “military force” with “military response” and “protesters” with “civic unrest”
  • That was a weaselly adjustment – Cotton wrote about criminality, not “unrest” – but the article at least no longer unambiguously misrepresented Cotton’s argument to make it seem he was in favour of crushing democratic protest. The Times did not publish a correction or any note acknowledging the story had been changed.
  • Seeking to influence the outcome of a story you cover, particularly without disclosing that to the reader, violates basic principles I was raised on at the Times
  • s Rhoades Ha disputes my characterisation of the after-the-fact editing of the story about my resignation. She said the editors changed the story after it was published on the website in order to “refine” it and “add context”, and so the story did not merit a correction disclosing to the reader that changes had been made.
  • In retrospect what seems almost comical is that as the conflict over Cotton’s op-ed unfolded within the Times I acted as though it was on the level, as though the staff of the Times would have a good-faith debate about Cotton’s piece and the decision to publish it
  • Instead, people wanted to vent and achieve what they considered to be justice, whether through Twitter, Slack, the union or the news pages themselves
  • My colleagues in Opinion, together with the PR team, put together a series of connected tweets describing the purpose behind publishing Cotton’s op-ed. Rather than publish these tweets from the generic Times Opinion Twitter account, Sulzberger encouraged me to do it from my personal one, on the theory that this would humanise our defence. I doubted that would make any difference, but it was certainly my job to take responsibility. So I sent out the tweets, sticking my head in a Twitter bucket that clangs, occasionally, to this day
  • What is worth recalling now from the bedlam of the next two days? I suppose there might be lessons for someone interested in how not to manage a corporate crisis. I began making my own mistakes that Thursday. The union condemned our publication of Cotton, for supposedly putting journalists in danger, claiming that he had called on the military “to ‘detain’ and ‘subdue’ Americans protesting racism and police brutality” – again, a misrepresentation of his argument. The publisher called to tell me the company was experiencing its largest sick day in history; people were turning down job offers because of the op-ed, and, he said, some people were quitting. He had been expecting for some time that the union would seek a voice in editorial decision-making; he said he thought this was the moment the union was making its move. He had clearly changed his own mind about the value of publishing the Cotton op-ed.
  • I asked Dao to have our fact-checkers review the union’s claims. But then I went a step further: at the publisher’s request, I urged him to review the editing of the piece itself and come back to me with a list of steps we could have taken to make it better. Dao’s reflex – the correct one – was to defend the piece as published. He and three other editors of varying ages, genders and races had helped edit it; it had been fact-checked, as is all our work
  • This was my last failed attempt to have the debate within the Times that I had been seeking for four years, about why it was important to present Times readers with arguments like Cotton’s. The staff at the paper never wanted to have that debate. The Cotton uproar was the most extreme version of the internal reaction we faced whenever we published conservative arguments that were not simply anti-Trump. Yes, yes, of course we believe in the principle of publishing diverse views, my Times colleagues would say, but why this conservative? Why this argument?
  • I doubt these changes would have mattered, and to extract this list from Dao was to engage in precisely the hypocrisy I claimed to despise – that, in fact, I do despise. If Cotton needed to be held to such standards of politesse, so did everyone else. Headlines such as “Tom Cotton’s Fascist Op-ed”, the headline of a subsequent piece, should also have been tranquillised.
  • As that miserable Thursday wore on, Sulzberger, Baquet and I held a series of Zoom meetings with reporters and editors from the newsroom who wanted to discuss the op-ed. Though a handful of the participants were there to posture, these were generally constructive conversations. A couple of people, including Baquet, even had the guts to speak up in favour of publishing the op-ed
  • Two moments stick out. At one point, in answer to a question, Sulzberger and Baquet both said they thought the op-ed – as the Times union and many journalists were saying – had in fact put journalists in danger. That was the first time I realised I might be coming to the end of the road.
  • The other was when a pop-culture reporter asked if I had read the op-ed before it was published. I said I had not. He immediately put his head down and started typing, and I should have paid attention rather than moving on to the next question. He was evidently sharing the news with the company over Slack.
  • Every job review I had at the Times urged me to step back from the daily coverage to focus on the long term. (Hilariously, one review, urging me to move faster in upending the Opinion department, instructed me to take risks and “ask for forgiveness not permission”.)
  • I learned when these meetings were over that there had been a new eruption in Slack. Times staff were saying that Rubenstein had been the sole editor of the op-ed. In response, Dao had gone into Slack to clarify to the entire company that he had also edited it himself. But when the Times posted the news article that evening, it reported, “The Op-Ed was edited by Adam Rubenstein” and made no mention of Dao’s statement
  • Early that morning, I got an email from Sam Dolnick, a Sulzberger cousin and a top editor at the paper, who said he felt “we” – he could have only meant me – owed the whole staff “an apology for appearing to place an abstract idea like open debate over the value of our colleagues’ lives, and their safety”. He was worried that I and my colleagues had unintentionally sent a message to other people at the Times that: “We don’t care about their full humanity and their security as much as we care about our ideas.”
  • “I know you don’t like it when I talk about principles at a moment like this,” I began. But I viewed the journalism I had been doing, at the Times and before that at the Atlantic, in very different terms from the ones Dolnick presumed. “I don’t think of our work as an abstraction without meaning for people’s lives – quite the opposite,” I continued. “The whole point – the reason I do this – is to have an impact on their lives to the good. I have always believed that putting ideas, including potentially dangerous one[s], out in the public is vital to ensuring they are debated and, if dangerous, discarded.” It was, I argued, in “edge cases like this that principles are tested”, and if my position was judged wrong then “I am out of step with the times.” But, I concluded, “I don’t think of us as some kind of debating society without implications for the real world and I’ve never been unmindful of my colleagues’ humanity.”
  • in the end, one thing he and I surely agree on is that I was, in fact, out of step with the Times. It may have raised me as a journalist – and invested so much in educating me to what were once its standards – but I did not belong there any more.
  • Finally, I came up with something that felt true. I told the meeting that I was sorry for the pain that my leadership of Opinion had caused. What a pathetic thing to say. I did not think to add, because I’d lost track of this truth myself by then, that opinion journalism that never causes pain is not journalism. It can’t hope to move society forward
  • As I look back at my notes of that awful day, I don’t regret what I said. Even during that meeting, I was still hoping the blow-up might at last give me the chance either to win support for what I had been asked to do, or to clarify once and for all that the rules for journalism had changed at the Times.
  • But no one wanted to talk about that. Nor did they want to hear about all the voices of vulnerable or underprivileged people we had been showcasing in Opinion, or the ambitious new journalism we were doing. Instead, my Times colleagues demanded to know things such as the names of every editor who had had a role in the Cotton piece. Having seen what happened to Rubenstein I refused to tell them. A Slack channel had been set up to solicit feedback in real time during the meeting, and it was filling with hate. The meeting ran long, and finally came to a close after 90 minutes.
  • I tried to insist, as did Dao, that the note make clear the Cotton piece was within our editorial bounds. Sulzberger said he felt the Times could afford to be “silent” on that question. In the end the note went far further in repudiating the piece than I anticipated, saying it should never have been published at all. The next morning I was told to resign.
  • It was a terrible moment for the country. By the traditional – and perverse – logic of journalism, that should also have made it an inspiring time to be a reporter, writer or editor. Journalists are supposed to run towards scenes that others are fleeing, towards hard truths others need to know, towards consequential ideas they would prefer to ignore.
  • But fear got all mixed up with anger inside the Times, too, along with a desire to act locally in solidarity with the national movement. That energy found a focus in the Cotton op-ed
  • the Times is not good at acknowledging mistakes. Indeed, one of my own, within the Times culture, was to take responsibility for any mistakes my department made, and even some it didn’t
  • To Sulzberger, the meltdown over Cotton’s op-ed and my departure in disgrace are explained and justified by a failure of editorial “process”. As he put it in an interview with the New Yorker this summer, after publishing his piece in the Columbia Journalism Review, Cotton’s piece was not “perfectly fact-checked” and the editors had not “thought about the headline and presentation”. He contrasted the execution of Cotton’s opinion piece with that of a months-long investigation the newsroom did of Donald Trump’s taxes (which was not “perfectly fact-checked”, as it happens – it required a correction). He did not explain why, if the Times was an independent publication, an op-ed making a mainstream conservative argument should have to meet such different standards from an op-ed making any other kind of argument, such as for the abolition of the police
  • “It’s not enough just to have the principle and wave it around,” he said. “You also have to execute on it.”
  • To me, extolling the virtue of independent journalism in the pages of the Columbia Journalism Review is how you wave a principle around. Publishing a piece like Cotton’s is how you execute on it.
  • As Sulzberger also wrote in the Review, “Independent journalism, especially in a pluralistic democracy, should err on the side of treating areas of serious political contest as open, unsettled, and in need of further inquiry.
  • If Sulzberger must insist on comparing the execution of the Cotton op-ed with that of the most ambitious of newsroom projects, let him compare it with something really important, the 1619 Project, which commemorated the 400th anniversary of the arrival of enslaved Africans in Virginia.
  • Like Cotton’s piece, the 1619 Project was fact-checked and copy-edited (most of the Times newsroom does not fact-check or copy-edit articles, but the magazine does). But it nevertheless contained mistakes, as journalism often does. Some of these mistakes ignited a firestorm among historians and other readers.
  • And, like Cotton’s piece, the 1619 Project was presented in a way the Times later judged to be too provocative.
  • The Times declared that the 1619 Project “aims to reframe the country’s history, understanding 1619 as our true founding”. That bold statement – a declaration of Times fact, not opinion, since it came from the newsroom – outraged many Americans who venerated 1776 as the founding. The Times later stealthily erased it from the digital version of the project, but was caught doing so by a writer for the publication Quillette. Sulzberger told me during the initial uproar that the top editors in the newsroom – not just Baquet but his deputy – had not reviewed the audacious statement of purpose, one of the biggest editorial claims the paper has ever made. They also, of course, did not edit all the pieces themselves, trusting the magazine’s editors to do that work.
  • If the 1619 Project and the Cotton op-ed shared the same supposed flaws and excited similar outrage, how come that one is lauded as a landmark success and the other is a sackable offence?
  • I am comparing them only to meet Sulzberger on his terms, in order to illuminate what he is trying to elide. What distinguished the Cotton piece was not an error, or strong language, or that I didn’t edit it personally. What distinguished that op-ed was not process. It was politics.
  • It is one thing for the Times to aggravate historians, or conservatives, or even old-school liberals who believe in open debate. It has become quite another for the Times to challenge some members of its own staff with ideas that might contradict their view of the world.
  • The lessons of the incident are not about how to write a headline but about how much the Times has changed – how digital technology, the paper’s new business model and the rise of new ideals among its staff have altered its understanding of the boundary between news and opinion, and of the relationship between truth and justice
  • Ejecting me was one way to avoid confronting the question of which values the Times is committed to. Waving around the word “process” is another.
  • As he asserts the independence of Times journalism, Sulzberger is finding it necessary to reach back several years to another piece I chose to run, for proof that the Times remains willing to publish views that might offend its staff. “We’ve published a column by the head of the part of the Taliban that kidnapped one of our own journalists,” he told the New Yorker. He is missing the real lesson of that piece, as well.
  • The case against that piece is that Haqqani, who remains on the FBI’s most-wanted terrorist list, may have killed Americans. It’s puzzling: in what moral universe can it be a point of pride to publish a piece by an enemy who may have American blood on his hands, and a matter of shame to publish a piece by an American senator arguing for American troops to protect Americans?
  • As Mitch McConnell, then the majority leader, said on the Senate floor about the Times’s panic over the Cotton op-ed, listing some other debatable op-ed choices, “Vladimir Putin? No problem. Iranian propaganda? Sure. But nothing, nothing could have prepared them for 800 words from the junior senator from Arkansas.”
  • The Times’s staff members are not often troubled by obnoxious views when they are held by foreigners. This is an important reason the paper’s foreign coverage, at least of some regions, remains exceptional.
  • What seems most important and least understood about that episode is that it demonstrated in real time the value of the ideals that I poorly defended in the moment, ideals that not just the Times’s staff but many other college-educated Americans are abandoning.
  • After all, we ran the experiment; we published the piece. Was any Times journalist hurt? No. Nobody in the country was. In fact, though it is impossible to know the op-ed’s precise effect, polling showed that support for a military option dropped after the Times published the essay, as the Washington Post’s media critic, Erik Wemple, has written
  • If anything, in other words, publishing the piece stimulated debate that made it less likely Cotton’s position would prevail. The liberal, journalistic principle of open debate was vindicated in the very moment the Times was fleeing from it.
Javier E

The Closing of the American Mind: A Summary - 0 views

  • Preface
  • “No teacher can doubt that his real task is to assist his pupil to fulfill human nature against all the deforming forces of convention and prejudice.” p. 20
  • A liberal education is one that helps students to ask themselves and answer the question, “what is man?… In our chronic lack of certainty, this comes down to knowing the alternative answers [to that question] and thinking about them.” p. 21
  • ...67 more annotations...
  • Introduction: Our Virtue
  • “There is one thing that a professor can be absolutely certain of: almost every student entering the university believes, or says he believes, that truth is relative…. Relativism is necessary to openness; and this is the virtue, the only virtue, which all primary education for more than fifty years has dedicated itself to inculcating.” p. 25
  • Democratic education…wants and needs to produce men and women [who are] supportive of a democratic regime.” p. 26
  • The historical assumption of the human sciences was (and remains) that an objective human nature exists and can be discovered—if not by reason itself, then at least by empirical science guided by reason. Science was a method to allow us to rise beyond the prejudices of our culture in order to discover the truths of human nature. It was a mechanism for opening our minds, an instrument of openness. p. 37-38
  • Liberalism has always tended towards increased freedom—i.e., decreased regulation. But “it was possible to expand the space exempt from legitimate social and political regulation only by contracting the claims to moral and political knowledge…. It begins to appear that full freedom can be attained only when there is no such knowledge at all…[and] of course the result is that…the argument justifying freedom disappears, and…all beliefs begin to have an attenuated character.” p. 28
  • Modern education is concerned mainly with correcting ethnocentrism—showing students that their preferences are merely accidents of their culture and that no single culture is better than any other. The roots of this movement are found in the problems (racism, mistreatment) that arose due to the multicultural nature of American life. p. 29-30
  • The Founders envisioned a society where individuals were bound together by their belief in and adherence to the rights of the Constitution. Minority factions were seen as a bad thing, detracting from social cohesiveness. p. 31
  • However, the provision of equal rights did not guarantee equal treatment, and minority groups suffered. This caused them to retreat into their minority identities and oppose the majority—indeed, “much of the intellectual machinery of twentieth-century American political thought and social science was constructed for the purpose of making an assault on [the] majority…. The very idea of a majority—now understood to be selfish interest—is done away with in order to protect the minorities.” p. 32-35
  • However, its ideas about what this means have changed over time, starting with a faith in the human rights of the U.S. Constitution, but ultimately changing to (now) mean “openness,” i.e., relativism. p. 26-27
  • “Historicism and cultural relativism actually are a means to avoid testing our own prejudices and asking, for example, whether men are really equal or whether that opinion is a democratic prejudice.” p. 40
  • Today, “the human sciences want to make us culture-beings with the instruments [science and reason] that were invented to liberate us from culture…: cultural relativism, historicism, the fact-value distinction—are the suicide of science. Culture, hence closedness, reigns supreme. Openness to closedness is what we teach.” p. 38-39
  • Yet the dogmatic modern assumption is that human nature does not exist, that our ways of being are culturally determined, that our minds are inherently constrained—“closed”—by cultural influences. p. 38
  • “There are two types of openness, the openness of indifference…and the openness that invites us to the quest for knowledge and certitude.” p. 41
  • The openness of indifference advocates the removal of all requirements in education—why should students learn languages or philosophy? But the reality is that, “to be open to knowing, there are certain types of things one must know which most people don’t want to bother to learn and which appear boring and irrelevant…true openness means closedness to all the charms that make us comfortable with the present.” p. 41
  • The Clean Slate
  • On the surface, Americans seems to lack a true culture or set of traditions. But most of them grew up with a shared knowledge of the Bible and the Declaration of Independence, and “contrary to much contemporary wisdom, the United States has one of the longest uninterrupted political [and intellectual] traditions of any nation in the world.” And this tradition is not confused or counterbalanced by a history of monarchy or aristocracy. p. 52-55
  • So we have a culture in which to root education, but we have begun to undermine it. The idealism of the American founding has been explained away as mythical, selfishly-motivated, and racist. And so our culture has been devalued. p. 55-56
  • Religion, too, has been explained away, but this has left us without a standpoint from which to understand our experience as humans. Parents “have nothing to give their children in the way of a vision of the world.” p. 56-57
  • “As it now stands, students have powerful images of what the perfect body is and pursue it incessantly. But deprived of literary guidance, they no longer have any image of a perfect soul, and hence do not long to have one. They do not even imagine that there is such a thing.” p. 67
  • Books
  • “I have begun to wonder whether the experience of the greatest texts from early childhood is not a prerequisite for a concern throughout life for them and for lesser but important literature. The soul’s longing…may well require encouragement at the outset.” p. 62
  • Literature is critical because it presents to young people the range of possibilities of human types—both good and bad. p. 62-64
  • But students are less and less exposed to literature, and as a result, “they have only pop psychology to tell them what people are like, and the range of their motives…. [Therefore,] people become more alike, for want of knowing they can be otherwise. What poor substitutes for real diversity are the wild rainbows of dyed hair and other external differences that tell the observer nothing about what is inside.” p. 64
  • Without exposure to literature, students usually resort to the movies. But movies do not provide the “distance from the contemporary” that students need, and so this only reinforces the belief that the here and now is all there is. p. 64
  • The loss of literature has also meant the loss of heroes. In a “perversion of the democratic principle,” this lack is almost admired, since being oneself is the supposed goal. But whether or not it is seen as desirable, students invariably seek role models. And without literature, they only have those around them (and in the media) to emulate. p. 66-67
  • “Nobody believes that the old books do, or even could, contain the truth…. Tradition has become superfluous.” p. 58
  • We are left with a culture filled with “the intense, changing, crude and immediate, which Tocqueville warned us would be the character of democratic art…. In short, life is made into a nonstop, commercially prepackaged masturbational fantasy. This description may seem exaggerated, but only because some would prefer to regard it as such.” p. 74-75
  • Relationships
  • “In short, after the war, while America was sending out its blue jeans to unite the young of all nations, a concrete form of democratic universalism that has had liberalizing effects on many enslaved nations, it was importing a clothing of German fabrication for its souls, which clashed with all that and cast doubt on the Americanization of the world on which we had embarked, thinking it was good and in conformity with the rights of man
  • “This indeterminate or open-ended future and the lack of a binding past mean that the souls of young people are in a condition like that of the first men in the state of nature—spiritually unclad, unconnected, isolated, with no inherited or unconditional connection with anything or anyone…. Why are we surprised that such unfurnished persons should be preoccupied principally with themselves?” p. 87-88
  • “The one eccentric element in this portrait, the one failure…is the relation between blacks and whites.” Although black students are present on campuses, they “have, by and large, proved indigestible.” p. 91
  • the Black Power movement arrived and the universities conceded to identity politics, which took the form of Black-themed courses, quotas, and an unwillingness to fail black students. p. 94-95
  • “The black student who wants to be just a student and to avoid allegiance to the black group has to pay a terrific price, because he is judged negatively by his black peers and because his behavior is atypical in the eyes of whites. White students have silently and unconsciously adjusted to a group presence of blacks, and they must readjust for a black who does not define himself by the group.” Affirmative action cements this dynamic. p. 95-96
  • The restructuring of the family requires that men subdue their masculine character. “And it is indeed possible to soften men. But to make them ‘care’ is another thing, and the project must inevitably fail…. The old moral order, however imperfect it may have been, at least moved towards the virtues by way of the passions. If men were self-concerned, that order tried to expand the scope of self-concern to include others [i.e., his wife and children], rather than commanding men to cease being concerned with themselves.” p. 129
  • “I am not arguing here that the old family arrangements were good or that we should go back to them. I am only insisting that we not cloud our vision to such an extent that we believe that there are viable substitutes for them just because we want or need them.” p. 130
  • “All of our reforms have helped strip the teeth of our gears, which can therefore no longer mesh. They spin idly, side by side, unable to set the social machine in motion.” p. 131
  • Modern students are lacking the longing that is critical for a full enjoyment of life. They are complacent. And the universities do not see themselves as providing for such a longing. p. 134-136
  • The German Connection
  • Value relativism is the modern replacement for traditional morality, and “constitutes a change in our view of things moral and political as great as the one that took place when Christianity replaced Greek and Roman paganism.” p. 141
  • Value relativism has sunk so far into the American consciousness that its vocabulary has become colloquial: we talk about ‘charisma,’ ‘life-style,’ ‘commitment,’ ‘identity,’ etc. “Although they, and the things to which they refer, would have been incomprehensible to our fathers, not to speak of our Founding Fathers.” p. 147
  • Students today are largely apathetic about any concerns outside of themselves. There isn’t any malice in this self-centeredness; but it has become so entrenched in American culture that it isn’t even recognized as unusual. p. 82-86
  • “We chose [to import] a system of thought that, like some wines, does not travel; we chose a way of looking at things that could never be ours and had its starting point dislike of us and our goals.” p. 153
  • The question isn’t even asked whether the German doctrine of value-creation is contrary to democratic and egalitarian ideals; but it certainly seems to leave room for their opposites and perhaps promote them—i.e., value relativism seems to allow for fascism. p. 154
  • The Self
  • Although a precise definition remains elusive, “the self is the modern substitute for the soul.” p. 173
  • Man used to strive for fulfillment by taming his bodily desires in order to live virtuously. But this changed after Machiavelli (and Hobbes after him) suggested that instead we ignore virtue and follow our desires, which find their root in the state of nature. p. 174-175
  • Following their advice, “our desire becomes a kind of oracle we consult; it is the last word, while in the past it was the questionable and dangerous part of us.” p. 175
  • Locke then replaced the virtuous man with the rationally selfish one. “Beneath his selfishness, of course, lies an expectation that it conduces more to the good of others than does moralism.” p. 175-176
  • “All higher purposiveness in nature, which might have been consulted by men’s reason and used to limit human passion, has disappeared.” p. 176
  • That reason “is unable to rule in culture or in soul…constitutes a crisis of the West…[whose] regimes are founded on reason.” Previous regimes relied on religion, but Enlightenment undermined religion. p. 196
  • Psychology came to us “in order to treat the parts of man which had been so long neglected by liberal society…. Modern psychology has this in common with what was always a popular opinion, fathered by Machiavelli—that selfishness is somehow good. Man is self, and the self must be selfish. What is new is that we are told to look more deeply into the self, that we assumed too easily that we know it and have access to it.” p. 178
  • Prior to this, it was only God who was dignified—not man. And God was dignified in his freedom, his ability to create. If man was to be elevated, he, too, must be free; he, too, must be able to create. p. 180
  • And so, following Rousseau and our dissatisfaction with the Enlightenment, we have elevated creativity above reason as the ultimate virtue, and the artist replaced the philosopher and scientist at the admired human type. p. 181-182
  • Yet those who praise creativity don’t realize why. They admire it without seeing that it is the result of Romantic thought absorbed into democratic public opinion. And it has influenced the whole political spectrum, from Left to Right. p. 181-182
  • The Germans (Nietzsche and Weber) recognized as early as 1919 that the scientific spirit was dead, that reason cannot establish values. But Americans (naïvely, and largely unknowingly) still held onto the rationalist dream, written as they were into our political foundations. p. 194-195
  • When those ideas came to the U.S. (via Weber), “a very dark view of the future was superimposed on our incorrigible optimism. We are children playing with adult toys.” p. 195
  • “The psychology of the self has succeeded so well that it is now the instinct of most of us to turn for a cure for our ills back within ourselves rather than to the nature of things.” p. 179
  • Rousseau and others recognized this. “The very idea of culture was a way of preserving something like religion without talking about it.” But Nietzsche saw this was impossible. p. 196-197
  • We are left with no religion, but we still have religious impulses. p. 197
  • “The disenchantment of God and nature necessitated a new description of good and evil. To adapt a formula of Plato about the gods, we do not love a thing because it is good, it is good because we love it. It [became] our decision to esteem that makes something estimable.” p. 197
  • “Since values are not rational…they must be imposed.” Will, or commitment, is the primary virtue; it is the equivalent of (what used to be) faith. “Nietzsche was not a fascist; but this project inspired fascist rhetoric, which looked to the revitalization of old cultures or the foundation of new ones, as opposed to the rational, rootless cosmopolitanism of the revolutions of the Left.” p. 201-202
  • Nietzsche was a cultural relativist. This meant he anticipated war, because wars are inevitable when values are imposed and unrooted in truth or anything objective. p. 202
  • “Just over the horizon, when Weber wrote, lay Hitler…. He was the mad, horrible parody of the charismatic leader—the demagogue—hoped for by Weber.” Weber was not looking for something so extreme, but “when one ventures out into the vast spaces opened up by Nietzsche, it is hard to set limits.” p. 213-214
  • “Hitler did not cause a rethinking of the politics here or in Europe. All to the contrary—it was while we were fighting him that the thought that had preceded him in Europe conquered here.” And it remains dominant. p. 214
  • The language of values implies that the religious is the source of everything political, social, and personal. It has been facilitated by a softening and blurring of the idea of religion and “the sacred,” which are no longer seen as dangerous.
  • “As an image of our current intellectual condition, I keep being reminded of the newsreel pictures of Frenchmen splashing happily in the water at the seashore, enjoying the paid annual vacations legislated by Leon Blum’s Popular Front government. It was 1936, the same year Hitler was permitted to occupy the Rhineland. All our big causes amount to that kind of vacation.” p. 239
  • This is our educational crisis and opportunity. Western rationalism has culminated in a rejection of reason. Is this result necessary?” p. 240
Javier E

President Obama's Interview With Jeffrey Goldberg on Syria and Foreign Policy - The Atl... - 0 views

  • The president believes that Churchillian rhetoric and, more to the point, Churchillian habits of thought, helped bring his predecessor, George W. Bush, to ruinous war in Iraq.
  • Obama entered the White House bent on getting out of Iraq and Afghanistan; he was not seeking new dragons to slay. And he was particularly mindful of promising victory in conflicts he believed to be unwinnable. “If you were to say, for instance, that we’re going to rid Afghanistan of the Taliban and build a prosperous democracy instead, the president is aware that someone, seven years later, is going to hold you to that promise,” Ben Rhodes, Obama’s deputy national-security adviser, and his foreign-policy amanuensis, told me not long ago.
  • Power is a partisan of the doctrine known as “responsibility to protect,” which holds that sovereignty should not be considered inviolate when a country is slaughtering its own citizens. She lobbied him to endorse this doctrine in the speech he delivered when he accepted the Nobel Peace Prize in 2009, but he declined. Obama generally does not believe a president should place American soldiers at great risk in order to prevent humanitarian disasters, unless those disasters pose a direct security threat to the United States.
  • ...162 more annotations...
  • Obama’s resistance to direct intervention only grew. After several months of deliberation, he authorized the CIA to train and fund Syrian rebels, but he also shared the outlook of his former defense secretary, Robert Gates, who had routinely asked in meetings, “Shouldn’t we finish up the two wars we have before we look for another?”
  • In his first term, he came to believe that only a handful of threats in the Middle East conceivably warranted direct U.S. military intervention. These included the threat posed by al‑Qaeda; threats to the continued existence of Israel (“It would be a moral failing for me as president of the United States” not to defend Israel, he once told me); and, not unrelated to Israel’s security, the threat posed by a nuclear-armed Iran.
  • Bush and Scowcroft removed Saddam Hussein’s army from Kuwait in 1991, and they deftly managed the disintegration of the Soviet Union; Scowcroft also, on Bush’s behalf, toasted the leaders of China shortly after the slaughter in Tiananmen Square.
  • As Obama was writing his campaign manifesto, The Audacity of Hope, in 2006, Susan Rice, then an informal adviser, felt it necessary to remind him to include at least one line of praise for the foreign policy of President Bill Clinton, to partially balance the praise he showered on Bush and Scowcroft.
  • “When you have a professional army,” he once told me, “that is well armed and sponsored by two large states”—Iran and Russia—“who have huge stakes in this, and they are fighting against a farmer, a carpenter, an engineer who started out as protesters and suddenly now see themselves in the midst of a civil conflict …” He paused. “The notion that we could have—in a clean way that didn’t commit U.S. military forces—changed the equation on the ground there was never true.”
  • The message Obama telegraphed in speeches and interviews was clear: He would not end up like the second President Bush—a president who became tragically overextended in the Middle East, whose decisions filled the wards of Walter Reed with grievously wounded soldiers, who was helpless to stop the obliteration of his reputation, even when he recalibrated his policies in his second term. Obama would say privately that the first task of an American president in the post-Bush international arena was “Don’t do stupid shit.”
  • Hillary Clinton, when she was Obama’s secretary of state, argued for an early and assertive response to Assad’s violence. In 2014, after she left office, Clinton told me that “the failure to help build up a credible fighting force of the people who were the originators of the protests against Assad … left a big vacuum, which the jihadists have now filled.” When The Atlantic published this statement, and also published Clinton’s assessment that “great nations need organizing principles, and ‘Don’t do stupid stuff’ is not an organizing principle,” Obama became “rip-shit angry,” according to one of his senior advisers. The president did not understand how “Don’t do stupid shit” could be considered a controversial slogan.
  • The Iraq invasion, Obama believed, should have taught Democratic interventionists like Clinton, who had voted for its authorization, the dangers of doing stupid shit. (Clinton quickly apologized to Obama for her comments,
  • Obama, unlike liberal interventionists, is an admirer of the foreign-policy realism of President George H. W. Bush and, in particular, of Bush’s national-security adviser, Brent Scowcroft (“I love that guy,” Obama once told me).
  • The danger to the United States posed by the Assad regime did not rise to the level of these challenges.
  • Obama generally believes that the Washington foreign-policy establishment, which he secretly disdains, makes a fetish of “credibility”—particularly the sort of credibility purchased with force. The preservation of credibility, he says, led to Vietnam. Within the White House, Obama would argue that “dropping bombs on someone to prove that you’re willing to drop bombs on someone is just about the worst reason to use force.”
  • American national-security credibility, as it is conventionally understood in the Pentagon, the State Department, and the cluster of think tanks headquartered within walking distance of the White House, is an intangible yet potent force—one that, when properly nurtured, keeps America’s friends feeling secure and keeps the international order stable.
  • All week, White House officials had publicly built the case that Assad had committed a crime against humanity. Kerry’s speech would mark the culmination of this campaign.
  • But the president had grown queasy. In the days after the gassing of Ghouta, Obama would later tell me, he found himself recoiling from the idea of an attack unsanctioned by international law or by Congress. The American people seemed unenthusiastic about a Syria intervention; so too did one of the few foreign leaders Obama respects, Angela Merkel, the German chancellor. She told him that her country would not participate in a Syria campaign. And in a stunning development, on Thursday, August 29, the British Parliament denied David Cameron its blessing for an attack. John Kerry later told me that when he heard that, “internally, I went, Oops.”
  • Obama was also unsettled by a surprise visit early in the week from James Clapper, his director of national intelligence, who interrupted the President’s Daily Brief, the threat report Obama receives each morning from Clapper’s analysts, to make clear that the intelligence on Syria’s use of sarin gas, while robust, was not a “slam dunk.” He chose the term carefully. Clapper, the chief of an intelligence community traumatized by its failures in the run-up to the Iraq War, was not going to overpromise, in the manner of the onetime CIA director George Tenet, who famously guaranteed George W. Bush a “slam dunk” in Iraq.
  • While the Pentagon and the White House’s national-security apparatuses were still moving toward war (John Kerry told me he was expecting a strike the day after his speech), the president had come to believe that he was walking into a trap—one laid both by allies and by adversaries, and by conventional expectations of what an American president is supposed to do.
  • Late on Friday afternoon, Obama determined that he was simply not prepared to authorize a strike. He asked McDonough, his chief of staff, to take a walk with him on the South Lawn of the White House. Obama did not choose McDonough randomly: He is the Obama aide most averse to U.S. military intervention, and someone who, in the words of one of his colleagues, “thinks in terms of traps.” Obama, ordinarily a preternaturally confident man, was looking for validation, and trying to devise ways to explain his change of heart, both to his own aides and to the public
  • The third, and most important, factor, he told me, was “our assessment that while we could inflict some damage on Assad, we could not, through a missile strike, eliminate the chemical weapons themselves, and what I would then face was the prospect of Assad having survived the strike and claiming he had successfully defied the United States, that the United States had acted unlawfully in the absence of a UN mandate, and that that would have potentially strengthened his hand rather than weakened it.
  • Others had difficulty fathoming how the president could reverse himself the day before a planned strike. Obama, however, was completely calm. “If you’ve been around him, you know when he’s ambivalent about something, when it’s a 51–49 decision,” Ben Rhodes told me. “But he was completely at ease.”
  • Obama also shared with McDonough a long-standing resentment: He was tired of watching Washington unthinkingly drift toward war in Muslim countries. Four years earlier, the president believed, the Pentagon had “jammed” him on a troop surge for Afghanistan. Now, on Syria, he was beginning to feel jammed again.
  • The fourth factor, he said, was of deeper philosophical importance. “This falls in the category of something that I had been brooding on for some time,” he said. “I had come into office with the strong belief that the scope of executive power in national-security issues is very broad, but not limitless.”
  • Obama’s decision caused tremors across Washington as well. John McCain and Lindsey Graham, the two leading Republican hawks in the Senate, had met with Obama in the White House earlier in the week and had been promised an attack. They were angered by the about-face. Damage was done even inside the administration. Neither Chuck Hagel, then the secretary of defense, nor John Kerry was in the Oval Office when the president informed his team of his thinking. Kerry would not learn about the change until later that evening. “I just got fucked over,” he told a friend shortly after talking to the president that night. (When I asked Kerry recently about that tumultuous night, he said, “I didn’t stop to analyze it. I figured the president had a reason to make a decision and, honestly, I understood his notion.”)
  • The president asked Congress to authorize the use of force—the irrepressible Kerry served as chief lobbyist—and it quickly became apparent in the White House that Congress had little interest in a strike. When I spoke with Biden recently about the red-line decision, he made special note of this fact. “It matters to have Congress with you, in terms of your ability to sustain what you set out to do,” he said. Obama “didn’t go to Congress to get himself off the hook. He had his doubts at that point, but he knew that if he was going to do anything, he better damn well have the public with him, or it would be a very short ride.” Congress’s clear ambivalence convinced Biden that Obama was correct to fear the slippery slope. “What happens when we get a plane shot down? Do we not go in and rescue?,” Biden asked. “You need the support of the American people.”
  • At the G20 summit in St. Petersburg, which was held the week after the Syria reversal, Obama pulled Putin aside, he recalled to me, and told the Russian president “that if he forced Assad to get rid of the chemical weapons, that that would eliminate the need for us taking a military strike.” Within weeks, Kerry, working with his Russian counterpart, Sergey Lavrov, would engineer the removal of most of Syria’s chemical-weapons arsenal—a program whose existence Assad until then had refused to even acknowledge.
  • The arrangement won the president praise from, of all people, Benjamin Netanyahu, the Israeli prime minister, with whom he has had a consistently contentious relationship. The removal of Syria’s chemical-weapons stockpiles represented “the one ray of light in a very dark region,” Netanyahu told me not long after the deal was announced.
  • John Kerry today expresses no patience for those who argue, as he himself once did, that Obama should have bombed Assad-regime sites in order to buttress America’s deterrent capability. “You’d still have the weapons there, and you’d probably be fighting isil” for control of the weapons, he said, referring to the Islamic State, the terror group also known as isis. “It just doesn’t make sense. But I can’t deny to you that this notion about the red line being crossed and [Obama’s] not doing anything gained a life of its own.”
  • today that decision is a source of deep satisfaction for him.
  • “I’m very proud of this moment,” he told me. “The overwhelming weight of conventional wisdom and the machinery of our national-security apparatus had gone fairly far. The perception was that my credibility was at stake, that America’s credibility was at stake. And so for me to press the pause button at that moment, I knew, would cost me politically. And the fact that I was able to pull back from the immediate pressures and think through in my own mind what was in America’s interest, not only with respect to Syria but also with respect to our democracy, was as tough a decision as I’ve made—and I believe ultimately it was the right decision to make.”
  • By 2013, Obama’s resentments were well developed. He resented military leaders who believed they could fix any problem if the commander in chief would simply give them what they wanted, and he resented the foreign-policy think-tank complex. A widely held sentiment inside the White House is that many of the most prominent foreign-policy think tanks in Washington are doing the bidding of their Arab and pro-Israel funders. I’ve heard one administration official refer to Massachusetts Avenue, the home of many of these think tanks, as “Arab-occupied territory.”
  • over the past few months, I’ve spent several hours talking with him about the broadest themes of his “long game” foreign policy, including the themes he is most eager to discuss—namely, the ones that have nothing to do with the Middle East.
  • I have come to believe that, in Obama’s mind, August 30, 2013, was his liberation day, the day he defied not only the foreign-policy establishment and its cruise-missile playbook, but also the demands of America’s frustrating, high-maintenance allies in the Middle East—countries, he complains privately to friends and advisers, that seek to exploit American “muscle” for their own narrow and sectarian ends.
  • “Where am I controversial? When it comes to the use of military power,” he said. “That is the source of the controversy. There’s a playbook in Washington that presidents are supposed to follow. It’s a playbook that comes out of the foreign-policy establishment. And the playbook prescribes responses to different events, and these responses tend to be militarized responses. Where America is directly threatened, the playbook works. But the playbook can also be a trap that can lead to bad decisions. In the midst of an international challenge like Syria, you get judged harshly if you don’t follow the playbook, even if there are good reasons why it does not apply.”
  • For some foreign-policy experts, even within his own administration, Obama’s about-face on enforcing the red line was a dispiriting moment in which he displayed irresolution and naïveté, and did lasting damage to America’s standing in the world. “Once the commander in chief draws that red line,” Leon Panetta, who served as CIA director and then as secretary of defense in Obama’s first term, told me recently, “then I think the credibility of the commander in chief and this nation is at stake if he doesn’t enforce it.” Right after Obama’s reversal, Hillary Clinton said privately, “If you say you’re going to strike, you have to strike. There’s no choice.”
  • Obama’s defenders, however, argue that he did no damage to U.S. credibility, citing Assad’s subsequent agreement to have his chemical weapons removed. “The threat of force was credible enough for them to give up their chemical weapons,” Tim Kaine, a Democratic senator from Virginia, told me. “We threatened military action and they responded. That’s deterrent credibility.”
  • History may record August 30, 2013, as the day Obama prevented the U.S. from entering yet another disastrous Muslim civil war, and the day he removed the threat of a chemical attack on Israel, Turkey, or Jordan. Or it could be remembered as the day he let the Middle East slip from America’s grasp, into the hands of Russia, Iran, and isis
  • spoke with obama about foreign policy when he was a U.S. senator, in 2006. At the time, I was familiar mainly with the text of a speech he had delivered four years earlier, at a Chicago antiwar rally. It was an unusual speech for an antiwar rally in that it was not antiwar; Obama, who was then an Illinois state senator, argued only against one specific and, at the time, still theoretical, war. “I suffer no illusions about Saddam Hussein,” he said. “He is a brutal man. A ruthless man … But I also know that Saddam poses no imminent and direct threat to the United States or to his neighbors.” He added, “I know that an invasion of Iraq without a clear rationale and without strong international support will only fan the flames of the Middle East, and encourage the worst, rather than best, impulses of the Arab world, and strengthen the recruitment arm of al-Qaeda.”
  • This speech had made me curious about its author. I wanted to learn how an Illinois state senator, a part-time law professor who spent his days traveling between Chicago and Springfield, had come to a more prescient understanding of the coming quagmire than the most experienced foreign-policy thinkers of his party, including such figures as Hillary Clinton, Joe Biden, and John Kerry, not to mention, of course, most Republicans and many foreign-policy analysts and writers, including me.
  • This was the moment the president believes he finally broke with what he calls, derisively, the “Washington playbook.”
  • “isis is not an existential threat to the United States,” he told me in one of these conversations. “Climate change is a potential existential threat to the entire world if we don’t do something about it.” Obama explained that climate change worries him in particular because “it is a political problem perfectly designed to repel government intervention. It involves every single country, and it is a comparatively slow-moving emergency, so there is always something seemingly more urgent on the agenda.”
  • At the moment, of course, the most urgent of the “seemingly more urgent” issues is Syria. But at any given moment, Obama’s entire presidency could be upended by North Korean aggression, or an assault by Russia on a member of nato, or an isis-planned attack on U.S. soil. Few presidents have faced such diverse tests on the international stage as Obama has, and the challenge for him, as for all presidents, has been to distinguish the merely urgent from the truly important, and to focus on the important.
  • My goal in our recent conversations was to see the world through Obama’s eyes, and to understand what he believes America’s role in the world should be. This article is informed by our recent series of conversations, which took place in the Oval Office; over lunch in his dining room; aboard Air Force One; and in Kuala Lumpur during his most recent visit to Asia, in November. It is also informed by my previous interviews with him and by his speeches and prolific public ruminations, as well as by conversations with his top foreign-policy and national-security advisers, foreign leaders and their ambassadors in Washington, friends of the president and others who have spoken with him about his policies and decisions, and his adversaries and critics.
  • Over the course of our conversations, I came to see Obama as a president who has grown steadily more fatalistic about the constraints on America’s ability to direct global events, even as he has, late in his presidency, accumulated a set of potentially historic foreign-policy achievements—controversial, provisional achievements, to be sure, but achievements nonetheless: the opening to Cuba, the Paris climate-change accord, the Trans-Pacific Partnership trade agreement, and, of course, the Iran nuclear deal.
  • These he accomplished despite his growing sense that larger forces—the riptide of tribal feeling in a world that should have already shed its atavism; the resilience of small men who rule large countries in ways contrary to their own best interests; the persistence of fear as a governing human emotion—frequently conspire against the best of America’s intentions. But he also has come to learn, he told me, that very little is accomplished in international affairs without U.S. leadership.
  • Obama talked me through this apparent contradiction. “I want a president who has the sense that you can’t fix everything,” he said. But on the other hand, “if we don’t set the agenda, it doesn’t happen.” He explained what he meant. “The fact is, there is not a summit I’ve attended since I’ve been president where we are not setting the agenda, where we are not responsible for the key results,” he said. “That’s true whether you’re talking about nuclear security, whether you’re talking about saving the world financial system, whether you’re talking about climate.”
  • One day, over lunch in the Oval Office dining room, I asked the president how he thought his foreign policy might be understood by historians. He started by describing for me a four-box grid representing the main schools of American foreign-policy thought. One box he called isolationism, which he dismissed out of hand. “The world is ever-shrinking,” he said. “Withdrawal is untenable.” The other boxes he labeled realism, liberal interventionism, and internationalism. “I suppose you could call me a realist in believing we can’t, at any given moment, relieve all the world’s misery,” he said. “We have to choose where we can make a real impact.” He also noted that he was quite obviously an internationalist, devoted as he is to strengthening multilateral organizations and international norms.
  • If a crisis, or a humanitarian catastrophe, does not meet his stringent standard for what constitutes a direct national-security threat, Obama said, he doesn’t believe that he should be forced into silence. He is not so much the realist, he suggested, that he won’t pass judgment on other leaders.
  • Though he has so far ruled out the use of direct American power to depose Assad, he was not wrong, he argued, to call on Assad to go. “Oftentimes when you get critics of our Syria policy, one of the things that they’ll point out is ‘You called for Assad to go, but you didn’t force him to go. You did not invade.’ And the notion is that if you weren’t going to overthrow the regime, you shouldn’t have said anything. That’s a weird argument to me, the notion that if we use our moral authority to say ‘This is a brutal regime, and this is not how a leader should treat his people,’ once you do that, you are obliged to invade the country and install a government you prefer.”
  • “I am very much the internationalist,” Obama said in a later conversation. “And I am also an idealist insofar as I believe that we should be promoting values, like democracy and human rights and norms and values
  • “Having said that,” he continued, “I also believe that the world is a tough, complicated, messy, mean place, and full of hardship and tragedy. And in order to advance both our security interests and those ideals and values that we care about, we’ve got to be hardheaded at the same time as we’re bighearted, and pick and choose our spots, and recognize that there are going to be times where the best that we can do is to shine a spotlight on something that’s terrible, but not believe that we can automatically solve it. There are going to be times where our security interests conflict with our concerns about human rights. There are going to be times where we can do something about innocent people being killed, but there are going to be times where we can’t.”
  • If Obama ever questioned whether America really is the world’s one indispensable nation, he no longer does so. But he is the rare president who seems at times to resent indispensability, rather than embrace it.
  • “Free riders aggravate me,” he told me. Recently, Obama warned that Great Britain would no longer be able to claim a “special relationship” with the United States if it did not commit to spending at least 2 percent of its GDP on defense. “You have to pay your fair share,” Obama told David Cameron, who subsequently met the 2 percent threshold.
  • Part of his mission as president, Obama explained, is to spur other countries to take action for themselves, rather than wait for the U.S. to lead. The defense of the liberal international order against jihadist terror, Russian adventurism, and Chinese bullying depends in part, he believes, on the willingness of other nations to share the burden with the U.S
  • This is why the controversy surrounding the assertion—made by an anonymous administration official to The New Yorker during the Libya crisis of 2011—that his policy consisted of “leading from behind” perturbed him. “We don’t have to always be the ones who are up front,” he told me. “Sometimes we’re going to get what we want precisely because we are sharing in the agenda.
  • The president also seems to believe that sharing leadership with other countries is a way to check America’s more unruly impulses. “One of the reasons I am so focused on taking action multilaterally where our direct interests are not at stake is that multilateralism regulates hubris,”
  • He consistently invokes what he understands to be America’s past failures overseas as a means of checking American self-righteousness. “We have history,” he said. “We have history in Iran, we have history in Indonesia and Central America. So we have to be mindful of our history when we start talking about intervening, and understand the source of other people’s suspicions.”
  • In his efforts to off-load some of America’s foreign-policy responsibilities to its allies, Obama appears to be a classic retrenchment president in the manner of Dwight D. Eisenhower and Richard Nixon. Retrenchment, in this context, is defined as “pulling back, spending less, cutting risk, and shifting burdens to allies
  • One difference between Eisenhower and Nixon, on the one hand, and Obama, on the other, Sestanovich said, is that Obama “appears to have had a personal, ideological commitment to the idea that foreign policy had consumed too much of the nation’s attention and resources.”
  • But once he decides that a particular challenge represents a direct national-security threat, he has shown a willingness to act unilaterally. This is one of the larger ironies of the Obama presidency: He has relentlessly questioned the efficacy of force, but he has also become the most successful terrorist-hunter in the history of the presidency, one who will hand to his successor a set of tools an accomplished assassin would envy
  • “He applies different standards to direct threats to the U.S.,” Ben Rhodes says. “For instance, despite his misgivings about Syria, he has not had a second thought about drones.” Some critics argue he should have had a few second thoughts about what they see as the overuse of drones. But John Brennan, Obama’s CIA director, told me recently that he and the president “have similar views. One of them is that sometimes you have to take a life to save even more lives. We have a similar view of just-war theory. The president requires near-certainty of no collateral damage. But if he believes it is necessary to act, he doesn’t hesitate.”
  • Those who speak with Obama about jihadist thought say that he possesses a no-illusions understanding of the forces that drive apocalyptic violence among radical Muslims, but he has been careful about articulating that publicly, out of concern that he will exacerbate anti-Muslim xenophobia
  • He has a tragic realist’s understanding of sin, cowardice, and corruption, and a Hobbesian appreciation of how fear shapes human behavior. And yet he consistently, and with apparent sincerity, professes optimism that the world is bending toward justice. He is, in a way, a Hobbesian optimist.
  • The contradictions do not end there. Though he has a reputation for prudence, he has also been eager to question some of the long-standing assumptions undergirding traditional U.S. foreign-policy thinking. To a remarkable degree, he is willing to question why America’s enemies are its enemies, or why some of its friends are its friends.
  • It is assumed, at least among his critics, that Obama sought the Iran deal because he has a vision of a historic American-Persian rapprochement. But his desire for the nuclear agreement was born of pessimism as much as it was of optimism. “The Iran deal was never primarily about trying to open a new era of relations between the U.S. and Iran,” Susan Rice told me. “It was far more pragmatic and minimalist. The aim was very simply to make a dangerous country substantially less dangerous. No one had any expectation that Iran would be a more benign actor.”
  • once mentioned to obama a scene from The Godfather: Part III, in which Michael Corleone complains angrily about his failure to escape the grasp of organized crime. I told Obama that the Middle East is to his presidency what the Mob is to Corleone, and I started to quote the Al Pacino line: “Just when I thought I was out—”“It pulls you back in,” Obama said, completing the thought
  • When I asked Obama recently what he had hoped to accomplish with his Cairo reset speech, he said that he had been trying—unsuccessfully, he acknowledged—to persuade Muslims to more closely examine the roots of their unhappiness.“My argument was this: Let’s all stop pretending that the cause of the Middle East’s problems is Israel,” he told me. “We want to work to help achieve statehood and dignity for the Palestinians, but I was hoping that my speech could trigger a discussion, could create space for Muslims to address the real problems they are confronting—problems of governance, and the fact that some currents of Islam have not gone through a reformation that would help people adapt their religious doctrines to modernity. My thought was, I would communicate that the U.S. is not standing in the way of this progress, that we would help, in whatever way possible, to advance the goals of a practical, successful Arab agenda that provided a better life for ordinary people.”
  • But over the next three years, as the Arab Spring gave up its early promise, and brutality and dysfunction overwhelmed the Middle East, the president grew disillusioned. Some of his deepest disappointments concern Middle Eastern leaders themselves. Benjamin Netanyahu is in his own category: Obama has long believed that Netanyahu could bring about a two-state solution that would protect Israel’s status as a Jewish-majority democracy, but is too fearful and politically paralyzed to do so
  • Obama has also not had much patience for Netanyahu and other Middle Eastern leaders who question his understanding of the region. In one of Netanyahu’s meetings with the president, the Israeli prime minister launched into something of a lecture about the dangers of the brutal region in which he lives, and Obama felt that Netanyahu was behaving in a condescending fashion, and was also avoiding the subject at hand: peace negotiations. Finally, the president interrupted the prime minister: “Bibi, you have to understand something,” he said. “I’m the African American son of a single mother, and I live here, in this house. I live in the White House. I managed to get elected president of the United States. You think I don’t understand what you’re talking about, but I do.”
  • Other leaders also frustrate him immensely. Early on, Obama saw Recep Tayyip Erdoğan, the president of Turkey, as the sort of moderate Muslim leader who would bridge the divide between East and West—but Obama now considers him a failure and an authoritarian, one who refuses to use his enormous army to bring stability to Syria
  • In recent days, the president has taken to joking privately, “All I need in the Middle East is a few smart autocrats.” Obama has always had a fondness for pragmatic, emotionally contained technocrats, telling aides, “If only everyone could be like the Scandinavians, this would all be easy.”
  • The unraveling of the Arab Spring darkened the president’s view of what the U.S. could achieve in the Middle East, and made him realize how much the chaos there was distracting from other priorities. “The president recognized during the course of the Arab Spring that the Middle East was consuming us,”
  • But what sealed Obama’s fatalistic view was the failure of his administration’s intervention in Libya, in 2011
  • Obama says today of the intervention, “It didn’t work.” The U.S., he believes, planned the Libya operation carefully—and yet the country is still a disaster.
  • “So we actually executed this plan as well as I could have expected: We got a UN mandate, we built a coalition, it cost us $1 billion—which, when it comes to military operations, is very cheap. We averted large-scale civilian casualties, we prevented what almost surely would have been a prolonged and bloody civil conflict. And despite all that, Libya is a mess.”
  • Mess is the president’s diplomatic term; privately, he calls Libya a “shit show,” in part because it’s subsequently become an isis haven—one that he has already targeted with air strikes. It became a shit show, Obama believes, for reasons that had less to do with American incompetence than with the passivity of America’s allies and with the obdurate power of tribalism.
  • Of France, he said, “Sarkozy wanted to trumpet the flights he was taking in the air campaign, despite the fact that we had wiped out all the air defenses and essentially set up the entire infrastructure” for the intervention. This sort of bragging was fine, Obama said, because it allowed the U.S. to “purchase France’s involvement in a way that made it less expensive for us and less risky for us.” In other words, giving France extra credit in exchange for less risk and cost to the United States was a useful trade-off—except that “from the perspective of a lot of the folks in the foreign-policy establishment, well, that was terrible. If we’re going to do something, obviously we’ve got to be up front, and nobody else is sharing in the spotlight.”
  • Obama also blamed internal Libyan dynamics. “The degree of tribal division in Libya was greater than our analysts had expected. And our ability to have any kind of structure there that we could interact with and start training and start providing resources broke down very quickly.”
  • Libya proved to him that the Middle East was best avoided. “There is no way we should commit to governing the Middle East and North Africa,” he recently told a former colleague from the Senate. “That would be a basic, fundamental mistake.”
  • Obama did not come into office preoccupied by the Middle East. He is the first child of the Pacific to become president—born in Hawaii, raised there and, for four years, in Indonesia—and he is fixated on turning America’s attention to Asia
  • For Obama, Asia represents the future. Africa and Latin America, in his view, deserve far more U.S. attention than they receive. Europe, about which he is unromantic, is a source of global stability that requires, to his occasional annoyance, American hand-holding. And the Middle East is a region to be avoided—one that, thanks to America’s energy revolution, will soon be of negligible relevance to the U.S. economy.
  • Advisers recall that Obama would cite a pivotal moment in The Dark Knight, the 2008 Batman movie, to help explain not only how he understood the role of isis, but how he understood the larger ecosystem in which it grew. “There’s a scene in the beginning in which the gang leaders of Gotham are meeting,” the president would say. “These are men who had the city divided up. They were thugs, but there was a kind of order. Everyone had his turf. And then the Joker comes in and lights the whole city on fire. isil is the Joker. It has the capacity to set the whole region on fire. That’s why we have to fight it.”
  • The rise of the Islamic State deepened Obama’s conviction that the Middle East could not be fixed—not on his watch, and not for a generation to come.
  • The traveling White House press corps was unrelenting: “Isn’t it time for your strategy to change?” one reporter asked. This was followed by “Could I ask you to address your critics who say that your reluctance to enter another Middle East war, and your preference of diplomacy over using the military, makes the United States weaker and emboldens our enemies?” And then came this imperishable question, from a CNN reporter: “If you’ll forgive the language—why can’t we take out these bastards?” Which was followed by “Do you think you really understand this enemy well enough to defeat them and to protect the homeland?”
  • This rhetoric appeared to frustrate Obama immensely. “When I hear folks say that, well, maybe we should just admit the Christians but not the Muslims; when I hear political leaders suggesting that there would be a religious test for which person who’s fleeing from a war-torn country is admitted,” Obama told the assembled reporters, “that’s not American. That’s not who we are. We don’t have religious tests to our compassion.”
  • he has never believed that terrorism poses a threat to America commensurate with the fear it generates. Even during the period in 2014 when isis was executing its American captives in Syria, his emotions were in check. Valerie Jarrett, Obama’s closest adviser, told him people were worried that the group would soon take its beheading campaign to the U.S. “They’re not coming here to chop our heads off,” he reassured her.
  • Obama frequently reminds his staff that terrorism takes far fewer lives in America than handguns, car accidents, and falls in bathtubs do
  • Several years ago, he expressed to me his admiration for Israelis’ “resilience” in the face of constant terrorism, and it is clear that he would like to see resilience replace panic in American society. Nevertheless, his advisers are fighting a constant rearguard action to keep Obama from placing terrorism in what he considers its “proper” perspective, out of concern that he will seem insensitive to the fears of the American people.
  • When I noted to Kerry that the president’s rhetoric doesn’t match his, he said, “President Obama sees all of this, but he doesn’t gin it up into this kind of—he thinks we are on track. He has escalated his efforts. But he’s not trying to create hysteria … I think the president is always inclined to try to keep things on an appropriate equilibrium. I respect that.”
  • Obama modulates his discussion of terrorism for several reasons: He is, by nature, Spockian. And he believes that a misplaced word, or a frightened look, or an ill-considered hyperbolic claim, could tip the country into panic. The sort of panic he worries about most is the type that would manifest itself in anti-Muslim xenophobia or in a challenge to American openness and to the constitutional order.
  • The president also gets frustrated that terrorism keeps swamping his larger agenda, particularly as it relates to rebalancing America’s global priorities. For years, the “pivot to Asia” has been a paramount priority of his. America’s economic future lies in Asia, he believes, and the challenge posed by China’s rise requires constant attention. From his earliest days in office, Obama has been focused on rebuilding the sometimes-threadbare ties between the U.S. and its Asian treaty partners, and he is perpetually on the hunt for opportunities to draw other Asian nations into the U.S. orbit. His dramatic opening to Burma was one such opportunity; Vietnam and the entire constellation of Southeast Asian countries fearful of Chinese domination presented others.
  • Obama believes, Carter said, that Asia “is the part of the world of greatest consequence to the American future, and that no president can take his eye off of this.” He added, “He consistently asks, even in the midst of everything else that’s going on, ‘Where are we in the Asia-Pacific rebalance? Where are we in terms of resources?’ He’s been extremely consistent about that, even in times of Middle East tension.”
  • “Right now, I don’t think that anybody can be feeling good about the situation in the Middle East,” he said. “You have countries that are failing to provide prosperity and opportunity for their people. You’ve got a violent, extremist ideology, or ideologies, that are turbocharged through social media. You’ve got countries that have very few civic traditions, so that as autocratic regimes start fraying, the only organizing principles are sectarian.”
  • He went on, “Contrast that with Southeast Asia, which still has huge problems—enormous poverty, corruption—but is filled with striving, ambitious, energetic people who are every single day scratching and clawing to build businesses and get education and find jobs and build infrastructure. The contrast is pretty stark.”
  • In Asia, as well as in Latin America and Africa, Obama says, he sees young people yearning for self-improvement, modernity, education, and material wealth.“They are not thinking about how to kill Americans,” he says. “What they’re thinking about is How do I get a better education? How do I create something of value?”
  • He then made an observation that I came to realize was representative of his bleakest, most visceral understanding of the Middle East today—not the sort of understanding that a White House still oriented around themes of hope and change might choose to advertise. “If we’re not talking to them,” he said, referring to young Asians and Africans and Latin Americans, “because the only thing we’re doing is figuring out how to destroy or cordon off or control the malicious, nihilistic, violent parts of humanity, then we’re missing the boat.
  • He does resist refracting radical Islam through the “clash of civilizations” prism popularized by the late political scientist Samuel Huntington. But this is because, he and his advisers argue, he does not want to enlarge the ranks of the enemy. “The goal is not to force a Huntington template onto this conflict,” said John Brennan, the CIA director.
  • “It is very clear what I mean,” he told me, “which is that there is a violent, radical, fanatical, nihilistic interpretation of Islam by a faction—a tiny faction—within the Muslim community that is our enemy, and that has to be defeated.”
  • “There is also the need for Islam as a whole to challenge that interpretation of Islam, to isolate it, and to undergo a vigorous discussion within their community about how Islam works as part of a peaceful, modern society,” he said. But he added, “I do not persuade peaceful, tolerant Muslims to engage in that debate if I’m not sensitive to their concern that they are being tagged with a broad brush.”
  • In private encounters with other world leaders, Obama has argued that there will be no comprehensive solution to Islamist terrorism until Islam reconciles itself to modernity and undergoes some of the reforms that have changed Christianity.
  • , Obama described how he has watched Indonesia gradually move from a relaxed, syncretistic Islam to a more fundamentalist, unforgiving interpretation; large numbers of Indonesian women, he observed, have now adopted the hijab, the Muslim head covering.
  • Why, Turnbull asked, was this happening?Because, Obama answered, the Saudis and other Gulf Arabs have funneled money, and large numbers of imams and teachers, into the country. In the 1990s, the Saudis heavily funded Wahhabist madrassas, seminaries that teach the fundamentalist version of Islam favored by the Saudi ruling family, Obama told Turnbull. Today, Islam in Indonesia is much more Arab in orientation than it was when he lived there, he said.
  • “Aren’t the Saudis your friends?,” Turnbull asked.Obama smiled. “It’s complicated,” he said.
  • But he went on to say that the Saudis need to “share” the Middle East with their Iranian foes. “The competition between the Saudis and the Iranians—which has helped to feed proxy wars and chaos in Syria and Iraq and Yemen—requires us to say to our friends as well as to the Iranians that they need to find an effective way to share the neighborhood and institute some sort of cold peace,”
  • “An approach that said to our friends ‘You are right, Iran is the source of all problems, and we will support you in dealing with Iran’ would essentially mean that as these sectarian conflicts continue to rage and our Gulf partners, our traditional friends, do not have the ability to put out the flames on their own or decisively win on their own, and would mean that we have to start coming in and using our military power to settle scores. And that would be in the interest neither of the United States nor of the Middle East.”
  • One of the most destructive forces in the Middle East, Obama believes, is tribalism—a force no president can neutralize. Tribalism, made manifest in the reversion to sect, creed, clan, and village by the desperate citizens of failing states, is the source of much of the Muslim Middle East’s problems, and it is another source of his fatalism. Obama has deep respect for the destructive resilience of tribalism—part of his memoir, Dreams From My Father, concerns the way in which tribalism in post-colonial Kenya helped ruin his father’s life—which goes some distance in explaining why he is so fastidious about avoiding entanglements in tribal conflicts.
  • “It is literally in my DNA to be suspicious of tribalism,” he told me. “I understand the tribal impulse, and acknowledge the power of tribal division. I’ve been navigating tribal divisions my whole life. In the end, it’s the source of a lot of destructive acts.”
  • “Look, I am not of the view that human beings are inherently evil,” he said. “I believe that there’s more good than bad in humanity. And if you look at the trajectory of history, I am optimistic.
  • “I believe that overall, humanity has become less violent, more tolerant, healthier, better fed, more empathetic, more able to manage difference. But it’s hugely uneven. And what has been clear throughout the 20th and 21st centuries is that the progress we make in social order and taming our baser impulses and steadying our fears can be reversed very quickly. Social order starts breaking down if people are under profound stress. Then the default position is tribe—us/them, a hostility toward the unfamiliar or the unknown.”
  • He continued, “Right now, across the globe, you’re seeing places that are undergoing severe stress because of globalization, because of the collision of cultures brought about by the Internet and social media, because of scarcities—some of which will be attributable to climate change over the next several decades—because of population growth. And in those places, the Middle East being Exhibit A, the default position for a lot of folks is to organize tightly in the tribe and to push back or strike out against those who are different.
  • “A group like isil is the distillation of every worst impulse along these lines. The notion that we are a small group that defines ourselves primarily by the degree to which we can kill others who are not like us, and attempting to impose a rigid orthodoxy that produces nothing, that celebrates nothing, that really is contrary to every bit of human progress—it indicates the degree to which that kind of mentality can still take root and gain adherents in the 21st century.”
  • “We have to determine the best tools to roll back those kinds of attitudes,” he said. “There are going to be times where either because it’s not a direct threat to us or because we just don’t have the tools in our toolkit to have a huge impact that, tragically, we have to refrain from jumping in with both feet.”
  • I asked Obama whether he would have sent the Marines to Rwanda in 1994 to stop the genocide as it was happening, had he been president at the time. “Given the speed with which the killing took place, and how long it takes to crank up the machinery of the U.S. government, I understand why we did not act fast enough,” he said. “Now, we should learn from tha
  • I actually think that Rwanda is an interesting test case because it’s possible—not guaranteed, but it’s possible—that this was a situation where the quick application of force might have been enough.
  • “Ironically, it’s probably easier to make an argument that a relatively small force inserted quickly with international support would have resulted in averting genocide [more successfully in Rwanda] than in Syria right now, where the degree to which the various groups are armed and hardened fighters and are supported by a whole host of external actors with a lot of resources requires a much larger commitment of forces.”
  • The Turkey press conference, I told him, “was a moment for you as a politician to say, ‘Yeah, I hate the bastards too, and by the way, I am taking out the bastards.’ ” The easy thing to do would have been to reassure Americans in visceral terms that he will kill the people who want to kill them. Does he fear a knee-jerk reaction in the direction of another Middle East invasion? Or is he just inalterably Spockian?
  • “Every president has strengths and weaknesses,” he answered. “And there is no doubt that there are times where I have not been attentive enough to feelings and emotions and politics in communicating what we’re doing and how we’re doing it.”
  • But for America to be successful in leading the world, he continued, “I believe that we have to avoid being simplistic. I think we have to build resilience and make sure that our political debates are grounded in reality. It’s not that I don’t appreciate the value of theater in political communications; it’s that the habits we—the media, politicians—have gotten into, and how we talk about these issues, are so detached so often from what we need to be doing that for me to satisfy the cable news hype-fest would lead to us making worse and worse decisions over time.”
  • “During the couple of months in which everybody was sure Ebola was going to destroy the Earth and there was 24/7 coverage of Ebola, if I had fed the panic or in any way strayed from ‘Here are the facts, here’s what needs to be done, here’s how we’re handling it, the likelihood of you getting Ebola is very slim, and here’s what we need to do both domestically and overseas to stamp out this epidemic,’ ” then “maybe people would have said ‘Obama is taking this as seriously as he needs to be.’ ” But feeding the panic by overreacting could have shut down travel to and from three African countries that were already cripplingly poor, in ways that might have destroyed their economies—which would likely have meant, among other things, a recurrence of Ebola. He added, “It would have also meant that we might have wasted a huge amount of resources in our public-health systems that need to be devoted to flu vaccinations and other things that actually kill people” in large numbers in America
  • “I have friends who have kids in Paris right now,” he said. “And you and I and a whole bunch of people who are writing about what happened in Paris have strolled along the same streets where people were gunned down. And it’s right to feel fearful. And it’s important for us not to ever get complacent. There’s a difference between resilience and complacency.” He went on to describe another difference—between making considered decisions and making rash, emotional ones. “What it means, actually, is that you care so much that you want to get it right and you’re not going to indulge in either impetuous or, in some cases, manufactured responses that make good sound bites but don’t produce results. The stakes are too high to play those games.”
  • The other meeting took place two months later, in the Oval Office, between Obama and the general secretary of the Vietnamese Communist Party, Nguyen Phu Trong. This meeting took place only because John Kerry had pushed the White House to violate protocol, since the general secretary was not a head of state. But the goals trumped decorum: Obama wanted to lobby the Vietnamese on the Trans-Pacific Partnership—his negotiators soon extracted a promise from the Vietnamese that they would legalize independent labor unions—and he wanted to deepen cooperation on strategic issues. Administration officials have repeatedly hinted to me that Vietnam may one day soon host a permanent U.S. military presence, to check the ambitions of the country it now fears most, China. The U.S. Navy’s return to Cam Ranh Bay would count as one of the more improbable developments in recent American history. “We just moved the Vietnamese Communist Party to recognize labor rights in a way that we could never do by bullying them or scaring them,” Obama told me, calling this a key victory in his campaign to replace stick-waving with diplomatic persuasion.
  • I noted that the 200 or so young Southeast Asians in the room earlier that day—including citizens of Communist-ruled countries—seemed to love America. “They do,” Obama said. “In Vietnam right now, America polls at 80 percent.”
  • The resurgent popularity of America throughout Southeast Asia means that “we can do really big, important stuff—which, by the way, then has ramifications across the board,” he said, “because when Malaysia joins the anti-isil campaign, that helps us leverage resources and credibility in our fight against terrorism. When we have strong relations with Indonesia, that helps us when we are going to Paris and trying to negotiate a climate treaty, where the temptation of a Russia or some of these other countries may be to skew the deal in a way that is unhelpful.
  • Obama then cited America’s increased influence in Latin America—increased, he said, in part by his removal of a region-wide stumbling block when he reestablished ties with Cuba—as proof that his deliberate, nonthreatening, diplomacy-centered approach to foreign relations is working. The alba movement, a group of Latin American governments oriented around anti-Americanism, has significantly weakened during his time as president. “When I came into office, at the first Summit of the Americas that I attended, Hugo Chávez”—the late anti-American Venezuelan dictator—“was still the dominant figure in the conversation,” he said. “We made a very strategic decision early on, which was, rather than blow him up as this 10-foot giant adversary, to right-size the problem and say, ‘We don’t like what’s going on in Venezuela, but it’s not a threat to the United States.’
  • Obama said that to achieve this rebalancing, the U.S. had to absorb the diatribes and insults of superannuated Castro manqués. “When I saw Chávez, I shook his hand and he handed me a Marxist critique of the U.S.–Latin America relationship,” Obama recalled. “And I had to sit there and listen to Ortega”—Daniel Ortega, the radical leftist president of Nicaragua—“make an hour-long rant against the United States. But us being there, not taking all that stuff seriously—because it really wasn’t a threat to us”—helped neutralize the region’s anti-Americanism.
  • “The truth is, actually, Putin, in all of our meetings, is scrupulously polite, very frank. Our meetings are very businesslike. He never keeps me waiting two hours like he does a bunch of these other folks.” Obama said that Putin believes his relationship with the U.S. is more important than Americans tend to think. “He’s constantly interested in being seen as our peer and as working with us, because he’s not completely stupid. He understands that Russia’s overall position in the world is significantly diminished. And the fact that he invades Crimea or is trying to prop up Assad doesn’t suddenly make him a player.
  • “The argument is made,” I said, “that Vladimir Putin watched you in Syria and thought, He’s too logical, he’s too rational, he’s too into retrenchment. I’m going to push him a little bit further in Ukraine.”
  • “Look, this theory is so easily disposed of that I’m always puzzled by how people make the argument. I don’t think anybody thought that George W. Bush was overly rational or cautious in his use of military force. And as I recall, because apparently nobody in this town does, Putin went into Georgia on Bush’s watch, right smack dab in the middle of us having over 100,000 troops deployed in Iraq.” Obama was referring to Putin’s 2008 invasion of Georgia, a former Soviet republic, which was undertaken for many of the same reasons Putin later invaded Ukraine—to keep an ex–Soviet republic in Russia’s sphere of influence.
  • “Putin acted in Ukraine in response to a client state that was about to slip out of his grasp. And he improvised in a way to hang on to his control there,” he said. “He’s done the exact same thing in Syria, at enormous cost to the well-being of his own country. And the notion that somehow Russia is in a stronger position now, in Syria or in Ukraine, than they were before they invaded Ukraine or before he had to deploy military forces to Syria is to fundamentally misunderstand the nature of power in foreign affairs or in the world generally. Real power means you can get what you want without having to exert violence. Russia was much more powerful when Ukraine looked like an independent country but was a kleptocracy that he could pull the strings on.”
  • Obama’s theory here is simple: Ukraine is a core Russian interest but not an American one, so Russia will always be able to maintain escalatory dominance there.“The fact is that Ukraine, which is a non-nato country, is going to be vulnerable to military domination by Russia no matter what we do,” he said.
  • “I think that the best argument you can make on the side of those who are critics of my foreign policy is that the president doesn’t exploit ambiguity enough. He doesn’t maybe react in ways that might cause people to think, Wow, this guy might be a little crazy.”“The ‘crazy Nixon’ approach,” I said: Confuse and frighten your enemies by making them think you’re capable of committing irrational acts.
  • “But let’s examine the Nixon theory,” he said. “So we dropped more ordnance on Cambodia and Laos than on Europe in World War II, and yet, ultimately, Nixon withdrew, Kissinger went to Paris, and all we left behind was chaos, slaughter, and authoritarian governments
  • “There is no evidence in modern American foreign policy that that’s how people respond. People respond based on what their imperatives are, and if it’s really important to somebody, and it’s not that important to us, they know that, and we know that,” he said. “There are ways to deter, but it requires you to be very clear ahead of time about what is worth going to war for and what is not.
  • Now, if there is somebody in this town that would claim that we would consider going to war with Russia over Crimea and eastern Ukraine, they should speak up and be very clear about it. The idea that talking tough or engaging in some military action that is tangential to that particular area is somehow going to influence the decision making of Russia or China is contrary to all the evidence we have seen over the last 50 years.”
  • “If you think about, let’s say, the Iran hostage crisis, there is a narrative that has been promoted today by some of the Republican candidates that the day Reagan was elected, because he looked tough, the Iranians decided, ‘We better turn over these hostages,’ ” he said. “In fact what had happened was that there was a long negotiation with the Iranians and because they so disliked Carter—even though the negotiations had been completed—they held those hostages until the day Reagan got elected
  • When you think of the military actions that Reagan took, you have Grenada—which is hard to argue helped our ability to shape world events, although it was good politics for him back home. You have the Iran-Contra affair, in which we supported right-wing paramilitaries and did nothing to enhance our image in Central America, and it wasn’t successful at all.” He reminded me that Reagan’s great foe, Daniel Ortega, is today the unrepentant president of Nicaragua.
  • Obama also cited Reagan’s decision to almost immediately pull U.S. forces from Lebanon after 241 servicemen were killed in a Hezbollah attack in 1983. “Apparently all these things really helped us gain credibility with the Russians and the Chinese,” because “that’s the narrative that is told,” he said sarcastically.
  • “Now, I actually think that Ronald Reagan had a great success in foreign policy, which was to recognize the opportunity that Gorbachev presented and to engage in extensive diplomacy—which was roundly criticized by some of the same people who now use Ronald Reagan to promote the notion that we should go around bombing people.”
  • “As I survey the next 20 years, climate change worries me profoundly because of the effects that it has on all the other problems that we face,” he said. “If you start seeing more severe drought; more significant famine; more displacement from the Indian subcontinent and coastal regions in Africa and Asia; the continuing problems of scarcity, refugees, poverty, disease—this makes every other problem we’ve got worse. That’s above and beyond just the existential issues of a planet that starts getting into a bad feedback loop.”
  • Terrorism, he said, is also a long-term problem “when combined with the problem of failed states.”
  • What country does he consider the greatest challenge to America in the coming decades? “In terms of traditional great-state relations, I do believe that the relationship between the United States and China is going to be the most critical,” he said. “If we get that right and China continues on a peaceful rise, then we have a partner that is growing in capability and sharing with us the burdens and responsibilities of maintaining an international order. If China fails; if it is not able to maintain a trajectory that satisfies its population and has to resort to nationalism as an organizing principle; if it feels so overwhelmed that it never takes on the responsibilities of a country its size in maintaining the international order; if it views the world only in terms of regional spheres of influence—then not only do we see the potential for conflict with China, but we will find ourselves having more difficulty dealing with these other challenges that are going to come.”
  • I’ve been very explicit in saying that we have more to fear from a weakened, threatened China than a successful, rising China,” Obama said. “I think we have to be firm where China’s actions are undermining international interests, and if you look at how we’ve operated in the South China Sea, we have been able to mobilize most of Asia to isolate China in ways that have surprised China, frankly, and have very much served our interest in strengthening our alliances.”
  • A weak, flailing Russia constitutes a threat as well, though not quite a top-tier threat. “Unlike China, they have demographic problems, economic structural problems, that would require not only vision but a generation to overcome,” Obama said. “The path that Putin is taking is not going to help them overcome those challenges. But in that environment, the temptation to project military force to show greatness is strong, and that’s what Putin’s inclination is. So I don’t underestimate the dangers there.”
  • “You know, the notion that diplomacy and technocrats and bureaucrats somehow are helping to keep America safe and secure, most people think, Eh, that’s nonsense. But it’s true. And by the way, it’s the element of American power that the rest of the world appreciates unambiguously
  • When we deploy troops, there’s always a sense on the part of other countries that, even where necessary, sovereignty is being violated.”
  • Administration officials have told me that Vice President Biden, too, has become frustrated with Kerry’s demands for action. He has said privately to the secretary of state, “John, remember Vietnam? Remember how that started?” At a National Security Council meeting held at the Pentagon in December, Obama announced that no one except the secretary of defense should bring him proposals for military action. Pentagon officials understood Obama’s announcement to be a brushback pitch directed at Kerry.
  • Obama’s caution on Syria has vexed those in the administration who have seen opportunities, at different moments over the past four years, to tilt the battlefield against Assad. Some thought that Putin’s decision to fight on behalf of Assad would prompt Obama to intensify American efforts to help anti-regime rebels. But Obama, at least as of this writing, would not be moved, in part because he believed that it was not his business to stop Russia from making what he thought was a terrible mistake. “They are overextended. They’re bleeding,” he told me. “And their economy has contracted for three years in a row, drastically.
  • Obama’s strategy was occasionally referred to as the “Tom Sawyer approach.” Obama’s view was that if Putin wanted to expend his regime’s resources by painting the fence in Syria, the U.S. should let him.
  • By late winter, though, when it appeared that Russia was making advances in its campaign to solidify Assad’s rule, the White House began discussing ways to deepen support for the rebels, though the president’s ambivalence about more-extensive engagement remained. In conversations I had with National Security Council officials over the past couple of months, I sensed a foreboding that an event—another San Bernardino–style attack, for instance—would compel the United States to take new and direct action in Syria. For Obama, this would be a nightmare.
  • If there had been no Iraq, no Afghanistan, and no Libya, Obama told me, he might be more apt to take risks in Syria. “A president does not make decisions in a vacuum. He does not have a blank slate. Any president who was thoughtful, I believe, would recognize that after over a decade of war, with obligations that are still to this day requiring great amounts of resources and attention in Afghanistan, with the experience of Iraq, with the strains that it’s placed on our military—any thoughtful president would hesitate about making a renewed commitment in the exact same region of the world with some of the exact same dynamics and the same probability of an unsatisfactory outcome.”
  • What has struck me is that, even as his secretary of state warns about a dire, Syria-fueled European apocalypse, Obama has not recategorized the country’s civil war as a top-tier security threat.
  • This critique frustrates the president. “Nobody remembers bin Laden anymore,” he says. “Nobody talks about me ordering 30,000 more troops into Afghanistan.” The red-line crisis, he said, “is the point of the inverted pyramid upon which all other theories rest.
  • “Was it a bluff?” I told him that few people now believe he actually would have attacked Iran to keep it from getting a nuclear weapon.“That’s interesting,” he said, noncommittally.I started to talk: “Do you—”He interrupted. “I actually would have,” he said, meaning that he would have struck Iran’s nuclear facilities. “If I saw them break out.”
  • “You were right to believe it,” the president said. And then he made his key point. “This was in the category of an American interest.”
  • I was reminded then of something Derek Chollet, a former National Security Council official, told me: “Obama is a gambler, not a bluffer.”
  • The president has placed some huge bets. Last May, as he was trying to move the Iran nuclear deal through Congress, I told him that the agreement was making me nervous. His response was telling. “Look, 20 years from now, I’m still going to be around, God willing. If Iran has a nuclear weapon, it’s my name on this,” he said. “I think it’s fair to say that in addition to our profound national-security interests, I have a personal interest in locking this down.”
  • In the matter of the Syrian regime and its Iranian and Russian sponsors, Obama has bet, and seems prepared to continue betting, that the price of direct U.S. action would be higher than the price of inaction. And he is sanguine enough to live with the perilous ambiguities of his decisions
  • Though in his Nobel Peace Prize speech in 2009, Obama said, “Inaction tears at our conscience and can lead to more costly intervention later,” today the opinions of humanitarian interventionists do not seem to move him, at least not publicly
  • As he comes to the end of his presidency, Obama believes he has done his country a large favor by keeping it out of the maelstrom—and he believes, I suspect, that historians will one day judge him wise for having done so
  • Inside the West Wing, officials say that Obama, as a president who inherited a financial crisis and two active wars from his predecessor, is keen to leave “a clean barn” to whoever succeeds him. This is why the fight against isis, a group he considers to be a direct, though not existential, threat to the U.S., is his most urgent priority for the remainder of his presidency; killing the so-called caliph of the Islamic State, Abu Bakr al-Baghdadi, is one of the top goals of the American national-security apparatus in Obama’s last year.
  • This is what is so controversial about the president’s approach, and what will be controversial for years to come—the standard he has used to define what, exactly, constitutes a direct threat.
  • Obama has come to a number of dovetailing conclusions about the world, and about America’s role in it. The first is that the Middle East is no longer terribly important to American interests. The second is that even if the Middle East were surpassingly important, there would still be little an American president could do to make it a better place. The third is that the innate American desire to fix the sorts of problems that manifest themselves most drastically in the Middle East inevitably leads to warfare, to the deaths of U.S. soldiers, and to the eventual hemorrhaging of U.S. credibility and power. The fourth is that the world cannot afford to see the diminishment of U.S. power. Just as the leaders of several American allies have found Obama’s leadership inadequate to the tasks before him, he himself has found world leadership wanting: global partners who often lack the vision and the will to spend political capital in pursuit of broad, progressive goals, and adversaries who are not, in his mind, as rational as he is. Obama believes that history has sides, and that America’s adversaries—and some of its putative allies—have situated themselves on the wrong one, a place where tribalism, fundamentalism, sectarianism, and militarism still flourish. What they don’t understand is that history is bending in his direction.
  • “The central argument is that by keeping America from immersing itself in the crises of the Middle East, the foreign-policy establishment believes that the president is precipitating our decline,” Ben Rhodes told me. “But the president himself takes the opposite view, which is that overextension in the Middle East will ultimately harm our economy, harm our ability to look for other opportunities and to deal with other challenges, and, most important, endanger the lives of American service members for reasons that are not in the direct American national-security interest.
  • George W. Bush was also a gambler, not a bluffer. He will be remembered harshly for the things he did in the Middle East. Barack Obama is gambling that he will be judged well for the things he didn’t do.
Javier E

Disgust and the Ground Zero Mosque | Big Questions Online - 0 views

  • The Ground Zero mosque controversy is actually a perfect illustration of the difficulty we have in our culture discussing controversial issues, because, if moral psychologist Jonathan Haidt is correct, people on opposite sides of the political spectrum analyze these issues using somewhat different criteria. 
  • Haidt has broken down five moral senses that contribute to moral reasoning: Harm, Fairness, Authority, Loyalty, and Purity. The degree to which we care about  those five areas determines the basic stances we take on morality. Note well, these don't dictate the content of our thinking, only the things we will take into consideration as we reason morally. Haidt has found that everyone factors Harm (e.g., "Whom does this hurt?")  and Fairness into their moral thinking, but only people who generally fall onto the conservative side of the American spectrum also factor in Authority, Loyalty, and Purity. (Interestingly, outside the West, nearly everybody else factors these things in as well, which is why, in a clever phrase, "Americans are WEIRD").
  • As Haidt explains in that Edge lecture and elsewhere, the three factors conservatives also bring into their moral reasoning all have to do with establishing and defending the kinds of morals that promote group cohesion. It should be easy to understand from an evolutionary point of view where these instincts came from. In the West, we have over the past couple of centuries centered our moral thinking around Kantian and Benthamite theories that, generally speaking, measure morality by universal categories -- ways of approaching morality that only concern themselves with Harm and Fairness, and exclude the other three. This, Haidt says, is how the people in our society who call themselves liberals (Haidt is one of them) see moral reasoning; they do not grasp that quite a few of their fellow Americans draw on other sources -- or if they do recognize this, they dismiss these sources as illegitimate. Unsurprisingly, conservatives do not accept that we should not care about Authority, Loyalty, and/or Purity (which is not simply about sexual matters, but about the degree to which one believes that some things are "sacred," and therefore not subject to justification through reason).
  • ...6 more annotations...
  • The fact that critics aren't bothered by the idea of Cordoba House existing some distance away from Ground Zero tells you a lot about the Sacred/Profane nature of the opposition. When you have to tell people who see something as sacred that they really have no rational grounds for doing so, you have lost the argument for hearts and minds, even though you may win the argument in court, or in a formal debate.
  • for the (liberal atheist) Harris, as for many conservatives, Ground Zero is a sacred spot. The idea of an Islamic cultural center linked to the patch of ground where thousands were murdered in the name of Islam is offensive on its face, because it profanes the sacred.
  • Cordoba House is explicitly founded as a response to 9/11, and is being sited close to Ground Zero because of what happened there. That mosque defenders don't understand why this upsets many people beyond their ability to articulate shows an incredible tone-deafness to how the world actually works.
  • Cordoba House is a powerful symbol of Who We Are. It defines us as a people. For some, it's important that Cordoba House exist at Ground Zero because it will stand for America as a cosmopolitan, tolerant nation. For others, it's important that Cordoba House not exist at Ground Zero because if it does, it will symbolize a nation that is so eager to affirm tolerance and multiculturalism that we profane the memory of Islam's victims, and break faith with the dead. Cordoba House's power as a cultural symbol, and a symbol of what the American tribe stands for, could hardly be more stark. That many political and cultural elites (academics, journalists, etc.) fail to appreciate its power in this regard -- and to appreciate something is not the same thing as agreeing with it -- is a dramatic failure of imagination.
  • The word "religion" is critical there. Not only are progressivists, re: the mosque, refusing to take as seriously as they ought religion as a system of ideas that actually dictate how people live in this world (something that a stern atheist like Sam Harris actually does, to his credit), but they're also dismissing, or devaluing, a sense of the sacred (as distinct from particular religions) as a source of meaning in the everyday lives of people. From the point of view of many conservatives, the Cordoba House controversy is yet again an example of the cultural elite (a word I use in the descriptive sociological sense, not in the partisan sense) displaying a contempt for their values.
  • I believe that the Manhattan Of the Mind people are going to win the Cordoba House battle, because they believe rights are more important than the common good, and there is no legal way to stop the construction of the mosque (nor, let me add, should there be). But I believe the victory will be entirely Pyrrhic, in more or less the same way it would be for a husband to defeat his wife on logic in an argument, but to leave her so alienated that he undermines the strength of their family's common life.
  •  
    Applies Haidt's theory about the five moral senses underlying all moral reasoning to the Cordoba House controversy, and to the liberal-conservative divide.
  •  
    Explains much about this controversy!
Javier E

When bias beats logic: why the US can't have a reasoned gun debate | US news | The Guar... - 0 views

  • Jon Stokes, a writer and software developer, said he is frustrated after each mass shooting by “the sentiment among very smart people, who are used to detail and nuance and doing a lot of research, that this is cut and dried, this is black and white”.
  • Stokes has lived on both sides of America’s gun culture war, growing up in rural Louisiana, where he got his first gun at age nine, and later studying at Harvard and the University of Chicago, where he adopted some of a big-city resident’s skepticism about guns. He’s written articles about the gun geek culture behind the popularity of the AR-15, why he owns a military-style rifle, and why gun owners are so skeptical of tech-enhanced “smart guns”.
  • Even to suggest that the debate is more complicated – that learning something about guns, by taking a course on how to safely carry a concealed weapon, or learning how to fire a gun, might shift their perspective on whichever solution they have just heard about on TV – “just upsets them, and they basically say you’re trying to obscure the issue”.
  • ...8 more annotations...
  • In early 2013, a few months after the mass shooting at Sandy Hook elementary school, a Yale psychologist created an experiment to test how political bias affects our reasoning skills. Dan Kahan was attempting to understand why public debates over social problems remain deadlocked, even when good scientific evidence is available. He decided to test a question about gun control.
  • Then Kahan ran the same test again. This time, instead of evaluating skin cream trials, participants were asked to evaluate whether a law banning citizens from carrying concealed firearms in public made crime go up or down. The result: when liberals and conservatives were confronted with a set of results that contradicted their political assumptions, the smartest people were barely more likely to arrive at the correct answer than the people with no math skills at all. Political bias had erased the advantages of stronger reasoning skills.
  • The reason that measurable facts were sidelined in political debates was not that people have poor reasoning skills, Kahan concluded. Presented with a conflict between holding to their beliefs or finding the correct answer to a problem, people simply went with their tribe.
  • It wasa reasonable strategy on the individual level – and a “disastrous” one for tackling social change, he concluded.
  • But the biggest distortion in the gun control debate is the dramatic empathy gap between different kinds of victims. It’s striking how puritanical the American imagination is, how narrow its range of sympathy. Mass shootings, in which the perpetrator kills complete strangers at random in a public place, prompt an outpouring of grief for the innocent lives lost. These shootings are undoubtedly horrifying, but they account for a tiny percentage of America’s overall gun deaths each year.
  • The roughly 60 gun suicides each day, the 19 black men and boys lost each day to homicide, do not inspire the same reaction, even though they represent the majority of gun violence victims. Yet there are meaningful measures which could save lives here – targeted inventions by frontline workers in neighborhoods where the gun homicide rate is 400 times higher than other developed countries, awareness campaigns to help gun owners in rural states learn about how to identify suicide risk and intervene with friends in trouble.
  • When it comes to suicide, “there is so much shame about that conversation … and where there is shame there is also denial,”
  • When young men of color are killed, “you have disdain and aggression,” fueled by the type of white supremacist argument which equates blackness with criminality.
Javier E

Steven Pinker Thinks the Future Is Looking Bright - The New York Times - 0 views

  • What’s behind all this good news?The most overarching explanation would be that the Enlightenment worked. The idea that if we — we being humanity — set ourselves the goal of improving well-being, if we try to figure out how the world works using reason and science, every once in a while we can succeed.
  • You have argued that there is such a thing as human nature. Do you think we can transcend it?Part of human nature allows us to control the other part of our human nature. Even though humans tend to be unreasonable, it can’t be the case that we’re incapable of reason — otherwise, you’d never be able to make the argument that we’re being unreasonable. Even if we tend to backslide to irrationality, that doesn’t mean we should indulge that when we are deliberating how to run a society.
  • I was surprised by how much interest there’s been from centrist politicians, who are desperate for a coherent narrative to defend centrist liberalism, cosmopolitanism, open society, from the threats both by populists and by the hard left. I think there is a hunger for a coherent worldview that isn’t just the status quo, the un-Trumpism. We can do better than that. We ought to use reason and science to enhance human well-being.
  • ...4 more annotations...
  • The value of science is not the value of a bunch of people who call themselves scientists. It’s the concept. It’s also the value of science that tells us when there’s been a failure of reasoning, that identifies the biases and distortions and also points the way to overcome them.
  • So, we need institutions like government to keep us acting rationally?None of us is anywhere close to perfect. Scientists themselves are not terribly, not completely rational. We can set up institutions that result in greater rationality than any of us is capable of individually, like peer review, like free speech, like a free press, like empirical testing — norms and institutions that make us collectively more rational than any of us is individually.
  • Why do you think people continue to hold on to demonstrably unscientific beliefs?It looks like the biggest reason is not because they don’t know the science, but because of their political ideology. The reason that people deny human-made climate change is not that they’re ignorant of climate science, but because they’re on the political right. Conversely, people who accept human-made climate change don’t necessarily understand what’s causing it
  • My own view of the world was radically altered when I looked at data instead of headlines.If history is about all the wars, all the disasters, you’re missing all this incremental improvement that can only be ascertained through data
Javier E

The Republican Party's Motivated Reasoning - The Bulwark - 0 views

  • sometimes, people will trust someone simply because the messenger is saying what they want to hear. Psychologists call this “motivated reasoning.”
  • Keep all this in mind as you consider Ed McBroom, a Republican state senator in Michigan who recently came to national attention thanks to the rantings of former President Trump and a riveting profile written by the Atlantic’s Tim Alberta. McBroom chairs the Michigan senate’s oversight committee, a position that empowered him to investigate allegations of voter fraud during the 2020 general election
  • here is the background and record on McBroom:
  • ...13 more annotations...
  • He entered politics to advocate for traditional or socially conservative beliefs. “He glowed with certain passions—outlawing abortion, preserving family values, fighting bureaucrats on behalf of the little guy—that could not be championed in the stables,” Alberta writes. McBroom stated his position on gun ownership in 2012: “The Second Amendment guarantees our rights to own firearms[,] and I stand strongly for that correct interpretation.”
  • The American Conservative Union gave McBroom the best marks of any Michigan state senator—voting in line with the organization’s position 95 percent of the time—in 2019, the most recent year of data. By the old rules of political communication, no one is more qualified to be a “credible messenger” to the right-of-center voters of the U.P. than Ed McBroom.
  • last month, McBroom and three of his senate colleagues—two of them Republicans, only one a Democrat—released their report, and it “crackled with annoyance at certain far-flung beliefs,” writes Alberta:
  • His committee interviewed scores of witnesses, subpoenaed and reviewed
  • thousands of pages of documents, dissected the procedural mechanics of Michigan’s highly decentralized elections system, and scrutinized the most trafficked claims about corruption at the state’s ballot box in November. McBroom’s conclusion hit Lansing like a meteor: It was all a bunch of nonsense. “Our clear finding is that citizens should be confident the results represent the true results of the ballots cast by the people of Michigan,” McBroom wrote in the report. “There is no evidence presented at this time to prove either significant acts of fraud or that an organized, wide-scale effort to commit fraudulent activity was perpetrated in order to subvert the will of Michigan voters.” For good measure, McBroom added: “The Committee strongly recommends citizens use a critical eye and ear toward those who have pushed demonstrably false theories for their own personal gain.”
  • “McBroom said he is not fazed by the criticism or the prospect of a primary challenge, which he was already expecting,” notes the Michigan Bridge. “I’ve been totally honest and up front, and if (voters) judge that’s not what they want, and if the majority of them want a different course of action, that’s okay,” he told that publication.
  • Yet despite three of the four senators who wrote the election report being Republicans; and despite McBroom’s ideological reputation, the product of a decade in Michigan’s state legislature (he was a state rep from 2010 to 2018), and the familiarity of the McBroom family name, and McBroom’s culturally Christian values—despite all that, his political standing is still taking a hit.
  • Trump trashed McBroom and the state senate president, Republican Mike Shirkey. He published their office phone numbers. He urged people to “vote them the hell out of office.”
  • more to the point about credible messengers is this: McBroom said that he’s felt heat from people he knows—allies of his—not just randos on social media. “It’s been very discouraging, and very sad, to have people I know who have supported me, and always said they respected me and found me to be honest, who suddenly don’t trust me because of what some guy told them on the internet,” he told Alberta.
  • So thorough were the authors’ conclusions that they recommended “the [state] attorney general consider investigating those who have been utilizing misleading and false information about Antrim County,” where an obvious and brief reporting error showed Biden thumping Trump, “to raise money or publicity for their own ends.”
  • What McBroom clarified is this:
  • Trump’s base is the animating force of the Republican party, which holds GOP officials accountable mostly for their accountability to Trump. To this group, there is no such thing as a credible critic of the former president.
  • the unanswered question that confronts coalition-builders today is how to reach a movement for which all reasoning is motivated reasoning; for which facts and proof are subjective
Javier E

Opinion | Transcript: Ezra Klein Interviews Brandon Terry - The New York Times - 0 views

  • BRANDON TERRY: Well, there’s this puzzle when we think about somebody like Martin Luther King Jr. And it’s that on the one hand, we have a national holiday devoted to him, an imposing monument on the hallowed space of the National Mall; he’s invoked in all manner of political speeches from across the political spectrum, probably the most famous African American of the 20th century.
  • But at the same time, if you ask even really well-educated people, they often don’t know that he’d written five major books, that he’s a systematic theologian with sustained interest in political philosophy who’s written lots and lots of things, incisive things, on some of the most pressing political and ethical matters.
  • King wants to say something different, I think. He wants to say that we are both of these things. We are a society with what he called the congenital deformity of racism — that it’s shot through many of our deepest institutions and structural arrangements, and because it has not been redressed on the scale that it would have to be to achieve true justice, it festers. It’s a rot. It’s a challenge that every generation is called on to pick up and try to do better than their forebears.
  • ...153 more annotations...
  • I’ve described it as a romantic narrative, one that’s about unities in the process of becoming, a calling together of Americans to transcend racial division and come together in a unifying way, a more perfect union, as a transcendence of essential American goodness over transitory American evils.
  • when we tell the story that way, unfortunately, not only is it mythic, but it trains us to treat King as the kind of person who’s not doing any original political thinking. What he’s doing is calling us to be true to who we always already were
  • And when you treat him like that, the thing that becomes most interesting about him is not his thought. It’s not the way he challenged us to think about violence. It’s not the way he challenges us to think about segregation, both de facto and de jure. It’s not how he challenges us to think about economic justice.
  • The thing that’s interesting about him starts to be his rhetoric or his tactics, the way in which he pushes people or frames arguments to call us to be true to who we always already were. That’s a real problem because it evades the most incisive, challenging and generative contributions that his public philosophy makes for our era.
  • it gets conscripted into a story that’s ultimately affirming about the adequacy of our constitutional order, the trajectory of our institutions, the essential goodness of our national character. You often hear politicians use this rhetoric of, this is not who we are.
  • it’s partly related to how we tell the story of the civil rights movement and particularly, how we tell King’s role in the civil rights movement.
  • There’s a way in which the philosophy of nonviolence gets painted, even in King’s time, as a kind of extreme, purist pacifism. And part of that is the connection with Gandhi, although I think it’s a radical misunderstanding of Gandhi, as well.
  • it’s a way of imagining the commitment to nonviolence as related to passivity, as related to the performance of suffering for pity. These are things that King never endures. For him, the idea of passive resistance was a misnomer. He helped coin the phrase “direct action” — he and other members of the civil-rights generation — that nonviolence is aggressive.
  • It’s an aggressive attack on injustice, an aggressive form of noncooperation with domination. It’s about trying to wedge yourself into the machinery of domination, to prevent its adequate functioning, to try to force or coerce your fellow citizens to stop and take stock of what kind of injustices are being unfurled in their name.
  • And it does so on the presumption that politics involves coercion, especially for King, who had a pretty tragic sense of human nature, that politics is going to involve confrontation with great evil, that it’s not a Pollyannaish view about what we’re all capable of if we just turn our eye toward God in the right way.
  • We’re owe it to them to live with evil. And we always are going to be called to confront it. We just need to do it in ways that won’t unleash a further chain of social evil and bitterness and revenge and retaliation. And King thought nonviolence was the only weapon that could cut and heal at the same time.
  • So when you hear King talk about love, when you hear King talk about nonviolence, these things actually require not just an enormous discipline around the acceptance of suffering, as if it’s some kind of passive practice, but they require really creative, dedicated thinking around how exactly to push and prod your neighbors into addressing the forms of injustice that structure the polity and how to do it in a way that doesn’t leave a perpetual midnight of bitterness when the conflict is done.
  • He says that the really interesting question, however, is how to organize a sustained, successful challenge to structural injustice. And for King, that requires something that blends militant resistance and a higher-order ethical practice that can point the way toward peaceful reconciliation over the long term.
  • Gandhi has this line where he says, if you can’t practice nonviolence, it quote, “retaliation or resistance unto death is the second best, though a long way off from the first. Cowardice is impotence, worse than violence.” So this idea that if you can’t be nonviolent, it’s better to be violent than to be a coward, doing nothing — I think gets at something important. Can you help unpack that?
  • to raise the question of strategy, as if we can evaluate means without some kind of ethical reflection or without some kind of underlying ethical commitments, for King, is already a confusion. He thinks that the ends are prefigured in any means.
  • Gandhi, in “Hind Swaraj,” has this great passage where he talks about how could come to acquire a piece of property. You could buy it. You could steal it. You could kill somebody in pursuit of it. You could ask for it as a gift.At the end of the day, you still have the same property. But the thing, itself, has changed. In one scenario, it’s a piece of stolen property. It’s a theft. In another, it’s a gift, which is different than something you’ve purchased.
  • So in the course of acquiring the thing, even though the thing is the same, the means have transformed it in a really, really important way. And King wants to say something similar — that in all political practice, the ends are prefigured in the means
  • nonviolence has to be — if it’s going to be true nonviolence for King — informed by a philosophy of love that really wants and desires and wills goodwill for the enemy at present and is committed, at the fundamental level, to going on together in peace, going on together, sharing the polity in perpetuity.
  • I think for King, imperative to nonviolent resistance turns, in large part, on the question of your own dignity and self-respect. So it is a justice question. He’s concerned with structural justice as a matter of the kinds of arrangements that prevail in the larger American society. That’s obviously true.
  • So there’s the person or group you’re in conversation or conflict with. I’m a liberal, and I’m arguing with a conservative. And I think that’s the most common target to think about: How do I beat or convince this person or group on the other side?
  • Then there’s the broader community polity — the voters, of the country, people who are bystanders, maybe interested, maybe not, but a broader community that is in some way watching or can be brought in to watch. And then there’s you, the person taking the action, and how it affects you and your group to take a particular action.
  • something that seems present in King’s thought is much, much, much, much more concern and focus than I think most political thinkers have today on how political action affects you, the person taking it, and affects the broader community that might be watching it
  • — that ends up with you being turned away from the good and toward things like hatred, resentment, violence, which he thinks, ultimately, will corrode your soul and take you further away from flourishing.
  • But he’s also concerned with how you relate to your own sense of equality, equal standing, worth, as he would say, somebodiness, we might say dignity — he also says that a lot — and that for King, to acquiesce in the face of oppression and domination, without protest, is to abdicate your own self-respect and dignity.
  • for him, dignity also required a certain kind of excellence of character, a certain kind of comportment and practice toward others.
  • So it is about trying to defend your dignity, defend yourself respect against insult and humiliation, oppression. But it’s also about doing so in a way that doesn’t degrade your character in the long term, that doesn’t cause you to end up being turned away from the good, which, again, for him, is going to be a religiously-inflected category
  • When you think about somebody’s political philosophy or their theory of political action, you can maybe think of there being a couple agents they’re thinking about.
  • It has fallen out of favor to say that there are certain ways of acting, politically, that are better and worse, from a virtue perspective, because it often is seen not as really a question of you and your relationship to some baseline or ideal but is some kind of concession you’re making to people who don’t deserve
  • I am a person who believes those questions are still legitimate, that they can’t all be reduced to strategy or will to power or psychic drives. I think that there’s something like an ethical life that requires us to argue about it and requires us to think really hard about how we discipline ourselves to achieve it.
  • Evelyn Brooks Higginbotham, wrote a phenomenal book, many years ago, called “Righteous Discontent.” And that’s what introduces the phrase, “the politics of respectability.” It’s a study of turn of the century Black Baptist women and their organizing efforts through the church.
  • It’s this idea that, in confronting a system of social stigma, the response that you need to have to it is to try to adjust your behavior, comportment, your self fashioning, in line with the dominant norms so that you can, over time, undermine the stigma and become a full participant in society.
  • there are all sorts of questions, legitimate questions, that are raised against that. Are we losing something valuable about alternative forms of life, about alternative cultural practices, when we take the existing, dominant norms as unassailable or something to aspire to?
  • what’s really fascinating is that he talks a lot about how he sympathizes with all those criticisms. He agrees with them
  • here’s the other part of Evelyn Higginbotham’s formulation — there’s a deeper question, one with thousands of years of moral reflection built up into it, which is about virtue ethics — that there are some things that people are appealing to you about that aren’t about their effect in the polity that aren’t about trying to manipulate white, racial attitudes. They’re about your own flourishing and character. They’re deep questions about how to live a good life, how to achieve excellence and the crafting of your soul.
  • as King would say, our reason sometimes can become subordinate to our passions. It can just be a legitimizing power or rationalizing power to the point where we lose track of what we really want to achieve, the kind of character we really want to have.
  • And for King, many of the appeals he made in that vocabulary are really about that. They’re really about virtue. They’re really about what hatred does to your life, what anger does to your life, what violence does to your life
  • there is a question for him, at the core of his life, which is, what makes this worth doing? That’s a virtue question. It’s not just a strategic or tactical one, in the narrow sense.
  • he describes nonviolence, I think really importantly, as also being about a nonviolence of spirit.
  • the example that he often gives is about humiliation — that there’s a way in which the desire to humiliate others, to diminish their status in front of other people for your own pleasure, the desire to subject them to standards of evaluation that they probably themselves don’t hold or don’t understand, in order to enable mockery. There’s a way in which, if we’re reflexive about where that desire comes from, we will find that it comes from a place that’s irrational, indefensible and, likely, cruel, and that if we were to imagine a way of life built around those feelings, those desires, those practices, it would be one that would make it really hard for us to have healthy social ties, stable institutions, flourishing social relationships.
  • So part of what he’s up to is asking us, at all times, to be self-reflexive about the desires and needs and fantasies that drive us in politics
  • the concession.
  • So what nonviolence does is, it builds in a check on those kinds of rationalizations, those kinds of emotional drives, by teaching us to avoid forms of humiliation and forms of physical violence that make it hard to come back from. So that’s the first point.
  • The second point — and it goes more to your sense of revenge and retaliation — is again, forcing us to acknowledge the legitimacy of anger.
  • He uses the phrase, “legitimate anger” in the late ’60s — but to be reflective about it and understand that, even in a case where someone kills a loved one of yours, revenge, violence, retaliation, that doesn’t bring back the loved one that you’ve lost.
  • The only thing that can do that is a kind of forward-looking, constructive practice of politics and social ethics.
  • so what he’s trying to do is raise the question of, can we channel our legitimate rage, our legitimate anger, into a practice that allows us to maintain our self respect?
  • here’s this man who is both making this public argument and trying to get people to follow him in it and put themselves at risk over it, and is also living it himself, and talks about this unbelievably difficult thing, which is not feel righteous anger, but to not feel hatred, to internally reflect the world you want externally.
  • he does falter. He does fail. And I think when we read biographies of King, when you read the last parts of David Garrow’s biography, when you read Cornel West’s essay, from “To Shape a New World,” which talks a lot about the despair at the end of King’s life, if you watch HBO’S great documentary, “King in the Wilderness,” you see a person faltering and failing under the pressure.
  • He’s not able, for example, to bring himself to a kind of reconciliation with Malcolm X
  • How imaginable is King’s philosophy, is this practice, without his deep Christianity, without a belief in redemption, in salvation, in the possibility of a next life?
  • I think King, himself, thinks that the practice of nonviolent politics does the kind of work that you’re describing. And I think he would be worried about the fact that, in our time, so much of these questions about the management of emotion, the building of character, has become a privatized practice.
  • So I think he does think that that’s one way that this really does happen. And we have lots of evidence from the Civil Rights Movement, personal testimony, and personal reflection, where this seems to be the case.
  • the last thing I’ll say is that in order to do that work, in order to do some of the work you’re describing, he also is building an alternative community
  • So one way that I read that famous final speech, “I’ve seen the promised land” — there’s obviously a prophetic reading of it, but there’s also one where he’s describing the prefiguration of the promised land in the kind of politics and social life he’s participated in over his career, that the promised land is seen in the union politics in Memphis, it’s seen in the Student Nonviolent Coordinating Committee, gathering to do Mississippi Freedom Summer. It’s seen in the people walking for 350-plus days in Montgomery, Alabama, and banding together to help each other out, that is the promised land.
  • And when you are in a community that’s constantly talking with each other and lifting each other up and engaging in practices like song, prayer, other communal rituals, to try to affirm this alternative set of ethical and political commitments against the whole rest of the culture, that’s the only way it can be done, is that you have to have an alternative form of social life that can sustain you in that work. The private practice isn’t going to do it.
  • When you look at the principles of nonviolence on Stanford’s King Institute, I think a bunch of them would be familiar to people. You can resist evil without resorting to violence. You seek to win the friendship and understanding of the opponent, not to humiliate.
  • He thinks that we learn a lot about how to love other people by confronting them in public, by forcing ourselves into uncomfortable situations where we have to endure the look of the other, back and forth, where we train ourselves to extend these interactions of contentious politics until they can alter or change the people that we’ve put our bodies in close contact with on the field of politics.
  • I go back to the sermon he gave — and it’s collected in “Strength to Love,” and it’s called shattered dreams — where he confronts a problem that is all over the Black tradition, which is that the struggle we’re engaged in has gone on, in some form or another, for hundreds of years. At the moments of its greatest promise, you can look over the course of history and see, just years later, we find ourselves in situations that are unimaginably awful.
  • King is not naive. He’s a student of history. He’s somebody who asks himself hard questions like this. And he gives two different kinds of answers. And one is the answer that you’ve mentioned here, which is a theological answer. It’s conventional theodicy story, that look, at the end of the day, God is at work in the world. And God is on the side of justice.
  • There’s another way that he goes at it, however. And for me, I read it as rooted in a different kind of project, one that combines what used to be called philosophical anthropology, which is just a way of saying philosophical reflections on what kind of beings we are. It’s rooted in that, and it’s rooted in politics. And I think those things can find lots of overlapping consensus from people outside of the Christian tradition.
  • What you have to be committed to, in the last instance, is that evil is not the totality of who we are as persons, that people have the capacity, emotionally and rationally, to reflect on their life plans, their practices, their commitments, and change them, maybe not all of them, maybe not all at once, but that those things can be changed, and that politics is really a field where contingency is the key word, that although there are structural constraints and everything can’t be done at every moment, that the unprecedented, the new, the unexpected, happens in this realm.
  • And the only way that we can confirm that nothing new will happen, that oppression will last forever, that the future bears no hope, is if we don’t act. That’s the only way we can confirm that it’s true for all time, is by failing to act in pursuit of justice.
  • that’s King’s view, I think. And to me, that’s the persuasive one, that in our action, we might be able to see some measure of justice from a complicated, complex swirl of contingencies, and to move the ball forward — we will inevitably fail — but to look back on that failure with maturity and try to do better the next time.
  • How do you think about the question of the weaponization of nonviolence and then the applicability of its principles to the powerful and to what they might, we might, the state might learn from it.
  • there were many people — Harold Cruse famously wrote this, but others even closer to King — who said, you’re not the leader of Vietnam. You’re the leader of the African American civil rights movement. You should not speak out on this war because you’ll lose your relationship with Johnson.
  • King says that the people who are advising him in this way, they just don’t know him, his commitment or his calling. They don’t understand that if he’s going to raise his voice against violence in Watts or Detroit, that he’s got to raise it against what he called, “the greatest purveyor of violence in the world today,” his own government.
  • for him, the question was really one about militarism and the way that gets imagined as this hardheaded, realistic, hyper-rational response to international disputes and social problems abroad, when in actuality, if we take stock of what he called the casualties of war, the spiritual ones and the material ones, we would realize that most of the violence we engage in at the foreign-policy level is counterproductive. It’s created more problems and more harms than it ever has seemed to solve.
  • This is one of the powerful interventions that you see in Lionel McPherson’s essay, in “To Shape a New World.” It’s just this idea that this is about hardheaded realism is mythic. King says it’s about an immature image that we are nurturing for ourselves, that we’re trying to shore up this idea of ourselves as some kind of crusading hero or all-powerful world power, while not taking stock of all of the things about our freedoms, about our way of life, about our connectedness as a society, about our social divisions, that war has exacerbated, not to mention the violence that’s prosecuted abroad.
  • And he says similar things about domestic policy, the ways in which our politics toward poor families, single-parent households, is punitive for reasons that aren’t justified, that our response to what he calls “the derivative crimes of the ghetto” are wildly out of proportion and unjust compared to how we treat the systematic crimes of exploitation, segregation, disenfranchisement, that structure much of ghetto life.
  • So I’m in total agreement with Coates on that question
  • it just seems — I don’t want to call it axiomatic, but a repeated d that the more willing you become to use violence as a state, the more it corrupts you, and the more violent you become as a state, and to some degree, the more violent the people you are policing, the people you are occupying, become.
  • I’m not a pacifist. I don’t believe you can fully eradicate violence. But we don’t weigh how violent we make others, in our actions, very well, and then how violent we become in response, how much we enter into that escalatory dynamic.
  • But then the other thing is this question of this broader community, of changing hearts, of changing minds, of acting upon people, not through punishment, but through our belief that they can alter. And I’d be curious to hear you reflect on that question of community a little bit, because I think one of the central debates of our time is who’s actually in the community.
  • What would it mean to have a bit more of King’s view, of trying to create community at the center of what the state is attempting to do, as it fashions and helps govern the country?
  • BRANDON TERRY: So one underappreciated feature from King’s famous Riverside Church speech against Vietnam, is he goes on this whole riff about America lacking maturity. And it’s a weird thing to have in a foreign-policy speech. You’re used to — you’re a policy person. You don’t usually hear the word “maturity” bandied about in these kinds of debates.
  • But what he’s getting at is that something really tightly linked to violence, that violence always exceeds the original justification you have for it. It’s not precise. It’s not able to be easily targeted, as we think. It spirals out. It produces retaliation. And then we retaliate again.
  • And all the while, it’s expanding its justifications to the point of absurdity. And King describes that as adding cynicism to the process of death. And he says that maturity is one of the only ways out here, that the maturity to be able to stand up and say, we were wrong, we want to make amends, we want to repair evils committed in our name, those are questions that are essentially nonstarters in American politics right now, certainly about foreign policy, but even in some places in domestic policy.
  • that feature of King’s thinking is something that I always want to draw attention to because I think it’s something we ignore. So that’s the first point I want to make.
  • The second thing — and this is also really deeply-seeded in that Vietnam speech — one of the reasons that people hated it so much — he was attacked in The New York Times, basically every editorial page in the country — one of the reasons people hated that speech so much is that he spent so much time expressing solidarity and sympathy with Ho Chi Minh and the North Vietnamese forces.
  • How could you express sympathy or some kind of solidarity with the enemy? And it’s very instructive, how King went about it. He wasn’t one of these people — you’ve seen these images of people waving the North Vietnamese flag at counterculture protests. It wasn’t like that.
  • It was him really spending a lot of time meditating on the reasons why we had ended up in this conflict, narrating the whole history of our failure to support Ho Chi Minh and the struggle against French colonialism, against Chinese colonialism, and how that had led to the situation we were in by 1967. King is narrating this history. He’s also trying to get people to think about what it must feel like to be on the ground in Vietnam and witness these bombings, witnessed this imposition of terror.
  • And he’s doing that because at bottom, he’s inspired by a vision really rooted in a parable the Good Samaritan, from the Bible, that everyone is our neighbor, that there are no sectional loyalties that should eviscerate our moral obligations to others, our obligation to show them respect, to go on in community with them, and that most of what goes on in foreign policy and particularly war making, is a bad-faith evasion of the fact that we’re all interconnected.
  • he understood that there’s a fundamental interconnectedness amongst humanity at the ethical level and at the material, structural level, and that war making is an evasion of that fact. We’re going to have to live together. So the chief question that should organize it is, how can we do so in peace?
  • He has a line where he says, quote, “the dignity of the individual will flourish when the decisions concerning his life are in his own hands, when he has the assurance that his income is stable and certain and when he knows that he has a means to seek self-improvement. Personal conflicts between husband, wife and children will diminish when the unjust measurement of human worth, on a scale of dollars, is eliminated.” Tell me a bit about the spiritual and psychological dimensions of King’s economic philosophy and organizing.
  • BRANDON TERRY: Well, for King, the question of poverty and the question of economic inequality are both questions of dignity and democracy, and the questions of dignity because when you live without the adequate means to really enjoy the fair value of your basic rights, when you live in a society — and this is a really important point for King — when you live in a society of profound affluence, like the United States, and you live in severe poverty, it expresses a kind of contempt from your fellow citizens about your standing as an equal member of the polity.
  • So separate from the plain, material fact of hunger or health care, there’s this additional spiritual concern with the way in which living with nothing, living on a lonely island amidst an ocean of prosperity, as he would put it, diminishes your dignity.
  • then another piece — this is bridging of the dignity and democracy question — is that when people don’t have a say in the core, vital interest of their life, when they have no decision-making power over the processes which determine how their life is going to go, that too is a diminishment of their dignity. And King, who was operating in a long tradition of social democracy, wants to expand democratic practices to the broader economic realm.
  • Without expanding democracy into that economic realm, for King, we’re both making a mockery of democracy and we’re diminishing the dignity of citizens who live in search of a real standing as free and equal.
  • as somebody who spends a lot of my time in debates about economic policy, I think it is fair to say that the ends of economics are taken as the economy, typically. People hopefully shouldn’t starve. But a lot of debates about what we should do, even for the poor, become these recursive, well, how can they better participate in the economy and how are they going to be able to invest in themselves and how they’ll be economic opportunity for their children.
  • And the idea that the economy is subservient to the community, that the point of the economy is the community, that it should be measured— our policy should be measured by what they do for democratic participation, for the dignity of individuals, is pretty lost. If anything, I see it more now, on the post-liberal right, as people call it, than I even do among mainline Democrats.
  • it has fallen out of favor as a way to frame and think about these conversations.
  • BRANDON TERRY: Yeah, I think it’s rooted in some really complicated things. I think there’s a kind of liberal anxiety about speaking forthrightly about the fact that living in areas of severe, concentrated disadvantage and racial segregation that we call ghettos, diminishes the dignity of the people who live there.
  • That feels uncomfortable for people to say forthrightly, in the way that King would.
  • so we try to get around it by speaking about opportunity and the wealth gap and unemployment statistics. But really what people are feeling is an existential assault on dignity.
  • one way to read that book is to say that she’s telling a tragic story about the loss of a particular ideal that guided great society politics. And that’s the principle of maximum feasible participation.
  • That was a really social-democratic idea, this idea that, well, we need to empower all sorts of people to participate in policy making and democratic deliberation, and that part of where people will find self-respect and dignity is through engagement in politics and their community
  • I think it gets to something that is very present, towards the end of King’s life, which is his sense that there is something important for the civil rights movement in the labor movement. And unions, on some level, they are mechanisms of democracy. One of the most important functions they have is workplace democracy
  • King is, in this tradition, in many ways inspired by a mentor of his. And one of the most important figures in American history but one of the most severely neglected was A. Philip Randolph, the great labor leader, former organizer of the Pullman Porters, the architect of both the March on Washington that gets canceled, which was going to target the Roosevelt administration during World War II, and the 1963 famous March on Washington for Jobs and Freedom.
  • they’ve got a certain set of commitments. So one is the idea that because most African Americans are working class or poor, anything that advances the interests of working-class people and their ability to exercise democratic control over the economy is going to advance the interests of African Americans.
  • for King labor unions are also, as you described, important laboratories of democracy. So they’re one of the few places where people from all walks of life can get together, deliberate about strategy, deliberate about social ends, social goods, put money behind things that they value, that aren’t only their own material interest.
  • today, of course, there’s this ongoing — always is this ongoing argument — of should you have race-based politics or is that unusable, doesn’t work, creates too much backlash? Or you should have class-based politics that are looking for commonalities, and because you’ve had so much economic disadvantage for Black Americans, that’ll work through the mechanism of class just fine.
  • BRANDON TERRY: So King often invokes the philosopher Hegel, because he’s constantly describing his mode of thinking as a dialectical one, where he’s trying to reconcile seeming opposites and produce a new synthesis, which helps you transcend certain intractable problems.
  • Now as a reading of Hegel, that has much to be desired. But as a description of Martin Luther King’s thought, I think that’s always a good way to understand what he’s up to. And so I think what he’s always trying to do is transcend that opposition.
  • there’s a way in which we sometimes will say class-based politics works to lift African Americans because they’re disproportionately poor. And what’s tricky about that is that it doesn’t really theorize what to do about the African American middle class and the African American elite.
  • So a thing that King was thinking a lot about when he wrote “Stride Toward Freedom” and the Montgomery bus boycott, is there are areas where racial solidarity is going to be really effective and probably indispensable.
  • where questions of anti-Black racism emerge, where questions of racial humiliation, stigma that really affects the larger group, things that all Black people feel vulnerable to, those are going to be areas — like the segregation laws on the buses — those are going to be areas where you actually can generate a lot of racial solidarity and do a lot of important work with it, especially as a defensive posture.
  • When you start to get into questions of political economy, however, you have to be careful because the appeal of racial solidarity can actually obscure the fact that Black people don’t all share the same material interest in lots of ways.
  • King’s primary principle always is, is that he’s dedicated to the group that William Julius Wilson called the truly disadvantaged, the least of these, that at the end of the day, he’s going to give everything to the people who are in the most desperate situation, the poor. And that’s going to guide his politics.
  • where that is enabled by a race-based solidarity, so in questions of policing, perhaps, or questions of social stigma and media discourse, that’s where he’ll turn. But in other cases, I think he’d really be trying to experiment with a form of politics that empowers the poor to take leadership on their ow
  • How does being more aware of the distinctions he drew and the decisions he made help you look at some of the paths we should be walking down today and are not, in these conversations, or are walking down and shouldn’t be?
  • n order for us to understand why so many African Americans are located in the realm of the most disadvantaged, in the strata of the most disadvantaged, you have to understand the history of racial domination in this country. You have to understand the persistence of racial discrimination, especially in labor markets. And you have to understand the ways that racial ideology allows us to obscure the nature of our economy.
  • So the most classic example is that structural unemployment gets reframed, in part by racism, as questions of laziness or pathology or criminality instead of as a feature of the economy as such. So King always talks about the critique of racism as part of the diagnosis of the disease in order to cure it.
  • So even in the privileging the least well off and being concerned with poor people of all races, he wants to say that the critique of racism helps us see through the kinds of blindnesses that obscure the nature of our economy and the commonalities across race and the things that we need to address the questions of economic justice precisely
  • The second thing is that, in his critique of Black power, one of the things he says is that he worries that Black power gives priority to the question of race in a way that confuses our analysis of social reality. So what does he mean by that?
  • if you think that all Black disadvantage is primarily about anti-Black racism, you can start to miss the fact that there are broader economic dislocations that need to be addressed, that there are structural features of the American constitutional order, the ways in which municipal boundaries are structured, ways that funding decisions are made, that aren’t primarily driven by racial animus, that need to be addressed.
  • You can lose sight of those things and start to think that the real battle is in something like a totality of anti-Black racial ideology that can be battled in Hollywood movies and comic books and school curricula and legislation and political rhetoric
  • it’s not to say that those things don’t exist. It’s just to say that there’s a confusion about what’s going to make the biggest impact in improving the life circumstances of the least well off.
  • King really calls us to constantly be very precise about what the causal mechanisms are for Black disadvantage and to not be confused by the fact that there’s discrimination and injustice and cruelty in these other realms but which might not have as much causal impact as some of these other things.
  • King was very adamant that Black pride, that a concern with representation, that thinking in expansive ways about how do you affirm the somebodiness of Black youth, that those things are really, really important and that they’re not to be dismissed.
  • So it is a question of justice if people in Hollywood just constantly demean or diminish the talent of nonwhite actors. That is a question of justice. It’s just that we have to be honest about what the import of those struggles will be for the broader group. And the only way we can do that is by being attentive to the class differences within the group.
  • there’s a way in which — and King diagnoses this very incisively — there’s a way in which some genres of Black nationalism are so pessimistic about the possibility for multiracial democracy in the United States, for any kind of Black flourishing in the United States that they essentially foreclose real interest in political organizing and social movements
  • But the energy they still managed to generate — the outrage, the sentiment, the sociality — they find their outlet, instead, in a practice of humiliation, counter humiliation. So that there may not be hope that we can actually change the country, but at the very least, we can enjoy a feeling of retaliation, a kind of self-respecting sense of resistance, by engaging in a practice of trying to humiliate our opponents in the public spher
  • there’s a titillation to that. There’s a catharsis in watching someone — at that point, it would have been called stick it to whitey. Now it would be stick it to the libs or own the libs.
  • this is a significant amount of people that could cause real damage in the places where they don’t face many countervailing forms of power. And they can exercise a much more toxic impact on the broader state of American politics in a time where the media environment is way more fragmented
  • I see those elements. And I think that we need more people operating, in the kind of mode that King did, in his critique of Black power, to try to turn people away from their understandable feelings of hostility and resentment, toward more productive forms of political engagement.
  • the word, “emotion,” which is a neglected part of politics, maybe of King’s thought in particular is that he understood — I think he understood part of the goal of politics and political action as creating a particular structure of political emotion.
  • , what structure of emotion, of political emotion, we’re actually living in.
  • BRANDON TERRY: My mentor and friend, Karuna Mantena, at Columbia, political theorist, a brilliant political theorist working on a book on Gandhi — I learned this from her, thinking a lot about how nonviolence is a kind of realism, in part because it doesn’t engage in of fiction that politics is operating on, in the model of rational discussion. It takes very, very, very seriously the problem of emotion.
  • for King, thinking about the history of racial oppression in America, they’re key emotions that you have to think about. One of the most important ones is fear
  • If that fear is a longstanding, deeply-structuring feature of American culture and political life, if it’s something that animates our comedy movies, our stand-up routines, our political discourse, you can’t operate as if it’s not there. You have to do things that will somehow disarm, disrupt, dispel those fears, in order to make progress on the political questions you want to pursue. That was one of King’s deepest, deepest commitments.
  • He’s thinking a lot about anger, which we’ve talked at great length about. And one of the disappointments I’ve had with radical politics in the present, as sympathetic as I am to most of the aims, is that I just don’t think the emotion question has been adequately considered
  • people often defend their politics as like, King was unpopular. And the things we’re saying are unpopular. So we’re operating in that tradition.
  • it’s not enough to just say, I’ve started a conversation, I’ve provoked something toxic in the culture. He’s not trying to do that, necessarily. He’s trying to elicit reactions that bring forward certain emotions but not let those emotions unravel the society itself. He’s trying to channel them into other forms of political affect that are much more congenial to reconciliation and justice.
  • what we’ve unfortunately ended up with is that the sophistication of mobilization strategists, the depth of the polarization, has made anger the principal affect of American politics at this moment.
  • a King-inspired political philosophy, both at the state level and the activist level, has to do, is think about how do we transform the recalcitrant nature of today’s political anger and channel it into forms of constructive politics that might point toward a more just future and that might dissolve the forms of anger that are illegitimate and ill founded, in part, by doing the kind of work sometimes described as a moral jujitsu, turning those affects against themselves, in part, to try to transform them into something different.
  • maybe it’ll be easier to use myself as an example, here.
  • When I started out in blogging and political writing and journalism, particularly blogging, I think I thought a lot about politics in terms of winning and losing, and in my corner of it, winning and losing intellectually, that I was involved in political arguments, and arguments could be won or lost in front of some kind of audience.
  • One is having been in a lot of arguments. And I think I’m a reasonably good arguer. And so I’ve done, by my own likes, well, and then noticed it didn’t have it all the effect I wanted it to have, which is, if anything, it usually — if you really beat somebody in an argument and they feel humiliated, they go further into views they already held
  • And two things have begun to corrode, for me, that sense
  • so you lose by winning.
  • then the second is, particularly in the Trump era, the sense that if you met something awful with an equal and opposite energetic force, that in some weird way, you just added energy to what was now an awful system and conversation.
  • What do you do to not create a sense that this is a right conversation to be having? And I don’t the answers to it. And I’m not saying like I’ve ascended to some higher plane and don’t argue or any of that. I have all the same intuitions and senses I’ve always had.
  • that’s why I find King so interesting and challenging in this way, because it’s just really, really, really different to ask the question, how do I reshape the emotional politics and the emotional structure of myself, of the people I’m in conflict with and then of the people who are bystanders or watchers of that conflict, for the better
  • It’s just a really different goal to be targeting, and just unimaginably harder than, can I come up with an argument that I think is a winning argument.
  • I think you see it — when he’s assassinated, the leading figures of the Black-Power generation, they’re heartbroken. They mourn his loss. They grieve for him, in part because — and you can read any of these memoirs, particularly Stokely Carmichael’s — they felt like he never — that even when he disagreed with them, he loved them, and not just because they were friendly, but because he loved in the sense that he always invoked, of agape love, that he wanted goodwill for them, and that his arguments weren’t from a place of trying to humiliate them or embarrass them or expose them as ridiculous.
  • He wanted to affirm their right to make the arguments they were making, to affirm their intelligence and judgment and to enter into their mind, to try to reconstruct a position with sympathy, but then show why it falls short for the sake of goals that he was forthright about, about justice, about reconciliation, about love
  • we are in a moment of extraordinary cynicism. And cynicism can take advantage of your intellectual honesty, your practice of agape love. But I think that’s in the short term.
  • In my better moments, I’m of the view that the only way to start to turn the tide against the cynicism that has so corroded and corrupted our political culture is to try to have these demonstrations of humility and authenticity that cause us to put ourselves at some risk, the way that King did
  • So always our final question: What are three books you would recommend to the audience? And if I can put one spin on that, you mentioned the many books King wrote. If people want to start with one thing he actually wrote to read, one book, which one should they start with?
  • I think you get the best sense of his mature thought from his 1967 book, “Where Do We Go From Here: Chaos or Community,” which is still our question. So I would definitely recommend that. I also really love “A Trumpet of Conscience,” his Canadian Broadcasting Corporation lectures that were published posthumously.
  • I really strongly recommend Peniel Joseph’s, “The Sword and the Shield.” It’s a dual biography of Martin Luther King and Malcolm X. I reviewed it for The New York Review of Books and think really highly of it. It’s a great meditation on the ways they influenced each other. And it gives you a good sense of the broader intellectual milieu of the period.
  • I also really like Jeanne Theoharis’s “A More Beautiful and Terrible History.” I think for people coming to the study of the civil rights movement for the first time are kind of curious about why some of the things that I’ve said don’t sound familiar to them. She writes, in a really accessible and intelligent way, about some of the myths, that structure, how that history is taught and popularly conveyed. We have a lot of agreements there.
  • And then a where do we go from here question, I want to recommend my colleague, Tommie Shelby’s book, “Dark Ghettos,” which is a King-inspired philosophical reflection on the deep structure of ghetto poverty and what it requires of us, as a society, to do to redress it. It’s a book that’s very demanding on how far we’ve fallen short and questions of justice that pertain to the kind of neighborhoods that we grew up in and around.
Javier E

Why the World Still Needs Immanuel Kant - The New York Times - 0 views

  • “Immanuel Kant: A European Thinker” was a good title for that conference report in 2019, when Brexit seemed to threaten the ideal of European unification Germans supported. Just a few years later, “European” has become a slur. At a time when the Enlightenment is regularly derided as a Eurocentric movement designed to support colonialism, who feels comfortable throwing a yearlong birthday party for its greatest thinker?
  • Before Kant, it’s said, philosophers were divided between Rationalists and Empiricists, who were concerned about the sources of knowledge. Does it come from our senses, or our reason? Can we ever know if anything is real? By showing that knowledge requires sensory experience as well as reason, we’re told, Kant refuted the skeptics’ worry that we never know if anything exists at all.
  • All this is true, but it hardly explains why the poet Heinrich Heine found Kant more ruthlessly revolutionary than Robespierre.
  • ...18 more annotations...
  • Ordinary people do not fret over the reality of tables or chairs or billiard balls. They do, however, wonder if ideas like freedom and justice are merely fantasies. Kant’s main goal was to show they are not.
  • In fact Kant was driven by a question that still plagues us: Are ideas like freedom and justice utopian daydreams, or are they more substantial? Their reality can’t be proven like that of material objects, for those ideas make entirely different claims on us — and some people are completely impervious to their claims.
  • Could philosophy show that acting morally, if not particularly common, is at least possible?
  • Kant always emphasized the limits of our knowledge, and none of us know if we would crumble when faced with death or torture. Most of us probably would. But all of us know what we should do in such a case, and we know that we could.
  • This experiment shows we are radically free. Not pleasure but justice can move human beings to deeds that overcome the deepest of animal desires, the love of life.
  • We want to determine the world, not only to be determined by it. We are born and we die as part of nature, but we feel most alive when we go beyond it: To be human is to refuse to accept the world we are given.
  • At the heart of Kant’s metaphysics stands the difference between the way the world is and the way the world ought to be.
  • But if we long, in our best moments, for the dignity of freedom and justice, Kant’s example has political consequences. It’s no surprise he thought the French Revolution confirmed our hopes for moral progress — unlike the followers of his predecessor David Hum
  • who thought it was dangerous to stray from tradition and habit.
  • This provides an answer to contemporary critics whose reading of Kant’s work focuses on the ways in which it violates our understanding of racism and sexism. Some of his remarks are undeniably offensive to 21st-century ears. But it’s fatal to forget that his work gave us the tools to fight racism and sexism, by providing the metaphysical basis of every claim to human rights.
  • Kant argued that each human being must be treated as an end and not as a means — which is why he called colonialism “evil” and congratulated the Chinese and Japanese for denying entry to European invaders. Contemporary dismissals of Enlightenment thinkers forget that those thinkers invented the concept of Eurocentrism, and urged their readers to consider the world from non-European perspectives
  • At a time when the advice to “be realistic” is best translated as the advice to decrease your expectations, Kant’s work asks deep questions about what reality is
  • He insisted that when we think morally, we should abstract from the cultural differences that divide us and recognize the potential human dignity in every human being.
  • This requires the use of our reason. Contrary to trendy views that see reason as an instrument of domination, Kant saw reason’s potential as a tool for liberation.
  • Should we discard Kant’s commitment to universalism because he did not fully realize it himself — or rather celebrate the fact that we can make moral progress, an idea which Kant would wholeheartedly applaud?
  • In Germany, it’s now common to hear that the Enlightenment was at very best ambivalent: While it may have been an age of reason, it was also an age of slavery and colonialism.
  • many contemporary intellectuals from formerly colonized countries reject those arguments. Thinkers like the Ghanaian Ato Sekyi-Otu, the Nigerian Olufemi Taiwo, the Chilean Carlos Peña, the Brazilian Francisco Bosco or the Indian Benjamin Zachariah are hardly inclined to renounce Enlightenment ideas as Eurocentric.
  • The problem with ideas like universal human rights is not that they come from Europe, but that they were not realized outside of it. Perhaps we should take a lesson from the Enlightenment and listen to non-Western standpoints?
Javier E

The Right to Not Be Offended - Hit & Run : Reason Magazine - 0 views

  • Treating people with respect is a fine goal, but Collini notices that respect tends to be shown with special deference to so-called “out groups.” Claims of offense that would otherwise be ignored are instead given credence and even deference. Collini also correctly identifies the people who tend to fall into this trap. Very few “progressive” forces, for example, would have shown any “understanding” of hurt Christian feelings if Jesus had been mocked in a Danish newspaper.
  • Collini’s central passage: “Where arguments are concerned—that is, matters that are pursued by means of reasons and evidence—the most important identity we can acknowledge in another person is the identity of being an intelligent reflective human being.”
  • “This does not mean assuming that people are entirely—or even primarily—rational, and it does not mean that people are, in practice, always and only persuaded by reasons and evidence. It means treating other people as we wish to be treated ourselves in this matter—namely, as potentially capable of understanding the grounds for any action or statement that concerns us. But to so treat them means that, where reason and evidence are concerned, they cannot be thought of as primarily defined by being members of the ‘Muslim community or ‘Black community’ or ‘gay community’...
  • ...1 more annotation...
  • if one decides to criticize a culture or a tradition or a work of art, doing so is not an act of Western arrogance. Criticism is not Western or Eastern or Christian or Jewish, and those facing criticism—and those societies and cultures facing criticism—should respond in a spirit of openness about truth. To withhold criticism from certain communities or religions is, in Collini’s word, a form of condescension towards them. It denies these groups the ability to engage in constructive dialogue, and to fortify their own values. In the final analysis, everyone loses.
Javier E

What's Wrong With 'All Lives Matter'? - NYTimes.com - 1 views

  • what we see is that some lives matter more than others, that some lives matter so much that they need to be protected at all costs, and that other lives matter less, or not at all. And when that becomes the situation, then the lives that do not matter so much, or do not matter at all, can be killed or lost, can be exposed to conditions of destitution, and there is no concern, or even worse, that is regarded as the way it is supposed to be
  • we have to remember that under slavery black lives were considered only a fraction of a human life, so the prevailing way of valuing lives assumed that some lives mattered more
  • when and where did black lives ever really get free of coercive force? One reason the chant “Black Lives Matter” is so important is that it states the obvious but the obvious has not yet been historically realized. So it is a statement of outrage and a demand for equality, for the right to live free of constraint, but also a chant that links the history of slavery, of debt peonage, segregation, and a prison system geared toward the containment, neutralization and degradation of black lives,
  • ...21 more annotations...
  • We can see the videos and know what is obviously true, but it is also obviously true that police and the juries that support them obviously do not see what is obvious, or do not wish to see.
  • we cannot name all the black men and women whose lives are snuffed out all because a police officer perceives a threat, sees the threat in the person, sees the person as pure threat. Perceived as a threat even when unarmed or completely physically subdued, or lying in the ground, as Rodney King clearly was, or coming back home from a party on the train and having the audacity to say to a policeman that he was not doing anything wrong and should not be detained: Oscar Grant.
  • also a police system that more and more easily and often can take away a black life in a flash all because some officer perceives a threat.
  • The perception is then ratified as a public perception at which point we not only must insist on the dignity of black lives, but name the racism that has become ratified as public perception.
  • to make that universal formulation concrete, to make that into a living formulation, one that truly extends to all people, we have to foreground those lives that are not mattering now, to mark that exclusion, and militate against it. Achieving that universal, “all lives matter,” is a struggle
  • it is not just that black lives matter, though that must be said again and again. It is also that stand-your-ground and racist killings are becoming increasingly normalized, which is why intelligent forms of collective outrage have become obligatory.
  • At least in these cases that have galvanized the nation and the world in protest, we all see the twisted logic that results in the exoneration of the police who take away the lives of unarmed black men and women. And why is that the case? It is not because what the police and their lawyers present as their thinking in the midst of the situation is very reasonable. No, it is because that form of thinking is becoming more “reasonable” all the time. In other words, every time a grand jury or a police review board accepts this form of reasoning, they ratify the idea that blacks are a population against which society must be defended, and that the police defend themselves and (white) society, when they preemptively shoot unarmed black men in public space.
  • What has led us to this place? J.B.: Racism has complex origins, and it is important that we learn the history of racism to know what has led us to this terrible place. But racism is also reproduced in the present, in the prison system, new forms of population control, increasing economic inequality that affects people of color disproportionately.
  • I’ve heard that some white people have held signs that read “All Lives Matter.” J.B.: When some people rejoin with “All Lives Matter” they misunderstand the problem, but not because their message is untrue. It is true that all lives matter, but it is equally true that not all lives are understood to matter which is precisely why it is most important to name the lives that have not mattered, and are struggling to matter in the way they deserve.
  • we cannot have a race-blind approach to the questions: which lives matter? Or, which lives are worth valuing? If we jump too quickly to the universal formulation, “all lives matter,” then we miss the fact that black people have not yet been included in the idea of “all lives.”
  • So the police see a threat when there is no gun to see, or someone is subdued and crying out for his life, when they are moving away or cannot move. These figures are perceived as threats even when they do not threaten, when they have no weapon, and the video footage that shows precisely this is taken to be a ratification of the police’s perception
  • whiteness is figured as a young virgin whose future husband is white — this characterization ratifies the sentiments that oppose miscegenation and defend norms or racial purity. But whose sexuality is imperiled in this scene? After all, black women and girls were the ones who were raped, humiliated and disposed of under conditions of slavery, and it was black families who were forcibly destroyed: black kinship was not recognized as kinship that matters
  • women of color, and black feminists in particular, have struggled for years against being the sexual property of either white male power or black masculinity, against poverty, and against the prison industry, so there are many reasons it is necessary to define racism in ways that acknowledge the specific forms it takes against men, women, and transgendered people of color.
  • there are white people who may be very convinced that they are not racist, but that does not necessarily mean that they have examined, or worked though, how whiteness organizes their lives, values, the institutions they support, how they are implicated in ways of talking, seeing, and doing that constantly and tacitly discriminate. Undoing whiteness has to be difficult work, but it starts, I think, with humility, with learning history, with white people learning how the history of racism persists in the everyday vicissitudes of the present, even as some of us may think we are “beyond” such a history, or
  • It is difficult and ongoing work, calling on an ethical disposition and political solidarity that risks error in the practice of solidarity.
  • It is probably important and satisfying as well to let one’s whiteness recede by joining in acts of solidarity with all those who oppose racism.
  • ut just as certain kinds of violence and inequality get established as “normal” through the proceedings that exonerate police of the lethal use of force against unarmed black people, so whiteness, or rather its claim to privilege, can be disestablished over time
  • it is probably an error, in my view, for white people to become paralyzed with guilt and self-scrutiny. The point is rather to consider those ways of valuing and devaluing life that govern our own thinking and acting, understanding the social and historical reach of those ways of valuing.
  • Whiteness is not an abstraction; its claim to dominance is fortified through daily acts which may not seem racist at all precisely because they are considered “normal.”
  • There are many ways to do this, in the street, the office, the home, and in the media. Only through such an ever-growing cross-racial struggle against racism can we begin to achieve a sense of all the lives that really do matter.
  • This week’s conversation is with Judith Butler, Maxine Elliot Professor in the department of comparative literature and the program of critical theory at the University of California, Berkeley.
Javier E

Ex-KGB Agent Says Trump Was a Russian Asset. Does it Matter? - 0 views

  • If something like the most sinister plausible story turned out to be true, how much would it matter? Probably not that much
  • I have merely come to think that even if we could have confirmed the worst, to the point that even Trump’s supporters could no longer deny it, it wouldn’t have changed very much. Trump wouldn’t have been forced to resign, and his Republican supporters would not have had to repudiate him. The controversy would have simply receded into the vast landscape of partisan talking points — one more thing liberals mock Trump over, and conservatives complain about the media for covering instead of Nancy Pelosi’s freezer or antifa or the latest campus outrage.
  • One reason I think that is because a great deal of incriminating information was confirmed and very little in fact changed as a result. In 2018, Buzzfeed reported, and the next year Robert Mueller confirmed, explosive details of a Russian kompromat operation. During the campaign, Russia had been dangling a Moscow building deal that stood to give hundreds of millions of dollars in profit to Trump, at no risk. Not only did he stand to gain this windfall, but he was lying in public at the time about his dealings with Russia, which gave Vladimir Putin additional leverage over him. (Russia could expose Trump’s lies at any time if he did something to displease Moscow.)
  • ...14 more annotations...
  • The truth, I suspect, was simultaneously about as bad as I suspected, and paradoxically anticlimactic. Trump was surrounded by all sorts of odious characters who manipulated him into saying and doing things that ran against the national interest. One of those characters was Putin. In the end, their influence ran up against the limits that the character over whom they had gained influence was a weak, failed president.
  • Mueller even testified that this arrangement gave Russia blackmail leverage over Trump. But by the time these facts had passed from the realm of the mysterious to the confirmed, they had become uninteresting.
  • Ultimately, whatever value Trump offered to Russia was compromised by his incompetence and limited ability to grasp firm control even of his own government’s foreign policy. It was not just the fabled “deep state” that undermined Trump. Even his own handpicked appointees constantly undermined him, especially on Russia. Whatever leverage Putin had was limited to a single individual, which meant there was nobody Trump could find to run the State Department, National Security Agency, and so on who shared his idiosyncratic Russophilia.
  • Shvets told Unger that the KGB cultivated Trump as an American leader, and persuaded him to run his ad attacking American alliances. “The ad was assessed by the active measures directorate as one of the most successful KGB operations at that time,” he said, “It was a big thing — to have three major American newspapers publish KGB soundbites.”
  • To be clear, while Shvets is a credible source, his testimony isn’t dispositive. There are any number of possible motives for a former Soviet spy turned critic of Russia’s regime to manufacture an indictment of Trump
  • This is what intelligence experts mean when they describe Trump as a Russian “asset.” It’s not the same as being an agent. An asset is somebody who can be manipulated, as opposed to somebody who is consciously and secretly working on your behalf.
  • A second reason is that reporter Craig Unger got a former KGB spy to confirm on the record that Russian intelligence had been working Trump for decades. In his new book, “American Kompromat,” Unger interviewed Yuri Shvets, who told him that the KGB manipulated Trump with simple flattery. “In terms of his personality, the guy is not a complicated cookie,” he said, “his most important characteristics being low intellect coupled with hyperinflated vanity. This makes him a dream for an experienced recruiter.”
  • If I had to guess today, I’d put the odds higher, perhaps over 50 percent. One reason for my higher confidence is that Trump has continued to fuel suspicion by taking anomalously pro-Russian positions. He met with Putin in Helsinki, appearing strangely submissive, and spouted Putin’s propaganda on a number of topics including the ridiculous possibility of a joint Russian-American cybersecurity unit. (Russia, of course, committed the gravest cyber-hack in American history not long ago, making Trump’s idea even more self-defeating in retrospect than it was at the time.) He seemed to go out of his way to alienate American allies and blow up cooperation every time they met during his tenure.
  • He would either refuse to admit Russian wrongdoing — Trump refused even to concede that the regime poisoned Alexei Navalny — or repeat bizarre snippets of Russian propaganda: NATO was a bad deal for America because Montenegro might launch an attack on Russia; the Soviets had to invade Afghanistan in the 1970s to defend against terrorism. These weren’t talking points he would pick up in his normal routine of watching Fox News and calling Republican sycophants.
  • there was a reasonable chance — I loosely pegged it at 10 or 20 percent — that the Soviets had planted some of these thoughts, which he had never expressed before the trip, in his head.
  • Trump returned from Moscow fired up with political ambition. He began the first of a long series of presidential flirtations, which included a flashy trip to New Hampshire. Two months after his Moscow visit, Trump spent almost $100,000 on a series of full-page newspaper ads that published a political manifesto. “An open letter from Donald J. Trump on why America should stop paying to defend countries that can afford to defend themselves,” as Trump labeled it, launched angry populist charges against the allies that benefited from the umbrella of American military protection. “Why are these nations not paying the United States for the human lives and billions of dollars we are losing to protect their interests?”
  • During the Soviet era, Russian intelligence cast a wide net to gain leverage over influential figures abroad. (The practice continues to this day.) The Russians would lure or entrap not only prominent politicians and cultural leaders, but also people whom they saw as having the potential for gaining prominence in the future. In 1986, Soviet ambassador Yuri Dubinin met Trump in New York, flattered him with praise for his building exploits, and invited him to discuss a building in Moscow. Trump visited Moscow in July 1987. He stayed at the National Hotel, in the Lenin Suite, which certainly would have been bugged. There is not much else in the public record to describe his visit, except Trump’s own recollection in The Art of the Deal that Soviet officials were eager for him to build a hotel there. (It never happened.)
  • In 2018, I became either famous or notorious — depending on your point of view — for writing a story speculating that Russia had secret leverage over Trump
  • Here is what I wrote in that controversial section:
1 - 20 of 1927 Next › Last »
Showing 20 items per page