Skip to main content

Home/ History Readings/ Group items tagged writing

Rss Feed Group items tagged

Javier E

Economics of Good and Evil: The Quest for Economic Meaning from Gilgamesh to Wall Stree... - 2 views

  • Instead of self-confident and self-centered answers, the author humbly asks fundamental questions: What is economics? What is its meaning? Where does this new religion, as it is sometimes called, come from? What are its possibilities and its limitations and borders, if there are any? Why are we so dependent on permanent growing of growth and growth of growing of growth? Where did the idea of progress come from, and where is it leading us? Why are so many economic debates accompanied by obsession and fanaticism?
  • The majority of our political parties act with a narrow materialistic focus when, in their programs, they present the economy and finance first; only then, somewhere at the end, do we find culture as something pasted on or as a libation for a couple of madmen.
  • most of them—consciously or unconsciously—accept and spread the Marxist thesis of the economic base and the spiritual superstructure.
  • ...297 more annotations...
  • He tries to break free of narrow specialization and cross the boundaries between scientific disciplines. Expeditions beyond economics’ borders and its connection to history, philosophy, psychology, and ancient myths are not only refreshing, but necessary for understanding the world of the twenty-first century.
  • Reality is spun from stories, not from material. Zdeněk Neubauer
  • “The separation between the history of a science, its philosophy, and the science itself dissolves into thin air, and so does the separation between science and non-science; differences between the scientific and unscientific are vanishing.”
  • Outside of our history, we have nothing more.
  • The study of the history of a certain field is not, as is commonly held, a useless display of its blind alleys or a collection of the field’s trials and errors (until we got it right), but history is the fullest possible scope of study of a menu that the given field can offer.
  • History of thought helps us to get rid of the intellectual brainwashing of the age, to see through the intellectual fashion of the day, and to take a couple of steps back.
  • Almost all of the key concepts by which economics operates, both consciously and unconsciously, have a long history, and their roots extend predominantly outside the range of economics, and often completely beyond that of science.
  • That is the reason for this book: to look for economic thought in ancient myths and, vice versa, to look for myths in today’s economics.
  • stories; Adam Smith believed. As he puts it in The Theory of Moral Sentiments, “the desire of being believed, or the desire of persuading, of leading and directing other people, seems to be one of the strongest of all our natural desires.”
  • “The human mind is built to think in terms of narratives … in turn, much of human motivation comes from living through a story of our lives, a story that we tell to ourselves and that creates a framework of our motivation. Life could be just ‘one damn thing after another’ if it weren’t for such stories. The same is true for confidence in a nation, a company, or an institution. Great leaders are foremost creators of stories.”
  • contrary to what our textbooks say, economics is predominantly a normative field. Economics not only describes the world but is frequently about how the world should be (it should be effective, we have an ideal of perfect competition, an ideal of high-GDP growth in low inflation, the effort to achieve high competitiveness …). To this end, we create models, modern parables,
  • I will try to show that mathematics, models, equations, and statistics are just the tip of the iceberg of economics; that the biggest part of the iceberg of economic knowledge consists of everything else; and that disputes in economics are rather a battle of stories and various metanarratives than anything else.
  • Before it was emancipated as a field, economics lived happily within subsets of philosophy—ethics, for example—miles away from today’s concept of economics as a mathematical-allocative science that views “soft sciences” with a scorn born from positivistic arrogance. But our thousand-year “education” is built on a deeper, broader, and oftentimes more solid base. It is worth knowing about.
  • is a paradox that a field that primarily studies values wants to be value-free. One more paradox is this: A field that believes in the invisible hand of the market wants to be without mysteries.
  • mathematics at the core of economics, or is it just the icing of the cake, the tip of the iceberg of our field’s inquiry?
  • we seek to chart the development of the economic ethos. We ask questions that come before any economic thinking can begin—both philosophically and, to a degree, historically. The area here lies at the very borders of economics—and often beyond. We may refer to this as protoeconomics (to borrow a term from protosociology) or, perhaps more fittingly, metaeconomics (to borrow a term from metaphysics).
  • In this sense, “the study of economics is too narrow and too fragmentary to lead to valid insight, unless complemented and completed by a study of metaeconomics.”17
  • The more important elements of a culture or field of inquiry such as economics are found in fundamental assumptions that adherents of all the various systems within the epoch unconsciously presuppose. Such assumptions appear so obvious that people do not know what they are assuming, because no other way of putting things has ever occurred to them, as the philosopher Alfred Whitehead notes in Adventures of Ideas.
  • I argue that economic questions were with mankind long before Adam Smith. I argue that the search for values in economics did not start with Adam Smith but culminated with him.
  • We should go beyond economics and study what beliefs are “behind the scenes,” ideas that have often become the dominant yet unspoken assumptions in our theories. Economics is surprisingly full of tautologies that economists are predominantly unaware of. I
  • argue that economics should seek, discover, and talk about its own values, although we have been taught that economics is a value-free science. I argue that none of this is true and that there is more religion, myth, and archetype in economics than there is mathematics.
  • In a way, this is a study of the evolution of both homo economicus and, more importantly, the history of the animal spirits within him. This book tries to study the evolution of the rational as well as the emotional and irrational side of human beings.
  • I argue that his most influential contribution to economics was ethical. His other thoughts had been clearly expressed long before him, whether on specialization, or on the principle of the invisible hand of the market. I try to show that the principle of the invisible hand of the market is much more ancient and developed long before Adam Smith. Traces of it appear even in the Epic of Gilgamesh, Hebrew thought, and in Christianity, and it is expressly stated by Aristophanes and Thomas Aquinas.
  • This is not a book on the thorough history of economic thought. The author aims instead to supplement certain chapters on the history of economic thought with a broader perspective and analysis of the influences that often escape the notice of economists and the wider public.
  • Progress (Naturalness and Civilization)
  • The Economy of Good and Evil
  • from his beginnings, man has been marked as a naturally unnatural creature, who for unique reasons surrounds himself with external possessions. Insatiability, both material and spiritual, are basic human metacharacteristics, which appear as early as the oldest myths and stories.
  • the Hebrews, with linear time, and later the Christians gave us the ideal (or amplified the Hebrew ideal) we now embrace. Then the classical economists secularized progress. How did we come to today’s progression of progress, and growth for growth’s sake?
  • The Need for Greed: The History of Consumption and Labor
  • Metamathematics From where did economics get the concept of numbers as the very foundation of the world?
  • All of economics is, in the end, economics of good and evil. It is the telling of stories by people of people to people. Even the most sophisticated mathematical model is, de facto, a story, a parable, our effort to (rationally) grasp the world around us.
  • idea that we can manage to utilize our natural egoism, and that this evil is good for something, is an ancient philosophical and mythical concept. We will also look into the development of the ethos of homo economicus, the birth of “economic man.”
  • The History of Animal Spirits: Dreams Never Sleep
  • Masters of the Truth
  • Originally, truth was a domain of poems and stories, but today we perceive truth as something much more scientific, mathematical. Where does one go (to shop) for the truth? And who “has the truth” in our epoch?
  • Our animal spirits (something of a counterpart to rationality) are influenced by the archetype of the hero and our concept of what is good.
  • The entire history of ethics has been ruled by an effort to create a formula for the ethical rules of behavior. In the final chapter we will show the tautology of Max Utility, and we will discuss the concept of Max Good.
  • The History of the Invisible Hand of the Market and Homo Economicus
  • We understand “economics” to mean a broader field than just the production, distribution, and consumption of goods and services. We consider economics to be the study of human relations that are sometimes expressible in numbers, a study that deals with tradables, but one that also deals with nontradables (friendship, freedom, efficiency, growth).
  • When we mention economics in this book, we mean the mainstream perception of it, perhaps as best represented by Paul Samuelson.
  • By the term homo economicus, we mean the primary concept of economic anthropology. It comes from the concept of a rational individual, who, led by narrowly egotistical motives, sets out to maximize his benefit.
  • the Epic of Gilgamesh bears witness to the opposite—despite the fact that the first written clay fragments (such as notes and bookkeeping) of our ancestors may have been about business and war, the first written story is mainly about great friendship and adventure.
  • there is no mention of either money or war; for example, not once does anyone in the whole epic sell or purchase something.5 No nation conquers another, and we do not encounter a mention even of the threat of violence.
  • Gilgamesh becomes a hero not only due to his strength, but also due to discoveries and deeds whose importance were in large part economic—direct gaining of construction materials in the case of felling the cedar forest, stopping Enkidu from devastating Uruk’s economy, and discovering new desert routes during his expeditions.
  • Even today we live in Gilgamesh’s vision that human relations—and therefore humanity itself—are a disturbance to work and efficiency; that people would perform better if they did not “waste” their time and energy on nonproductive things.
  • is a story of nature and civilization, of heroism, defiance, and the battle against the gods, and evil; an epic about wisdom, immortality, and also futility.
  • But labour is unlike any other commodity. The work environment is of no concern for steel; we do not care about steel’s well-being.16
  • But it is in friendship where—often by-the-way, as a side product, an externality—ideas and deeds are frequently performed or created that together can altogether change the face of society.19 Friendship can go against an ingrained system in places where an individual does not have the courage to do so himself or herself.
  • As Joseph Stiglitz says, One of the great “tricks” (some say “insights”) of neoclassical economics is to treat labour like any other factor of production. Output is written as a function of inputs—steel, machines, and labour. The mathematics treats labour like any other commodity, lulling one into thinking of labour like an ordinary commodity, such as steel or plastic.
  • Even the earliest cultures were aware of the value of cooperation on the working level—today we call this collegiality, fellowship, or, if you want to use a desecrated term, comradeship. These “lesser relationships” are useful and necessary for society and for companies because work can be done much faster and more effectively if people get along with each other on a human level
  • But true friendship, which becomes one of the central themes of the Epic of Gilgamesh, comes from completely different material than teamwork. Friendship, as C. S. Lewis accurately describes it, is completely uneconomical, unbiological, unnecessary for civilization, and an unneeded relationship
  • Here we have a beautiful example of the power of friendship, one that knows how to transform (or break down) a system and change a person. Enkidu, sent to Gilgamesh as a punishment from the gods, in the end becomes his faithful friend, and together they set out against the gods. Gilgamesh would never have gathered the courage to do something like that on his own—nor would Enkidu.
  • Due to their friendship, Gilgamesh and Enkidu then intend to stand up to the gods themselves and turn a holy tree into mere (construction) material they can handle almost freely, thereby making it a part of the city-construct, part of the building material of civilization, thus “enslaving” that which originally was part of wild nature. This is a beautiful proto-example of the shifting of the borders between the sacred and profane (secular)—and to a certain extent also an early illustration of the idea that nature is there to provide cities and people with raw material and production resources.
  • started with Babylonians—rural nature becomes just a supplier of raw materials, resources (and humans the source of human resources). Nature is not the garden in which humans were created and placed, which they should care for and which they should reside in, but becomes a mere reservoir for natural (re)sources.
  • Even today, we often consider the domain of humanity (human relations, love, friendship, beauty, art, etc.) to be unproductive;
  • Both heroes change—each from opposite poles—into humans. In this context, a psychological dimension to the story may be useful: “Enkidu (…) is Gilgamesh’s alter ego, the dark, animal side of his soul, the complement to his restless heart. When Gilgamesh found Enkidu, he changed from a hated tyrant into the protector of his city. (…)
  • To be human seems to be somewhere in between, or both of these two. We
  • this moment of rebirth from an animal to a human state, the world’s oldest preserved epic implicitly hints at something highly important. Here we see what early cultures considered the beginning of civilization. Here is depicted the difference between people and animals or, better, savages. Here the epic quietly describes birth, the awakening of a conscious, civilized human. We are witnesses to the emancipation of humanity from animals,
  • The entire history of culture is dominated by an effort to become as independent as possible from the whims of nature.39 The more developed a civilization is, the more an individual is protected from nature and natural influences and knows how to create around him a constant or controllable environment to his liking.
  • The price we pay for independence from the whims of nature is dependence on our societies and civilizations. The more sophisticated a given society is as a whole, the less its members are able to survive on their own as individuals, without society.
  • The epic captures one of the greatest leaps in the development of the division of labor. Uruk itself is one of the oldest cities of all, and in the epic it reflects a historic step forward in specialization—in the direction of a new social city arrangement. Because of the city wall, people in the city can devote themselves to things other than worrying about their own safety, and they can continue to specialize more deeply.
  • Human life in the city gains a new dimension and suddenly it seems more natural to take up issues going beyond the life span of an individual. “The city wall symbolizes as well as founds the permanence of the city as an institution which will remain forever and give its inhabitants the certainty of unlimited safety, allowing them to start investing with an outlook reaching far beyond the borders of individual life.
  • The wall around the city of Uruk is, among other things, a symbol of an internal distancing from nature, a symbol of revolts against submission to laws that do not come under the control of man and that man can at most discover and use to his benefit.
  • “The chief thing which the common-sense individual wants is not satisfactions for the wants he had, but more, and better wants.”47
  • If a consumer buys something, theoretically it should rid him of one of his needs—and the aggregate of things they need should be decreased by one item. In reality, though, the aggregate of “I want to have” expands together with the growing aggregate of “I have.”
  • can be said that Enkidu was therefore happy in his natural state, because all of his needs were satiated. On the other hand, with people, it appears that the more a person has, the more developed and richer, the greater the number of his needs (including the unsaturated ones).
  • the Old Testament, this relationship is perceived completely differently. Man (humanity) is created in nature, in a garden. Man was supposed to care for the Garden of Eden and live in harmony with nature and the animals. Soon after creation, man walks naked and is not ashamed, de facto the same as the animals. What is characteristic is that man dresses (the natural state of creation itself is not enough for him), and he (literally and figuratively) covers52 himself—in shame after the fall.53
  • Nature is where one goes to hunt, collect crops, or gather the harvest. It is perceived as the saturator of our needs and nothing more. One goes back to the city to sleep and be “human.” On the contrary, evil resides in nature. Humbaba lives in the cedar forest, which also happens to be the reason to completely eradicate it.
  • Symbolically, then, we can view the entire issue from the standpoint of the epic in the following way: Our nature is insufficient, bad, evil, and good (humane) occurs only after emancipation from nature (from naturalness), through culturing and education. Humanity is considered as being in civilization.
  • The city was frequently (at least in older Jewish writings) a symbol of sin, degeneration, and decadence—nonhumanity. The Hebrews were originally a nomadic nation, one that avoided cities. It is no accident that the first important city57 mentioned in the Bible is proud Babylon,58 which God later turns to dust.
  • is enough, for example, to read the Book of Revelation to see how the vision of paradise developed from the deep Old Testament period, when paradise was a garden. John describes his vision of heaven as a city—paradise is in New Jerusalem, a city where the dimensions of the walls(!) are described in detail, as are the golden streets and gates of pearl.
  • Hebrews later also chose a king (despite the unanimous opposition of God’s prophets) and settled in cities, where they eventually founded the Lord’s Tabernacle and built a temple for Him. The city of Jerusalem later gained an illustrious position in all of religion.
  • this time Christianity (as well as the influence of the Greeks) does not consider human naturalness to be an unambiguous good, and it does not have such an idyllic relationship to nature as the Old Testament prophets.
  • If a tendency toward good is not naturally endowed in people, it must be imputed from above through violence or at least the threat of violence.
  • If we were to look at human naturalness as a good, then collective social actions need a much weaker ruling hand. If people themselves have a natural tendency (propensity) toward good, this role does not have to be supplied by the state, ruler, or, if you wish, Leviathan.
  • How does this affect economics?
  • us return for the last time to the humanization of the wild Enkidu, which is a process we can perceive with a bit of imagination as the first seed of the principle of the market’s invisible hand, and therefore the parallels with one of the central schematics of economic thinking.
  • Sometimes it is better to “harness the devil to the plow” than to fight with him. Instead of summoning up enormous energy in the fight against evil, it is better to use its own energy to reach a goal we desire; setting up a mill on the turbulent river instead of futile efforts to remove the current. This is also how Saint Prokop approached it in one of the oldest Czech legends.
  • Enkidu caused damage and it was impossible to fight against him. But with the help of a trap, trick, this evil was transformed into something that greatly benefited civilization.
  • By culturing and “domesticating” Enkidu, humanity tamed the uncontrollable wild and chaotic evil
  • Enkidu devastated the doings (the external, outside-the-walls) of the city. But he was later harnessed and fights at the side of civilization against nature, naturalness, the natural state of things.
  • A similar motif appears a thousand years after the reversal, which is well known even to noneconomists as the central idea of economics: the invisible hand of the market.
  • A similar story (reforming something animally wild and uncultivated in civilizational achievement) is used by Thomas Aquinas in his teachings. Several centuries later, this idea is fully emancipated in the hands of Bernard Mandeville and his Fable of the Bees: or, Private Vices, Publick Benefits. The economic and political aspects of this idea are—often incorrectly—ascribed to Adam Smith.
  • Here the individual does not try anymore to maximize his goods or profits, but what is important is writing his name in human memory in the form of heroic acts or deeds.
  • immortality, one connected with letters and the cult of the word: A name and especially a written name survives the body.”77
  • After this disappointment, he comes to the edge of the sea, where the innkeeper Siduri lives. As tonic for his sorrow, she offers him the garden of bliss, a sort of hedonistic fortress of carpe diem, where a person comes to terms with his mortality and at least in the course of the end of his life maximizes earthly pleasures, or earthly utility.
  • In the second stage, after finding his friend Enkidu, Gilgamesh abandons the wall and sets out beyond the city to maximalize heroism. “In his (…) search of immortal life, Gilgamesh
  • The hero refuses hedonism in the sense of maximizing terrestrial pleasure and throws himself into things that will exceed his life. In the blink of an eye, the epic turns on its head the entire utility maximization role that mainstream economics has tirelessly tried to sew on people as a part of their nature.81
  • It is simpler to observe the main features of our civilization at a time when the picture was more readable—at a time when our civilization was just being born and was still “half-naked.” In other words, we have tried to dig down to the bedrock of our written civilization;
  • today remember Gilgamesh for his story of heroic friendship with Enkidu, not for his wall, which no longer reaches monumental heights.
  • the eleventh and final tablet, Gilgamesh again loses what he sought. Like Sisyphus, he misses his goal just before the climax
  • is there something from it that is valid today? Have we found in Gilgamesh certain archetypes that are in us to this day?
  • The very existence of questions similar to today’s economic ones can be considered as the first observation. The first written considerations of the people of that time were not so different from those today. In other words: The epic is understandable for us, and we can identify with it.
  • We have also been witnesses to the very beginnings of man’s culturing—a great drama based on a liberation and then a distancing from the natural state.
  • Let us take this as a memento in the direction of our restlessness, our inherited dissatisfaction and the volatility connected to it. Considering that they have lasted five thousand years and to this day we find ourselves in harmony with a certain feeling of futility, perhaps these characteristics are inherent in man.
  • Gilgamesh had a wall built that divided the city from wild nature and created a space for the first human culture. Nevertheless, “not even far-reaching works of civilization could satisfy human desire.”
  • Friendship shows us new, unsuspected adventures, gives us the opportunity to leave the wall and to become neither its builder nor its part—to not be another brick in the wall.
  • with the phenomenon of the creation of the city, we have seen how specialization and the accumulation of wealth was born, how holy nature was transformed into a secular supplier of resources, and also how humans’ individualistic ego was emancipated.
  • to change the system, to break down that which is standing and go on an expedition against the gods (to awaken, from naïveté to awakening) requires friendship.
  • For small acts (hunting together, work in a factory), small love is enough: Camaraderie. For great acts, however, great love is necessary, real love: Friendship. Friendship that eludes the economic understanding of quid pro quo. Friendship gives. One friend gives (fully) for the other. That is friendship for life and death,
  • The thought that humanity comes at the expense of efficiency is just as old as humanity itself—as we have shown, subjects without emotion are the ideal of many tyrants.
  • The epic later crashes this idea through the friendship of Gilgamesh and Enkidu. Friendship—the biologically least essential love, which at first sight appears to be unnecessary
  • less a civilized, city person is dependent on nature, the more he or she is dependent on the rest of society. Like Enkidu, we have exchanged nature for society; harmony with (incalculable) nature for harmony with (incalculable) man.
  • human nature good or evil? To this day these questions are key for economic policy: If we believe that man is evil in his nature, therefore that a person himself is dog eat dog (animal), then the hard hand of a ruler is called for. If we believe that people in and of themselves, in their nature, gravitate toward good, then it is possible to loosen up the reins and live in a society that is more laissez-faire.
  • For a concept of historical progress, for the undeification of heroes, rulers, and nature, mankind had to wait for the Hebrews.
  • Because nature is not undeified, it is beyond consideration to explore it, let alone intervene in it (unless a person was a two-thirds god like Gilgamesh). It
  • They practiced money lending, traded in many assets (…) and especially were engaged in the trading of shares on capital markets, worked in currency exchange and frequently figured as mediators in financial transactions (…), they functioned as bankers and participated in emissions of all possible forms.
  • As regards modern capitalism (as opposed to the ancient and medieval periods) … there are activities in it which are, in certain forms, inherently (and completely necessarily) present—both from an economic and legal standpoint.7
  • As early as the “dark” ages, the Jews commonly used economic tools that were in many ways ahead of their time and that later became key elements of the modern economy:
  • Gilgamesh’s story ends where it began. There is a consistency in this with Greek myths and fables: At the end of the story, no progress occurs, no essential historic change; the story is set in indefinite time, something of a temporal limbo.
  • Jews believe in historical progress, and that progress is in this world.
  • For a nation originally based on nomadism, where did this Jewish business ethos come from? And can the Hebrews truly be considered as the architects of the values that set the direction of our civilization’s economic thought?
  • Hebrew religiosity is therefore strongly connected with this world, not with any abstract world, and those who take pleasure in worldly possessions are not a priori doing anything wrong.
  • PROGRESS: A SECULARIZED RELIGION One of the things the writers of the Old Testament gave to mankind is the idea and notion of progress. The Old Testament stories have their development; they change the history of the Jewish nation and tie in to each other. The Jewish understanding of time is linear—it has a beginning and an end.
  • The observance of God’s Commandments in Judaism leads not to some ethereal other world, but to an abundance of material goods (Genesis 49:25–26, Leviticus 26:3–13, Deuteronomy 28:1–13) (…) There are no accusing fingers pointed at
  • There are no echoes of asceticism nor for the cleansing and spiritual effect of poverty. It is fitting therefore, that the founders of Judaism, the Patriarchs Abraham, Isaac and Jacob, were all wealthy men.12
  • about due to a linear understanding of history. If history has a beginning as well as an end, and they are not the same point, then exploration suddenly makes sense in areas where the fruits are borne only in the next generation.
  • What’s more, economic progress has almost become an assumption of modern functional societies. We expect growth. We take it automatically. Today, if nothing “new” happens, if GDP does not grow (we say it stagnates) for several quarters, we consider it an anomaly.
  • however, the idea of progress itself underwent major changes, and today we perceive it very differently. As opposed to the original spiritual conceptions, today we perceive progress almost exclusively in an economic or scientific-technological sense.
  • Because care for the soul has today been replaced by care for external things,
  • This is why we must constantly grow, because we (deep down and often implicitly) believe that we are headed toward an (economic) paradise on Earth.
  • Only since the period of scientific-technological revolution (and at a time when economics was born as an independent field) is material progress automatically assumed.
  • Jewish thought is the most grounded, most realistic school of thought of all those that have influenced our culture.17 An abstract world of ideas was unknown to the Jews. To this day it is still forbidden to even depict God, people, and animals in symbols, paintings, statues, and drawings.
  • economists have become key figures of great importance in our time (Kacířské eseje o filosofii dějin [Heretical Essays in the Philosophy of History]). They are expected to perform interpretations of reality, give prophetic services (macroeconomic forecasts), reshape reality (mitigate the impacts of the crisis, speed up growth), and, in the long run, provide leadership on the way to the Promised Land—paradise on Earth.
  • REALISM AND ANTIASCETICISM Aside from ideas of progress, the Hebrews brought another very fundamental contribution to our culture: The desacralization of heroes, nature, and rulers.
  • Voltaire writes: “It certain fact is, that in his public laws he [Moses] never so much as once made mention of a life to come, limiting all punishments and all rewards to the present life.”21
  • As opposed to Christianity, the concept of an extraterrestrial paradise or heaven was not developed much in Hebrew thought.19 The paradise of the Israelites—Eden—was originally placed on Earth at a given place in Mesopotamia20 and at a given time,
  • The Hebrews consider the world to be real—not just a shadow reflection of a better world somewhere in the cloud of ideas, something the usual interpretation of history ascribes to Plato. The soul does not struggle against the body and is not its prisoner, as Augustine would write later.
  • The land, the world, the body, and material reality are for Jews the paramount setting for divine history, the pinnacle of creation. This idea is the conditio sine qua non of the development of economics, something of an utterly earthly making,
  • The mythology of the hero-king was strongly developed in that period, which Claire Lalouette summarizes into these basic characteristics: Beauty (a perfect face, on which it is “pleasant to look upon,” but also “beauty,” expressed in the Egyptian word nefer, not only means aesthetics, but contains moral qualities as well),
  • THE HERO AND HIS UNDEIFICATION: THE DREAM NEVER SLEEPS The concept of the hero is more important than it might appear. It may be the remote origin of Keynes’s animal spirits, or the desire to follow a kind of internal archetype that a given individual accepts as his own and that society values.
  • This internal animator of ours, our internal mover, this dream, never sleeps and it influences our behavior—including economic behavior—more than we want to realize.
  • manliness and strength,28 knowledge and intelligence,29 wisdom and understanding, vigilance and performance, fame and renown (fame which overcomes enemies because “a thousand men would not be able to stand firmly in his presence”);30 the hero is a good shepherd (who takes care of his subordinates), is a copper-clad rampart, the shield of the land, and the defender of heroes.
  • Each of us probably has a sort of “hero within”—a kind of internal role-model, template, an example that we (knowingly or not) follow. It is very important what kind of archetype it is, because its role is dominantly irrational and changes depending on time and the given civilization.
  • The oldest was the so-called Trickster—a fraudster; then the culture bearer—Rabbit; the musclebound hero called Redhorn; and finally the most developed form of hero: the Twins.
  • the Egyptian ruler, just as the Sumerian, was partly a god, or the son of a god.31
  • Jacob defrauds his father Isaac and steals his brother Esau’s blessing of the firstborn. Moses murders an Egyptian. King David seduces the wife of his military commander and then has him killed. In his old age, King Solomon turns to pagan idols, and so on.
  • Anthropology knows several archetypes of heroes. The Polish-born American anthropologist Paul Radin examined the myths of North American Indians and, for example, in his most influential book, The Trickster, he describes their four basic archetypes of heroes.
  • The Torah’s heroes (if that term can be used at all) frequently make mistakes and their mistakes are carefully recorded in the Bible—maybe precisely so that none of them could be deified.32
  • We do not have to go far for examples. Noah gets so drunk he becomes a disgrace; Lot lets his own daughters seduce him in a similar state of drunkenness. Abraham lies and (repeatedly) tries to sell his wife as a concubine.
  • the Hebrew heroes correspond most to the Tricksters, the Culture Bearers, and the Twins. The divine muscleman, that dominant symbol we think of when we say hero, is absent here.
  • To a certain extent it can be said that the Hebrews—and later Christianity—added another archetype, the archetype of the heroic Sufferer.35 Job
  • Undeification, however, does not mean a call to pillage or desecration; man was put here to take care of nature (see the story of the Garden of Eden or the symbolism of the naming of the animals). This protection and care of nature is also related to the idea of progress
  • For the heroes who moved our civilization to where it is today, the heroic archetypes of the cunning trickster, culture bearer, and sufferer are rather more appropriate.
  • the Old Testament strongly emphasizes the undeification of nature.37 Nature is God’s creation, which speaks of divinity but is not the domain of moody gods
  • This is very important for democratic capitalism, because the Jewish heroic archetype lays the groundwork much better for the development of the later phenomenon of the hero, which better suits life as we know it today. “The heroes laid down their arms and set about trading to become wealthy.”
  • in an Old Testament context, the pharaoh was a mere man (whom one could disagree with, and who could be resisted!).
  • RULERS ARE MERE MEN In a similar historical context, the Old Testament teachings carried out a similar desacralization of rulers, the so-called bearers of economic policy.
  • Ultimately the entire idea of a political ruler stood against the Lord’s will, which is explicitly presented in the Torah. The Lord unequivocally preferred the judge as the highest form of rule—an
  • The needs of future generations will have to be considered; after all humankind are the guardians of God’s world. Waste of natural resources, whether privately owned or nationally owned is forbidden.”39
  • Politics lost its character of divine infallibility, and political issues were subject to questioning. Economic policy could become a subject of examination.
  • 44 God first creates with the word and then on individual days He divides light from darkness, water from dry land, day from night, and so forth—and He gives order to things.45 The world is created orderly— it is wisely, reasonably put together. The way of the world is put together at least partially46 decipherably by any other wise and reasonable being who honors rational rules.
  • which for the methodology of science and economics is very important because disorder and chaos are difficult to examine scientifically.43 Faith in some kind of rational and logical order in a system (society, the economy) is a silent assumption of any (economic) examination.
  • THE PRAISE OF ORDER AND WISDOM: MAN AS A PERFECTER OF CREATION The created world has an order of sorts, an order recognizable by us as people,
  • From the very beginning, when God distances Himself from the entire idea, there is an anticipation that there is nothing holy, let alone divine, in politics. Rulers make mistakes, and it is possible to subject them to tough criticism—which frequently occurs indiscriminately through the prophets in the Old Testament.
  • Hebrew culture laid the foundations for the scientific examination of the world.
  • Examining the world is therefore an absolutely legitimate activity, and one that is even requested by God—it is a kind of participation in the Creator’s work.51 Man is called on to understand himself and his surroundings and to use his knowledge for good.
  • I was there when he set heavens in place, when he marked out the horizon on the face of the deep (…) Then I was the craftsman at his side.47
  • There are more urgings to gain wisdom in the Old Testament. “Wisdom calls aloud in the street (…): ‘How long will you simple ones love your simple ways?’”49 Or several chapters later: “Wisdom is supreme; therefore get wisdom. Though it cost all you have, get understanding.”50
  • examination is not forbidden. The fact that order can be grasped by human reason is another unspoken assumption that serves as a cornerstone of any scientific examination.
  • then, my sons, listen to me; blessed are those who keep my ways (…) Blessed is the man who listens to me, watching daily at my doors, waiting at my doorway. For whoever finds me finds life and receives favor from the Lord.
  • the rational examination of nature has its roots, surprisingly, in religion.
  • The Lord brought me forth as the first of his works, before his deeds of old. I was appointed from eternity, from the beginning, before the world began. When there were no oceans, I was given birth, when there were no springs abounding with water, before the mountains were settled in place,
  • The Book of Proverbs emphasizes specifically several times that it was wisdom that was present at the creation of the world. Wisdom personified calls out:
  • The last act, final stroke of the brush of creation, naming of the animals—this act is given to a human, it is not done by God, as one would expect. Man was given the task of completing the act of creation that the Lord began:
  • MAN AS A FINISHER OF CREATION The creation of the world, as it is explained in Jewish teachings, is described in the Book of Genesis. Here God (i) creates, (ii) separates, and (iii) names [my emphasis]:
  • Naming is a symbolic expression. In Jewish culture (and also in our culture to this day), the right to name meant sovereign rights and belonged, for example, to explorers (new places), inventors (new principles), or parents (children)—that is, to those who were there at the genesis, at the origin. This right was handed over by God to mankind.
  • The Naming itself (the capital N is appropriate) traditionally belongs to the crowning act of the Creator and represents a kind of grand finale of creation, the last move of the brush to complete the picture—a signature of the master.
  • Without naming, reality does not exist; it is created together with language. Wittgenstein tightly names this in his tractatus—the limits of our language are the limits of our world.53
  • He invented (fictitiously and completely abstractly!) a framework that was generally accepted and soon “made into” reality. Marx invented similarly; he created the notion of class exploitation. Through his idea, the perception of history and reality was changed for a large part of the world for nearly an entire century.
  • Reality is not a given; it is not passive. Perceiving reality and “facts” requires man’s active participation. It is man who must take the last step, an act (and we
  • How does this relate to economics? Reality itself, our “objective” world, is cocreated, man himself participates in the creation; creation, which is somewhat constantly being re-created.
  • Our scientific models put the finishing touches on reality, because (1) they interpret, (2) they give phenomena a name, (3) they enable us to classify the world and phenomena according to logical forms, and (4) through these models we de facto perceive reality.
  • When man finds a new linguistic framework or analytical model, or stops using the old one, he molds or remolds reality. Models are only in our heads; they are not “in objective reality.” In this sense, Newton invented (not merely discovered!) gravity.
  • A real-ization act on our part represents the creation of a construct, the imputation of sense and order (which is beautifully expressed by the biblical act of naming, or categorization, sorting, ordering).
  • Keynes enters into the history of economic thought from the same intellectual cadence; his greatest contribution to economics was precisely the resurrection of the imperceptible—for example in the form of animal spirits or uncertainty. The economist Piero Mini even ascribes Keynes’s doubting and rebellious approach to his almost Talmudic education.63
  • God connects man with the task of guarding and protecting the Garden of Eden, and thus man actually cocreates the cultural landscape. The Czech philosopher Zdeněk Neubauer also describes this: “Such is reality, and it is so deep that it willingly crystallizes into worlds. Therefore I profess that reality is a creation and not a place of occurrence for objectively given phenomena.”61
  • in this viewpoint it is possible to see how Jewish thought is mystical—it admits the role of the incomprehensible. Therefore, through its groundedness, Jewish thought indulges mystery and defends itself against a mechanistic-causal explanation of the world: “The Jewish way of thinking, according to Veblen, emphasizes the spiritual, the miraculous, the intangible.
  • The Jews believed the exact opposite. The world is created by a good God, and evil appears in it as a result of immoral human acts. Evil, therefore, is induced by man.66 History unwinds according to the morality of human acts.
  • What’s more, history seems to be based on morals; morals seem to be the key determining factors of history. For the Hebrews, history proceeds according to how morally its actors behave.
  • The Sumerians believed in dualism—good and evil deities exist, and the earth of people becomes their passive battlefield.
  • GOOD AND EVIL IN US: A MORAL EXPLANATION OF WELL-BEING We have seen that in the Epic of Gilgamesh, good and evil are not yet addressed systematically on a moral level.
  • This was not about moral-human evil, but rather a kind of natural evil. It is as if good and evil were not touched by morality at all. Evil simply occurred. Period.
  • the epic, good and evil are not envisaged morally—they are not the result of an (a)moral act. Evil was not associated with free moral action or individual will.
  • Hebrew thought, on the other hand, deals intensively with moral good and evil. A moral dimension touches the core of its stories.65
  • discrepancy between savings and investment, and others are convinced of the monetary essence
  • The entire history of the Jewish nation is interpreted and perceived in terms of morality. Morality has become, so to speak, a mover and shaker of Hebrew history.
  • sunspots. The Hebrews came up with the idea that morals were behind good and bad years, behind the economic cycle. But we would be getting ahead of ourselves. Pharaoh’s Dream: Joseph and the First Business Cycle To
  • It is the Pharaoh’s well-known dream of seven fat and seven lean cows, which he told to Joseph, the son of Jacob. Joseph interpreted the dream as a macroeconomic prediction of sorts: Seven years of abundance were to be followed by seven years of poverty, famine, and misery.
  • Self-Contradicting Prophecy Here, let’s make several observations on this: Through taxation74 on the level of one-fifth of a crop75 in good years to save the crop and then open granaries in bad years, the prophecy was de facto prevented (prosperous years were limited and hunger averted—through a predecessor of fiscal stabilization).
  • The Old Testament prophesies therefore were not any deterministic look into the future, but warnings and strategic variations of the possible, which demanded some kind of reaction. If the reaction was adequate, what was prophesied would frequently not occur at all.
  • This principle stands directly against the self-fulfilling prophecy,80 the well-known concept of social science. Certain prophecies become self-fulfilling when expressed (and believed) while others become self-contradicting prophecies when pronounced (and believed).
  • If the threat is anticipated, it is possible to totally or at least partially avoid it. Neither Joseph nor the pharaoh had the power to avoid bounty or crop failure (in this the dream interpretation was true and the appearance of the future mystical), but they avoided the impacts and implications of the prophecy (in this the interpretation of the dream was “false”)—famine did not ultimately occur in Egypt, and this was due to the application of reasonable and very intuitive economic policy.
  • Let us further note that the first “macroeconomic forecast” appears in a dream.
  • back to Torah: Later in this story we will notice that there is no reason offered as to why the cycle occurs (that will come later). Fat years will simply come, and then lean years after them.
  • Moral Explanation of a Business Cycle That is fundamentally different from later Hebrew interpretations, when the Jewish nation tries to offer reasons why the nation fared well or poorly. And those reasons are moral.
  • If you pay attention to these laws and are careful to follow them, then the Lord your God will keep his covenant of love with you, as he swore to your forefathers. He will love you and bless you and increase your numbers.
  • Only in recent times have some currents of economics again become aware of the importance of morals and trust in the form of measuring the quality of institutions, the level of justice, business ethics, corruption, and so forth, and examining their influence on the economy,
  • From today’s perspective, we can state that the moral dimension entirely disappeared from economic thought for a long time, especially due to the implementation of Mandeville’s concept of private vices that contrarily support the public welfare
  • Without being timid, we can say this is the first documented attempt to explain the economic cycle. The economic cycle, the explanation of which is to this day a mystery to economists, is explained morally in the Old Testament.
  • But how do we consolidate these two conflicting interpretations of the economic cycle: Can ethics be responsible for it or not? Can we influence reality around us through our acts?
  • it is not within the scope of this book to answer that question; justice has been done to the question if it manages to sketch out the main contours of possible searches for answers.
  • THE ECONOMICS OF GOOD AND EVIL: DOES GOOD PAY OFF? This is probably the most difficult moral problem we could ask.
  • Kant, the most important modern thinker in the area of ethics, answers on the contrary that if we carry out a “moral” act on the basis of economic calculus (therefore we carry out an hedonistic consideration; see below) in the expectation of later recompense, its morality is lost. Recompense, according to the strict Kant, annuls ethics.
  • Inquiring about the economics of good and evil, however, is not that easy. Where would Kant’s “moral dimension of ethics” go if ethics paid? If we do good for profit, the question of ethics becomes a mere question of rationality.
  • Job’s friends try to show that he must have sinned in some way and, in doing so, deserved God’s punishment. They are absolutely unable to imagine a situation in which Job, as a righteous man, would suffer without (moral) cause. Nevertheless, Job insists that he deserves no punishment because he has committed no offense: “God has wronged me and drawn his net around me.”94
  • But Job remains righteous, even though it does not pay to do so: Though he slay me, yet will I hope in him.95 And till I die, I will not deny my integrity I will maintain my righteousness and never let go of it; my conscience will not reproach me as long as I live.96
  • He remains righteous, even if his only reward is death. What economic advantage could he have from that?
  • morals cannot be considered in the economic dimension of productivity and calculus. The role of the Hebrews was to do good, whether it paid off or not. If good (outgoing) is rewarded by incoming goodness, it is a bonus,99 not a reason to do outgoing good. Good and reward do not correlate to each other.
  • This reasoning takes on a dimension of its own in the Old Testament. Good (incoming) has already happened to us. We must do good (outgoing) out of gratitude for the good (incoming) shown to us in the past.
  • So why do good? After all, suffering is the fate of many biblical figures. The answer can only be: For good itself. Good has the power to be its own reward. In this sense, goodness gets its reward, which may or may not take on a material dimension.
  • the Hebrews offered an interesting compromise between the teachings of the Stoics and Epicureans. We will go into it in detail later, so only briefly
  • constraint. It calls for bounded optimalization (with limits). A kind of symbiosis existed between the legitimate search for one’s own utility (or enjoyment of life) and maintaining rules, which are not negotiable and which are not subject to optimalization.
  • In other words, clear (exogenously given) rules exist that must be observed and cannot be contravened. But within these borders it is absolutely possible, and even recommended, to increase utility.
  • the mining of enjoyment must not come at the expense of exogenously given rules. “Judaism comes therefore to train or educate the unbounded desire … for wealth, so that market activities and patterns of consumption operate within a God-given morality.”102
  • The Epicureans acted with the goal of maximizing utility without regard for rules (rules developed endogenously, from within the system, computed from that which increased utility—this was one of the main trumps of the Epicurean school; they did not need exogenously given norms, and argued that they could “calculate” ethics (what to do) for every given situation from the situation itself).
  • The Stoics could not seek their enjoyment—or, by another name, utility. They could not in any way look back on it, and in no way could they count on it. They could only live according to rules (the greatest weakness of this school was to defend where exogenously the given rules came from and whether they are universal) and take a indifferent stand to the results of their actions.
  • To Love the Law The Jews not only had to observe the law (perhaps the word covenant would be more appropriate), but they were to love it because it was good.
  • Their relationship to the law was not supposed to be one of duty,105 but one of gratitude, love. Hebrews were to do good (outgoing), because goodness (incoming) has already been done to them.
  • This is in stark contrast with today’s legal system, where, naturally, no mention of love or gratefulness exists. But God expects a full internalization of the commandments and their fulfillment with love, not as much duty. By no means was this on the basis of the cost-benefit analyses so widespread in economics today, which determines when it pays to break the law and when not to (calculated on the basis of probability of being caught and the amount of punishment vis-à-vis the possible gain).
  • And now, O Israel, what does the Lord your God ask of you but to fear the Lord your God, to walk in all his ways, to love him, to serve the Lord your God with all your heart and with all your soul, and to observe the Lord’s commands and decrees that I am giving you today for your own good? To the Lord your God belong the heavens, even the highest heavens, the earth and everything in it. Yet the Lord set his affection on your forefathers and loved them….
  • the principle of doing good (outgoing) on the basis of a priori demonstrated good (incoming) was also taken over by the New Testament. Atonement itself is based on an a priori principle; all our acts are preceded by good.
  • The Hebrews, originally a nomadic tribe, preferred to be unrestrained and grew up in constant freedom of motion.
  • Human laws, if they are in conflict with the responsibilities given by God, are subordinate to personal responsibility, and a Jew cannot simply join the majority, even if it is legally allowed. Ethics, the concept of good, is therefore always superior to all local laws, rules, and customs:
  • THE SHACKLES OF THE CITY Owing to the Hebrew’s liberation from Egyptian slavery, freedom and responsibility become the key values of Jewish thought.
  • Laws given by God are binding for Jews, and God is the absolute source of all values,
  • The Hebrew ideal is represented by the paradise of the Garden of Eden, not a city.116 The despised city civilization or the tendency to see in it a sinful and shackling way of life appears in glimpses and allusions in many places in the Old Testament.
  • The nomadic Jewish ethos is frequently derived from Abraham, who left the Chaldean city of Ur on the basis of a command:
  • In addition, they were aware of a thin two-way line between owner and owned. We own material assets, but—to a certain extent—they own us and tie us down. Once we become used to a certain material
  • This way of life had understandably immense economic impacts. First, such a society lived in much more connected relationships, where there was no doubt that everyone mutually depended on each other. Second, their frequent wanderings meant the inability to own more than they could carry; the gathering up of material assets did not have great weight—precisely because the physical weight (mass) of things was tied to one place.
  • One of Moses’s greatest deeds was that he managed to explain to his nation once and for all that it is better to remain hungry and liberated than to be a slave with food “at no cost.”
  • SOCIAL WELFARE: NOT TO ACT IN THE MANNER OF SODOM
  • regulations is developed in the Old Testament, one we hardly find in any other nation of the time. In Hebrew teachings, aside from individual utility, indications of the concept of maximalizing utility societywide appear for the first time as embodied in the Talmudic principle of Kofin al midat S´dom, which can be translated as “one is compelled not to act in the manner of Sodom” and to take care of the weaker members of society.
  • In a jubilee year, debts were to be forgiven,125 and Israelites who fell into slavery due to their indebtedness were to be set free.126
  • Such provisions can be seen as the antimonopoly and social measures of the time. The economic system even then had a clear tendency to converge toward asset concentration, and therefore power as well. It would appear that these provisions were supposed to prevent this process
  • Land at the time could be “sold,” and it was not sale, but rent. The price (rent) of real estate depended on how long there was until a forgiveness year. It was about the awareness that we may work the land, but in the last instance we are merely “aliens and strangers,” who have the land only rented to us for a fixed time. All land and riches came from the Lord.
  • These provisions express a conviction that freedom and inheritance should not be permanently taken away from any Israelite. Last but not least, this system reminds us that no ownership lasts forever and that the fields we plow are not ours but the Lord’s.
  • Glean Another social provision was the right to glean, which in Old Testament times ensured at least basic sustenance for the poorest. Anyone who owned a field had the responsibility not to harvest it to the last grain but to leave the remains in the field for the poor.
  • Tithes and Early Social Net Every Israelite also had the responsibility of levying a tithe from their entire crop. They had to be aware from whom all ownership comes and, by doing so, express their thanks.
  • “Since the community has an obligation to provide food, shelter, and basic economic goods for the needy, it has a moral right and duty to tax its members for this purpose. In line with this duty, it may have to regulate markets, prices and competition, to protect the interests of its weakest members.”135
  • In Judaism, charity is not perceived as a sign of goodness; it is more of a responsibility. Such a society then has the right to regulate its economy in such a way that the responsibility of charity is carried out to its satisfaction.
  • With a number of responsibilities, however, comes the difficulty of getting them into practice. Their fulfillment, then, in cases when it can be done, takes place gradually “in layers.” Charitable activities are classified in the Talmud according to several target groups with various priorities, classified according to, it could be said, rules of subsidiarity.
  • Do not mistreat an alien or oppress him, for you were aliens in Egypt.140 As one can see, aside from widows and orphans, the Old Testament also includes immigrants in its area of social protection.141 The Israelites had to have the same rules apply for them as for themselves—they could not discriminate on the basis of their origin.
  • ABSTRACT MONEY, FORBIDDEN INTEREST, AND OUR DEBT AGE If it appears to us that today’s era is based on money and debt, and our time will be written into history as the “Debt age,” then it will certainly be interesting to follow how this development occurred.
  • Money is a social abstractum. It is a social agreement, an unwritten contract.
  • The first money came in the form of clay tablets from Mesopotamia, on which debts were written. These debts were transferable, so the debts became currency. In the end, “It is no coincidence that in English the root of ‘credit’ is ‘credo,’ the Latin for ‘I believe.’”
  • To a certain extent it could be said that credit, or trust, was the first currency. It can materialize, it can be embodied in coins, but what is certain is that “money is not metal,” even the rarest metal, “it is trust inscribed,”
  • Inseparably, with the original credit (money) goes interest. For the Hebrews, the problem of interest was a social issue: “If you lend money to one of my people among you who is needy, do not be like a moneylender; charge him no interest.”
  • there were also clearly set rules setting how far one could go in setting guarantees and the nonpayment of debts. No one should become indebted to the extent that they could lose the source of their livelihood:
  • In the end, the term “bank” comes from the Italian banci, or the benches that Jewish lenders sat on.157
  • Money is playing not only its classical roles (as a means of exchange, a holder of value, etc.) but also a much greater, stronger role: It can stimulate, drive (or slow down) the whole economy. Money plays a national economic role.
  • In the course of history, however, the role of loans changed, and the rich borrowed especially for investment purposes,
  • Today the position and significance of money and debt has gone so far and reached such a dominant position in society that operating with debts (fiscal policy) or interest or money supply (monetary policy) means that these can, to a certain extent, direct (or at least strongly influence) the whole economy and society.
  • In such a case a ban on interest did not have great ethical significance. Thomas Aquinas, a medieval scholar (1225-1274), also considers similarly; in his time, the strict ban on lending with usurious interest was loosened, possibly due to him.
  • As a form of energy, money can travel in three dimensions, vertically (those who have capital lend to those who do not) and horizontally (speed and freedom in horizontal or geographic motion has become the by-product—or driving force?—of globalization). But money (as opposed to people) can also travel through time.
  • money is something like energy that can travel through time. And it is a very useful energy, but at the same time very dangerous as well. Wherever
  • Aristotle condemned interest162 not only from a moral standpoint, but also for metaphysical reasons. Thomas Aquinas shared the same fear of interest and he too argued that time does not belong to us, and that is why we must not require interest.
  • MONEY AS ENERGY: TIME TRAVEL AND GROSS DEBT PRODUCT (GDP)
  • Due to this characteristic, we can energy-strip the future to the benefit of the present. Debt can transfer energy from the future to the present.163 On the other hand, saving can accumulate energy from the past and send it to the present.
  • labor was not considered degrading in the Old Testament. On the contrary, the subjugation of nature is even a mission from God that originally belonged to man’s very first blessings.
  • LABOR AND REST: THE SABBATH ECONOMY
  • The Jews as well as Aristotle behaved very guardedly toward loans. The issue of interest/usury became one of the first economic debates. Without having an inkling of the future role of economic policy (fiscal and monetary), the ancient Hebrews may have unwittingly felt that they were discovering in interest a very powerful weapon, one that can be a good servant, but (literally) an enslaving master as well.
  • It’s something like a dam. When we build one, we are preventing periods of drought and flooding in the valley; we are limiting nature’s whims and, to a large extent, avoiding its incalculable cycles. Using dams, we can regulate the flow of water to nearly a constant. With it we tame the river (and we can also gain
  • But if we do not regulate the water wisely, it may happen that we would overfill the dam and it would break. For the cities lying in the valley, their end would be worse than if a dam were never there.
  • If man lived in harmony with nature before, now, after the fall, he must fight; nature stands against him and he against it and the animals. From the Garden we have moved unto a (battle)field.
  • Only after man’s fall does labor turn into a curse.168 It could even be said that this is actually the only curse, the curse of the unpleasantness of labor, that the Lord places on Adam.
  • Both Plato and Aristotle consider labor to be necessary for survival, but that only the lower classes should devote themselves to it so that the elites would not have to be bothered with it and so that they could devote themselves to “purely spiritual matters—art, philosophy, and politics.”
  • Work is also not only a source of pleasure but a social standing; It is considered an honor. “Do you see a man skilled in his work? He will serve before kings.”170 None of the surrounding cultures appreciate work as much. The idea of the dignity of labor is unique in the Hebrew tradition.
  • Hebrew thinking is characterized by a strict separation of the sacred from the profane. In life, there are simply areas that are holy, and in which it is not allowed to economize, rationalize, or maximize efficiency.
  • good example is the commandment on the Sabbath. No one at all could work on this day, not even the ones who were subordinate to an observant Jew:
  • the message of the commandment on Saturday communicated that people were not primarily created for labor.
  • Paradoxically, it is precisely this commandment out of all ten that is probably the most violated today.
  • Aristotle even considers labor to be “a corrupted waste of time which only burdens people’s path to true honour.”
  • we have days when we must not toil connected (at least lexically) with the word meaning emptiness: the English term “vacation” (or emptying), as with the French term, les vacances, or German die Freizeit, meaning open time, free time, but also…
  • Translated into economic language: The meaning of utility is not to increase it permanently but to rest among existing gains. Why do we learn how to constantly increase gains but not how to…
  • This dimension has disappeared from today’s economics. Economic effort has no goal at which it would be possible to rest. Today we only know growth for growth’s sake, and if our company or country prospers, that does not…
  • Six-sevenths of time either be dissatisfied and reshape the world into your own image, man, but one-seventh you will rest and not change the creation. On the seventh day, enjoy creation and enjoy the work of your hands.
  • the purpose of creation was not just creating but that it had an end, a goal. The process was just a process, not a purpose. The whole of Being was created so…
  • Saturday was not established to increase efficiency. It was a real ontological break that followed the example of the Lord’s seventh day of creation. Just as the Lord did not rest due to tiredness or to regenerate strength; but because He was done. He was done with His work, so that He could enjoy it, to cherish in His creation.
  • If we believe in rest at all today, it is for different reasons. It is the rest of the exhausted machine, the rest of the weak, and the rest of those who can’t handle the tempo. It’s no wonder that the word “rest…
  • Related to this, we have studied the first mention of a business cycle with the pharaoh’s dream as well as seen a first attempt (that we may call…
  • We have tried to show that the quest for a heaven on Earth (similar to the Jewish one) has, in its desacralized form, actually also been the same quest for many of the…
  • We have also seen that the Hebrews tried to explain the business cycle with morality and ethics. For the Hebrews,…
  • ancient Greek economic ethos, we will examine two extreme approaches to laws and rules. While the Stoics considered laws to be absolutely valid, and utility had infinitesimal meaning in their philosophy, the Epicureans, at least in the usual historical explanation, placed utility and pleasure in first place—rules were to be made based on the principle of utility.
  • CONCLUSION: BETWEEN UTILITY AND PRINCIPLE The influence of Jewish thought on the development of market democracy cannot be overestimated. The key heritage for us was the lack of ascetic perception of the world, respect to law and private…
  • We have tried to show how the Torah desacralized three important areas in our lives: the earthly ruler, nature,…
  • What is the relationship between the good and evil that we do (outgoing) and the utility of disutility that we (expect to) get as a reward (incoming)? We have seen…
  • The Hebrews never despised material wealth; on contrary, the Jewish faith puts great responsibility on property management. Also the idea of progress and the linear perception of time gives our (economic)…
  • the Hebrews managed to find something of a happy compromise between both of these principles.
  • will not be able to completely understand the development of the modern notion of economics without understanding the disputes between the Epicureans and the Stoics;
  • poets actually went even further, and with their speech they shaped and established reality and truth. Honor, adventure, great deeds, and the acclaim connected with them played an important role in the establishment of the true, the real.
  • those who are famous will be remembered by people. They become more real, part of the story, and they start to be “realized,” “made real” in the lives of other people. That which is stored in memory is real; that which is forgotten is as if it never existed.
  • Today’s scientific truth is founded on the notion of exact and objective facts, but poetic truth stands on an interior (emotional) consonance with the story or poem. “It is not addressed first to the brain … [myth] talks directly to the feeling system.”
  • “epic and tragic poets were widely assumed to be the central ethical thinkers and teachers of Greece; nobody thought of their work as less serious, less aimed at truth, than the speculative prose treatises of historians and philosophers.”5 Truth and reality were hidden in speech, stories, and narration.
  • Ancient philosophy, just as science would later, tries to find constancy, constants, quantities, inalterabilities. Science seeks (creates?) order and neglects everything else as much as it can. In their own experiences, everyone knows that life is not like that,
  • Just as scientists do today, artists drew images of the world that were representative, and therefore symbolic, picturelike, and simplifying (but thus also misleading), just like scientific models, which often do not strive to be “realistic.”
  • general? In the end, poetry could be more sensitive to the truth than the philosophical method or, later, the scientific method. “Tragic poems, in virtue of their subject matter and their social function, are likely to confront and explore problems about human beings and luck that a philosophical text might be able to omit or avoid.”8
Javier E

These Truths: A History of the United States (Jill Lepore) - 1 views

  • It was meant to mark the start of a new era, in which the course of history might be made predictable and a government established that would be ruled not by accident and force but by reason and choice. The origins of that idea, and its fate, are the story of American history.
  • It seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force.6 This was the question of that autumn. And, in a way, it has been the question of every season since,
  • I once came across a book called The Constitution Made Easy.7 The Constitution cannot be made easy. It was never meant to be easy.
  • ...297 more annotations...
  • THE AMERICAN EXPERIMENT rests on three political ideas—“these truths,” Thomas Jefferson called them—political equality, natural rights, and the sovereignty of the people.
  • After Benjamin Franklin read Jefferson’s draft, he picked up his quill, scratched out the words “sacred & undeniable,” and suggested that “these truths” were, instead, “self-evident.” This was more than a quibble. Truths that are sacred and undeniable are God-given and divine, the stuff of religion. Truths that are self-evident are laws of nature, empirical and observable, the stuff of science. This divide has nearly rent the Republic apart.
  • The real dispute is between “these truths” and the course of events: Does American history prove these truths, or does it belie them?
  • The United States rests on a dedication to equality, which is chiefly a moral idea, rooted in Christianity, but it rests, too, on a dedication to inquiry, fearless and unflinching. Its founders agreed with the Scottish philosopher and historian David Hume, who wrote, in 1748, that “Records of Wars, Intrigues, Factions, and Revolutions are so many Collections of Experiments.”9 They believed that truth is to be found in ideas about morality but also in the study of history.
  • understanding history as a form of inquiry—not as something easy or comforting but as something demanding and exhausting—was central to the nation’s founding. This, too, was new.
  • A new kind of historical writing, less memorial and more unsettling, only first emerged in the fourteenth century. “History is a philosophical science,” the North African Muslim scholar Ibn Khaldun wrote in 1377, in the prologue to his history of the world, in which he defined history as the study “of the causes and origins of existing things.”11
  • Only by fits and starts did history become not merely a form of memory but also a form of investigation, to be disputed, like philosophy, its premises questioned, its evidence examined, its arguments countered.
  • Declaring independence was itself an argument about the relationship between the present and the past, an argument that required evidence of a very particular kind: historical evidence. That’s why most of the Declaration of Independence is a list of historical claims. “To prove this,” Jefferson wrote, “let facts be submitted to a candid world.”
  • In an attempt to solve this problem, the earliest historians of the United States decided to begin their accounts with Columbus’s voyage, stitching 1776 to 1492. George Bancroft published his History of the United States from the Discovery of the American Continent to the Present in 1834, when the nation was barely more than a half-century old, a fledgling, just hatched. By beginning with Columbus, Bancroft made the United States nearly three centuries older than it was, a many-feathered old bird.
  • In 1787, then, when Alexander Hamilton asked “whether societies of men are really capable or not of establishing good government from reflection and choice, or whether they are forever destined to depend for their political constitutions on accident and force,” that was the kind of question a scientist asks before beginning an experiment. Time alone would tell. But time has passed. The beginning has come to an end. What, then, is the verdict of history?
  • In deciding what to leave in and what to leave out, I’ve confined myself to what, in my view, a people constituted as a nation in the early twenty-first century need to know about their own past, mainly because this book is meant to double as an old-fashioned civics book, an explanation of the origins and ends of democratic institutions, from the town meeting to the party system, from the nominating convention to the secret ballot, from talk radio to Internet polls. This book is chiefly a political
  • Aside from being a brief history of the United States and a civics primer, this book aims to be something else, too: it’s an explanation of the nature of the past. History isn’t only a subject; it’s also a method.
  • The truths on which the nation was founded are not mysteries, articles of faith, never to be questioned, as if the founding were an act of God, but neither are they lies, all facts fictions, as if nothing can be known, in a world without truth.
  • Between reverence and worship, on the one side, and irreverence and contempt, on the other, lies an uneasy path, away from false pieties and petty triumphs over people who lived and died and committed both their acts of courage and their sins and errors long before we committed ours. “We cannot hallow this ground,” Lincoln said at Gettysburg. We are obliged, instead, to walk this ground, dedicating ourselves to both the living and the dead.
  • studying history is like that, looking into one face and seeing, behind it, another, face after face after face. “Know whence you came,” Baldwin told his nephew.17 The past is an inheritance, a gift and a burden. It can’t be shirked. You carry it everywhere. There’s nothing for it but to get to know it.
  • Nature takes one toll, malice another. History is the study of what remains, what’s left behind, which can be almost anything, so long as it survives the ravages of time and war: letters, diaries, DNA, gravestones, coins, television broadcasts, paintings, DVDs, viruses, abandoned Facebook pages, the transcripts of congressional hearings, the ruins of buildings. Some of these things are saved by chance or accident, like the one house that, as if by miracle, still stands after a hurricane razes a town. But most of what historians study survives because it was purposely kept—placed
  • As nation-states emerged, they needed to explain themselves, which they did by telling stories about their origins, tying together ribbons of myths, as if everyone in the “English nation,” for instance, had the same ancestors, when, of course, they did not. Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.15
  • When the United States declared its independence in 1776, plainly, it was a state, but what made it a nation? The fiction that its people shared a common ancestry was absurd on its face; they came from all over, and, having waged a war against England, the very last thing they wanted to celebrate was their Englishness.
  • Facts, knowledge, experience, proof. These words come from the law. Around the seventeenth century, they moved into what was then called “natural history”: astronomy, physics, chemistry, geology. By the eighteenth century they were applied to history and to politics, too. These truths: this was the language of reason, of enlightenment, of inquiry, and of history.
  • Against conquest, slaughter, and slavery came the urgent and abiding question, “By what right?”
  • Yet the origins of the United States date to 1492 for another, more troubling reason: the nation’s founding truths were forged in a crucible of violence, the products of staggering cruelty, conquest and slaughter, the assassination of worlds.
  • Locke, spurred both by a growing commitment to religious toleration and by a desire to distinguish English settlement from Spanish conquest, stressed the lack of cultivation as a better justification for taking the natives’ land than religious difference, an emphasis with lasting consequences.
  • Unlike Polo and Mandeville, Columbus did not make a catalogue of the ways and beliefs of the people he met (only later did he hire Pané to do that). Instead, he decided that the people he met had no ways and beliefs. Every difference he saw as an absence.22 Insisting that they had no faith and no civil government and were therefore infidels and savages who could not rightfully own anything, he claimed possession of their land, by the act of writing. They were a people without truth; he would make his truth theirs. He would tell them where the dead go.
  • It became commonplace, inevitable, even, first among the Spanish, and then, in turn, among the French, the Dutch, and the English, to see their own prosperity and good health and the terrible sicknesses suffered by the natives as signs from God. “Touching these savages, there is a thing that I cannot omit to remark to you,” one French settler wrote: “it appears visibly that God wishes that they yield their place to new peoples.” Death convinced them at once of their right and of the truth of their faith. “The natives, they are all dead of small Poxe,” John Winthrop wrote when he arrived in New England in 1630: “the Lord hathe cleared our title to what we possess.”
  • In much of New Spain, the mixed-race children of Spanish men and Indian women, known as mestizos, outnumbered Indians; an intricate caste system marked gradations of skin color, mixtures of Europeans, Native Americans, and Africans, as if skin color were like dyes made of plants, the yellow of sassafras, the red of beets, the black of carob. Later, the English would recognize only black and white, a fantasy of stark and impossible difference, of nights without twilight and days without dawns. And yet both regimes of race, a culture of mixing or a culture of pretending not to mix, pressed upon the brows of every person of the least curiosity the question of common humanity: Are all peoples one?
  • Elizabeth’s best defender argued that if God decided “the female should rule and govern,” it didn’t matter that women were “weake in nature, feable in bodie, softe in courage,” because God would make every right ruler strong. In any case, England’s constitution abided by a “rule mixte,” in which the authority of the monarch was checked by the power of Parliament; also, “it is not she that ruleth but the lawes.” Elizabeth herself called on yet another authority: the favor of the people.48 A mixed constitution, the rule of law, the will of the people: these were English ideas that Americans would one day make their own, crying, “Liberty!”
  • In the brutal, bloody century between Columbus’s voyage and John White’s, an idea was born, out of fantasy, out of violence, the idea that there exists in the world a people who live in an actual Garden of Eden, a state of nature, before the giving of laws, before the forming of government. This imagined history of America became an English book of genesis, their new truth. “In the beginning,” the Englishman John Locke would write, “all the world was America.” In America, everything became a beginning.
  • England’s empire would have a different character than that of either Spain or France. Catholics could make converts by the act of baptism, but Protestants were supposed to teach converts to read the Bible; that meant permanent settlements, families, communities, schools, and churches. Also, England’s empire would be maritime—its navy was its greatest strength. It would be commercial. And, of greatest significance for the course of the nation that would grow out of those settlements, its colonists would be free men, not vassals, guaranteed their “English liberties.”
  • Beginning with the Virginia charter, the idea of English liberties for English subjects was planted on American soil and, with it, the king’s claim to dominion, a claim that rested on the idea that people like Powhatan and his people lived in darkness and without government, no matter that the English called their leaders kings.
  • Twenty Englishmen were elected to the House of Burgesses. Twenty Africans were condemned to the house of bondage. Another chapter opened in the American book of genesis: liberty and slavery became the American Abel and Cain.
  • To build his case against the king, Coke dusted off a copy of an ancient and almost entirely forgotten legal document, known as Magna Carta (literally, the “great charter”), in which, in the year 1215, King John had pledged to his barons that he would obey the “law of the land.” Magna Carta wasn’t nearly as important as Coke made it out to be, but by arguing for its importance, he made it important, not only for English history, but for American history, too, tying the political fate of everyone in England’s colonies to the strange doings of a very bad king from the Middle Ages.
  • Magna Carta explains a great deal about how it is that some English colonists would one day come to believe that their king had no right to rule them and why their descendants would come to believe that the United States needed a written constitution. But Magna Carta played one further pivotal role, the role it played in the history of truth—a history that had taken a different course in England than in any other part of Europe.
  • The most crucial right established under Magna Carta was the right to a trial by jury.
  • in 1215, the pope banned trial by ordeal. In Europe, it was replaced by a new system of divine judgment: judicial torture. But in England, where there existed a tradition of convening juries to judge civil disputes—like disagreements over boundaries between neighboring freeholds—trial by ordeal was replaced not by judicial torture but by trial by jury.
  • This turn marked the beginning of a new era in the history of knowledge: it required a new doctrine of evidence and new method of inquiry and eventually led to the idea that an observed or witnessed act or thing—the substance, the matter, of fact—is the basis of truth. A judge decided the law; a jury decided the facts. Mysteries were matters of faith, a different kind of truth, known only to God.
  • The age of mystery began to wane, and, soon, the culture of fact spread from law to government.
  • There would never be very many Africans in New England, but New Englanders would have slave plantations, on the distant shores. Nearly half of colonial New Englanders’ wealth would come from sugar grown by West Indian slaves.
  • One million Europeans migrated to British America between 1600 and 1800 and two and a half million Africans were carried there by force over that same stretch of centuries, on ships that sailed past one another by day and by night.42 Africans died faster, but as a population of migrants, they outnumbered Europeans two and a half to one.
  • In the last twenty-five years of the seventeenth century, English ships, piloted by English sea captains, crewed by English sailors, carried more than a quarter of a million men, women, and children across the ocean, shackled in ships’ holds.44 Theirs was not a ship of state crossing a sea of troubles, another Mayflower, their bond a covenant. Theirs was a ship of slavery, their bonds forged in fire. They whispered and wept; they screamed and sat in silence. They grew ill; they grieved; they died; they endured.
  • By what right did the English hold these people as their slaves?
  • Under Roman law, all men are born free and can only be made slaves by the law of nations, under certain narrow conditions—for instance, when they’re taken as prisoners of war, or when they sell themselves as payment of debt. Aristotle had disagreed with Roman law, insisting that some men are born slaves. Neither of these traditions from antiquity proved to be of much use to English colonists attempting to codify their right to own slaves, because laws governing slavery, like slavery itself, had disappeared from English common law by the fourteenth century. Said one Englishman in Barbados in 1661, there was “no track to guide us where to walk nor any rule sett us how to govern such Slaves.”46
  • With no track or rule to guide them, colonial assemblies adopted new practices and devised new laws with which they attempted to establish a divide between “blacks” and “whites.”
  • Adopting these practices and passing these laws required turning English law upside down, because much in existing English law undermined the claims of owners of people. In 1655, a Virginia woman with an African mother and an English father sued for her freedom by citing English common law, under which children’s status follows that of their father, not their mother. In 1662, Virginia’s House of Burgesses answered doubts about “whether children got by any Englishman upon a Negro woman should be slave or ffree” by reaching back to an archaic Roman rule, partus sequitur ventrem (you are what your mother was). Thereafter, any child born of a woman who was a slave inherited her condition.
  • By giving Americans a more ancient past, he hoped to make America’s founding appear inevitable and its growth inexorable, God-ordained. He also wanted to celebrate the United States, not as an offshoot of England, but instead as a pluralist and cosmopolitan nation, with ancestors all over the world.
  • No book should be censored before publication, Milton argued (though it might be condemned after printing), because truth could only be established if allowed to do battle with lies. “Let her and falsehood grapple,” he urged, since, “whoever knew Truth to be put to the worst in a free and open encounter?” This view depended on an understanding of the capacity of the people to reason. The people, Milton insisted, are not “slow and dull, but of a quick, ingenious and piercing spirit, acute to invent, subtle and sinewy to discourse, not beneath the reach of any point the highest that human capacity can soar to.”52
  • All men, Locke argued, are born equal, with a natural right to life, liberty, and property; to protect those rights, they erect governments by consent. Slavery, for Locke, was no part either of a state of nature or of civil society. Slavery was a matter of the law of nations, “nothing else, but the state of war continued, between a lawful conqueror and a captive.” To introduce slavery in the Carolinas, then, was to establish, as fundamental to the political order, an institution at variance with everything about how Locke understood civil society.
  • Long before shots were fired at Lexington and Concord, long before George Washington crossed the Delaware, long before American independence was thought of, or even thinkable, a revolutionary tradition was forged, not by the English in America, but by Indians waging wars and slaves waging rebellions. They revolted again and again and again. Their revolutions came in waves that lashed the land. They asked the same question, unrelentingly: By what right are we ruled?
  • Rebellion hardened lines between whites and blacks. Before Bacon and his men burned Jamestown, poor Englishmen had very little political power. As many as three out of every four Englishmen and women who sailed to the colonies were either debtors or convicts or indentured servants; they weren’t slaves, but neither were they free.61 Property requirements for voting meant that not all free white men could vote. Meanwhile, the fact that slaves could be manumitted by their masters meant that it was possible to be both black and free and white and unfree. But after Bacon’s Rebellion, free white men were granted the right to vote, and it became nearly impossible for black men and women to secure their freedom. By 1680, one observer could remark that “these two words, Negro and Slave” had “grown Homogeneous and convertible”: to be black was to be a slave.
  • Benjamin Franklin eventually settled in the tidy Quaker town of Philadelphia and began printing his own newspaper, the Pennsylvania Gazette, in 1729. In its pages, he fought for freedom of the press. In a Miltonian 1731 “Apology for Printers,” he observed “that the Opinions of Men are almost as various as their Faces” but that “Printers are educated in the Belief, that when Men differ in Opinion, both Sides ought equally to have the Advantage of being heard by the Publick; and that when Truth and Error have fair Play, the former is always an overmatch for the latter.”
  • But if the culture of the fact hadn’t yet spread to newspapers, it had spread to history. In Leviathan, Thomas Hobbes had written that “The register of Knowledge of Fact is called History.”74 One lesson Americans would learn from the facts of their own history had to do with the limits of the freedom of the press, and this was a fact on which they dwelled, and a liberty they grew determined to protect.
  • Slavery does not exist outside of politics. Slavery is a form of politics, and slave rebellion a form of violent political dissent. The Zenger trial and the New York slave conspiracy were much more than a dispute over freedom of the press and a foiled slave rebellion: they were part of a debate about the nature of political opposition, and together they established its limits. Both Cosby’s opponents and Caesar’s followers allegedly plotted to depose the governor. One kind of rebellion was celebrated, the other suppressed—a division that would endure.
  • In American history, the relationship between liberty and slavery is at once deep and dark: the threat of black rebellion gave a license to white political opposition.
  • This, too, represented a kind of revolution: Whitefield emphasized the divinity of ordinary people, at the expense of the authority of their ministers.
  • he wrote in 1751 an essay about the size of the population, called “Observations concerning the Increase of Mankind, Peopling of Countries, &c.”
  • Franklin guessed the population of the mainland colonies to be about “One Million English Souls,” and his calculations suggested that this number would double every twenty-five years. At that rate, in only a century, “the greatest Number of Englishmen will be on this Side the Water.” Franklin’s numbers were off; his estimates weren’t too high; they were too low. At the time, more than 1.5 million people lived in Britain’s thirteen mainland colonies. Those colonies were far more densely settled than New France or New Spain. Only 60,000 French settlers lived in Canada and 10,000 more in Louisiana. New Spain was even more thinly settled.
  • he wrote about a new race, a people who were “white.” “The Number of purely white People in the World is proportionably very small,” Franklin began. As he saw it, Africans were “black”; Asians and Native Americans were “tawny”; Spaniards, Italians, French, Russians, Swedes, and Germans were “swarthy.” That left very few people, and chiefly the English, as the only “white people” in the world. “I could wish their Numbers were increased,” Franklin said, adding, wonderingly, “But perhaps I am partial to the Complexion of my Country, for such Kind of Partiality is natural to Mankind.”
  • Franklin’s “JOIN, or DIE” did some of that, too: it offered a lesson about the rulers and the ruled, and the nature of political communities. It made a claim about the colonies: they were parts of a whole.
  • When Benjamin Franklin began writing his autobiography, in 1771, he turned the story of his own escape—running away from his apprenticeship to his brother James—into a metaphor for the colonies’ growing resentment of parliamentary rule. James’s “harsh and tyrannical Treatment,” Franklin wrote, had served as “a means of impressing me with that Aversion to arbitrary Power that has stuck to me thro’ my whole Life.”7 But that was also the story of every runaway slave ad, testament after testament to an aversion to arbitrary power.
  • The American Revolution did not begin in 1775 and it didn’t end when the war was over. “The success of Mr. Lay, in sowing the seeds of . . . a revolution in morals, commerce, and government, in the new and in the old world, should teach the benefactors of mankind not to despair, if they do not see the fruits of their benevolent propositions, or undertakings, during their lives,” Philadelphia doctor Benjamin Rush later wrote.
  • There were not one but two American revolutions at the end of the eighteenth century: the struggle for independence from Britain, and the struggle to end slavery. Only one was won.
  • The Revolution was at its most radical in the challenge it presented to the institution of slavery and at its most conservative in its failure to meet that challenge. Still, the institution had begun to break, like a pane of glass streaked with cracks but not yet shattered.
  • “I wish our Poor Distracted State would atend to the many good Lessons” of history, Jane Franklin wrote to her brother, and not “keep always in a Flame.”21
  • After Annapolis, Madison went home to Virginia and resumed his course of study. In April of 1787, he drafted an essay called “Vices of the Political System of the United States.” It took the form of a list of eleven deficiencies,
  • it closed with a list of causes for these vices, which he located primarily “in the people themselves.” By this last he meant the danger that a majority posed to a minority: “In republican Government the majority however composed, ultimately give the law. Whenever therefore an apparent interest or common passion unites a majority what is to restrain them from unjust violations of the rights and interests of the minority, or of individuals?”27 What force restrains good men from doing bad things? Honesty, character, religion—these, history demonstrated, were not to be relied upon. No, the only force that could restrain the tyranny of the people was the force of a well-constructed constitution. It would have to be as finely wrought as an iron gate.
  • At the convention, it proved impossible to set the matter of slavery aside, both because the question of representation turned on it and because any understanding of the nature of tyranny rested on it. When Madison argued about the inevitability of a majority oppressing a minority, he cited ancient history, and told of how the rich oppressed the poor in Greece and Rome. But he cited, too, modern American history. “We have seen the mere distinction of color made in the most enlightened period of time, the ground of the most oppressive dominion ever exercised by man over man.”40
  • If not for the three-fifths rule, the representatives of free states would have outnumbered representatives of slave states by 57 to 33.44
  • Wilson, half Franklin’s age, read his remarks instead. “Mr. President,” he began, addressing Washington, “I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them.” He suggested that he might, one day, change his mind. “For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.” Hoping to pry open the minds of delegates who were closed to the compromise before them, he reminded them of the cost of zealotry. “Most men indeed as well as most sects in Religion, think themselves in possession of all truth, and that wherever others differ from them it is so far error.” But wasn’t humility the best course, in such circumstances? “Thus I consent, Sir, to this Constitution,” he closed, “because I expect no better, and because I am not sure, that it is not the best.”
  • Except for the Massachusetts Constitution, in 1780, and the second New Hampshire Constitution, in 1784, no constitution, no written system of government, had ever before been submitted to the people for their approval. “This is a new event in the history of mankind,” said the governor of Connecticut at his state’s ratification convention.
  • Nearly everything Washington did set a precedent. What would have happened if he had decided, before taking that oath of office, to emancipate his slaves? He’d grown disillusioned with slavery; his own slaves, and the greater number of slaves owned by his wife, were, to him, a moral burden, and he understood very well that for all the wealth generated by forced, unpaid labor, the institution of slavery was a moral burden to the nation. There is some evidence—slight though it is—that Washington drafted a statement announcing that he intended to emancipate his slaves before assuming the presidency. (Or maybe that statement, like Washington’s inaugural address, had been written by Hamilton, a member of New York’s Manumission Society.) This, too, Washington understood, would have established a precedent: every president after him would have had to emancipate his slaves. And yet he would not, could not, do it.65 Few of Washington’s decisions would have such lasting and terrible consequences as this one failure to act.
  • In the century and a half between the Connecticut charter and the 1787 meeting of the constitutional convention lies an entire revolution—not just a political revolution but also a religious revolution. So far from establishing a religion, the Constitution doesn’t even mention “God,” except in naming the date (“the year of our Lord . . .”). At a time when all but two states required religious tests for office, the Constitution prohibited them. At a time when all but three states still had an official religion, the Bill of Rights forbade the federal government from establishing one. Most Americans believed, with Madison, that religion can only thrive if it is no part of government, and that a free government can only thrive if it is no part of religion.
  • The replacement of debtors’ prison with bankruptcy protection would change the nature of the American economy, spurring investment, speculation, and the taking of risks.
  • as early as 1791, Madison had begun to revise his thinking. In an essay called “Public Opinion,” he considered a source of instability particular to a large republic: the people might be deceived. “The larger a country, the less easy for its real opinion to be ascertained,” he explained. That is, factions might not, in the end, consist of wise, knowledgeable, and reasonable men. They might consist of passionate, ignorant, and irrational men, who had been led to hold “counterfeit” opinions by persuasive men. (Madison was thinking of Hamilton and his ability to gain public support for his financial plan.)
  • The way out of this political maze was the newspaper. “A circulation of newspapers through the entire body of the people,” he explained, “is equivalent to a contraction of territorial limits.” Newspapers would make the country, effectively, smaller.90 It was an ingenious idea. It would be revisited by each passing generation of exasperated advocates of republicanism. The newspaper would hold the Republic together; the telegraph would hold the Republic together; the radio would hold the Republic together; the Internet would hold the Republic together. Each time, this assertion would be both right and terribly wrong.
  • Newspapers in the early republic weren’t incidentally or inadvertently partisan; they were entirely and enthusiastically partisan. They weren’t especially interested in establishing facts; they were interested in staging a battle of opinions. “Professions of impartiality I shall make none,” wrote a Federalist printer. “They are always useless, and are besides perfect nonsense.”92
  • Washington’s Farewell Address consists of a series of warnings about the danger of disunion. The North and the South, the East and the West, ought not to consider their interests separate or competing, Washington urged: “your union ought to be considered as a main prop of your liberty.” Parties, he warned, were the “worst enemy” of every government, agitating “the community with ill-founded jealousies and false alarms,” kindling “the animosity of one part against another,” and even fomenting “riot and insurrection.”
  • As to the size of the Republic, “Is there a doubt whether a common government can embrace so large a sphere? Let experience solve it.” The American experiment must go on. But it could only thrive if the citizens were supported by religion and morality, and if they were well educated. “Promote, then, as an object of primary importance, institutions for the general diffusion of knowledge,” he urged. “In proportion as the structure of a government gives force to public opinion, it is essential that public opinion should be enlightened.”95
  • “Passion” or variants of the word appear seven times in the Farewell; it is the source of every problem; reason is its only remedy. Passion is a river. There would be no changing its course.
  • Adams and Jefferson lived in an age of quantification. It began with the measurement of time. Time used to be a wheel that turned, and turned again; during the scientific revolution, time became a line. Time, the easiest quantity to measure, became the engine of every empirical inquiry: an axis, an arrow. This new use and understanding of time contributed to the idea of progress—if time is a line instead of a circle, things can get better and even better, instead of forever rising and falling in endless cycles, like the seasons. The idea of progress animated American independence and animated, too, the advance of capitalism.
  • The quantification of time led to the quantification of everything else: the counting of people, the measurement of their labor, and the calculation of profit as a function of time. Keeping time and accumulating wealth earned a certain equivalency. “Time is money,” Benjamin Franklin used to say.
  • The two-party system turned out to be essential to the strength of the Republic. A stable party system organizes dissent. It turns discontent into a public good. And it insures the peaceful transfer of power, in which the losing party willingly, and without hesitation, surrenders its power to the winning party.
  • Behind Madison’s remarks about “lessening the proportion of slaves to the free people,” behind Jefferson’s tortured calculations about how many generations would have to pass before his own children could pass for “white,” lay this hard truth: none of these men could imagine living with descendants of Africans as political equals.
  • If the battle between John Adams and Thomas Jefferson had determined whether aristocracy or republicanism would prevail (and, with Jefferson, republicanism won), the battle between Andrew Jackson and John Quincy Adams would determine whether republicanism or democracy would prevail (and, with Jackson, democracy would, eventually, win). Jackson’s rise to power marked the birth of American populism. The argument of populism is that the best government is that most closely directed by a popular majority.
  • He was provincial, and poorly educated. (Later, when Harvard gave Jackson an honorary doctorate, John Quincy Adams refused to attend the ceremony, calling him “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”)68 He had a well-earned reputation for being ferocious, ill-humored, and murderous, on the battlefield and off. When he ran for president, he had served less than a year in the Senate. Of his bid for the White House Jefferson declared, “He is one of the most unfit men I know of for such a place.”69 Jackson made a devilishly shrewd decision. He would make his lack of certain qualities—judiciousness, education, political experience—into strengths.
  • Eaton, who ran Jackson’s campaign, shrewdly revised his Life of Andrew Jackson, deleting or dismissing everything in Jackson’s past that looked bad and lavishing attention on anything that looked good and turning into strengths what earlier had been considered weaknesses: Eaton’s Jackson wasn’t uneducated; he was self-taught. He wasn’t ill-bred; he was “self-made.”
  • Watching the rise of American democracy, an aging political elite despaired, and feared that the Republic could not survive the rule of the people. Wrote John Randolph of Virginia, “The country is ruined past redemption.”
  • “The first principle of our system,” Jackson said, “is that the majority is to govern.” He bowed to the people. Then, all at once, the people nearly crushed him with their affection.
  • The democratization of American politics was hastened by revivalists like Stewart who believed in the salvation of the individual through good works and in the equality of all people in the eyes of God. Against that belief stood the stark and brutal realities of an industrializing age, the grinding of souls.
  • The great debates of the middle decades of the nineteenth century had to do with the soul and the machine. One debate merged religion and politics. What were the political consequences of the idea of the equality of souls? Could the soul of America be redeemed from the nation’s original sin, the Constitution’s sanctioning of slavery?
  • Another debate merged politics and technology. Could the nation’s new democratic traditions survive in the age of the factory, the railroad, and the telegraph? If all events in time can be explained by earlier events in time, if history is a line, and not a circle, then the course of events—change over time—is governed by a set of laws, like the laws of physics, and driven by a force, like gravity. What is that force? Is change driven by God, by people, or by machines? Is progress the progress of Pilgrim’s Progress, John Bunyan’s 1678 allegory—the journey of a Christian from sin to salvation? Is progress the extension of suffrage, the spread of democracy? Or is progress invention, the invention of new machines?
  • A distinctively American idea of progress involved geography as destiny, picturing improvement as change not only over time but also over space.
  • If the sincerity of converts was often dubious, another kind of faith was taking deeper root in the 1820s, an evangelical faith in technological progress, an unquestioning conviction that each new machine was making the world better. That faith had a special place in the United States, as if machines had a distinctive destiny on the American continent. In prints and paintings, “Progress” appeared as a steam-powered locomotive, chugging across the continent, unstoppable. Writers celebrated inventors as “Men of Progress” and “Conquerors of Nature” and lauded their machines as far worthier than poetry. The triumph of the sciences over the arts meant the defeat of the ancients by the moderns. The genius of Eli Whitney, hero of modernity, was said to rival that of Shakespeare; the head of the U.S. Patent Office declared the steamboat “a mightier epic” than the Iliad.18
  • To Jackson’s supporters, his election marked not degeneration but a new stage in the history of progress. Nowhere was this argument made more forcefully, or more influentially, than in George Bancroft’s History of the United States from the Discovery of the American Continent to the Present. The book itself, reviewers noted, voted for Jackson. The spread of evangelical Christianity, the invention of new machines, and the rise of American democracy convinced Bancroft that “humanism is steady advancing,” and that “the advance of liberty and justice is certain.” That advance, men like Bancroft and Jackson believed, required Americans to march across the continent, to carry these improvements from east to west, the way Jefferson had pictured it. Democracy, John O’Sullivan, a New York lawyer and Democratic editor, argued in 1839, is nothing more or less than “Christianity in its earthly aspect.” O’Sullivan would later coin the term “manifest destiny” to describe this set of beliefs, the idea that the people of the United States were fated “to over spread and to possess the whole of the continent which Providence has given for the development of the great experiment of liberty.”23
  • To evangelical Democrats, Democracy, Christianity, and technology were levers of the same machine. And yet, all along, there were critics and dissenters and objectors who saw, in the soul of the people, in the march of progress, in the unending chain of machines, in the seeming forward movement of history, little but violence and backwardness and a great crushing of men, women, and children. “Oh, America, America,” Maria Stewart cried, “foul and indelible is thy stain!”24
  • The self-evident, secular truths of the Declaration of Independence became, to evangelical Americans, the truths of revealed religion. To say that this marked a turn away from the spirit of the nation’s founding is to wildly understate the case. The United States was founded during the most secular era in American history, either before or since. In the late eighteenth century, church membership was low, and anticlerical feeling was high.
  • The United States was not founded as a Christian nation. The Constitution prohibits religious tests for officeholders. The Bill of Rights forbids the federal government from establishing a religion, James Madison having argued that to establish
  • The separation of church and state allowed religion to thrive; that was one of its intentions. Lacking an established state religion, Americans founded new sects, from Shakers to Mormons, and rival Protestant denominations sprung up in town after town. Increasingly, the only unifying, national religion was a civil religion, a belief in the American creed. This faith bound the nation together, and provided extraordinary political stability in an era of astonishing change,
  • Slavery wasn’t an aberration in an industrializing economy; slavery was its engine. Factories had mechanical slaves; plantations had human slaves. The power of machines was measured by horsepower, the power of slaves by hand power. A healthy man counted as “two hands,” a nursing woman as a “half-hand,” a child as a “quarter-hand.”
  • With Walker, the antislavery argument for gradual emancipation, with compensation for slave owners, became untenable. Abolitionists began arguing for immediate emancipation. And southern antislavery societies shut their doors. As late as 1827, the number of antislavery groups in the South had outnumbered those in the North by more than four to one. Southern antislavery activists were usually supporters of colonization, not of emancipation. Walker’s Appeal ended the antislavery movement in the South and radicalized it in the North.
  • The rebellion rippled across the Union. The Virginia legislature debated the possibility of emancipating its slaves, fearing “a Nat Turner might be in every family.” Quakers submitted a petition to the state legislature calling for abolition. The petition was referred to a committee, headed by Thomas Jefferson’s thirty-nine-year-old grandson, Thomas Jefferson Randolph, who proposed a scheme of gradual emancipation. Instead, the legislature passed new laws banning the teaching of slaves to read and write, and prohibiting, too, teaching slaves about the Bible.43 In a nation founded on a written Declaration, made sacred by evangelicals during a religious revival, reading about equality became a crime.
  • One consequence of the rise of Jacksonian democracy and the Second Great Awakening was the participation of women in the reformation of American politics by way of American morals. When suffrage was stripped of all property qualifications, women’s lack of political power became starkly obvious. For women who wished to exercise power, the only source of power seemingly left to them was their role as mothers, which, they suggested, rendered them morally superior to men—more loving, more caring, and more responsive to the cries of the weak.
  • Purporting to act less as citizens than as mothers, cultivating the notion of “republican motherhood,” women formed temperance societies, charitable aid societies, peace societies, vegetarian societies, and abolition societies. The first Female Anti-Slavery Society was founded in Boston in 1833; by 1837, 139 Female Anti-Slavery Societies had been founded across the country,
  • After 1835, she never again spoke in public. As Catherine Beecher argued in 1837, in An Essay on Slavery and Abolitionism, with Reference to the Duty of American Females, “If the female advocate chooses to come upon a stage, and expose her person, dress, and elocution to public criticism, it is right to express disgust.”
  • Jacksonian democracy distributed political power to the many, but industrialization consolidated economic power in the hands of a few. In Boston, the top 1 percent of the population controlled 10 percent of wealth in 1689, 16 percent in 1771, 33 percent in 1833, and 37 percent in 1848, while the lowest 80 percent of the population controlled 39 percent of the wealth in 1689, 29 percent in 1771, 14 percent in 1833, and a mere 4 percent in 1848.
  • In New York, the top 1 percent of the population controlled 40 percent of the wealth in 1828 and 50 percent in 1845; the top 4 percent of the population controlled 63 percent of the wealth in 1828 and 80 percent in 1845.49
  • While two and a half million Europeans had migrated to all of the Americas between 1500 and 1800, the same number—two and a half million—arrived specifically in the United States between 1845 and 1854 alone. As a proportion of the U.S. population, European immigrants grew from 1.6 percent in the 1820s to 11.2 percent in 1860. Writing in 1837, one Michigan reformer called the nation’s rate of immigration “the boldest experiment upon the stability of government ever made in the annals of time.”51 The largest
  • Critics of Jackson—himself the son of Irish immigrants—had blamed his election on the rising population of poor, newly enfranchised Irishmen. “Everything in the shape of an Irishman was drummed to the polls,” one newspaper editor wrote in 1828.52 By 1860, more than one in eight Americans were born in Europe, including 1.6 million Irish and 1.2 million Germans, the majority of whom were Catholic. As the flood of immigrants swelled, the force of nativism gained strength, as did hostility toward Catholics, fueled by the animus of evangelical Protestants.
  • The insularity of both Irish and German communities contributed to a growing movement to establish tax-supported public elementary schools, known as “common schools,” meant to provide a common academic and civic education to all classes of Americans. Like the extension of suffrage to all white men, this element of the American experiment propelled the United States ahead of European nations. Much of the movement’s strength came from the fervor of revivalists. They hoped that these new schools would assimilate a diverse population of native-born and foreign-born citizens by introducing them to the traditions of American culture and government, so that boys, once men, would vote wisely, and girls, once women, would raise virtuous children. “It is our duty to make men moral,” read one popular teachers’ manual, published in 1830. Other advocates hoped that a shared education would diminish partisanship. Whatever the motives of its advocates, the common school movement emerged out of, and nurtured, a strong civic culture.56
  • With free schools, literacy spread, and the number of newspapers rose, a change that was tied to the rise of a new party system. Parties come and go, but a party system—a stable pair of parties—has characterized American politics since the ratification debates. In American history the change from one party system to another has nearly always been associated with a revolution in communications that allows the people to shake loose of the control of parties. In the 1790s, during the rise of the first party system, which pitted Federalists against Republicans, the number of newspapers had swelled. During the shift to the second party system, which, beginning in 1833, pitted Democrats against the newly founded Whig Party, not only did the number of newspapers rise, but their prices plummeted.
  • The newspapers of the first party system, which were also known as “commercial advertisers,” had consisted chiefly of partisan commentary and ads, and generally sold for six cents an issue. The new papers cost only one cent, and were far more widely read. The rise of the so-called penny press also marked the beginning of the triumph of “facts” over “opinion” in American journalism, mainly because the penny press aimed at a different, broader, and less exclusively partisan, audience. The New York Sun appeared in 1833. “It shines for all” was its common-man motto. “The object of this paper is to lay before the public, at a price within the means of everyone, ALL THE NEWS OF THE DAY,” it boasted. It dispensed with subscriptions and instead was circulated at newsstands, where it was sold for cash, to anyone who had a ready penny. Its front page was filled not with advertising but with news. The penny press was a “free press,” as James Gordon Bennett of the New York Herald put it, because it wasn’t beholden to parties. (Bennett, born in Scotland, had immigrated to the United States after reading Benjamin Franklin’s Autobiography.) Since the paper was sold at newsstands, rather than mailed to subscribers, he explained, its editors and writers were “entirely ignorant who are its readers and who are not.” They couldn’t favor their readers’ politics because they didn’t know them. “We shall support no party,” Bennett insisted. “We shall endeavor to record facts.”
  • During the days of the penny press, Tocqueville observed that Americans had a decided preference for weighing the facts of a matter themselves: They mistrust systems; they adhere closely to facts and study facts with their own senses. As they do not easily defer to the mere name of any fellow man, they are never inclined to rest upon any man’s authority; but, on the contrary, they are unremitting in their efforts to find out the weaker points of their neighbor’s doctrine.60
  • For centuries, Europeans had based their claims to lands in the New World on arguments that native peoples had no right to the land they inhabited, no sovereignty over it, because they had no religion, or because they had no government, or because they had no system of writing. The Cherokees, with deliberation and purpose, challenged each of these arguments.
  • Britain, Calhoun argued that if a state were to decide that a law passed by Congress was unconstitutional, the Constitution would have to be amended, and if such an amendment were not ratified—if it didn’t earn the necessary approval of three-quarters of the states—the objecting state would have the right to secede from the Union. The states had been sovereign before the Constitution was ever written, or even thought of, Calhoun argued, and they remained sovereign. Calhoun also therefore argued against majority rule; nullification is fundamentally anti-majoritarian. If states can secede, the majority does not rule.78 The nullification crisis was
  • New York abolished debtors’ prison in 1831, and in 1841, Congress passed a federal law offering bankruptcy protection to everyone. Within two years, 41,000 Americans had filed for bankruptcy. Two years later, the law was repealed, but state laws continued to offer bankruptcy protection and, still more significantly, debtors’ prisons were gone for good. In Britain and all of Europe except Portugal, offenders were still being thrown in debtors’ prison (a plot that animated many a nineteenth-century novel); in the United States, debtors could declare bankruptcy and begin again.
  • A nation of debtors, Americans came to see that most people who fall into debt are victims of the business cycle and not of fate or divine retribution or the wheel of fortune. The nation’s bankruptcy laws, even as they came and went again, made taking risks less risky for everyone, which meant that everyone took more risks.
  • the geographical vastness of the United States meant that the anxiety about the machinery of industrial capitalism took the form not of Marxism, with its argument that “the history of all hitherto existing society is the history of class struggles,” but instead of a romance with nature, and with the land, and with all things rustic. Against the factory, Americans posed not a socialist utopia but the log cabin.
  • Were all these vast designs and rapid strides worth it? Thoreau thought not. He came to this truth: “They are but improved means to an unimproved end.”112
  • Expansion, even more than abolition, pressed upon the public the question of the constitutionality of slavery. How or even whether this crisis would be resolved was difficult to see not only because of the nature of the dispute but also because there existed very little agreement about who might resolve it: Who was to decide whether a federal law was unconstitutional?
  • In the midst of all this clamoring among the thundering white-haired patriarchs of American politics, there emerged the idea that the authority to interpret the Constitution rests with the people themselves. Or, at least, this became a rather fashionable thing to say. “It is, Sir, the people’s Constitution, the people’s government, made for the people, made by the people, and answerable to the people,” Daniel Webster roared from the floor of Congress.14 Every man could read and understand the Constitution, Webster insisted.
  • The Notes, it appeared, could be read as variously as the Constitution itself. As one shrewd observer remarked, “The Constitution threatens to be a subject of infinite sects, like the Bible.” And, as with many sects, those politicians who most strenuously staked their arguments on the Constitution often appeared the least acquainted with it. Remarked New York governor Silas Wright, “No one familiar with the affairs of our government, can have failed to notice how large a proportion of our statesmen appear never to have read the Constitution of the United States with a careful reference to its precise language and exact provisions, but rather, as occasion presents, seem to exercise their ingenuity . . . to stretch both to the line of what they, at the moment, consider expedient.”22
  • A NATION HAS borders but the edges of an empire are frayed.23 While abolitionists damned the annexation of Texas as an extension of the slave power, more critics called it an act of imperialism, inconsistent with a republican form of government. “We have a republic, gentlemen, of vast extent and unequalled natural advantages,” Daniel Webster pointed out. “Instead of aiming to enlarge its boundaries, let us seek, rather, to strengthen its union.”24 Webster lost that argument, and, in the end, it was the American reach for empire that, by sundering the Union, brought about the collapse of slavery.
  • Although hardly ever reported in the press, the years between 1830 and 1860 saw more than one hundred incidents of violence between congressmen, from melees in the aisles to mass brawls on the floor, from fistfights and duels to street fights. “It is the game of these men, and of their profligate organs,” Dickens wrote, “to make the strife of politics so fierce and brutal, and so destructive of all self-respect in worthy men, that sensitive and delicate-minded persons shall be kept aloof, and they, and such as they, be left to battle out their selfish views unchecked.”
  • They spat venom. They pulled guns. They unsheathed knives. Divisions of party were abandoned; the splinter in Congress was sectional. Before heading to the Capitol every morning, southern congressmen strapped bowie knives to their belts and tucked pistols into their pockets. Northerners, on principle, came unarmed. When northerners talked about the slave power, they meant that literally.32
  • If the United States were to acquire territory from Mexico, and if this territory were to enter the Union, would Mexicans become American citizens? Calhoun, now in the Senate, vehemently opposed this idea. “I protest against the incorporation of such a people,” he declared. “Ours is the government of the white man.”
  • And yet, as different as were Wilmot’s interests from Calhoun’s, they were both interested in the rights of white men, as Wilmot made plain. “I plead the cause of the rights of white freemen,” he said. “I would preserve for free white labor a fair country, a rich inheritance, where the sons of toil, of my own race and own color, can live without the disgrace which association with negro slavery brings upon free labor.”
  • If the problem was the size of the Republic, the sprawl of its borders, the frayed edges of empire, couldn’t railroads, and especially the telegraph, tie the Republic together? “Doubt has been entertained by many patriotic minds how far the rapid, full, and thorough intercommunication of thought and intelligence, so necessary to the people living under a common representative republic, could be expected to take place throughout such immense bounds,” said one House member in 1845, but “that doubt can no longer exist.”45
  • even Americans with an unflinching faith in machine-driven progress understood that a pulse along a wire could not stop the slow but steady dissolution of the Union.
  • the Treaty of Guadalupe Hidalgo, under which the top half of Mexico became the bottom third of the United States. The gain to the United States was as great as the loss to Mexico. In 1820, the United States of America had spanned 1.8 million square miles, with a population of 9.6 million people; Mexico had spanned 1.7 million square miles, with a population of 6.5 million people. By 1850, the United States had acquired one million square miles of Mexico, and its population had grown to 23.2 million; Mexico’s population was 7.5 million.49
  • The Louisiana Purchase had doubled the size of the United States. In gaining territory from Mexico, the United States grew by 64 percent.
  • the territory comprising the United States had grown to “nearly ten times as large as the whole of France and Great Britain combined; three times as large as the whole of France, Britain, Austria, Prussia, Spain, Portugal, Belgium, Holland, and Denmark, together; one-and-a-half times as large as the Russian empire in Europe; one-sixth less only than the area covered by the fifty-nine or sixty empires, states, and Republics of Europe; of equal extent with the Roman Empire or that of Alexander, neither of which is said to have exceeded 3,000,000 square miles.”50
  • Sentiment was not Fuller’s way; debate was her way. She was a scourge of lesser intellects. Edgar Allan Poe, whose work she did not admire, described her as wearing a perpetual sneer. In “The Great Lawsuit: Man versus Men, Woman versus Women,” Fuller argued that the democratization of American politics had cast light on the tyranny of men over women: “As men become aware that all men have not had their fair chance,” she observed, women had become willing to say “that no women have had a fair chance.”
  • In 1845, in Woman in the Nineteenth Century, Fuller argued for fundamental and complete equality: “We would have every path laid open to Woman as freely as to Man.”56 The book was wildly successful, and Greeley, who had taken to greeting Fuller with one of her catchphrases about women’s capacity—“Let them be sea-captains, if you will”—sent her to Europe to become his newspaper’s foreign correspondent.
  • Reeling from those revolutions, the king of Bavaria asked the historian Leopold von Ranke to explain why his people had rebelled against monarchial rule, as had so many peoples in Europe that year. “Ideas spread most rapidly when they have found adequate concrete expression,” Ranke told the king, and the United States had “introduced a new force in the world,” the idea that “the nation should govern itself,” an idea that would determine “the course of the modern world”: free speech, spread by wire, would make the whole world free.61
  • Unlike Thoreau, who cursed the railroads, Free-Soilers believed in improvement, improvement through the hard work of the laboring man, his power, his energy. “Our paupers to-day, thanks to free labor, are our yeoman and merchants of tomorrow,” the New York Times boasted. “Why, who are the laboring people of the North?” Daniel Webster asked. “They are the whole North. They are the people who till their own farms with their own hands, freeholders, educated men, independent men.”
  • This attack by northerners led southerners to greater exertions in defending their way of life. They battled on several fronts. They described northern “wage slavery” as a far more exploitative system of labor than slavery. They celebrated slavery as fundamental to American prosperity. Slavery “has grown with our growth, and strengthened with our strength,” Calhoun said. And they elaborated an increasingly virulent ideology of racial difference, arguing against the very idea of equality embodied in the American creed.
  • Conservative Virginian George Fitzhugh, himself inspired by ethnological thinking, dismissed the “self-evident truths” of the Declaration of Independence as utter nonsense. “Men are not born physically, morally, or intellectually equal,” he wrote. “It would be far nearer the truth to say, ‘that some were born with saddles on their backs, and others booted and spurred to ride them,’—and the riding does them good.”
  • For Fitzhugh, the error had begun in the imaginations of the philosophes of the Enlightenment and in their denial of the reality of history. Life and liberty are not “inalienable rights,” Fitzhugh argued: instead, people “have been sold in all countries, and in all ages, and must be sold so long as human nature lasts.” Equality means calamity: “Subordination, difference of caste and classes, difference of sex, age, and slavery beget peace and good will.”
  • Progress is an illusion: “the world has not improved in the last two thousand, probably four thousand years.” Perfection is to be found in the past, not in the future.66 As for the economic systems of the North and the South, “Free laborers have not a thousandth part of the rights and liberties of negro slaves,” Fitzhugh insisted. “The negro slaves of the South are the happiest, and, in some sense, the freest people in the world.”67
  • HISTORY TEEMS WITH mishaps and might-have-beens: explosions on the Potomac, storms not far from port, narrowly contested elections, court cases lost and won, political visionaries drowned. But over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • over the United States in the 1850s, a sense of inevitability fell, as if there were a fate, a dismal dismantlement, that no series of events or accidents could thwart.
  • Douglas promoted the idea of popular sovereignty, proclaiming, “If there is any one principle dearer and more sacred than all others in free governments, it is that which asserts the exclusive right of a free people to form and adopt their own fundamental law.”75 Unfree people, within Stephen Douglas’s understanding, had no such rights.
  • the Fugitive Slave Law, required citizens to turn in runaway slaves and denied fugitives the right to a jury trial. The law, said Harriet Jacobs, a fugitive slave living in New York, marked “the beginning of a reign of terror to the colored population.”76 Bounty hunters and slave catchers hunted down and captured former slaves and returned them to their owners for a fee. Little stopped them from seizing men, women, and children who had been born free, or who had been legally emancipated, and selling them to the South, too. Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.
  • February 1854, at their convention in Philadelphia, northern Know-Nothings proposed a platform plank calling for the reinstatement of the Missouri Compromise. When that motion was rejected, some fifty delegates from eight northern states bolted: they left the convention, and the party, to set up their own party, the short-lived North American Party. Nativism would endure as a force in American politics, but, meanwhile, nativists split over slavery.
  • Lincoln’s was the language of free soil, free speech, and free labor. He grounded his argument against slavery in his understanding of American history, in the language of Frederick Douglass, and in his reading of the Constitution. “Let no one be deceived,” he said. “The spirit of seventy-six and the spirit of Nebraska, are utter antagonisms.”
  • As a nation, we began by declaring that “all men are created equal.” We now practically read it “all men are created equal, except negroes.” When the Know-Nothings get control, it will read “all men are created equal, except negroes, and foreigners, and Catholics.” When it comes to this I should prefer emigrating to some country where they make no pretense of loving liberty—to Russia, for instance, where despotism can be taken pure, and without the base alloy of hypocrisy.
  • “That negroes, whether slave or free, that is, men of the African race, are not citizens of the United States by the Constitution.” The implications of the ruling stunned his readers. Even Americans who held no strong views on the question of slavery—and they were rare enough—were nonetheless shocked by the court’s exercise of the authority to determine the unconstitutionality of the law.
  • “A large meeting of colored people” was held in Philadelphia in April, at which it was resolved that “the only duty the colored man owes to a Constitution under which he is declared to be an inferior and degraded being, having no rights which white men are bound to respect, is to denounce and repudiate it, and to do what he can by all proper means to bring it into contempt.”
  • “You may close your Supreme Court against the black man’s cry for justice, but you cannot, thank God, close against him the ear of a sympathising world, nor shut up the Court of Heaven.” Taney’s interpretation of the Constitution would be ignored, Douglass predicted. “Slavery lives in this country not because of any paper Constitution, but in the moral blindness of the American people.”102
  • APHOTOGRAPH STOPS TIME, TRAPPING IT LIKE A BUTTERFLY in a jar.
  • No other kind of historical evidence has this quality of instantaneity, of an impression taken in a moment, in a flicker, an eye opened and then shut. Photographs also capture the ordinary, the humble, the speechless. The camera discriminates between light and dark but not between the rich and the poor, the literate and the illiterate, the noisy and the quiet.
  • portraits were also closely associated with death, with being trapped in time, on glass, for eternity, and, even more poignantly, with equality.3 With photography, Walt Whitman predicted, “Art will be democratized.”
  • Morse had long predicted that the telegraph would usher in an age of world peace. “I trust that one of its effects will be to bind man to his fellow-man in such bonds of amity as to put an end to war,” he insisted.8 War was a failure of technology, Morse argued, a shortcoming of communication that could be remedied by way of a machine. Endowing his work with the grandest of purposes, he believed that the laying of telegraph wires across the American continent would bind the nation together into one people, and that the laying of cable across the ocean would bind Europe to the Americas, ushering in the dawn of an age of global harmony.
  • But war isn’t a failure of technology; it’s a failure of politics.
  • Debate is to war what trial by jury is to trial by combat: a way to settle a dispute without coming to blows. The form and its rules had been established over centuries. They derived from rules used in the courts and in Parliament, and even from the rules of rhetoric used in the writing of poetry. Since the Middle Ages and the founding of the first universities, debate had been the foundation of a liberal arts education.
  • (Etymologically and historically, the artes liberales are the arts acquired by people who are free, or liber.)10 In the eighteenth century, debate was understood as the foundation of civil society. In 1787, delegates to the constitutional convention had agreed to “to argue without asperity, and to endeavor to convince the judgment without hurting the feelings of each other.”
  • Some twelve thousand people showed up for their first debate, at two o’clock in the afternoon on August 21, in Ottawa, Illinois. There were no seats; the audience stood, without relief, for three hours.
  • They’d agreed to strict rules: the first speaker would speak for an hour and the second for an hour and a half, whereupon the first speaker would offer a thirty-minute rebuttal.
  • And, as to the wrongness of slavery, he called it tyranny, and the idea of its naturalness as much an error as a belief in the divine right of kings. The question wasn’t sectionalism or nationalism, the Democratic Party or the Republican Party. The question was right against wrong. “That is the issue that will continue in this country when these poor tongues of Judge Douglas and myself shall be silent,” Lincoln said.16
  • The price of slaves grew so high that a sizable number of white southerners urged the reopening of the African slave trade. In the 1850s, legislatures in several states, including South Carolina, proposed reopening the trade. Adopting this measure would have violated federal law. Some “reopeners” believed that the federal ban on the trade was unconstitutional; others were keen to nullify it, in a dress rehearsal for secession.
  • “If it is right to buy slaves in Virginia and carry them to New Orleans, why is it not right to buy them in Cuba, Brazil, or Africa and carry them there?”21 Proslavery southerners made these arguments under the banner of “free trade,” their rhetorical answer to “free labor.”
  • To George Fitzhugh, all societies were “at all times and places, regulated by laws as universal and as similar as those which control the affairs of bees,” and trade itself, including the slave trade, was “as old, as natural, and irresistible as the tides of the ocean.”
  • In 1855, David Christy, the author of Cotton Is King, wrote about the vital importance of “the doctrine of Free Trade,” which included abolishing the tariffs that made imported English goods more expensive than manufactured goods produced in the North. As one southerner put it, “Free trade, unshackled industry, is the motto of the South.”23
  • Darwin’s Origin of Species would have a vast and lingering influence on the world of ideas. Most immediately, it refuted the racial arguments of ethnologists like Louis Agassiz. And, in the months immediately following the book’s publication—the last, unsettling months before the beginning of the Civil War—abolitionists took it as evidence of the common humanity of man.30
  • The truths of the Confederacy disavowed the truths of the Union. The Confederacy’s newly elected vice president, a frail Georgian named Alexander Stephens, delivered a speech in Savannah in which he made those differences starkly clear. The ideas that lie behind the Constitution “rested upon the assumption of the equality of races,” Stephens said, but
  • “Our new government is founded upon exactly the opposite idea: its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery . . . is his natural and moral condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”52 It would become politically expedient, after the war, for ex-Confederates to insist that the Confederacy was founded on states’ rights. But the Confederacy was founded on white supremacy.
  • Opposition to free speech had long been the position of slave owners, a position taken at the constitutional convention and extended through the gag rule, antiliteracy laws, bans on the mails, and the suppression of speakers. An aversion to political debate also structured the Confederacy, which had both a distinctive character and a lasting influence on Americans’ ideas about federal authority as against popular sovereignty.
  • Secessionists were attempting to build a modern, proslavery, antidemocratic state. In order to wage a war, the leaders of this fundamentally antidemocratic state needed popular support. Such support was difficult to gain and impossible to maintain. The Confederacy therefore suppressed dissent.55
  • By May of 1861, the Confederacy comprised fifteen states stretching over 900,000 square miles and containing 12 million people, including 4 million slaves, and 4 million white women who were disenfranchised. It rested on the foundational belief that a minority governs a majority. “The condition of slavery is with us nothing but a form of civil government for a class of people not fit to govern themselves,” said Jefferson Davis.
  • There would be those, after the war ended, who said that it had been fought over states’ rights or to preserve the Union or for a thousand other reasons and causes. Soldiers, North and South, knew better. “The fact that slavery is the sole undeniable cause of this infamous rebellion, that it is a war of, by, and for Slavery, is as plain as the noon-day sun,” a soldier writing for his Wisconsin regimental newspaper explained in 1862. “Any man who pretends to believe that this is not a war for the emancipation of the blacks,” a soldier writing for his Confederate brigade’s newspaper wrote that same year, “is either a fool or a liar.”
  • Lincoln would remain a man trapped in time, in the click of a shutter and by the trigger of a gun. In mourning him, in sepia and yellow, in black and white, beneath plates of glinting glass, Americans deferred a different grief, a vaster and more dire reckoning with centuries of suffering and loss, not captured by any camera, not settled by any amendment, the injuries wrought on the bodies of millions of men, women, and children, stolen, shackled, hunted, whipped, branded, raped, starved, and buried in unmarked graves.
  • No president consecrated their cemeteries or delivered their Gettysburg address; no committee of arrangements built monuments to their memory. With Lincoln’s death, it was as if millions of people had been crammed into his tomb, trapped in a vault that could not hold them.
  • People running for Congress didn’t have to meet property requirements; they didn’t have to have been born in the United States; and they couldn’t be subjected to religious tests. This same logic applied to citizenship, and for the same reason: the framers of the Constitution understood these sorts of requirements as forms of political oppression. The door to the United States was meant to be open.
  • Before the 1880s, no federal law restricted immigration. And, despite periods of fervent nativism, especially in the 1840s, the United States welcomed immigrants into citizenship, and valued them. After the Civil War, the U.S. Treasury estimated the worth of each immigrant as equal to an $800 contribution to the nation’s economy,
  • Nineteenth-century politicians and political theorists interpreted American citizenship within the context of an emerging set of ideas about human rights and the authority of the state, holding dear the conviction that a good government guarantees everyone eligible for citizenship the same set of political rights, equal and irrevocable.
  • The Civil War raised fundamental questions not only about the relationship between the states and the federal government but also about citizenship itself and about the very notion of a nation-state. What is a citizen? What powers can a state exert over its citizens? Is suffrage a right of citizenship, or a special right, available only to certain citizens? Are women citizens? And if women are citizens, why aren’t they voters? What about Chinese immigrants, pouring into the West? They were free. Were they, under American law, “free white persons” or “free persons of color” or some other sort of persons?
  • In 1866, Congress searched in vain for a well-documented definition of the word “citizen.” Over the next thirty years, that definition would become clear, and it would narrow.
  • In 1896, the U.S. passport office, in the Department of State, which had grown to thousands of clerks, began processing applications according to new “Rules Governing the Application of Passports,” which required evidence of identity, including a close physical description Lew Wa Ho worked at a dry goods shop in St. Louis; the photograph was included in his Immigration Service case file as evidence of employment. Age, _____ years; stature, _____ feet _____ inches (English measure); forehead, _____; eyes, _____; nose, _____; mouth, _____; chin, _____; hair, _____; complexion, _____; face, _____ as well as affidavits, signatures, witnesses, an oath of loyalty, and, by way of an application fee, one dollar.12
  • The Fourteenth Amendment, drafted by the Joint Committee on Reconstruction, marked the signal constitutional achievement of a century of debate and war, of suffering and struggle. It proposed a definition of citizenship guaranteeing its privileges and immunities, and insuring equal protection and due process to all citizens. “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside,”
  • “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.”20
  • During the drafting of the amendment, the committee betrayed the national phalanx of women who for decades had fought for abolition and for black civil rights by proposing to insert, into the amendment’s second section, a provision that any state that denied the right to vote “to any of the male inhabitants of such state” would lose representation in Congress. “Male” had never before appeared in any part of the Constitution. “If that word ‘male’ be inserted,” Stanton warned, “it will take us a century at least to get it out.”21 She was not far wrong.
  • Women protested. “Can any one tell us why the great advocates of Human Equality . . . forget that when they were a weak party and needed all the womanly strength of the nation to help them on, they always united the words ‘without regard to sex, race, or color’?” asked Ohio-born reformer Frances Gage. Charles Sumner offered this answer: “We know how the Negro will vote, but are not so sure of the women.” How women would vote was impossible to know. Would black women vote the way black men voted? Would white women vote like black women? Republicans decided they’d rather not find out.
  • In the federal census of 1860, 24,282 out of 34,935 Chinese toiled in mines. Although some Chinese immigrants left mining—and some were forced out—many continued to mine well into the 1880s, often working in sites abandoned by other miners.
  • An 1867 government report noted that in Montana, “the diggings now fall into the hands of the Chinese, who patiently glean the fields abandoned by the whites.” Chinese workers began settling in Boise in 1865 and only five years later constituted a third of Idaho’s settlers and nearly 60 percent of its miners. In 1870, Chinese immigrants and their children made up nearly 9 percent of the population of California, and one-quarter of the state’s wage earners.
  • Their rights, under state constitutions and statutes, were markedly limited. Oregon’s 1857 constitution barred “Chinamen” from owning real estate, while California barred Chinese immigrants from testifying in court, a provision upheld in an 1854 state supreme court opinion, People v. Hall, which described the Chinese as “a race of people whom nature has marked as inferior, and who are incapable of progress or intellectual development beyond a certain point, as their history has shown.”29
  • And what about the voting rights of U.S.-born Chinese Americans? Much turned on the Fifteenth Amendment, proposed early in 1869. While the aim of the amendment was to guarantee African Americans the right to vote and hold office, its language inevitably raised the question of Chinese citizenship and suffrage. Opponents of the amendment found its entire premise scandalous. Garrett Davis, a Democratic senator from Kentucky, fumed, “I want no negro government; I want no Mongolian government; I want the government of the white man which our fathers incorporated.”33
  • Douglass spoke about what he called a “composite nation,” a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them: “I want a home here not only for the negro, the mulatto and the Latin races; but I want the Asiatic to find a home here in the United States, and feel at home here, both for his sake and for ours.”36
  • Tilden won the nomination anyway and, in the general election, he won the popular vote against Hayes. Unwilling to accept the result of the election, Republicans disputed the returns in Florida, Louisiana, and South Carolina.
  • Eventually, the decision was thrown to an electoral commission that brokered a nefarious compromise: Democrats agreed to throw their support behind the man ever after known as Rutherfraud B. Hayes, so that he could become president, in exchange for a promise from Republicans to end the military occupation of the South. For a minor and petty political win over the Democratic Party, Republicans first committed electoral fraud and then, in brokering a compromise, abandoned a century-long fight for civil rights.
  • As soon as federal troops withdrew, white Democrats, calling themselves the “Redeemers,” took control of state governments of the South, and the era of black men’s enfranchisement came to a violent and terrible end. The Klan terrorized the countryside, burning homes and hunting, torturing, and killing people. (Between 1882 and 1930, murderers lynched more than three thousand black men and women.)
  • Black politicians elected to office were thrown out. And all-white legislatures began passing a new set of black codes, known as Jim Crow laws, that segregated blacks from whites in every conceivable public place, down to the last street corner. Tennessee passed the first Jim Crow law, in 1881, mandating the separation of blacks and whites in railroad cars. Georgia became the first state to demand separate seating for whites and blacks in streetcars, in 1891.
  • “Capital buys and sells to-day the very heart-beats of humanity,” she said. Democracy itself had been corrupted by it: “the speculators, the land-robbers, the pirates and gamblers of this Nation have knocked unceasingly at the doors of Congress, and Congress has in every case acceded to their demands.”44 The capitalists, she said, had subverted the will of the people.
  • In the late nineteenth century, a curious reversal took place. Electoral politics, the politics men engaged in, became domesticated, the office work of education and advertising—even voting moved indoors. Meanwhile, women’s political expression moved to the streets. And there, at marches, rallies, and parades, women deployed the tools of the nineteenth-century religious revival: the sermon, the appeal, the conversion.45
  • 1862 alone, in addition to the Homestead Act, the Republican Congress passed the Pacific Railway Act (chartering railroad companies to build the line from Omaha, Nebraska, to Sacramento, California) and the National Bank Act (to issue paper money to pay for it all). After the war, political power moved from the states to the federal government and as the political influence of the South waned, the importance of the West rose. Congress not only sent to the states amendments to the Constitution that defined citizenship and guaranteed voting rights but also passed landmark legislation involving the management of western land, the control of native populations, the growth and development of large corporations, and the construction of a national transportation infrastructure.
  • The independent farmer—the lingering ideal of the Jeffersonian yeoman—remained the watchword of the West, but in truth, the family farming for subsistence, free of government interference, was far less common than a federally subsidized, capitalist model of farming and cattle raising for a national or even an international market. The small family farm—Jefferson’s republican dream—was in many parts of the arid West an environmental impossibility.
  • Much of the property distributed under the terms of the Homestead Act, primarily in the Great Basin, was semi-arid, the kind of land on which few farmers could manage a productive farm with only 160 acres. Instead, Congress typically granted the best land to railroads, and allowed other, bigger interests to step in, buying up large swaths for agricultural business or stock raising and fencing it in, especially after the patenting of barbed wire in 1874.46
  • In 1885, an American economist tried to reckon the extraordinary transformation wrought by what was now 200,000 miles of railroad, more than in all of Europe. It was possible to move one ton of freight one mile for less than seven-tenths of one cent, “a sum so small,” he wrote, “that outside of China it would be difficult to find a coin of equivalent value to give a boy as a reward for carrying an ounce package across a street.”48
  • instability contributed to a broader set of political concerns that became Mary Lease’s obsession, concerns known as “the money question,” and traceable all the way back to Hamilton’s economic plan: Should the federal government control banking and industry?
  • No group of native-born Americans was more determined to end Chinese immigration than factory workers. The 1876 platform of the Workingmen’s Party of California declared that “to an American death is preferable to life on par with a Chinaman.”55 In 1882, spurred by the nativism of populists, Congress passed its first-ever immigration law, the Chinese Exclusion Act, which barred immigrants from China from entering the United States and, determining that the Fourteenth Amendment did not apply to people of Chinese ancestry, decreed that Chinese people already in the United States were permanent aliens who could never become citizens.
  • Populists, whether farmers or factory workers, for all their invocation of “the people,” tended to take a narrow view of citizenship. United in their opposition to the “money power,” members of the alliance, like members of the Knights of Labor, were also nearly united in their opposition to the political claims of Chinese immigrants, and of black people. The Farmers’ Alliance excluded African Americans, who formed their own association, the Colored Farmers’ Alliance. Nor did populists count Native Americans within the body of “the people.”
  • In 1887, Congress passed the Dawes Severalty Act, under whose terms the U.S. government offered native peoples a path to citizenship in a nation whose reach had extended across the lands of their ancestors. The Dawes Act granted to the federal government the authority to divide Indian lands into allotments and guaranteed U.S. citizenship to Indians who agreed to live on those allotments and renounce tribal membership.
  • In proposing the allotment plan, Massachusetts senator Henry Laurens Dawes argued that the time had come for Indians to choose between “extermination or civilization” and insisted that the law offered Americans the opportunity to “wipe out the disgrace of our past treatment” and instead lift Indians up “into citizenship and manhood.”58
  • But in truth the Dawes Act understood native peoples neither as citizens nor as “persons of color,” and led to nothing so much as forced assimilation and the continued takeover of native lands. In 1887 Indians held 138 million acres; by 1900, they held only half of that territory.
  • In 1877, railroad workers protesting wage cuts went on strike in cities across the country. President Hayes sent in federal troops to end the strikes, marking the first use of the power of the federal government to support business against labor. The strikes continued, with little success in improving working conditions. Between 1881 and 1894, there was, on average, one major railroad strike a week. Labor was, generally and literally, crushed: in a single year, of some 700,000 men working on the railroads, more than 20,000 were injured on the job and nearly 2,000 killed.59
  • In 1882, Roscoe Conkling represented the Southern Pacific Railroad Company’s challenge to a California tax rule. He told the U.S. Supreme Court, “I come now to say that the Southern Pacific Railroad Company and its creditors and stockholders are among the ‘persons’ protected by the Fourteenth Amendment.”
  • In offering an argument about the meaning and original intention of the word “person” in the Fourteenth Amendment, Conkling enjoyed a singular authority: he’d served on the Joint Committee on Reconstruction that had drafted the amendment and by 1882 was the lone member of that committee still living. With no one alive to contradict him, Conkling assured the court that the committee had specifically rejected the word “citizen” in favor of “person” in order to include corporations. (A
  • Much evidence suggests, however, that Conkling was lying. The record of the deliberations of the Joint Committee on Reconstruction does not support his argument regarding the committee’s original intentions, nor is it plausible that between 1866 and 1882, the framers of the Fourteenth Amendment had kept mysteriously hidden their secret intention to guarantee equal protection and due process to corporations. But
  • in 1886, when another railroad case, Santa Clara County v. Southern Pacific Railroad, reached the Supreme Court, the court’s official recorder implied that the court had accepted the doctrine that “corporations are persons within the meaning of the Fourteenth Amendment.”62 After that, the Fourteenth Amendment, written and ratified to guarantee freed slaves equal protection and due process of law, became the chief means by which corporations freed themselves from government regulation.
  • In 1937, Supreme Court Justice Hugo Black would observe, with grim dismay, that, over the course of fifty years, “only one half of one percent of the Fourteenth Amendment cases that came before the court had anything to do with African Americans or former slaves, while over half of the cases were about protecting the rights of corporations.”63 Rights guaranteed to the people were proffered, instead, to corporations.
  • He devised an economic plan that involved abolishing taxes on labor and instead imposing a single tax on land. Tocqueville had argued that democracy in America is made possible by economic equality; people with equal estates will eventually fight for, and win, equal political rights. George agreed. But, like Mary Lease, he thought that financial capitalism was destroying democracy by making economic equality impossible. He saw himself as defending “the Republicanism of Jefferson and the Democracy of Jackson.”72
  • Between 1889 and 1893, the mortgages on so many farms were foreclosed that 90 percent of farmland fell into the hands of bankers. The richest 1 percent of Americans owned 51 percent of the nation’s wealth, and the poorest 44 percent owned less than 2 percent.
  • For all its passionate embrace of political equality and human rights and its energetic championing of suffrage, the People’s Party rested on a deep and abiding commitment to exclude from full citizenship anyone from or descended from anyone from Africa or Asia.
  • Many of the reforms proposed by populists had the effect of diminishing the political power of blacks and immigrants. Chief among them was the Australian ballot, more usually known as the secret ballot, which, by serving as a de facto literacy test, disenfranchised both black men in the rural South and new immigrants in northern cities.
  • to deliberate at length over the secret ballot. Quickest to adopt the reform were the states of the former Confederacy, where the reform appealed to legislatures eager to find legal ways to keep black men from voting. In 1890, Mississippi held a constitutional
  • Both by law and by brute force, southern legislators, state by state, and poll workers, precinct by precinct, denied black men the right to vote. In Louisiana, black voter registration dropped from 130,000 in 1898 to 5,300 in 1908, and to 730 in 1910. In 1893, Arkansas Democrats celebrated their electoral advantage by singing,         The Australian ballot works like a charm         It makes them think and scratch         And when a Negro gets a ballot         He has certainly met his match.82
  • One Republican said, “I felt that Bryan was the first politician I had ever heard speak the truth and nothing but the truth,” even though in every case, when he read a transcript of the speech in the newspaper the next day, he “disagreed with almost all of it.”85
  • In 1894, Bryan tacked an income tax amendment to a tariff bill, which managed to pass. But the populist victory—a 2 percent federal income tax that applied only to Americans who earned more than $4,000—didn’t last long. The next year, in Pollock v. Farmers’ Loan and Trust Company, the Supreme Court ruled 5–4 that the tax was a direct tax, and therefore unconstitutional, one justice calling the tax the first campaign in “a war of the poor against the rich.”
  • POPULISM ENTERED AMERICAN politics at the end of the nineteenth century, and it never left. It pitted “the people,” meaning everyone but the rich, against corporations, which fought back in the courts by defining themselves as “persons”; and it pitted “the people,” meaning white people, against nonwhite people who were fighting for citizenship and whose ability to fight back in the courts was far more limited, since those fights require well-paid lawyers.
  • After 1859, and the Origin of Species, the rise of Darwinism contributed to the secularization of the university, as did the influence of the German educational model, in which universities were divided into disciplines and departments, each with a claim to secular, and especially scientific, expertise. These social sciences—political science, economics, sociology, and anthropology—used the methods of science, and especially of quantification, to study history, government, the economy, society, and culture.96
  • For Wilson’s generation of political scientists, the study of the state replaced the study of the people. The erection of the state became, in their view, the greatest achievement of civilization. The state also provided a bulwark against populism. In the first decades of the twentieth century, populism would yield to progressivism as urban reformers applied the new social sciences to the study of political problems, to be remedied by the intervention of the state.
  • The rise of populism and the social sciences reshaped the press, too. In the 1790s, the weekly partisan newspaper produced the two-party system. The penny press of the 1830s produced the popular politics of Jacksonian democracy. And in the 1880s and 1890s the spirit of populism and the empiricism of the social sciences drove American newspapers to a newfound obsession with facts.
  • The newspapers of the 1880s and 1890s were full of stunts and scandals and crusades, even as they defended their accuracy. “Facts, facts piled up to the point of dry certitude was what the American people really wanted,” wrote the reporter Ray Stannard Baker. Julius Chambers said that writing for the New York Herald involved “Facts; facts; nothing but facts. So many peas at so much a peck; so much molasses at so much a quart.”
  • Ballot reform, far from keeping money out of elections, had ushered more money into elections, along with a new political style: using piles of money to sell a candidate’s personality, borrowing from the methods of business by using mass advertising and education, slogans and billboards. McKinley ran a new-style campaign; Bryan ran an old-style campaign. Bryan barnstormed all over the country: he gave some six hundred speeches to five million people in twenty-seven states and traveled nearly twenty thousand miles.
  • But McKinley’s campaign coffers were fuller: Republicans spent $7 million; Democrats, $300,000. John D. Rockefeller alone provided the GOP with a quarter of a million dollars. McKinley’s campaign manager, Cleveland businessman Mark Hanna, was nearly buried in donations from fellow businessmen. He used that money to print 120 million pieces of campaign literature. He hired fourteen hundred speakers to stump for McKinley; dubbing the populists Popocrats, they agitated voters to a state of panic.108 As Mary Lease liked to say, money elected McKinley.
  • Turner, born in Wisconsin in 1861, was one of the first Americans to receive a doctorate in history. At the exposition, he delivered his remarks before the American Historical Association, an organization that had been founded in 1884 and incorporated by an act of Congress in 1889 “for the promotion of historical studies, the collection and preservation of historical manuscripts and for kindred purposes in the interest of American history and of history in America.”110
  • like journalists, historians borrowed from the emerging social sciences, relying on quantitative analysis to understand how change happens. Where George Bancroft, in his History of the United States, had looked for explanations in the hand of providence, Frederick Jackson Turner looked to the census.
  • The difference between Turner’s methods and Bancroft’s signaled a profound shift in the organization of knowledge, one that would have lasting consequences for the relationship between the people and the state and for civil society itself. Like Darwinism, the rise of the social sciences involved the abdication of other ways of knowing, and, indirectly, contributed to the rise of fundamentalism.
  • Across newly defined academic disciplines, scholars abandoned the idea of mystery—the idea that there are things known only by God—in favor of the claim to objectivity, a development sometimes called “the disenchantment of the world.”111 When universities grew more secular, religious instruction became confined to divinity schools and theological seminaries.
  • theologian at the University of Chicago’s divinity school defined modernism as “the use of scientific, historical, and social methods in understanding and applying evangelical Christianity to the needs of living persons.”112 Increasingly, this is exactly what evangelicals who eventually identified themselves as fundamentalists found objectionable.
  • Influenced by both Jefferson and Darwin, Turner saw the American frontier as the site of political evolution, beginning with the “savages” of a “wilderness,” proceeding to the arrival of European traders, and continuing through various forms of settlement, through the establishment of cities and factories, “the evolution of each into a higher stage,” and culminating in the final stage of civilization: capitalism and democracy.114
  • “American democracy is fundamentally the outcome of the experiences of the American people in dealing with the West,” by which he meant the experience of European immigrants to the United States in defeating its native peoples, taking possession of their homelands, and erecting there a civilization of their own. This, for Turner, was the story of America and the lesson of American history: evolution.116
  • Douglass, who, as the former U.S. ambassador to Haiti, had represented the nation of Haiti at the Haitian pavilion, was the only eminent African American with a role at the fair, whose program had been planned by a board of 208 commissioners, all white.117 There were, however, black people at the fair: on display. In the Hall of Agriculture, old men and women, former slaves, sold miniature bales of cotton, souvenirs, while, in a series of exhibits intended to display the Turnerian progress of humankind from savagery to civilization, black Americans were posed in a fake African village. “As if to shame the Negro,” Douglass wrote, they “exhibit the Negro as a repulsive savage.”118
  • “A ship at anchor, with halliards broken, sails mildewed, hull empty, her bottom covered with sea-weed and barnacles, meets no resistance,” Douglass said that day, turning the idea of a ship of state to the problem of Jim Crow. “But when she spread her canvas to the breeze and sets out on her voyage, turns prow to the open sea, the higher shall be her speed, the greater shall be her resistance. And so it is with the colored man.”
  • He paused to allow his listeners to conjure the scene, and its meaning, of a people struggling against the sea. “My dear young friends,” Douglass closed. “Accept the inspiration of hope. Imitate the example of the brave mariner, who, amid clouds and darkness, amid hail, rain and storm bolts, battles his way against all that the sea opposes to his progress and you will reach the goal of your noble ambition in safety.”124
  • The majority in Plessy v. Ferguson asserted that separation and equality were wholly separate ideas. “We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.” The resulting legal principle—that public accommodations could be “separate but equal”—would last for more than half a century.
  • The sole dissenter, John Marshall Harlan, objecting to the establishment of separate classes of citizens, insisted that the achievement of the United States had been the establishment, by amendment, of a Constitution that was blind to race. “Our constitution is color-blind, and neither knows nor tolerates classes among citizens,” Harlan wrote, and it is therefore a plain violation of the Constitution “for a state to regulate the enjoyment by citizens of their civil rights solely upon the basis of race.”
  • What all these laws had in common, Harlan argued, was that they were based on race. And yet a war had been fought and won to establish that laws in the United States could not be based on race; nor could citizenship be restricted by race. The court’s opinion in Plessy, Harlan warned, was so dreadfully in error as to constitutional principles that “the judgment this day rendered will, in time, prove to be quite as pernicious as the decision made by this tribunal in the Dred Scott Case.”128 This prediction proved true.
  • Four centuries had passed since continents, separated by oceans, had met again. A century had passed since Jefferson had declared all men equal. Three decades had passed since the Fourteenth Amendment had declared all persons born or naturalized in the United States to be citizens.
  • And now the Supreme Court ruled that those who would set aside equality in favor of separation had not violated the nation’s founding truths. In one of the most wrenching tragedies in American history—a chronicle not lacking for tragedy—the Confederacy had lost the war, but it had won the peace.
  • Lippmann started out as a socialist, when even mentioning the masses hinted at socialism; The Masses was the name of a socialist monthly, published in New York, and, especially after the Russian Revolution of 1917, which brought the Bolshevists to power (“bol’shinstvo” means “the majority”), “the masses” sounded decidedly Red.
  • But Lippmann soon began to write about the masses as “the bewildered herd,” unthinking and instinctual, and as dangerous as an impending stampede. For Lippmann, and for an entire generation of intellectuals, politicians, journalists, and bureaucrats who styled themselves Progressives—the term dates to 1910—the masses posed a threat to American democracy.
  • This change was wrought in the upheaval of the age. In the years following the realigning election of 1896, everything seemed, suddenly, bigger than before, more crowded, and more anonymous: looming and teeming. Even buildings were bigger: big office buildings, big factories, big mansions, big museums. Quantification became the only measure of value: how big, how much, how many.
  • To fight monopolies, protect the people, and conserve the land, the federal government grew bigger, too; dozens of new federal agencies were founded in this era,
  • “Mass” came to mean anything that involved a giant and possibly terrifying quantity, on a scale so great that it overwhelmed existing arrangements—including democracy. “Mass production” was coined in the 1890s, when factories got bigger and faster, when the number of people who worked in them skyrocketed, and when the men who owned them got staggeringly rich.
  • “Mass migration” dates to 1901, when nearly a million immigrants were entering the United States every year, “mass consumption” to 1905, “mass consciousness” to 1912. “Mass hysteria” had been defined by 1925 and “mass communication” by 1927, when the New York Times described the radio as “a system of mass communication with a mass audience.”3
  • And the masses themselves? They formed a mass audience for mass communication and had a tendency, psychologists believed, to mass hysteria—the political stampede—posing a political problem unanticipated by James Madison and Thomas Jefferson,
  • To meet that challenge in what came to be called the Progressive Era, activists, intellectuals, and politicians campaigned for and secured far-reaching reforms that included municipal, state, and federal legislation.
  • Their most powerful weapon was the journalistic exposé. Their biggest obstacle was the courts, which they attempted to hurdle by way of constitutional amendments. Out of these campaigns came the federal income tax, the Federal Reserve Bank, the direct election of U.S. senators, presidential primaries, minimum-wage and maximum-hour laws, women’s suffrage, and Prohibition.
  • And all of what Progressives accomplished in the management of mass democracy was vulnerable to the force that so worried the unrelenting Walter Lippmann: the malleability of public opinion, into mass delusion.
  • Progressives championed the same causes as Populists, and took their side in railing against big business, but while Populists generally wanted less government, Progressives wanted more, seeking solutions in reform legislation and in the establishment of bureaucracies, especially government agencies.6
  • Populists believed that the system was broken; Progressives believed that the government could fix it. Conservatives, who happened to dominate the Supreme Court, didn’t believe that there was anything to fix but believed that, if there was, the market would fix it. Notwithstanding conservatives’ influence in the judiciary, Progressivism spanned both parties.
  • Woodrow Wilson himself admitted, “When I sit down and compare my views with those of a Progressive Republican I can’t see what the difference is.”7
  • Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives,
  • The Social Gospel movement was led by seminary professors—academic theologians who accepted the theory of evolution, seeing it as entirely consistent with the Bible and evidence of a divinely directed, purposeful universe; at the same time, they fiercely rejected the social Darwinism of writers like Herbert Spencer, the English natural scientist who coined the phrase “the survival of the fittest” and used the theory of evolution to defend all manner of force, violence, and oppression.
  • argued that fighting inequality produced by industrialism was an obligation of Christians: “We must make men believe that Christianity has a right to rule this kingdom of industry, as well as all the other kingdoms of this world.”9 Social Gospelers brought the zeal of abolitionism to the problem of industrialism.
  • In 1908, Methodists wrote a Social Creed and pledged to fight to end child labor and to promote a living wage. It was soon adopted by the thirty-three-member Federal Council of Churches, which proceeded to investigate a steelworkers’ strike in Bethlehem, ultimately taking the side of the strikers.10
  • Washington, in the debate over the annexation of the Philippines, Americans revisited unsettled questions about expansion that had rent the nation during the War with Mexico and unsettled questions about citizenship that remained the unfinished business of Reconstruction. The debate also marked the limits of the Progressive vision: both sides in this debate availed themselves, at one time or another, of the rhetoric of white supremacy. Eight million people of color in the Pacific and the Caribbean, from the Philippines to Puerto Rico, were now part of the United States, a nation that already, in practice, denied the right to vote to millions of its own people because of the color of their skin.
  • “You are undertaking to annex and make a component part of this Government islands inhabited by ten millions of the colored race, one-half or more of whom are barbarians of the lowest type,” said Ben Tillman, a one-eyed South Carolina Democrat who’d boasted of having killed black men and expressed his support for lynch mobs. “It is to the injection into the body politic of the United States of that vitiated blood, that debased and ignorant people, that we object.”
  • Tillman reminded Republicans that they had not so long ago freed slaves and then “forced on the white men of the South, at the point of the bayonet, the rule and domination of those ex-slaves. Why the difference? Why the change? Do you acknowledge that you were wrong in 1868?”14
  • The war that began in Cuba in 1898 and was declared over in the Philippines in 1902 dramatically worsened conditions for people of color in the United States, who faced, at home, a campaign of terrorism. Pro-war rhetoric, filled with racist venom, only further incited American racial hatreds. “If it is necessary, every Negro in the state will be lynched,” the governor of Mississippi pledged in 1903.
  • By one estimate, someone in the South was hanged or burned alive every four days. The court’s decision in Plessy v. Ferguson meant that there was no legal recourse to fight segregation, which grew more brutal with each passing year.
  • Nor was discrimination confined to the South. Cities and counties in the North and West passed racial zoning laws, banning blacks from the middle-class communities. In 1890, in Montana, blacks lived in all fifty-six counties in the state; by 1930, they’d been confined to just eleven. In Baltimore, blacks couldn’t buy houses on blocks where whites were a majority.
  • In 1917, in Buchanan v. Warley, the Supreme Court availed itself of the Fourteenth Amendment not to guarantee equal protection for blacks but to guarantee what the court had come to understand as the “liberty of contract”—the liberty of businesses to discriminate.16
  • A generation earlier, he’d have become a preacher, like his father, but instead he became a professor of political science.23 In the academy and later in the White House, he dedicated himself to the problem of adapting a Constitution written in the age of the cotton gin to the age of the automobile.
  • “We have grown more and more inclined from generation to generation to look to the President as the unifying force in our complex system, the leader both of his party and of the nation. To do so is not inconsistent with the actual provisions of the Constitution; it is only inconsistent with a very mechanical theory of its meaning and intention.” A president’s power, Wilson concluded, is virtually limitless: “His office is anything he has the sagacity and force to make it.”24
  • the U.S. Supreme Court overruled much Progressive labor legislation. The most important of these decisions came in 1905. In a 5–4 decision in Lochner v. New York, the U.S. Supreme Court voided a state law establishing that bakers could work no longer than ten hours a day, six days a week, on the ground that the law violated a business owner’s liberty of contract, the freedom to forge agreements with his workers, something the court’s majority said was protected under the Fourteenth Amendment.
  • The laissez-faire conservatism of the court was informed, in part, by social Darwinism, which suggested that the parties in disputes should be left to battle it out, and if one side had an advantage, even so great an advantage as a business owner has over its employees, then it should win.
  • In a dissenting opinion in Lochner, Oliver Wendell Holmes accused the court of violating the will of the people. “This case is decided upon an economic theory which a large part of the country does not entertain,” he began. The court, he said, had also wildly overreached its authority and had carried social Darwinism into the Constitution. “A Constitution is not intended to embody a particular economic theory,” Holmes wrote. “The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.”
  • Wilson pointed out that the Constitution, written before mass industrialization, couldn’t be expected to have anticipated it, and couldn’t solve the problems industrialization had created, unless the Constitution were treated like a living thing that, like an organism, evolved.
  • Critics further to the left argued that the courts had become an instrument of business interests. Unions, in fact, often failed to support labor reform legislation, partly because they expected it to be struck down by the courts as unconstitutional, and partly because they wanted unions to provide benefits to their members, which would be an argument for organizing.
  • conservatives insisted that the courts were right to protect the interests of business and that either market forces would find a way to care for sick, injured, and old workers, or (for social Darwinists) the weakest, who were not meant to thrive, would wither and die.
  • “No other social movement in modern economic development is so pregnant with benefit to the public,” wrote the editor of the Journal of the American Medical Association. “At present the United States has the unenviable distinction of being the only great industrial nation without compulsory health insurance,” the Yale economist Irving Fisher pointed out in 1916.36 It would maintain that unenviable distinction for a century.
  • In California, the legislature passed a constitutional amendment providing for universal health insurance. But when it was put on the ballot for ratification, a federation of insurance companies took out an ad in the San Francisco Chronicle warning that it “would spell social ruin in the United States.” Every voter in the state received in the mail a pamphlet with a picture of the kaiser and the words “Born in Germany. Do you want it in California?” The measure was defeated. Opponents called universal health insurance “UnAmerican, Unsafe, Uneconomic, Unscientific, Unfair and Unscrupulous.”
  • “Scientific management has no place for a bird that can sing and won’t sing,” answered Taylor. “We are not . . . dealing with horses nor singing birds,” Wilson told Taylor. “We are dealing with men who are a part of society and for whose benefit society is organized.
  • Jim Crow thrived because, after the end of Reconstruction in 1877, reformers who had earlier fought for the cause of civil rights abandoned it for the sake of forging a reunion between the states and the federal government and between the North and the South. This wasn’t Wilson’s doing; this was the work of his generation, the work of the generation that came before him, and the work of the generation that would follow him, an abdication of struggle, an abandonment of justice.
  • War steered the course of American politics like a gale-force wind. The specter of slaughter undercut Progressivism, suppressed socialism, and produced anticolonialism. And, by illustrating the enduring wickedness of humanity and appearing to fulfill prophecies of apocalypse as a punishment for the moral travesty of modernism, the war fueled fundamentalism.
  • Bryan’s difficulty was that he saw no difference between Darwinism and social Darwinism, but it was social Darwinism that he attacked, the brutality of a political philosophy that seemed to believe in nothing more than the survival of the fittest, or what Bryan called “the law of hate—the merciless law by which the strong crowd out and kill the weak.”77
  • Germany was the enemy, the same Germany whose model of education had secularized American colleges and universities, which were now teaching eugenics, sometimes known as the science of human betterment, calling for the elimination from the human race of people deemed unfit to reproduce on the basis of their intelligence, criminality, or background.
  • Nor was this academic research without consequence. Beginning in 1907, with Indiana, two-thirds of American states passed forced sterilization laws.
  • In 1916, Madison Grant, the president of the Museum of Natural History in New York, who had degrees from Yale and Columbia, published The Passing of the Great Race; Or, the Racial Basis of European History, a “hereditary history” of the human race, in which he identified northern Europeans (the “blue-eyed, fair-haired peoples of the north of Europe” that he called the “Nordic race”) as genetically superior to southern Europeans (the “dark-haired, dark-eyed” people he called “the Alpine race”) and lamented the presence of “swarms of Jews” and “half-breeds.” In the United States, Grant argued, the Alpine race was overwhelming the Nordic race, threatening the American republic, since “democracy is fatal to progress when two races of unequal value live side by side.”79
  • fundamentalists were, of course, making an intellectual argument, if one that not many academics wanted to hear. In 1917, William B. Riley, who, like J. Frank Norris, had trained at the Southern Baptist Theological Seminary, published a book called The Menace of Modernism, whose attack on evolution included a broader attack on the predominance in public debate of liberal faculty housed at secular universities—and the silencing of conservative opinion.
  • The horror of the war fueled the movement, convincing many evangelicals that the growing secularization of society was responsible for this grotesque parade of inhumanity: mass slaughter. “The new theology has led Germany into barbarism,” one fundamentalist argued in 1918, “and it will lead any nation into the same demoralization.”
  • “If my re-election as President depends upon my getting into war, I don’t want to be President,” Wilson said privately. “He kept us out of war” became his campaign slogan, and when Theodore Roosevelt called that an “ignoble shirking of responsibility,” Wilson countered, “I am an American, but I do not believe that any of us loves a blustering nationality.”
  • Wilson had in fact pledged not to make the world democratic, or even to support the establishment of democratic institutions everywhere, but instead to establish the conditions of stability in which democracy was possible.
  • nearly five million were called to serve. How were they to be persuaded of the war’s cause? In a speech to new recruits, Wilson’s new secretary of state, Robert Lansing, ventured an explanation. “Were every people on earth able to express their will, there would be no wars of aggression and, if there were no wars of aggression, then there would be no wars, and lasting peace would come to this earth,” Lansing said, stringing one conditional clause after another. “The only way that a people can express their will is through democratic institutions,” Lansing went on. “Therefore, when the world is made safe for democracy . . . universal peace will be an accomplished fact.”88
  • Wilson, the political scientist, tried to earn the support of the American people with an intricate theory of the relationship between democracy and peace. It didn’t work. To recast his war message and shore up popular support, he established a propaganda department,
  • Social scientists called the effect produced by wartime propaganda “herd psychology”; the philosopher John Dewey called it the “conscription of thought.”89
  • To suppress dissent, Congress passed a Sedition Act in 1918. Not since the Alien and Sedition Acts of 1798 had Congress so brazenly defied the First Amendment. Fewer than two dozen people had been arrested under the 1798 Sedition Act. During the First World War, the Justice Department charged more than two thousand Americans with sedition and convicted half of them. Appeals that went to the Supreme Court failed.
  • “If we want real peace,” Du Bois wrote, “we must extend the democratic ideal to the yellow, brown, and black peoples.” But after the United States entered the war, Creel called thirty-one black editors and publishers to a conference in Washington and warned them about “Negro subversion.”
  • Du Bois asked black men who could not vote in the United States to give their lives to make the world “safe for democracy” and asked black people to hold off on fighting against lynchings, whose numbers kept rising.91
  • Wilson signed a tax bill, raising taxes on incomes, doubling a tax on corporate earnings, eliminating an exemption for dividend income, and introducing an estate tax and a tax on excess profits. Rates for the wealthiest Americans rose from 2 percent to 77, but most people paid no tax at all (80 percent of the revenue was drawn from the income of the wealthiest 1 percent of American families).
  • Wars, as ever, expanded the powers of the state. It rearranged the relationship between the federal government and business, establishing new forms of cooperation, oversight, and regulation that amounted to erecting a welfare state for business owners.
  • As the war drew to a close, the reckoning began. American losses were almost trivial compared to the staggering losses in European nations. Against America’s 116,000 casualties, France lost 1.6 million lives, Britain 800,000, and Germany 1.8 million. Cities across Europe lay in ashes; America was untouched. Europe, composed of seventeen countries before the war, had splintered into twenty-six, all of them deeply in debt, and chiefly to Americans.
  • Before the war, Americans owed $3.7 billion to foreigners; after the war, foreigners owed $12.6 billion to Americans. Even the terrifying influenza epidemic of 1918, which took 21 million lives worldwide, claimed the lives of only 675,000 Americans. The war left European economies in ruins, America’s thriving. In the United States, steel production rose by a quarter between 1913 and 1920; everywhere else, it fell by a third.98 The Armistice came on November
  • Wilson left a lasting legacy: his rhetoric of self-determination contributed to a wave of popular protests in the Middle East and Asia, including a revolution in Egypt in 1919; made the nation-state the goal of stateless societies; and lies behind the emergence and force of anticolonial nationalism.100
  • Thirty black men were lynched in 1917, twice as many the next year, and in 1919, seventy-six, including ten veterans, some still wearing their uniforms, having fought, some people thought, the wrong war.101
  • IN 1922, when Walter Lippmann turned thirty-two, he wrote a book called Public Opinion, in which he concluded that in a modern democracy the masses, asked to make decisions about matters far removed from their direct knowledge, had been asked to do too much. “Decisions in a modern state tend to be made by the interaction, not of Congress and the executive, but of public opinion and the executive,” he’d once observed.108 Mass democracy can’t work, Lippmann argued, because the new tools of mass persuasion—especially mass advertising—meant that a tiny minority could very easily persuade the majority to believe whatever it wished them to believe.
  • The best hope for mass democracy might have seemed to be the scrupulously and unfailingly honest reporting of news, but this, Lippmann thought, was doomed to fall short, because of the gap between facts and truth.
  • Reporters chronicle events, offering facts, but “they cannot govern society by episodes, incidents, and eruptions,” he said.109 To govern, the people need truth, sense out of the whole, but people can’t read enough in the morning paper or hear enough on the evening news to turn facts into truth when they’re driven like dray horses all day.
Javier E

How 9/11 changed us - Washington Post - 0 views

  • “The U.S. government must define what the message is, what it stands for,” the report asserts. “We should offer an example of moral leadership in the world, committed to treat people humanely, abide by the rule of law, and be generous and caring to our neighbors. . . . We need to defend our ideals abroad vigorously. America does stand up for its values.”
  • the authors pause to make a rousing case for the power of the nation’s character.
  • Rather than exemplify the nation’s highest values, the official response to 9/11 unleashed some of its worst qualities: deception, brutality, arrogance, ignorance, delusion, overreach and carelessness.
  • ...103 more annotations...
  • Reading or rereading a collection of such books today is like watching an old movie that feels more anguishing and frustrating than you remember. The anguish comes from knowing how the tale will unfold; the frustration from realizing that this was hardly the only possible outcome.
  • This conclusion is laid bare in the sprawling literature to emerge from 9/11 over the past two decades
  • Whatever individual stories the 9/11 books tell, too many describe the repudiation of U.S. values, not by extremist outsiders but by our own hand.
  • In these works, indifference to the growing terrorist threat gives way to bloodlust and vengeance after the attacks. Official dissembling justifies wars, then prolongs them. In the name of counterterrorism, security is politicized, savagery legalized and patriotism weaponized.
  • that state of exception became our new American exceptionalism.
  • The latest works on the legacy of 9/11 show how war-on-terror tactics were turned on religious groups, immigrants and protesters in the United States. The war on terror came home, and it walked in like it owned the place.
  • It happened fast. By 2004, when the 9/11 Commission urged America to “engage the struggle of ideas,” it was already too late; the Justice Department’s initial torture memos were already signed, the Abu Ghraib images had already eviscerated U.S. claims to moral authority.
  • “It is for now far easier for a researcher to explain how and why September 11 happened than it is to explain the aftermath,” Steve Coll writes in “Ghost Wars,” his 2004 account of the CIA’s pre-9/11 involvement in Afghanistan. Throughout that aftermath, Washington fantasized about remaking the world in its image, only to reveal an ugly image of itself to the world.
  • “We anticipate a black future for America,” bin Laden told ABC News more than three years before the 9/11 attacks. “Instead of remaining United States, it shall end up separated states and shall have to carry the bodies of its sons back to America.”
  • bin Laden also came to grasp, perhaps self-servingly, the benefits of luring Washington into imperial overreach, of “bleeding America to the point of bankruptcy,” as he put it in 2004, through endless military expansionism, thus beating back its global sway and undermining its internal unity.
  • To an unnerving degree, the United States moved toward the enemy’s fantasies of what it might become — a nation divided in its sense of itself, exposed in its moral and political compromises, conflicted over wars it did not want but would not end.
  • “The most frightening aspect of this new threat . . . was the fact that almost no one took it seriously. It was too bizarre, too primitive and exotic.” That is how Lawrence Wright depicts the early impressions of bin Laden and his terrorist network among U.S. officials
  • The books traveling that road to 9/11 have an inexorable, almost suffocating feel to them, as though every turn invariably leads to the first crush of steel and glass.
  • With the system “blinking red,” as CIA Director George Tenet later told the 9/11 Commission, why were all these warnings not enough? Wright lingers on bureaucratic failings
  • Clarke’s conclusion is simple, and it highlights America’s we-know-better swagger, a national trait that often masquerades as courage or wisdom. “America, alas, seems only to respond well to disasters, to be undistracted by warnings,” he writes. “Our country seems unable to do all that must be done until there has been some awful calamity.”
  • The problem with responding only to calamity is that underestimation is usually replaced by overreaction. And we tell ourselves it is the right thing, maybe the only thing, to do.
  • A last-minute flight change. A new job at the Pentagon. A retirement from the fire station. The final tilt of a plane’s wings before impact. If the books about the lead-up to 9/11 are packed with unbearable inevitability, the volumes on the day itself highlight how randomness separated survival from death.
  • Had the World Trade Center, built in the late 1960s and early 1970s, been erected according to the city building code in effect since 1938, Dwyer and Flynn explain, “it is likely that a very different world trade center would have been built.
  • Instead, it was constructed according to a new code that the real estate industry had avidly promoted, a code that made it cheaper and more lucrative to build and own skyscrapers. “It increased the floor space available for rent . . . by cutting back on the areas that had been devoted, under the earlier law, to evacuation and exit,” the authors write. The result: Getting everybody out on 9/11 was virtually impossible.
  • The towers embodied the power of American capitalism, but their design embodied the folly of American greed. On that day, both conditions proved fatal.
  • Garrett Graff quotes Defense Department officials marveling at how American Airlines Flight 77 struck a part of the Pentagon that, because of new anti-terrorism standards, had recently been reinforced and renovated
  • “In any other wedge of the Pentagon, there would have been 5,000 people, and the plane would have flown right through the middle of the building.” Instead, fewer than 200 people were killed in the attack on the Pentagon, including the passengers on the hijacked jet. Chance and preparedness came together.
  • The bravery of police and firefighters is the subject of countless 9/11 retrospectives, but these books also emphasize the selflessness of civilians who morphed into first responders
  • The passengers had made phone calls when the hijacking began and had learned the fate of other aircraft that day. “According to one call, they voted on whether to rush the terrorists in an attempt to retake the plane,” the commission report states. “They decided, and acted.”
  • The civilians aboard United Airlines Flight 93, whose resistance forced the plane to crash into a Pennsylvania field rather than the U.S. Capitol, were later lionized as emblems of swashbuckling Americana
  • Such episodes, led by ordinary civilians, embodied values that the 9/11 Commission called on the nation to display. Except those values would soon be dismantled, in the name of security, by those entrusted to uphold them.
  • Lawyering to death.The phrase appears in multiple 9/11 volumes, usually uttered by top officials adamant that they were going to get things done, laws and rules be damned
  • “I had to show the American people the resolve of a commander in chief that was going to do whatever it took to win,” Bush explains. “No yielding. No equivocation. No, you know, lawyering this thing to death.” In “Against All Enemies,” Clarke recalls the evening of Sept. 11, 2001, when Bush snapped at an official who suggested that international law looked askance at military force as a tool of revenge. “I don’t care what the international lawyers say, we are going to kick some ass,” the president retorted.
  • The message was unmistakable: The law is an obstacle to effective counterterrorism
  • Except, they did lawyer this thing to death. Instead of disregarding the law, the Bush administration enlisted it. “Beginning almost immediately after September 11, 2001, [Vice President Dick] Cheney saw to it that some of the sharpest and best-trained lawyers in the country, working in secret in the White House and the United States Department of Justice, came up with legal justifications for a vast expansion of the government’s power in waging war on terror,
  • Through public declarations and secret memos, the administration sought to remove limits on the president’s conduct of warfare and to deny terrorism suspects the protections of the Geneva Conventions by redefining them as unlawful enemy combatants. Nothing, Mayer argues of the latter effort, “more directly cleared the way for torture than this.”
  • Tactics such as cramped confinement, sleep deprivation and waterboarding were rebranded as “enhanced interrogation techniques,” legally and linguistically contorted to avoid the label of torture. Though the techniques could be cruel and inhuman, the OLC acknowledged in an August 2002 memo, they would constitute torture only if they produced pain equivalent to organ failure or death, and if the individual inflicting such pain really really meant to do so: “Even if the defendant knows that severe pain will result from his actions, if causing such harm is not his objective, he lacks the requisite specific intent.” It’s quite the sleight of hand, with torture moving from the body of the interrogated to the mind of the interrogator.
  • the memo concludes that none of it actually matters. Even if a particular interrogation method would cross some legal line, the relevant statute would be considered unconstitutional because it “impermissibly encroached” on the commander in chief’s authority to conduct warfare
  • You have informed us. Experts you have consulted. Based on your research. You do not anticipate. Such hand-washing words appear throughout the memos. The Justice Department relies on information provided by the CIA to reach its conclusions; the CIA then has the cover of the Justice Department to proceed with its interrogations. It’s a perfect circle of trust.
  • In these documents, lawyers enable lawlessness. Another May 2005 memo concludes that, because the Convention Against Torture applies only to actions occurring under U.S. jurisdiction, the CIA’s creation of detention sites in other countries renders the convention “inapplicable.”
  • avid Cole describes the documents as “bad-faith lawyering,” which might be generous. It is another kind of lawyering to death, one in which the rule of law that the 9/11 Commission urged us to abide by becomes the victim.
  • Similarly, because the Eighth Amendment’s prohibition on cruel and unusual punishment is meant to protect people convicted of crimes, it should not apply to terrorism detainees — because they have not been officially convicted of anything. The lack of due process conveniently eliminates constitutional protections
  • Years later, the Senate Intelligence Committee would investigate the CIA’s post-9/11 interrogation program. Its massive report — the executive summary of which appeared as a 549-page book in 2014 — found that torture did not produce useful intelligence, that the interrogations were more brutal than the CIA let on, that the Justice Department did not independently verify the CIA’s information, and that the spy agency impeded oversight by Congress and the CIA inspector general.
  • “The CIA’s effectiveness representations were almost entirely inaccurate,” the Senate report concluded. It is one of the few lies of the war on terror unmasked by an official government investigation and public report, but just one of the many documented in the 9/11 literature.
  • Officials in the war on terror didn’t deceive or dissemble just with lawmakers or the public. In the recurring tragedy of war, they lied just as often to themselves.
  • “The decision to invade Iraq was one made, finally and exclusively, by the president of the United States, George W. Bush,” he writes.
  • n Woodward’s “Bush at War,” the president admitted that before 9/11, “I didn’t feel that sense of urgency [about al-Qaeda], and my blood was not nearly as boiling.”
  • A president initially concerned about defending and preserving the nation’s moral goodness against terrorism found himself driven by darker impulses. “I’m having difficulty controlling my bloodlust,” Bush confessed to religious leaders in the Oval Office on Sept. 20, 2001,
  • Bloodlust, moral certainty and sudden vulnerability make a dangerous combination. The belief that you are defending good against evil can lead to the belief that whatever you do to that end is good, too.
  • Draper distills Bush’s worldview: “The terrorists’ primary objective was to destroy America’s freedom. Saddam hated America. Therefore, he hated freedom. Therefore, Saddam was himself a terrorist, bent on destroying America and its freedom.”
  • The president assumed the worst about what Hussein had done or might do, yet embraced best-case scenarios of how an American invasion would proceed.
  • “Iraqis would rejoice at the sight of their Western liberators,” Draper recaps. “Their newly shared sense of national purpose would overcome any sectarian allegiances. Their native cleverness would make up for their inexperience with self-government. They would welcome the stewardship of Iraqi expatriates who had not set foot in Baghdad in decades. And their oil would pay for everything.”
  • It did not seem to occur to Bush and his advisers that Iraqis could simultaneously hate Hussein and resent the Americans — feelings that could have been discovered by speaking to Iraqis and hearing their concerns.
  • few books on the war that gets deep inside Iraqis’ aversion to the Americans in their midst. “What gives them the right to change something that’s not theirs in the first place?” a woman in a middle-class Baghdad neighborhood asks him. “I don’t like your house, so I’m going to bomb it and you can rebuild it again the way I want it, with your money?
  • The occupation did not dissuade such impressions when it turned the former dictator’s seat of government into its own luxurious Green Zone, or when it retrofitted the Abu Ghraib prison (“the worst of Saddam’s hellholes,” Shadid calls it) into its own chamber of horrors.
  • Shadid hears early talk of the Americans as “kuffar” (heathens), a 51-year-old former teacher complains that “we’ve exchanged a tyrant for an occupier.”
  • Shadid understood that governmental legitimacy — who gets to rule, and by what right — was a matter of overriding importance for Iraqis. “The Americans never understood the question,” he writes; “Iraqis never agreed on the answer.
  • When the United States so quickly shifted from liberation to occupation, it lost whatever legitimacy it enjoyed. “Bush handed that enemy precisely what it wanted and needed, proof that America was at war with Islam, that we were the new Crusaders come to occupy Muslim land,” Clarke writes. “It was as if Usama bin Laden, hidden in some high mountain redoubt, were engaging in long-range mind control of George Bush, chanting ‘invade Iraq, you must invade Iraq.’ ”
  • The foolishness and arrogance of the American occupation didn’t help. In “Imperial Life in the Emerald City: Inside Iraq’s Green Zone,” Rajiv Chandrasekaran explains how, even as daily security was Iraqis’ overwhelming concern, viceroy L. Paul Bremer, Bush’s man in Baghdad, was determined to turn the country into a model free-market economy, complete with new investment laws, bankruptcy courts and a state-of-the-art stock exchange.
  • a U.S. Army general, when asked by local journalists why American helicopters must fly so low at night, thus scaring Iraqi children, replied that the kids were simply hearing “the sound of freedom.”Message: Freedom sounds terrifying.
  • For some Americans, inflicting that terror became part of the job, one more tool in the arsenal. In “The Forever War” by Dexter Filkins, a U.S. Army lieutenant colonel in Iraq assures the author that “with a heavy dose of fear and violence, and a lot of money for projects, I think we can convince these people that we are here to help them.”
  • Chandrasekaran recalls the response of a top communications official under Bremer, when reporters asked about waves of violence hitting Baghdad in the spring of 2004. “Off the record: Paris is burning,” the official told the journalists. “On the record: Security and stability are returning to Iraq.”
  • the Iraq War, conjured in part on the false connections between Iraq and al-Qaeda, ended up helping the terrorist network: It pulled resources from the war in Afghanistan, gave space for bin Laden’s men to regroup and spurred a new generation of terrorists in the Middle East. “A bigger gift to bin Laden was hard to imagine,” Bergen writes.
  • “U.S. officials had no need to lie or spin to justify the war,” Washington Post reporter Craig Whitlock writes in “The Afghanistan Papers,” a damning contrast of the war’s reality vs. its rhetoric. “Yet leaders at the White House, the Pentagon and the State Department soon began to make false assurances and to paper over setbacks on the battlefield.” As the years passed, the deceit became entrenched, what Whitlock calls “an unspoken conspiracy” to hide the truth.
  • Afghanistan was where al-Qaeda, supported by the Taliban, had made its base — it was supposed to be the good war, the right war, the war of necessity and not choice, the war endorsed at home and abroad.
  • If Iraq was the war born of lies, Afghanistan was the one nurtured by them
  • Whitlock finds commanding generals privately admitting that they long fought the war “without a functional strategy.” That, two years into the conflict, Rumsfeld complained that he had “no visibility into who the bad guys are.”
  • That Army Lt. Gen. Douglas Lute, a former coordinator of Iraq and Afghanistan policy, acknowledged that “we didn’t have the foggiest idea of what we were undertaking.”
  • That U.S. officials long wanted to withdraw American forces but feared — correctly so, it turns out — that the Afghan government might collapse. “Bin Laden had hoped for this exact scenario,” Whitlock observes. “To lure the U.S. superpower into an unwinnable guerrilla conflict that would deplete its national treasury and diminish its global influence.”
  • All along, top officials publicly contradicted these internal views, issuing favorable accounts of steady progress
  • Bad news was twisted into good: Rising suicide attacks in Kabul meant the Taliban was too weak for direct combat, for instance, while increased U.S. casualties meant America was taking the fight to the enemy.
  • deceptions transpired across U.S. presidents, but the Obama administration, eager to show that its first-term troop surge was working, “took it to a new level, hyping figures that were misleading, spurious or downright false,” Whitlock writes. And then under President Donald Trump, he adds, the generals felt pressure to “speak more forcefully and boast that his war strategy was destined to succeed.”
  • in public, almost no senior government officials had the courage to admit that the United States was slowly losing,” Whitlock writes. “With their complicit silence, military and political leaders avoided accountability and dodged reappraisals that could have changed the outcome or shortened the conflict.”
  • Deputy Secretary of State Richard Armitage traveled to Moscow shortly after 9/11 to give officials a heads up about the coming hostilities in Afghanistan. The Russians, recent visitors to the graveyard of empires, cautioned that Afghanistan was an “ambush heaven” and that, in the words of one of them, “you’re really going to get the hell kicked out of you.”
  • a war should not be measured only by the timing and the competence of its end. We still face an equally consequential appraisal: How good was this good war if it could be sustained only by lies?
  • In the two decades since the 9/11 attacks, the United States has often attempted to reconsider its response
  • They are written as though intending to solve problems. But they can be read as proof that the problems have no realistic solution, or that the only solution is to never have created them.
  • the report sets the bar for staying so high that an exit strategy appears to be its primary purpose.
  • he counterinsurgency manual is an extraordinary document. Implicitly repudiating notions such as “shock and awe” and “overwhelming force,” it argues that the key to battling an insurgency in countries such as Iraq and Afghanistan is to provide security for the local population and to win its support through effective governance
  • It also attempts to grasp the nature of America’s foes. “Most enemies either do not try to defeat the United States with conventional operations or do not limit themselves to purely military means,” the manual states. “They know that they cannot compete with U.S. forces on those terms. Instead, they try to exhaust U.S. national will.” Exhausting America’s will is an objective that al-Qaeda understood well.
  • “Counterinsurgents should prepare for a long-term commitment,” the manual states. Yet, just a few pages later, it admits that “eventually all foreign armies are seen as interlopers or occupiers.” How to accomplish the former without descending into the latter? No wonder so many of the historical examples of counterinsurgency that the manual highlights, including accounts from the Vietnam War, are stories of failure.
  • “Soldiers and Marines are expected to be nation builders as well as warriors,” the manual proclaims, but the arduous tasks involved — reestablishing government institutions, rebuilding infrastructure, strengthening local security forces, enforcing the rule of law — reveal the tension at the heart of the new doctrine
  • In his foreword, Army Lt. Col. John Nagl writes that the document’s most lasting impact may be as a catalyst not for remaking Iraq or Afghanistan, but for transforming the Army and Marine Corps into “more effective learning organizations,” better able to adapt to changing warfare. And in her introduction, Sarah Sewall, then director of Harvard’s Carr Center for Human Rights Policy, concludes that its “ultimate value” may be in warning civilian officials to think hard before engaging in a counterinsurgency campaign.
  • “The thing that got to everyone,” Finkel explains in the latter book, “was not having a defined front line. It was a war in 360 degrees, no front to advance toward, no enemy in uniform, no predictable patterns, no relief.” It’s a powerful summation of battling an insurgency.
  • Hitting the wrong house is what counterinsurgency doctrine is supposed to avoid. Even successfully capturing or killing a high-value target can be counterproductive if in the process you terrorize a community and create more enemies. In Iraq, the whole country was the wrong house. America’s leaders knew it was the wrong house. They hit it anyway.
  • Another returning soldier, Nic DeNinno, struggles to tell his wife about the time he and his fellow soldiers burst into an Iraqi home in search of a high-value target. He threw a man down the stairs and held another by the throat. After they left, the lieutenant told him it was the wrong house. “The wrong f---ing house,” Nic says to his wife. “One of the things I want to remember is how many times we hit the wrong house.”
  • “As time passes, more documents become available, and the bare facts of what happened become still clearer,” the report states. “Yet the picture of how those things happened becomes harder to reimagine, as that past world, with its preoccupations and uncertainty, recedes.” Before making definitive judgments, then, they ask themselves “whether the insights that seem apparent now would really have been meaningful at the time.”
  • Two of the latest additions to the canon, “Reign of Terror” by Spencer Ackerman and “Subtle Tools” by Karen Greenberg, draw straight, stark lines between the earliest days of the war on terror and its mutations in our current time, between conflicts abroad and divisions at home. These works show how 9/11 remains with us, and how we are still living in the ruins.
  • When Trump declared that “we don’t have victories anymore” in his 2015 speech announcing his presidential candidacy, he was both belittling the legacy of 9/11 and harnessing it to his ends. “His great insight was that the jingoistic politics of the War on Terror did not have to be tied to the War on Terror itself,” Ackerman writes. “That enabled him to tell a tale of lost greatness.” And if greatness is lost, someone must have taken it.
  • “Trump had learned the foremost lesson of 9/11,” Ackerman writes, “that the terrorists were whomever you said they were.”
  • The backlash against Muslims, against immigrants crossing the southern border and against protesters rallying for racial justice was strengthened by the open-ended nature of the global war on terror.
  • the war is not just far away in Iraq or Afghanistan, in Yemen or Syria, but it’s happening here, with mass surveillance, militarized law enforcement and the rebranding of immigration as a threat to the nation’s security rather than a cornerstone of its identity
  • the Authorization for Use of Military Force, drafted by administration lawyers and approved by Congress just days after the attacks, as the moment when America’s response began to go awry. The brief joint resolution allowed the president to use “all necessary and appropriate force” against any nation, organization or person who committed the attacks, and to prevent any future ones.
  • It was the “Ur document in the war on terror and its legacy,” Greenberg writes. “Riddled with imprecision, its terminology was geared to codify expansive powers.” Where the battlefield, the enemy and the definition of victory all remain vague, war becomes endlessly expansive, “with neither temporal nor geographical boundaries.”
  • This was the moment the war on terror was “conceptually doomed,” Ackerman concludes. This is how you get a forever war.
  • There were moments when an off-ramp was visible. The killing of bin Laden in 2011 was one such instance, Ackerman argues, but “Obama squandered the best chance anyone could ever have to end the 9/11 era.”
  • The author assails Obama for making the war on terror more “sustainable” through a veneer of legality — banning torture yet failing to close the detention camp at Guantánamo Bay and relying on drone strikes that “perversely incentivized the military and the CIA to kill instead of capture.”
  • There would always be more targets, more battlefields, regardless of president or party. Failures became the reason to double down, never wind down.
  • The longer the war went on, the more that what Ackerman calls its “grotesque subtext” of nativism and racism would move to the foreground of American politics
  • Absent the war on terror, it is harder to imagine a presidential candidate decrying a sitting commander in chief as foreign, Muslim, illegitimate — and using that lie as a successful political platform.
  • Absent the war on terror, it is harder to imagine a travel ban against people from Muslim-majority countries. Absent the war on terror, it is harder to imagine American protesters labeled terrorists, or a secretary of defense describing the nation’s urban streets as a “battle space” to be dominated
  • In his latest book on bin Laden, Bergen argues that 9/11 was a major tactical success but a long-term strategic failure for the terrorist leader. Yes, he struck a vicious blow against “the head of the snake,” as he called the United States, but “rather than ending American influence in the Muslim world, the 9/11 attacks greatly amplified it,” with two lengthy, large-scale invasions and new bases established throughout the region.
  • “A vastly different America has taken root” in the two decades since 9/11, Greenberg writes. “In the name of retaliation, ‘justice,’ and prevention, fundamental values have been cast aside.”
  • the legacy of the 9/11 era is found not just in Afghanistan or Iraq, but also in an America that drew out and heightened some of its ugliest impulses — a nation that is deeply divided (like those “separated states” bin Laden imagined); that bypasses inconvenient facts and embraces conspiracy theories; that demonizes outsiders; and that, after failing to spread freedom and democracy around the world, seems less inclined to uphold them here
  • Seventeen years after the 9/11 Commission called on the United States to offer moral leadership to the world and to be generous and caring to our neighbors, our moral leadership is in question, and we can barely be generous and caring to ourselves.
  • Still reeling from an attack that dropped out of a blue sky, America is suffering from a sort of post-traumatic stress democracy. It remains in recovery, still a good country, even if a broken good country.
  • 9/11 was a test. Thebooks of the lasttwo decades showhow America failed.
  • Deep within the catalogue of regrets that is the 9/11 Commission report
Javier E

AI is already writing books, websites and online recipes - The Washington Post - 0 views

  • Experts say those books are likely just the tip of a fast-growing iceberg of AI-written content spreading across the web as new language software allows anyone to rapidly generate reams of prose on almost any topic. From product reviews to recipes to blog posts and press releases, human authorship of online material is on track to become the exception rather than the norm.
  • Semrush, a leading digital marketing firm, recently surveyed its customers about their use of automated tools. Of the 894 who responded, 761 said they’ve at least experimented with some form of generative AI to produce online content, while 370 said they now use it to help generate most if not all of their new content, according to Semrush Chief Strategy Officer Eugene Levin.
  • What that may mean for consumers is more hyper-specific and personalized articles — but also more misinformation and more manipulation, about politics, products they may want to buy and much more.
  • ...32 more annotations...
  • As AI writes more and more of what we read, vast, unvetted pools of online data may not be grounded in reality, warns Margaret Mitchell, chief ethics scientist at the AI start-up Hugging Face
  • “The main issue is losing track of what truth is,” she said. “Without grounding, the system can make stuff up. And if it’s that same made-up thing all over the world, how do you trace it back to what reality is?”
  • a raft of online publishers have been using automated writing tools based on ChatGPT’s predecessors, GPT-2 and GPT-3, for years. That experience shows that a world in which AI creations mingle freely and sometimes imperceptibly with human work isn’t speculative; it’s flourishing in plain sight on Amazon product pages and in Google search results.
  • “If you have a connection to the internet, you have consumed AI-generated content,” said Jonathan Greenglass, a New York-based tech investor focused on e-commerce. “It’s already here.
  • “In the last two years, we’ve seen this go from being a novelty to being pretty much an essential part of the workflow,”
  • the news credibility rating company NewsGuard identified 49 news websites across seven languages that appeared to be mostly or entirely AI-generated.
  • The sites sport names like Biz Breaking News, Market News Reports, and bestbudgetUSA.com; some employ fake author profiles and publish hundreds of articles a day, the company said. Some of the news stories are fabricated, but many are simply AI-crafted summaries of real stories trending on other outlets.
  • Ingenio, the San Francisco-based online publisher behind sites such as horoscope.com and astrology.com, is among those embracing automated content. While its flagship horoscopes are still human-written, the company has used OpenAI’s GPT language models to launch new sites such as sunsigns.com, which focuses on celebrities’ birth signs, and dreamdiary.com, which interprets highly specific dreams.
  • Ingenio used to pay humans to write birth sign articles on a handful of highly searched celebrities like Michael Jordan and Ariana Grande, said Josh Jaffe, president of its media division. But delegating the writing to AI allows sunsigns.com to cheaply crank out countless articles on not-exactly-A-listers
  • In the past, Jaffe said, “We published a celebrity profile a month. Now we can do 10,000 a month.”
  • It isn’t just text. Google users have recently posted examples of the search engine surfacing AI-generated images. For instance, a search for the American artist Edward Hopper turned up an AI image in the style of Hopper, rather than his actual art, as the first result.
  • Jaffe said he isn’t particularly worried that AI content will overwhelm the web. “It takes time for this content to rank well” on Google, he said — meaning that it appears on the first page of search results for a given query, which is critical to attracting readers. And it works best when it appears on established websites that already have a sizable audience: “Just publishing this content doesn’t mean you have a viable business.”
  • Google clarified in February that it allows AI-generated content in search results, as long as the AI isn’t being used to manipulate a site’s search rankings. The company said its algorithms focus on “the quality of content, rather than how content is produced.”
  • Reputations are at risk if the use of AI backfires. CNET, a popular tech news site, took flack in January when fellow tech site Futurism reported that CNET had been using AI to create articles or add to existing ones without clear disclosures. CNET subsequently investigated and found that many of its 77 AI-drafted stories contained errors.
  • But CNET’s parent company, Red Ventures, is forging ahead with plans for more AI-generated content, which has also been spotted on Bankrate.com, its popular hub for financial advice. Meanwhile, CNET in March laid off a number of employees, a move it said was unrelated to its growing use of AI.
  • BuzzFeed, which pioneered a media model built around reaching readers directly on social platforms like Facebook, announced in January it planned to make “AI inspired content” part of its “core business,” such as using AI to craft quizzes that tailor themselves to each reader. BuzzFeed announced last month that it is laying off 15 percent of its staff and shutting down its news division, BuzzFeed News.
  • it’s finding traction in the murkier worlds of online clickbait and affiliate marketing, where success is less about reputation and more about gaming the big tech platforms’ algorithms.
  • That business is driven by a simple equation: how much it costs to create an article vs. how much revenue it can bring in. The main goal is to attract as many clicks as possible, then serve the readers ads worth just fractions of a cent on each visit — the classic form of clickbait
  • In the past, such sites often outsourced their writing to businesses known as “content mills,” which harness freelancers to generate passable copy for minimal pay. Now, some are bypassing content mills and opting for AI instead.
  • “Previously it would cost you, let’s say, $250 to write a decent review of five grills,” Semrush’s Levin said. “Now it can all be done by AI, so the cost went down from $250 to $10.”
  • The problem, Levin said, is that the wide availability of tools like ChatGPT means more people are producing similarly cheap content, and they’re all competing for the same slots in Google search results or Amazon’s on-site product reviews
  • So they all have to crank out more and more article pages, each tuned to rank highly for specific search queries, in hopes that a fraction will break through. The result is a deluge of AI-written websites, many of which are never seen by human eyes.
  • Jaffe said his company discloses its use of AI to readers, and he promoted the strategy at a recent conference for the publishing industry. “There’s nothing to be ashamed of,” he said. “We’re actually doing people a favor by leveraging generative AI tools” to create niche content that wouldn’t exist otherwise.
  • The rise of AI is already hurting the business of Textbroker, a leading content platform based in Germany and Las Vegas, said Jochen Mebus, the company’s chief revenue officer. While Textbroker prides itself on supplying credible, human-written copy on a huge range of topics, “People are trying automated content right now, and so that has slowed down our growth,”
  • Mebus said the company is prepared to lose some clients who are just looking to make a “fast dollar” on generic AI-written content. But it’s hoping to retain those who want the assurance of a human touch, while it also trains some of its writers to become more productive by employing AI tools themselves.
  • He said a recent survey of the company’s customers found that 30 to 40 percent still want exclusively “manual” content, while a similar-size chunk is looking for content that might be AI-generated but human-edited to check for tone, errors and plagiarism.
  • Levin said Semrush’s clients have also generally found that AI is better used as a writing assistant than a sole author. “We’ve seen people who even try to fully automate the content creation process,” he said. “I don’t think they’ve had really good results with that. At this stage, you need to have a human in the loop.”
  • For Cowell, whose book title appears to have inspired an AI-written copycat, the experience has dampened his enthusiasm for writing.“My concern is less that I’m losing sales to fake books, and more that this low-quality, low-priced, low-effort writing is going to have a chilling effect on humans considering writing niche technical books in the future,”
  • It doesn’t help, he added, knowing that “any text I write will inevitably be fed into an AI system that will generate even more competition.”
  • Amazon removed the impostor book, along with numerous others by the same publisher, after The Post contacted the company for comment.
  • AI-written books aren’t against Amazon’s rules, per se, and some authors have been open about using ChatGPT to write books sold on the site.
  • “Amazon is constantly evaluating emerging technologies and innovating to provide a trustworthy shopping experience for our customers,”
Javier E

Opinion | How AI is transforming education at the University of Mississippi - The Washi... - 0 views

  • Perplexity AI “unlocks the power of knowledge with information discovery and sharing.” This, it turns out, means “does research.” Type something into it, and it spits out a comprehensive answer, always sourced and sometimes bulleted. You might say this is just Google on steroids — but really, it is Google with a bibliography.
  • Caleb Jackson, a 22-year-old junior at Ole Miss studying part time, is a fan. This way, he doesn’t have to spend hours between night shifts and online classes trawling the internet for sources. Perplexity can find them, and he can get to writing that much sooner.
  • What’s most important to Ole Miss faculty members is that students use these tools with integrity. If the university doesn’t have a campuswide AI honor code, and so far it doesn’t, individual classes should. And no matter whether professors permit all applications of AI, as some teachers have tried, or only the narrowest, students should have to disclose just how much help they had from robots.
  • ...25 more annotations...
  • “Write a five-paragraph essay on Virginia Woolf’s ‘To the Lighthouse.’” Too generic? Well, how about “Write a five-paragraph essay on the theme of loss in ‘To the Lighthouse’”? Too high-schoolish? “Add some bigger words, please.” The product might not be ready to turn in the moment it is born, fully formed, from ChatGPT’s head. But with enough tweaking — either by the student or by the machine at the student’s demand — chances are the output can muster at least a passing grade.
  • Which of these uses are okay? Which aren’t? The harnessing of an AI tool to create an annotated bibliography likely doesn’t rankle even librarians the way relying on that same tool to draft a reflection on Virginia Woolf offends the professor of the modern novel. Why? Because that kind of contemplation goes closer to the heart of what education is really about.
  • the core of the question colleges now face. They can’t really stop students from using AI in class. They might not be able to notice students have done so at all, and when they do think they’ve noticed they’ll be acting only on suspicion. But maybe teachers can control the ways in which students use AI in class.
  • Figuring out exactly what ways those ought to be requires educators to determine what they care about in essays — what they are desperate to hear. The purpose of these papers is for students to demonstrate what they’ve learned, from hard facts to compositional know-how, and for teachers to assess how their pupils are progressing. The answer to what teachers want to get from students in their written work depends on what they want to give to students.
  • ChatGPT is sort of in a class of its own, because it can be almost anything its users want it to be so long as they possess one essential skill: prompt engineering. This means, basically, manipulating the machine not only into giving you an answer but also into giving you the kind of answer you’re looking for.
  • The next concern is that students should use AI in a manner that improves not only their writing but also their thinking — in short, in a manner that enhances learning rather than bypasses the need to learn at all.
  • This simple principle makes for complicated practice. Certainly, no one is going to learn anything by letting AI write an essay in its entirety. What about letting AI brainstorm an idea, on the other hand, or write an outline, or gin up a counter-argument? Lyndsey Cook, a senior at Ole Miss planning a career in nursing, finds the brainstorming especially helpful: She’ll ask ChatGPT or another tool to identify the themes in a piece of literature, and then she’ll go back and look for them herself.
  • These shortcuts, on the one hand, might interfere with students’ learning to brainstorm, outline or see the other side of things on their own
  • But — here comes a human-generated counterargument — they may also aid students in surmounting obstacles in their composition that otherwise would have stopped them short. That’s particularly true of kids whose high schools didn’t send them to college already equipped with these capabilities.
  • Allow AI to boost you over these early hurdles, and suddenly the opportunity for deeper learning — the opportunity to really write — will open up. That’s how Caleb Jackson, the part-time student for whom Perplexity has been such a boon, sees it: His professor, he says , wanted them to “get away from the high-school paper and go further, to write something larger like a thesis.”
  • maybe, as one young Ole Miss faculty member put it to me, this risks “losing the value of the struggle.” That, she says, is what she is scared will go away.
  • All this invites the most important question there is: What is learning for?
  • Learning, in college, can be instrumental. According to this view, the aim of teaching is to prepare students to live in the real world, so all that really matters is whether they have the chops to field jobs that feed themselves and their families. Perhaps knowing how to use AI to do any given task for you, then, is one of the most valuable skills out there — the same way it pays to be quick with a calculator.
  • If you accept this line of argument, however, there are still drawbacks to robotic crutches. Some level of critical thinking is necessary to function as an adult, and if AI stymies its development even the instrumental aim of education is thwarted. The same goes for that “value of the struggle.” The real world is full of adversity, much of which the largest language model can’t tell you how to overcome.
  • more compelling is the idea, probably shared by most college professors, that learning isn’t only instrumental after all — that it has intrinsic value and that it is the end rather than merely a means to one.
  • Every step along the way that is skipped, the shorter the journey becomes, the less we will take in as we travel.
  • This glummest of outlooks suggests that AI will stunt personal growth even if it doesn’t harm professional prospects.
  • While that doesn’t mean it’s wise to prohibit every little application of the technology in class, it probably does mean discouraging those most closely related to critical thinking.
  • One approach is to alter standards for grading, so that the things the machines are worst at are also the things that earn the best marks: originality, say, or depth of feeling, or so-called metacognition — the process of thinking about one’s own thinking or one’s own learning.
  • Hopefully, these things are also the most valuable because they are what make us human.
  • Caleb Jackson only wants AI to help him write his papers — not to write them for him. “If ChatGPT will get you an A, and you yourself might get a C, it’s like, ‘Well, I earned that C.’” He pauses. “That might sound crazy.”
  • Dominic Tovar agrees. Let AI take charge of everything, and, “They’re not so much tools at that point. They’re just replacing you.”
  • Lyndsey Cook, too, believes that even if these systems could reliably find the answers to the most vexing research problems, “it would take away from research itself” — because scientific inquiry is valuable for its own sake. “To have AI say, ‘Hey, this is the answer …’” she trails off, sounding dispirited.
  • Claire Mischker, lecturer of composition and director of the Ole Miss graduate writing center, asked her students at the end of last semester to turn in short reflections on their experience in her class. She received submissions that she was near certain were produced by ChatGPT — “that,” she says as sarcastically as she does mournfully, “felt really good.
  • The central theme of the course was empathy.
Javier E

The Coming Software Apocalypse - The Atlantic - 0 views

  • Our standard framework for thinking about engineering failures—reflected, for instance, in regulations for medical devices—was developed shortly after World War II, before the advent of software, for electromechanical systems. The idea was that you make something reliable by making its parts reliable (say, you build your engine to withstand 40,000 takeoff-and-landing cycles) and by planning for the breakdown of those parts (you have two engines). But software doesn’t break. Intrado’s faulty threshold is not like the faulty rivet that leads to the crash of an airliner. The software did exactly what it was told to do. In fact it did it perfectly. The reason it failed is that it was told to do the wrong thing.
  • Software failures are failures of understanding, and of imagination. Intrado actually had a backup router, which, had it been switched to automatically, would have restored 911 service almost immediately. But, as described in a report to the FCC, “the situation occurred at a point in the application logic that was not designed to perform any automated corrective actions.”
  • This is the trouble with making things out of code, as opposed to something physical. “The complexity,” as Leveson puts it, “is invisible to the eye.”
  • ...52 more annotations...
  • Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.
  • Technological progress used to change the way the world looked—you could watch the roads getting paved; you could see the skylines rise. Today you can hardly tell when something is remade, because so often it is remade by code.
  • Software has enabled us to make the most intricate machines that have ever existed. And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of cod
  • The programmer, the renowned Dutch computer scientist Edsger Dijkstra wrote in 1988, “has to be able to think in terms of conceptual hierarchies that are much deeper than a single mind ever needed to face before.” Dijkstra meant this as a warning.
  • “The serious problems that have happened with software have to do with requirements, not coding errors.” When you’re writing code that controls a car’s throttle, for instance, what’s important is the rules about when and how and by how much to open it. But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”
  • What made programming so difficult was that it required you to think like a computer.
  • The introduction of programming languages like Fortran and C, which resemble English, and tools, known as “integrated development environments,” or IDEs, that help correct simple mistakes (like Microsoft Word’s grammar checker but for code), obscured, though did little to actually change, this basic alienation—the fact that the programmer didn’t work on a problem directly, but rather spent their days writing out instructions for a machine.
  • “The problem is that software engineers don’t understand the problem they’re trying to solve, and don’t care to,” says Leveson, the MIT software-safety expert. The reason is that they’re too wrapped up in getting their code to work.
  • As programmers eagerly poured software into critical systems, they became, more and more, the linchpins of the built world—and Dijkstra thought they had perhaps overestimated themselves.
  • a nearly decade-long investigation into claims of so-called unintended acceleration in Toyota cars. Toyota blamed the incidents on poorly designed floor mats, “sticky” pedals, and driver error, but outsiders suspected that faulty software might be responsible
  • software experts spend 18 months with the Toyota code, picking up where NASA left off. Barr described what they found as “spaghetti code,” programmer lingo for software that has become a tangled mess. Code turns to spaghetti when it accretes over many years, with feature after feature piling on top of, and being woven around
  • Using the same model as the Camry involved in the accident, Barr’s team demonstrated that there were actually more than 10 million ways for the onboard computer to cause unintended acceleration. They showed that as little as a single bit flip—a one in the computer’s memory becoming a zero or vice versa—could make a car run out of control. The fail-safe code that Toyota had put in place wasn’t enough to stop it
  • . In all, Toyota recalled more than 9 million cars, and paid nearly $3 billion in settlements and fines related to unintended acceleration.
  • The problem is that programmers are having a hard time keeping up with their own creations. Since the 1980s, the way programmers work and the tools they use have changed remarkably little.
  • “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. All this work had been put into this thing, but it missed the fundamental problems that people faced. And the biggest one that I took away from it was that basically people are playing computer inside their head.” Programmers were like chess players trying to play with a blindfold on—so much of their mental energy is spent just trying to picture where the pieces are that there’s hardly any left over to think about the game itself.
  • The fact that the two of them were thinking about the same problem in the same terms, at the same time, was not a coincidence. They had both just seen the same remarkable talk, given to a group of software-engineering students in a Montreal hotel by a computer researcher named Bret Victor. The talk, which went viral when it was posted online in February 2012, seemed to be making two bold claims. The first was that the way we make software is fundamentally broken. The second was that Victor knew how to fix it.
  • Though he runs a lab that studies the future of computing, he seems less interested in technology per se than in the minds of the people who use it. Like any good toolmaker, he has a way of looking at the world that is equal parts technical and humane. He graduated top of his class at the California Institute of Technology for electrical engineering,
  • WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”
  • “Our current conception of what a computer program is,” he said, is “derived straight from Fortran and ALGOL in the late ’50s. Those languages were designed for punch cards.”
  • in early 2012, Victor had finally landed upon the principle that seemed to thread through all of his work. (He actually called the talk “Inventing on Principle.”) The principle was this: “Creators need an immediate connection to what they’re creating.” The problem with programming was that it violated the principle. That’s why software systems were so hard to think about, and so rife with bugs: The programmer, staring at a page of text, was abstracted from whatever it was they were actually making.
  • Victor’s point was that programming itself should be like that. For him, the idea that people were doing important work, like designing adaptive cruise-control systems or trying to understand cancer, by staring at a text editor, was appalling.
  • With the right interface, it was almost as if you weren’t working with code at all; you were manipulating the game’s behavior directly.
  • When the audience first saw this in action, they literally gasped. They knew they weren’t looking at a kid’s game, but rather the future of their industry. Most software involved behavior that unfolded, in complex ways, over time, and Victor had shown that if you were imaginative enough, you could develop ways to see that behavior and change it, as if playing with it in your hands. One programmer who saw the talk wrote later: “Suddenly all of my tools feel obsolete.”
  • hen John Resig saw the “Inventing on Principle” talk, he scrapped his plans for the Khan Academy programming curriculum. He wanted the site’s programming exercises to work just like Victor’s demos. On the left-hand side you’d have the code, and on the right, the running program: a picture or game or simulation. If you changed the code, it’d instantly change the picture. “In an environment that is truly responsive,” Resig wrote about the approach, “you can completely change the model of how a student learns ... [They] can now immediately see the result and intuit how underlying systems inherently work without ever following an explicit explanation.” Khan Academy has become perhaps the largest computer-programming class in the world, with a million students, on average, actively using the program each month.
  • . In traditional programming, your task is to take complex rules and translate them into code; most of your energy is spent doing the translating, rather than thinking about the rules themselves. In the model-based approach, all you have is the rules. So that’s what you spend your time thinking about. It’s a way of focusing less on the machine and more on the problem you’re trying to get it to solve.
  • “Everyone thought I was interested in programming environments,” he said. Really he was interested in how people see and understand systems—as he puts it, in the “visual representation of dynamic behavior.” Although code had increasingly become the tool of choice for creating dynamic behavior, it remained one of the worst tools for understanding it. The point of “Inventing on Principle” was to show that you could mitigate that problem by making the connection between a system’s behavior and its code immediate.
  • In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface.
  • Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.
  • Victor implored professional software developers to stop pouring their talent into tools for building apps like Snapchat and Uber. “The inconveniences of daily life are not the significant problems,” he wrote. Instead, they should focus on scientists and engineers—as he put it to me, “these people that are doing work that actually matters, and critically matters, and using really, really bad tools.”
  • “people are not so easily transitioning to model-based software development: They perceive it as another opportunity to lose control, even more than they have already.”
  • In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.
  • Bantegnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules
  • “Typically the main problem with software coding—and I’m a coder myself,” Bantegnie says, “is not the skills of the coders. The people know how to code. The problem is what to code. Because most of the requirements are kind of natural language, ambiguous, and a requirement is never extremely precise, it’s often understood differently by the guy who’s supposed to code.”
  • On this view, software becomes unruly because the media for describing what software should do—conversations, prose descriptions, drawings on a sheet of paper—are too different from the media describing what software does do, namely, code itself.
  • for this approach to succeed, much of the work has to be done well before the project even begins. Someone first has to build a tool for developing models that are natural for people—that feel just like the notes and drawings they’d make on their own—while still being unambiguous enough for a computer to understand. They have to make a program that turns these models into real code. And finally they have to prove that the generated code will always do what it’s supposed to.
  • tice brings order and accountability to large codebases. But, Shivappa says, “it’s a very labor-intensive process.” He estimates that before they used model-based design, on a two-year-long project only two to three months was spent writing code—the rest was spent working on the documentation.
  • uch of the benefit of the model-based approach comes from being able to add requirements on the fly while still ensuring that existing ones are met; with every change, the computer can verify that your program still works. You’re free to tweak your blueprint without fear of introducing new bugs. Your code is, in FAA parlance, “correct by construction.”
  • The ideas spread. The notion of liveness, of being able to see data flowing through your program instantly, made its way into flagship programming tools offered by Google and Apple. The default language for making new iPhone and Mac apps, called Swift, was developed by Apple from the ground up to support an environment, called Playgrounds, that was directly inspired by Light Table.
  • The bias against model-based design, sometimes known as model-driven engineering, or MDE, is in fact so ingrained that according to a recent paper, “Some even argue that there is a stronger need to investigate people’s perception of MDE than to research new MDE technologies.”
  • “Human intuition is poor at estimating the true probability of supposedly ‘extremely rare’ combinations of events in systems operating at a scale of millions of requests per second,” he wrote in a paper. “That human fallibility means that some of the more subtle, dangerous bugs turn out to be errors in design; the code faithfully implements the intended design, but the design fails to correctly handle a particular ‘rare’ scenario.”
  • Newcombe was convinced that the algorithms behind truly critical systems—systems storing a significant portion of the web’s data, for instance—ought to be not just good, but perfect. A single subtle bug could be catastrophic. But he knew how hard bugs were to find, especially as an algorithm grew more complex. You could do all the testing you wanted and you’d never find them all.
  • An algorithm written in TLA+ could in principle be proven correct. In practice, it allowed you to create a realistic model of your problem and test it not just thoroughly, but exhaustively. This was exactly what he’d been looking for: a language for writing perfect algorithms.
  • TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer. That is, before you write any code, you write a concise outline of your program’s logic, along with the constraints you need it to satisfy
  • Programmers are drawn to the nitty-gritty of coding because code is what makes programs go; spending time on anything else can seem like a distraction. And there is a patient joy, a meditative kind of satisfaction, to be had from puzzling out the micro-mechanics of code. But code, Lamport argues, was never meant to be a medium for thought. “It really does constrain your ability to think when you’re thinking in terms of a programming language,”
  • Code makes you miss the forest for the trees: It draws your attention to the working of individual pieces, rather than to the bigger picture of how your program fits together, or what it’s supposed to do—and whether it actually does what you think. This is why Lamport created TLA+. As with model-based design, TLA+ draws your focus to the high-level structure of a system, its essential logic, rather than to the code that implements it.
  • But TLA+ occupies just a small, far corner of the mainstream, if it can be said to take up any space there at all. Even to a seasoned engineer like Newcombe, the language read at first as bizarre and esoteric—a zoo of symbols.
  • this is a failure of education. Though programming was born in mathematics, it has since largely been divorced from it. Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”
  • “In the 15th century,” he said, “people used to build cathedrals without knowing calculus, and nowadays I don’t think you’d allow anyone to build a cathedral without knowing calculus. And I would hope that after some suitably long period of time, people won’t be allowed to write programs if they don’t understand these simple things.”
  • Programmers, as a species, are relentlessly pragmatic. Tools like TLA+ reek of the ivory tower. When programmers encounter “formal methods” (so called because they involve mathematical, “formally” precise descriptions of programs), their deep-seated instinct is to recoil.
  • Formal methods had an image problem. And the way to fix it wasn’t to implore programmers to change—it was to change yourself. Newcombe realized that to bring tools like TLA+ to the programming mainstream, you had to start speaking their language.
  • he presented TLA+ as a new kind of “pseudocode,” a stepping-stone to real code that allowed you to exhaustively test your algorithms—and that got you thinking precisely early on in the design process. “Engineers think in terms of debugging rather than ‘verification,’” he wrote, so he titled his internal talk on the subject to fellow Amazon engineers “Debugging Designs.” Rather than bemoan the fact that programmers see the world in code, Newcombe embraced it. He knew he’d lose them otherwise. “I’ve had a bunch of people say, ‘Now I get it,’” Newcombe says.
  • In the world of the self-driving car, software can’t be an afterthought. It can’t be built like today’s airline-reservation systems or 911 systems or stock-trading systems. Code will be put in charge of hundreds of millions of lives on the road and it has to work. That is no small task.
Javier E

Essay-Grading Software, as Teacher's Aide - Digital Domain - NYTimes.com - 0 views

  • AS a professor and a parent, I have long dreamed of finding a software program that helps every student learn to write well. It would serve as a kind of tireless instructor, flagging grammatical, punctuation or word-use problems, but also showing the way to greater concision and clarity.
  • The standardized tests administered by the states at the end of the school year typically have an essay-writing component, requiring the hiring of humans to grade them one by one.
  • the Hewlett Foundation sponsored a study of automated essay-scoring engines now offered by commercial vendors. The researchers found that these produced scores effectively identical to those of human graders.
  • ...11 more annotations...
  • humans are not necessarily ideal graders: they provide an average of only three minutes of attention per essa
  • We are talking here about providing a very rough kind of measurement, the assignment of a single summary score on, say, a seventh grader’s essay
  • “A few years back, almost all states evaluated writing at multiple grade levels, requiring students to actually write,” says Mark D. Shermis, dean of the college of education at the University of Akron in Ohio. “But a few, citing cost considerations, have either switched back to multiple-choice format to evaluate or have dropped writing evaluation altogether.”
  • As statistical models for automated essay scoring are refined, Professor Shermis says, the current $2 or $3 cost of grading each one with humans could be virtually eliminated, at least theoretically.
  • the cost of commercial essay-grading software is now $10 to $20 a student per year. But as the technology improves and the costs drop, he expects that it will be incorporated into the word processing software that all students use
  • As essay-scoring software becomes more sophisticated, it could be put to classroom use for any type of writing assignment throughout the school year, not just in an end-of-year assessment. Instead of the teacher filling the essay with the markings that flag problems, the software could do so. The software could also effortlessly supply full explanations and practice exercises that address the problems — and grade those, too.
  • “Providing students with instant feedback about grammar, punctuation, word choice and sentence structure will lead to more writing assignments,” Mr. Vander Ark says, “and allow teachers to focus on higher-order skills.”
  • When sophisticated essay-evaluation software is built into word processing software, Mr. Vander Ark predicts “an order-of-magnitude increase in the amount of writing across the curriculum.”
  • the essay-scoring software that he and his teammates developed uses relatively small data sets and ordinary PCs — so the additional infrastructure cost for schools could be nil.
  • the William and Flora Hewlett Foundation sponsored a competition to see how well algorithms submitted by professional data scientists and amateur statistics wizards could predict the scores assigned by human graders. The winners were announced last month — and the predictive algorithms were eerily accurate.
  • wanted to create a neutral and fair platform to assess the various claims of the vendors. It turns out the claims are not hype.”
Javier E

When the New York Times lost its way - 0 views

  • There are many reasons for Trump’s ascent, but changes in the American news media played a critical role. Trump’s manipulation and every one of his political lies became more powerful because journalists had forfeited what had always been most valuable about their work: their credibility as arbiters of truth and brokers of ideas, which for more than a century, despite all of journalism’s flaws and failures, had been a bulwark of how Americans govern themselves.
  • I think Sulzberger shares this analysis. In interviews and his own writings, including an essay earlier this year for the Columbia Journalism Review, he has defended “independent journalism”, or, as I understand him, fair-minded, truth-seeking journalism that aspires to be open and objective.
  • It’s good to hear the publisher speak up in defence of such values, some of which have fallen out of fashion not just with journalists at the Times and other mainstream publications but at some of the most prestigious schools of journalism.
  • ...204 more annotations...
  • All the empathy and humility in the world will not mean much against the pressures of intolerance and tribalism without an invaluable quality that Sulzberger did not emphasise: courage.
  • Sulzberger seems to underestimate the struggle he is in, that all journalism and indeed America itself is in
  • In describing the essential qualities of independent journalism in his essay, he unspooled a list of admirable traits – empathy, humility, curiosity and so forth. These qualities have for generations been helpful in contending with the Times’s familiar problem, which is liberal bias
  • on their own, these qualities have no chance against the Times’s new, more dangerous problem, which is in crucial respects the opposite of the old one.
  • The Times’s problem has metastasised from liberal bias to illiberal bias, from an inclination to favour one side of the national debate to an impulse to shut debate down altogether
  • the internet knocked the industry off its foundations. Local newspapers were the proving ground between college campuses and national newsrooms. As they disintegrated, the national news media lost a source of seasoned reporters and many Americans lost a journalism whose truth they could verify with their own eyes.
  • far more than when I set out to become a journalist, doing the work right today demands a particular kind of courage:
  • the moral and intellectual courage to take the other side seriously and to report truths and ideas that your own side demonises for fear they will harm its cause.
  • One of the glories of embracing illiberalism is that, like Trump, you are always right about everything, and so you are justified in shouting disagreement down.
  • leaders of many workplaces and boardrooms across America find that it is so much easier to compromise than to confront – to give a little ground today in the belief you can ultimately bring people around
  • This is how reasonable Republican leaders lost control of their party to Trump and how liberal-minded college presidents lost control of their campuses. And it is why the leadership of the New York Times is losing control of its principles.
  • Over the decades the Times and other mainstream news organisations failed plenty of times to live up to their commitments to integrity and open-mindedness. The relentless struggle against biases and preconceptions, rather than the achievement of a superhuman objective omniscience, is what mattered
  • . I thought, and still think, that no American institution could have a better chance than the Times, by virtue of its principles, its history, its people and its hold on the attention of influential Americans, to lead the resistance to the corruption of political and intellectual life, to overcome the encroaching dogmatism and intolerance.
  • As the country became more polarised, the national media followed the money by serving partisan audiences the versions of reality they preferred
  • This relationship proved self-reinforcing. As Americans became freer to choose among alternative versions of reality, their polarisation intensified.
  • as the top editors let bias creep into certain areas of coverage, such as culture, lifestyle and business, that made the core harder to defend and undermined the authority of even the best reporters.
  • here have been signs the Times is trying to recover the courage of its convictions
  • The paper was slow to display much curiosity about the hard question of the proper medical protocols for trans children; but once it did, the editors defended their coverage against the inevitable criticism.
  • As Sulzberger told me in the past, returning to the old standards will require agonising change. He saw that as the gradual work of many years, but I think he is mistaken. To overcome the cultural and commercial pressures the Times faces, particularly given the severe test posed by another Trump candidacy and possible presidency, its publisher and senior editors will have to be bolder than that.
  • As a Democrat from a family of Democrats, a graduate of Yale and a blossom of the imagined meritocracy, I had my first real chance, at Buchanan’s rallies, to see the world through the eyes of stalwart opponents of abortion, immigration and the relentlessly rising tide of modernity.
  • the Times is failing to face up to one crucial reason: that it has lost faith in Americans, too.
  • For now, to assert that the Times plays by the same rules it always has is to commit a hypocrisy that is transparent to conservatives, dangerous to liberals and bad for the country as a whole.
  • It makes the Times too easy for conservatives to dismiss and too easy for progressives to believe.
  • The reality is that the Times is becoming the publication through which America’s progressive elite talks to itself about an America that does not really exist.
  • It is hard to imagine a path back to saner American politics that does not traverse a common ground of shared fact.
  • It is equally hard to imagine how America’s diversity can continue to be a source of strength, rather than become a fatal flaw, if Americans are afraid or unwilling to listen to each other.
  • I suppose it is also pretty grandiose to think you might help fix all that. But that hope, to me, is what makes journalism worth doing.
  • Since Adolph Ochs bought the paper in 1896, one of the most inspiring things the Times has said about itself is that it does its work “without fear or favour”. That is not true of the institution today – it cannot be, not when its journalists are afraid to trust readers with a mainstream conservative argument such as Cotton’s, and its leaders are afraid to say otherwise.
  • Most important, the Times, probably more than any other American institution, could influence the way society approached debate and engagement with opposing views. If Times Opinion demonstrated the same kind of intellectual courage and curiosity that my colleagues at the Atlantic had shown, I hoped, the rest of the media would follow.
  • You did not have to go along with everything that any tribe said. You did not have to pretend that the good guys, much as you might have respected them, were right about everything, or that the bad guys, much as you might have disdained them, never had a point. You did not, in other words, ever have to lie.
  • This fundamental honesty was vital for readers, because it equipped them to make better, more informed judgments about the world. Sometimes it might shock or upset them by failing to conform to their picture of reality. But it also granted them the respect of acknowledging that they were able to work things out for themselves.
  • The Atlantic did not aspire to the same role as the Times. It did not promise to serve up the news of the day without any bias. But it was to opinion journalism what the Times’s reporting was supposed to be to news: honest and open to the world.
  • Those were the glory days of the blog, and we hit on the idea of creating a living op-ed page, a collective of bloggers with different points of view but a shared intellectual honesty who would argue out the meaning of the news of the day
  • They were brilliant, gutsy writers, and their disagreements were deep enough that I used to joke that my main work as editor was to prevent fistfights.
  • Under its owner, David Bradley, my colleagues and I distilled our purpose as publishing big arguments about big ideas
  • we also began producing some of the most important work in American journalism: Nicholas Carr on whether Google was “making us stupid”; Hanna Rosin on “the end of men”; Taylor Branch on “the shame of college sports”; Ta-Nehisi Coates on “the case for reparations”; Greg Lukianoff and Jonathan Haidt on “the coddling of the American mind”.
  • I was starting to see some effects of the new campus politics within the Atlantic. A promising new editor had created a digital form for aspiring freelancers to fill out, and she wanted to ask them to disclose their racial and sexual identity. Why? Because, she said, if we were to write about the trans community, for example, we would ask a trans person to write the story
  • There was a good argument for that, I acknowledged, and it sometimes might be the right answer. But as I thought about the old people, auto workers and abortion opponents I had learned from, I told her there was also an argument for correspondents who brought an outsider’s ignorance, along with curiosity and empathy, to the story.
  • A journalism that starts out assuming it knows the answers, it seemed to me then, and seems even more so to me now, can be far less valuable to the reader than a journalism that starts out with a humbling awareness that it knows nothing.
  • In the age of the internet it is hard even for a child to sustain an “innocent eye”, but the alternative for journalists remains as dangerous as ever, to become propagandists. America has more than enough of those already.
  • When I looked around the Opinion department, change was not what I perceived. Excellent writers and editors were doing excellent work. But the department’s journalism was consumed with politics and foreign affairs in an era when readers were also fascinated by changes in technology, business, science and culture.
  • Fairly quickly, though, I realised two things: first, that if I did my job as I thought it should be done, and as the Sulzbergers said they wanted me to do it, I would be too polarising internally ever to lead the newsroom; second, that I did not want that job, though no one but my wife believed me when I said that.
  • there was a compensating moral and psychological privilege that came with aspiring to journalistic neutrality and open-mindedness, despised as they might understandably be by partisans. Unlike the duelling politicians and advocates of all kinds, unlike the corporate chieftains and their critics, unlike even the sainted non-profit workers, you did not have to pretend things were simpler than they actually were
  • On the right and left, America’s elites now talk within their tribes, and get angry or contemptuous on those occasions when they happen to overhear the other conclave. If they could be coaxed to agree what they were arguing about, and the rules by which they would argue about it, opinion journalism could serve a foundational need of the democracy by fostering diverse and inclusive debate. Who could be against that?
  • The large staff of op-ed editors contained only a couple of women. Although the 11 columnists were individually admirable, only two of them were women and only one was a person of colour
  • Not only did they all focus on politics and foreign affairs, but during the 2016 campaign, no columnist shared, in broad terms, the worldview of the ascendant progressives of the Democratic Party, incarnated by Bernie Sanders. And only two were conservative.
  • This last fact was of particular concern to the elder Sulzberger. He told me the Times needed more conservative voices, and that its own editorial line had become predictably left-wing. “Too many liberals,” read my notes about the Opinion line-up from a meeting I had with him and Mark Thompson, then the chief executive, as I was preparing to rejoin the paper. “Even conservatives are liberals’ idea of a conservative.” The last note I took from that meeting was: “Can’t ignore 150m conservative Americans.”
  • As I knew from my time at the Atlantic, this kind of structural transformation can be frightening and even infuriating for those understandably proud of things as they are. It is hard on everyone
  • experience at the Atlantic also taught me that pursuing new ways of doing journalism in pursuit of venerable institutional principles created enthusiasm for change. I expected that same dynamic to allay concerns at the Times.
  • If Opinion published a wider range of views, it would help frame a set of shared arguments that corresponded to, and drew upon, the set of shared facts coming from the newsroom.
  • New progressive voices were celebrated within the Times. But in contrast to the Wall Street Journal and the Washington Post, conservative voices – even eloquent anti-Trump conservative voices – were despised, regardless of how many leftists might surround them.
  • The Opinion department mocked the paper’s claim to value diversity. It did not have a single black editor
  • Eventually, it sank in that my snotty joke was actually on me: I was the one ignorantly fighting a battle that was already lost. The old liberal embrace of inclusive debate that reflected the country’s breadth of views had given way to a new intolerance for the opinions of roughly half of American voters.
  • Out of naivety or arrogance, I was slow to recognise that at the Times, unlike at the Atlantic, these values were no longer universally accepted, let alone esteemed
  • After the 9/11 attacks, as the bureau chief in Jerusalem, I spent a lot of time in the Gaza Strip interviewing Hamas leaders, recruiters and foot soldiers, trying to understand and describe their murderous ideology. Some readers complained that I was providing a platform for terrorists, but there was never any objection from within the Times.
  • Our role, we knew, was to help readers understand such threats, and this required empathetic – not sympathetic – reporting. This is not an easy distinction but good reporters make it: they learn to understand and communicate the sources and nature of a toxic ideology without justifying it, much less advocating it.
  • Today’s newsroom turns that moral logic on its head, at least when it comes to fellow Americans. Unlike the views of Hamas, the views of many Americans have come to seem dangerous to engage in the absence of explicit condemnation
  • Focusing on potential perpetrators – “platforming” them by explaining rather than judging their views – is believed to empower them to do more harm.
  • After the profile of the Ohio man was published, media Twitter lit up with attacks on the article as “normalising” Nazism and white nationalism, and the Times convulsed internally. The Times wound up publishing a cringing editor’s note that hung the writer out to dry and approvingly quoted some of the criticism, including a tweet from a Washington Post opinion editor asking, “Instead of long, glowing profiles of Nazis/White nationalists, why don’t we profile the victims of their ideologies”?
  • the Times lacked the confidence to defend its own work
  • The editor’s note paraded the principle of publishing such pieces, saying it was important to “shed more light, not less, on the most extreme corners of American life”. But less light is what the readers got. As a reporter in the newsroom, you’d have to have been an idiot after that explosion to attempt such a profile
  • Empathetic reporting about Trump supporters became even more rare. It became a cliché among influential left-wing columnists and editors that blinkered political reporters interviewed a few Trump supporters in diners and came away suckered into thinking there was something besides racism that could explain anyone’s support for the man.
  • After a year spent publishing editorials attacking Trump and his policies, I thought it would be a demonstration of Timesian open-mindedness to give his supporters their say. Also, I thought the letters were interesting, so I turned over the entire editorial page to the Trump letters.
  • I wasn’t surprised that we got some criticism on Twitter. But I was astonished by the fury of my Times colleagues. I found myself facing an angry internal town hall, trying to justify what to me was an obvious journalistic decision
  • Didn’t he think other Times readers should understand the sources of Trump’s support? Didn’t he also see it was a wonderful thing that some Trump supporters did not just dismiss the Times as fake news, but still believed in it enough to respond thoughtfully to an invitation to share their views?
  • And if the Times could not bear to publish the views of Americans who supported Trump, why should it be surprised that those voters would not trust it?
  • Two years later, in 2020, Baquet acknowledged that in 2016 the Times had failed to take seriously the idea that Trump could become president partly because it failed to send its reporters out into America to listen to voters and understand “the turmoil in the country”. And, he continued, the Times still did not understand the views of many Americans
  • Speaking four months before we published the Cotton op-ed, he said that to argue that the views of such voters should not appear in the Times was “not journalistic”.
  • Conservative arguments in the Opinion pages reliably started uproars within the Times. Sometimes I would hear directly from colleagues who had the grace to confront me with their concerns; more often they would take to the company’s Slack channels or Twitter to advertise their distress in front of each other
  • This environment of enforced group-think, inside and outside the paper, was hard even on liberal opinion writers. One left-of-centre columnist told me that he was reluctant to appear in the New York office for fear of being accosted by colleagues.
  • An internal survey shortly after I left the paper found that barely half the staff, within an enterprise ostensibly devoted to telling the truth, agreed “there is a free exchange of views in this company” and “people are not afraid to say what they really think”.)
  • Even columnists with impeccable leftist bona fides recoiled from tackling subjects when their point of view might depart from progressive orthodoxy.
  • The bias had become so pervasive, even in the senior editing ranks of the newsroom, as to be unconscious
  • Trying to be helpful, one of the top newsroom editors urged me to start attaching trigger warnings to pieces by conservatives. It had not occurred to him how this would stigmatise certain colleagues, or what it would say to the world about the Times’s own bias
  • By their nature, information bubbles are powerfully self-reinforcing, and I think many Times staff have little idea how closed their world has become, or how far they are from fulfilling their compact with readers to show the world “without fear or favour”
  • sometimes the bias was explicit: one newsroom editor told me that, because I was publishing more conservatives, he felt he needed to push his own department further to the left.
  • The Times’s failure to honour its own stated principles of openness to a range of views was particularly hard on the handful of conservative writers, some of whom would complain about being flyspecked and abused by colleagues. One day when I relayed a conservative’s concern about double standards to Sulzberger, he lost his patience. He told me to inform the complaining conservative that that’s just how it was: there was a double standard and he should get used to it.
  • A publication that promises its readers to stand apart from politics should not have different standards for different writers based on their politics. But I delivered the message. There are many things I regret about my tenure as editorial-page editor. That is the only act of which I am ashamed.
  • I began to think of myself not as a benighted veteran on a remote island, but as Rip Van Winkle. I had left one newspaper, had a pleasant dream for ten years, and returned to a place I barely recognised.
  • The new New York Times was the product of two shocks – sudden collapse, and then sudden success. The paper almost went bankrupt during the financial crisis, and the ensuing panic provoked a crisis of confidence among its leaders. Digital competitors like the HuffPost were gaining readers and winning plaudits within the media industry as innovative. They were the cool kids; Times folk were ink-stained wrinklies.
  • In its panic, the Times bought out experienced reporters and editors and began hiring journalists from publications like the HuffPost who were considered “digital natives” because they had never worked in print. This hiring quickly became easier, since most digital publications financed by venture capital turned out to be bad businesses
  • Though they might have lacked deep or varied reporting backgrounds, some of the Times’s new hires brought skills in video and audio; others were practised at marketing themselves – building their brands, as journalists now put it – in social media. Some were brilliant and fiercely honest, in keeping with the old aspirations of the paper.
  • critically, the Times abandoned its practice of acculturation, including those months-long assignments on Metro covering cops and crime or housing. Many new hires who never spent time in the streets went straight into senior writing and editing roles.
  • All these recruits arrived with their own notions of the purpose of the Times. To me, publishing conservatives helped fulfil the paper’s mission; to them, I think, it betrayed that mission.
  • then, to the shock and horror of the newsroom, Trump won the presidency. In his article for Columbia Journalism Review, Sulzberger cites the Times’s failure to take Trump’s chances seriously as an example of how “prematurely shutting down inquiry and debate” can allow “conventional wisdom to ossify in a way that blinds society.
  • Many Times staff members – scared, angry – assumed the Times was supposed to help lead the resistance. Anxious for growth, the Times’s marketing team implicitly endorsed that idea, too.
  • As the number of subscribers ballooned, the marketing department tracked their expectations, and came to a nuanced conclusion. More than 95% of Times subscribers described themselves as Democrats or independents, and a vast majority of them believed the Times was also liberal
  • A similar majority applauded that bias; it had become “a selling point”, reported one internal marketing memo. Yet at the same time, the marketers concluded, subscribers wanted to believe that the Times was independent.
  • As that memo argued, even if the Times was seen as politically to the left, it was critical to its brand also to be seen as broadening its readers’ horizons, and that required “a perception of independence”.
  • Readers could cancel their subscriptions if the Times challenged their worldview by reporting the truth without regard to politics. As a result, the Times’s long-term civic value was coming into conflict with the paper’s short-term shareholder value
  • The Times has every right to pursue the commercial strategy that makes it the most money. But leaning into a partisan audience creates a powerful dynamic. Nobody warned the new subscribers to the Times that it might disappoint them by reporting truths that conflicted with their expectations
  • When your product is “independent journalism”, that commercial strategy is tricky, because too much independence might alienate your audience, while too little can lead to charges of hypocrisy that strike at the heart of the brand.
  • It became one of Dean Baquet’s frequent mordant jokes that he missed the old advertising-based business model, because, compared with subscribers, advertisers felt so much less sense of ownership over the journalism
  • The Times was slow to break it to its readers that there was less to Trump’s ties to Russia than they were hoping, and more to Hunter Biden’s laptop, that Trump might be right that covid came from a Chinese lab, that masks were not always effective against the virus, that shutting down schools for many months was a bad idea.
  • there has been a sea change over the past ten years in how journalists think about pursuing justice. The reporters’ creed used to have its foundation in liberalism, in the classic philosophical sense. The exercise of a reporter’s curiosity and empathy, given scope by the constitutional protections of free speech, would equip readers with the best information to form their own judgments. The best ideas and arguments would win out
  • The journalist’s role was to be a sworn witness; the readers’ role was to be judge and jury. In its idealised form, journalism was lonely, prickly, unpopular work, because it was only through unrelenting scepticism and questioning that society could advance. If everyone the reporter knew thought X, the reporter’s role was to ask: why X?
  • Illiberal journalists have a different philosophy, and they have their reasons for it. They are more concerned with group rights than individual rights, which they regard as a bulwark for the privileges of white men. They have seen the principle of  free speech used to protect right-wing outfits like Project Veritas and Breitbart News and are uneasy with it.
  • They had their suspicions of their fellow citizens’ judgment confirmed by Trump’s election, and do not believe readers can be trusted with potentially dangerous ideas or facts. They are not out to achieve social justice as the knock-on effect of pursuing truth; they want to pursue it head-on
  • The term “objectivity” to them is code for ignoring the poor and weak and cosying up to power, as journalists often have done.
  • And they do not just want to be part of the cool crowd. They need to be
  • To be more valued by their peers and their contacts – and hold sway over their bosses – they need a lot of followers in social media. That means they must be seen to applaud the right sentiments of the right people in social media
  • The journalist from central casting used to be a loner, contrarian or a misfit. Now journalism is becoming another job for joiners, or, to borrow Twitter’s own parlance, “followers”, a term that mocks the essence of a journalist’s role.
  • The new newsroom ideology seems idealistic, yet it has grown from cynical roots in academia: from the idea that there is no such thing as objective truth; that there is only narrative, and that therefore whoever controls the narrative – whoever gets to tell the version of the story that the public hears – has the whip hand
  • What matters, in other words, is not truth and ideas in themselves, but the power to determine both in the public mind.
  • By contrast, the old newsroom ideology seems cynical on its surface. It used to bug me that my editors at the Times assumed every word out of the mouth of any person in power was a lie.
  • And the pursuit of objectivity can seem reptilian, even nihilistic, in its abjuration of a fixed position in moral contests. But the basis of that old newsroom approach was idealistic: the notion that power ultimately lies in truth and ideas, and that the citizens of a pluralistic democracy, not leaders of any sort, must be trusted to judge both.
  • Our role in Times Opinion, I used to urge my colleagues, was not to tell people what to think, but to help them fulfil their desire to think for themselves.
  • It seems to me that putting the pursuit of truth, rather than of justice, at the top of a publication’s hierarchy of values also better serves not just truth but justice, too
  • over the long term journalism that is not also sceptical of the advocates of any form of justice and the programmes they put forward, and that does not struggle honestly to understand and explain the sources of resistance,
  • will not assure that those programmes will work, and it also has no legitimate claim to the trust of reasonable people who see the world very differently. Rather than advance understanding and durable change, it provokes backlash.
  • The impatience within the newsroom with such old ways was intensified by the generational failure of the Times to hire and promote women and non-white people
  • Pay attention if you are white at the Times and you will hear black editors speak of hiring consultants at their own expense to figure out how to get white staff to respect them
  • As wave after wave of pain and outrage swept through the Times, over a headline that was not damning enough of Trump or someone’s obnoxious tweets, I came to think of the people who were fragile, the ones who were caught up in Slack or Twitter storms, as people who had only recently discovered that they were white and were still getting over the shock.
  • Having concluded they had got ahead by working hard, it has been a revelation to them that their skin colour was not just part of the wallpaper of American life, but a source of power, protection and advancement.
  • I share the bewilderment that so many people could back Trump, given the things he says and does, and that makes me want to understand why they do: the breadth and diversity of his support suggests not just racism is at work. Yet these elite, well-meaning Times staff cannot seem to stretch the empathy they are learning to extend to people with a different skin colour to include those, of whatever race, who have different politics.
  • The digital natives were nevertheless valuable, not only for their skills but also because they were excited for the Times to embrace its future. That made them important allies of the editorial and business leaders as they sought to shift the Times to digital journalism and to replace staff steeped in the ways of print. Partly for that reason, and partly out of fear, the leadership indulged internal attacks on Times journalism, despite pleas from me and others, to them and the company as a whole, that Times folk should treat each other with more respect
  • My colleagues and I in Opinion came in for a lot of the scorn, but we were not alone. Correspondents in the Washington bureau and political reporters would take a beating, too, when they were seen as committing sins like “false balance” because of the nuance in their stories.
  • My fellow editorial and commercial leaders were well aware of how the culture of the institution had changed. As delighted as they were by the Times’s digital transformation they were not blind to the ideological change that came with it. They were unhappy with the bullying and group-think; we often discussed such cultural problems in the weekly meetings of the executive committee, composed of the top editorial and business leaders, including the publisher. Inevitably, these bitch sessions would end with someone saying a version of: “Well, at some point we have to tell them this is what we believe in as a newspaper, and if they don’t like it they should work somewhere else.” It took me a couple of years to realise that this moment was never going to come.
  • There is a lot not to miss about the days when editors like Boyd could strike terror in young reporters like me and Purdum. But the pendulum has swung so far in the other direction that editors now tremble before their reporters and even their interns. “I miss the old climate of fear,” Baquet used to say with a smile, in another of his barbed jokes.
  • I wish I’d pursued my point and talked myself out of the job. This contest over control of opinion journalism within the Times was not just a bureaucratic turf battle (though it was that, too)
  • The newsroom’s embrace of opinion journalism has compromised the Times’s independence, misled its readers and fostered a culture of intolerance and conformity.
  • The Opinion department is a relic of the era when the Times enforced a line between news and opinion journalism.
  • Editors in the newsroom did not touch opinionated copy, lest they be contaminated by it, and opinion journalists and editors kept largely to their own, distant floor within the Times building. Such fastidiousness could seem excessive, but it enforced an ethos that Times reporters owed their readers an unceasing struggle against bias in the news
  • But by the time I returned as editorial-page editor, more opinion columnists and critics were writing for the newsroom than for Opinion. As at the cable news networks, the boundaries between commentary and news were disappearing, and readers had little reason to trust that Times journalists were resisting rather than indulging their biases
  • The Times newsroom had added more cultural critics, and, as Baquet noted, they were free to opine about politics.
  • Departments across the Times newsroom had also begun appointing their own “columnists”, without stipulating any rules that might distinguish them from columnists in Opinion
  • I checked to see if, since I left the Times, it had developed guidelines explaining the difference, if any, between a news columnist and opinion columnist. The paper’s spokeswoman, Danielle Rhoades Ha, did not respond to the question.)
  • The internet rewards opinionated work and, as news editors felt increasing pressure to generate page views, they began not just hiring more opinion writers but also running their own versions of opinionated essays by outside voices – historically, the province of Opinion’s op-ed department.
  • Yet because the paper continued to honour the letter of its old principles, none of this work could be labelled “opinion” (it still isn’t). After all, it did not come from the Opinion department.
  • And so a newsroom technology columnist might call for, say, unionisation of the Silicon Valley workforce, as one did, or an outside writer might argue in the business section for reparations for slavery, as one did, and to the average reader their work would appear indistinguishable from Times news articles.
  • By similarly circular logic, the newsroom’s opinion journalism breaks another of the Times’s commitments to its readers. Because the newsroom officially does not do opinion – even though it openly hires and publishes opinion journalists – it feels free to ignore Opinion’s mandate to provide a diversity of views
  • When I was editorial-page editor, there were a couple of newsroom columnists whose politics were not obvious. But the other newsroom columnists, and the critics, read as passionate progressives.
  • I urged Baquet several times to add a conservative to the newsroom roster of cultural critics. That would serve the readers by diversifying the Times’s analysis of culture, where the paper’s left-wing bias had become most blatant, and it would show that the newsroom also believed in restoring the Times’s commitment to taking conservatives seriously. He said this was a good idea, but he never acted on it
  • I couldn’t help trying the idea out on one of the paper’s top cultural editors, too: he told me he did not think Times readers would be interested in that point of view.
  • opinion was spreading through the newsroom in other ways. News desks were urging reporters to write in the first person and to use more “voice”, but few newsroom editors had experience in handling that kind of journalism, and no one seemed certain where “voice” stopped and “opinion” began
  • The Times magazine, meanwhile, became a crusading progressive publication
  • Baquet liked to say the magazine was Switzerland, by which he meant that it sat between the newsroom and Opinion. But it reported only to the news side. Its work was not labelled as opinion and it was free to omit conservative viewpoints.
  • his creep of politics into the newsroom’s journalism helped the Times beat back some of its new challengers, at least those on the left
  • Competitors like Vox and the HuffPost were blending leftish politics with reporting and writing it up conversationally in the first person. Imitating their approach, along with hiring some of their staff, helped the Times repel them. But it came at a cost. The rise of opinion journalism over the past 15 years changed the newsroom’s coverage and its culture
  • The tiny redoubt of never-Trump conservatives in Opinion is swamped daily not only by the many progressives in that department but their reinforcements among the critics, columnists and magazine writers in the newsroom
  • They are generally excellent, but their homogeneity means Times readers are being served a very restricted range of views, some of them presented as straight news by a publication that still holds itself out as independent of any politics.
  • And because the critics, newsroom columnists and magazine writers are the newsroom’s most celebrated journalists, they have disproportionate influence over the paper’s culture.
  • By saying that it still holds itself to the old standard of strictly separating its news and opinion journalists, the paper leads its readers further into the trap of thinking that what they are reading is independent and impartial – and this misleads them about their country’s centre of political and cultural gravity.
  • And yet the Times insists to the public that nothing has changed.
  • “Even though each day’s opinion pieces are typically among our most popular journalism and our columnists are among our most trusted voices, we believe opinion is secondary to our primary mission of reporting and should represent only a portion of a healthy news diet,” Sulzberger wrote in the Columbia Journalism Review. “For that reason, we’ve long kept the Opinion department intentionally small – it represents well under a tenth of our journalistic staff – and ensured that its editorial decision-making is walled off from the newsroom.”
  • When I was editorial-page editor, Sulzberger, who declined to be interviewed on the record for this article, worried a great deal about the breakdown in the boundaries between news and opinion
  • He told me once that he would like to restructure the paper to have one editor oversee all its news reporters, another all its opinion journalists and a third all its service journalists, the ones who supply guidance on buying gizmos or travelling abroad. Each of these editors would report to him
  • That is the kind of action the Times needs to take now to confront its hypocrisy and begin restoring its independence.
  • The Times could learn something from the Wall Street Journal, which has kept its journalistic poise
  • It has maintained a stricter separation between its news and opinion journalism, including its cultural criticism, and that has protected the integrity of its work.
  • After I was chased out of the Times, Journal reporters and other staff attempted a similar assault on their opinion department. Some 280 of them signed a letter listing pieces they found offensive and demanding changes in how their opinion colleagues approached their work. “Their anxieties aren’t our responsibility,” shrugged the Journal’s editorial board in a note to readers after the letter was leaked. “The signers report to the news editors or other parts of the business.” The editorial added, in case anyone missed the point, “We are not the New York Times.” That was the end of it.
  • Unlike the publishers of the Journal, however, Sulzberger is in a bind, or at least perceives himself to be
  • The confusion within the Times over its role, and the rising tide of intolerance among the reporters, the engineers, the business staff, even the subscribers – these are all problems he inherited, in more ways than one. He seems to feel constrained in confronting the paper’s illiberalism by the very source of his authority
  • The paradox is that in previous generations the Sulzbergers’ control was the bulwark of the paper’s independence.
  • if he is going to instil the principles he believes in, he needs to stop worrying so much about his powers of persuasion, and start using the power he is so lucky to have.
  • Shortly after we published the op-ed that Wednesday afternoon, some reporters tweeted their opposition to Cotton’s argument. But the real action was in the Times’s Slack channels, where reporters and other staff began not just venting but organising. They turned to the union to draw up a workplace complaint about the op-ed.
  • The next day, this reporter shared the byline on the Times story about the op-ed. That article did not mention that Cotton had distinguished between “peaceful, law-abiding protesters” and “rioters and looters”. In fact, the first sentence reported that Cotton had called for “the military to suppress protests against police violence”.
  • This was – and is – wrong. You don’t have to take my word for that. You can take the Times’s
  • Three days later in its article on my resignation it also initially reported that Cotton had called “for military force against protesters in American cities”. This time, after the article was published on the Times website, the editors scrambled to rewrite it, replacing “military force” with “military response” and “protesters” with “civic unrest”
  • That was a weaselly adjustment – Cotton wrote about criminality, not “unrest” – but the article at least no longer unambiguously misrepresented Cotton’s argument to make it seem he was in favour of crushing democratic protest. The Times did not publish a correction or any note acknowledging the story had been changed.
  • Seeking to influence the outcome of a story you cover, particularly without disclosing that to the reader, violates basic principles I was raised on at the Times
  • s Rhoades Ha disputes my characterisation of the after-the-fact editing of the story about my resignation. She said the editors changed the story after it was published on the website in order to “refine” it and “add context”, and so the story did not merit a correction disclosing to the reader that changes had been made.
  • In retrospect what seems almost comical is that as the conflict over Cotton’s op-ed unfolded within the Times I acted as though it was on the level, as though the staff of the Times would have a good-faith debate about Cotton’s piece and the decision to publish it
  • Instead, people wanted to vent and achieve what they considered to be justice, whether through Twitter, Slack, the union or the news pages themselves
  • My colleagues in Opinion, together with the PR team, put together a series of connected tweets describing the purpose behind publishing Cotton’s op-ed. Rather than publish these tweets from the generic Times Opinion Twitter account, Sulzberger encouraged me to do it from my personal one, on the theory that this would humanise our defence. I doubted that would make any difference, but it was certainly my job to take responsibility. So I sent out the tweets, sticking my head in a Twitter bucket that clangs, occasionally, to this day
  • What is worth recalling now from the bedlam of the next two days? I suppose there might be lessons for someone interested in how not to manage a corporate crisis. I began making my own mistakes that Thursday. The union condemned our publication of Cotton, for supposedly putting journalists in danger, claiming that he had called on the military “to ‘detain’ and ‘subdue’ Americans protesting racism and police brutality” – again, a misrepresentation of his argument. The publisher called to tell me the company was experiencing its largest sick day in history; people were turning down job offers because of the op-ed, and, he said, some people were quitting. He had been expecting for some time that the union would seek a voice in editorial decision-making; he said he thought this was the moment the union was making its move. He had clearly changed his own mind about the value of publishing the Cotton op-ed.
  • I asked Dao to have our fact-checkers review the union’s claims. But then I went a step further: at the publisher’s request, I urged him to review the editing of the piece itself and come back to me with a list of steps we could have taken to make it better. Dao’s reflex – the correct one – was to defend the piece as published. He and three other editors of varying ages, genders and races had helped edit it; it had been fact-checked, as is all our work
  • This was my last failed attempt to have the debate within the Times that I had been seeking for four years, about why it was important to present Times readers with arguments like Cotton’s. The staff at the paper never wanted to have that debate. The Cotton uproar was the most extreme version of the internal reaction we faced whenever we published conservative arguments that were not simply anti-Trump. Yes, yes, of course we believe in the principle of publishing diverse views, my Times colleagues would say, but why this conservative? Why this argument?
  • I doubt these changes would have mattered, and to extract this list from Dao was to engage in precisely the hypocrisy I claimed to despise – that, in fact, I do despise. If Cotton needed to be held to such standards of politesse, so did everyone else. Headlines such as “Tom Cotton’s Fascist Op-ed”, the headline of a subsequent piece, should also have been tranquillised.
  • As that miserable Thursday wore on, Sulzberger, Baquet and I held a series of Zoom meetings with reporters and editors from the newsroom who wanted to discuss the op-ed. Though a handful of the participants were there to posture, these were generally constructive conversations. A couple of people, including Baquet, even had the guts to speak up in favour of publishing the op-ed
  • Two moments stick out. At one point, in answer to a question, Sulzberger and Baquet both said they thought the op-ed – as the Times union and many journalists were saying – had in fact put journalists in danger. That was the first time I realised I might be coming to the end of the road.
  • The other was when a pop-culture reporter asked if I had read the op-ed before it was published. I said I had not. He immediately put his head down and started typing, and I should have paid attention rather than moving on to the next question. He was evidently sharing the news with the company over Slack.
  • Every job review I had at the Times urged me to step back from the daily coverage to focus on the long term. (Hilariously, one review, urging me to move faster in upending the Opinion department, instructed me to take risks and “ask for forgiveness not permission”.)
  • I learned when these meetings were over that there had been a new eruption in Slack. Times staff were saying that Rubenstein had been the sole editor of the op-ed. In response, Dao had gone into Slack to clarify to the entire company that he had also edited it himself. But when the Times posted the news article that evening, it reported, “The Op-Ed was edited by Adam Rubenstein” and made no mention of Dao’s statement
  • Early that morning, I got an email from Sam Dolnick, a Sulzberger cousin and a top editor at the paper, who said he felt “we” – he could have only meant me – owed the whole staff “an apology for appearing to place an abstract idea like open debate over the value of our colleagues’ lives, and their safety”. He was worried that I and my colleagues had unintentionally sent a message to other people at the Times that: “We don’t care about their full humanity and their security as much as we care about our ideas.”
  • “I know you don’t like it when I talk about principles at a moment like this,” I began. But I viewed the journalism I had been doing, at the Times and before that at the Atlantic, in very different terms from the ones Dolnick presumed. “I don’t think of our work as an abstraction without meaning for people’s lives – quite the opposite,” I continued. “The whole point – the reason I do this – is to have an impact on their lives to the good. I have always believed that putting ideas, including potentially dangerous one[s], out in the public is vital to ensuring they are debated and, if dangerous, discarded.” It was, I argued, in “edge cases like this that principles are tested”, and if my position was judged wrong then “I am out of step with the times.” But, I concluded, “I don’t think of us as some kind of debating society without implications for the real world and I’ve never been unmindful of my colleagues’ humanity.”
  • in the end, one thing he and I surely agree on is that I was, in fact, out of step with the Times. It may have raised me as a journalist – and invested so much in educating me to what were once its standards – but I did not belong there any more.
  • Finally, I came up with something that felt true. I told the meeting that I was sorry for the pain that my leadership of Opinion had caused. What a pathetic thing to say. I did not think to add, because I’d lost track of this truth myself by then, that opinion journalism that never causes pain is not journalism. It can’t hope to move society forward
  • As I look back at my notes of that awful day, I don’t regret what I said. Even during that meeting, I was still hoping the blow-up might at last give me the chance either to win support for what I had been asked to do, or to clarify once and for all that the rules for journalism had changed at the Times.
  • But no one wanted to talk about that. Nor did they want to hear about all the voices of vulnerable or underprivileged people we had been showcasing in Opinion, or the ambitious new journalism we were doing. Instead, my Times colleagues demanded to know things such as the names of every editor who had had a role in the Cotton piece. Having seen what happened to Rubenstein I refused to tell them. A Slack channel had been set up to solicit feedback in real time during the meeting, and it was filling with hate. The meeting ran long, and finally came to a close after 90 minutes.
  • I tried to insist, as did Dao, that the note make clear the Cotton piece was within our editorial bounds. Sulzberger said he felt the Times could afford to be “silent” on that question. In the end the note went far further in repudiating the piece than I anticipated, saying it should never have been published at all. The next morning I was told to resign.
  • It was a terrible moment for the country. By the traditional – and perverse – logic of journalism, that should also have made it an inspiring time to be a reporter, writer or editor. Journalists are supposed to run towards scenes that others are fleeing, towards hard truths others need to know, towards consequential ideas they would prefer to ignore.
  • But fear got all mixed up with anger inside the Times, too, along with a desire to act locally in solidarity with the national movement. That energy found a focus in the Cotton op-ed
  • the Times is not good at acknowledging mistakes. Indeed, one of my own, within the Times culture, was to take responsibility for any mistakes my department made, and even some it didn’t
  • To Sulzberger, the meltdown over Cotton’s op-ed and my departure in disgrace are explained and justified by a failure of editorial “process”. As he put it in an interview with the New Yorker this summer, after publishing his piece in the Columbia Journalism Review, Cotton’s piece was not “perfectly fact-checked” and the editors had not “thought about the headline and presentation”. He contrasted the execution of Cotton’s opinion piece with that of a months-long investigation the newsroom did of Donald Trump’s taxes (which was not “perfectly fact-checked”, as it happens – it required a correction). He did not explain why, if the Times was an independent publication, an op-ed making a mainstream conservative argument should have to meet such different standards from an op-ed making any other kind of argument, such as for the abolition of the police
  • “It’s not enough just to have the principle and wave it around,” he said. “You also have to execute on it.”
  • To me, extolling the virtue of independent journalism in the pages of the Columbia Journalism Review is how you wave a principle around. Publishing a piece like Cotton’s is how you execute on it.
  • As Sulzberger also wrote in the Review, “Independent journalism, especially in a pluralistic democracy, should err on the side of treating areas of serious political contest as open, unsettled, and in need of further inquiry.
  • If Sulzberger must insist on comparing the execution of the Cotton op-ed with that of the most ambitious of newsroom projects, let him compare it with something really important, the 1619 Project, which commemorated the 400th anniversary of the arrival of enslaved Africans in Virginia.
  • Like Cotton’s piece, the 1619 Project was fact-checked and copy-edited (most of the Times newsroom does not fact-check or copy-edit articles, but the magazine does). But it nevertheless contained mistakes, as journalism often does. Some of these mistakes ignited a firestorm among historians and other readers.
  • And, like Cotton’s piece, the 1619 Project was presented in a way the Times later judged to be too provocative.
  • The Times declared that the 1619 Project “aims to reframe the country’s history, understanding 1619 as our true founding”. That bold statement – a declaration of Times fact, not opinion, since it came from the newsroom – outraged many Americans who venerated 1776 as the founding. The Times later stealthily erased it from the digital version of the project, but was caught doing so by a writer for the publication Quillette. Sulzberger told me during the initial uproar that the top editors in the newsroom – not just Baquet but his deputy – had not reviewed the audacious statement of purpose, one of the biggest editorial claims the paper has ever made. They also, of course, did not edit all the pieces themselves, trusting the magazine’s editors to do that work.
  • If the 1619 Project and the Cotton op-ed shared the same supposed flaws and excited similar outrage, how come that one is lauded as a landmark success and the other is a sackable offence?
  • I am comparing them only to meet Sulzberger on his terms, in order to illuminate what he is trying to elide. What distinguished the Cotton piece was not an error, or strong language, or that I didn’t edit it personally. What distinguished that op-ed was not process. It was politics.
  • It is one thing for the Times to aggravate historians, or conservatives, or even old-school liberals who believe in open debate. It has become quite another for the Times to challenge some members of its own staff with ideas that might contradict their view of the world.
  • The lessons of the incident are not about how to write a headline but about how much the Times has changed – how digital technology, the paper’s new business model and the rise of new ideals among its staff have altered its understanding of the boundary between news and opinion, and of the relationship between truth and justice
  • Ejecting me was one way to avoid confronting the question of which values the Times is committed to. Waving around the word “process” is another.
  • As he asserts the independence of Times journalism, Sulzberger is finding it necessary to reach back several years to another piece I chose to run, for proof that the Times remains willing to publish views that might offend its staff. “We’ve published a column by the head of the part of the Taliban that kidnapped one of our own journalists,” he told the New Yorker. He is missing the real lesson of that piece, as well.
  • The case against that piece is that Haqqani, who remains on the FBI’s most-wanted terrorist list, may have killed Americans. It’s puzzling: in what moral universe can it be a point of pride to publish a piece by an enemy who may have American blood on his hands, and a matter of shame to publish a piece by an American senator arguing for American troops to protect Americans?
  • As Mitch McConnell, then the majority leader, said on the Senate floor about the Times’s panic over the Cotton op-ed, listing some other debatable op-ed choices, “Vladimir Putin? No problem. Iranian propaganda? Sure. But nothing, nothing could have prepared them for 800 words from the junior senator from Arkansas.”
  • The Times’s staff members are not often troubled by obnoxious views when they are held by foreigners. This is an important reason the paper’s foreign coverage, at least of some regions, remains exceptional.
  • What seems most important and least understood about that episode is that it demonstrated in real time the value of the ideals that I poorly defended in the moment, ideals that not just the Times’s staff but many other college-educated Americans are abandoning.
  • After all, we ran the experiment; we published the piece. Was any Times journalist hurt? No. Nobody in the country was. In fact, though it is impossible to know the op-ed’s precise effect, polling showed that support for a military option dropped after the Times published the essay, as the Washington Post’s media critic, Erik Wemple, has written
  • If anything, in other words, publishing the piece stimulated debate that made it less likely Cotton’s position would prevail. The liberal, journalistic principle of open debate was vindicated in the very moment the Times was fleeing from it.
Javier E

Language and the Invention of Writing - Talking Points Memo - 0 views

  • Language is not an invention. As best we can tell it is an evolved feature of the human brain.
  • Critically, it is something that is hardwired into us.
  • Writing is an altogether different and artificial thing.
  • ...6 more annotations...
  • there’s nothing natural about it or inevitable.
  • There have been a number of independent originations of writing in human history.
  • But our modern world is dominated by only two lines of descent: the character system which is used in China and with some modifications much of East Asia and the ancestor of the alphabet we use as readers and writers of English.
  • What is particularly fascinating is that most historians of writing believe that this invention – the alphabet, designed by and for sub-literate Semites living on the borderlands of Egypt about 4,000 years ago – is likely the origin point of all modern alphabets.
  • the creators of the alphabets that now dominate South Asia (originating 2500 to 3000 years ago) also seem to have borrowed at least the idea of the alphabet from these Semitic innovators, though others believe they are an indigenous creation.
  • The deep history of these letters we are now communicating through is like the DNA – or perhaps rather the record of the DNA – of human cognition and thought, processed through language and encoded into writing.
Javier E

AI is about to completely change how you use computers | Bill Gates - 0 views

  • Health care
  • before the sophisticated agents I’m describing become a reality, we need to confront a number of questions about the technology and how we’ll use it.
  • Today, AI’s main role in healthcare is to help with administrative tasks. Abridge, Nuance DAX, and Nabla Copilot, for example, can capture audio during an appointment and then write up notes for the doctor to review.
  • ...38 more annotations...
  • agents will open up many more learning opportunities.
  • Already, AI can help you pick out a new TV and recommend movies, books, shows, and podcasts. Likewise, a company I’ve invested in, recently launched Pix, which lets you ask questions (“Which Robert Redford movies would I like and where can I watch them?”) and then makes recommendations based on what you’ve liked in the past
  • Productivity
  • copilots can do a lot—such as turn a written document into a slide deck, answer questions about a spreadsheet using natural language, and summarize email threads while representing each person’s point of view.
  • I don’t think any single company will dominate the agents business--there will be many different AI engines available.
  • Helping patients and healthcare workers will be especially beneficial for people in poor countries, where many never get to see a doctor at all.
  • To create a new app or service, you won’t need to know how to write code or do graphic design. You’ll just tell your agent what you want. It will be able to write the code, design the look and feel of the app, create a logo, and publish the app to an online store
  • Agents will do even more. Having one will be like having a person dedicated to helping you with various tasks and doing them independently if you want. If you have an idea for a business, an agent will help you write up a business plan, create a presentation for it, and even generate images of what your product might look like
  • For decades, I’ve been excited about all the ways that software would make teachers’ jobs easier and help students learn. It won’t replace teachers, but it will supplement their work—personalizing the work for students and liberating teachers from paperwork and other tasks so they can spend more time on the most important parts of the job.
  • Mental health care is another example of a service that agents will make available to virtually everyone. Today, weekly therapy sessions seem like a luxury. But there is a lot of unmet need, and many people who could benefit from therapy don’t have access to it.
  • Entertainment and shopping
  • The real shift will come when agents can help patients do basic triage, get advice about how to deal with health problems, and decide whether they need to seek treatment.
  • They’ll replace word processors, spreadsheets, and other productivity apps.
  • Education
  • For example, few families can pay for a tutor who works one-on-one with a student to supplement their classroom work. If agents can capture what makes a tutor effective, they’ll unlock this supplemental instruction for everyone who wants it. If a tutoring agent knows that a kid likes Minecraft and Taylor Swift, it will use Minecraft to teach them about calculating the volume and area of shapes, and Taylor’s lyrics to teach them about storytelling and rhyme schemes. The experience will be far richer—with graphics and sound, for example—and more personalized than today’s text-based tutors.
  • your agent will be able to help you in the same way that personal assistants support executives today. If your friend just had surgery, your agent will offer to send flowers and be able to order them for you. If you tell it you’d like to catch up with your old college roommate, it will work with their agent to find a time to get together, and just before you arrive, it will remind you that their oldest child just started college at the local university.
  • To see the dramatic change that agents will bring, let’s compare them to the AI tools available today. Most of these are bots. They’re limited to one app and generally only step in when you write a particular word or ask for help. Because they don’t remember how you use them from one time to the next, they don’t get better or learn any of your preferences.
  • Agents will affect how we use software as well as how it’s written. They’ll replace search sites because they’ll be better at finding information and summarizing it for you
  • Businesses that are separate today—search advertising, social networking with advertising, shopping, productivity software—will become one business.
  • other issues won’t be decided by companies and governments. For example, agents could affect how we interact with friends and family. Today, you can show someone that you care about them by remembering details about their life—say, their birthday. But when they know your agent likely reminded you about it and took care of sending flowers, will it be as meaningful for them?
  • In the computing industry, we talk about platforms—the technologies that apps and services are built on. Android, iOS, and Windows are all platforms. Agents will be the next platform.
  • A shock wave in the tech industry
  • Agents won’t simply make recommendations; they’ll help you act on them. If you want to buy a camera, you’ll have your agent read all the reviews for you, summarize them, make a recommendation, and place an order for it once you’ve made a decision.
  • The current state of the art is Khanmigo, a text-based bot created by Khan Academy. It can tutor students in math, science, and the humanities—for example, it can explain the quadratic formula and create math problems to practice on. It can also help teachers do things like write lesson plans.
  • they’ll be dramatically better. You’ll be able to have nuanced conversations with them. They will be much more personalized, and they won’t be limited to relatively simple tasks like writing a letter.
  • Companies will be able to make agents available for their employees to consult directly and be part of every meeting so they can answer questions.
  • AI agents that are well trained in mental health will make therapy much more affordable and easier to get. Wysa and Youper are two of the early chatbots here. But agents will go much deeper. If you choose to share enough information with a mental health agent, it will understand your life history and your relationships. It’ll be available when you need it, and it will never get impatient. It could even, with your permission, monitor your physical responses to therapy through your smart watch—like if your heart starts to race when you’re talking about a problem with your boss—and suggest when you should see a human therapist.
  • If the number of companies that have started working on AI just this year is any indication, there will be an exceptional amount of competition, which will make agents very inexpensive.
  • Agents are smarter. They’re proactive—capable of making suggestions before you ask for them. They accomplish tasks across applications. They improve over time because they remember your activities and recognize intent and patterns in your behavior. Based on this information, they offer to provide what they think you need, although you will always make the final decisions.
  • Agents are not only going to change how everyone interacts with computers. They’re also going to upend the software industry, bringing about the biggest revolution in computing since we went from typing commands to tapping on icons.
  • In the distant future, agents may even force humans to face profound questions about purpose. Imagine that agents become so good that everyone can have a high quality of life without working nearly as much. In a future like that, what would people do with their time? Would anyone still want to get an education when an agent has all the answers? Can you have a safe and thriving society when most people have a lot of free time on their hands?
  • The ramifications for the software business and for society will be profound.
  • In the next five years, this will change completely. You won’t have to use different apps for different tasks. You’ll simply tell your device, in everyday language, what you want to do. And depending on how much information you choose to share with it, the software will be able to respond personally because it will have a rich understanding of your life. In the near future, anyone who’s online will be able to have a personal assistant powered by artificial intelligence that’s far beyond today’s technology.
  • You’ll also be able to get news and entertainment that’s been tailored to your interests. CurioAI, which creates a custom podcast on any subject you ask about, is a glimpse of what’s coming.
  • An agent will be able to help you with all your activities if you want it to. With permission to follow your online interactions and real-world locations, it will develop a powerful understanding of the people, places, and activities you engage in. It will get your personal and work relationships, hobbies, preferences, and schedule. You’ll choose how and when it steps in to help with something or ask you to make a decision.
  • even the best sites have an incomplete understanding of your work, personal life, interests, and relationships and a limited ability to use this information to do things for you. That’s the kind of thing that is only possible today with another human being, like a close friend or personal assistant.
  • The most exciting impact of AI agents is the way they will democratize services that today are too expensive for most people
  • They’ll have an especially big influence in four areas: health care, education, productivity, and entertainment and shopping.
Javier E

Guernica / The Storytellers of Empire - 1 views

  • Hiroshima is a book about what happened in Japan, to Japan, in August 1945. It is a book about five Japanese and one German hibakusha, or bomb survivors. It is not a book which concerns itself with what the bombing meant for America in military terms, but rather what it meant for the people of Hiroshima in the most human terms.
  • Inevitably, it also contains within it two Americas. One is the America which develops and uses—not once, but twice—a weapon of a destructive capability which far outstrips anything that has come before, the America which decides what price some other country’s civilian population must pay for its victory. There is nothing particular to America in this—all nations in war behave in much the same way. But in the years between the bombing of Hiroshima and now, no nation has intervened militarily with as many different countries as America, and always on the other country’s soil; which is to say, no nation has treated as many other civilian populations as collateral damage as America while its own civilians stay well out of the arena of war. So that’s one of the Americas in Hiroshima—the America of brutal military power.
  • But there’s another America in the book, that of John Hersey. The America of looking at the destruction your nation has inflicted and telling it like it is. The America of stepping back and allowing someone else to tell their story through you because they have borne the tragedy and you have the power to bear witness to it. It is the America of The New Yorker of William Shawn, which, for the only time in its history, gave over an entire edition to a single article and kept its pages clear of its famed cartoons. It is the America which honored Hersey for his truth telling.
  • ...12 more annotations...
  • How to reconcile these two Americas? I didn’t even try. It was a country I always looked at with one eye shut. With my left eye I saw the America of John Hersey; with my right eye I saw the America of the two atom bombs. This one-eyed seeing was easy enough from a distance. But then I came to America as an undergraduate and realized that with a few honorable exceptions, all of America looked at America with one eye shut.
  • I had grown up in a country with military rule; I had grown up, that is to say, with the understanding that the government of a nation is a vastly different thing than its people. The government of America was a ruthless and morally bankrupt entity; but the people of America, well, they were different, they were better. They didn’t think it was okay for America to talk democracy from one side of its mouth while heaping praise on totalitarian nightmares from the other side. They just didn’t know it was happening, not really, not in any way that made it real to them. For a while this sufficed. I grumbled a little about American insularity. But it was an affectionate grumble. All nations have their failings. As a Pakistani, who was I to cast stones from my brittle, blood-tipped glass house?
  • Then came September 11, and for a few seconds, it brought this question: why do they hate us?
  • It was asked not only about the men on the planes but also about those people in the world who didn’t fall over with weeping but instead were seen to remark that now America, too, knew what it felt like to be attacked. It was asked, and very quickly it was answered: they hate our freedoms. And just like that a door was closed and a large sign pasted onto it saying, “You’re Either With Us or Against Us.” Anyone who hammered on the door with mention of the words “foreign policy” was accused of justifying the murder of more than three thousand people.
  • I found myself looking to writers. Where were the novels that could be proffered to people who asked, “Why do they hate us?”, which is actually the question “Who are these people and what do they have to do with us?” No such novel, as far as I knew, had come from the post-Cold War generation of writers who started writing after the 1980s when Islam replaced Communism as the terrifying Other. But that would change, I told myself.
  • The writers would write. The novels would come. They didn’t. They haven’t.
  • So where are they, the American fiction writers—and I mean literary fiction—whose works are interested in the question “What do these people have to do with us?” and “What are we doing out there in the world?”
  • I grew up in Pakistan in the 1980s, aware that thinking about my country’s history and politics meant thinking about America’s history and politics. This is not an unusual position. Many countries of the world from Asia to South America exist, or have existed, as American client states, have seen U.S.-backed coups, faced American missiles or sanctions, seen their government’s policies on various matters dictated in Washington. America may not be an empire in the nineteenth century way which involved direct colonization. But the neo-imperialism of America was evident to me by the time I was an adolescent and able to understand these things.
  • why is it that the fiction writers of my generation are so little concerned with the history of their own nation once that history exits the fifty states. It’s not because of a lack of dramatic potential in those stories of America in the World; that much is clear.
  • The stories of America in the World rather than the World in America stubbornly remain the domain of nonfiction. Your soldiers will come to our lands, but your novelists won’t. The unmanned drone hovering over Pakistan, controlled by someone in Langley, is an apt metaphor for America’s imaginative engagement with my nation.
  • Where is the American writer who looks on his or her country with two eyes, one shaped by the experience of living here, the other filled with the sad knowledge of what this country looks like when it’s not at home. Where is the American writer who can tell you about the places your nation invades or manipulates, brings you into those stories and lets you draw breath with its characters?
  • why, when there are astonishing stories out in the world about America, to do with America, going straight to the heart of the question: who are these people and what do they have to do with us?—why are the fiction writers staying away from the stories? The answer, I think, comes from John Hersey. He said of novelists, “A writer is bound to have varying degrees of success, and I think that that is partly an issue of how central the burden of the story is to the author’s psyche.” And that’s the answer. Even now, you just don’t care very much about us. One eye remains closed. The pen, writing its deliberate sentences, is icy cold.
  •  
    Asks why American fiction writers don't write stories about America's effect on countries where it has intervened.
Javier E

The Dispossessed: An Ambiguous Utopia (Hainish Cycle Book 5) (Ursula K. Le Guin) - 0 views

  • instead of merely looking at it from outside. He took on two seminars and an open lecture course. No teaching was requested of him, but he had asked if he could teach, and the administrators had arranged the seminars. The open class was neither his idea nor theirs. A delegation of students came and asked him to give it. He consented at once. This was how courses were organized in Anarresti learning centers by student demand, or on the teacher’s initiative, or by students and teachers together. When he found that the administrators were upset, he laughed. “Do they expect students not to be anarchists?” he said. “What else can the young be? When you are on the bottom, you must organize from the bottom up!” He had no intention of being administered out of the course—he had fought this kind of battle before—and because he communicated his firmness to the students, they held firm. To avoid unpleasant publicity, the Rectors of the University gave in, and Shevek began his course to a first-cay audience of two thousand. Attendance soon dropped. He stuck to physics, never going off into the personal or the political, and it was physics on a pretty advanced level. But several hundred students continued to come. Some came out of mere curiosity, to see the man from the Moon; others were drawn by Shevek’s personality, by the glimpses of the man and the libertarian which they could catch from his words even when they could not follow his mathematics. And a surprising number of them were capable of following both the philosophy and the mathematics. They were superbly trained, these students. Their minds were fine, keen, ready. When they weren’t working, they rested. They were not blunted and distracted by a dozen other obligations. They never fell asleep in class because they were tired from having worked on rotational duty the day before. Their society maintained them in complete freedom from want, distractions, and cares. What they were free to do, however, was another question. It appeared to Shevek that their freedom from obligation was in exact proportion to their lack of freedom of initiative. He was appalled by the examination system, when it was explained to him; he could not imagine a greater deterrent to the natural wish to learn than this pattern of cramming in information and disgorging it at demand. At first he refused to give any tests or grades, but this upset the University administrators so badly that, not wishing to be discourteous to his hosts, he gave in. He asked his students to write a paper on any problem in physics that interested them, and told them that he would give them all the highest mark, so that the bureaucrats would have something to write on their forms and lists. To his surprise a good many students came to him to complain. They wanted him to set the problems, to ask the right questions; they did not want to think about questions, but to write down the answers they had learned. And some of them objected strongly to his giving everyone the same mark. How could the diligent students be distinguished from the dull ones? What was the good in working hard? If no competitive distinctions were to be made, one might as well do nothing. “Well, of course,” Shevek said, troubled. “If you do not want to do the work, you should not do it.” The boys went away unappeased, but polite. They were pleasant boys, with frank and civil manners. Shevek’s readings in Urrasti history led him to decide that they were, in fact, though the word was seldom used these days, aristocrats. In feudal times the aristocracy had sent their sons to university, conferring superiority on the institution. Nowadays it was the other way round: the university conferred superiority on the man. They told Shevek with pride that the competition for scholarships to Ieu Eun was stiffer every year, proving the essential democracy of the institution. He said, “You put another lock on the door and call it democracy.” He liked his polite, intelligent students, but he felt no great warmth towards any of them. They were planning careers as academic or industrial scientists, and what they learned from him was to them a means to that end, success in their careers. They either had, or denied the importance of, anything else he might have offered them.
  • Shevek touched her, silver arm with his silver hand, marveling at the warmth of the touch in that cool light. “If you can see a thing whole,” he said, “it seems that it’s always beautiful. Planets, lives. . . . But close up, a world’s all dirt and rocks. And day to day, life’s a hard job, you get tired, you lose the pattern. You need distance, interval. The way to see how beautiful the earth is, is to see it as the moon. The way to see how beautiful life is, is from the vantage point of death.”
  • instead of merely looking at it from outside. He took on two seminars and an open lecture course. No teaching was requested of him, but he had asked if he could teach, and the administrators had arranged the seminars. The open class was neither his idea nor theirs. A delegation of students came and asked him to give it. He consented at once. This was how courses were organized in Anarresti learning centers by student demand, or on the teacher’s initiative, or by students and teachers together. When he found that the administrators were upset, he laughed. “Do they expect students not to be anarchists?” he said. “What else can the young be? When you are on the bottom, you must organize from the bottom up!” He had no intention of being administered out of the course—he had fought this kind of battle before—and because he communicated his firmness to the students, they held firm. To avoid unpleasant publicity, the Rectors of the University gave in, and Shevek began his course to a first-cay audience of two thousand. Attendance soon dropped. He stuck to physics, never going off into the personal or the political, and it was physics on a pretty advanced level. But several hundred students continued to come. Some came out of mere curiosity, to see the man from the Moon; others were drawn by Shevek’s personality, by the glimpses of the man and the libertarian which they could catch from his words even when they could not follow his mathematics. And a surprising number of them were capable of following both the philosophy and the mathematics. They were superbly trained, these students. Their minds were fine, keen, ready. When they weren’t working, they rested. They were not blunted and distracted by a dozen other obligations. They never fell asleep in class because they were tired from having worked on rotational duty the day before. Their society maintained them in complete freedom from want, distractions, and cares. What they were free to do, however, was another question. It appeared to Shevek that their freedom from obligation was in exact proportion to their lack of freedom of initiative. He was appalled by the examination system, when it was explained to him; he could not imagine a greater deterrent to the natural wish to learn than this pattern of cramming in information and disgorging it at demand. At first he refused to give any tests or grades, but this upset the University administrators so badly that, not wishing to be discourteous to his hosts, he gave in. He asked his students to write a paper on any problem in physics that interested them, and told them that he would give them all the highest mark, so that the bureaucrats would have something to write on their forms and lists. To his surprise a good many students came to him to complain. They wanted him to set the problems, to ask the right questions; they did not want to think about questions, but to write down the answers they had learned. And some of them objected strongly to his giving…
  • ...5 more annotations...
  • He found himself, therefore, with no duties at all beyond the preparation of his three classes; the rest of his time was all his own. He had not been in a situation like this since his early twenties, his first years at the Institute in Abbenay. Since those years his social and personal life had got more and more complicated and demanding. He had been not only a physicist but also a partner, a father, an Odonian, and finally a social reformer. As such, he had not been sheltered, and had expected no shelter, from whatever cares and responsibilities came to him. He had not been free from anything: only free to do anything. Here, it was the other way around. Like all the students and professors, he had nothing to do but his intellectual work, literally nothing. The beds were made for them, the rooms were swept for them, the routine of the college was managed for them, the way was made plain for them.
  • she was not a temporal physicist. She saw time naïvely as a road laid out. You walked ahead, and you got somewhere. If you were lucky, you got somewhere worth getting to. But when Shevek took her metaphor and recast it in his terms, explaining that, unless the past and the future were made part of the present by memory and intention, there was, in human terms, no road, nowhere to go, she nodded before he was half done. “Exactly,” she said. “That’s what I was doing these last four years. It isn’t all luck. Just partly.”
  • Shevek touched her, silver arm with his silver hand, marveling at the warmth of the touch in that cool light. “If you can see a thing whole,” he said, “it seems that it’s always beautiful. Planets, lives. . . . But close up, a world’s all dirt and rocks. And day to day, life’s a hard job, you get tired, you lose the pattern. You need distance, interval. The way to see how beautiful the earth is, is to see it as the moon. The way to see how beautiful life is, is from the vantage point of death.”
  • all. Odo wrote: “A child free from the guilt of ownership and the burden of economic competition will grow up with the will to do what needs doing and the capacity for joy in doing it. It is useless work that darkens the heart. The delight of the nursing mother, of the scholar, of the successful hunter, of the good cook, of the skillful maker, of anyone doing needed work and doing it well—this durable joy is perhaps the deepest source of human affection, and of sociality as a whole.” There was an undercurrent of joy, in that sense, in Abbenay that summer. There was a lightheartedness at work however hard the work, a
  • Fulfillment, Shevek thought, is a function of time. The search for pleasure is circular, repetitive, atemporal. The variety seeking of the spectator, the thrill hunter, the sexually promiscuous, always ends in the same place. It has an end. It comes to the end and has to start over. It is not a journey and return, but a closed cycle, a locked room, a cell. Outside the locked room is the landscape of time, in which the spirit may, with luck and courage, construct the fragile, makeshift, improbable roads and cities of fidelity: a landscape inhabitable by human beings. It is not until an act occurs within the landscape of the past and the future that it is a human act. Loyalty, which asserts the continuity of past and future, binding time into a whole, is the root of human strength; there is no good to be done without it. So, looking back on the last four years, Shevek saw them not as wasted, but as part of the edifice that he and Takver were building with their lives. The thing about working with time, instead of against it, he thought, is that it is not wasted. Even pain counts.
Javier E

A former Russian troll speaks: 'It was like being in Orwell's world' - The Washington Post - 0 views

  • How did you end up at the troll factory? I worked there from November 2014 to February 2015. I ended up there totally by accident — I happened to be unemployed, and this place had work right by my house. So I went there. I realized quickly that this was the kind of place where I only wanted to spend enough time until I got my salary and I could leave.
  • Your first feeling, when you ended up there, was that you were in some kind of factory that turned lying, telling untruths, into an industrial assembly line. The volumes were colossal — there were huge numbers of people, 300 to 400, and they were all writing absolute untruths. It was like being in Orwell’s world.
  • I worked in the commenting department — I had to comment on the news. No one asked me my opinion. My opinions were already written for me, and I had to write in my own words that which I was ordered to write.
  • ...6 more annotations...
  • We were commenting on Russian sites — all sorts of them, LiveJournal for example, and all the Russian news websites. Wherever a given news item appeared on Russian websites, trolls were immediately created to provide the illusion of support.
  • There were two shifts of 12 hours, day and night. You had to arrive exactly on time, that is, from 9 a.m. to 9 p.m. There were production norms, for example, 135 comments of 200 characters each. … You come in and spend all day in a room with the blinds closed and 20 computers. There were multiple such rooms spread over four floors. It was like a production line, everyone was busy, everyone was writing something. You had the feeling that you had arrived in a factory rather than a creative place.
  • You got a list of topics to write about. Every piece of news was taken care of by three trolls each, and the three of us would make up an act. We had to make it look like we were not trolls but real people. One of the three trolls would write something negative about the news, the other two would respond, “You are wrong,” and post links and such. And the negative one would eventually act convinced. Those are the kinds of plays we had to act out.
  • We didn’t visit other departments, but I knew there was a “Facebook department.” … It wasn’t a secret. We all had essentially the same topics, they were focused on American readers and we were focused on Russians.
  • I speak English, and they asked me if I would like to transfer to the Facebook department. The pay there was two times as high. I said, “Well, let me try.” I failed the test because you had to know English perfectly. The reader must not have the feeling that you are a foreigner. The language demands were in fact very high, they were demanding high-end translators, basically.
  • And what were the people like who worked in the American department? I would see them on smoking breaks. … They were totally modern-looking young people, like hipsters, wearing fashionable clothes with stylish haircuts and modern devices. They were so modern that you wouldn’t think they could do something like this.
Javier E

Jill Lepore On Why We Need a New American National Story - 0 views

  • Degler, a gentle and quietly heroic man, accused his colleagues of nothing short of dereliction of duty: appalled by nationalism, they had abandoned the study of the nation.
  • “If we historians fail to provide a nationally defined history, others less critical and less informed will take over the job for us.”
  • historians seemed to believe that if they stopped studying it, it would die sooner: starved, neglected, and abandoned.
  • ...44 more annotations...
  • Nation-states, when they form, imagine a past. That, at least in part, accounts for why modern historical writing arose with the nation-state.
  • For more than a century, the nation-state was the central object of historical inquiry.
  • studying American history meant studying the American nation. As the historian John Higham put it, “From the middle of the nineteenth century until the 1960s, the nation was the grand subject of American history.”
  • “A history in common is fundamental to sustaining the affiliation that constitutes national subjects,” the historian Thomas Bender once observed. “Nations are, among other things, a collective agreement, partly coerced, to affirm a common history as the basis for a shared future.”
  • in the 1970s, studying the nation fell out of favor in the American historical profession. Most historians started looking at either smaller or bigger things, investigating the experiences and cultures of social groups or taking the broad vantage promised by global history
  • The endurance of nationalism proves that there’s never any shortage of blackguards willing to prop up people’s sense of themselves and their destiny with a tissue of myths and prophecies, prejudices and hatreds, or to empty out old rubbish bags full of festering resentments and calls to violence.
  • When historians abandon the study of the nation, when scholars stop trying to write a common history for a people, nationalism doesn’t die. Instead, it eats liberalism. 
  • is there any option other than to try to craft a new American history—one that could foster a new Americanism? 
  • o review: a nation is a people with common origins, and a state is a political community governed by laws.
  • A nation-state is a political community governed by laws that unites a people with a supposedly common ancestry.
  • These Truths, “Very often, histories of nation-states are little more than myths that hide the seams that stitch the nation to the state.”
  • Not until the 1840s, when European nations were swept up in what has been called “the age of nationalities,” did Americans come to think of themselves as belonging to a nation, with a destiny
  • the state-nation, which arises when the state is formed before the development of any sense of national consciousness. The United States might be seen as a, perhaps the only, spectacular example of the latter”
  • Bancroft’s ten-volume History of the United States, From the Discovery of the American Continent, was published between 1834 and 1874.
  • An architect of manifest destiny, Bancroft wrote his history in an attempt to make the United States’ founding appear inevitable, its growth inexorable, and its history ancient. De-emphasizing its British inheritance, he celebrated the United States as a pluralistic and cosmopolitan nation, with ancestors all over the world:
  • Nineteenth-century nationalism was liberal, a product of the Enlightenment. It rested on an analogy between the individual and the collective
  • “The concept of national self-determination—transferring the ideal of liberty from the individual to the organic collectivity—was raised as the banner of liberalism.” 
  • Nineteenth-century Americans understood the nation-state within the context of an emerging set of ideas about human rights: namely, that the power of the state guaranteed everyone eligible for citizenship the same set of irrevocable political rights.
  • The American Civil War was a struggle over two competing ideas of the nation-state. This struggle has never ended; it has just moved around
  • Southerners were nationalists, too. It’s just that their nationalism was what would now be termed “illiberal” or “ethnic,” as opposed to the Northerners’ liberal or civic nationalism.
  • much of U.S. history has been a battle between them. 
  • “Ours is the government of the white man,” the American statesman John C. Calhoun declared in 1848, arguing against admitting Mexicans as citizens of the United States. “This Government was made by our fathers on the white basis,” the American politician Stephen Douglas said in 1858. “It was made by white men for the benefit of white men and their posterity forever.” 
  • In 1861, the Confederacy’s newly elected vice president, Alexander Stephens, delivered a speech in Savannah in which he explained that the ideas that lay behind the U.S. Constitution “rested upon the assumption of the equality of races”—here ceding Lincoln’s argument—but that “our new government is founded upon exactly the opposite ideas; its foundations are laid, its cornerstone rests, upon the great truth that the negro is not equal to the white man; that slavery is his natural and moral condition.”
  • the battle between liberal and illiberal nationalism raged on, especially during the debates over the 14th and 15th Amendments, which marked a second founding of the United States on terms set by liberal ideas about the rights of citizens and the powers of nation-states—namely, birthright citizenship, equal rights, universal (male) suffrage, and legal protections for noncitizens
  • For Douglass, progress could only come in this new form of a nation, the composite nation. “We shall spread the network of our science and civilization over all who seek their shelter, whether from Asia, Africa, or the Isles of the sea,” he said, and “all shall here bow to the same law, speak the same language, support the same Government, enjoy the same liberty, vibrate with the same national enthusiasm, and seek the same national ends.”
  • that effort had been betrayed by white Northerners and white Southerners who patched the United States back together by inventing a myth that the war was not a fight over slavery at all but merely a struggle between the nation and the states. “We fell under the leadership of those who would compromise with truth in the past in order to make peace in the present,” Du Bois wrote bitterly.
  • Nationalism was taking a turn, away from liberalism and toward illiberalism, including in Germany
  • That “placed the question of the ‘nation,’ and the citizen’s feelings towards whatever he regarded as his ‘nation,’ ‘nationality’ or other centre of loyalty, at the top of the political agenda.”
  • began in the United States in the 1880s, with the rise of Jim Crow laws, and with a regime of immigration restriction, starting with the Chinese Exclusion Act, the first federal law restricting immigration, which was passed in 1882. Both betrayed the promises and constitutional guarantees made by the 14th and 15th Amendments.
  • the white men who delivered speeches at the annual meetings of the American Historical Association during those years had little interest in discussing racial segregation, the disenfranchisement of black men, or immigration restriction
  • All offered national histories that left out the origins and endurance of racial inequality.
  • the uglier and more illiberal nationalism got, the more liberals became convinced of the impossibility of liberal nationalism
  • The last, best single-volume popular history of the United States written in the twentieth century was Degler’s 1959 book, Out of Our Past: The Forces That Shaped Modern America: a stunning, sweeping account that, greatly influenced by Du Bois, placed race, slavery, segregation, and civil rights at the center of the story, alongside liberty, rights, revolution, freedom, and equality. Astonishingly, it was Degler’s first book.
  • hatred for nationalism drove American historians away from it in the second half of the twentieth century.
  • with the coming of the Vietnam War, American historians stopped studying the nation-state in part out of a fear of complicity with atrocities of U.S. foreign policy and regimes of political oppression at home.
  • Bender observed in Rethinking American History in a Global Age in 2002. “Only recently,” he continued, “and because of the uncertain status of the nation-state has it been recognized that history as a professional discipline is part of its own substantive narrative and not at all sufficiently self-conscious about the implications of that circularity.” Since then, historians have only become more self-conscious, to the point of paralysis
  • If nationalism was a pathology, the thinking went, the writing of national histories was one of its symptoms, just another form of mythmaking
  • Beginning in the 1960s, women and people of color entered the historical profession and wrote new, rich, revolutionary histories, asking different questions and drawing different conclusions
  • a lot of historians in the United States had begun advocating a kind of historical cosmopolitanism, writing global rather than national history
  • Michael Walzer grimly announced that “the tribes have returned.” They had never left. They’d only become harder for historians to see, because they weren’t really looking anymore. 
  • Writing national history creates plenty of problems. But not writing national history creates more problems, and these problems are worse.
  • What would a new Americanism and a new American history look like? They might look rather a lot like the composite nationalism imagined by Douglass and the clear-eyed histories written by Du Bois
  • A nation born in contradiction will forever fight over the meaning of its history. But that doesn’t mean history is meaningless, or that anyone can afford to sit out the fight.
  • “The history of the United States at the present time does not seek to answer any significant questions,” Degler told his audience some three decades ago. If American historians don’t start asking and answering those sorts of questions, other people will, he warned
Javier E

Reading in the Time of Books Bans and A.I. - The New York Times - 0 views

  • We are in the throes of a reading crisis.
  • While right and left are hardly equivalent in their stated motivations, they share the assumption that it’s important to protect vulnerable readers from reading the wrong things.
  • But maybe the real problem is that children aren’t being taught to read at all.
  • ...44 more annotations...
  • . In May, David Banks, the chancellor of New York City’s public schools, for many years a stronghold of “whole language” instruction, announced a sharp pivot toward phonics, a major victory for the “science of reading” movement and a blow to devotees of entrenched “balanced literacy” methods
  • As corporate management models and zealous state legislatures refashion the academy into a gated outpost of the gig economy, the humanities have lost their luster for undergraduates. According to reports in The New Yorker and elsewhere, fewer and fewer students are majoring in English, and many of those who do (along with their teachers) have turned away from canonical works of literature toward contemporary writing and pop culture. Is anyone reading “Paradise Lost” anymore? Are you?
  • While we binge and scroll and D.M., the robots, who are doing more and more of our writing, may also be taking over our reading.
  • There is so much to worry about. A quintessentially human activity is being outsourced to machines that don’t care about phonics or politics or beauty or truth. A precious domain of imaginative and intellectual freedom is menaced by crude authoritarian politics. Exposure to the wrong words is corrupting our children, who aren’t even learning how to decipher the right ones. Our attention spans have been chopped up and commodified, sold off piecemeal to platforms and algorithms. We’re too busy, too lazy, too preoccupied to lose ourselves in books.
  • the fact that the present situation has a history doesn’t mean that it isn’t rea
  • the reading crisis isn’t simply another culture-war combat zone. It reflects a deep ambivalence about reading itself, a crack in the foundations of modern consciousness.
  • Just what is reading, anyway? What is it for? Why is it something to argue and worry about? Reading isn’t synonymous with literacy, which is one of the necessary skills of contemporary existence. Nor is it identical with literature, which designates a body of written work endowed with a special if sometimes elusive prestige.
  • Is any other common human undertaking so riddled with contradiction? Reading is supposed to teach us who we are and help us forget ourselves, to enchant and disenchant, to make us more worldly, more introspective, more empathetic and more intelligent. It’s a private, even intimate act, swathed in silence and solitude, and at the same time a social undertaking. It’s democratic and elitist, soothing and challenging, something we do for its own sake and as a means to various cultural, material and moral ends.
  • Fun and fundamental: Together, those words express a familiar utilitarian, utopian promise — the faith that what we enjoy doing will turn out to be what we need to do, that our pleasures and our responsibilities will turn out to be one and the same. It’s not only good; it’s good for you.
  • Reading is, fundamentally, both a tool and a toy. It’s essential to social progress, democratic citizenship, good government and general enlightenment.
  • It’s also the most fantastically, sublimely, prodigiously useless pastime ever invented
  • Teachers, politicians, literary critics and other vested authorities labor mightily to separate the edifying wheat from the distracting chaff, to control, police, correct and corral the transgressive energies that propel the turning of pages.
  • His despair mirrors his earlier exhilaration and arises from the same source. “I envied my fellow-slaves for their stupidity. I have often wished myself a beast. I preferred the condition of the meanest reptile to my own. Any thing, no matter what, to get rid of thinking!”
  • Reading is a relatively novel addition to the human repertoire — less than 6,000 years old — and the idea that it might be available to everybody is a very recent innovation
  • Written language, associated with the rise of states and the spread of commerce, was useful for trade, helpful in the administration of government and integral to some religious practices. Writing was a medium for lawmaking, record-keeping and scripture, and reading was the province of priests, bureaucrats and functionaries.
  • For most of history, that is, universal literacy was a contradiction in terms. The Latin word literatus designated a member of the learned elite
  • Anyone could learn to do it, but the mechanisms of learning were denied to most people on the grounds of caste, occupation or gender.
  • According to Steven Roger Fischer’s lively and informative “A History of Reading” (2003), “Western Europe began the transition from an oral to a literate society in the early Middle Ages, starting with society’s top rungs — aristocracy and clergy — and finally including everyone else around 1,200 years later.”
  • . The print revolution catalyzed a global market that flourishes to this day: Books became commodities, and readers became consumers.
  • For Fischer, as for many authors of long-range synthetic macrohistories, the story of reading is a chronicle of progress, the almost mythic tale of a latent superpower unlocked for the benefit of mankind.
  • “If extraordinary human faculties and powers do lie dormant until a social innovation calls them into life,” he writes, “perhaps this might help to explain humanity’s constant advancement.” “Reading,” he concludes, “had become our union card to humanity.”
  • For one thing, the older, restrictive model of literacy as an elite prerogative proved to be tenacious
  • The novel, more than any other genre, catered to this market. Like every other development in modern popular culture, it provoked a measure of social unease. Novels, at best a source of harmless amusement and mild moral instruction, were at worst — from the pens of the wrong writers, or in the hands of the wrong readers — both invitations to vice and a vice unto themselves
  • More consequential — and more revealing of the destabilizing power of reading — was the fear of literacy among the laboring classes in Europe and America. “Reading, writing and arithmetic,” the Enlightenment political theorist Bernard Mandeville asserted, were “very pernicious to the poor” because education would breed restlessness and disconte
  • “It was unlawful, as well as unsafe, to teach a slave to read,” Frederick Douglass writes in his “Narrative of the Life” recalling the admonitions of one of his masters, whose wife had started teaching young Frederick his letters. If she persisted, the master explained, their chattel would “become unmanageable, and of no value to his master. As to himself, it could do him no good, but a great deal of harm. It would make him discontented and unhappy.”
  • “As I read and contemplated the subject, behold! that very discontentment which Master Hugh had predicted would follow my learning to read had already come, to torment and sting my soul to unutterable anguish. As I writhed under it, I would at times feel that learning to read had been a curse rather than a blessing.”
  • The crisis is what happens either when those efforts succeed or when they fail. Everyone likes reading, and everyone is afraid of it.
  • Douglass’s literary genius resides in the way he uses close attention to his own situation to arrive at the essence of things — to crack the moral nut of slavery and, in this case, to peel back the epistemological husk of freedom.
  • He has freed his mind, but the rest has not followed. In time it would, but freedom itself brings him uncertainty and terror, an understanding of his own humanity that is embattled and incomplete.
  • Here, the autobiographical touches on the mythic, specifically on the myth of Prometheus, whose theft of fire — a curse as well as a blessing bestowed on a bumbling, desperate species — is a primal metaphor for reading.
  • A school, however benevolently conceived and humanely administered, is a place of authority, where the energies of the young are regulated, their imaginations pruned and trained into conformity. As such, it will inevitably provoke resistance, rebellion and outright refusal on the part of its wards
  • Schools exist to stifle freedom, and also to inculcate it, a dialectic that is the essence of true education. Reading, more than any other discipline, is the engine of this process, precisely because it escapes the control of those in charge.
  • Apostles of reading like to quote Franz Kafka’s aphorism that “a book must be the ax for the frozen sea within us.” By itself, the violence of the metaphor is tempered by its therapeutic implication.
  • Kafka’s previous sentence: “What we need are books that hit us like the most painful misfortune, like the death of someone we loved more than we love ourselves, that make us feel as though we had been banished to the woods, far from any human presence, like a suicide.”
  • Are those the books you want in your child’s classroom? To read in this way is to go against the grain, to feel oneself at odds, alienated, alone. Schools exist to suppress those feelings, to blunt the ax and gently thaw the sea
  • That is important work, but it’s equally critical for that work to be subverted, for the full destructive potential of reading to lie in reach of innocent hands.
  • Roland Barthes distinguished between two kinds of literary work:
  • Text of pleasure: the text that contents, fills, grants euphoria: the text that comes from culture and does not break with it, is linked to a comfortable practice of reading. Text of bliss: the text that imposes a state of loss, the text that discomforts (perhaps to the point of a certain boredom), unsettles the reader’s historical, cultural, psychological assumptions, the consistency of his tastes, values, memories, brings to a crisis his relation with language.
  • he is really describing modalities of reading. To a member of the slaveholding Southern gentry, “The Columbian Orator” is a text of pleasure, a book that may challenge and surprise him in places, but that does not undermine his sense of the world or his place in it. For Frederick Douglass, it is a text of bliss, “bringing to crisis” (as Barthes would put it) his relation not only to language but to himself.
  • If you’ll forgive a Dungeons and Dragons reference, it might help to think of these types of reading as lawful and chaotic.
  • Lawful reading rests on the certainty that reading is good for us, and that it will make us better people. We read to see ourselves represented, to learn about others, to find comfort and enjoyment and instruction. Reading is fun! It’s good and good for you.
  • Chaotic reading is something else. It isn’t bad so much as unjustified, useless, unreasonable, ungoverned. Defenses of this kind of reading, which are sometimes the memoirs of a certain kind of reader, favor words like promiscuous, voracious, indiscriminate and compulsive.
  • Bibliophilia is lawful. Bibliomania is chaotic.
  • The point is not to choose between them: This is a lawful publication staffed by chaotic readers. In that way, it resembles a great many English departments, bookstores, households and classrooms. Here, the crisis never ends. Or rather, it will end when we stop reading. Which is why we can’t.
Javier E

Welcome to the blah blah blah economy - 0 views

image

unpredictable economy global

started by Javier E on 17 Dec 22 no follow-up yet
Javier E

Guest Post: Robert Lane Greene on Language Sticklers - NYTimes.com - 0 views

  • more people are writing than ever before. Even most of the poor today have cell phones and internet. When they text or scribble on Facebook, they’re writing. We easily forget that this is something that farmhands and the urban poor almost never did in centuries past. They lacked the time and means even if they had the education. So a bigger proportion of Americans than ever before write sometimes, or even frequently, maybe daily. Naturally that means more people are writing with poor grammar and mechanics. Education is universal, and every texter and Facebooker is a writer. A century ago, a nation of 310 million engaged with the written word on a daily basis was unthinkable. Now its uneven results are taken as proof by some that language skills are in decline.
Javier E

Donald Trump will win in a landslide. *The mind behind 'Dilbert' explains why. - The Wa... - 0 views

  • Adams believes Trump will win because he’s “a master persuader.”
  • what Trump is doing? He is acknowledging the suffering of some, Adams says, and then appealing emotionally to that.
  • And he bolsters that approach, Adams says, by “exploiting the business model” like an entrepreneur. In this model, which “the news industry doesn’t have the ability to change … the media doesn’t really have the option of ignoring the most interesting story,” says Adams, contending that Trump “can always be the most interesting story if he has nothing to fear and nothing to lose.”
  • ...7 more annotations...
  • Having nothing to lose essentially then increases his chance of winning, because it opens up his field of rhetorical play. “Psychology is the only necessary skill for running for president,” writes Adams, adding: “Trump knows psychology.”
  • “Did Trump’s involvement in the birther thing confuse you?” Adams goes on to ask. “Were you wondering how Trump could believe Obama was not a citizen? The answer is that Trump never believed anything about Obama’s place of birth. The facts were irrelevant, so he ignored them while finding a place in the hearts of conservatives. For later.
  • “If you see voters as rational you’ll be a terrible politician,” Adams writes on his blog. “People are not wired to be rational. Our brains simply evolved to keep us alive. Brains did not evolve to give us truth. Brains merely give us movies in our minds that keeps us sane and motivated. But none of it is rational or true, except maybe sometimes by coincidence.”
  • “While his opponents are losing sleep trying to memorize the names of foreign leaders – in case someone asks – Trump knows that is a waste of time … ,” Adams writes. “There are plenty of important facts Trump does not know. But the reason he doesn’t know those facts is – in part – because he knows facts don’t matter. They never have and they never will. So he ignores them.
  • “The evidence is that Trump completely ignores reality and rational thinking in favor of emotional appeal,” Adams writes. “Sure, much of what Trump says makes sense to his supporters, but I assure you that is coincidence. Trump says whatever gets him the result he wants. He understands humans as 90-percent irrational and acts accordingly.
  • Among the persuasive techniques that Trump uses to help bend reality, Adams says, are repetition of phrases; “thinking past the sale” so the initial part of his premise is stated as a given; and knowing the appeal of the simplest answer, which relates to the concept of Occam’s razor.)
  • Writes Adams: “Identity is always the strongest level of persuasion. The only way to beat it is with dirty tricks or a stronger identity play. … [And] Trump is well on his way to owning the identities of American, Alpha Males, and Women Who Like Alpha Males. Clinton is well on her way to owning the identities of angry women, beta males, immigrants, and disenfranchised minorities.
Javier E

Jordan Peterson's Gospel of Masculinity | The New Yorker - 0 views

  • his accent and vocabulary combine to make him seem like a man out of time and out of place, especially in America.
  • His central message is a thoroughgoing critique of modern liberal culture, which he views as suicidal in its eagerness to upend age-old verities.
  • a possibly spurious quote that nevertheless captures his style and his substance: “Sort yourself out, bucko.”
  • ...41 more annotations...
  • His fame grew in 2016, during the debate over a Canadian bill known as C-16. The bill sought to expand human-rights law by adding “gender identity and gender expression” to the list of grounds upon which discrimination is prohibited. In a series of videotaped lectures, Peterson argued that such a law could be a serious infringement of free speech
  • His main focus was the issue of pronouns: many transgender or gender-nonbinary people use pronouns different from the ones they were assigned at birth—including, sometimes, “they,” in the singular, or nontraditional ones, like “ze.” The Ontario Human Rights Commission had found that, in a workplace or a school, “refusing to refer to a trans person by their chosen name and a personal pronoun that matches their gender identity” would probably be considered discrimination.
  • Peterson resented the idea that the government might force him to use what he called neologisms of politically correct “authoritarians.”
  • To many people disturbed by reports of intolerant radicals on campus, Peterson was a rallying figure: a fearsomely self-assured debater, unintimidated by liberal condemnation.
  • He remains a psychology professor by trade, and he still spends much of his time doing something like therapy. Anyone in need of his counsel can find plenty of it in “12 Rules for Life.”
  • One of his many fans is PewDiePie, a Swedish video gamer who is known as the most widely viewed YouTube personality in the world—his channel has more than sixty million subscribers
  • In a video review of “12 Rules for Life,” PewDiePie confessed that the book had surprised him. “It’s a self-help book!” he said. “I don’t think I ever would have read a self-help book.” (He nonetheless declared that Peterson’s book, at least the parts he read, was “very interesting.”)
  • Political polemic plays a relatively small role; Peterson’s goal is less to help his readers change the world than to help them find a stable place within it. One of his most compelling maxims is strikingly modest: “You should do what other people do, unless you have a very good reason not to.”
  • Of course, he is famous today precisely because he has determined that, in a range of circumstances, there are good reasons to buck the popular tide.
  • He is, by turns, a defender of conformity and a critic of it, and he thinks that if readers pay close attention, they, too, can learn when to be which.
  • “I stopped attending church, and joined the modern world.” He turned first to socialism and then to political science, seeking an explanation for “the general social and political insanity and evil of the world,” and each time finding himself unsatisfied.
  • The question was, he decided, a psychological one, so he sought psychological answers, and eventually earned a Ph.D. from McGill University, having written a thesis examining the heritability of alcoholism.
  • In “Maps of Meaning,” Peterson drew from Jung, and from evolutionary psychology: he wanted to show that modern culture is “natural,” having evolved over hundreds of thousands of years to reflect and meet our human needs.
  • Then, rather audaciously, he sought to explain exactly how our minds work, illustrating his theory with elaborate geometric diagrams
  • In “Maps of Meaning,” he traced this sense of urgency to a feeling of fraudulence that overcame him in college. When he started to speak, he would hear a voice telling him, “You don’t believe that. That isn’t true.” To ward off mental breakdown, he resolved not to say anything unless he was sure he believed it; this practice calmed the inner voice, and in time it shaped his rhetorical style, which is forceful but careful.
  • “You have to listen very carefully and tell the truth if you are going to get a paranoid person to open up to you,” he writes. Peterson seems to have found that this approach works on much of the general population, too.
  • He is particularly concerned about boys and men, and he flatters them with regular doses of tough love. “Boys are suffering in the modern world,” he writes, and he suggests that the problem is that they’re not boyish enough. Near the end of the chapter, he tries to coin a new catchphrase: “Toughen up, you weasel.”
  • his tone is more pragmatic in this book, and some of his critics might be surprised to find much of the advice he offers unobjectionable, if old-fashioned: he wants young men to be better fathers, better husbands, better community members.
  • Where the pickup artists promised to make guys better sexual salesmen (sexual consummation was called “full close,” as in closing a deal), Peterson, more ambitious, promises to help them get married and stay married. “You have to scour your psyche,” he tells them. “You have to clean the damned thing up.
  • When he claims to have identified “the culminating ethic of the canon of the West,” one might brace for provocation. But what follows, instead, is prescription so canonical that it seems self-evident: “Attend to the day, but aim at the highest good.” In urging men to overachieve, he is also urging them to fit in, and become productive members of Western society.
  • Every so often, Peterson pauses to remind his readers how lucky they are. “The highly functional infrastructure that surrounds us, particularly in the West,” he writes, “is a gift from our ancestors: the comparatively uncorrupt political and economic systems, the technology, the wealth, the lifespan, the freedom, the luxury, and the opportunity.”
  • Peterson seems to view Trump, by contrast, as a symptom of modern problems, rather than a cause of them. He suggests that Trump’s rise was unfortunate but inevitable—“part of the same process,” he writes, as the rise of “far-right” politicians in Europe. “If men are pushed too hard to feminize,” he warns, “they will become more and more interested in harsh, fascist political ideology.”
  • Peterson sometimes asks audiences to view him as an alternative to political excesses on both sides. During an interview on BBC Radio 5, he said, “I’ve had thousands of letters from people who were tempted by the blandishments of the radical right, who’ve moved towards the reasonable center as a consequence of watching my videos.”
  • But he typically sees liberals, or leftists, or “postmodernists,” as aggressors—which leads him, rather ironically, to frame some of those on the “radical right” as victims. Many of his political stances are built on this type of inversion.
  • Postmodernists, he says, are obsessed with the idea of oppression, and, by waging war on oppressors real and imagined, they become oppressors themselves. Liberals, he says, are always talking about the importance of compassion—and yet “there’s nothing more horrible for children, and developing people, than an excess of compassion.”
  • The danger, it seems, is that those who want to improve Western society may end up destroying it.
  • But Peterson remains a figurehead for the movement to block or curtail transgender rights. When he lampoons “made-up pronouns,” he sometimes seems to be lampooning the people who use them, encouraging his fans to view transgender or gender-nonbinary people as confused, or deluded
  • Once, after a lecture, he was approached on campus by a critic who wanted to know why he would not use nonbinary pronouns. “I don’t believe that using your pronouns will do you any good, in the long run,” he replied.
  • In a debate about gender on Canadian television, in 2016, he tried to find some middle ground. “If our society comes to some sort of consensus over the next while about how we’ll solve the pronoun problem,” he said, “and that becomes part of popular parlance, and it seems to solve the problem properly, without sacrificing the distinction between singular and plural, and without requiring me to memorize an impossible list of an indefinite number of pronouns, then I would be willing to reconsider my position.
  • Despite his fondness for moral absolutes, Peterson is something of a relativist; he is inclined to defer to a Western society that is changing in unpredictable ways
  • Peterson excels at explaining why we should be careful about social change, but not at helping us assess which changes we should favor; just about any modern human arrangement could be portrayed as a radical deviation from what came before.
  • In the case of gender identity, Peterson’s judgment is that “our society” has not yet agreed to adopt nontraditional pronouns, which isn’t quite an argument that we shouldn’t.
  • Peterson—like his hero, Jung—has a complicated relationship to religious belief. He reveres the Bible for its stories, reasoning that any stories that we have been telling ourselves for so long must be, in some important sense, true.
  • In a recent podcast interview, he mentioned that people sometimes ask him if he believes in God. “I don’t respond well to that question,” he said. “The answer to that question is forty hours long, and I can’t condense it into a sentence.”
  • At times, Peterson emphasizes his interest in empirical knowledge and scientific research—although these tend to be the least convincing parts of “12 Rules for Life.”
  • Peterson’s story about the lobster is essentially a modern myth. He wants forlorn readers to imagine themselves as heroic lobsters; he wants an image of claws to appear in their mind whenever they feel themselves start to slump; he wants to help them.
  • Peterson wants to help everyone, in fact. In his least measured moments, he permits himself to dream of a world transformed. “Who knows,” he writes, “what existence might be like if we all decided to strive for the best?
  • His many years of study fostered in him a conviction that good and evil exist, and that we can discern them without recourse to any particular religious authority. This is a reassuring belief, especially in confusing times: “Each human being understands, a priori, perhaps not what is good, but certainly what is not.
  • there are therapists and life coaches all over the world dispensing some version of this formula, nudging their clients to pursue lives that better conform to their own moral intuitions. The problem is that, when it comes to the question of how to order our societies—when it comes, in other words, to politics—our intuitions have proved neither reliable nor coherent.
  • The “highly functional infrastructure” he praises is the product of an unceasing argument over what is good, for all of us; over when to conform, and when to dissent
  • We can, most of us, sort ourselves out, or learn how to do it. That doesn’t mean we will ever agree on how to sort out everyone else.
Javier E

How Do You Know When Society Is About to Fall Apart? - The New York Times - 0 views

  • Tainter seemed calm. He walked me through the arguments of the book that made his reputation, “The Collapse of Complex Societies,” which has for years been the seminal text in the study of societal collapse, an academic subdiscipline that arguably was born with its publication in 1988
  • It is only a mild overstatement to suggest that before Tainter, collapse was simply not a thing.
  • His own research has moved on; these days, he focuses on “sustainability.”
  • ...53 more annotations...
  • He writes with disarming composure about the factors that have led to the disintegration of empires and the abandonment of cities and about the mechanism that, in his view, makes it nearly certain that all states that rise will one day fall
  • societal collapse and its associated terms — “fragility” and “resilience,” “risk” and “sustainability” — have become the objects of extensive scholarly inquiry and infrastructure.
  • Princeton has a research program in Global Systemic Risk, Cambridge a Center for the Study of Existential Risk
  • even Tainter, for all his caution and reserve, was willing to allow that contemporary society has built-in vulnerabilities that could allow things to go very badly indeed — probably not right now, maybe not for a few decades still, but possibly sooner. In fact, he worried, it could begin before the year was over.
  • Plato, in “The Republic,” compared cities to animals and plants, subject to growth and senescence like any living thing. The metaphor would hold: In the early 20th century, the German historian Oswald Spengler proposed that all cultures have souls, vital essences that begin falling into decay the moment they adopt the trappings of civilization.
  • that theory, which became the heart of “The Collapse of Complex Societies.” Tainter’s argument rests on two proposals. The first is that human societies develop complexity, i.e. specialized roles and the institutional structures that coordinate them, in order to solve problems
  • All history since then has been “characterized by a seemingly inexorable trend toward higher levels of complexity, specialization and sociopolitical control.”
  • Something more than the threat of violence would be necessary to hold them together, a delicate balance of symbolic and material benefits that Tainter calls “legitimacy,” the maintenance of which would itself require ever more complex structures, which would become ever less flexible, and more vulnerable, the more they piled up.
  • Eventually, societies we would recognize as similar to our own would emerge, “large, heterogeneous, internally differentiated, class structured, controlled societies in which the resources that sustain life are not equally available to all.”
  • Social complexity, he argues, is inevitably subject to diminishing marginal returns. It costs more and more, in other words, while producing smaller and smaller profits.
  • Take Rome, which, in Tainter's telling, was able to win significant wealth by sacking its neighbors but was thereafter required to maintain an ever larger and more expensive military just to keep the imperial machine from stalling — until it couldn’t anymore.
  • This is how it goes. As the benefits of ever-increasing complexity — the loot shipped home by the Roman armies or the gentler agricultural symbiosis of the San Juan Basin — begin to dwindle, Tainter writes, societies “become vulnerable to collapse.”
  • haven’t countless societies weathered military defeats, invasions, even occupations and lengthy civil wars, or rebuilt themselves after earthquakes, floods and famines?
  • Only complexity, Tainter argues, provides an explanation that applies in every instance of collapse.
  • Complexity builds and builds, usually incrementally, without anyone noticing how brittle it has all become. Then some little push arrives, and the society begins to fracture.
  • A disaster — even a severe one like a deadly pandemic, mass social unrest or a rapidly changing climate — can, in Tainter’s view, never be enough by itself to cause collapse
  • Societies evolve complexity, he argues, precisely to meet such challenges.
  • Whether any existing society is close to collapsing depends on where it falls on the curve of diminishing returns.
  • The United States hardly feels like a confident empire on the rise these days. But how far along are we?
  • Scholars of collapse tend to fall into two loose camps. The first, dominated by Tainter, looks for grand narratives and one-size-fits-all explanations
  • The second is more interested in the particulars of the societies they study
  • Patricia McAnany, who teaches at the University of North Carolina at Chapel Hill, has questioned the usefulness of the very concept of collapse — she was an editor of a 2010 volume titled “Questioning Collapse” — but admits to being “very, very worried” about the lack, in the United States, of the “nimbleness” that crises require of governments.
  • We’re too vested and tied to places.” Without the possibility of dispersal, or of real structural change to more equitably distribute resources, “at some point the whole thing blows. It has to.”
  • In Turchin’s case the key is the loss of “social resilience,” a society’s ability to cooperate and act collectively for common goals. By that measure, Turchin judges that the United States was collapsing well before Covid-19 hit. For the last 40 years, he argues, the population has been growing poorer and more unhealthy as elites accumulate more and more wealth and institutional legitimacy founders. “The United States is basically eating itself from the inside out,
  • Inequality and “popular immiseration” have left the country extremely vulnerable to external shocks like the pandemic, and to internal triggers like the killings of George Floyd
  • Turchin is keenly aware of the essential instability of even the sturdiest-seeming systems. “Very severe events, while not terribly likely, are quite possible,” he says. When he emigrated from the U.S.S.R. in 1977, he adds, no one imagined the country would splinter into its constituent parts. “But it did.”
  • Eric H. Cline, who teaches at the George Washington University, argued in “1177 B.C.: The Year Civilization Collapsed” that Late Bronze Age societies across Europe and western Asia crumbled under a concatenation of stresses, including natural disasters — earthquakes and drought — famine, political strife, mass migration and the closure of trade routes. On their own, none of those factors would have been capable of causing such widespread disintegration, but together they formed a “perfect storm” capable of toppling multiple societies all at once.
  • Collapse “really is a matter of when,” he told me, “and I’m concerned that this may be the time.”
  • In “The Collapse of Complex Societies,” Tainter makes a point that echoes the concern that Patricia McAnany raised. “The world today is full,” Tainter writes. Complex societies occupy every inhabitable region of the planet. There is no escaping. This also means, he writes, that collapse, “if and when it comes again, will this time be global.” Our fates are interlinked. “No longer can any individual nation collapse. World civilization will disintegrate as a whole.”
  • If it happens, he says, it would be “the worst catastrophe in history.”
  • The quest for efficiency, he wrote recently, has brought on unprecedented levels of complexity: “an elaborate global system of production, shipping, manufacturing and retailing” in which goods are manufactured in one part of the world to meet immediate demands in another, and delivered only when they’re needed. The system’s speed is dizzying, but so are its vulnerabilities.
  • A more comprehensive failure of fragile supply chains could mean that fuel, food and other essentials would no longer flow to cities. “There would be billions of deaths within a very short period,” Tainter says.
  • If we sink “into a severe recession or a depression,” Tainter says, “then it will probably cascade. It will simply reinforce itself.”
  • Tainter tells me, he has seen “a definite uptick” in calls from journalists: The study of societal collapse suddenly no longer seems like a purely academic pursuit
  • The only precedent Tainter could think of, in which pandemic coincided with mass social unrest, was the Black Death of the 14th century. That crisis reduced the population of Europe by as much as 60 percent.
  • He writes of visions of “bloated bureaucracies” becoming the basis of “entire political careers.” Arms races, he observes, presented a “classic example” of spiraling complexity that provides “no tangible benefit for much of the population” and “usually no competitive advantage” either.
  • It is hard not to read the book through the lens of the last 40 years of American history, as a prediction of how the country might deteriorate if resources continued to be slashed from nearly every sector but the military, prisons and police.
  • The more a population is squeezed, Tainter warns, the larger the share that “must be allocated to legitimization or coercion.
  • And so it was: As U.S. military spending skyrocketed — to, by some estimates, a total of more than $1 trillion today from $138 billion in 1980 — the government would try both tactics, ingratiating itself with the wealthy by cutting taxes while dismantling public-assistance programs and incarcerating the poor in ever-greater numbers.
  • “As resources committed to benefits decline,” Tainter wrote in 1988, “resources committed to control must increase.”
  • The overall picture drawn by Tainter’s work is a tragic one. It is our very creativity, our extraordinary ability as a species to organize ourselves to solve problems collectively, that leads us into a trap from which there is no escaping
  • Complexity is “insidious,” in Tainter’s words. “It grows by small steps, each of which seems reasonable at the time.” And then the world starts to fall apart, and you wonder how you got there.
  • Perhaps collapse is not, actually, a thing. Perhaps, as an idea, it was a product of its time, a Cold War hangover that has outlived its usefulness, or an academic ripple effect of climate-change anxiety, or a feedback loop produced by some combination of the two
  • if you pay attention to people’s lived experience, and not just to the abstractions imposed by a highly fragmented archaeological record, a different kind of picture emerges.
  • Since the beginning of the pandemic, the total net worth of America’s billionaires, all 686 of them, has jumped by close to a trillion dollars.
  • Tainter’s understanding of societies as problem-solving entities can obscure as much as it reveals
  • Plantation slavery arose in order to solve a problem faced by the white landowning class: The production of agricultural commodities like sugar and cotton requires a great deal of backbreaking labor. That problem, however, has nothing to do with the problems of the people they enslaved. Which of them counts as “society”?
  • If societies are not in fact unitary, problem-solving entities but heaving contradictions and sites of constant struggle, then their existence is not an all-or-nothing game.
  • Collapse appears not as an ending, but a reality that some have already suffered — in the hold of a slave ship, say, or on a long, forced march from their ancestral lands to reservations faraway — and survived.
  • The current pandemic has already given many of us a taste of what happens when a society fails to meet the challenges that face it, when the factions that rule over it tend solely to their own problems
  • the real danger comes from imagining that we can keep living the way we always have, and that the past is any more stable than the present.
  • If you close your eyes and open them again, the periodic disintegrations that punctuate our history — all those crumbling ruins — begin to fade, and something else comes into focus: wiliness, stubbornness and, perhaps the strongest and most essential human trait, adaptability.
  • When one system fails, we build another. We struggle to do things differently, and we push on. As always, we have no other choice.
1 - 20 of 996 Next › Last »
Showing 20 items per page